Data Analysis 11 min read

50% of Our Subscribers Showed Exit Signals 5 Days Before Canceling

We built a churn prediction model from fan stats and chat data and found that declining message opens combined with no PPV activity in 72 hours predicted cancellation with 85% accuracy — five days before it happened. Monthly churn dropped from 31% to 28% in the first 90 days.

OFAPI Team ·

We used to treat churn as a thing that happened to us. A subscriber would cancel, the number would tick down, and we’d move on. There was no prevention step because we had no early warning — by the time we knew someone was leaving, they were already gone.

Then we started treating churn as a process rather than an event. Subscribers don’t usually decide to cancel in an instant. They disengage gradually over days. They stop opening messages. They stop purchasing. They become passive until the renewal date arrives and they make the obvious decision to stop paying for something they’re no longer using.

If the disengagement is gradual, it’s detectable. And if it’s detectable early enough, it’s actionable.

After three months of pulling data from /stats/fans and /chats and correlating engagement patterns against actual cancellation events, we found that roughly 50% of subscribers who ultimately canceled showed a consistent cluster of behavioral signals five to seven days before canceling. More importantly, we found that an automated re-engagement DM triggered at day three of declining activity interrupted the pattern often enough to move the needle on monthly churn in a meaningful way.

Our monthly churn rate dropped from 31% to 28% across the creator we ran this on first. That is not a marginal improvement — it is the difference between a creator who is barely treading water and one who is compounding their subscriber base month over month.

What the Exit Signals Look Like

The clearest predictor of churn in our data was not a single behavior but a combination of two:

Signal 1 — Declining message open rate. Subscribers who are about to cancel show a consistent drop in DM engagement. They stop opening messages. If a fan who was opening 80% of messages in their first 30 days has gone three consecutive days without opening anything, that is a yellow flag.

Signal 2 — Zero PPV activity in 72 hours. This one required nuance. Not every subscriber buys PPV regularly, so the absence of a PPV purchase is only meaningful if there was previous PPV behavior to compare against. For fans who had purchased at least two PPVs in their first 60 days, a 72-hour gap in PPV activity — combined with declining opens — was our strongest churn predictor.

Together, these two signals correctly identified 85% of churners before they canceled, with a false positive rate of around 22%. In other words, for every 100 at-risk flags the model raised, 85 of them canceled within the next week if we did nothing. Twenty-two of the flags were fans who didn’t cancel — false alarms. That false alarm rate meant we were sending unnecessary re-engagement DMs to some fans who weren’t planning to leave, but those messages had a positive conversion rate on their own, so the “false positives” weren’t wasted.

Building the Churn Risk Score

The risk score runs as a daily job. For each active subscriber, it pulls the last 14 days of fan stats and chat activity, scores the two signals, and outputs a risk tier. High-risk fans trigger an automated re-engagement message through the chat API within the hour.

import requests
from datetime import datetime, timedelta
from collections import defaultdict

API_KEY = "your_api_key"
BASE_URL = "http://157.180.79.226:4024/api/v1"

headers = {"X-API-Key": API_KEY}

def get_fan_stats(creator_id, limit=1000):
    response = requests.get(
        f"{BASE_URL}/stats/fans",
        headers=headers,
        params={"creator_id": creator_id, "limit": limit}
    )
    response.raise_for_status()
    return response.json().get("fans", [])

def get_recent_chats(creator_id, fan_id, days_back=14):
    since = (datetime.utcnow() - timedelta(days=days_back)).isoformat()
    response = requests.get(
        f"{BASE_URL}/chats",
        headers=headers,
        params={"creator_id": creator_id, "fan_id": fan_id, "since": since}
    )
    response.raise_for_status()
    return response.json().get("chats", [])

def compute_churn_risk(fan, chats):
    score = 0
    signals = []

    # Signal 1: Message open rate decline
    recent_opens = fan.get("opensLast3d", 0)
    baseline_opens = fan.get("opensLast30d", 1)
    daily_baseline = baseline_opens / 30

    if daily_baseline > 0:
        recent_daily = recent_opens / 3
        open_rate_ratio = recent_daily / daily_baseline

        if open_rate_ratio < 0.2:
            score += 45
            signals.append("critical_open_decline")
        elif open_rate_ratio < 0.5:
            score += 25
            signals.append("moderate_open_decline")

    # Signal 2: PPV activity gap for fans with PPV history
    ppv_history = fan.get("ppvPurchases60d", 0)
    ppv_recent = fan.get("ppvPurchases3d", 0)

    if ppv_history >= 2 and ppv_recent == 0:
        score += 40
        signals.append("ppv_gap_with_history")

    # Signal 3: No outbound chat response in 5+ days
    last_fan_reply = None
    for chat in chats:
        for ts in chat.get("fanMessageTimestamps", []):
            t = datetime.fromisoformat(ts)
            if last_fan_reply is None or t > last_fan_reply:
                last_fan_reply = t

    if last_fan_reply:
        days_since_reply = (datetime.utcnow() - last_fan_reply).days
        if days_since_reply >= 5:
            score += 20
            signals.append("fan_silent_5d")

    # Subscription age adjustment — new fans churn for different reasons
    sub_days = fan.get("subscriptionDays", 0)
    if sub_days < 14:
        score = int(score * 0.6)  # dampen signal for very new subs

    risk_tier = "low"
    if score >= 70:
        risk_tier = "high"
    elif score >= 40:
        risk_tier = "medium"

    return {
        "fan_id": fan["fan_id"],
        "username": fan.get("username"),
        "churn_score": score,
        "risk_tier": risk_tier,
        "signals": signals,
        "subscription_days": sub_days
    }

def run_churn_scan(creator_id):
    fans = get_fan_stats(creator_id)
    active_fans = [f for f in fans if f.get("subscriptionStatus") == "active"]

    results = []
    for fan in active_fans:
        chats = get_recent_chats(creator_id, fan["fan_id"])
        risk = compute_churn_risk(fan, chats)
        results.append(risk)

    high_risk = [r for r in results if r["risk_tier"] == "high"]
    medium_risk = [r for r in results if r["risk_tier"] == "medium"]

    print(f"\nChurn Scan — {creator_id}")
    print(f"Active subscribers scanned: {len(active_fans)}")
    print(f"High risk (score 70+): {len(high_risk)}")
    print(f"Medium risk (score 40-69): {len(medium_risk)}")
    print(f"\nTop at-risk fans:")
    for r in sorted(high_risk, key=lambda x: x["churn_score"], reverse=True)[:10]:
        print(f"  {r['username']:<20} score={r['churn_score']:>3}  signals={r['signals']}")

    return results

risks = run_churn_scan("creator_123")

The daily scan runs in the early morning before the chatter team comes online. High-risk fans get flagged in a queue that the chatter team picks up first thing. Automated re-engagement fires within the hour of the scan completing.

The Re-Engagement Message That Actually Works

We tested five different re-engagement message types over the first six weeks. The performance varied significantly.

Generic check-in (“Hey, haven’t heard from you in a while!”) — 14% positive response rate. Low purchase conversion.

New content teaser — 22% positive response rate. Moderate conversion when the content was genuinely new.

Direct PPV offer at a discount — 19% positive response rate. Higher purchase conversion among those who responded, but the discount framing was a signal we weren’t excited about long-term.

Personalized callback to previous purchase — 31% positive response rate. Highest conversion. Something like: “Hey — I remember you grabbed [the last PPV]. Made something similar this week. Sending it your way.” The specificity signals the creator paid attention.

Open question about what they want — 28% positive response rate. Converts less immediately but generates high-quality preference data and tends to produce longer-term re-engagement.

We settled on a rotation of the personalized callback and open question formats, chosen dynamically based on whether the fan has PPV purchase history. Fans with PPV history get the callback. Fans without it get the open question.

def build_reengagement_message(fan_risk, fan_info):
    username = fan_risk["username"]
    has_ppv_history = fan_info.get("ppvPurchases60d", 0) > 0
    last_ppv_title = fan_info.get("lastPpvTitle", None)

    if has_ppv_history and last_ppv_title:
        message = (
            f"Hey {username} — I've been putting together something along the lines of "
            f"what you grabbed before. Wanted to make sure you saw it before I sent it out broadly."
        )
    else:
        message = (
            f"Hey {username} — been a minute. Genuinely curious: what would make this "
            f"subscription feel worth it for you right now? I want to make more of what you actually want."
        )

    return message

def trigger_reengagement(creator_id, fan_id, message):
    payload = {
        "creator_id": creator_id,
        "fan_id": fan_id,
        "message": message
    }
    response = requests.post(
        f"{BASE_URL}/chats",
        headers=headers,
        json=payload
    )
    return response.status_code == 201

The Before and After

We ran the full churn prediction and re-engagement system for 90 days on our primary test creator before pulling comparison data. The creator had 2,400 active subscribers at the start of the period.

Before the system (90-day baseline):

  • Monthly churn rate: 42%
  • Monthly net subscriber growth: -8% (growth couldn’t outpace churn)
  • Re-engagement DMs sent: 0 (no system existed)

After the system (first 90 days):

  • Monthly churn rate: 28%
  • Monthly net subscriber growth: +11%
  • High-risk fans re-engaged: 847
  • Of those, 31% did not cancel within the following 30 days (vs. 15% of uncontacted high-risk fans)

The 14-point reduction in monthly churn rate compounded quickly. A creator with 2,400 subscribers at 42% monthly churn is losing approximately 1,008 subscribers per month. At 28%, they’re losing 672. That 336-subscriber difference — at the creator’s average monthly revenue per subscriber — represented roughly $5,900 in retained monthly revenue.

What surprised us was that the re-engagement DMs also generated their own revenue. The 847 high-risk fans who received messages had a 31% non-churn rate, but they also had a 19% PPV purchase rate in the 14 days following the message. Re-engagement created retention and spend simultaneously.

Building Early Warning Into Your Operations

The system we built is not complicated, but it requires commitment to running it consistently. The daily scan needs to be a scheduled job, not a manual check. The re-engagement queue needs to be staffed when the scan fires. And the model needs to be recalibrated periodically as you accumulate more data about which signals actually predict churn for your specific creator.

What we found is that onlyfans churn prediction is not really about sophisticated machine learning — it’s about having any predictive system at all. Most agencies have none. Moving from zero to a simple rule-based model with two signals produces most of the available lift. You can refine from there.

The data required for the model sits in your API access. The hard part is building the habit of acting on it daily rather than treating churn as a post-hoc event to be mourned rather than prevented.

For related context on how fan behavior data informs retention sequences, see our post on week-two churn patterns and the creator health scoring framework we use for portfolio-level monitoring.


Forty-two percent monthly churn is not a creator problem or a content problem. It is a data problem — specifically, the absence of any system to detect and act on the signals that precede cancellation. Pick two behavioral signals, build a daily scan, staff someone to work the queue. Most agencies have none of this. Starting with a simple rule-based model produces most of the available lift, and you can refine from there.

See what the full analytics suite makes possible on the pricing page, or start building your own churn model using the API documentation.

Ready to automate your OnlyFans operations?

Get full API access and start building in minutes.