Score Decay: Keeping Your AI SDR Honest Over Time

The hottest lead on my dashboard once clicked a banner eight months ago.
He’d moved teams since. New boss, new priorities, new inbox. My AI SDR still flagged him as “🔥warm.” Sales chased. Nothing. Then nothing again. Then a “who is this?” reply that I deserved.
Old engagement wears a flattering cologne. It smells like momentum even when there’s none.
The Mirage
Most internal AI-SDR builds (mine included, once) treat engagement like a trophy you win forever. One ad click and you’re knighted “high intent.” A couple of site visits and your score climbs the ladder and never comes down.
That’s how you end up with tidy spreadsheets and messy weeks.
Because yesterday’s “hot” silently becomes today’s “maybe,” then “meh,” then “please stop calling me.” If we don’t let time drain the score, time drains our team.
How We Got Here
We do this because it’s easy to add signals and hard to subtract belief.
A product marketer asks for “webinar attended.” Growth wants “viewed pricing.” RevOps tosses in “opened three emails.” Each new input is another shiny gauge on the cockpit. Nobody volunteers to say, “Okay, now reduce it.”
And the CRM doesn’t scream when your scoring math keeps an April ad click equally valuable in December. It quietly nods and keeps handing out tasks.
I once shipped a version that behaved exactly like this. Sales loved the green bars… until the calls felt like time travel.
The Turn
The fix is boring and powerful: decay. As in, “How fast should a signal lose its value?”
If your average deal cycle is six months, engagement that old shouldn’t matter much. If it’s older than the quarter, maybe it shouldn’t matter at all. “If it’s older than six months, it’s a memory, not intent,” our sales lead Priya told me. She was right.
So we made the score leak.
Not dramatically. Not with a PhD. Just a steady, predictable fade that sales could understand at a glance.
Keep It Simple (First)
We started with linear monthly decay on anything that moves: ad clicks, site visits, replies, social engagement. The idea was plain: each month, a signal is worth a bit less until it’s worth nothing.
I like the envelope-math version:
New weight = max(0, 1 − months_since_event ÷ shelf_life_months)
If shelf life is 6 months, a March click is at half strength in June, near zero by September. Everyone gets that. Nobody has to trust a black box.
Later, sure, we can graduate to exponential decay (half-life models feel more “natural” and handle long tails well). But we earned our way there—by showing the linear version worked and didn’t surprise anyone.
Intent Isn’t Equal
Not all signals should fade at the same speed. A lazy “like” on a social post is a sneeze. A “request demo” is a diary entry.
So we separated low-intent from high-intent and gave them different shelf lives.
In practice:
A page view or ad impression faded fast. Think weeks, not months.
A whitepaper download or pricing page binge lasted longer. Think a quarter.
A demo request or trial signup stuck around. Think half a year (or until a clear “not now” event).
This wasn’t philosophical. It was practical. We mapped shelf lives to how long those actions correlated with meetings booked in our own data. The nice surprise: sales started trusting the score again, because it felt like the reality they lived.
Make It Visible
We didn’t ship decay as a silent tweak. We showed it.
Side-by-side, before/after. Two weeks of outbound, same reps, same patch. The old model had an impressive to-do list and a mediocre connect rate. The decayed model had fewer tasks and more conversations.
We also added tiny receipts in the record: “This lead’s score decayed 22% in the last 60 days (last engagement: pricing page, Aug 3).” That one line started killing arguments in pipeline reviews. The reason was in the room.
Where It Broke
A few things we got wrong on the first pass:
I once decayed static fields (role seniority, company size). That was silly. Firmographics don’t melt like ad clicks. We fixed it: dynamic signals decay, static signals don’t.
We also forgot review windows. We let scores quietly fall off a cliff without nudging the owner. Cue frantic “why did this disappear?” threads. We added a 7-day “about to go stale” nudge. It turned cliff dives into gentle landings.
And yes, we briefly overweighted ancient webinar attendance because “it was a great webinar.” Nostalgia is not a strategy. We cut it down.
What Worked
The day we aligned decay to our funnel velocity, the noise dropped. Sales started reporting cleaner weeks: fewer zombie follow-ups, more current conversations. Our weekly “why is this scored high?” debates… mostly vanished.
Did everything get better? No. Decay doesn’t write copy or fix routing. But it stopped the system from lying to us.
The best part was cultural. By making time a first-class citizen in scoring, we gave the team permission to stop. To look at an eight-month-old click and say, “We missed it. Move on.” That’s healthier than pretending.
If You Want the Fancy Stuff
Exponential decay is nice once you need nuance. A half-life model (“this signal halves every 30 days”) is intuitive, and you can tune half-lives by intent tier. You can even mix models—linear early, exponential late—if your funnel has distinct phases.
But earn complexity. The only model sales won’t forgive is the one they don’t understand.
The Takeaway
Your AI SDR isn’t a memory palace. It’s a weather report.
Let yesterday’s sun fade. Let today’s storm matter. If your average cycle is six months, don’t let an April drizzle call itself December thunder.
Start with simple, transparent decay. Let intent age at the speed your deals actually move. Then, when the team can feel the honesty in the score, upgrade the math.
Thanks for reading—subscribe if you want the next chapter on half-life tuning and intent tiers.
Caption: The hottest lead in the room was eight months old; the room just didn’t know it yet.

Want To Dive Deeper?
If your marketing and sales team is struggling to meet quota, its time to give them a Side Kick!