A developer at one of my portfolio companies had been missing deadlines for two months.

Inconsistent hours.

No proactive communication.

A ghost in slack.

The client project is sitting in a holding pattern that's costing relationship capital I don't have to spare.

So, after it came up in a few meetings, I flagged it with his director and gave some thoughts on how to handle.

Within 48 hours, something changed. Online early. Tasks closed fast. Responsive. The director noticed. I noticed. The sentiment was “problem solved, no worries now”.

So… I moved on.

Six weeks later, everything reverted. The project slipped, again.. The client relationship cracked, again.. And I was back in the same conversation I thought I'd closed, again.

I thought we already solved this?

Not exactly. I'd celebrated an audition of what his performance could be - not a promise of what it would be.

The random performance spike was a signal that he knows what good looks like, chooses it when the pressure is on, and stops the moment it lifts. There isn't a list of possible explanations here. There's one. He simply doesn't want to be great in this role. But he will perform whenever he knows his back is against the wall.

That's not a turnaround. That's a recurring tax on the team around him.

At Pneuma, I watched this play out with a senior account lead. Strong two months after a hard conversation. I called it progress. Three months later, same patterns. Same conversation. Harder the second time because I'd already given credit I hadn't verified.

The mistake, in both cases, wasn't caring about improvement. It was moving on before I'd seen discipline, not just effort.

Working Theory: The 2-8-90 Stickiness Test

Before calling any performance change real, run three checkpoints - and measure against something concrete, not a feeling.

1. 2 days: Name the driver. What specifically changed? One thing. If the only thing that changed was that you had a hard conversation, that's not a driver - that's a warning shot. Name the actual adjustment: workload restructured, scope narrowed, direct feedback on one specific behavior.

2. 8 weeks: Track 2-3 measurable indicators without hovering. For client-facing roles: NRR movement, client satisfaction scores, late deliverable rate. For internal roles: on-time task completion, communications, output per week. You're looking for consistency across different conditions - not a burst while the spotlight is on. Can they explain what changed and why it's working? Can they, and will they, hold to it?

3. 90 days: Hard proof. The behavior holds when the pressure is off. One process improvement they've documented themselves. It's routine - not reactive.

The 8-week window is where most stop watching, myself included. Don’t make that same mistake :)

Don't trust an audition. Trust discipline.

Keep Reading