4 min read

Almost a year in: daily error messages from an ex-customer

A customer left us almost a year ago. Their systems still send me error messages every day. Nobody reacts. Apart from me.

Almost a year on: daily error messages from an ex-customer. The new provider simply doesn't react. Some time ago this customer left us, out of a business logic I don't even think badly of. Cheaper, closer, different setup. That happens. What's happened since then doesn't really happen any more. And that's exactly what this is about.

To this day I get several error messages every day from the monitoring chain we set up back then. The system officially runs elsewhere now. But the alerts show: it isn't really running. Certs expire. Cron jobs fail. Somewhere a service logs without anyone catching it. The new provider doesn't react to any of it. Occasionally I send a quick heads-up over when it gets critical. Sometimes a thanks comes back, sometimes nothing.

Why I still do it at all

A fair question would be: why do you even care? They're not your customer any more. The mails could go into a rule, done. Honestly, I've thought about that. Several times. But I still don't do it, and I think it has to do with something deeper than customer retention.

I can't look away from certain things. When I see a productive system running for hours with an expired certificate, an old reflex wakes up in me. Not for the customer. For the system. For the people who maybe can't use it right now. For the whole quiet nonsense that happens when nobody's watching. That may sound old-fashioned. I'd rather be old-fashioned and sleep peacefully than cynical and efficient.

And one more thing: through this experience I see every day how much we used to do for the customer without them noticing. They knew their systems were running. They didn't know that someone at our place looked every day to see if they were running and quietly straightened things out before they became problems. That invisibility is part of the job. But it has consequences when it falls away.

What this case says about our industry

I don't know the new provider personally. Maybe they have good reasons not to react. Maybe they scoped the contract differently. Maybe the alerts simply aren't visible in their setup. But what I see from a distance is a symptom of a bigger pattern: many IT providers optimise for the visible. Everything the customer sees gets attended to. Everything they don't see gets left until the next outage.

That works until the next outage comes. And then it usually costs many times what the maintenance would have cost. I've seen that often enough in my career to take a different path in my own agency. We deliberately invest in the invisible layer. That doesn't make us cheaper. It makes us calmer, and it makes our customers calmer.

What I take from the situation

I've learnt several things from this ex-customer case. First: loyalty is an asset, even when it's no longer being paid for. Anyone who knows I'll still send a mail after the end if it's burning thinks twice about replacing me at all. That's not why I do it, but it's a nice side effect.

Second: handovers are the hardest discipline in our craft. Not because they're technically complicated. Because they demand honesty, on both sides. The old provider has to really lay everything open. The new one has to be willing to inherit reality, including all the alerts they don't find inviting. If either of those steps doesn't happen, you land in exactly the situation I'm describing here.

Third: I still believe in this industry. Despite everything. They exist, the teams who quietly do good work. They exist, the providers who take an alert seriously even when nobody is monitoring it. If I have a wish, it's this: that customers dare again to choose not the cheapest, but the one who still gets in touch after the handover, when it's burning.

The mails keep coming. I'll keep reading them. Not because I have to. Because I think systems deserve a person who at least looks.