Bug that wiped customer data saved the day – and a contract
- Reference: 1773041407
- News link: https://www.theregister.co.uk/2026/03/09/who_me/
- Source link:
This week, meet a reader we'll Regomize as "Caleb" who told us of the time he worked for an internet service provider that had problems with one of its biggest clients.
"They constantly complained about their network being slow and their internet access being even slower," Caleb told The Register .
[1]
"As this was one of our largest clients at the time, my boss grabbed me and we visited the customer."
[2]
[3]
The first thing Caleb and his boss saw was a collection of 3Com routers the client used to connect to the ISP and their other offices over IPX.
"I looked at the config and noticed the customer did not have a default route set," Caleb admitted. He wasn't sure if that was the problem, so he made some changes he thought might be useful.
[4]
The router Caleb worked on then rebooted, which he expected. But when it restarted, its previous configuration was gone. This was unexpected, but Caleb later discovered it was a known bug with the routers, perhaps due to dodgy NVRAM.
[5]Server crashes traced to one very literal knee-jerk reaction
[6]Work experience kids messed with manager's PC to send him to Ctrl-Alt-Del hell
[7]Final step to put new website into production deleted it instead
[8]Tech support chap invented fake fix for non-problem and watched it spread across the office
Caleb and his boss decided to restore the IP connection and figured the IPX side of the network was the client's problem.
Once they finished the job, Caleb and his boss noticed that the client's internet access had become blazing fast.
"We showed them proof that their own IPX networking setup was the reason for their slow internet. They were very happy and remained a customer for several years."
Looking back on the job, Caleb thinks it was his mistake that saved the day.
[9]
"If that router hadn't eaten its own configuration, we would have lost them as a customer," he wrote.
Have your mistakes turned into miracles? If so, [10]click here to send your story to Who, Me? We'd love the chance to tell your tale on a future Monday. ®
Get our [11]Tech Resources
[1] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_onprem/networks&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=2&c=2aa6oUeTHFvNLk1ml-VDcawAAAQA&t=ct%3Dns%26unitnum%3D2%26raptor%3Dcondor%26pos%3Dtop%26test%3D0
[2] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_onprem/networks&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=4&c=44aa6oUeTHFvNLk1ml-VDcawAAAQA&t=ct%3Dns%26unitnum%3D4%26raptor%3Dfalcon%26pos%3Dmid%26test%3D0
[3] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_onprem/networks&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=3&c=33aa6oUeTHFvNLk1ml-VDcawAAAQA&t=ct%3Dns%26unitnum%3D3%26raptor%3Deagle%26pos%3Dmid%26test%3D0
[4] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_onprem/networks&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=4&c=44aa6oUeTHFvNLk1ml-VDcawAAAQA&t=ct%3Dns%26unitnum%3D4%26raptor%3Dfalcon%26pos%3Dmid%26test%3D0
[5] https://www.theregister.com/2026/03/02/who_me/
[6] https://www.theregister.com/2026/02/23/who_me/
[7] https://www.theregister.com/2026/02/16/who_me/
[8] https://www.theregister.com/2026/02/09/who_me/
[9] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_onprem/networks&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=3&c=33aa6oUeTHFvNLk1ml-VDcawAAAQA&t=ct%3Dns%26unitnum%3D3%26raptor%3Deagle%26pos%3Dmid%26test%3D0
[10] mailto:whome@theregister.com
[11] https://whitepapers.theregister.com/
Well they did get to the route of the problem.
Managed to stop them switching.
I was worried the customer was about to firewall them
They switched configuration
I'd imagine the loss of that customer may have taken a byte out of their income but they managed to save the gig.
Several times I've seen messed up network configurations sucessfully contain malware infestations.
The worrying bit was way the IT department's self-congratuations about this, they refused to believe they shouldn't celebrate that they had dodged a bullet by tripping over their own untied shoelaces
BDTD.
Is not good that your digital infrastructure is a potpourri of old VMs, obsolete OS, abandoned servers running for years with no purpose, but when you get a full blown intrusion (becasue of a gaping hole in a 3rd party software) and the ransomware crew fails to encrypt the real juicy targets ( and in several others their tool crashed) it kinda makes your day.
so .. er ?
having trouble following this
So they turned up , blamed the internal crappy IPX layout , had a go at fixing that , failed , then looked at the IP part they were actually responsible for and fixed that too , which apparently made a difference even though the crappy IPX weakest link was still faulty?
So whats the story? If they hadnt failed to fix the IPX they might not have looked at the IP ?
This sounds a bit like every time I've had to ring an ISP having diagnosed the fault is definately prior to the incoming connection and having to sit through hours of scripts read out to me about how the iPad might not be so fast in the garden shed and the kids might be downloading fortnite .
Re: so .. er ?
My understanding is that they went to investigate why a customer's internet access was poor.
While doing so, "Caleb" spotted a setting in the - customer owned - routers which was perhaps incorrect. He tweaked it and it rebooted. Unfortunately the routers had a bug which caused the rest of their configuration to be reset.
Caleb set the router up from scratch, and suddenly the customer internet connection was working properly.
Re: so .. er ?
In other words ... Luck !!!
Sometimes it works in your favour ...
:)
Accidental on purpose?
I was visiting a customer and was peripherally involved in a problem they had.
They had a home grown configuration file which had had change upon changed made to it - to the extent that no one knew how it worked. They had had meetings to discuss rewriting it, and thought the risk was too high of it not working.
I was at the customer for the weekend while they did a major upgrade, and they wanted all support people in. One person I was working with thought he could improve their configuration program.
At midnight, he found his changes made it worse; and then he found he was updating the master copy, and when he backed out his changes it didn't work. The previous backups were corrupted, so could not be used.
He bit the bullet and rewrote it from scratch.
I went to the hotel at 0600, and came back at 1500 for the next day's support. He was still there. In the middle of the night I found him asleep at his desk (for an hour or two).
Sunday night about 8pm the major upgrade was successful, and the rewrite worked.
The guy was hyperactive from coffee an doughnuts. He said that most of the work was understanding what was still needed, and what was junk left over from earlier systems.
His boss when told, was happy and annoyed (more happy than annoyed).
As I was checking some documentation on his laptop, I noticed some files saying backup of....
When I mentioned it, he said "you didn't see them I sort of caused the backups to be unreadable. I just wanted to fix this application"
If that router hadn't eaten its own configuration, we would have lost them as a customer," he wrote.
So they made a packet out of them