A few days ago, a client’s data center "vanished" overnight.
-
@stefano that advice also applies to monitoring scheduled backup jobs (or any other automated process). I use a service that emails me if I don't hit a specific URL roughly every 24 hours, and I hit that at the end of my backup job if it was successful.
Better than finding out the hard way at some point in the future that something happened with my backup job, preventing it from running for the last month.
@rhoot exactly, that's the right approach. I'm using something similar.
-
@stefano that advice also applies to monitoring scheduled backup jobs (or any other automated process). I use a service that emails me if I don't hit a specific URL roughly every 24 hours, and I hit that at the end of my backup job if it was successful.
Better than finding out the hard way at some point in the future that something happened with my backup job, preventing it from running for the last month.
-
@stefano Oh, if genre is horror, then don’t forget to tell the tale of the guy who pronounced “Microsoft” 3 times before his mirror. What happened next, the blue mirror of death, is frightening to the bones.
I am quite keen to look into Uptime Kuma. Our current monitor is antiquated.
On a side note, you guys are hilarious! I genuinely had a good laugh at your comments.
-
-
-
I am quite keen to look into Uptime Kuma. Our current monitor is antiquated.
On a side note, you guys are hilarious! I genuinely had a good laugh at your comments.
-
A few days ago, a client’s data center "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.
I then suspected a power failure, but the UPS should have sent an alert.
The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.
To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.
The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.
That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.
The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.
The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.
Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.
Never rely only on internal monitoring. Never.
@stefano Sounds like a case of either good design or *very* good luck too that the UPS took the brunt of it.
We can't protect against everything, but we *can* have an idea for what to do when the unimagined happens.
-
@EnigmaRotor @marios @stefano You mean like, whatever happened to those crocodile pits and spike traps we used to see in the old Fu Manchu movies?
-
-
@EnigmaRotor @marios exactly. Life is hard - let's make it a little funnier
-
@stefano Sounds like a case of either good design or *very* good luck too that the UPS took the brunt of it.
We can't protect against everything, but we *can* have an idea for what to do when the unimagined happens.
@mkj yes, that is (was) a very good UPS and it did its job.
-
@EnigmaRotor @marios @stefano You mean like, whatever happened to those crocodile pits and spike traps we used to see in the old Fu Manchu movies?
-
@mkj yes, that is (was) a very good UPS and it did its job.
-
-
-
-
A few days ago, a client’s data center "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.
I then suspected a power failure, but the UPS should have sent an alert.
The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.
To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.
The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.
That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.
The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.
The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.
Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.
Never rely only on internal monitoring. Never.
@stefano That’s a rather cool war story. Great for a lecture.
-
@stefano @mkj Aye, there are different kinds of SPDs, and unfortunately electrical installers usually go for the cheapest ones unless you specifically ask for certain specs when getting a quote. Like always, it comes down to money, but at the very least the customer should be told about the limitations
-
@stefano That’s a rather cool war story. Great for a lecture.
@toxy this will probably be a longer blog post (with some more details)
-
@toxy this will probably be a longer blog post (with some more details)

that’s part of the concept, I think we do need and deserve to get smiles on our faces. As often as we possibly can 