Mastodon Skip to content
  • Home
  • Aktuell
  • Tags
  • Über dieses Forum
Einklappen
Grafik mit zwei überlappenden Sprechblasen, eine grün und eine lila.
Abspeckgeflüster – Forum für Menschen mit Gewicht(ung)

Kostenlos. Werbefrei. Menschlich. Dein Abnehmforum.

  1. Home
  2. Uncategorized
  3. A few days ago, a client’s data center "vanished" overnight.

A few days ago, a client’s data center "vanished" overnight.

Geplant Angeheftet Gesperrt Verschoben Uncategorized
sysadminhorrorstoriesithorrorstoriesmonitoring
111 Beiträge 46 Kommentatoren 0 Aufrufe
  • Älteste zuerst
  • Neuste zuerst
  • Meiste Stimmen
Antworten
  • In einem neuen Thema antworten
Anmelden zum Antworten
Dieses Thema wurde gelöscht. Nur Nutzer mit entsprechenden Rechten können es sehen.
  • darkling@mstdn.socialD darkling@mstdn.social

    @stefano Fortunately, the only thing that did fail after the aircon was the switch. (And a pair of ear muffs which had been hanging on a metal rail -- they'd melted).

    The fire brigade turned up, checked everything, and ran some big positive pressure fans to get airflow through the room from one door to the other to cool everything down.

    robert@irrelevant.me.ukR This user is from outside of this forum
    robert@irrelevant.me.ukR This user is from outside of this forum
    robert@irrelevant.me.uk
    schrieb zuletzt editiert von
    #71

    @darkling @stefano
    Ferranti Computer Systems, Cheadle (UK) circa 1982. I was a lowly apprentice, at the time working in the department that oversaw the various VAXen that most of the site used. Three full size machines and a handful of microVAX. Kept cool by *three* massive air conditioner units on the external wall. The server room was always chilly. /cont

    robert@irrelevant.me.ukR 1 Antwort Letzte Antwort
    0
    • stefano@mastodon.bsd.cafeS stefano@mastodon.bsd.cafe

      A few days ago, a client’s data center "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.

      I then suspected a power failure, but the UPS should have sent an alert.

      The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

      To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.

      The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.

      That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.

      The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.

      The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.

      Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.

      Never rely only on internal monitoring. Never.

      #IT #SysAdmin #HorrorStories #ITHorrorStories #Monitoring

      rasteri@mastodon.scotR This user is from outside of this forum
      rasteri@mastodon.scotR This user is from outside of this forum
      rasteri@mastodon.scot
      schrieb zuletzt editiert von
      #72

      @stefano I wonder how they generate a big enough power surge.

      falkappel@sueden.socialF 1 Antwort Letzte Antwort
      0
      • _elena@mastodon.social_ _elena@mastodon.social

        @EnigmaRotor reading this at lunch in a cafe near my house and I keep chuckling and smiling from ear to ear. @stefano is such a treasure 🙌🏆

        enigmarotor@mastodon.bsd.cafeE This user is from outside of this forum
        enigmarotor@mastodon.bsd.cafeE This user is from outside of this forum
        enigmarotor@mastodon.bsd.cafe
        schrieb zuletzt editiert von
        #73

        @_elena @stefano And the café is the treasure island (“X” marks the place). 🎶“Heeeeee is a pirate, a jar of whiskey and a bottle of winnnneeeeee”🎶. Well that was a spontaneous Jack Sparrow moment. Sorry!

        1 Antwort Letzte Antwort
        0
        • robert@irrelevant.me.ukR robert@irrelevant.me.uk

          @darkling @stefano
          Ferranti Computer Systems, Cheadle (UK) circa 1982. I was a lowly apprentice, at the time working in the department that oversaw the various VAXen that most of the site used. Three full size machines and a handful of microVAX. Kept cool by *three* massive air conditioner units on the external wall. The server room was always chilly. /cont

          robert@irrelevant.me.ukR This user is from outside of this forum
          robert@irrelevant.me.ukR This user is from outside of this forum
          robert@irrelevant.me.uk
          schrieb zuletzt editiert von
          #74

          @darkling @stefano
          Until one morning I arrived to chaos. One of the aircons had failed, & the others, overstressed, had completely iced up, and the reduced airflow had caused the temperature in the room to rise. It was pretty much the hottest I'd ever encountered anywhere!
          Fire doors and internal doors were propped open, to get a bit of airflow, and the blocked air cons turned off. The heat then had a chance to melt the ice, and they could be brought back online later. I think it took all day.

          1 Antwort Letzte Antwort
          0
          • jamesoff@mastodon.jamesoff.netJ jamesoff@mastodon.jamesoff.net

            @mkj @stefano @rhoot oh if audio's getting involved, you can use `ping -a` 😄

            mkj@social.mkj.earthM This user is from outside of this forum
            mkj@social.mkj.earthM This user is from outside of this forum
            mkj@social.mkj.earth
            schrieb zuletzt editiert von
            #75

            @jamesoff `ping -af` 🙂

            @stefano @rhoot

            jamesoff@mastodon.jamesoff.netJ 1 Antwort Letzte Antwort
            0
            • mkj@social.mkj.earthM mkj@social.mkj.earth

              @jamesoff `ping -af` 🙂

              @stefano @rhoot

              jamesoff@mastodon.jamesoff.netJ This user is from outside of this forum
              jamesoff@mastodon.jamesoff.netJ This user is from outside of this forum
              jamesoff@mastodon.jamesoff.net
              schrieb zuletzt editiert von
              #76

              @mkj @stefano @rhoot "i don't even see the pings any more... it's just blonde, brunette, airhorn"

              1 Antwort Letzte Antwort
              0
              • rasteri@mastodon.scotR rasteri@mastodon.scot

                @stefano I wonder how they generate a big enough power surge.

                falkappel@sueden.socialF This user is from outside of this forum
                falkappel@sueden.socialF This user is from outside of this forum
                falkappel@sueden.social
                schrieb zuletzt editiert von
                #77

                @rasteri probably easy you just need a big capacitor and a tape generator (that thing from physics in school) and woossh enough voltage and current to melt e.g. a screwdriver (did that in school 😅) @stefano

                1 Antwort Letzte Antwort
                0
                • stefano@mastodon.bsd.cafeS stefano@mastodon.bsd.cafe

                  A few days ago, a client’s data center "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.

                  I then suspected a power failure, but the UPS should have sent an alert.

                  The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

                  To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.

                  The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.

                  That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.

                  The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.

                  The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.

                  Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.

                  Never rely only on internal monitoring. Never.

                  #IT #SysAdmin #HorrorStories #ITHorrorStories #Monitoring

                  angel@triptico.comA This user is from outside of this forum
                  angel@triptico.comA This user is from outside of this forum
                  angel@triptico.com
                  schrieb zuletzt editiert von
                  #78
                  Oh my 😱
                  1 Antwort Letzte Antwort
                  0
                  • stefano@mastodon.bsd.cafeS stefano@mastodon.bsd.cafe

                    A few days ago, a client’s data center "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.

                    I then suspected a power failure, but the UPS should have sent an alert.

                    The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

                    To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.

                    The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.

                    That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.

                    The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.

                    The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.

                    Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.

                    Never rely only on internal monitoring. Never.

                    #IT #SysAdmin #HorrorStories #ITHorrorStories #Monitoring

                    penguin42@mastodon.org.ukP This user is from outside of this forum
                    penguin42@mastodon.org.ukP This user is from outside of this forum
                    penguin42@mastodon.org.uk
                    schrieb zuletzt editiert von
                    #79

                    @stefano There was an attack a few years back near here where they dropped burning rubbish into manholes around a a data centre; the theory at the time was it was to try and cut off some CCTV or alarm monitoring for something. Well caught!

                    1 Antwort Letzte Antwort
                    0
                    • stefano@mastodon.bsd.cafeS stefano@mastodon.bsd.cafe

                      A few days ago, a client’s data center "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.

                      I then suspected a power failure, but the UPS should have sent an alert.

                      The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

                      To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.

                      The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.

                      That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.

                      The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.

                      The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.

                      Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.

                      Never rely only on internal monitoring. Never.

                      #IT #SysAdmin #HorrorStories #ITHorrorStories #Monitoring

                      fisher@toots.nuF This user is from outside of this forum
                      fisher@toots.nuF This user is from outside of this forum
                      fisher@toots.nu
                      schrieb zuletzt editiert von
                      #80

                      @stefano Cool story bro, but it's too fictional, I'd say.
                      First off, as a Ukrainian, I know that powerlines can survive "the spikes" by just cutting the power at the very input. No damage to equipment behind the input electric circuit breaker, nope. You just get damaged input.
                      Next, I used to work in a bank. And here we had a clear requirement for data storage center: more than one power input -- is a must.

                      fisher@toots.nuF 1 Antwort Letzte Antwort
                      0
                      • fisher@toots.nuF fisher@toots.nu

                        @stefano Cool story bro, but it's too fictional, I'd say.
                        First off, as a Ukrainian, I know that powerlines can survive "the spikes" by just cutting the power at the very input. No damage to equipment behind the input electric circuit breaker, nope. You just get damaged input.
                        Next, I used to work in a bank. And here we had a clear requirement for data storage center: more than one power input -- is a must.

                        fisher@toots.nuF This user is from outside of this forum
                        fisher@toots.nuF This user is from outside of this forum
                        fisher@toots.nu
                        schrieb zuletzt editiert von
                        #81

                        @stefano
                        Third, given it's a data center, power consumption is probably tens of KW. The "gang" could probably be killed in action playing with it.
                        Fourth, if there is a power spike and cut off, it won't go unnoticed by those who control power lines. They will be the first on site to see what happened.

                        fisher@toots.nuF stefano@mastodon.bsd.cafeS 2 Antworten Letzte Antwort
                        0
                        • stefano@mastodon.bsd.cafeS stefano@mastodon.bsd.cafe

                          A few days ago, a client’s data center "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.

                          I then suspected a power failure, but the UPS should have sent an alert.

                          The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

                          To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.

                          The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.

                          That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.

                          The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.

                          The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.

                          Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.

                          Never rely only on internal monitoring. Never.

                          #IT #SysAdmin #HorrorStories #ITHorrorStories #Monitoring

                          connynasch@mastodon.socialC This user is from outside of this forum
                          connynasch@mastodon.socialC This user is from outside of this forum
                          connynasch@mastodon.social
                          schrieb zuletzt editiert von
                          #82

                          @stefano thank you for this knowledge, I have boosted it for reference for others. 🤗

                          1 Antwort Letzte Antwort
                          0
                          • fisher@toots.nuF fisher@toots.nu

                            @stefano
                            Third, given it's a data center, power consumption is probably tens of KW. The "gang" could probably be killed in action playing with it.
                            Fourth, if there is a power spike and cut off, it won't go unnoticed by those who control power lines. They will be the first on site to see what happened.

                            fisher@toots.nuF This user is from outside of this forum
                            fisher@toots.nuF This user is from outside of this forum
                            fisher@toots.nu
                            schrieb zuletzt editiert von
                            #83

                            @stefano but otherwise it's a cool horror story, yeah 😃

                            1 Antwort Letzte Antwort
                            0
                            • stefano@mastodon.bsd.cafeS stefano@mastodon.bsd.cafe

                              @_elena Thank you! Sure, I will 👍
                              But, to be honest, I don't think any of those stories will ever be a film.

                              The big, most scary one is yet to come, anyway...

                              elaterite@mastoart.socialE This user is from outside of this forum
                              elaterite@mastoart.socialE This user is from outside of this forum
                              elaterite@mastoart.social
                              schrieb zuletzt editiert von
                              #84

                              @stefano I don't know, you told this short story like a pro. Starts out, ya, data center suddenly goes dark over the holidays. UPS fails, kinda of ya, ya , still interesting then you introduce the gold, two-meter thick walls, professional thieves, wow, that's some drama! Although, I wonder how they were able to send such a massive power surge down the lines and why the bus mains didn't blow before the equipment was damaged? Looking forward to your next tale!

                              @_elena

                              stefano@mastodon.bsd.cafeS 1 Antwort Letzte Antwort
                              0
                              • enigmarotor@mastodon.bsd.cafeE enigmarotor@mastodon.bsd.cafe

                                @stefano Oh, if genre is horror, then don’t forget to tell the tale of the guy who pronounced “Microsoft” 3 times before his mirror. What happened next, the blue mirror of death, is frightening to the bones.

                                balderdoordash@mastodon.socialB This user is from outside of this forum
                                balderdoordash@mastodon.socialB This user is from outside of this forum
                                balderdoordash@mastodon.social
                                schrieb zuletzt editiert von
                                #85

                                @EnigmaRotor @stefano or the case of the red fire button killer

                                1 Antwort Letzte Antwort
                                0
                                • stefano@mastodon.bsd.cafeS stefano@mastodon.bsd.cafe

                                  A few days ago, a client’s data center "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.

                                  I then suspected a power failure, but the UPS should have sent an alert.

                                  The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

                                  To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.

                                  The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.

                                  That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.

                                  The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.

                                  The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.

                                  Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.

                                  Never rely only on internal monitoring. Never.

                                  #IT #SysAdmin #HorrorStories #ITHorrorStories #Monitoring

                                  G This user is from outside of this forum
                                  G This user is from outside of this forum
                                  gbsills@social.vivaldi.net
                                  schrieb zuletzt editiert von
                                  #86

                                  @stefano thanks for sharing this.

                                  1 Antwort Letzte Antwort
                                  0
                                  • zeitverschreib@freundica.deZ zeitverschreib@freundica.de shared this topic
                                  • stefano@mastodon.bsd.cafeS stefano@mastodon.bsd.cafe

                                    A few days ago, a client’s data center "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.

                                    I then suspected a power failure, but the UPS should have sent an alert.

                                    The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

                                    To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.

                                    The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.

                                    That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.

                                    The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.

                                    The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.

                                    Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.

                                    Never rely only on internal monitoring. Never.

                                    #IT #SysAdmin #HorrorStories #ITHorrorStories #Monitoring

                                    pedro@social.bufu.linkP This user is from outside of this forum
                                    pedro@social.bufu.linkP This user is from outside of this forum
                                    pedro@social.bufu.link
                                    schrieb zuletzt editiert von
                                    #87
                                    @stefano why do you assume that via 4G there would be connectivity? I don't get this part, what am I missing?
                                    saupreiss@pfalz.socialS stefano@mastodon.bsd.cafeS 2 Antworten Letzte Antwort
                                    0
                                    • stefano@mastodon.bsd.cafeS stefano@mastodon.bsd.cafe

                                      A few days ago, a client’s data center "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.

                                      I then suspected a power failure, but the UPS should have sent an alert.

                                      The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

                                      To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.

                                      The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.

                                      That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.

                                      The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.

                                      The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.

                                      Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.

                                      Never rely only on internal monitoring. Never.

                                      #IT #SysAdmin #HorrorStories #ITHorrorStories #Monitoring

                                      wwberrutti@ohai.socialW This user is from outside of this forum
                                      wwberrutti@ohai.socialW This user is from outside of this forum
                                      wwberrutti@ohai.social
                                      schrieb zuletzt editiert von
                                      #88

                                      @stefano This is a pretty important knowledge to have! I bookmarked for future reference!

                                      1 Antwort Letzte Antwort
                                      0
                                      • pedro@social.bufu.linkP pedro@social.bufu.link
                                        @stefano why do you assume that via 4G there would be connectivity? I don't get this part, what am I missing?
                                        saupreiss@pfalz.socialS This user is from outside of this forum
                                        saupreiss@pfalz.socialS This user is from outside of this forum
                                        saupreiss@pfalz.social
                                        schrieb zuletzt editiert von
                                        #89

                                        @pedro

                                        Also 4G needs power.

                                        @stefano

                                        1 Antwort Letzte Antwort
                                        0
                                        • fisher@toots.nuF fisher@toots.nu

                                          @stefano
                                          Third, given it's a data center, power consumption is probably tens of KW. The "gang" could probably be killed in action playing with it.
                                          Fourth, if there is a power spike and cut off, it won't go unnoticed by those who control power lines. They will be the first on site to see what happened.

                                          stefano@mastodon.bsd.cafeS This user is from outside of this forum
                                          stefano@mastodon.bsd.cafeS This user is from outside of this forum
                                          stefano@mastodon.bsd.cafe
                                          schrieb zuletzt editiert von
                                          #90

                                          @fisher It's not a bank, and it's definitely not a top-tier data center. It's just a trading company in a typical industrial area in Northern Italy, with fewer than 50 employees. They've only got four aging servers in a rack. Facilities like this don't need bank-grade security or the kind of resilience you'd find in a war zone. I wish it were fictional! 😃

                                          1 Antwort Letzte Antwort
                                          0
                                          Antworten
                                          • In einem neuen Thema antworten
                                          Anmelden zum Antworten
                                          • Älteste zuerst
                                          • Neuste zuerst
                                          • Meiste Stimmen



                                          Copyright (c) 2025 abSpecktrum (@abspecklog@fedimonster.de)

                                          Erstellt mit Schlaflosigkeit, Kaffee, Brokkoli & ♥

                                          Impressum | Datenschutzerklärung | Nutzungsbedingungen

                                          • Anmelden

                                          • Du hast noch kein Konto? Registrieren

                                          • Anmelden oder registrieren, um zu suchen
                                          • Erster Beitrag
                                            Letzter Beitrag
                                          0
                                          • Home
                                          • Aktuell
                                          • Tags
                                          • Über dieses Forum