Mastodon Skip to content
  • Home
  • Aktuell
  • Tags
  • Über dieses Forum
Einklappen
Grafik mit zwei überlappenden Sprechblasen, eine grün und eine lila.
Abspeckgeflüster – Forum für Menschen mit Gewicht(ung)

Kostenlos. Werbefrei. Menschlich. Dein Abnehmforum.

  1. Home
  2. Uncategorized
  3. A few days ago, a client’s data center "vanished" overnight.

A few days ago, a client’s data center "vanished" overnight.

Geplant Angeheftet Gesperrt Verschoben Uncategorized
sysadminhorrorstoriesithorrorstoriesmonitoring
111 Beiträge 46 Kommentatoren 0 Aufrufe
  • Älteste zuerst
  • Neuste zuerst
  • Meiste Stimmen
Antworten
  • In einem neuen Thema antworten
Anmelden zum Antworten
Dieses Thema wurde gelöscht. Nur Nutzer mit entsprechenden Rechten können es sehen.
  • stefano@mastodon.bsd.cafeS stefano@mastodon.bsd.cafe

    @marios @EnigmaRotor consider this: https://it-notes.dragas.net/2024/07/22/install-uptime-kuma-freebsd-jail/

    saupreiss@pfalz.socialS This user is from outside of this forum
    saupreiss@pfalz.socialS This user is from outside of this forum
    saupreiss@pfalz.social
    schrieb zuletzt editiert von
    #93

    @stefano

    That sounds like a hell of a yell for an official port…

    @marios @EnigmaRotor

    1 Antwort Letzte Antwort
    0
    • pedro@social.bufu.linkP pedro@social.bufu.link
      @stefano why do you assume that via 4G there would be connectivity? I don't get this part, what am I missing?
      stefano@mastodon.bsd.cafeS This user is from outside of this forum
      stefano@mastodon.bsd.cafeS This user is from outside of this forum
      stefano@mastodon.bsd.cafe
      schrieb zuletzt editiert von
      #94

      @pedro if the two FTTH providers are down, the router will use the failover 4g connection to reach my VPN (and alert me).

      indyradio@kafeneio.socialI pedro@social.bufu.linkP 2 Antworten Letzte Antwort
      0
      • stefano@mastodon.bsd.cafeS stefano@mastodon.bsd.cafe

        @elaterite @_elena Fair question 🙂
        I'm just relaying what I was told and what I know about the company, for which I've been providing some services for many years. The details came directly from their internal manager and, honestly, I didn't have much interest in digging deeper into the technical specifics of the incident.
        My focus was simply making sure their servers were back up and running and that their data was safe. Everything else, electrical infrastructure, physical security, and similar aspects, is outside of my scope and handled by other people.

        elaterite@mastoart.socialE This user is from outside of this forum
        elaterite@mastoart.socialE This user is from outside of this forum
        elaterite@mastoart.social
        schrieb zuletzt editiert von
        #95

        @stefano Indeed. Still would be interesting to find out about the details of the infrastructure failure and how they pulled it off. Sounds like a good story for a documentary, especially if this is something that has happened in the past.

        @_elena

        stefano@mastodon.bsd.cafeS 1 Antwort Letzte Antwort
        0
        • stefano@mastodon.bsd.cafeS stefano@mastodon.bsd.cafe

          A few days ago, a client’s data center "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.

          I then suspected a power failure, but the UPS should have sent an alert.

          The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

          To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.

          The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.

          That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.

          The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.

          The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.

          Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.

          Never rely only on internal monitoring. Never.

          #IT #SysAdmin #HorrorStories #ITHorrorStories #Monitoring

          indyradio@kafeneio.socialI This user is from outside of this forum
          indyradio@kafeneio.socialI This user is from outside of this forum
          indyradio@kafeneio.social
          schrieb zuletzt editiert von
          #96

          @stefano that's impressive. meanwhile I accidentally stumbled on your website:
          You have shared many useful items in a thoughtful way. I appreciate it, and am glad to let you know. 😀

          stefano@mastodon.bsd.cafeS 1 Antwort Letzte Antwort
          0
          • stefano@mastodon.bsd.cafeS stefano@mastodon.bsd.cafe

            @pedro if the two FTTH providers are down, the router will use the failover 4g connection to reach my VPN (and alert me).

            indyradio@kafeneio.socialI This user is from outside of this forum
            indyradio@kafeneio.socialI This user is from outside of this forum
            indyradio@kafeneio.social
            schrieb zuletzt editiert von
            #97

            @stefano @pedro power line monitoring is important even for "normal" failures, because some are destructive.
            Since 9/11 there are a few new spooky things, and one is modulating the power with pulses

            pedro@social.bufu.linkP 1 Antwort Letzte Antwort
            0
            • stefano@mastodon.bsd.cafeS stefano@mastodon.bsd.cafe

              @bojanlandekic thank you! I'm just trying to spread some real life experiences

              bojanlandekic@mastodon.socialB This user is from outside of this forum
              bojanlandekic@mastodon.socialB This user is from outside of this forum
              bojanlandekic@mastodon.social
              schrieb zuletzt editiert von
              #98

              @stefano it is the criminals among us who make life difficult for all. Not even the greatest sci-fi authors have been able to imagine how beautiful and fun a future we all would have without them!

              1 Antwort Letzte Antwort
              0
              • stefano@mastodon.bsd.cafeS stefano@mastodon.bsd.cafe

                A few days ago, a client’s data center "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.

                I then suspected a power failure, but the UPS should have sent an alert.

                The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

                To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.

                The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.

                That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.

                The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.

                The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.

                Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.

                Never rely only on internal monitoring. Never.

                #IT #SysAdmin #HorrorStories #ITHorrorStories #Monitoring

                caymanpilot@mastodon.socialC This user is from outside of this forum
                caymanpilot@mastodon.socialC This user is from outside of this forum
                caymanpilot@mastodon.social
                schrieb zuletzt editiert von
                #99

                @stefano
                Wow! Cool story

                1 Antwort Letzte Antwort
                0
                • elaterite@mastoart.socialE elaterite@mastoart.social

                  @stefano Indeed. Still would be interesting to find out about the details of the infrastructure failure and how they pulled it off. Sounds like a good story for a documentary, especially if this is something that has happened in the past.

                  @_elena

                  stefano@mastodon.bsd.cafeS This user is from outside of this forum
                  stefano@mastodon.bsd.cafeS This user is from outside of this forum
                  stefano@mastodon.bsd.cafe
                  schrieb zuletzt editiert von
                  #100

                  @elaterite @_elena The police are investigating, and I know some technicians are scheduled to go over in the next few days. There will also be an insurance report, so I’ll try to get some more information.

                  1 Antwort Letzte Antwort
                  0
                  • indyradio@kafeneio.socialI indyradio@kafeneio.social

                    @stefano that's impressive. meanwhile I accidentally stumbled on your website:
                    You have shared many useful items in a thoughtful way. I appreciate it, and am glad to let you know. 😀

                    stefano@mastodon.bsd.cafeS This user is from outside of this forum
                    stefano@mastodon.bsd.cafeS This user is from outside of this forum
                    stefano@mastodon.bsd.cafe
                    schrieb zuletzt editiert von
                    #101

                    @indyradio thank you!!!

                    1 Antwort Letzte Antwort
                    0
                    • stefano@mastodon.bsd.cafeS stefano@mastodon.bsd.cafe

                      A few days ago, a client’s data center "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.

                      I then suspected a power failure, but the UPS should have sent an alert.

                      The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

                      To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.

                      The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.

                      That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.

                      The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.

                      The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.

                      Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.

                      Never rely only on internal monitoring. Never.

                      #IT #SysAdmin #HorrorStories #ITHorrorStories #Monitoring

                      itthinx@mastodon.socialI This user is from outside of this forum
                      itthinx@mastodon.socialI This user is from outside of this forum
                      itthinx@mastodon.social
                      schrieb zuletzt editiert von
                      #102

                      @stefano Great story and appropriate setup!

                      1 Antwort Letzte Antwort
                      0
                      • stefano@mastodon.bsd.cafeS stefano@mastodon.bsd.cafe

                        A few days ago, a client’s data center "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.

                        I then suspected a power failure, but the UPS should have sent an alert.

                        The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

                        To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.

                        The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.

                        That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.

                        The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.

                        The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.

                        Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.

                        Never rely only on internal monitoring. Never.

                        #IT #SysAdmin #HorrorStories #ITHorrorStories #Monitoring

                        dougiec3@libretooth.grD This user is from outside of this forum
                        dougiec3@libretooth.grD This user is from outside of this forum
                        dougiec3@libretooth.gr
                        schrieb zuletzt editiert von
                        #103

                        @stefano
                        Hey! Thanks for the inside story! I love happy endings.

                        1 Antwort Letzte Antwort
                        0
                        • indyradio@kafeneio.socialI indyradio@kafeneio.social

                          @stefano @pedro power line monitoring is important even for "normal" failures, because some are destructive.
                          Since 9/11 there are a few new spooky things, and one is modulating the power with pulses

                          pedro@social.bufu.linkP This user is from outside of this forum
                          pedro@social.bufu.linkP This user is from outside of this forum
                          pedro@social.bufu.link
                          schrieb zuletzt editiert von
                          #104
                          @indyradio @stefano modulating power with pulses? What is that? How does that work? What does it achieve?

                          I have so many questions...
                          Honestly, I know nothing about electrical wizzardry, I went too deep into computer science and never really touched that layer much.
                          1 Antwort Letzte Antwort
                          0
                          • stefano@mastodon.bsd.cafeS stefano@mastodon.bsd.cafe

                            @pedro if the two FTTH providers are down, the router will use the failover 4g connection to reach my VPN (and alert me).

                            pedro@social.bufu.linkP This user is from outside of this forum
                            pedro@social.bufu.linkP This user is from outside of this forum
                            pedro@social.bufu.link
                            schrieb zuletzt editiert von
                            #105
                            @stefano how do you think they managed to burn 4G? I suppose the battery for 4G should not even be in the same "grid" as the other stuff, right? (Im not sure anymore if I know how electricity works, guess I always took it for granted)
                            stefano@mastodon.bsd.cafeS 1 Antwort Letzte Antwort
                            0
                            • stefano@mastodon.bsd.cafeS stefano@mastodon.bsd.cafe

                              A few days ago, a client’s data center "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.

                              I then suspected a power failure, but the UPS should have sent an alert.

                              The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

                              To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.

                              The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.

                              That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.

                              The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.

                              The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.

                              Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.

                              Never rely only on internal monitoring. Never.

                              #IT #SysAdmin #HorrorStories #ITHorrorStories #Monitoring

                              ondrejzizka@mastodon.socialO This user is from outside of this forum
                              ondrejzizka@mastodon.socialO This user is from outside of this forum
                              ondrejzizka@mastodon.social
                              schrieb zuletzt editiert von
                              #106

                              @stefano Thanks for all the info about the company's internal setup.

                              stefano@mastodon.bsd.cafeS 1 Antwort Letzte Antwort
                              0
                              • stefano@mastodon.bsd.cafeS stefano@mastodon.bsd.cafe

                                A few days ago, a client’s data center "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.

                                I then suspected a power failure, but the UPS should have sent an alert.

                                The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

                                To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.

                                The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.

                                That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.

                                The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.

                                The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.

                                Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.

                                Never rely only on internal monitoring. Never.

                                #IT #SysAdmin #HorrorStories #ITHorrorStories #Monitoring

                                gme@bofh.socialG This user is from outside of this forum
                                gme@bofh.socialG This user is from outside of this forum
                                gme@bofh.social
                                schrieb zuletzt editiert von
                                #107

                                @stefano@mastodon.bsd.cafe
                                In the critical infrastructure sector controls are designed to fail open (as in break the circuit) and monitoring systems all have watchdog timers. If an "I'm still here!" ping is not received when it's expected to be received, an alarm goes off.

                                I say this not to distract from your original point.

                                External monitoring for critical systems is a must.

                                1 Antwort Letzte Antwort
                                0
                                • pedro@social.bufu.linkP pedro@social.bufu.link
                                  @stefano how do you think they managed to burn 4G? I suppose the battery for 4G should not even be in the same "grid" as the other stuff, right? (Im not sure anymore if I know how electricity works, guess I always took it for granted)
                                  stefano@mastodon.bsd.cafeS This user is from outside of this forum
                                  stefano@mastodon.bsd.cafeS This user is from outside of this forum
                                  stefano@mastodon.bsd.cafe
                                  schrieb zuletzt editiert von
                                  #108

                                  @pedro the 4g router was connected to the same UPS. So it wasn't destroyed, just off.

                                  1 Antwort Letzte Antwort
                                  0
                                  • ondrejzizka@mastodon.socialO ondrejzizka@mastodon.social

                                    @stefano Thanks for all the info about the company's internal setup.

                                    stefano@mastodon.bsd.cafeS This user is from outside of this forum
                                    stefano@mastodon.bsd.cafeS This user is from outside of this forum
                                    stefano@mastodon.bsd.cafe
                                    schrieb zuletzt editiert von
                                    #109

                                    @OndrejZizka I never named the company 😉

                                    1 Antwort Letzte Antwort
                                    0
                                    • stefano@mastodon.bsd.cafeS stefano@mastodon.bsd.cafe

                                      A few days ago, a client’s data center "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.

                                      I then suspected a power failure, but the UPS should have sent an alert.

                                      The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

                                      To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.

                                      The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.

                                      That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.

                                      The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.

                                      The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.

                                      Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.

                                      Never rely only on internal monitoring. Never.

                                      #IT #SysAdmin #HorrorStories #ITHorrorStories #Monitoring

                                      mixdup@mastodon.socialM This user is from outside of this forum
                                      mixdup@mastodon.socialM This user is from outside of this forum
                                      mixdup@mastodon.social
                                      schrieb zuletzt editiert von
                                      #110

                                      @stefano And while not relying on internal monitoring make sure your external monitoring doesn't share anything with the monitored systems:

                                      Different ISP, different cloud provider if in the cloud, no shared infra at any level

                                      1 Antwort Letzte Antwort
                                      0
                                      • stefano@mastodon.bsd.cafeS stefano@mastodon.bsd.cafe

                                        A few days ago, a client’s data center "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.

                                        I then suspected a power failure, but the UPS should have sent an alert.

                                        The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

                                        To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.

                                        The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.

                                        That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.

                                        The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.

                                        The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.

                                        Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.

                                        Never rely only on internal monitoring. Never.

                                        #IT #SysAdmin #HorrorStories #ITHorrorStories #Monitoring

                                        sharquaydius@mastodon.socialS This user is from outside of this forum
                                        sharquaydius@mastodon.socialS This user is from outside of this forum
                                        sharquaydius@mastodon.social
                                        schrieb zuletzt editiert von
                                        #111

                                        @stefano zapping the power lines, eh? Looks like the perfect solution to my nuisance neighbors with the big loudspeakers.

                                        1 Antwort Letzte Antwort
                                        0
                                        • energisch_@troet.cafeE energisch_@troet.cafe shared this topic
                                        Antworten
                                        • In einem neuen Thema antworten
                                        Anmelden zum Antworten
                                        • Älteste zuerst
                                        • Neuste zuerst
                                        • Meiste Stimmen



                                        Copyright (c) 2025 abSpecktrum (@abspecklog@fedimonster.de)

                                        Erstellt mit Schlaflosigkeit, Kaffee, Brokkoli & ♥

                                        Impressum | Datenschutzerklärung | Nutzungsbedingungen

                                        • Anmelden

                                        • Du hast noch kein Konto? Registrieren

                                        • Anmelden oder registrieren, um zu suchen
                                        • Erster Beitrag
                                          Letzter Beitrag
                                        0
                                        • Home
                                        • Aktuell
                                        • Tags
                                        • Über dieses Forum