Network Computing is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Things That Go Bump in the Data Center: Tales of Data Center Horrors

  • Data Center

    The end of October means it’s time for costume parties, haunted houses, and your favorite scary movies. We all enjoy a frighteningly fun spooky story during parties or while wrapped up in a blanket at home, but these ghoulish tales come to life not just around Halloween for tech pros—they encounter these scary stories year-round.

    This year, SolarWinds polled its THWACK community of over 150,000 tech pros to share some of the real-life scary stories community members have encountered while working in the data center. A terrifying mishap with a chiller, HVAC, generator, or branch circuit, perhaps? Or maybe some eerie elements outside the data center affected their ability to conduct business as usual?

    A recent SolarWinds survey explored tech pros’ confidence in their current skillsets as they contemplate the IT environments of tomorrow and revealed that despite the lack of tech pro confidence in managing current and future environments, skill development is still on the back burner. While there’s no easy fix for leaks, accidental blackouts, or other mishaps, these types of unexpected nightmares—despite the headache—often offer important learning opportunities for admins to managers and everyone in between.

    Here are a few spooky tales fellow tech pros shared this year:

    (Source: Pixabay)
  • TW-H1

    The Storm Is Coming

    “My biggest nightmare was when I was called at night into the data center because the power had dropped, and the uninterruptible power supply (UPS) failed to switch over to the generator. All we could do was wait for the UPS technician to arrive and fix the issue. The company arrived quickly, and the problem was resolved within an hour.

    “After the UPS was fixed, we then spent 45 minutes bringing everything up and verifying it was running smoothly. Everything was going fine until power went out again at 2:00 a.m.—the UPS switched over to the generator, which failed to start up, and the data center went black again.

    “Ultimately, once everything was working again, we were able to bring the data center back online a little after 6:00 a.m.—right before people were to report to work and the state agencies were to open. Success after a very long night.”

    – THWACK member: Richard Phillips, Senior Network Engineer

    “Our network engineer made a lot of changes to the switches in our data center. Everything went smoothly and was working properly after that. That night, there was a bad storm which led to a power outage. The uninterruptible power supply (UPS) that was running the switches ran out of juice and caused everything to turn off. When the power came back up, nothing worked. The engineer didn't save the configuration, and everything reverted [to its original state].”

    – THWACK member: chrispacifico, IT Manager – Systems

  • TW-H2

    Spooky Throwbacks

    “In a previous life the data center [I worked in] was under the parking deck, so when it rained, it leaked. At one point, we had plastic tarps suspended between the ceiling and the concrete above to channel the water into larger catch basins (and we would have to get up onto ladders and use a shop vac to remove the water during storms). Later, we had trays made that were the same size as the 2x4 ceiling tiles and we'd only have to remove water from those as needed.  It was always fun when a new leak was discovered, and we needed to find where it dripped into.”

    THWACK member: Jfrazier, Global Monitoring Technologies Engineer III

    “Many years ago, we had a file transfer that was failing to one certain machine—when you moved files to it, they would corrupt and crash in the process. We worked on this issue for several days until my boss had this crazy idea. He swapped the network cables between two adjacent machines on the same subnet. Then, he asked us to test it again.

    “The original machine worked like a charm, but the swapped machine started having issues with the same process. We pulled the cable out and put a new one in and everything was working fine again.

    “We later checked the cable with a simple pair tester while wiggling the ends. It never showed an error. My boss said it was a haunted cable and took a set of wire cutters, cut the ends off, and tossed the whole thing in the trash.”

    – THWACK member: knucklebusted, Lead Network Engineer

  • TW-H3


    “We were adding a secondary fiber connection to all our off-site locations that had two possible service providers. At one location, a third party was running the line from the pole under a parking lot to our building when he hit a gas line. The equipment operator figured it out quickly, and the nearby buildings were evacuated just moments before the leak hit the hot pizza ovens next door. The pizza shop exploded with enough force to cause structural damage to us and other nearby buildings. We didn't open that location for six months.”

    – THWACK member: jm_sysadmin, Sr. Systems Engineer

    “At one of our data centers, the building/facilities services organization determined that they needed to test the UPS, generator, and flywheel switchover mechanism. In preparation for the test, they brought a technician in to validate the functionality of the breakers and transfer switch. The technician thought that the best method of testing that switch was to flip it and managed to determine the flywheel was non-functional when two thirds of the data center went dark.”

    – THWACK member: tsords

  • TW-H4

    Haunted Red Buttons

    “We had a large data center with a few AS/400s running various applications across Canada. The server room was pretty state-of-the-art at that time, with central cooling, proper cable management, alarms, fire suppression, cameras, and even an emergency power shutdown button.

    “Unfortunately, this button was at ‘butt’ height and didn't have a protective cover on it. One administrator bent over, hit the button with his behind, and killed power to the entire room, including the AS/400s, shutting down all the enterprise applications across the country. The individual had a very embarrassed week trying to recover all the data and get them back up and running...”

    – THWACK member: shuckyshark

    “We had a vendor mopping under the raised floor in the data center. To exit the data center, we had a big red button by the door to release the maglock. One of the vendor’s employees instead went to the wall 15 feet away, where there was a small red button under a clear plastic cover labeled Emergency Power Off. Yep, you guessed it: lights off, power off, nothing but the deafening sound of an entire DASD farm spinning down.”

    – THWACK member: Jfrazier, Global Monitoring Technologies Engineer III

  • TW-H5

    Water Scares

    “I was working for a small ISP and we had an inspection done on one of the data centers. It was in the basement and the weekend after the inspection there was a flash flood. I woke up to calls on my cell phone that there was four feet of water in the basement. The data center was destroyed. We restored the VMs over to the other data center to get everything working again, and spent the next day getting everything cleaned up. Over the next month we worked with insurance, ordered new equipment and rebuilt the data center. In the end, we found out that the cause of all the water coming in was a check valve from the sewage line that wasn't replaced after inspection.”

    – THWACK member: neoceasar