Network Wiring Nightmares

The network is the forum.

Moderators: Steel, notfred

Network Wiring Nightmares

Postposted on Wed Aug 03, 2011 1:43 pm

So when commenting on APWNH's thread on home networking viewtopic.php?f=14&t=77465 I was reminded of my last job where I served as the IS manager. One of my projects was to improve the wiring nightmare that I had inherited in the Core networking closet. A picture (or 4) is worth a thousand words, so without further ado:
This was that state of affairs that I was tasked with "managing". Somewhere in the lower left corner buried under a mess of cabling is a 9 slot Cisco 6500 series switch with a number of 48 port 10/100 and gigabit cards, a couple fiber cards and I think one or two blanks. On the right was all the patch panels for everything ever made. I think there were about 250 ports, about 150 of which were active.
Image
The time came to replace the switch under lease and I knew I wasn't going to just replace it with the same thing. We decided to go ahead and change from a 9 slot chassis to a 4 slot to handle the fiber and a few gig cards and then do seperate workgroup switches linked by gig copper back to the core for access to the majority of the clients. Bandwidth wasn't really a huge concern - the workgroup switches were essentially daisy-chained together, so all 5 of them shared 2 Gig ports. There wasn't any huge file sharing/moving going on - mostly network file access and email and those few that needed better speeds got hooked directly to the core Gig ports. The impetus for switching to the workgroup switches was wiring managability and being able to keep the runs from patch panel to switch much shorter and more manageable. First, though, we removed all the unused wiring we could - lots of inactive ports that you just couldn't tell because of the spaghetti monster that had taken over:
Image
Here's a close-up of the strategy that we were employing - a 48 port switch sandwiched in between a few sets of patch panels so we could use 6", 1 foot and 2 foot patch cords.
Image
And here is the final result. Performance was the same or better, management was actually possible, wiring was easy to trace, remove, and change as needed.
Image
Anyone else had any similar nightmare wiring they've had to deal with? How did you handle it?
i7-3770K | Asus P8Z77-V LK | 8GB DDR3-1600 | HD5850 | 128GB 840 Pro | Samsung F3 1TB | U2412M | Define R4 | Seasonic 520W M12II | Win7 Pro x64.
frumper15
Gerbil Team Leader
Silver subscriber
 
 
Posts: 237
Joined: Mon Jan 18, 2010 3:25 pm

Re: Network Wiring Nightmares

Postposted on Wed Aug 03, 2011 7:47 pm

I worked in a datacenter with a raised floor which had foot thick piles of cable in some places. They said it used to be worse. There used to be spots where the cables would push up the tiles.

Then they installed Cisco duplex panels randomly around the floor, but they only listed the locations of the server and switch.

Then there was the other datacenter which had an insane amount of patch panels between the servers and the switches. I want to say there were fifteen hops, but the logical part of my brain says that sounds absurd. Anyway, it was a stupid amount, and they couldn't test end to end because of all the attenuation due to all of the connections inbetween the two end-points.

What did we do? We used lots of profanity.
Flatland_Spider
Gerbil Elite
 
Posts: 824
Joined: Mon Sep 13, 2004 8:33 pm
Location: The 918/539

Re: Network Wiring Nightmares

Postposted on Thu Aug 04, 2011 9:04 am

That's impressive. How did you handle the cleanup? How much downtime did that take?
Naito
Gerbil First Class
 
Posts: 197
Joined: Mon Feb 24, 2003 4:24 pm

Re: Network Wiring Nightmares

Postposted on Thu Aug 04, 2011 4:55 pm

Naito wrote:That's impressive. How did you handle the cleanup? How much downtime did that take?

We got some reports from the network admin based on the port activity on the switch itself and anything that hadn't been active for more than 90 days was considered unused (the biggest problem was ports would be made active for visitors or other office arrangements and in conference rooms and were never unplugged/turned back off). We then painstakingly burrowed our way into the snake pit and removed all the inactives. I think we only had 1 or 2 that should have been left on and we discovered that a week or two later as they were people that were on vacation or medical leave and were just getting back. After that 75 or so cables were removed things looked a little better. I think I have an intermediate picture - hold on [goes to look for picture...]
Image
Ok, I guess it doesn't look much better. More impressive is the amount of cable we removed:
Image

After things got cleaned up as well as they could with the existing configuration, we went about inserting the new 1U 48 port switches between the patch panels. After that it was a bit of finagling to get a list of "special" clients that had specific port needs. For example, some heavy data users needed to be in Gigabit ports instead of the standard 10/100 and we also made one of the new switches POE (power over ethernet) for a 2nd phase VOIP phone rollout we were working on at the same time. With that information in hand, we brought up the new switches (core and workgroups) and spent a night pulling the old long spaghetti apart and making our nice short connections to the new switches. Here's about half way through the night:
Image
In reality, the longest downtime that anyone experienced was the 15 minutes or so at the beginning when we switched the new Core to being the Master (this is my understanding of what was going on in non-technical terms - I'm not a network admin by any means) and that required a bit of preparation to make sure our production servers didn't freak out. After that, it was really only 15 to 30 seconds for each port as we unplugged the old and plugged in the new. I think it went pretty smooth and doing it overnight meant very little interruption of service for anyone.

Looking back it's probably one of my most satisfying projects as far as the impact that it will continue to have and I wanted to share in case it inspires anyone else in their work. FYI, I somewhat stole the idea of arranging things the way I did from these folks: http://www.neatpatch.com/ but we didn't actually have ENOUGH space to use their product (except for one you'll notice in the final pic above) so using 6 inch and 1 foot lengths instead of their 2 foot idea was a close approximation and I think worked out extremely well.
i7-3770K | Asus P8Z77-V LK | 8GB DDR3-1600 | HD5850 | 128GB 840 Pro | Samsung F3 1TB | U2412M | Define R4 | Seasonic 520W M12II | Win7 Pro x64.
frumper15
Gerbil Team Leader
Silver subscriber
 
 
Posts: 237
Joined: Mon Jan 18, 2010 3:25 pm


Return to Networking

Who is online

Users browsing this forum: No registered users and 3 guests