Maintenance Archive

And the Song Remains the Same

Introduction

As you may remember, we moved recently, no…  not that one, this one.  While the first was successful, the second one was a little more painful but also very successful.  It took us a little longer than expected to bring everything back up.  But it is all back up now.

  • freed0 – Well that was nicely done
  • Chief Architect – <grumble><grumble>
  • freed0 – What?  You say that there are still some details left?
  • Chief Architect – <grumble><grumble>
  • freed0 – But everything is running isn’t it?
  • Chief Architect – <murmur><murmur>
  • freed0 – Yes, the office is still a little loud, the staging room not set, the staging racks still on another floor, our extra gear somewhere else, the doors to the pods all scraped up, and there are extra holes in the floor…  but besides that it looks good doesn’t it?
  • Chief Architect – <grumble><grumble>
  • Senior Analyst – <grumble><grumble>
  • freedo – Are you guys taking language lessons from each other now?

So, it was busy and there are lots of little details left.  Although we had a lot of little things to complete with the last one as well.  Those finishing details always seem to take the most amount of time.  The guys worked very long hours from start to finish.  While we did not have to bleed much to make the actual move, there was enough to be done that we had to consider fourth and fifth meals most days to keep going.

One other big change is that we migrated everything from 10Gb connections between the clusters to 40Gb.  I expect our needs will outgrow that in the next three years.  Now if we only had a real internet connection.  Anyone want to offer one or ten up?

What Really Happened

They say a pictures is worth a thousand words, although from our Chief Architect we might only earn two.  But here is the photographic proof of the move and some data center/equipment porn for all you geeky people.

The Move

This is the old data center space with most of the racks already moved and the rest packed up.  Since we kept the racks we thought it easier to just ensure everything was tightly rack mounted and wrapped up instead of de-racking and re-racking.  This required a lot of preparation time to acquire rails and such for all the devices that did not have them originally.  We have been getting ready for this move for almost nine months to ensure that it went as smoothly as possible and limit the possible loss of gear.  In the end we did not lose very much considering that 84 racks were moved.

Everything was wrapped in blankets and then wrapped again in plastic to hold it all together.

Prior to moving into the new space, everything was unwrapped.  Kind of like a birthday party for grown up kids.

The floors were lined with boards in both the starting and destination buildings.  This facilitated rolling the racks as rapidly as possible.  And those guys did roll them fast.  We thought that we might lose a mover or two a couple of times as they squished all the racks together.

The New Data Center

It seems strange to keep saying that especially since it is not really all that new.  The biggest changes are that we have gone from a concrete floor to a raised floor.  That also then required a change of cooling.  We went from the Inter Row Cooling (IRC) which had water pipes above all the racks to a more standard Computer Room Air Handler (CRAH) system.

Here you can see our new staging are minus the staging racks and our to be deployed gear.  Somehow the newly placed work benches are already crowded and messy.

Inside our one storage room where we maintain a mountainous selection of cables and such.

One of the five venerable CRAH units.  These bad boys are almost 30 years old!

The new network center for the entire data center.  In the third row you can see our new 40GB backbone switch.

The new deployment of the racks look very similar to the old data center since we are using the same pod cooling methodology.

We still need to look at proper balancing the air flows, but with the pods it is a standard hot and cold aisle build.

Our new office space has a window!  There is also a balcony, but the door is blocked off so we can only look wistfully at the outside.

Our Old Space

All that is left are the IRC units and a few cables we could have bothered to unplug.

Conclusion

The move went well.  The end result is that we have a space that is larger than our last space, has more power, and more cooling.  This will allow us to continue to grow and fill the space more efficiently.  We have already been looking at growth options and methodology and for the last several years have really been focusing on replacement upgrading where possible instead of just adding hardware.  Although at a certain point more hardware is still needed for specific capacity types such as storage, although even there we have aggressively been replacing older smaller capacity hard drives with newer higher capacity drives as you can see from these blogs here and here.

How Can you Help

Cash in easy to carry bags is always useful, but what is really needed are direct internet connections to our data center.  We limp along with a small incoming connection at this time.  We really need one or two 10gb lines to solve all of that.  If you can provide something in the Bay Area (California), we would love to hear from you to see what can be done.  And beyond that, data, data, data….

  • freed0 – <smacks the Senior Analyst> Hey…  I am writing this, stop inserting your data fetish needs..
  • Senior Analyst – But… data is good.  Data make the world go around, data puts happy face on me…
  • freed0 – sigh, do we have any normal people here?

So, thanks everyone for the effort and continued support.