2013년 10월 23일 수요일

Rodrick Gorden's blog ::Norwegian Data Center Is Built in a Cave and Cooled by Fjords






Rodrick Gorden's blog ::Norwegian Data Center Is Built in a Cave and Cooled by Fjords










               For               years,               Robert               Wakefield               and               Dameon               Rustin               lived               with               the               problems               of               keeping               Snelling               Staffing               Services'               old,               poorly               designed               data               center               up               and               running.

Not               only               were               the               intricate               cable               runs               and               varied               server               makes               and               models               difficult               to               keep               straight,               but               the               building               itself               tended               to               compound               their               management               headaches.
               "Our               15-ton               air-conditioning               unit               was               water-cooled,               but               the               building               [management]               didn't               clean               the               cooling               tower               very               often,"               says               Wakefield,               vice               president               of               IT               at               Snelling               Staffing               and               Intrepid               USA,               a               home               healthcare               firm               also               owned               by               Snelling's               parent               firm,               Patriarch               Partners.

"Muck               would               get               in,               clog               up               our               strainers               and               shut               down               the               AC               unit               to               our               data               center.

That               was               a               big               problem."
               In               addition,               the               building               owners               would               not               give               Snelling               the               OK               to               put               in               a               diesel               backup               generator               to               power               the               data               center.

"Let's               just               say               they               weren't               very               helpful,"               says               Wakefield,               who               spoke               about               his               data               center               project               at               the               recent               Network               World               IT               Roadmap               Conference               and               Expo               in               Dallas.
               Things               began               to               change               quickly               once               Patriarch               bought               up               Intrepid               in               2006.

Wakefield               and               Rustin,               Snelling's               director               of               technology,               were               charged               with               building               a               brand-new               data               center               that               would               not               only               solve               the               current               Snelling               problems,               but               also               house               Intrepid's               data               center               and               be               ready               to               support               any               future               growth.
               "We               had               to               build               expandability               into               it               because               Patriarch               is               a               private               investment               firm,               and               their               goal               is               to               buy               more               companies               and               roll               them               in,"               Wakefield               says.

"We               were               told               to               give               ourselves               about               100%               growth               room."
               The               downside?

They               needed               to               do               all               that               with               a               budget               of               US$800,000               and               a               window               of               only               six               months.

"It               was               a               challenge,"               Wakefield               says.
               But               it               was               a               challenge               they               met               head-on.

Today,               Snelling               and               Intrepid's               new               1,100               square               foot               data               center               in               Dallas               efficiently               houses               a               variety               of               equipment,               including:
               A               total               of               137               servers               (45               for               Intrepid               and               92               for               Snelling),               37               of               which               are               new               dual-core,               dual-AMD               Opteron               processor-based               Sun               Fire               X-series               Unix               servers.

Three               EMC               storage               systems,               including               an               EMC               CX400,               a               CX3-20               iSCSI               system               and               an               old               SC4500,               as               well               as               a               Quantum               tape               library.

A               variety               of               networking               components,               including               shared               virus               scanners               and               Web               surfing               control               appliances.

A               Liebert               100kVA               uninterruptible               power               supply               (UPS).

Two               Emerson               10-ton               and               one               Emerson               15-ton               glycol-based               AC               units.

And               even               with               all               of               that,               Wakefield               says               he               still               has               room               to               add               nine               more               server               racks.
               Getting               there
               Wakefield               and               Rustin               first               visited               several               data               centers               to               get               an               idea               of               what               could               and               could               not               be               done.

They               also               looked               at               a               number               of               different               locations               before               deciding               in               January               on               the               Dallas               building.

Then,               the               real               planning               began.
               "Once               we               had               the               dimensions,               everything               else               came               from               that,"               Wakefield               says.

He               and               Ruston               drew               up               10               different               floor               plans               and               began               calculating               how               many               servers               they'd               need,               and               how               much               cabinet               space.

At               that               point,               requirements               began               to               fall               into               place.

"High-density               became               a               requirement;               virtualization               became               a               requirement,"               he               says.
               Although               the               new               data               center               is               only               150               square               feet               larger               than               the               old               one,               it               needed               to               support               more               than               40               additional               servers,               plus               provide               room               for               growth.

Wakefield               considered               going               the               blade               server               route               to               save               space,               but               soon               learned               they               were               prohibitively               expensive.
               "Blades               were               pretty               high               cost-wise,               and               we               had               bought               some               of               the               Sun               X-series               boxes               in               the               past,"               he               says.

"They               are               AMD-based,               so               they               use               less               energy               and               put               out               less               heat.

And               they're               dual-core,               dual-processor               with               about               8GB               of               RAM,               so               we               could               set               up               [virtual               machines]               on               a               good               chunk               of               them,               and               that               saved               us               a               lot               of               space               too."               

               Wakefield               says               space               constraints               also               led               him               to               purchase               new               Chatsworth               CPI               TeraFrame               high-density               racks,               each               of               which               can               hold               as               many               as               36               1U               servers.

"They're               vented               at               the               top               and               handle               air               circulation               really               well,"               he               says.

"We're               on               a               raised               floor,               so               the               cooling               comes               from               below,               it               gets               sucked               in               the               front               of               the               cabinet               and               then               vented               out               the               back               and               straight               up               the               top.

It's               very               efficient."
               He               addressed               the               AC               problems               by               purchasing               the               glycol-based               units,               which               are               completely               self-contained.

"Now,               all               of               our               cooling               is               independent               of               the               building,"               he               says.

"So               if               the               building               needs               to               shut               down               their               water               supply,               it               doesn't               shut               down               my               data               center."
               Wakefield               has               also               planned               for               optimal               power               usage.

A               600-amp               power               cabinet               powers               everything               in               the               data               center.

"We               have               a               UPS               tied               to               that,               and               then               we               have               a               power               distribution               unit               out               on               the               floor               in               the               data               center               that               provides               feeds               to               each               cabinet,"               Wakefield               explains.

"Each               cabinet               has               the               ability               for               a               single               box               to               plug               four               power               supplies               into               it,               and               each               of               those               power               supplies               is               on               a               different               circuit               for               redundancy."               And               if               that's               not               enough,               he's               also               planning               to               soon               install               a               generator.

That               will               provide               backup               power               not               only               for               the               data               center,               but               for               critical               business               areas               that               support               payroll               and               billing,               so               that               Intrepid               and               Snelling               can               both               stay               open               for               business               even               during               a               power               outage.


               
               

               Wakefield               says               the               new               data               center               optimizes               efficiency               by               enabling               Snelling               and               Intrepid               to               share               as               much               equipment               as               possible.

Snelling's               43               locations               are               linked               via               an               MPLS               network               to               the               data               center,               while               Intrepid's               115               locations               use               a               variety               of               DSL,               frame               relay               and               MPLS,               with               most               gradually               moving               to               MPLS               over               time.

Each               company               has               its               own               router,               but               they               share               a               10Gbps               core               switch               in               the               data               center.

"Everywhere               we               can,               we               try               and               put               in               a               common               platform               to               save               both               companies               money,"               he               says.

"We               have               a               common               core               switch,               as               well               as               common               e-mail,               virus               scanning               and               surf               control               for               the               Web."
               Future-proofing
               All               of               the               new               data               center's               cabinets               are               pre-wired,               a               move               that               was               more               expensive               upfront,               but               will               offer               huge               payback               over               time.

Each               cabinet               has               a               10G               connection               to               a               core               switch.

"If               you               need               to               put               a               new               server               in,               you               don't               have               to               pull               a               fiber               run               all               the               way               back               to               the               switch,"               Wakefield               says.

"It's               all               there               already.

We               just               drop               the               server               in,               connect               in               our               patch               panels               and               we're               ready               to               go."
               In               addition               to               prewiring               10G               and               fiber,               he               also               future-proofed               by               installing               Category               6               cabling               to               support               not               only               both               companies'               data               but               also               their               voice               via               a               new               Cisco               VoIP               system.

And               all               of               this               means               the               new               data               center               should               easily               serve               the               two               companies               (and               any               others               that               may               be               added)               for               anywhere               from               five               to               seven               years.
               "Eventually,               depending               on               new               fiber               technology,               I               may               have               to               add               some               more               fiber               in,               but               in               the               grand               scheme               of               things,               it's               pretty               solid               for               several               years               to               come,"               he               says.
               Doing               it               right
               After               many               80-plus               hour               weeks               for               his               staff,               Wakefield               says               his               team               successfully               cut               over               the               Snelling               side               of               the               business               in               May               and               moved               in               the               Intrepid               side,               from               its               old               home               in               Edina,               Minn.,               in               July.

They               did               it               all,               start               to               finish,               in               less               than               six               months.

"I               wouldn't               recommend               that               timeframe,"               he               says.
               But               overall,               Wakefield               and               Rustin               are               pleased               with               the               results.

"We               spent               years               dealing               with               a               poor               setup,"               Wakefield               says.

"In               the               old               building,               when               we               wanted               to               add               a               server,               we               were               always               having               to               trace               runs               out               to               determine               where               they               went,               or               crawling               up               on               ladders               to               pull               cable,"               Wakefield               says.

"Over               the               years,               it               just               drove               us               crazy.

And               Dameon               and               I               always               said,               if               we               ever               get               to               build               our               own,               we               know               what               we're               going               to               do.

We'll               do               it               right.

And               I               think               we               did."




Image of building a data center




building a data center
building a data center

building a data center Image 1

building a data center
building a data center

building a data center Image 2

building a data center
building a data center

building a data center Image 3

building a data center
building a data center

building a data center Image 4

building a data center
building a data center

building a data center Image 5

  • Related blog with building a data center



    1. venturebeat.com/   09/20/2013
      ...and career development.Register now. One of Hazelcast’s engineers built a scale-model data center with 48 Raspberry Pi PCs and a boatload of Legos to show off at Oracle...
    2. datacenterlinks.blogspot.com/   06/30/2008
      ...which city they will build in. For anyone living under a rock, Google is building a data center in Council Bluffs, Iowa right now. My guesses for where they would build...
    3. secnut.blogspot.com/   02/01/2011
      ... a chance to build out a data center from the groud up and...Example Inc. has built a PHP application with a...that will run in the data center. Over the next few...
    4. virtualizationstuff.blogspot.com/   06/15/2009
      ... recently about new data centers being built. The mean average per square foot...price tag. I am glad I am not an Apple investor or I would be...
    5. fedcloud.wordpress.com/   06/22/2010
      ... a raised floor data center . . . between 50,000 and 60,000 square feet. Pod A was built in 10 months, filled...
    6. rogerluethy.wordpress.com/   12/23/2011
      Post by Jamie Condliffe (thank you) on very special Data Center This Norwegian Data Center Is Built in a Cave and Cooled by Fjords Hidden inside a Norwegian mountain, next...
    7. ibloga.blogspot.com/   07/15/2013
      ...the new one-million-square-foot NSA data center being built in Bluffdale, Utah. In the article, I interviewed William Binney, a former high-ranking NSA official...
    8. txwikinger.wordpress.com/   07/05/2012
      ...: AOL (s aol) is taking its flexible infrastructure strategy to a whole new level of flexibility by building data centers about the size of French door refrigerators. AOL Services CTO Mike...
    9. talking-tecnohme.blogspot.com/   01/15/2012
      ...caliber giants Google and Research in Motion ( RIM ) so as not to hesitate to invest building a data center in Indonesia. At least, according to him, there are three ...
    10. accesstechnews.blogspot.com/   12/05/2011
      ... land in Prineville, a community already a magnet for tech companies such as... tell The Oregonian. Apple may build data center in Oregon Steven Musil Sun, 04 Dec 2011 22...
    11. Building A Data Center - Blog Homepage Results

      ... in the Town Hall building, corner of High and McPhee ...Zealand. The Dannevirke Information Centre is a knowledge data base, providing an...
      ...Reblogged from GigaOM: Microsoft (s MSFT) has unveiled details of an experimental small data center that it’s building next to a waste water treatment plant in Cheyenne, Wyoming. The tiny data center will be ...
      Building A Data Center, One Virtual Block At A Time (by Dustin Pike)


    Related Video with building a data center




    building a data center Video 1




    building a data center Video 2




    building a data center Video 3


    building a data center




















    댓글 없음:

    댓글 쓰기