2013년 10월 23일 수요일

Rodrick Gorden's blog ::Well-Planned Data Center Transformation Effort Delivers IT Efficiency Paybacks, Green IT Boost for Valero Energy






Rodrick Gorden's blog ::Well-Planned Data Center Transformation Effort Delivers IT Efficiency Paybacks, Green IT Boost for Valero Energy










               For               years,               Robert               Wakefield               and               Dameon               Rustin               lived               with               the               problems               of               keeping               Snelling               Staffing               Services'               old,               poorly               designed               data               center               up               and               running.

Not               only               were               the               intricate               cable               runs               and               varied               server               makes               and               models               difficult               to               keep               straight,               but               the               building               itself               tended               to               compound               their               management               headaches.
               "Our               15-ton               air-conditioning               unit               was               water-cooled,               but               the               building               [management]               didn't               clean               the               cooling               tower               very               often,"               says               Wakefield,               vice               president               of               IT               at               Snelling               Staffing               and               Intrepid               USA,               a               home               healthcare               firm               also               owned               by               Snelling's               parent               firm,               Patriarch               Partners.

"Muck               would               get               in,               clog               up               our               strainers               and               shut               down               the               AC               unit               to               our               data               center.

That               was               a               big               problem."
               In               addition,               the               building               owners               would               not               give               Snelling               the               OK               to               put               in               a               diesel               backup               generator               to               power               the               data               center.

"Let's               just               say               they               weren't               very               helpful,"               says               Wakefield,               who               spoke               about               his               data               center               project               at               the               recent               Network               World               IT               Roadmap               Conference               and               Expo               in               Dallas.
               Things               began               to               change               quickly               once               Patriarch               bought               up               Intrepid               in               2006.

Wakefield               and               Rustin,               Snelling's               director               of               technology,               were               charged               with               building               a               brand-new               data               center               that               would               not               only               solve               the               current               Snelling               problems,               but               also               house               Intrepid's               data               center               and               be               ready               to               support               any               future               growth.
               "We               had               to               build               expandability               into               it               because               Patriarch               is               a               private               investment               firm,               and               their               goal               is               to               buy               more               companies               and               roll               them               in,"               Wakefield               says.

"We               were               told               to               give               ourselves               about               100%               growth               room."
               The               downside?

They               needed               to               do               all               that               with               a               budget               of               US$800,000               and               a               window               of               only               six               months.

"It               was               a               challenge,"               Wakefield               says.
               But               it               was               a               challenge               they               met               head-on.

Today,               Snelling               and               Intrepid's               new               1,100               square               foot               data               center               in               Dallas               efficiently               houses               a               variety               of               equipment,               including:
               A               total               of               137               servers               (45               for               Intrepid               and               92               for               Snelling),               37               of               which               are               new               dual-core,               dual-AMD               Opteron               processor-based               Sun               Fire               X-series               Unix               servers.

Three               EMC               storage               systems,               including               an               EMC               CX400,               a               CX3-20               iSCSI               system               and               an               old               SC4500,               as               well               as               a               Quantum               tape               library.

A               variety               of               networking               components,               including               shared               virus               scanners               and               Web               surfing               control               appliances.

A               Liebert               100kVA               uninterruptible               power               supply               (UPS).

Two               Emerson               10-ton               and               one               Emerson               15-ton               glycol-based               AC               units.

And               even               with               all               of               that,               Wakefield               says               he               still               has               room               to               add               nine               more               server               racks.
               Getting               there
               Wakefield               and               Rustin               first               visited               several               data               centers               to               get               an               idea               of               what               could               and               could               not               be               done.

They               also               looked               at               a               number               of               different               locations               before               deciding               in               January               on               the               Dallas               building.

Then,               the               real               planning               began.
               "Once               we               had               the               dimensions,               everything               else               came               from               that,"               Wakefield               says.

He               and               Ruston               drew               up               10               different               floor               plans               and               began               calculating               how               many               servers               they'd               need,               and               how               much               cabinet               space.

At               that               point,               requirements               began               to               fall               into               place.

"High-density               became               a               requirement;               virtualization               became               a               requirement,"               he               says.
               Although               the               new               data               center               is               only               150               square               feet               larger               than               the               old               one,               it               needed               to               support               more               than               40               additional               servers,               plus               provide               room               for               growth.

Wakefield               considered               going               the               blade               server               route               to               save               space,               but               soon               learned               they               were               prohibitively               expensive.
               "Blades               were               pretty               high               cost-wise,               and               we               had               bought               some               of               the               Sun               X-series               boxes               in               the               past,"               he               says.

"They               are               AMD-based,               so               they               use               less               energy               and               put               out               less               heat.

And               they're               dual-core,               dual-processor               with               about               8GB               of               RAM,               so               we               could               set               up               [virtual               machines]               on               a               good               chunk               of               them,               and               that               saved               us               a               lot               of               space               too."               

               Wakefield               says               space               constraints               also               led               him               to               purchase               new               Chatsworth               CPI               TeraFrame               high-density               racks,               each               of               which               can               hold               as               many               as               36               1U               servers.

"They're               vented               at               the               top               and               handle               air               circulation               really               well,"               he               says.

"We're               on               a               raised               floor,               so               the               cooling               comes               from               below,               it               gets               sucked               in               the               front               of               the               cabinet               and               then               vented               out               the               back               and               straight               up               the               top.

It's               very               efficient."
               He               addressed               the               AC               problems               by               purchasing               the               glycol-based               units,               which               are               completely               self-contained.

"Now,               all               of               our               cooling               is               independent               of               the               building,"               he               says.

"So               if               the               building               needs               to               shut               down               their               water               supply,               it               doesn't               shut               down               my               data               center."
               Wakefield               has               also               planned               for               optimal               power               usage.

A               600-amp               power               cabinet               powers               everything               in               the               data               center.

"We               have               a               UPS               tied               to               that,               and               then               we               have               a               power               distribution               unit               out               on               the               floor               in               the               data               center               that               provides               feeds               to               each               cabinet,"               Wakefield               explains.

"Each               cabinet               has               the               ability               for               a               single               box               to               plug               four               power               supplies               into               it,               and               each               of               those               power               supplies               is               on               a               different               circuit               for               redundancy."               And               if               that's               not               enough,               he's               also               planning               to               soon               install               a               generator.

That               will               provide               backup               power               not               only               for               the               data               center,               but               for               critical               business               areas               that               support               payroll               and               billing,               so               that               Intrepid               and               Snelling               can               both               stay               open               for               business               even               during               a               power               outage.


               
               

               Wakefield               says               the               new               data               center               optimizes               efficiency               by               enabling               Snelling               and               Intrepid               to               share               as               much               equipment               as               possible.

Snelling's               43               locations               are               linked               via               an               MPLS               network               to               the               data               center,               while               Intrepid's               115               locations               use               a               variety               of               DSL,               frame               relay               and               MPLS,               with               most               gradually               moving               to               MPLS               over               time.

Each               company               has               its               own               router,               but               they               share               a               10Gbps               core               switch               in               the               data               center.

"Everywhere               we               can,               we               try               and               put               in               a               common               platform               to               save               both               companies               money,"               he               says.

"We               have               a               common               core               switch,               as               well               as               common               e-mail,               virus               scanning               and               surf               control               for               the               Web."
               Future-proofing
               All               of               the               new               data               center's               cabinets               are               pre-wired,               a               move               that               was               more               expensive               upfront,               but               will               offer               huge               payback               over               time.

Each               cabinet               has               a               10G               connection               to               a               core               switch.

"If               you               need               to               put               a               new               server               in,               you               don't               have               to               pull               a               fiber               run               all               the               way               back               to               the               switch,"               Wakefield               says.

"It's               all               there               already.

We               just               drop               the               server               in,               connect               in               our               patch               panels               and               we're               ready               to               go."
               In               addition               to               prewiring               10G               and               fiber,               he               also               future-proofed               by               installing               Category               6               cabling               to               support               not               only               both               companies'               data               but               also               their               voice               via               a               new               Cisco               VoIP               system.

And               all               of               this               means               the               new               data               center               should               easily               serve               the               two               companies               (and               any               others               that               may               be               added)               for               anywhere               from               five               to               seven               years.
               "Eventually,               depending               on               new               fiber               technology,               I               may               have               to               add               some               more               fiber               in,               but               in               the               grand               scheme               of               things,               it's               pretty               solid               for               several               years               to               come,"               he               says.
               Doing               it               right
               After               many               80-plus               hour               weeks               for               his               staff,               Wakefield               says               his               team               successfully               cut               over               the               Snelling               side               of               the               business               in               May               and               moved               in               the               Intrepid               side,               from               its               old               home               in               Edina,               Minn.,               in               July.

They               did               it               all,               start               to               finish,               in               less               than               six               months.

"I               wouldn't               recommend               that               timeframe,"               he               says.
               But               overall,               Wakefield               and               Rustin               are               pleased               with               the               results.

"We               spent               years               dealing               with               a               poor               setup,"               Wakefield               says.

"In               the               old               building,               when               we               wanted               to               add               a               server,               we               were               always               having               to               trace               runs               out               to               determine               where               they               went,               or               crawling               up               on               ladders               to               pull               cable,"               Wakefield               says.

"Over               the               years,               it               just               drove               us               crazy.

And               Dameon               and               I               always               said,               if               we               ever               get               to               build               our               own,               we               know               what               we're               going               to               do.

We'll               do               it               right.

And               I               think               we               did."




Image of it data center




it data center
it data center

it data center Image 1

it data center
it data center

it data center Image 2

it data center
it data center

it data center Image 3

it data center
it data center

it data center Image 4

it data center
it data center

it data center Image 5

  • Related blog with it data center



    1. dcimnews.wordpress.com/   06/30/2013
      ...DCIM Data Center Infrastructure and Critical Facility News: Predictive DCIM: A Necessity to Protect IT Service Availability and Data Center Capacity How is capacity lost? Once lost, can it be reclaimed...
    2. mark17.typepad.com/blog/   04/15/2013
      ...again joining Mark Hydeman of Taylor Engineering to present a course on IT and data center energy efficiency at the Sacramento Municipal Utility District next week. It's gratifying...
    3. itassetmanagementexpert.wordpress.com/   03/27/2012
      ... for both data centers and IT asset management is...to date with accurate IT asset information. Investing in the Data Center Powerful asset management...
    4. gregness.wordpress.com/   06/04/2011
      ...to reduce OPEX while preserving ownership and control of critical IT assets. Smart data centers, whether they are owned or leased, offer significant...
    5. briefingsdirect.blogspot.com/   04/07/2010
      ... new data centers can address critical efficiency...Listen to the podcast . Find it on iTunes/iPod...size-fits-all data center. It's just not that way...
    6. briefingsdirect.blogspot.com/   02/11/2010
      ... at this problem. It’s a lifecycle approach from the data centers to full automation ... in that same data center. It's really using...
    7. constantia.wordpress.com/   12/02/2011
      Supporting IT: Data Center Modularity and Efficiency Our colleague had...2011 - Posted by afairchild | Uncategorized | data center , IT , modularity
    8. briefingsdirect.blogspot.com/   04/19/2011
      ...determine actual IT needs and... data center operations... on the Data Center Transformation Experience... data center. It's all about ...
    9. utilitycomputing.wordpress.com/   06/15/2012
      ...For many the following video will be unsurprising – it’s just a tour of another data-centre (albeit a famous one). Yet it demonstrates the sheer physicality of the cloud. We can pretend the internet ...
    10. grovesgreenit.typepad.com/   11/30/2007
      ...what technology in the data center is involved, it was ...important issues for data center managers. IT Operations...
    11. It Data Center - Blog Homepage Results

      ... and strategy in data center infrastructure and operations...project/portfolio management (PPM), data management, data warehousing...DW), business intelligence (BI) and IT enterprise systems, business...
      Blog on those big data center issues, including cloud computing, automation, IT consumerization and other things IT ops cares about
      ... for the Long Haul Lean, Agile and Six Sigma IT decision makers take a long term view of... supported by facts and data.  The term decision analysis was first ...


    Related Video with it data center




    it data center Video 1




    it data center Video 2




    it data center Video 3


    it data center




















    댓글 없음:

    댓글 쓰기