Top Banner

of 20

Rethinking the Datacenter

May 30, 2018

Download

Documents

vrbala
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
  • 8/14/2019 Rethinking the Datacenter

    1/20an Networking eBook

    Rethinking theData Center

  • 8/14/2019 Rethinking the Datacenter

    2/20

    1

    contents

    This content was adapted from Internet.com'sInternetNews, ServerWatch, and bITa Planet Websites. Contributors: Paul Rubens, Drew Robb, JudyMottl, and Jennifer Zaino.

    Rethinking the Datacenter, An Internet.com Networking eBook.

    2008, Jupitermedia Corp.

    2 Enterprises Face DataGrowth ExplosionJudy Mottl

    4 Whats the State ofYour Data Center?Jennifer Zaino

    6 Create a Recession-Proof Data CenterPaul Rubens

    9 Greening Your Data Center You May Have No ChoicePaul Rubens

    12 Hardware for Virtualization:Do's and Don'tsDrew Robb

    15 Why Tape LibrariesStill Matter

    Drew Robb

    18 Facilities ManagementCrosses Chasm to theData CenterPaul Rubens

    4

    2

    6

    129

    1815

    R e t h i n ki n g t h e D a t a ce n t e r

    [ ]

  • 8/14/2019 Rethinking the Datacenter

    3/20

    If you think storing your enterprise data is a toughchallenge now, it's nothing compared to what itmight be in just a few years.

    According to a study from research firm IDC and stor-age vendor EMC, data requirements are growing atan annual rate of 60 percent. Today, that figure tops45 gigabytes for every per-son, or 281 exabytes total(equivalent to 281 billionGB).

    What should concern ITmanagers is that the reportpredicts the total amountof digital information -- the

    "digital universe," as thestudy's authors call it -- willballoon to 1,800 exabytesby 2011.

    The findings should serveas a wake-up call to enterprises, said Charles King,principal analyst at Pund-IT.

    "Creation of information is accelerating at a torridpace, and if organizations want the benefits of infor-mation they'll need effective management tools,"

    King wrote in a response to the IDC/EMC report.Chief factors responsible for the growth in data

    include burgeoning Internet access in emerging coun-tries, increasing numbers of datacenters supportingcloud computing and the rise in social networks, thestudy found.

    Less than 5 percent of the digital universe is from datacenter servers, and only 35 percent is drawn from the

    enterprise overall, accord-ing to IDC.

    Nevertheless, the ITimpact will be extensive,ranging from the need toboost information gover-nance to improving datasecurity.

    Individuals create about 70percent of the digital uni-verse, although companiesare responsible for thesecurity, privacy, reliabilityand compliance of 85 per-

    cent of that data, the study said.

    King said IT will need to cope by assessing relation-ships with business units that classify data.

    Additionally, enterprises will have to set and enforcepolicies for data security, access and retention, and

    2 Rethinking the Datacenter, An Internet.com Networking eBook. 2008, Jupitermedia Corp.

    Rethinking the Datacenter[ ]

    Enterprises Face Data Growth ExplosionBy Judy Mottl

    Jupiterimages

    Individuals create about 70 percent of the digital universe, althoughcompanies are responsible for the security, privacy, reliability and compliance of

    85 percent of that data, the study said.

  • 8/14/2019 Rethinking the Datacenter

    4/20

    adopt tools for contending with issues like unstruc-tured data search, database analytics, and resourcepooling, he said.

    Certain business segments may be more affected thanothers, since they churn out more data. The financialindustry, for example, accounts for just 6 percent ofthe digital universe. Media and communications firms,meanwhile, collectively generate 10 times thatamount, according to the study.

    The EMC/IDC study found also that while not all infor-mation created and transmitted is stored, enterprises

    will be storing only about half on average until 2011.

    The report comes as a follow-up to an earlier, similarly

    aggressive IDC forecast about data growth.

    The research firm said it based its conclusions on esti-mates of how much data is captured or created annu-ally from roughly 30 classes of devices or applications.It then converted the data to megabytes usingassumptions about usage and compression. I

    3 Rethinking the Datacenter, An Internet.com Networking eBook. 2008, Jupitermedia Corp.

    Rethinking the Datacenter[ ]

  • 8/14/2019 Rethinking the Datacenter

    5/20

    If youre like the data center managers surveyed bySymantec in the fall of 2007, maybe things arent asgood as youd like them to be.

    Symantec issued the results of its inaugural State ofthe Data Center report, based on a survey of morethan 800 data center managers in Global 2000 andother large companies worldwide, with average annu-al IT budgets of $75 millionin the U.S. and $54 millionoutside the States.

    We found data centermanagers looking at a num-ber of challenges and tech-niques to combat them,

    says Sean Derrington,Symantecs director of stor-age management. Fixedcosts are continuing toincrease. Sixty-nine percentsay expenditures are grow-ing 5 percent a year, so alarger and larger piece ofthat IT budget goes tofixed costs. That doesntleave much incremental dollars for IT managers toplay around with.

    Trying to get out of that vicious cycle, and free updollars for more strategic uses, data center man-

    agers are attempting to contain costs, deploying newtechnologies such as server virtualization, but findingthat the money they save on hardware sometimesgets swallowed up by the increasing management

    complexity of those environments. Thats keeping anemerging technology such as virtualization from mak-ing the leap from the test and development environ-ment to production systems, Derrington notes.

    So the recommendationis to figure out how to cre-ate a software infrastruc-ture that runs across theentire data center andworks across physical andvirtual systems, so regard-

    less of the technology youselected to constrain costs,you wont have to sacrificeskills training for IT staff.They perform the sametask the same way,Derrington says.

    Symantec, of course,makes products designed

    to meet this need, such as Veritas NetBackup dataprotection and Veritas Cluster Server disaster recovery

    and high availability solution.

    One interesting finding of the survey was that increas-

    4 Rethinking the Datacenter, An Internet.com Networking eBook. 2008, Jupitermedia Corp.

    Rethinking the Datacenter[ ]

    Whats the State of Your Data Center?By Jennifer Zaino

    Jupiterimages

    Fixed costs are continuing to increase. Sixty-nine percent say expenditures are growing5 percent a year, so a larger and larger piece of that IT budget goes to fixed costs.

  • 8/14/2019 Rethinking the Datacenter

    6/20

    ing or significantly increasing demands by the busi-ness, in combination with overall data center growth,are compounding the problems of data center com-

    plexity. The respondents noted that service-levelexpectations have increased 85 percent over the pasttwo years and 51 percent admit to not having metservice-level agreements in the same time period.

    As theyre looking at negotiating service levels, theyhave to figure out how to deliver those services, saysDerrington, and hitting a wall because of inadequatestaff.

    Fifty-seven percent say staff skills do not meet currentneeds, and 60 percent say skill sets are too narrow.

    For example, they dont just want a Tivoli storageadministrator, but someone who can work acrossbackup and data protection infrastructures. Sixty-sixpercent also say there are too many applications tomanage.

    Consistency in operations supported by a standard-ized software infrastructure can make all the differencehere as well, Symantec believes.

    They want to automate the same things that arepotentially repetitive, says Derrington. Take, forexample, storage provisioning. That task includes

    storage administrators, server administrators, SANarchitects, maybe the business, maybe procurementand finance. How can a company actually define theworkflow and process so everyone knows what needsto be done, so the person on the job for a day provi-sioning storage does the same thing as someone whohas been on the job for 10 years?

    Thats not to say data center managers are looking forautomatons. In fact, part of the reason data centermanagers are having staffing troubles is that wantemployees not only to solve technology problems,

    but also to understand the implications of technologyfor the business.

    The reason being that an individual could write thebest code or solve the best way to write a script tointegrate Technology A with Technology B," Derringersays, "but if they dont understand how that impactsthe business or how it delivers a business benefit, thatperson is not going to be as valuable." I

    5 Rethinking the Datacenter, An Internet.com Networking eBook. 2008, Jupitermedia Corp.

    Rethinking the Datacenter[ ]

  • 8/14/2019 Rethinking the Datacenter

    7/20

    You don't need a Nobel Prize in economics torealize that the world's economies are facing aslowdown or recession head on. And it doesn't

    take a genius, or large leap of logic, to work out that

    your data center's budget is likely to face a cut.

    Whether you have an inkling a cut is coming or youhaven't been warned of an impending budget cut,establishing a course of action to cut costs now wouldbe a wise move, accordingto Ken McGee, a vice presi-dent and Fellow at Gartner.

    As far back as 2007 Gartnerwas warning about the needto prepare for a recession.

    Since then, things have obvi-ously changed for the worse."Since that time, the factorswe based the research on such as GDP growth projec-tions and expert predictionsfor the likelihood of a reces-sion have worsened to adegree that convinces us it is now time for clients toprepare for cutting IT costs," McGee said in January.

    McGee recommends dedicating top staff exclusively

    to investigating IT cost-cutting measures, andappointing a senior auditor or accountant to the teamto provide an official record of the team's perform-

    ance. He also recommends reporting progress to sen-ior managers on a weekly basis and identifying a liai-son with a legal representative to make it easier towork through legal issues that may crop up in connec-

    tion with maintenance and other contracts or penaltyclauses. This is to ensure cost-cutting measures don'tresult in increased legal liabilities for your company.

    So, having established that now is the time to takemeasures to help the datacenter weather a recession,the question is whereshould you look to cutcosts?

    Cost-Cutting

    Sweet SpotsOne of the most significantdata center costs is electrici-ty for powering both thecomputing equipment andthe systems used to providecooling. Virtualization can

    play a key role in reducing overall electricity consump-tion, as it reduces the number of physical boxes need-ed to power and cool.

    A single physical server hosting a number of virtualmachines can replace two, three, or sometimes manymore underutilized physical servers. Although a physi-

    6 Rethinking the Datacenter, An Internet.com Networking eBook. 2008, Jupitermedia Corp.

    Rethinking the Datacenter[ ]

    Create a Recession-Proof Data CenterBy Paul Rubens

    Jupiterimages

    Whether you have an inkling a cut is coming or you haven't been warnedof an impending budget cut, establishing a course of action to cut costs now

    would be a wise move

  • 8/14/2019 Rethinking the Datacenter

    8/20

    cal server working at 80 percent utilization uses moreelectricity than one working at 20 percent, it is still farmore energy-efficient than running four servers at 20

    percent along with the accompanying four disk drives,four inefficient power supplies, and so on.

    Virtualization also shrinks costs by reducing theamount of hardware that must be replaced. If youoperate fewer servers, you then have fewer to replacewhen they reach the end of their lives. Thanks toadvanced virtual machine management software fromthe likes of Microsoft and VMware, the time spent set-ting up and configuring them (and thus the associatedcost) can be much less than that spent managing com-parable physical servers.

    And virtualization needs not be restricted to servers.What's true of servers is true of storage systems, too:

    Storage virtualization can cut costs by reducing over-provisioning and reducing the number of disks andother storage media that must be powered (andcooled), bought and replaced.

    This leads to the concept of automation. Data centerautomation can take a vast amount of investment, butit also promises significant cost savings. In a time ofrecession it's prudent to look at initiatives that carry amodest price point and offer a relatively fast paybackperiod. These may include patch management andsecurity alerting (which in turn may enable lower cost

    remote working practices,) and labor-intensive tasks,such as password resets. Voice authentication systems,for example, can dramatically reduce password resetcosts in organizations that have large numbers ofemployees calling the IT help desk with passwordproblems. Such systems automatically authenticate theuser and reset relevant passwords.

    Any automation software worth its salt also has the

    7 Rethinking the Datacenter, An Internet.com Networking eBook. 2008, Jupitermedia Corp.

    Rethinking the Datacenter[ ]

    IDC revised its forecast in terms of both server

    dollar and unit sales in the coming years. It

    attributes the downshift to the increasing popu-

    larity of virtualization and more powerful servers. In

    both cases, one server can accomplish what previ-

    ously took several.

    IDC reported unit sales slid in 2006, while dollar

    sales grew, an indication that fewer but more power-

    ful machines are being sold. Instead of a 61 percentincrease in server shipments by 2010, IDC now

    expects server sales will grow by 39 percent.

    In projecting this trend out a few years, the research

    firm had to revise its server sales projections down-

    ward. Between now and 2010, IDC sees the x86-based

    server market dollars shrinking by 9 percent, from

    $36 billion to $33 billion, and actual unit sales

    declining 18 percent, from 10.5 million servers to 8.7

    million servers.

    This is due to what said Michelle Bailey, research

    vice president for IDC's Enterprise Platforms and

    Datacenter Trends division, called a "perfect storm"

    of virtualization and multi-core processors.

    "On its own, multi-core wouldn't have been that

    interesting," she told internetnews.com. "It probably

    would have been just another speed bump. It's the

    addition of virtualization that lets you take advan-

    tage of multi-core much more quickly."

    Virtualization lets you run multiple single-threaded

    apps and get the benefits of multi-core technology

    without having to rewrite applications to be multi-

    threaded. So a single machine with a dozen or more

    virtual environments can run the applications in a

    way a single-core system cannot.

    "It allows you to fully exploit an unutilized proces-

    sor. Virtualization is what we think of as the killer

    app for multi-core. It lets customers take advantage

    of multi-core early without having to re-architect for

    it," said Bailey.

    Virtualizations Success

    Hampers Server SalesBy Andy Patrizio

    Data center automation can take avast amount of investment, but it also

    promises significant cost savings.

  • 8/14/2019 Rethinking the Datacenter

    9/20

    added benefit that when it reduces the number ofman-hours spent dealing with a task, managers havethe flexibility to choose between reducing data center

    human resource costs and reassigning employees toother tasks, including implementing further cost cut-ting systems thereby creating a virtual circle.

    A more straightforward, but contentious, strategy isapplication consolidation. Clearly the more applica-tions your data center runs, the more complex andexpensive it will be to manage them. Thus, consoli-dating on as few applications as possible makes goodfinancial sense, assuming, of course, the apps are upto the required task. If these are open source applica-tions, which in practice probably means Linux-based

    ones, then there's a potential for significant savings, interms of operating system and applications licensefees, and CALs.

    Bear in mind that significant support costs will remain,and Microsoft and other large vendors make the casethat the total cost of ownership of open source soft-ware is no lower than closed source, but at the veryleast, you may be able to use open-source alterna-tives as bargaining chips to get a better deal fromyour existing closed-source vendors.

    As well as looking at changes that can be made at themicro level, it's also useful to look at the macro levelat the way your whole data center operations arestructured. For example, you may have set yourself atarget of "the five nines" for system availability, butit's worth evaluating if this is really necessary. Howmuch would it reduce your costs to ease this target to99.9 percent? And what impact would it have on theprofitability of the business as a whole?

    If you can identify only a few applications that require99.999 percent uptime, it's important to consider ifyour data center is the best place from which to pro-

    vide them. A specialized application service providermay be able to provide this sort of reliability at alower cost for a fixed, per user fee, with compensationif they fall below this service level. It certainly doesn'tmake sense to provide more redundancy than youneed: That's simply pouring money down the drain.Also consider whether your data center is operatinglonger hours than necessary. Thanks to the power ofremote management tools, you may find it makes

    more sense financially to leave it unmanned at certaintimes, while having a number of staff "on call" to sortout problems remotely, should the need arise.

    Finally, it's worth mentioning best practice IT manage-ment frameworks like the IT Infrastructure Library (ITIL)

    and Microsoft Operations Framework (MOF). Aligningoperations to these frameworks is a medium- to long-term project, but they are intended to ensure that allIT services, including those associated with the datacenter, are delivered as efficiently as possible.

    If you can achieve that, you are a long way down thepath to ensuring your data center can endure anyslowdown the economy can throw at it not just thistime, but the next time, and the time after that. I

    8 Rethinking the Datacenter, An Internet.com Networking eBook. 2008, Jupitermedia Corp.

    Rethinking the Datacenter[ ]

    IDC estimates that the number of virtual servers

    will rise to more than 1.7 million physical serversby 2010, resulting in 7.9 million logical servers.

    Virtualized servers will represent 14.6 percent of all

    physical servers in 2010 compared to just 4.5 per-

    cent in 2005.

    This means customers are growing more confident

    in the uptime reliability of x86-based hardware.

    While they haven't approached mainframes in relia-

    bility, x86 systems are a lot better than in previous

    years, and come with better configuration and man-

    agement tools.

    A virtualized server going down could have far

    greater impact than a single application server

    going down, but Bailey said IT is not as concerned

    about that. "I would say customer perception around

    putting too many eggs in one basket has changed. A

    virtual environment is no less available than a sin-

    gle environment," she said.

    However, there won't be a great spillover benefit

    when it comes to power and cooling issues, a grow-

    ing headache for IT. While Bailey sees the potential

    for server consolidation, she expects that virtualiza-

    tion will more likely extend the lifespan of a server,

    thus keeping more machines deployed, so there

    won't be a thinning of the herd. Worldwide, power

    and cooling cost IS organizations $30 billion in

    2006, and that will hit $45 billion by 2010.

    Virtualizations Success... continued

  • 8/14/2019 Rethinking the Datacenter

    10/20

    The writing has been on the wall for some time.

    Electricity use in data centers is skyrocketing, sendingcorporate energy bills through the roof, creating envi-

    ronmental concerns and generating negative publicityfor large corporations.

    Because IT budgets are limited and because govern-ments in Europe and the UnitedStates may soon impose carbontaxes on wasteful data centers,something's got to give. Data cen-ters are going to have to "gogreen."

    It's not as if no one saw this com-

    ing. The aggregate electricity usefor servers actually doubledbetween 2000 and 2005, both inthe U.S. and around the world as awhole, according to research con-ducted by Jonathan Koomey, aconsulting professor at StanfordUniversity.

    In the U.S. alone, servers in datacenters accounted for 0.6 percentof total electricity usage in 2005. But that's only half

    the story. When you include the energy needed to getrid of the heat generated by these servers that figuredoubles, so these data centers are responsible for

    about 1.2 percent of total U.S. electricity consump-tion, equivalent to the output of about five 1000MWpower stations, and costing $2.7 billion about thegross national product of an entire country like

    Zambia or Nepal.

    Unless data centers go green, costs energy costscould soon spiral out of control, according to Rakesh

    Kumar, a vice president atGartner. In a report titled "Why'Going Green' Will BecomeEssential for Data Centers" hesays that because space is limited,many organizations are deployinghigh-density systems that requireconsiderably more power and

    cooling than last generation hard-ware.

    Add to that the rising global ener-gy prices, and the proportion ofIT budgets spent on energy couldeasily rise from 5 percent to 15percent in five years. The mootedintroduction of carbon taxeswould make this proportion evenhigher. "When people look at the

    amount of energy being consumed and model energy

    prices, and think about risk management and energysupply, they should begin to get worried," Kumarsaid.

    9 Rethinking the Datacenter, An Internet.com Networking eBook. 2008, Jupitermedia Corp.

    Rethinking the Datacenter[ ]

    Greening Your Data Center You May Have No Choice

    By Paul Rubens

    Jupiterimages

    The aggregate electricity use for servers actually doubled between 2000 and 2005, bothin the U.S. and around the world as a whole, according to research conducted by

    Jonathan Koomey, a consulting professor at Stanford University.

  • 8/14/2019 Rethinking the Datacenter

    11/20

    It's Not Easy Being GreenSince most data centers historically have not beendesigned with the environment in mind, Kumar saysmore than 60 percent of the energy used for coolingpurposes is actually wasted. This is bad for the envi-ronment and reflects poorly on the organizations con-cerned especially if, as increasingly is the case,they have corporate social responsibility commit-ments. And as a growing number of companies areadopting a "carbon neutral" policy (either out of gen-uine concern for the environment of for the positivePR this can produce) pressure from head office toreduce the carbon footprint of the data center, to helpreduce overall carbon emissions, will become moreintense. "There's no doubt that in the short term this

    problem is a financial one, but behind that there is theneed of organizations to be seen to be green," hesaid.

    So what can be done to "green" the data center?"There is no one solution that will solve the problem this is a collective issue and it will require a raft ofsolutions," Kumar said. "You need to start by gettingsome metrics to understand the problem, because it'snot going to go away," he said.

    The ideal solution is to start from the ground up by

    designing and building a new data center with energyefficiency in mind. This includes looking at the thermalproperties of the building being constructed, the lay-out of the building for maximum cooling efficiency,and even the site of the building: Locating a new datacenter far from urban areas means that it might bemore feasible to incorporate renewable energysources such as wind turbines or solar panels into thedesign, for example. For more specific guidance,organizations can turn to standards such as the U.S.Green Building Council's Leadership in Energy andEnvironmental Design certification. Vendor programs

    such as the Green Grid, an information network spon-sored by AMD, IBM, HP, and Sun, may also be a use-ful source of information.

    Assuming you're not quite ready to tear down yourbuildings and start again, there's still plenty you cando to reduce your electricity bill and reduce your car-bon footprint. Perhaps the most effective action youcan take is to reduce the number of servers in use atany one time. Each server you switch off can reduce

    your electricity bill by up to $500 per year (and reducethe amount of carbon dioxide released into the airannually by perhaps 2000 pounds) directly, with about

    the same savings again realizable from reduced cool-ing requirements.

    It may be that you have servers that don't need to beon at all hours of the day and night, but it's more like-ly that you can reduce the number of severs you needthrough virtualization. If you run corporate applica-tions on separate servers, many may be only 10 to 20percent utilized. Virtualization can dramatically cut thenumber of physical servers you need, while technolo-gy from companies such as VMWare can ensure thatyour virtual machines can be switched to higher-

    capacity physical machines during peak times.

    If you do retire some servers, it obviously makes senseto get rid of the older ones. This has the added bene-fit of increasing your overall server energy efficiencybecause newer multi-core chips can offer significantperformance gains over older ones, whilst usingalmost 50 percent less power. Power managementtechnologies such as Intel's Demand Based Switchingcan further reduce individual processor electricity con-sumption by up to 30 percent.

    Another area where you can make significant powersavings is server power supplies themselves. That'sbecause they can vary enormously in efficiency, espe-cially under certain loads. Bad power supplies wasteabout half of the energy they consume (and thus thesame again used by cooling systems to dissipate theheat generated by this wasted energy.) To compoundthis, power supplies running at a small fraction of theirrated capacity are often even more inefficient. Lookfor power supplies with the 80 Plus certification thismeans that the power supply will run at least 80 per-cent efficiency even when running at just 20 percentof its full capacity.

    It's a Long Way to TipperaryThe answer to the question "how do you make yourdata center greener" is similar to the traditional ques-tion from the Emerald Isle: "How do you get toTipperary?" The answer in both cases is "If I were youI wouldn't start from here." What this means is thatwhile you can certainly make savings by switchingpower supplies and switching off unused machines,

    10 Rethinking the Datacenter, An Internet.com Networking eBook. 2008, Jupitermedia Corp.

    Rethinking the Datacenter[ ]

  • 8/14/2019 Rethinking the Datacenter

    12/20

    the real solution requires a total rethinking of the datacenter. This ranges from the design of the buildingsand cooling systems they contain, to the extensive

    use of virtualization to increase server utilization, allthe way down to the use of energy efficient equip-ment, from power supplies to smart, power-managedprocessors. It's not a cheap undertaking, but one thatmay prove vital for the survival of the data center, thecorporation, and perhaps in a small way eventhe planet. I

    11 Rethinking the Datacenter, An Internet.com Networking eBook. 2008, Jupitermedia Corp.

    Rethinking the Datacenter[ ]

  • 8/14/2019 Rethinking the Datacenter

    13/20

    Virtualization is catching on like never before.Just about every server vendor is advocating itheavily, and IT departments worldwide are buy-

    ing into the technology in ever-increasing numbers.

    "The use of virtualization in the mainstream is now rel-atively commonplace, rather than just in developmentand test," said Clive Longbottom, an analyst at U.K.-based Quocirca. "In addition,business continuity based onlong-distance virtualization isbeing seen more often."

    As a result, the time has cometo more closely align hardwarepurchasing with virtualization

    deployment. So what are someof the important do's anddon'ts of buying servers andother hardware for a virtualdata center infrastructure?What questions should IT man-agers ask before they makeselection decisions on servers?And how should storage virtualization gear be inte-grated into the data center?

    Dos and DontsThere are, of course, plenty of ways to virtualize,depending on the applications being addressed. This

    article will focus on a typical case where infrastructureand business logic applications are the main targets.

    With that in mind, one obvious target is memory. It is

    a smart policy to buy larger servers that hold morememory to get the best return on investment. Whilesingle- and dual-processor systems can host multipleapplications under normal circumstances, problems

    arise when two or more hitpeak usage periods.

    "Our field experience hasshown that you can host more

    VMs [virtual machines] perprocessor and drive higheroverall utilization on the server

    if there are more resourceswithin the physical system,"said Jay Bretzmann, worldwidemarketing manager, System xat IBM. "VMware's code per-mits dynamic load balancingacross the unused processorresources allocated to sepa-rate virtual machines."

    He advised buying servers with more reliability fea-tures, especially those that predict pending failures

    and send alerts to move the workloads before the sys-tem experiences a hard failure. Despite the addedcost, organizations should bear in mind that such

    12 Rethinking the Datacenter, An Internet.com Networking eBook. 2008, Jupitermedia Corp.

    Rethinking the Datacenter[ ]

    Hardware for Virtualization: Dos and DontsBy Drew Robb

    Jupiterimages

    While single- and dual-processor systems can host multiple applications under normalcircumstances, problems arise when two or more hit peak usage periods.

  • 8/14/2019 Rethinking the Datacenter

    14/20

    servers are the cornerstone of any virtualization solu-tion. Therefore, they deserve the lion's share of invest-ment.

    "Businesses will lose significant productivity if the con-solidation server fails," said Bretzmann. "A hard crashcan lead to hours of downtime depending upon whatfailed."

    Longbottom, however, made the point that an organi-zation need not spend an arm and a leg for virtualiza-tion hardware as long as it doesn't go too low end.

    "Cost of items should be low these items mayneed swapping in and out as time goes on," said

    Longbottom. "But don't just go for cheapest kitaround make sure that you get what is needed."

    This is best achieved by looking for highly dense sys-tems. Think either stackable within a 19-inch rack orusable as a blade chassis system. By focusing on suchsystems, overall cooling and power budgets can bebetter contained. Remember, too, not every server iscapable of being managed in a virtual environment.Therefore, all assets should be recognizable by stan-dard systems management tools.

    Just as there are things you must do, several keydon'ts should be observed as well. One that is oftenviolated is that servers should not be configured withlots of internal storage.

    "Servers that load VMs from local storage don't havethe ability to use technologies like VMotion to moveworkloads from one server to another," cautionedBretzmann.

    What about virtualizing everything? That's a no-no,too. Although many applications benefit from thistechnology, in some cases, it actually makes things

    worse. For example, database servers should not bevirtualized for performance reasons.

    Support is another important issue to consider.

    "Find out if the adoption of virtualization will causeany application support problems," said Bretzmann."Not all ISVs have tested their applications with

    VMware."

    Storage VirtualizationMost of the provisos covered above also apply to pur-chasing gear for storage virtualization.

    "Most of the same rules for classic physical environ-ments still apply to virtual environments it's really aquestion of providing a robust environment for theapplication and its data," said John Lallier, vice presi-dent of technology at FalconStor Software.

    While virtual environments can shield users from hard-ware specific dependencies, they can also introduceother issues. One concern when consolidating appli-cations on a single virtualization server, for example, isthat you may be over-consolidating to the detriment

    of performance and re-introducing a single-point-of-failure. When one physical server fails, multiple virtualapplication servers are affected.

    "Customers should look for systems that can providethe same level of data protection that they alreadyenjoy in their physical environments," said Lallier.

    He believes, therefore, that storage purchasers shouldopt for resilient and highly available gear that willkeep vital services active no matter what hardwareproblems arise. In addition, Lallier suggests investing

    in several layers of protection for large distributedapplications that may span multiple applicationservers. This should include disaster recovery (DR)technology so operations can quickly resume atremote sites. To keep costs down, he said usersshould select DR solutions that do not require anenormous investment in bandwidth.

    As a cost-cutting measure, Lallier advocates doublingup virtual environments. If the user is deploying a vir-tual environment to better manage applicationservers, for example, why not use the same virtualiza-

    tion environment to better manage the data protec-tion servers? As an example, FalconStor has createdvirtual appliances for VMware Virtual Infrastructurethat enable users to make use of its continuous dataprotection (CDP) or virtual tape library (VTL) systemsthat can be installed and managed as easily as appli-cation servers in this environment.

    Of course, every vendor has a different take. NettAppprovides an alternative to FalconStor using the snap-

    13 Rethinking the Datacenter, An Internet.com Networking eBook. 2008, Jupitermedia Corp.

    Rethinking the Datacenter[ ]

  • 8/14/2019 Rethinking the Datacenter

    15/20

    shot technology available in its StoreVault S500. Thisstorage array handles instant backups and restoreswithout disrupting the established IT environment.

    "Useful products are able to host VMs over multipleprotocols, and the StoreVault can do it via NFS, iSCSIor FCP whatever your environment needs," saidAndrew Meyer StoreVault Product Marketing Managerat NetApp.

    "Don't get trapped into buying numerous productsfor each individual solution. One product that is flexi-ble with multiple options (can handle VMs, create a

    SAN, handle NAS needs, provide snapshots and repli-cation) may be a smarter investment as a piece ofinfrastructure." I

    14 Rethinking the Datacenter, An Internet.com Networking eBook. 2008, Jupitermedia Corp.

    Rethinking the Datacenter[ ]

  • 8/14/2019 Rethinking the Datacenter

    16/20

    Tape libraries aren't exactly a booming business orfront-page news these days, but at the same time,they're not faring all that badly in the face of the disk-

    based backup onslaught. According to Freeman Reports,

    total revenue from all tape libraries declined 15.6 percent in2006 compared to 2005, while unit shipments declined 4.5percent.

    Despite those statistics, tape users purchased morethan 50 percent more capacity asthey migrated to higher-capacityand higher-performance tapedrives and cartridges. Thus, whatlooks a fading industry on the sur-face is very much alive and kick-ing.

    Revenue still amounted to ahealthy $1.81 billion in 2006 andwas expected to be $1.77 billionin 2007. According to FreemanReports, it will rise to $2.15 billionby 2012. Within those numbers,older formats like 8-millimeter andDLT library sales continue to falter,offset by increased sales of LTO and half-inch cartridgelibraries.

    LTO has evolved into the dominant player, accountingfor 88 percent of library unit shipments and 58 percentof library revenue. LTO capacity and throughput grew

    by leaps and bounds during the past few years. LTO-2offered 200 GB native and 30 to 35 MB/s, whereasLTO-3 provides 400 GB and 80 MB/s, and the newLTO-4 delivers 400 GB and 120 MB/s. It is also the first

    open systems tape drive technology to incorporatenative encryption.

    With the growing popularity of disk-based backupand recovery solutions and the continued consolida-

    tion of tape library resources,however, tape is increasinglytaking on a more specializedrole in data protection. In manycases, tape is being used fordisaster recovery and central-ized backup.

    "Corporations must retain datafor long periods of time andensure compliance with internalservice-level agreements andgovernment regulations," saidMark Eastman, product linedirector, Tape AutomationSystems for Quantum. "As a

    result, customers are demanding higher security, capac-ity, performance and reliability across their tape invest-ments. Automation platforms incorporating the latest-

    generation LTO-4 technology deliver on these impor-tant features."

    15 Rethinking the Datacenter, An Internet.com Networking eBook. 2008, Jupitermedia Corp.

    Rethinking the Datacenter[ ]

    Why Tape Libraries Still MatterBy Drew Robb

    With the growing popularity of disk-based backup and recovery solutions andthe continued consolidation of tape library resources, however, tape is increasingly

    taking on a more specialized role in data protection.

    Jupiterimages

  • 8/14/2019 Rethinking the Datacenter

    17/20

    QuantumOn the vendor side, the top players are SunMicrosystems, IBM, and Quantum. Quantum gainedserious ground in the enterprise tape library marketwith its acquisition of ADIC several years back.

    At the high end of the scale, the Quantum Scalar i2000has a starting price of $65,000. According to Eastman,the i2000 is designed to meet the rigors of high-duty-cycle data center operations and integration with disk-based backup solutions. It uses a standard 19-inch rackform and can holds 746 cartridges per square meter, aswell as up to 192 LTO bulk loading slots in one library.

    In the midrange, the Scalar i500 is priced beginning at

    $25,000. The entry Scalar 50 has a starting price of$8,000. One box contains 38 slots, and its QuantumStorageCare Vision data management and reportingtools enable users to monitor multiple tape librariesand disk systems from one screen.

    "Backup and restore capabilities are just as critical inbusy workgroups and remote environments as they areanywhere else," said Eastman. "The Scalar 50 tapelibrary provides them with an easy-to-use, reliable andscalable solution that simplifies the backup process."IBM Tape

    According to IDC, IBM offers the leading enterprisetape drive in the TS1120. This tape drive comes withEncryption Key Manager for Java platform (EKM) toencrypt information being written to tape.

    "EKM technology is used in high-end enterpriseaccounts by Fortune 100 companies in a variety ofindustries including banking, finance and securities,"said Master. "IBM's LTO tape offerings have achievednearly 900,000 drive shipments and over 10 million car-tridge shipments."

    The company's highest-end tape library is the TS3500,which scales up to 6,800 slots and up to 192 LTO tapedrives. Lower down the ladder comes the TS3310,which can deal with up to 398 slots and 18 LTO drives.The company offers various lower-end models such asthe TS3100 with 24 slots.

    IBM also offers tape virtualization products, such as theTS7520 and TS7700. The TS7700 Tape Virtualization

    Engine is for mainframes and can be configured to par-ticipate in a grid environment.

    "Two or three TS7700s can communicate and replicatewith each other over an IP network," said Master. "Thisarrangement helps reduce or eliminate bottlenecks inthe tape environment, supports the re-reference of vol-umes without the physical delays typical to tape I/O,helps increase performance of tape processes, andhelps to protect data and address business continuityobjectives."

    Sun StorageTekLike the other big vendors, Sun provides encryption fortape systems. The StorageTek T10000 tape drive, for

    example, includes this feature and has a starting priceof $37,000.

    At the high end on the tape library side is theStorageTek SL8500, with a starting price of $195,830. Itcan house up to 56 Petabytes (70,000 slots) and can beshared among mainframe, Solaris, AS/400, Windows,Linux and Unix systems.

    Lower down the line is the StorageTek SL500 (startingat $16,400), an 8U rackmount tape automation modelthat scales from 30 to 575 LTO slots and can deal with

    multiple cartridge types, such as LTO and SDLT/DLT-S4.Its maximum capacity is around 460 terabytes (uncom-pressed).

    "We are seeing strong adoption of the scalablelibraries in the distributed and small business space, asevidenced by continued growth of the SL500," saidAlex North group manager for tape at SunMicrosystems. "The SL500 is particularly good for suchapplications as e-mail servers, database applicationsand file servers."

    Encryption is another feature making its way intoStorageTek tape technology. Sun's StorageTek T10000tape drive is an example of a product that has built-insoftware to encrypt your data. The T10000 pricingbegins at $37,000.

    Green TapeAs for the future of tape, these vendors are committedto it and believe it will continue to play an important

    16 Rethinking the Datacenter, An Internet.com Networking eBook. 2008, Jupitermedia Corp.

    Rethinking the Datacenter[ ]

  • 8/14/2019 Rethinking the Datacenter

    18/20

    role. In fact, as green data center trends strengthen,tape usage will accelerate.

    "Tape storage TCO is as much as an order of magni-tude less expensive than disk storage," said BruceMaster, senior program manager, Worldwide TapeStorage Systems Marketing at IBM. "Its consumption ofenergy for power and cooling is anywhere from 20 to100 times less expensive than disk storage." I

    17 Rethinking the Datacenter, An Internet.com Networking eBook. 2008, Jupitermedia Corp.

    Rethinking the Datacenter[ ]

  • 8/14/2019 Rethinking the Datacenter

    19/20

    It wasn't so long ago that the facilities management (FM)

    team stalked the corridors of office buildings with greasyblue coats and large bunches of keys. That image is now

    as out of date as carbon paper and typing pools: Today'sfacilities manager is more likely to be found in a white short-sleeved shirt behind a 21-inch flat-screen monitor looking atCAD drawings and updating an asset database in a high-tech basement lair.

    The role of the FM departmenthas changed, too. If you areinvolved in planning and run-ning a modern data center, it's

    a good idea to get facilitiesmanagement involved. Today'sFM departments have much tooffer data centers and theadministrators that managethem. Working with them helpsfacilitate a flexible data centerthat is green and energy-effi-cient. Together, they enable thedata center to supply thedesired IT services to the peo-ple who need them, at close to optimal cost.

    First, let's clear up some basics: the facilities manage-ment department does not dictate what technology

    is used in the data center. That's an IT decision and

    nothing will change that. "Essentially, facility manage-ment is about power, cooling and fire protection, andalso, where data centers are concerned, physicalaccess controls," said Kevin Janus, vice president ofthe International Facility Management Association(IFMA) IT Council. "It is not involved in what serversyou run, but it is concerned with the environment inwhich they will live."

    A facility manager can helpwith a number of environmen-tal factors, purely because he

    has a complete overview of abuilding and its current andplanned future uses some-thing IT staff probably lack."Obviously you don't want theIT department creating a datacenter when there are kitchenson the floor above because ofthe danger of leaks," Januspoints out.

    But the real issues are power and air conditioning. Air

    conditioning is the number one consumer of power.Servers, as anyone who has worked in a data centercan testify, generate a great deal of heat. The high-

    18 Rethinking the Datacenter, An Internet.com Networking eBook. 2008, Jupitermedia Corp.

    Rethinking the Datacenter[ ]

    Facilities Management CrossesChasm to the Data Center

    By Paul Rubens

    Air conditioning is the number one consumer of power. Servers, as anyone who hasworked in a data center can testify, generate a great deal of heat.

    Jupiterimages

  • 8/14/2019 Rethinking the Datacenter

    20/20

    density racks that are becoming increasingly commonin today's data centers consume vast amounts ofpower, and a similar amount of power is needed to dis-

    sipate this heat. That makes the planning and layout ofthe data center, and the provision of power and airconditioning equipment, crucial.

    This falls clearly under the FM purview.

    How can FM help? In an organization of any size, it'slikely that the facility managers will have a computeraided facility management (CAFM) package at their dis-posal. Among other things, a CAFM will usually storeCAD floor plans of the building and a database ofassets. For the data center, this will likely include plans

    showing the layouts of racks. In many cases, the data-base will hold the location of each server, the applica-tions running on these servers, and information aboutthe departments that "own" each application, whererelevant.

    Software tools can also carry out calculations to workout the amount of power that must be supplied in agiven area of the data center, and the correspondingcooling capacity needed to remove the resulting heat.Information like this is clearly invaluable for the ITdepartment because no matter what IT strategy is inplace, the available power and cooling capacity pres-ents constraints. The only way the IT department canbe free to install and run the hardware it wants is if FMhas already put in place the power and cooling itrequires. And the only way for FM to know the ITrequirements is for the two departments to communi-cate regularly.

    "The IT strategy may call for increased use of virtualiza-tion two of three years down the line, but they won'tnecessarily know what implication that has for the facili-ty, especially in terms of A/C," said Chris Keller, a pastpresident of the IFMA's IT Council. "But it's also impor-

    tant to look at how the strategy will impact on peopleand the office layout elsewhere in the building. If the ITdepartment wants to replace printer stations with inex-pensive printers on every desk, then more power andA/C is going to be needed throughout the building orit won't be possible."

    When a data center space is initially populated, the FMdepartment can help design the layout of the racks to

    maximize the efficiency of the cooling systems.Detailing current thinking on hot and cool aisles andother energy efficient data center layout techniques is

    beyond the scope of this article. However, bear in mindthat input from the FM department and the softwaretools at its disposal makes is possible to design a datacenter layout that will use significantly less energy andcost less to keep at an acceptable operating tempera-ture than a badly laid out one.

    What about making changes to existing data centers?"The contents of racks have to be managed, and if theA/C can't handle it then racks or individual servers haveto be moved," said Keller. "Then the question is howdo you know which servers you are moving and how

    do you keep track of where they are going? The FMdepartment has, in a CAFM database, the place tostore that information, and can offer it to the IT depart-ment. There's no point in the IT department doing it allagain when the information already exists. From theCEO's point of view, redundancy is not the way to go,"he said.

    The message from the basement then is very clear. Byinvolving the FM team in the planning and layout ofyour data center, it can provide the tools and resourcesto ensure the data center will be practical to run, andas green and energy efficient, as possible. By keepinglines of communication open between the two organi-zations, the data center will be the flexible enough toaccommodate the changes that you have planned, soyou can deliver the services you want in the way youwant, without worrying about where you are going toput the boxes, whether you are going to run out ofpower, or if the servers might melt when a new systemis deployed. I

    This content was adapted from Internet.com'sInternetNews, ServerWatch, and bITa Planet Web sites.Contributors: Paul Rubens, Drew Robb, Judy Mottl, and

    Jennifer Zaino.

    Rethinking the Datacenter[ ]