Top Banner
Datacenter Virtualization 2008 - 2009 Enabling a Dynamic Datacenter with Microsoft Virtualization Datacenter Virtualization
28

See How Virtualization is a key Technology to help Datacenters Move: Whitepaper

May 19, 2015

Download

Technology

This whitepaper examines strategies for moving an organization toward more dynamic IT using datacenter virtualization technologies. Datacenters evolve from manual and reactionary to automated and proactive, and from cost centers to strategic assets, through a series of stages. This paper will show how virtualization is a key technology to help datacenters move through those stages, reduce cost, increase security and availability, and enable more agile business. This paper provides concrete scenarios showing how virtualization can enable server consolidation and business continuity.
This paper also examines the technologies that underlie those solutions, which include Windows Server 2008, Hyper-V, and System Center.
Finally, this paper explains the partnerships that Microsoft has formed with organizations such as XenSource/Citrix to ensure that Microsoft supports heterogeneous environments including Linux workloads, and the engineering investments that
Microsoft has made to support non-Microsoft technologies such as Xen and ESX Server.
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: See How Virtualization is a key Technology to help Datacenters Move: Whitepaper

Datacenter Virtualization 2008 - 2009

Enabling a Dynamic Datacenter with Microsoft Virtualization

Datacenter Virtualization

Page 2: See How Virtualization is a key Technology to help Datacenters Move: Whitepaper

Enabling a Dynamic Datacenter with Microsoft Virtualization

Datacenter Virtualization 2008 - 2009

1

For the latest information, please see http://www.microsoft.com/virtualization

Published: June 2008

Page 3: See How Virtualization is a key Technology to help Datacenters Move: Whitepaper

Enabling a Dynamic Datacenter with Microsoft Virtualization

Datacenter Virtualization 2008 - 2009

Contents

Introduction 3

Dynamic IT 4Characteristics of a Dynamic IT Organization 4Core Infrastructure Optimization Model 5The Dynamic Datacenter 6Virtualization as Part of Core Infrastructure Optimization Models 8Products Engineered for a Dynamic Datacenter 9

Datacenter Challenges 10Controlling Costs 10Improving Availability 12Increasing Agility 13

Virtualization Scenarios 14Scenario 1: Server Consolidation 14Scenario 2 Business Continuity 16Centralized, Policy-Based Management 18

Virtualization Technologies 19Windows Server 2008 19Microsoft System Center 20Host Clustering and Quick Migration 22

Interoperability and Partnerships 23XenSource/Citrix Partnership 23Microsoft Cross-Platform Extensions 23ESX Interoperability 23

Tools to Get Started 24Microsoft Assessment and Planning: MAP Tool 24Microsoft Integrated Virtualization ROI Tool 24Server Virtualization with Advanced Management 25

Why Choose Microsoft? 25

Conclusion 26

2

Page 4: See How Virtualization is a key Technology to help Datacenters Move: Whitepaper

Enabling a Dynamic Datacenter with Microsoft Virtualization

Datacenter Virtualization 2008 - 2009

IntroductionThis whitepaper examines strategies for moving an organization toward more dynamic IT using datacenter virtualization technologies. Datacenters evolve from manual and reactionary to automated and proactive, and from cost centers to strategic assets, through a series of stages. This paper will show how virtualization is a key technology to help datacenters move through those stages, reduce cost, increase security and availability, and enable more agile business.

This paper provides concrete scenarios showing how virtualization can enable server consolidation and business continuity. This paper also examines the technologies that underlie those solutions, which include Windows Server 2008, Hyper-V, and System Center.

Finally, this paper explains the partnerships that Microsoft has formed with organizations such as XenSource/Citrix to ensure that Microsoft supports heterogeneous environments including Linux workloads, and the engineering investments that Microsoft has made to support non-Microsoft technologies such as Xen and ESX Server.

3

Page 5: See How Virtualization is a key Technology to help Datacenters Move: Whitepaper

Enabling a Dynamic Datacenter with Microsoft Virtualization

Datacenter Virtualization 2008 - 2009

Dynamic ITTechnology accumulates in the datacenter over time, leaving many organizations in a position where their IT resources are fully allocated simply maintaining what they have, with no time left over to focus on strategic initiatives. All legacy applications must be maintained. IT organizations have to support existing capabilities, while meeting new business needs. Often viewed as a cost center, IT must meet these challenges while operating under tight financial constraints.

Characteristics of a Dynamic IT Organization As organizations move toward dynamic IT, the capabilities of IT change, and the role of IT in the organization grows. IT organizations that are dynamic have the following characteristics:

Aligned First, dynamic IT is aligned with the business. This seems obvious, but creating this synergy is often easier said than done.

Becoming dynamic ensures that IT is thoroughly connected with business requirements, by aligning the new goals that the business generates and wants to embrace with the actual IT implementation. Being dynamic means an expanded point of view and a willingness to embrace new players in the IT life cycle – for example, a business architect or analyst. It’s very important to maintain a robust real-time connection between business requirements and IT, making sure that you can connect and synchronize the system used predominantly by those business architects and analysts with technology management solutions in your organization.

Adaptable The systems must be adaptable to change. Industry trends and new technologies generate significant interest, but IT must be able to evaluate new technologies with the business needs in mind, and rapidly incorporate new technology as part of strategic initiatives. While moving forward, IT must not jeopardize prior investments and tools that are already in place providing critical functionality.

Efficient IT organizations must stay within budget. Simply purchasing expensive technology does not enable a dynamic datacenter, especially if such technology ends up as “shelfware.” While investments should be expected as organizations move from reactive and manual approaches to proactive automated processes, these investments need to be done with key success criteria and payoff calculated from the outset.

As IT moves from being viewed as infrastructure to being a business asset that provides information for decision makers, and becomes a key component in new business initiatives, IT can garner additional budget, as IT is seen as enabling profit rather than simply keeping the lights (or e-mail) on.

4

Page 6: See How Virtualization is a key Technology to help Datacenters Move: Whitepaper

Enabling a Dynamic Datacenter with Microsoft Virtualization

Datacenter Virtualization 2008 - 2009

Empowering People These elements combine to empower people within the organization. Helping to make the enterprise “people ready” means allowing people access to the information they need. It means making sure that IT services becomes literally like a dial tone:

computing on demand, wherever people need it – in various form factors and with all the technologies that they require.

Core Infrastructure Optimization ModelMicrosoft is helping businesses break out of reactive IT and move toward a vision of automated dynamic systems and applications. To enable organization along this path, Microsoft has created the Core Infrastructure Optimization Model. This model defines four optimization levels (Basic, Standardized, Rationalized, and Dynamic) for each capability. The characteristics of these optimization levels are as follows :

Optimization Level 1: Basic The basic infrastructure and platform is characterized by manual, localized processes; minimal central control; and non-existent or unenforced IT policies and standards regarding security, backup, image management and deployment, compliance, and other common IT standards. There is a general lack of knowledge regarding the details of the infrastructure and platform that is currently in place and which tactics will have the greatest impact to improve upon it.

The overall health of applications and services is unknown due to a lack of tools and resources. Data is stored in file shares and personal drives with disparate search tools. Records management is through manual, paper-based processes. There is no vehicle for sharing accumulated knowledge across IT.

Customers benefit substantially by moving from a basic level to a standardized level—dramatically reducing costs through developing standards, policies, and controls with an enforcement strategy, automating many manual and time consuming tasks, adopting best practices, and aspiring to make IT a strategic asset rather than a burden.

Optimization Level 2: Standardized The standardized infrastructure and platform introduces controls through the use of standards and policies to manage desktops and servers, how machines are introduced to the network, and the use of Active Directory® directory services to manage resources, security policies, and access control. Customers in a standardized state have realized the value of basic standards and some policies, yet they are still quite reactive.

Generally all patches, software deployments, and desktop services are provided through medium touch with medium to high cost. However, they have a reasonable inventory of hardware and software and are beginning to manage licenses.

Content is consolidated and records retention is managed using disconnected repositories with basic search capabilities.

Security measures are improved with a locked down perimeter but internal security may still be a risk.

5

IT is an efficient

cost center

IT is abusiness enabler

IT is astrategic

asset

IT is a cost center

BasicStandardized

RationalizedDynamic

Page 7: See How Virtualization is a key Technology to help Datacenters Move: Whitepaper

Enabling a Dynamic Datacenter with Microsoft Virtualization

Datacenter Virtualization 2008 - 2009

Customers benefit by moving from this standardized state to a rationalized state with their infrastructure and platform by gaining substantial control and having proactive policies and processes that prepare them for the spectrum of circumstances from opportunity to catastrophe. Service management becomes a recognized concept and the organization is taking steps to implement it.

Optimization Level 3: Rationalized The rationalized infrastructure and platform is where the costs involved in managing desktops and servers are at their lowest and processes and policies have matured to begin playing a large role in supporting and expanding the business. Security is very proactive and responding to threats and challenges is rapid and controlled.

The use of zero-touch deployment minimizes costs, the time to deploy, and technical challenges. The number of images is minimal and the process for managing desktops is very low touch. Organizations at a rationalized level have a clear inventory of hardware and software, and only purchase those licenses and computers they need. Document and records management and search are considered as strategic enablers for the business and are integrated with one or more business productivity infrastructure investments and IT has defined processes and procedures to provision search integration with new line-of-business applications.

Customers benefit on a business level by moving from this rationalized state to a dynamic state. The benefits of implementing new or alternative technologies to take on a business challenge or opportunity far outweigh the incremental cost. Service management is implemented for a few services with the organization taking steps to implement it more broadly across IT.

Optimization Level 4: Dynamic Customers with a dynamic infrastructure and platform are fully aware of the strategic value their infrastructure provides in helping them run their business efficiently and staying ahead of competitors.

Costs are fully controlled; there is integration between users and data, desktops, and servers; collaboration between users and departments is pervasive; and mobile users have nearly on-site levels of service and capabilities regardless of location.

Processes are fully automated, often incorporated into the technology itself allowing IT to be aligned and managed according to the business needs. Additional investments in technology yield rapid, measurable benefits for the business.

Customers benefit from increasing the percentage of their infrastructure and platform that is dynamic by providing heightened levels of service, competitive and comparative advantage, and taking on bigger business challenges. Service management is implemented for all critical services with service level agreements and operational reviews.

Self Evaluation Currently, most organizations are at the basic stage, where IT is seen as a cost center. As organizations adopt standard technologies and practices, IT can become an efficient cost center. But organizations really want to move beyond seeing IT as a cost center; they want to rationalize IT so it becomes a business enabler. Eventually, organizations want IT to be dynamic – a strategic asset that provides a competitive advantage.

As organizations move through the optimization models, they find it easier to lower and control IT costs; they’re able to increase availability, security, and the agility of the business, shortening the time from idea to implementation.

Microsoft has developed a self-assessment tool that you can use to determine your current optimization level. It’s recommended that you assess your organization before proceeding with virtualization solutions. This will help you and your organization identify virtualization initiatives that will provide the most value at each level of the optimization model.

6

Page 8: See How Virtualization is a key Technology to help Datacenters Move: Whitepaper

Enabling a Dynamic Datacenter with Microsoft Virtualization

Datacenter Virtualization 2008 - 2009

The Dynamic DatacenterA datacenter comprises physical hardware and potentially virtual machines, application workloads, and models that govern service levels, regulatory compliance, and other IT and business policies.

Physical Layer At the physical layer, it’s important for the dynamic datacenter to be able to provision physical systems efficiently. This includes configuring bare-metal hardware and installing and configuring software, from the operating system through the workloads, without the IT personnel resorting to low-level scripting. Once systems are provisioned, they need to be patched and kept up to date without manual intervention. Finally, organizations need to be able to multicast configurations to provision numerous servers rapidly. Microsoft provides this functionality to the dynamic datacenter with System Center Configuration Manager.

Virtual Layer With virtualization, there’s another layer of provisioning for the dynamic datacenter. This includes the provisioning of the hypervisor and the virtual machines. With Windows Server 2008, Microsoft provides the Hyper-V hypervisor as a feature of the operating system, and Hyper-V is enabled through a server role.

As a consumer of its own technology, Microsoft has fully virtualized the Technet and MSDN Web sites, using Hyper-V, realizing a significant cost savings. These sites respectively serve 11.5 million and 15 million visitors per month, and Hyper-V has proven stable and high performing in this environment. Microsoft is continuing to roll out Hyper-V out across its datacenters.

Application Layer With just hardware virtualization, you can get great benefits from server consolidation; but if you have thousands of physical servers, that will result in thousands of virtual servers. While this will save space and power and will help availability, there are additional benefits that can be achieved with application virtualization.

By separating the operating system from the applications, an organization may find they need only ten or twenty base images for hundreds or thousands of servers. Through application streaming technology, applications can be streamed to the systems on demand. This is viable today with many desktop applications, and Microsoft is investing in engineering to allow server workloads to stream to servers when needed.

This level of application virtualization allows a single base image to be patched, and all instances using that base image get the benefit, without patching each instance individually. This also allows applications to be serviced, patched, and migrated, without costly uninstall/reinstall or upgrade operations that may make the application unavailable.

7

Page 9: See How Virtualization is a key Technology to help Datacenters Move: Whitepaper

Enabling a Dynamic Datacenter with Microsoft Virtualization

Datacenter Virtualization 2008 - 2009

8

Model Layer Applications typically deploy across many servers. Most applications require three to five servers, while some require hundreds. A model cohesively brings together those applications, servers, and configurations. It also allows the people who build applications to understand the application components and configure them in a standard way.

This starts with the business analyst, who comes up with the application requirements. An architect then defines the application architecture and deployment model. Developers implement the application, and it is deployed into the environment as dictated by the model. The model can also apply governance rules.

Microsoft has started to apply this visionary process – in particular, for Operations Manager, Virtual Machine Manager, Configuration Manager, and the Visual Studio development tools, and model-driven operations is an area where Microsoft will continue to invest. In this environment, IT defines the models and the models drive the datacenter. The model directs how the operating system and applications are pulled together, and the applications are composited on the fly.

Management Microsoft brings datacenter management under one roof with the System Center suite of products. Microsoft has heard from customers that they want one set of management tools to manage their physical and virtual environments, and that virtualization solutions and management tools need to support a heterogeneous environment. Microsoft has gone in exactly that direction by partnering with XenSource/Citrix, supporting Linux workloads, and managing Virtual Server, Hyper-V, Xen, and ESX environments.

Virtualization as Part of Core Infrastructure Optimization ModelsVirtualization technology is a key factor in helping organizations optimize their IT. Organizations at a basic optimization level can realize power-saving goals and can substantially improve resource utilization. Virtualization can assist in application testing, staging, and moving workloads into production. While IT may still be viewed as a cost center, virtualization can help that center become much more efficient.

Basic DynamicStandardized Rationalized

Reduce TotalCost of

Ownership

IncreaseAvailability

EnableAgility

Basic d RationalizedStandardized

Page 10: See How Virtualization is a key Technology to help Datacenters Move: Whitepaper

Enabling a Dynamic Datacenter with Microsoft Virtualization

Datacenter Virtualization 2008 - 2009

9

Virtualization is also crucial when it comes to simplified backup and even disaster recovery, reducing downtime caused by catastrophic events from days to hours or even minutes. Virtualization can ensure that applications remain available, independent of hardware servicing. Virtualization greatly simplifies increasing the resources available to applications. At this stage in the game, IT is seen no longer as a cost center but as an empowering agent that enables business goals and increases agility.

In the most advanced organizations, business units can acquire their own infrastructure through self-service provisioning. Dynamic provisioning can automatically bring new resources on- and off-line as the workload demands. Migration of workloads can happen on the fly, with no interruption to users. Problems can be detected and mitigated with minimal manual effort.

Products Engineered for a Dynamic Datacenter

A truly dynamic datacenter utilizes a variety of technologies and best practices to optimize operations. Microsoft’s offerings extend well beyond hardware virtualization, providing the technologies that organizations need. Individually, technologies provide critical functionality, and in combination they provide the functionality needed for dynamic operations.

Terminal Services virtualizes the presentation layer, allowing administration and productivity independent of location. Profile virtualization untethers users from specific desktop hardware. Server virtualization allows for consolidation and other datacenter optimizations. Application virtualization disconnects applications from a particular operating system instance. Desktop virtualization allows users to access their desktop from anywhere, and provides datacenter performance and connectivity for user workloads. System Center provides interoperable management.

Management

Desktop Virtualization

Windows Vista EnterpriseCentralized Desktop

Presentation Virtualization

t

Server Virtualization

Pro�leVirtualizationDocument RedirectionO�ine �les

onccttiioonn

Application Virtualization

Microsoft®

Application Virtualization (App -V)

Page 11: See How Virtualization is a key Technology to help Datacenters Move: Whitepaper

Enabling a Dynamic Datacenter with Microsoft Virtualization

Datacenter Virtualization 2008 - 2009

10

Datacenter ChallengesThe main challenges to datacenters are well known. Many datacenters are seen as cost centers and are charged with the task of providing the necessary services for the least expense. Hard costs typically come in the form of power, square footage, hardware, and administrative staff. When IT is seen as an asset and a business enabler, datacenters are expected to be extremely efficient and to provide services with relatively little administrative staff. Staff is expected to focus more on strategic priorities and less on day-to-day operations.

Many datacenter services are expected to be “always on,” ready to meet the needs of a distributed and often global workforce. Maintenance windows are exceedingly small, and the IT organization is expected to comply with internal service level agreements. A high value is placed on any servicing that can be performed without service disruptions. Datacenters are also required to be secure and to comply with applicable regulations. Security breaches and vulnerabilities affect availability, result in large expense, expose the company to liability, and damage the company’s reputation. To maintain security, patches must be applied on a regular basis with minimal to no impact on availability.

When seen as a strategic asset, IT is expected to facilitate company agility. Successful businesses seek to implement new strategies, products, and services at great speed. Measurable initiatives are set, and IT must provide business intelligence services to decision makers. To meet these needs, IT is expected to provision systems rapidly. If a workload spikes, IT must allocate resources with no service disruption. Changes must be implemented quickly, without the datacenter devolving into a hodgepodge of undocumented one-off configurations. IT must be able to swiftly certify new applications for operation and ensure that updates and upgrades do not break existing workloads.

Controlling CostsThe “low-hanging fruit” in many datacenters is server consolidation. By consolidating servers, datacenters can see an immediate reduction in power use. Datacenters can keep some unused servers as spare capacity and decommission others to free up precious floor space and reduce cooling requirements.

The first step in server consolidation is converting appropriate physical servers to virtual servers. Servers with single workloads and low utilization are the most logical initial candidates for consolidation. Server consolidation can be especially valuable for legacy workloads that are tied to discontinued hardware.

Green IT The U.S. Department of Energy has said that the datacenter is the fastest-growing energy consumer in the United States today, with 61 billion kilowatt hours going toward datacenter power consumption and a projected ten to fifteen additional power plants needed by 2011 to keep up.

According to Gartner Research, energy costs could soon account for more than 50 percent of the total information technology budget for a typical datacenter .

Page 12: See How Virtualization is a key Technology to help Datacenters Move: Whitepaper

Enabling a Dynamic Datacenter with Microsoft Virtualization

Datacenter Virtualization 2008 - 2009

11

Windows Server 2008 Power Efficiency While power consumption is often viewed as a hardware issue, Microsoft has made significant engineering investments to ensure that Windows Server 2008 uses energy efficiently. As a result, Windows Server 2008 out of the box uses approximately 10 percent less energy than Windows Server 2003 running an identical workload.

This is largely because Windows Server 2008 includes updated support for Advanced Configuration and Power Interface (ACPI) processor power management (PPM) features, including support for processor performance states (P-states) and processor idle sleep states on multiprocessor systems. These features simplify power management in Windows Server 2008 and can be managed easily across servers and clients using Group Policies.

Virtualization Power Savings While a 10 percent power savings is significant, virtualization provides an opportunity for a vastly greater impact. One of the primary goals of almost all forms of virtualization is making the most efficient use of available system resources. Microsoft’s Hyper-V dramatically improves capacity utilization because it allows for the consolidation of underutilized servers. This translates to less space required, less cooling necessary, and fewer kilowatt hours of power – all of which saves money and reduces the environmental footprint.

Microsoft’s measurements with Hyper-V show a near one-to-one energy savings for each server consolidated. In other words, the power consumption of the host OS does not substantially increase as guests are added.

To put these savings into perspective, consider these actual measurements, which show the power consumption of 10 IIS Web servers compared to that of 10 IIS Virtual Servers running on Hyper-V.

Page 13: See How Virtualization is a key Technology to help Datacenters Move: Whitepaper

Enabling a Dynamic Datacenter with Microsoft Virtualization

Datacenter Virtualization 2008 - 2009

12

Physical/Virtual Comparison DataServer Setup Average Watts kWh/year Cost KG of CO2Standalone IIS x10 5,001 43,839 $4,007 34,084One Hyper-V server with 10 IIS7 virtual machines

512 4,490 $410 3,491

Savings 4,489 39,349 $3,597 30,593

Within Microsoft’s own IT, department servicing more than 100,000 employees and contractors, there has been a tremendous savings in both test/development and production virtualization implementations. As shown in the table below, the savings go beyond just power. Virtual machines allocate disk space only as needed, resulting in lower overall storage requirements. The conversion from a physical to a virtual system also greatly lowers costs by reducing cabling needs and the number of servers and racks required, among other costs.

Microsoft Test and Development Datacenter SavingsNumber of Servers Hard Drive Space Rack Space Power

Physical 477 systems ~$5k each 19 terabytes 30 racks 525 amps Virtual 20 systems ~$20k each 8 terabytes 2 racks 8 amps Savings ~ $2,000,000 11 terabytes 28 racks 517 amps

0

5000

10000

15000

20000

25000

30000

35000

40000

45000

50000

1 server 4 servers 10 servers

kWh/Ye

arVirtual Servers Physical Servers

Page 14: See How Virtualization is a key Technology to help Datacenters Move: Whitepaper

Enabling a Dynamic Datacenter with Microsoft Virtualization

Datacenter Virtualization 2008 - 2009

13

Improving AvailabilityAn IT organization is constrained by the skills and knowledge possessed by its staff. When different solutions require specialized skill sets, the organization can become strained. Initially, virtualization was new and different, and it required specialized skills and training. Virtual management tools often provided little or no functionality for the physical environments and workloads.

But virtualization has matured, and Microsoft has worked to ensure that administrators can manage physical and virtual environments using existing skills and knowledge. Microsoft’s virtualization is provided through the familiar “server role” metaphor, and System Center tools are designed to provide consistency across heterogeneous environments. This includes managing a variety of operating systems (including Windows, UNIX, and Linux) and a variety of virtualization technologies (including Virtual Server, Hyper-V, and ESX).

Microsoft Virtualization also improves availability by building on top of Windows Clustering and enabling “quick migration” of virtual machines between physical hosts. These technologies allow you to service and patch the host OS without incurring downtime for the guest workload. Decoupling the workload from the hardware ensures that the guest OS can be migrated if the host fails or needs servicing. System Center’s automated patch management keeps systems up to date, and baseline monitoring keeps hosts and guests from drifting from a defined baseline configuration.

System Center Data Protection Manager uses the same technology to back up the host, guest virtual machines, and workloads. For example, DPM can provide continuous data protection for a SQL Server or Exchange workload running in a virtual machine, and can back up the virtual machine image for disaster recovery.

Virtual Machine Manager and Operations Manager can monitor host utilization, guest performance, and application performance; can recommend migration of a guest to a host with more resources; and can even automate the move.

In combination, these technologies ensure that your datacenter applications remain secure and available.

Increasing AgilityAs the perception of IT moves from cost to strategic asset, it becomes recognized as an enabler of business agility. Companies that rapidly respond to market changes and opportunities need IT to provide the infrastructure that will power new initiatives. This includes speedy provisioning of computing power for development, test, and production operations. In some cases, departments may even be able to provision their own infrastructure without requiring IT assistance. This self-service can support rapid prototyping and afford the services needed for development and quality assurance (QA).

Any long-lived organization has legacy workloads that entail chronic, time-consuming IT support. As IT resources are diverted in order to procure discontinued hardware and complete lengthy certification processes, fewer IT resources are available for strategic initiatives. Virtualization liberates the IT organization from these and other chronic issues related to legacy applications. Because applications are isolated from the hardware, they free IT to host legacy virtual machines on the latest hardware. Virtualization can also simplify backup and recovery as well as other common tasks.

Page 15: See How Virtualization is a key Technology to help Datacenters Move: Whitepaper

Enabling a Dynamic Datacenter with Microsoft Virtualization

Datacenter Virtualization 2008 - 2009

14

Virtualization ScenariosThis next section provides a walk through of two real-world scenarios (Server Consolidation and Business Continuity) in which Microsoft’s virtualization helped datacenters move toward dynamic IT. Centralized, Policy Based Management is required to effectively manage the physical and virtual infrastructure needed for the implementation of these two scenarios efficiently.

Scenario 1: Server Consolidation• Scenario 2: Business Continuity•

High Availability »Disaster Recovery »

Centralized, Policy Based Management•

Server consolidation scenario focuses on achieving lower costs through server consolidation, This includes reducing hardware, space, power costs, as well as reducing management complexity.

Business continuity scenarios focus on maximizing system uptime and availability through server virtualization. This includes reducing the impact of disruptive events and disaster recovery, and streamlining maintenance. This also includes dynamic resource allocation and streamlining workload provisioning to efficiently support changing business needs.

Centralized management examines management and complexity and shows how centralized, policy-based management brought the datacenter under control.

Scenario 1: Server ConsolidationWith greater demand on IT to solve business challenges, datacenters quickly fill to capacity, and each new server purchase increases capital and operating expenditures as well as power and cooling costs. At the same time, servers are underutilized. Typically, server workloads consume only 5 percent of their total physical capacity, wasting hardware, space, and electricity. Because of application compatibility issues, IT has to separate applications by running them in different silos and on different servers, resulting in significant server sprawl. Provisioning new servers is a lengthy, labor-intensive process measured in days and months, making it difficult for IT to keep pace with the much faster rate of business growth and change.

Page 16: See How Virtualization is a key Technology to help Datacenters Move: Whitepaper

Enabling a Dynamic Datacenter with Microsoft Virtualization

Datacenter Virtualization 2008 - 2009

What Is Server Consolidation and Provisioning? Server consolidation is performed by converting physical servers to virtual machine (VM) files that can be centrally stored and managed, allowing for dynamic deployment based on load and available resources. The number of required physical machines is reduced, while server utilization and business agility are dramatically improved.

Benefits: By consolidating multiple workloads onto a single hardware platform via server virtualization, you can maintain a one workload/one server ratio while reducing physical server sprawl. Your business will be fully supported with less hardware, resulting in lower equipment costs, lower electrical consumption (thanks to reduced server power and cooling), and less physical space required to house the server farm.

Virtualization can also simplify and accelerate provisioning. The acquisition of workload resources and hardware can be decoupled. Adding the capability required for a particular business process (say, a Web commerce engine) becomes streamlined and immediate. In an advanced virtualized environment, workload requirements can be self-provisioning, resulting in dynamic resource allocation.

While virtualization-based server consolidation can provide many benefits, it can also add complexity if the environment is not managed properly. The savings from hardware consolidation could be offset by increases in IT management overhead. Because creating VMs is so easy, an unintentional and unnecessary sprawl can result that far exceeds physical server sprawl and that outpaces the tools used to manage VMs. A properly managed virtual infrastructure, however, automatically determines which servers are the best candidates for virtualization, converts them to virtual machines, and provisions them to the right hosts in minutes, rather than the weeks or months it takes to procure and configure physical servers manually.

15

Non-Virtualized Infrastructure

Managed-Virtualization Infrastructure

Virtual Servers

System Center VMM

VM Library

STORAGE AREA

NETWORK

Physical Servers

Microsoft Server Virtualization

Page 17: See How Virtualization is a key Technology to help Datacenters Move: Whitepaper

Enabling a Dynamic Datacenter with Microsoft Virtualization

Datacenter Virtualization 2008 - 2009

16

Scenario 2: Business ContinuityBusiness continuity scenarios examines technologies used for maximizing system uptime and availability through server virtualization. This includes delivering on high availability through technologies such as clustering as well as reducing the impact of disruptive events and disaster recovery, and streamlining maintenance. This also includes dynamic resource allocation and streamlining workload provisioning to efficiently support changing business needs.

Scenario 2.1: High Availability Traditionally, all layers of computing environments – hardware, OS, applications, storage – have been static, configured to interact properly with and support a specific computing solution. Components are installed in particular computers, resulting in a tightly bound system that does not adapt well to changes. Creating new capability entails procuring and configuring the hardware, software, and interfaces.

What Is High Availability Using Virtualization? In a virtualized stack, each element is logically isolated and independent. By separating the different layers within the stack, you facilitate greater flexibility and simplified change management – you no longer need to configure each element in order for all to work together. Computing components are essentially turned into on-demand services that are available instantly. This makes it easy to dynamically add, update, and support all elements of the infrastructure, creating the foundation for utility computing – and a much more nimble business. Organizations use virtualization to create a more dynamic server infrastructure, enabling them to fulfill their SLAs, increase the availability of their server infrastructure, and avoid disruptive events.

Benefits: Disruptive events and server downtime are reduced when virtualization is introduced, meaning increased availability of your systems to your employees – and your business to your customers. When workloads do go down, however, they are quickly and automatically migrated to an online server. Virtualization allows you to maintain an instant failover plan that provides business continuity throughout disruptive events.

Microsoft Server Virtualization Standby Virtual Host N+1

Active Virtual Hosts

HA Physical Server

Non-HA Physical Server

Page 18: See How Virtualization is a key Technology to help Datacenters Move: Whitepaper

Enabling a Dynamic Datacenter with Microsoft Virtualization

Datacenter Virtualization 2008 - 2009

Scenario 2.2: Disaster RecoveryAs every enterprise knows – typically from firsthand experience – hardware and software failures, natural disasters, and even planned maintenance result in downtime that can bring business to a halt. Not only does this interruption lead to frustrated end users and overwhelmed IT departments, it can also result in damaged brand names and the loss of critical information and revenue. Implementing a reliable, rapid recovery strategy can be time-consuming and expensive. You must maintain recovery equipment, often in a separate location, that mirrors your production environment. This means that upgrades and changes to your primary and recovery systems must occur simultaneously. Because of the difficulty and time involved, many companies simply don’t support comprehensive business continuity or disaster recovery plans to cover all their devices, data, and applications.

What Is Business Continuity/Disaster Recovery? Virtualization can simplify efforts to maintain bulletproof continuity and disaster recovery strategies for all these assets. By compartmentalizing workloads, you prevent one application from affecting the performance of another or causing a system crash. Even less stable legacy applications can be operated in a secure, isolated environment.

A holistic virtualization strategy allows you also to maintain an instant failover plan that provides business continuity throughout disruptive events. By enabling you to convert OS and application instances into data files, this approach can help automate and streamline backup, replication, and transfer – providing more robust business continuity and speeding recovery in the case of an outage or natural disaster.

Organizations use virtualization to create a more efficiently managed server infrastructure, which reduces disruptive events, simplifies disaster and recovery planning, and decreases the costs associated with backing up servers.

Benefits: Easy data backup, redundant infrastructure, and replication ensure that the impact of any disaster on your business is greatly reduced. You’ll also discover that flexibility on a day-to-day basis is increased when workloads are shifted between physical servers, enabling your organization to perform maintenance without disrupting service. This approach provides data protection of Virtual Server hosts and virtual machines, regardless of which operating system they are running, while automatically minimizing outage of the protected virtual machines.

17

Globally Managed

Virtualization Infrastructure

System CenterVirtual Machine

Manager

S

System Center Data

Protection Manager

System Center Data

Protection Manager

Page 19: See How Virtualization is a key Technology to help Datacenters Move: Whitepaper

Enabling a Dynamic Datacenter with Microsoft Virtualization

Datacenter Virtualization 2008 - 2009

18

Centralized, Policy-Based ManagementManaging thousands of desktops, applications, and servers is incredibly complicated and requires vast resources. In traditional, static environments, where every layer of the stack is linked to another layer, much manual labor is necessary to provision, upgrade, change, or remove elements of the infrastructure. For example, in order to install, terminate, or even troubleshoot applications, IT often is required to take possession of each client device and conduct a tremendous amount of testing and QA before confirming that everything works properly.

Virtualizing the entire computing infrastructure provides tremendous time and cost savings, as well as flexibility benefits. However, attempting to separately manage each layer of the stack and each instance within those layers (such as individual virtual machines) creates a much more complex situation than is necessary. Using different tools for virtualized resources can result in duplicate or competing processes for managing resources, adding complexity to the IT infrastructure. This can undermine the benefits of virtualization. A virtualized world that isn’t well managed can be less reliable and perhaps even more expensive than its nonvirtualized counterpart.

What Is Centralized, Policy-Based Management? Centralized, policy-based management – of both virtual and physical assets – lets IT handle enterprise-wide provisioning and changes from a central location. This greatly reduces the resources and time needed to administer the infrastructure, and provides a unified toolset that manages both Microsoft applications and third-party virtualization applications such as VMware.

Benefits: With virtualization, you will realize an enormous reduction in the resources and time needed to administer your business’s computing infrastructure. This will allow you to simplify your support requirements, making you much more agile and responsive to business needs.

In addition, with a unified toolset that manages both Microsoft and third-party virtualization applications (such as VMware and Xen), you will find your management style much more sophisticated.

Virtual Layer Management Only

3rd PartySolution

3rd Party Virtualization Infrastructure Management

System CenterVMMCMOM

Storage Management

Workload Layer Management

Microsoft Centralized, Policy BasedManagement including Virtualization

Infrastructure

Physical Layer Management

Virtual Layer ManagementIncluding 3rd Party HyperVisors

SSS

Page 20: See How Virtualization is a key Technology to help Datacenters Move: Whitepaper

Enabling a Dynamic Datacenter with Microsoft Virtualization

Datacenter Virtualization 2008 - 2009

19

Virtualization TechnologiesAt this point, you’ve seen the vision of the dynamic datacenter, common datacenter challenges, and scenarios for addressing those challenges. Next, you will see a more in-depth examination of the technologies that come together to provide these virtualization solutions.

The foundation for datacenter virtualization is Windows Server 2008, which includes the Hyper-V hypervisor operating system feature. The hypervisor is installed by the familiar administrative task of configuring a server role. Windows Server 2008 was designed for interoperability, and Hyper-V was specifically engineered to be a great hypervisor for Windows, Linux, and UNIX guests.

Unified and consistent management is provided by the System Center family of products. Virtual Machine Manager provides the administrative console for provisioning and maintaining virtual machines. Operations Manager monitors physical and virtual environments and provides guidance to optimize IT operations. Configuration Manager allows the quick provisioning of physical servers, along with automated patching for physical and virtual environments. Data Protection Manager provides the foundation for backup, restore, and disaster recovery, and through a single tool allows IT to back up virtual machines and their internal workloads.

Windows Server 2008Windows Server 2008 provides many advantages for an organization. Active directory integration allows you to use the same directory management features for virtual machines and physical machines, and permits you to delegate management of the virtual environment and machines using the same techniques and policies that you currently use to delegate management of physical machines.

With 64-bit technology and SMP support, virtual environments scale to meet the needs of demanding workloads. By supporting up to four processors in a virtual machine environment, your virtual machines get the most performance from multithreaded applications.

Hyper-V Hypervisor The actual hypervisor is a very thin layer of code on top of the hardware that presents a very small attack surface. The hypervisor was developed under the industry-leading Microsoft Security Development Lifecycle, which ensures product team security education, threat modeling, code reviews, static analysis, fuzz and penetration testing, and a robust security response.

Page 21: See How Virtualization is a key Technology to help Datacenters Move: Whitepaper

Enabling a Dynamic Datacenter with Microsoft Virtualization

Datacenter Virtualization 2008 - 2009

There are two kinds of hypervisors: monolithic and microkernel. A monolithic hypervisor is a relatively thick layer between the guest operating systems and the hardware. Monolithic hypervisors carry their own hardware drivers, which are different from the hardware drivers in the guest operating systems. The hypervisor controls guest access to processors, memory, and input/output (I/O), and isolates guests from one another.

Because a monolithic hypervisor is relatively large and carries multiple drivers, it presents a significant attack surface. If the hypervisor is compromised, through either the hypervisor code or the third-party drivers that it loads, the entire physical host and all guests can be compromised, too.

Rather than accepting this unnecessary risk, Microsoft developed Hyper-V using microkernel architecture. In this model, the hypervisor is a thin layer between the guests and the hardware. The hypervisor provides simple partitioning functionality that leverages virtualization extensions to the processor. Guest operating systems use their own native drivers. This means that the hypervisor contains no third-party code that could introduce vulnerabilities. The microkernel hypervisor also supports more hardware, as OEMs already produce OS drivers and need not produce separate hypervisor drivers.

With a guest using its own drivers, the size of the trusted computing base (TCB) is reduced, as guests are not routed through parent partition (or Dom-0) drivers.

Microsoft believes microkernel is the best approach, as it ensures that all hypervisor code is Microsoft code produced under the Security Development Lifecycle, presenting the smallest attack surface possible. As OEMs are not required to produce hypervisor drivers, more hardware is available, and the possibility of systems performing differently when virtualized is diminished. Modern processors contain virtualization extensions, which allows the hypervisor to be a much thinner software layer.

Microsoft System CenterVirtualization technology is only a portion of the virtualization solution. All datacenter operations require management tools for both the physical and the virtual layers. Datacenters also require the provisioning of software, as manual provisioning is not adequate to meet the needs of agile, cost-conscious businesses. Further, datacenters require operational monitoring, alerts, and problem mitigation. Finally, datacenters require quick and granular backup and recovery, scaling to full disaster recovery scenarios as necessary. Microsoft provides this critical functionality through the System Center suite of products.

20

Hypervisor

(Admin)VM 2 VM 3

Hardware Hardware

HypervisorDriversDrivers

yp

DDDDDDDDDrrrrrrrrrriiiiiiivvvvvvvvvveeeeeeeeeerrrrrrrrrrssssssssssDDDDDDDDDDDrDDDDDDDDiiiirrrrrrvvvrrrrriiiiiiiivvveeeevvvvvvvveeeeeereeeeeessssrrrrrrrrDrivers

VM 2(“Child”)

VM 3(“Child”)

Virtualization Stack

VM 1(“Parent”)

DriversDriversDDDDDDDDDDrrrrrrriiiiiiiivvvvvvvvvvveeeeeeeeeeerrrrrrrrrrrsssssssssssDDDDDDDDDDrDDDDDDiiiivvvrrrrrriiivvviiieeeervvvvvveeeeeesrrrrrrDrivers DriversDriversDDDDDDDDDDrrrrrrrrrrriiiiiiivvvvvvvvvvveeeeeeeeeeerrrrrrrrrrrsssssssssssDDDDDDDDDDrDDDDDDiiiirrrrrrvvvrrrrriiiiiivvveeeervvvvvveeeeeesrrrrrrDrivers DriversDriversDDDDDDDDDDrrrrrrrrrrriiiiiiivvvvvvvvvvveeeeeeeeeeerrrrrrrrrrrsssssssssssDDDDDDDDDDrDDDDDDiiiirrrrrrvvvrrrrriiiiiivvveeeervvvvvveeeeeesrrrrrrDrivers

Monolithic Hypervisor Microkernel Hypervisor

VM 1

Page 22: See How Virtualization is a key Technology to help Datacenters Move: Whitepaper

Enabling a Dynamic Datacenter with Microsoft Virtualization

Datacenter Virtualization 2008 - 2009

21

Virtualization Management in System CenterVirtualization Management Capabilities

Virtual Machine Manager

Operations Manager

Configuration Manager

Data Protection Manager

Server consolidation through virtual migration

X

Virtual machine provisioning and configuration

X

Server health monitoring and management

X

Performance reporting and analysis XPatch management, software upgrades

X

Virtual machine backup and restore XDisaster recovery X

System Center Virtual Machine Manager Microsoft understands that datacenters tend to be heterogeneous environments, often containing a mixture of operating systems, databases, and application workloads provided by a variety of vendors. Customers have said that it’s critical for physical servers to host disparate operating systems and that they don’t want separate management tools for these disparate workloads. IT operations simply don’t want one set of management tools for Linux, and another for Windows – one set of management tools for ESX and another for Hyper-V. Microsoft has listened to customers, and is proud that Virtual Machine Manager 2008 will manage Virtual Server, Hyper-V, Xen, and ESX as first-class hypervisors. This provides access to the entire virtual environment through one pane of glass.

Customers have also said that they value the ease of use of graphical management tools, as well as the wizards that make administrative tasks intuitive and consistent, and that increase the productivity of IT personnel. Customers also need powerful scripting capabilities in order to perform consistent operations on hundreds or thousands of machines; scripting allows the unique circumstances and needs of individual businesses and datacenters to be addressed. Virtual Machine Manager 2008 provides both capabilities. At the end of every wizard function in Virtual Machine Manager 2008, you’re presented with the option to save the wizard’s actions as a PowerShell script. In fact, Virtual Machine Manager is built on top of PowerShell, ensuring that any operations performed by VMM are scriptable. This provides the effortlessness that Windows administrators expect, along with the power to script complex operations customized for the needs of the datacenter.

It’s important to place virtual machines on physical servers that can provide the needed resources. Virtual Machine Manager’s intelligent placement recommends the best server for placement of a new machine and for migrating an existing workload with the goal of providing more resources. For Hyper-V virtualization, VMM allows instantaneous migration with the click of a button. When managing ESX, VMM allows you to perform live migrations using the same intelligent placement. Even live migration and other ESX operations are scripted as PowerShell scripts.

For mission-critical workloads, you can simply click a “high availability” checkbox and VMM will place the virtual machine on a clustered server. VMM handles all configuration on top of Windows Server 2008’s greatly simplified clustering.

Page 23: See How Virtualization is a key Technology to help Datacenters Move: Whitepaper

Enabling a Dynamic Datacenter with Microsoft Virtualization

Datacenter Virtualization 2008 - 2009

Operations Manager System Center Operations Manager allows datacenters to monitor their physical and virtual environments with a single tool. Operations Manager has long allowed datacenters to monitor operating systems and workloads, and this functionality continues whether the workload is running on a physical or a virtual server. In addition, Operations Manager allows datacenters to monitor the physical hosts running the virtual machines.

It’s important to monitor not just the overall CPU, memory, and I/O of hosts, but also the performance of the workloads within hosts in order to determine when more resources are needed so that workload performance meets requirements. System Center is designed with these scenarios in mind, and coordinates between the physical and virtual environments. Operations Manager also integrates with Virtual Machine Manager, providing tips that VMM can use when recommending virtual machine migration to more suitable hosts, and can even perform the migration automatically.

System Center Operations Manager Data Protection Manager (DPM) forms the foundation of backup, restore, and disaster recovery functionality. DPM provides great functionality in its continual data protection for the workloads themselves, ensuring that you never lose more than fifteen minutes of data from SQL Server, Exchange, SharePoint, and other workloads. DPM offers granular restore of such things as individual mailboxes, up to a complete bare-metal restore of machines.

DPM is capable of protecting virtual machines without hibernation or downtime. Using shadow copy-based block-level protection of your virtual disks, DPM delivers fast backup that does not consume inordinate amounts of disk space. This gives datacenters a single backup and recovery tool for both physical and virtual workloads.

With replication technologies, DPM facilitates disaster recovery by restoring system images to a backup datacenter.

Host Clustering and Quick MigrationHost clustering and quick migration allows IT organizations to minimize or eliminate downtime when servicing workloads.

For unplanned downtime, such as a physical host failure, quick migration can have the workload up and running on a new host within seconds. Clustering of the host allows the virtual workloads to fail over to the new host from shared disks. This works regardless of whether the guest operating system is Windows, Linux, or UNIX.

For planned downtime, guests can be clustered so that any node in the cluster can be taken off-line and serviced while other cluster nodes handle the workload. This allows guest patching and other maintenance without service interruption.

Virtual Machine Manager 2008 will allow you to designate, with a simple checkbox, mission-critical virtual machines as “high availability.” The appropriate server configuration will then be performed, and virtual machines will be properly sited and clustered.

22

Page 24: See How Virtualization is a key Technology to help Datacenters Move: Whitepaper

Enabling a Dynamic Datacenter with Microsoft Virtualization

Datacenter Virtualization 2008 - 2009

23

Interoperability and PartnershipsCIOs report that, more and more, datacenters are using virtualization technology from multiple vendors. The main vendors – Microsoft, Citrix, and VMware – all have strengths that may make one more appropriate than another for a specific scenario. In addition, datacenters that already use virtualization are likely using VMware but are eager to introduce competing technologies that promise cost savings and unified management.

To address this need, Microsoft has focused on interoperability. Microsoft supports Windows, Linux, and UNIX guest operating systems to ensure that an organization can virtualize its existing workloads onto a single hypervisor technology. Microsoft is also leading the industry with management support for disparate hypervisors, allowing organizations to choose the best hypervisor technology for specific workloads and to manage them through a single pane of glass.

To meet the needs of customers, Microsoft has formed strategic partnerships with Citrix and certain open source projects, and has built interoperability to make certain that ESX operates as a first-class hypervisor in Virtual Machine Manager.

XenSource/Citrix PartnershipMicrosoft has partnered with Citrix to provide first-class support for Xen-enabled Linux workloads on Hyper-V with the Linux Integration Components. With these components, Linux operating systems achieve the same near-physical performance of Windows virtual workloads by avoiding hardware emulation and utilizing the Virtual Service Provider (VSP), Virtual Service Client (VSC), and VMBus. This allows Hyper-V to host Windows and Linux workloads, and ensures that those workloads have great performance and scalability characteristics.

In addition, Citrix is enabling XenEnterprise to manage Hyper-V, in much the same way that we’ve enabled Virtual Machine Manager to manage non-Microsoft hypervisors. Finally, Citrix’s development of XenDesktop allows customers to connect to virtual desktops running in the datacenter.

Microsoft Cross-Platform ExtensionsCross-platform extensions allow Operations Manager to manage and monitor Linux and UNIX operating systems, as well as open-source Web servers and databases such as Apache and MySQL. Microsoft has worked to ensure that Linux administrators would be comfortable using the cross-platform extensions, in part by building the extensions on top of such industry-standard technologies as WS-Management and OpenPegasus. Microsoft will be contributing code to (and has joined the steering committee for) the OpenPegasus project.

ESX InteroperabilityMicrosoft recognizes that datacenters are heterogeneous environments, and as such, it is becoming the norm today for many organizations to use a variety of hypervisors. Customers will migrate workloads to Hyper-V as it is adopted, since it is substantially more cost-effective than other hypervisor solutions. Customers have said they want to manage their hosts and guests using a single set of management tools. Microsoft has stepped up to provide first-class support for ESX, Hyper-V, and Virtual Server in Virtual Machine Manager. This allows organizations to develop one set of virtualization skills to manage all workloads through a single pane of glass. Microsoft’s System Center products integrate with one another to provide the best provisioning, management, and monitoring functionality available.

Page 25: See How Virtualization is a key Technology to help Datacenters Move: Whitepaper

Enabling a Dynamic Datacenter with Microsoft Virtualization

Datacenter Virtualization 2008 - 2009

Tools to Get StartedMicrosoft provides a variety of tools to help you analyze your existing environment, calculate the cost and ROI of virtualization, and perform an implementation.

Microsoft Assessment and Planning: MAP ToolIn order for a successful IT project rollout to become a reality, understanding the network environment is key. The MAP tool works as a remote inventory engine. By simply installing MAP on a single desktop or server, connecting it to the IT network, and confirming the correct credentials, you will be able to leverage WMI and other protocols to find and assess computers on the network. MAP can generate technology-specific assessments and recommendation reports in both Microsoft Word and Microsoft Excel.

Some of the features of the MAP tool include:

Integrated portal with automated tools and guidance from desktops to servers• Agentless inventory of clients, servers, applications, devices, and roles• Technology migration, readiness assessment, and proposal generation•

Additionally, the MAP tool provides:

Hardware and device compatibility for Windows Vista, Windows Server 2008, and Microsoft Office System 2007• Server virtualization candidates for consolidation with Hyper-V and Virtual Server 2005 R2• Infrastructure assessment for the environment in order to leverage Microsoft Application Virtualization (formerly • SoftGrid)

You can access the MAP tool at http://www.microsoft.com/map.

Microsoft Integrated Virtualization ROI ToolThe Microsoft Integrated Virtualization ROI Tool is designed to help organizations make the business case for virtualization solutions and easily compare the cost of Microsoft’s solutions to competing technologies.

The ROI Tool can assist organizations in rapidly determining their particular ROI with Microsoft’s virtualizations solution. The tool allows you to enter information about the business’s current infrastructure, including hardware, operating systems, and workloads. The tool assists in determining the virtualized infrastructure and estimating the cost to implement it, taking into consideration hardware costs and software licensing in order to provide the most comprehensive pricing. The ROI Tool also compares competitive products to illustrate the cost savings to your business of the Microsoft solution.

You can access the ROI Tool at http://www.microsoft.com/virtualization/roitool.

24

Page 26: See How Virtualization is a key Technology to help Datacenters Move: Whitepaper

Enabling a Dynamic Datacenter with Microsoft Virtualization

Datacenter Virtualization 2008 - 2009

Why Choose Microsoft?Microsoft provides a comprehensive, end-to-end toolset for creating, provisioning, managing, and securing both the virtual and the physical infrastructures of your organization, including the management of third-party virtualization solutions.

Using familiar interfaces and common management consoles, an environment based on Microsoft technologies delivers the promised cost, service level, and agility benefits while reducing system complexity that can result from disparate point solutions. Your IT organization can harness the power of virtualization across the enterprise while simultaneously improving the efficiency and effectiveness of your operations.

25

Server Virtualization with Advanced ManagementServer Virtualization with Advanced Management is an offering from Microsoft Consulting Services that enables customers to maximize the value of their infrastructure investments through managed virtualization by providing a proven methodology, best practices, and the highest level of expertise in the industry.

Customers see increased IT system cost-efficiency through server consolidation; a reduction in hardware, space, and utilities costs; and centralized management of physical and virtual server assets. This offering drives greater IT operating efficiency through managed virtualization, helping to reduce costs, maximize system availability, and increase operational agility.

Virtual Servers

Virtual Presentation

Virtual Applications

Physical Servers

Backup & Storage

Desktop Infrastructure

Page 27: See How Virtualization is a key Technology to help Datacenters Move: Whitepaper

Enabling a Dynamic Datacenter with Microsoft Virtualization

Datacenter Virtualization 2008 - 2009

26

ConclusionDynamic IT shifts the IT organization from being a cost center to being a strategic asset of the business. With Dynamic IT, common datacenter tasks are automated, freeing the IT organization from repetitive manual operations. As less IT time is consumed maintaining existing infrastructure, more time is available to focus on strategic initiatives.

Server consolidation is the first step in controlling costs. Reducing the number of physical servers saves on power, space, and cooling. However, virtualization can cause complexity by requiring the administration of physical and virtual servers. The key to reducing complexity is unified tools that manage, monitor, and provision the physical and virtual environment. Microsoft System Center suite of tools provides unified management of the physical and virtual environment, the operating systems, and the applications.

Datacenters are heterogeneous environments. Virtualization will introduce even more heterogeneity, as companies introduce different hypervisors into the same datacenter. System Center is designed for heterogeneity, with the ability to manage Windows, Linux, and UNIX workloads, and Xen, ESX, Virtual Server, and Hyper-V hypervisors.

Microsoft’s virtualization solutions let you maximize uptime, and reduce the impact of disruptive events. Using quick migration and clustering, workloads can be kept available while servers are patched, and hardware serviced. System Center tools can monitor the physical and virtual environments, and alert personnel to issues before the result in a service outage. Using Data Protection Manager, organizations can achieve near-continuous backup of virtual servers, and continuous data protection of workloads running on the servers. This allows organizations to use one tool to recover something as small as an individual user’s mailbox, to something as large as an entire datacenter.

Microsoft provides its Hyper-V as part of the Windows operating system, rather than as a completely separate technology. This leverages existing tools, skills, and hardware, and insures seamless integration with technologies such as active directory. Microsoft provides the full virtualization solution, including server, desktop, presentation, application, and storage virtualization. Microsoft fully supports the entire stack, from the hypervisor, through the operating system, and including the Microsoft server workloads. Integrated management insures that you have a complete view of your operations through a single set of tools. Microsoft offers this technology at an affordable price, enabling a more rapid return on investment.

To move forward with virtualization to enable a dynamic datacenter, it’s important to ready your team. Your team should seek to understand virtualization solutions, and team members should see for themselves how Microsoft offers the lowest cost and the most integrated, interoperable management tools.

Your first tactical step should be performing a MAP analysis to determine the level of impact that virtualization could have on your organization. Next, use the ROI tool to discern the cost of a virtualized implementation and when that implementation would pay for itself.

Page 28: See How Virtualization is a key Technology to help Datacenters Move: Whitepaper

Enabling a Dynamic Datacenter with Microsoft Virtualization

Datacenter Virtualization 2008 - 2009

The information contained in this document represents the current view of Microsoft Corporation on the issues discussed as of the date of publication. Because Microsoft must respond to changing market conditions, it should not be interpreted to be a commitment on the part of Microsoft, and Microsoft cannot guarantee the accuracy of any information presented after the date of publication.

This white paper is for informational purposes only. MICROSOFT MAKES NO WARRANTIES, EXPRESS OR IMPLIED, IN THIS DOCUMENT.

Complying with all applicable copyright laws is the responsibility of the user. Without limiting the rights under copyright, no part of this document may be reproduced, stored in, or introduced into a retrieval system, or transmitted in any form or by any means (electronic, mechanical, photocopying, recording, or otherwise), or for any purpose, without the express written permission of Microsoft Corporation.

Microsoft may have patents, patent applications, trademarks, copyrights, or other intellectual property rights covering subject matter in this document. Except as expressly provided in any written license agreement from Microsoft, the furnishing of this document does not give you any license to these patents, trademarks, copyrights, or other intellectual property.

© 2008 Microsoft Corporation. All rights reserved.

Microsoft, list Microsoft trademarks used in your white paper alphabetically are either registered trademarks or trademarks of Microsoft Corporation in the United States and/or other countries.

27