Saturday, March 5, 2011

Virtualization


            Since its origin, Information Technology has been a resource for improving efficiency. From Morse Code’s hand in the Civil War to Singapore’s vastly improved shipping economy, Information Technology has provided means for a systematic improvement in communications. Revolving around the transmittal and reception of data, Information Technology has generally thrived by providing an improved convenience and speed of data access. Accessibility and availability of information consistently pushes the competitiveness of global economics. The ability of an organization to provide customers with the fastest and highest quality results, whether in products or services, has birthed a perpetual influx of next generation IT products. These products have given dramatic improvements in global communication and information efficiency, but the items themselves have proven to be consistently wasteful in resources and power management. Global consumers have typically expressed an unquenchable thirst for ever-faster communication speeds, blindly ignoring the consequences of these improvements. Due to this, product manufacturers have had, for the most part, a complete disinterest in the environmental impact of their products, merely focusing on satisfying the consumer’s lust for speed. Recently, environmental concerns over these products have emerged, largely due to the recognition of global warming. These concerns have directed unwanted attention towards the magnitude of the IT industry’s wastefulness, and brought eco-unfriendly culprits to the forefront of environmental improving efforts. From this industry-wide analysis, the supporting infrastructure of computer networks has been identified as one of the most significant contributors to our environmental degradation. The sheer size of the intertwining networks of the Internet has presented an ideal opportunity for environmental issues to arise. Network servers have made good on this opportunity, heavily due to the impact of their immense quantity.
Server issues confronted by virtualization:
Networks and the Internet are accessible across majority of the modern world. These massive webs of communication are supported by a conglomerate of information hoarding datacenters, which store the desired information of consumers. These datacenters are essentially a central location for groupings of interlinked servers. Common networks are supported by a mesh of thousands of these datacenters, containing an estimated forty-four million servers; the Internet powerhouse Google accounts for nearly 2% of the world’s active servers (Nielson 2009). Not only does a standard server account for an average power consumption of between 250 to 500 watts per hour, these machines generally run on an uninterrupted cycle for the majority of their lives (West 2010). Through simple mathematics, you can easily find that the low-end estimates of global power consumption by servers accounts for energy usage of over 250 billion watts annually, or 250 GW-hours. These constantly running machines, when looking at the low estimate of usage, account for roughly the annual energy consumption of Greenland (Central 2009). For devices that account for such a significant proportion of the world’s power consumption, it would seem these machines would be strictly regulated for efficiency, this if far from the case. These millions of devices are performing energy and resource management at a level which would render any other machine unusable.
Servers run uninterrupted for the majority of their energy consuming lives, yet they generally remain in an idle state. Unless a server has a client request for its services, it patiently waits until it is called upon. Due to this, servers are generally put to use only 5-15% of the time they are in operation (Kumar 2010). A server’s utilization of an idle state gives the misconception that the machine is consuming the bare minimum of resources; this could not be farther from the truth. While the machines do minutely reduce their power consumption while idle, they still absorb 60 to 90% of their normal workload’s resources (Kumar 2010). Sadly, this blatant misuse of energy is merely half the issue with server inefficiency. As these devices are constantly burning off our highly valuable energy resources, each machine is not even scraping the surface of its storage capabilities. Of the tens of millions of servers in operation, each is leaving 90-95% of their server capacity unused. This means we are inefficiently running ten to twenty times more servers than we need, while each machine siphons our power grid for an absurd amount of energy it does not even use (Chu 2008). The combination of these server’s efficiency issues results in each machine emitting four tons of CO2 annually each year, they are pumping 175 million metric tons of CO2 into the atmosphere annually; more than Thailand’s overall total (Moore 2003). With our environment screaming for some kind of consolidation, a new breed of server has finally begun implementation. An industry-wide push has targeted all of our above listed issues through the newly designed virtualized servers.
Virtualization background:
In response to the overwhelming necessity to improve the wasteful nature of servers, companies like Citrix and VMware are revolutionizing current methods of data storage through virtualization. Recognizing the opportunity for improvement early on, these companies set out with two goals in mind: reduce the necessity to maintain so many servers, and improve a server’s resource management. Pursuit of these goals has led to dramatic economic and environmental results; VMware’s servers have proven to be the next major milestone in IT development (VMware 2010). 

Server benefits provided by virtualization:
Targeting the tremendous monetary cost of servers, virtualization has provided opportunities to dramatically reduce a company’s overhead. This has worked heavily to the advantage of implementing these newer, environmentally friendly machines. The concept behind these highly improved devices is that, unlike its predecessors, they are capable of consolidating multiple systems onto one server. In essence, they are capable of significantly reducing the required number of devices. Traditionally a company would be required to host several separate servers to accommodate each of its database, ecommerce, web and business applications. This led to major expenses in not only electricity and hardware, but also in building and maintaining the massive datacenters for storage. With virtualization, the abundance of servers is able to be consolidated onto a single machine with each application running simultaneously in its own completely separate environment (Chu 2008). From here, the improved hardware utilization becomes blatantly obvious. Rather than using the previously noted 5-15%, consolidation boosts hardware usage to an astounding 85%, while simultaneously decreasing energy consumption by 80% (Kumar 2010). Adequate use of these servers could boast ratios of anywhere from 10:1 to 20:1. Servers could theoretically be reduced from forty-four million to four million, drastically reducing the overall carbon footprint. With such a staggering reduction in energy and datacenter expenses, it is easy to see how businesses are yielding returns on their virtualization investments in less than three to six months (Chu 2008). While virtualization has undoubtedly produced dramatic results in way of a datacenter’s wasted resources, its benefits are expanding to the corporate realm of networked desktops.
Desktop issues confronted by virtualization:
Typical corporate infrastructure may have hundreds of computer desktops requiring maintenance. Depending on the scale of the business, these desktops can potentially be scattered from hundreds to thousands of miles apart. Traditionally being managed through remote access, these desktops will require an endless man hours to individually keep each and every system up to par. Simple tasks such as patch work or updates could easily result in days or weeks of maintenance. Along with the financial impact, these requirements are the epitome of inefficiency. If instead an organization uses virtualization, they would have centralized management of all their desktops. Rather than requiring individual maintenance, they could be maintained as a whole. This technology has come to be known more commonly as “cloud” computing. Taking measures even further, virtualization combats the same issues of idle time power consumption that plagued traditional servers.
Desktop benefits provided by virtualization:
For desktop computers in enterprise environments, the typical approach is to put a computer to sleep during idle periods. A newly available LiteGreen virtualization saves energy by migrating between the user’s physical desktop and the virtual server, depending on whether the machine is active of idle. This interface migration ensures the desktop is “always on” and network connected, even while the physical machine is turned off (Das 2010). This transition between the physical desktop and server has produced energy savings of 72% compared to 32% with standard operating system power management (Das 2010).
Conclusion:
Ventures into virtualization have allowed companies like VMware to not only improve the IT industry’s poor reputation for efficiency, but have allowed organizations as a whole to dramatically cut overhead. Corporations are now more capable of not only managing their systems, but minimizing their company-wide expenses on hardware and electricity usage. From the perspective of any IT dependent organization, virtualization is a blessing. On the larger scale, the results of virtualization should hopefully provide insight into the potential gains of companies willing to confront economic and environmental issues. While there are definite financial incentives in ventures like VMware and Citrix, the larger reward comes from the revelations such products provide. Until the poor utilization of resources is brought to a consumer’s attention, people will continue to blindly accept wasteful products until an alternative becomes available. Companies are generally content in providing merely sufficient products as long as consumers are willing to buy them. Until manufacturers are prodded into taking steps to improve the significant downside of their goods, they deem it unnecessary. Virtualization has opened consumer’s eyes to the flaws of traditional products, and their lack of efficiency. From this we can only hope that society’s focus on mere singular details, such as speed, no longer persists and consumers become more aware of a products potential for improvement.


Works Cited
Book:
Central Intelligence Agency. (2009). The CIA World Factbook 2009. New York, NY: Skyhorse Publishing, Inc.
The CIA World Factbook is released annually and describes thorough statistics on many facets of the world’s nations. A primary information resource for the United States government, the CIA is a credible source of information. If our government finds it reasonable to rely on them for information, I suppose I can too.
Interview:
Michael West. (Personal communication, February 17, 2010). We had spoken about the origins of server virtualization and the fact that it was geared towards improving the horrendous use of power.
Michael West is my cousin. He is a former Microsoft executive, whom in 2008 left Microsoft to work at VMware. He is highly knowledgeable in the realm of server functionality and virtualization. His knowledge has led to a successful career in the virtualization industry.
Sites:
Dan Chu, What is Virtualization? [video]. (2008) Retrieved February 20, 2011, from http://www.youtube.com/watch?v=MnNX13yBzAU
Dan Chu is the Senior Director of Product for VMware. He is involved in the company’s inner workings and therefore knows the complete concept behind VMware and server improvement. He is at the forefront of the world’s virtualization leading company, it would be preposterous to think that his knowledge on virtualization was not reliable. As a side note, this is actually a pretty interesting video if you have not seen it.
Rich Miller. (May 14, 2009). Who Has The Most Servers?. Data Center Knowledge. Retrieved February 21, 2010 from http://www.datacenterknowledge.com/archives/2009/05/14/whos-got-the-most-web-servers/
Data Center Knowledge (DCK) is a leading source of daily news and analysis about the data center industry. The company is staffed with numerous professionals in the realm of servers and data centers, which is the bulk of my paper’s content. Rich Miller is the company’s founder and lead editor. The company has been publishing information on data centers since 2005 and has amassed nearly 4,000 articles about nearly every facet of the industry.
Dave Nielson. (August 24, 2009). Google May Own More Than 2% of servers in the whole world. Pingdom. Retrieved February 21, 2010 from http://royal.pingdom.com/2009/08/24/google-may-own-more-than-2-of-all-servers-in-the-world/
Pingdom is a company dedicated to monitoring the news and events surround all major players in the world of Internet. Owned by the former owner of Sweden’s largest web hosting service, Pingdom is led by a man with unquestioned reputability in the industry. The site itself caters to clients in search of up to date news on any major hosting service or other internet company. Their business relies on reputable information, so they are definitely a reliable resource.
Rakesh Kumar. (February 08, 2010). Energy Efficiency. VMware Green IT Energy. Retrieved February 21, 2010 from http://www.vmware.com/virtualization/green-it/
This is the website of the world’s leading virtualization company. They have not only revolutionized the entire market, but birthed it in the late 1990’s. This company knows more about virtualization that anyone else in the world, they are reputable.
VMware (NYSE: VMW), the global leader in virtualization. (2010). Company Information. VMware. Retrieved February 21, 2010 from http://www.vmware.com/company/
This is an additional citation for the VMware, the world’s virtualization provider. The citation even says it, they are the global leader in virtualization. How can you argue with that? Seriously though, they are a legitimate resource.
Katrina Moore. (2003). CO2 emissions by country. NationMaster.com. Retrieved February 22, 2010 from http://www.nationmaster.com/graph/env_co2_emi-environment-co2-emissions
NationMaster is a company which centralizes statistical information of nations. They cite all of their information and provide links to any sited source. The site is excellent for comparisons of national statistics, rather than a breakdown of each nation individually. Any question of whether the provided statistics are reputable can only be traced back to the sited source. Fortunately the sources which they site are generally governmental, such as the CIA and its factbook.
Tathagata Das. (June 23, 2010). LiteGreen: Saving Energy in Networked Desktops Using Virtualization. Microsoft Research. Retrieved February 22, 2010 from http://research.microsoft.com/apps/pubs/default.aspx?id=131576
This source refers to an article on the Microsoft website. Litegreen is their product, and they publicize the information surrounding its development. They make not of their reasoning behind the product’s development and its benefits. Microsoft is a leader in information technology and is highly regarded for its contributions to the industry. On top of this, they tote one of the largest and most secure server farms in the world, therefore I believe they know their stuff when concerning servers and virtualization.

No comments:

Post a Comment

Please add to advantages and disadvantages of technology. Also, please share your experience with the technology.