Utility computing: Difference between revisions

Content deleted Content added
m History: changed "CA" text instance to "California"
 
(One intermediate revision by one other user not shown)
Line 6:
This repackaging of computing services became the foundation of the shift to "[[Code on demand|on demand]]" computing, [[software as a service]] and [[cloud computing]] models that further propagated the idea of computing, application and network as a service.
 
There was some initial skepticism about such a significant shift.<ref>{{citation | publisher=ZD Net | url=httphttps://www.zdnet.com/newsarticle/on-demand-computing-what-are-the-odds/296135 | title=On-demand computing: What are the odds? | date=Nov 2002 | access-date=2017-11-03}}</ref> However, the new model of computing caught on and eventually became mainstream.
 
IBM, HP and Microsoft were early leaders in the new field of utility computing, with their business units and researchers working on the architecture, payment and development challenges of the new computing model. Google, Amazon and others started to take the lead in 2008, as they established their own utility services for computing, storage and applications.
Line 27:
IBM and other mainframe providers conducted this kind of business in the following two decades, often referred to as time-sharing, offering computing power and database storage to banks and other large organizations from their worldwide data centers. To facilitate this business model, mainframe operating systems evolved to include process control facilities, security, and user metering. The advent of mini computers changed this business model, by making computers affordable to almost all companies. As Intel and AMD increased the power of PC architecture servers with each new generation of processor, data centers became filled with thousands of servers.
 
In the late 1990s utility computing re-surfaced. InsynQ, Inc. launched [on-demand] applications and desktop hosting services in 1997 using HP equipment. In 1998, HP set up the Utility Computing Division in Mountain View, CACalifornia, assigning former Bell Labs computer scientists to begin work on a computing power plant, incorporating multiple utilities to form a software stack. Services such as "IP billing-on-tap" were marketed. HP introduced the [[HP Utility Data Center|Utility Data Center]] in 2001. Sun announced the [[Sun Cloud]] service to consumers in 2000. In December 2005, [[Alexa Internet|Alexa]] launched Alexa Web Search Platform, a Web search building tool for which the underlying power is utility computing. Alexa charges users for storage, utilization, etc. There is space in the market for specific industries and applications as well as other niche applications powered by utility computing. For example, PolyServe Inc. offers a [[clustered file system]] based on commodity server and storage hardware that creates highly available utility computing environments for mission-critical applications including Oracle and Microsoft SQL Server databases, as well as workload optimized solutions specifically tuned for bulk storage, high-performance computing, vertical industries such as financial services, seismic processing, and content serving. The Database Utility and File Serving Utility enable IT organizations to independently add servers or storage as needed, retask workloads to different hardware, and maintain the environment without disruption.
 
In spring 2006 3tera announced its AppLogic service and later that summer Amazon launched [[Amazon EC2]] (Elastic Compute Cloud). These services allow the operation of general purpose computing applications. Both are based on [[Xen]] virtualization software and the most commonly used operating system on the virtual computers is Linux, though Windows and Solaris are supported. Common uses include web application, SaaS, image rendering and processing but also general-purpose business applications.