Utility computing , or the Computer Utility , is a service provisioning model in which a service provider makes computing resources and infrastructure management available to the customer as needed, and charges them for specific usage rather than a flat rate. Like other types of on-demand computing (such as grid computing), the utility model seeks to maximize the efficient use of resources and / or minimize associated costs. Utility is the packaging of system resources , such as computing, storage and services, as a metered service. This model has the advantage of a low cost instead, resources are essentially rented.
This repackaging of computing services has become the foundation of the shift to ” on demand ” computing, software as a service and cloud computing models that further propagate the idea of computing, application and network as a service.
There was some initial skepticism about such a significant shift.  However, the new model of computing is becoming mainstream.
IBM, HP and Microsoft were early leaders in the new field of utility computing, with their business units and researchers working on the architecture, payment and development challenges of the new computing model. Google, Amazon and others started to take the lead in 2008, as they established their own utility services for computing, storage and applications.
Utility computing can support grid computing which has the characteristic of very large computations or sudden peaks in which is supported by a large number of computers.
“Utility computing” has usually envisioned Some form of virtualization so que le amount of storage or computing power available is considerably larger Than That of a single time-sharing computer. Multiple servers are used on the back end to make this possible. These might be a dedicated computer cluster specifically for the purpose of being rented out, or even an under-utilized supercomputer . The technique of running a single computing calculator is known as distributed computing .
The term ” grid computing ” is often used to describe a particular form of distributed computing, where the supporting nodes are geographically distributed or cross administrative domains . To provide utility computing services, a company can “bundle” the resources of members of the public for sale, who might be paid with a portion of the revenue from customers.
One model, common among volunteer computing applications, is for a central server to dispense tasks to participating nodes, on the behest of approved end-users (in the commercial case, the paying customers). Another model, sometimes called the Virtual Organization (VO), [ citation needed ] is more decentralized, with organizations buying and selling computing resources as they go.
The definition of “utility computing” is sometimes extended to specialized tasks, such as web services .
Utility computing means “Pay and Use”, with regards to computing power. Utility computing is not a new concept, but rather has a long history. Among the earliest references is:
|“||If computers of the future, then computing can be used as a public utility utility. The computer utility could become the basis of a new and important industry .||“|
|- John McCarthy , speaking at the MIT Centennial in 1961 |
IBM and other mainframe providers providing this type of business in the following two decades, often referred to as time-sharing, offering computing power and database storage to banks and other large organizations from their worldwide data centers. To facilitate this business model, mainframe operating systems, security, and user metering. The advent of mini computers has changed this business model. As an Intel and AMD increased the power of PCs, data centers grew to become thousands of servers.
In the late 90’s utility computing re-surfaced. InsynQ, Inc. launched [on-demand] applications and desktop hosting services in 1997 using HP equipment. In 1998, HP set up the Utility Computing Division in Mountain View, CA, assigning Bell Labs Computer Scientists to a workflow on a computing power plant, incorporating multiple utilities to form a software stack. Services such as “IP billing-on-tap” were marketed. HP introduced the Utility Data Center in 2001. Sun announced the Sun Cloud service to consumers in 2000. In December 2005, Alexalaunched Alexa Web Search Platform, a Web search building tool for which the underlying power is utility computing. Alexa charges users for storage, utilization, etc. There are more applications in the market for specific industries and applications. For example, PolyServe Inc. offers a clustered file systemOracle and Microsoft SQL Server databases, high-performance computing, high-performance computing, high-performance computing, high-performance computing, high-performance computing, high-performance computing , seismic processing, and content serving. The Database Utility and File Serving Utility enables organizations to independently add servers or storage as needed, retask workloads to different hardware, and maintain the environment without disruption.
In spring 2006 3tera annoncé icts AppLogic service and Later That summer lancé Amazon Amazon EC2 (Elastic Compute Cloud). These services allow the operation of general purpose computing applications. Both are based is Xenvirtualization software and The Most Commonly used operating system on the virtual computers is Linux, though Windows and Solaris are supported. Common uses include web application, SaaS, image rendering and processing but also general-purpose business applications.
- cloud computing
- Computer service office
- Edge computing
- Grid computing
- Jump up^ On-demand computing: What are the odds? , ZD Net, Nov 2002 , retrieved 2017-11-03
- Jump up^ Garfinkel, Simson (1999). Abelson, Hal, ed. Architects of the Information Society, Thirty-Five Years of the Laboratory for Computer Science at MIT . Cambridge: MIT Press. p. 1. ISBN 978-0-262-07196-3 .