More than 300 government agencies and 1,500 education institutions are now using Amazon Web Services’ cloud computing platforms, according to AWS Global Public Sector Vice President Teresa Carlson, speaking at a government customer and partner meeting in Washington, DC, Wednesday. AWS also announced several enhanced services for its GovCloud, a government-only cloud computing center.
The new customer milestones reflect not only the growing flexibility of service and pricing options for public sector customers, but also the growing maturity of intermediaries and brokers who are making it easier for government agencies to buy – and for Amazon to sell – an expanding assortment of on-demand computing services.
Several factors are driving adoption, according to Carlson, and others who spoke at the summit.
One is the growing demand by agencies to conduct big data analytics and high performance computing, use web and collaboration applications, and archive and store data on a new and greater scale.
A second factor is the increasing ease with which computing services can be acquired on an as-needed-only basis. That has been fueled in part by the AWS’s decision to open a gated cloud computing center last year dedicated exclusively to government clients and which meets a host of physical and logical security and access requirements that have left non-compliant cloud centers struggling for a foothold with federal agencies.
Amazon’s and other dedicated computing facilities, such as one opened by Terremark, have received an added boost from recently-issued government contract vehicles – and the expertise of brokers, such as Aquilent, which helps simplify cloud procurement and eases implementation for agencies, according to Aquilent Chief Technology Officer Mark Pietrasanta.
But another factor gaining the attention of agency IT and acquisition officials is the ability to acquire cloud computing services from AWS at dramatically discounted prices in what is emerging as an increasingly attractive spot-pricing market.
AWS’s Matt Wood product manager for data-intensive computing, explained that AWS and its customers benefit from lower shared expenses and pricing when AWS’s computing centers operate at full effective capacity. When there are gaps between optimal and actual usage, which is usually the case, AWS puts the unused capacity up for bid.
That can result in “computing instance” costs of as little as $0.25 per hour, or about a tenth of typical costs, Wood said, which suddenly makes high capacity computing available to researchers and others who would have otherwise been unable to afford it.
One of AWS’s partners, Cycle Computing, offered a recent example where Morgridge Institute for Research needed a multi-thousand core cluster to begin processing data on short notice to process an estimated 115 compute-years of analysis in about 7 days, using 78 TB of genomic data. On demand prices ranged from $0.45 to $1.80 per hour. Corresponding spot prices, however, ran from $0.037 to $0.15/hour. The one million hours of compute time and 78 TB of storage cost only $116/hour or $19,555 total.
According to Carlson, the dynamic nature of computing costs is also fueling an options, or futures market for customers who can lock in computing prices, for an upfront fee, for one to three years.
NJVC and Virtustream, for instance, just this month announced a new marketplace for unused cloud capacity and computing services that would soon be available for the federal government.
Customers in many cases can still end up paying less over time, said Aquilent’s Pietrasanta, because they only pay for what they use and because the price of computing services continues to drop.
Lower processing costs are helping to negate a certain irony agencies and organization have been confronting in recent years, said Wood. Cheaper computers and wider web and mobile usage spawned an explosion in data generation. But budget constraints limited the ability to collect, store and analyze that data.
“The irony of declining IT costs,” he said was that while it permitted “escalating through-put, it hit a barrier of storage constraints.” The rise of affordable, as-needed computing, however, is now relieving those constraints, he said.
It’s also resulting in a shift in the role of cloud computing, from a tool for provisioning capacity to one that will increasingly provide specialized capabilities, said Paul Krein a cloud specialist with Deloitte, who also spoke at the AWS summit.
“And we’re starting to see a subtle shift, from the capability cloud to the composite cloud,where you build new capabilities on top of cloud capabilities,” he said.
In the meantime, AWS said it is continuing to enhance its offerings for government users, announcing the following new or expanded services, to accommodate more types of workloads:
- High Performance Computing – Support for EC2′s Cluster Compute Eight Extra Large Instances (60.5 GB of RAM).
- Elastic Load Balancing – Automatically distribute traffic across multiple EC2 instances.
- Auto Scaling – Automatically scale EC2 capacity up or down based on user-defined conditions.
- CloudWatch Alarms – Receive notification when a CloudWatch metric falls beyond a configurable threshold.
- Simple Notification Service (SNS) – Cloud-based notifications using a topic-centric publish and subscribe model.
- Simple Queue Service (SQS) – Reliable, highly scalable hosted queues for building distributed applications.