Federal agencies are steadily moving to adopt cloud computing architectures for their IT services. But their progress is still spotty, and there is still a large amount of uncertainty about the technology in the public and private sectors, said Dave McClure, assistant administrator, Office of Citizen Services and Innovative Technologies, General Services Administration.

McClure has a bird’s eye view of that progress. His office is responsible for overseeing and managing the GSA’s Federal Risk and Authorization Management Program (FedRAMP), which is a government-wide effort providing standardized assessment, security and continuous monitoring of cloud products and services.

He and other government and industry officials shared their perspective on the government’s efforts to procure and establish cloud computing ecosystems, and to adapt existing systems and applications to these new environments, during a symposium in Washington D.C. on July 18, hosted by GTSI Corp. and Federal Computer Week.

Security for cloud architectures remains a central concern for federal IT officials. FedRAMP’s trust-based, develop-once, use-often approach to security has helped address some of these concerns. So have efforts by FedRAMP program officials, working with the Department of Homeland Security, to deploy a continuous security monitoring system for its cloud architecture, McClure said. FedRAMP has also created standardized templates for cloud providers to allow them to develop security processes.

This is not faith-based computing.”

McClure, however, also warned agencies that they cannot assume that a cloud service provider is familiar with government security guidelines. This is a learning curve many private firms have had to climb, he said, resulting in a range of understanding and compliance with federal regulations among commercial providers. Agencies need to be up front and explain “non-negotiable” security requirements to cloud services providers. This includes areas such as audits and personnel back ground checks.

“This is not faith-based computing,” McClure said.

While federal agencies are wrestling with the mechanics of moving to the cloud, the view from Congress is more segmented, said Benjamin Rhodeside, legislative assistant for Congressman Gerry Connolly, ranking member of the House Committee on Oversight and Government Reform’s Subcommittee on Technology, Information Policy, Intergovernmental Relations and Procurement.

While some regulatory committees are beginning to consider the impact of cloud computing, and there are some strong advocates for virtualization and the efficiencies it yields, cloud computing in general is not on Congresses’ radar yet, he said.

What is currently lacking for additional Congressional interest and support is solid data on federal sector cost savings from cloud computing, Rhodeside said. “If you could bridge that gap, you’ll see interest,” he said.

What is available are practical success stories. The State Department, for instance, has created its own cloud architecture to connect the Web sites of its overseas embassies and missions. This allows the personnel at these foreign posts to focus on site content instead of worrying about the underlying architecture, said Ken Rogers, chief enterprise architect and director of the State Department’s IT Strategic Planning Office.

The department’s flagship cloud effort is the Foreign Affairs Network, which besides supporting the organization’s embassies and missions, may soon begin offering platform as a service capabilities to other agencies, such as the Department of Agriculture, Rogers said. The next step in for the network to bring other agencies onboard.

There are currently three pilot programs underway, one with the U.S. Agency for International Development, which is using a platform as a service to support federal agencies, he said.

To help speed its transition to a cloud-based system, the State Department put together a white paper and launched a symposium on cloud computing to educate agency personnel and to change the culture, Rogers said. The department also has several additional efforts underway, these include data center consolidation and the establishment of an internal clearinghouse to capture lessons learned.

The Department of Homeland Security is also aggressively pursuing cloud computing programs. It has launched 12 cloud programs with the goal of saving money, compensating for shrinking budgets, and replacing inefficient IT infrastructure, explained Keith Tripple, executive director of enterprise system development at the DHS. By moving IT processes to the cloud, the sprawling department and its constituent agencies now have more control on operating costs, he added.

IT has been traditionally viewed as a cost center in the DHS, Tripple said. With the move to cloud-based services, he noted that he now views IT as a revenue center. The DHS has recently issued a request for information from industry for cloud brokerage models. The goal is to ultimately select a firm to provide cloud-based environments and models to support the department and its agencies, he said.

In the long term, this model could extend across the federal government and fundamentally change how it consumes IT services. “This is a potential federal mechanism where we can have a dialogue about this,” Tripple said.

The DHS is also taking steps to move commodity IT services and mid-tier applications to its cloud architecture. The goal is to get these into the architecture to help create an ecosystem, Tripple said. For example, he added that many of the Federal Emergency Management Agency’s IT services are now in the DHS’s public cloud environment. “Start small and get your mid-tier apps,” he said.