When it comes to moving apps to the cloud, the National Aeronautics and Space Administration has a distinct advantage over most other civilian agencies-in-house technology expertise.

NASA’s scientists and engineers have built the agency’s own private cloud, called Nebula, which will host most of the applications and services destined for the cloud at NASA.

To be sure, NASA has been ahead in the cloud-computing game in government. In 2009, when the Obama administration first unveiled its cloud-computing initiative, NASA was already developing its cloud program. It began as an effort to cuts costs and establish a more efficient computing model, one that would let scientists process, share and store mission-critical information more effectively.

NASA values its “engineering expertise and it is critical to nurture the technical knowledge of our in-house employees so they can make sound technical recommendations or decisions [on cloud initiatives],” Tsengdar Lee, NASA’s acting chief technology officer for information technology told Breaking Gov.

Mixed with that in-house expertise is the knowledge that NASA’s private-sector IT contractors bring to the agency. “We have teams of people, both in-house and contractors, working on the projects,” Lee said.

Nebula is hosting about 70 projects, and more than 60 percent of NASA applications or services targeted for the cloud will migrate to Nebula, according to Lee. “We focus on the science and engineering requirement and couple our cloud efforts with the consolidation of science and engineering servers and workstations,” he said.

Many of the applications that NASA scientists and engineers use in the course of their work are too specialized to be hosted by commercial public clouds, Lee explained.

“We really focus our private cloud to satisfy the unique requirements that cannot be satisfied by the public cloud,” he said. “We have system design, engineering, design and data analytic challenges that require a lot of computing and storage resources.”

At the same time, NASA officials have designated some of the agency’s public-facing Web sites for the public cloud, according to Lee. One prominent example is the agency’s “Be a Martian” site, which will be hosted by Microsoft’s Azure public cloud. The site lets citizens set up an account as a “Martian citizen” and explore the planet.

Under the administration’s “cloud first” policy, which last year mandated that agencies move at least three programs to the cloud, NASA has also tabbed several other programs for the public cloud.

Climate@Home will let NASA collaborate directly with people around the world to expand knowledge about patterns in weather and climate using NASA’s earth science data and modeling systems. The input of participants will help determine the accuracy of computer models that scientists use to predict climate change, NASA said. NASA’s Earth Science Division and the Office of the Chief Information Officer have partnered to manage the Climate@Home initiative.

The agency is also migrating to the cloud SERVIR, a visualization and monitoring system that integrates satellite observations, ground-based data and forecast models to monitor and forecast environmental changes and improve response to natural disasters in developing countries.

In determining whether applications or services should be moved to the cloud, NASA officials try to develop clear requirements and full understanding of the scope and scale of the project, Lee said. Otherwise, “a seemingly easy project can balloon into a large-scale implementation,” he said.

“A good assessment on the scope and requirements of the project are the first steps making the project happen,” he added. “To ensure the success, we also use our software and system engineering practice and follow through with strong project management discipline. We review and monitor programmatic, technical, cost and schedule risks routinely to ensure the project stays on track and within budget.”