David McClure calls the General Services Administration’s Office of Citizen Services and Innovative Technologies (OCSIT) “a little sparkplug igniting innovation all across government.”

Indeed, OCSIT’s just-released 2012 annual report, “More for Mission,” serves as a 51-page catalogue for the office’s multi-pronged push for innovation in technology in the federal government. Keep reading →

The U.S. General Services Administration announced that a key program for approving shared cloud computing security tools has issued the first approved provisional cloud security authorization to Autonomic Resources LLC.

The authorization comes from the Federal Risk and Authorization Management Program (FedRAMP) Joint Authorization Board, which is comprised of the chief information officers from GSA and the Departments of Defense and Homeland Security. Keep reading →

It’s been a little more than a year since Admiral Thad Allen (USCG-Ret.) joined Booz Allen Hamilton as a senior vice president after a storied career with the U.S. Coast Guard, and serving as National Incident Commander for the Department of Homeland Security in the aftermath of the Deepwater Horizon Oil Spill.

Breaking Gov contributor Dan Verton sat down with Allen to discuss the importance of innovation and the challenges frontline federal government managers face when trying to implement new innovations. He also discussed some of the priorities for the future of homeland security outlined recently by Booz Allen Hamilton, on the occasion of the tenth anniversary of the creation of the U.S. Department of Homeland Security (DHS).

Breaking Gov: How important is the concept and practice of innovation to the Department of Homeland Security as we look toward the next 10 years of the homeland security mission?

Adm. Thad Allen: I don’t think there’s any doubt that innovation has been the key to the success of this country since our revolution. The ability to innovate, create new things and bring them to market progress the country along.

I think the real issue is how do you enable innovation in a government department or across the government? How quickly can you recognize technologies and bring them to bear on the problems you’re dealing with?

We have a whole host of regulations and federal acquisition regulations. We’re concerned about who we sell business to in the federal government. There are groups that we want to help and encourage, such as small business and the middle class. We need to figure out a better way to identify innovative capabilities that we can bring to bear in the homeland security area.

I don’t think right now the current acquisition procedures or requirements development procedures are mature to the point where we can move as rapidly as we need to.

Is it just the acquisition side of the equation, or is it the federal culture that does not encourage innovation from frontline managers?

Allen: The whole notion of innovation is a challenge across the government. What you have is a set of regulatory requirements that take time [and] they’re difficult to work through for new and challenging technologies. And then there’s a question of whether or not the people in government are technically qualified to understand those new technologies.

I think there’s a dual challenge. One is a process challenge. How do we make the process simpler? But there’s a content challenge. If you don’t understand the technology regarding cloud analytics [as an example] or what a cloud reference architecture can do, or what high performance computing can do, then you don’t make real good decisions about the acquisition of technology or make policy and budget decisions that enable that.

Are efforts such as the FedRAMP process helping agencies to innovate and adopt new technologies?

Allen: In the current budget environment, we can’t afford to have multiple stovepipe systems, multiple licensing fees, and multiple costs for software platforms. The downward force on funding is going to force the integration of software and data sets.

Then, once you get them in one place, it’s easier to make a fundamental change in how you actually manage the data. It’s something we’re going to have to do and it’s going to be required for mission execution.

More importantly, I don’t think we can operate the systems we’re operating right now and stovepipe them in proprietary systems in the current budget environment.

Our theory is, it takes a network to defeat a network.”

How do you see future of Homeland Security changing?

Allen: I believe terrorism is nothing more than political criminality; so you’re really dealing with a criminal organizations involved in criminal activity. The things all criminal organizations need to succeed is …they need to have a source of financing, they have to talk, they have to move, and they have to spend money to be successful. That is a network.

When you look at our law enforcement organizational structure and how we deal with terrorism, we tend to focus particular threat streams on particular agencies, and that’s how we employ them, like the Drug Enforcement Administration; (the Bureau of) Alcohol, Tobacco and Firearms; the US Secret Service; Immigration and Customs Enforcement.

Our theory is, it that takes a network to defeat a network. And if we’re going to do that properly, we’re going to have to break down the walls between those jurisdictions, particularly with regards to how we share information.

Read more from Thad Allen in an op-ed he wrote for Breaking Gov last year on the importance of separating the value of public service from the politics.

My perspective on the outlook for cyber initiatives is quite different heading into the New Year than in past years.

While there are always budgetary uncertainties and looming cuts in government IT spending, this year, we face an unprecedented financial uncertainty as our nation stands on the edge of a fiscal cliff. That will impact not only the resources we have to invest in technology, but how people work and live. Keep reading →

With the government’s Shared First initiative, the emergence of the Federal Risk and Authorization Management Program (FedRAMP) and ongoing budget pressures, migrating to the cloud has moved from an ideal to reality for many government agencies.

However, along with the efficiencies and cost savings associated with cloud computing comes a number of information security risks that must be overcome. Keep reading →

The number of reported cybersecurity incidents involving federal information networks continues to increase while the posture of federal agencies to defend against them appears to be weakening in 2012, according to projected data from a Congressional watchdog agency.

The Government Accountability Office’s director of information security issues, Greg Wilshusen, in a presentation to federal and industry security officials, said that the rate of reported security incidents, which had leveled off in 2011 after a steady four-year climb, was expected to jump again in 2012. Keep reading →

Management and program silos within agencies that so often stymie efforts to integrate information technology and security practices are also hindering efforts to institute smarter risk management strategies at agencies, according to senior government security officials.

“Risk is still being managed at most agencies in a stovepipe manner,” said Department of Energy Chief Information Officer Bob Brese (pictured at left) during a Government Technology Research Alliance conference on government security trends on Monday. Keep reading →

Federal information technology professionals are confronted with a management landscape that is perhaps as complex as any have seen in a generation.

That’s due in part to the convergence of three transformational technologies – cloud computing, mobile devices and big data analytics. The benefits of each technology are generally expected to outweigh many of the associated challenges of implementing them. Keep reading →

For those who follow government computing trends, the biggest story of 2012 in the U.S. has been the accelerating adoption of cloud services by federal agencies as well as by state and local governments. This growth has been fostered in large part by the admirably proactive stance in favor of cloud taken by the White House’s Office of Management and Budget (OMB).

It has also been propelled by the FedRAMP program, which streamlines the procedures used to vet the security features of commercial cloud solutions. At SafeGov we enthusiastically endorse this trend and look forward to the cost savings and improvements in citizen services it will bring to all levels of government.

This article originally appeared on SafeGov.org and is republished by permission. For more news and insights on innovations at work in government, please sign up for the AOL Gov newsletter. For the quickest updates, like us on Facebook.

But while U.S. government use of cloud services surges, U.S. regulators have paid relatively little attention to the emerging issue of data confidentiality in the cloud. The focus of Federal cloud standardization efforts such as the NIST requirements built into FedRAMP has been data security, not privacy and confidentiality.

In Europe, however, the picture looks very different. The European public sector is approaching the cloud with caution: governments are keenly interested in the potential benefits, but have not yet issued the kind of top-down mandate for rapid migration that we’ve seen in the U.S.

At the same time, European regulators are much further along the path toward a modernized regulatory regime for cloud computing. The key development looming on the horizon in Europe is the proposed new EU Data Protection Regulation. This draft legislation represents a sweeping revision of the 1995 EU Privacy Directive and is currently the subject of intense scrutiny by interested stakeholders.

Observers expect it to be passed by the EU Parliament sometime in early 2014. In the short term, the most significant event is the investigation of Google’s privacy policy that the French Data Protection Authority – the CNIL – is conducting at the request of the Article 29 Working Party (the association of European DPAs).

Before we assess the impact of European privacy regulations on government cloud computing, let’s take a step back to review the recent debate over online privacy in the U.S. As visitors to this forum may know, SafeGov contributors were among the first to identify the mismatch between the privacy policies of giant web advertising companies like Google and the requirements of safe cloud computing in a government or educational environment.

Recall that the new Google privacy policy introduced in March of this year allows the Mountain View, Calif., firm to combine all of the vast knowledge it gleans from tracking a user’s activity across its many web services (Gmail, Docs, Search, YouTube, DoubleClick, etc.) into a single “master profile.” This profile can then be intensively data mined to select the most profitable ads to serve to that user.

Scaled up to tens and even hundreds of millions of users, this profiling technique yields an extraordinarily profitable business model that has made Google the most successful advertising firm in history.

Google, by the way, is not the only web firm to use this model. Facebook does essentially the same thing, although it has less raw data about users’ behavior outside of its own site. Even Microsoft has recently adopted a unified privacy policy for its consumer services However, a critical difference between the Google and Microsoft policies is the fact that Microsoft, unlike Google, has a specific privacy policy for enterprise and government users.

Privacy advocates on both sides of the Atlantic have objected to Google’s business model on the grounds that web users are not informed that they are being tracked in this manner and are not given an obvious opportunity to opt out. We also note that the European DPAs asked Google to delay implementation of the policy until it could be investigated, but Google declined.

SafeGov itself does not take a position on business models deployed by consumer advertising firms. We recognize that opinions on this difficult and sensitive question will differ. Web advertising (which does not necessarily require hidden user tracking) can be a healthy form of technological innovation that offers significant benefits to consumers.

However, our contributing experts have pointed out on many occasions that the kind of stealthy user profiling and systematic data mining of user content that has become the norm on the consumer web is absolutely unacceptable when performed in cloud services provided under contract to governments or schools.

I believe that our experts who have spoken out on this issue are on solid ground. Imagine for example that a cloud provider decided to apply the same data mining algorithms it uses for consumer ad targeting to the email traffic of tens or hundreds of thousands of government users or school children.

Even if no personal information of individual users was disclosed to advertisers, the power of these algorithms to identify trending topics and keywords in user content could be of immense economic value. In the case of sensitive government information, it could also represent a grave threat to the security of nations. It is for these reasons that SafeGov has called on all cloud service providers to create separate privacy policies for public sector users that expressly ban these practices.

Now what of the European regulators? As noted above, the French DPA – the CNIL – was assigned the task last February of investigating Google’s new privacy policy in order to determine whether it complies with existing European data protection rules. The CNIL is expected to present its initial findings on the Google policy to its European peers sometime in the coming days.

It is important to understand that although the CNIL is a French institution, it is not acting on behalf of the French government, but on that of the association of European DPAs (the Article 29 Working Party). These DPAs are national regulatory bodies whose members are appointed by their national governments, but which operate as independent authorities (in much the same way that the FTC and the FCC do in the U.S.). Their mandate is to enforce European and national laws concerning data protection and online privacy.

While nothing has yet leaked to the press regarding the CNIL’s findings (the contrast on this point with the American FTC is noteworthy), past statements of the CNIL and the Article 29 Working Party allow us to anticipate the likely direction the regulator will take.

First, it is highly probable that the CNIL will find that Google’s privacy policy indeed does not fully comply with European law. This much was already implied in the statements of Article 29 Working Party Chairman Jacob Kohnstamm last February and by the decision to entrust an investigation to the CNIL.

Second, it is unlikely that the CNIL will adopt a punitive stance toward Google, for example by imposing a fine. European law gives the DPAs the power to fine companies that violate the rules, and several DPAs (including the CNIL) have already inflicted fines on Google for that firm’s conduct in the so-called Wi-Spy scandal. But in this case it is more likely that the CNIL and the other DPAs will politely ask Google to change its privacy policy in ways that make it compatible with European laws.

What changes might the Europeans seek in Google’s privacy policy? Any answer to this question before the release of the CNIL’s report is of course purely speculative. Certainly we can expect the regulator to require that Google do more to disclose to users the extent of its data gathering and to offer them more explicit opportunities to opt out.

In recent months many web sites in Europe have begun to implement the new EU cookie rules that require increased disclosure and express consent prior to the serving of web cookies to user browsers. We might expect similar requirements to be imposed on Google and its web advertising peers (Facebook, Yahoo, Hotmail and Bing, etc.).

But at SafeGov our mission is government computing. We don’t know at this point whether the CNIL will express an opinion on the suitability of Google’s privacy policy for cloud services delivered to government customers. We note optimistically that the CNIL asked Google whether its new privacy policy applied to users of Google Apps for Education and Google Apps for Business (of which Google Apps for Government is a derivative. See Question 47 in the CNIL’s second questionnaire addressed to Google).

However, the regulator may prefer to focus its initial report on a broad outline of the changes it wishes to see in Google’s privacy policy, rather than drilling down to issues that confront specific sectors such as government or education.

In any case, observers can be confident that the debate on the topic of the confidentiality and safety of government data in the cloud is only just beginning.

The CNIL’s findings on behalf of the Article 29 Working Party, whatever they are, will be only the first step in a long road. As Europe prepares a fundamental revision of its data protection and online privacy law, that road will ultimately lead to significant changes in the privacy practices and perhaps even in the business models of all web advertising firms that wish to do business in Europe.

These changes will inevitably encompass the rules that govern the cloud services provided to European governments and schools.

We hope that Europe’s Data Protection Authorities will recognize the need for dedicated privacy policies that guarantee users in these critical sectors of the European economy protection from the user profiling and data mining practices of the online consumer advertising industry.

Jeff Gould is CEO and Director of Research, Peerstone Research, and a regular contributor to SafeGov.org, a forum for IT providers and industry experts dedicated to promoting trusted and responsible cloud computing. Keep reading →

MeriTalk has released a report that reveals how federal IT managers view the barriers, current status, and future plans related to moving mission-critical applications to the cloud.

The report, released this week, also reveals that government could save an estimated $16.6 billion annually if all agencies move just three mission-critical applications to the cloud. Keep reading →

Page 1 of 512345