intelligence community

The intelligence community is developing a single cloud computing network to allow all its analysts to access and rapidly sift through massive volumes of data. When fully complete, this effort will create a pan-agency cloud, with organizations sharing many of the same computing resources and information. More importantly, the hope is the system will break down existing boundaries between agencies and change their insular cultures.

As in the rest of the federal government, lower costs and higher efficiency are the primary reasons for the intelligence world’s shift to cloud computing, said Charles Allen, formerly Under Secretary of Homeland Security for intelligence and analysis, currently a principal with the Chertoff Group, in an interview with Breaking Defense, an affiliate of Breaking Gov. Keep reading →


The United Nation’s International Telecommunications Union sent shock waves across the Internet with an agreement approved last night which would give countries a right to access international telecommunications services including Internet traffic.

(This story was updated a 4:30 p.m. ET to include additional reporting.) Keep reading →

US Government agencies often face a Catch-22 trying to adopt innovative technologies: Procurement rules designed to promote fairness can effectively preclude federal buyers from seeing – or influencing – developments that could eventually help agencies work more effectively.

The Defense Department and intelligence agencies, of course, have been fueling innovative technologies on their own for decades. But as commercial markets have exploded with new ideas, and learned to bring those ideas to market with greater speed, government agencies increasingly find themselves racing to keep up with innovations in the commercial sector. Keep reading →

The power of big data like cloud computing and mobility – has emerged as a transformational technology force, but one that poses a host of planning questions for senior government agency officials. Peter Mell, a senior computer scientist for the National Institute of Standards and Technology, devoted many months assessing the potential and the pitfalls of big data for NIST. He recently shared what he learned and what executives need to understand about big data in an interview with AOL Government’s Wyatt Kash.

Mell outlined some of the misunderstandings and tradeoffs associated with large scale data sets agencies are likely to encounter as they move beyond classic relational databases. He also talked about the importance cloud computing plays in facilitating big data analytics. And he shared with our readers a comprehensive slide presentation that puts many of the questions about big data and related security implications into perspective. Keep reading →


For all the progress made advancing integrated intelligence with data from the days of Desert Storm to Operation Iraqi Freedom, Dawn Meyerriecks says the intelligence community must embrace analytics and mission-focused technology to stay on an innovative track.

She made the declarations during a keynote at a conference incorporating key players in cybersecurity, cloud computing and mobile government in Washington, D.C. on Wednesday. She also said the US government has to reach outside its borders for most of the necessary talent. Keep reading →

Most government agencies strive to use technology more effectively, but only a few use it to directly save lives.

Despite its small size, the Joint Improvised Explosive Device Defeat Organization comes up with techniques and technologies to get rid of bombs and shares that information with U.S. and allied warfighters through an online training portal. Keep reading →

The U.S. Army is expected to open a new mobile applications store as part of a pilot program designed to offer a more flexible way to develop and buy software for the government. The online store will provide a space where users can request specific tools and where participating developers can quickly provide or create a product to fill respective needs without getting bogged down in a complex and time consuming acquisitions process.

The new pilot will be a six-month effort that will support the Army’s intelligence service and the potentially intelligence agencies. Keep reading →


This is one in a series of profiles on the 2012 Samuel J. Heyman Service to America Medal finalists. The awards, presented by the nonprofit Partnership for Public Service, recognize outstanding federal employees whose important, behind-the-scenes work is advancing the health, safety and well-being of Americans and are among the most prestigious honors given to civil servants. This profile features a finalist for the National Security and International Affairs medal, Joyce Connery, director for Nuclear Energy Policy at the National Security Council in Washington, D.C.

A summit of 50 world leaders hosted by President Obama in 2010 resulted in important steps to prevent terrorists from obtaining nuclear materials such as plutonium and highly enriched uranium that could be used to make radiological bombs. Keep reading →

After six days of the 2012 International Open Government Data Conference, which concluded last week, I and others are asking ourselves this question: Is there a business case for open government data?

Clearly, more needs to be done to spread what is working with open government data.

But when it comes to making a business case for open government data, there are at least three success models – or examples I am aware of:

  • Statistical agencies that get regular funding because it is critical to governmental decisions such as establishing congressional districts;
  • Intelligence agencies and the larger intelligence community that received a big budget increase for big data because of the need to find more needles in bigger haystacks;
  • Google, Facebook, LinkedIn, and other big data users of online data that learned they needed a data science team with an information platform to grow their businesses.
But the question remains, what business value can make open government data fundable and sustainable like the above three? Keep reading →

I just recently attended a meeting about something that many people are beginning to hear about but which most people do not understand — Ontology for Big Systems.

Even the words describing the outcome of meeting are misleading: Summit and Communique. A summit is usually a global meeting of world leaders and a communique is usually a short statement for the public and press. This year’s communique is 11 pages long, which has grown from previous years: 2006 – 1, 2007 – 4, 2008 – 8.4, 2009 – 8.4, 2010 – 8.6, and 2011 – 8 pages.


Ontology has two definitions — one from philosophy and another computer science. I won’t even bother you with their definitions because the Intelligence Community prefers to use the word Knowledge Base instead to describe a collection with a very large number of documents that can be analyzed and searched “for more needles in bigger haystack.” Keep reading →

Page 1 of 212