big data


The explosion of records across the government, including those generated by emerging technologies and social media, is putting new pressures on federal information managers.

The primary challenge is managing the onslaught of records in a new environment, Alan Linden, a senior technology consultant at Electronic Image Designers, said Thursday at the annual FOSE convention in Washington, D.C. Keep reading →


One of the many benefits of being the director of research at GTRA is that it offers the opportunity to speak candidly and off the record with countless executives from Defense, Intelligence and Civilian agencies who share what they really care about, not what mandates and initiatives tell them to focus on.

The result is a real-time snapshot of the most frequently made comments by federal IT executives, some of which may come as a surprise. Among the most frequently uttered comments I’ve received over the past few months, which may or may-not come as a surprise: Keep reading →

Donna Roy, Executive Director of NIEM, recently sent a ‘State-of-NIEM’ letter to the community thanking them for continuing to help develop a plan for scaling NIEM and addressing challenges facing the program.

The National Information Exchange Model is an XML-based information exchange framework from the United States. NIEM represents a collaborative partnership of agencies and organizations across all levels of government (federal, state, tribal, and local) and with private industry. The purpose of this partnership is to effectively and efficiently share critical information at key decision points throughout the whole of the justice, public safety, emergency and disaster management, intelligence, and homeland security enterprise. NIEM is designed to develop, disseminate, and support enterprise-wide information exchange standards and processes that will enable jurisdictions to automate information sharing. Keep reading →

Big data science visualizations have evolved from the use of proprietary data (past), to difficult-to-obtain-and-use big data (present), to the hope that business, finance, media, and government big data will be more readily available and useable in the future.

That future appears a ways off, however, given my experience with several recent projects and judging from some of the presentations at the just-concluded O’Reilly Strata Conference.


The Strata Conference is billed as the home of data science, that brings together practitioners, researchers, IT leaders and entrepreneurs to discuss big data, Hadoop, analytics, visualization and data markets.

I was especially interested in learning more about public domain data sets and how far they’ve evolved in their ability to to be used.

One way to look at that evolution is through an analysis of content from the past three Strata conferences, looking at the number of presentations by category. I was motivated to do this in part because of a previous story I read that government, real estate, and manufacturing had the highest value potential for big data in the future. Keep reading →

The White House’s recently launched “Future First” initiative marks a milestone in the federal government’s effort to invigorate the implementation of new technologies. As Federal CIO Steven VanRoekel begins to roll out new initiatives like “Shared Services First,” agencies should ask themselves “What technology will help us better manage systems amidst the current data explosion?”

The answer lies in the ability to handle large volumes of machine-generated data, also known as big data. Agencies need to automate how they manage large volumes of machine data because the growth of data is outpacing human capacity to monitor and understand its relevance. Keep reading →

Page 7 of 91...3456789