Brand Niemann

 

Posts by Brand Niemann


One of the many things newly-appointed Federal Chief Technology Officer Todd Park is credited with while serving in that role at the Department of Health and Human Services is the Health Data Initiative (HDI) and the HealthCare.gov webs site.

Originally launched in 2010 by the Institute of Medicine (IOM) and the HHS as the Community Health Data Initiative (HDI Forum I), it is now part of the Health Data Consortium (HDC), a new public-private collaboration that encourages innovators to utilize health data to develop applications to raise awareness of health and health system performance and spark community action to improve health (HDI Forum II last June).

The goal of what is now being called the Health Datapalooza (HDI Forum III), to be held on June 5-6, 2012, at the Walter E. Washington Convention Center in Washington, DC., is to showcase the best and brightest new applications using health data from government and other sources.

I have followed Todd Park’s–and his predecessor, Aneesh Chopra’s–innovation efforts with health data culminating in the Health Data Initiative Forum II last June and the Strata 2011 New York Conference last September. I like their four policy levers that reflect their open innovation philosophy: Opening up data for innovators and entrepreneurs; taking on the role of impatient convener; initiating prizes, challenges, and competitions; and attracting top talent at the intersection of technology and policy.” Keep reading →

Donna Roy, Executive Director of NIEM, recently sent a ‘State-of-NIEM’ letter to the community thanking them for continuing to help develop a plan for scaling NIEM and addressing challenges facing the program.

The National Information Exchange Model is an XML-based information exchange framework from the United States. NIEM represents a collaborative partnership of agencies and organizations across all levels of government (federal, state, tribal, and local) and with private industry. The purpose of this partnership is to effectively and efficiently share critical information at key decision points throughout the whole of the justice, public safety, emergency and disaster management, intelligence, and homeland security enterprise. NIEM is designed to develop, disseminate, and support enterprise-wide information exchange standards and processes that will enable jurisdictions to automate information sharing. Keep reading →

COMMENTARY: Federal CIO Steven VanRoekel says in his recent White House Blog: “Through reporting of new operational metrics, the updated IT Dashboard provides unprecedented insight into the performance of each major IT investment in the federal government in order to ensure that each IT dollar is being spent most effectively.”

So why not make this one of the state-of-the-art shared services VanRoekel is trying to achieve? Why not put all the data in memory in dashboards so it can be in motion and give immediate insights?

I did this elsewhere. There was no need to download the data sets, open them in Excel, and sort, filter and graph them to gain insights into government IT investment performance. It can all be done automatically when the data is in memory, live in the dashboard and can be readily updated as the agencies update their input data.

Since I recently wrote about the discussion draft for Federal Information Technology (IT) Shared Services Strategy “Shared First” issued by VanRoekel in December, I followed the design principles it contains and noted what I did specifically for an improved Federal IT Dashboard:

Benefits:

  • Standardization: “Shared service providers must leverage consistent standards that streamline functions across the Federal Government. This enables communication, data sharing and function use across all agencies. It eliminates the use of decentralized and inconsistent resources to create new, unique solutions throughout agencies in response to a single set of federal requirements. (author’s note: The shared data service supports the Sitemap and Schema Protocols and a Web Oriented Architecture.)
  • Visibility: “A government-wide shared services catalog helps agencies discover the wide array of available services. This enhances the potential for service integration as some agencies will develop shared services for those functions not already being provided. (author’s note: This is a government wide catalog of IT investments that helps agencies discover existing services.)
  • Reusability: “Shared services harness a way to support duplicated agency functions throughout the mission areas. This reduces the potential for development and maintenance costs by using repeatable services.” (author’s note: This is a government wide catalog of IT investments that helps agencies avoid duplication of services.)
  • Platform independence: “Agencies no longer need to worry about integrating with their current platforms in-house. Shared services providers ensure a stable infrastructure and can manage systems changes and updates within a controlled environment.” (author’s note: This platform imports many different data formats and makes data services that export the data in standard formats for reuse.)
  • Extensibility: “The basic shared services of a provider can be used as building blocks for other services that consumer agencies need. Services can be scaled up or down, based on demand.” (author’s note: The Amazon Cloud is elastic and so are the applications I used that are hosted there.)
  • Location transparency: “Users of shared services access the services from anywhere within the shared service network. This increases availability and end user access to strengthen SLAs between the provider and the services consumer.” (author’s note: This is noted in the Amazon Cloud with SLAs.)
  • Reliability: “Services provided are robust and stable with service continuity capabilities to minimize critical system outages to levels established by SLAs.” (author’s note: This is hosted in the Amazon Cloud with SLAs.)
Components:

  • Component #1: Requirements. This includes the strategic and tactical requirements for the type(s) of functionality that the service has to provide to consumers. The number and type of functional requirements depends on the type of service area, number and diversity of participating agencies, sensitivity of information and data being exchanged. (author’s note: I reproduced the requirements and functionality of the new Federal IT Dashboard.)
  • Component #2: Workflow. These are the business processes that function through the shared service. The design of the process must be such that the functional requirements from Component #1 are supported. (author’s note: I made the business process of the new Federal IT Dashboard more complete by putting all the data in memory and the metadata in linked open data format.)
  • Component #3: Data Exchange. This is the part of the business process in Component #2 that involves the creation, exchange, manipulation, storage, or deletion of data and information. (author’s note: The application supports the data business processes needed.)
  • Component #4: Applications. This includes the software and hardware that provide the functionality and data exchange capabilities that are identified in Components #2 and #3. (author’s note: The software and hardware provide more functionality and data exchange than the new Federal IT Dashboard.)
  • Component #5: Hosting. This is the infrastructure that the application(s) are hosted in. This includes cloud-based, client-server hosting solutions. (author’s note: This is hosted in the Amazon Cloud with SLAs.)
  • Component #6: Security and Privacy. These are the various types of logical, physical, process, and personnel controls that achieve required levels of protection and risk mitigation for the shared service. (author’s note: The applications used have received security certifications and this is hosted in the Amazon Cloud with SLAs that provide for security and privacy protections.)
The IT Dashboard team says:

“We are always looking for ways to improve analytical capabilities and user experience. Some additional features the public should expect to see in the coming months include: visualizations for operational performance and activities, additional improvements in search capabilities, Treemap enhancements, etc. User feedback is always appreciated and can be submitted via the Feedback link at the top of each page.”

Keep reading →

The fact that the Department of Defense got its budget cut and the Intelligence Community got its budget increased in the White House’s 2013 budget request of Congress is indicative of more than the need to roll back a decade of military growth. It’s also indicative of a shift in IT focus–and a reflection that DoD’s network-centric focus is being overtaken by the IC’s big data-centric focus.

There are probably many reasons for such a shift. One is the world’s population. The U.S. Census Bureau estimates the world population passed 7 billion mark this past weekend. The rapidly growing number of people who will eventually have smartphones with multiple sensors (your iPhone has them now for GPS position, etc.) promises a future where there will be massive streams of real-time data that the IC will want to mine, looking for lone-wolf terrorists (who are relatively but easy to stop) who I have written about previously.

For companies like Google and Facebook, big data is big business, and for other companies big data is becoming their business as they mine their large swaths of data to improve their services and develop new business activities. The IC may not come out and say it, but it has to love the fact that Facebook will soon have 1/7th of the world’s population using it’s platform to share what’s going on. Or that Google is almost everyone’s favorite search engine because they can keep track of what people are posting and searching for much easier than many in government can.

The IC also has to love big data, and the rapid evolution of systems used to ingest and process it, because it helps push the technology wave, as Gus Hunt, CIA chief technology officer (pictured above), described it at the recent Government Big Data Forum.

Hunt said that in every aspect of their workflow at the CIA, from sensors to finished intelligence, massive, multiple, real-time sensor data streams cause bottlenecks on current networks that swamp current storage devices and overwhelm current query, analytics, and visualization tools, that are needed to produce finished intelligence.

So he wants his cake and to eat it too: He wants real-time analytics and visualizations that he says a few start-ups are trying to achieve. He also wants the Federal Cloud Computing Initiative to add two more services to Platform-, Software-, and Infrastructure-as-a-Service, namely, Data-as-a-Service and Security-as-a-Service.

Part of the solution is emerging from Google’s MapReduce, which is a parallel data processing framework that has been commercialized as Apache Hadoop (developed by Doug Cutting who named it after his son’s toy elephant) by Cloudera so one can store and compute big data at the same time.

Amr Awadallah, founder and CTO of Cloudera, calls Apache Hadoop a data operating system in contrast to Windows and Linux, which are essentially file operating systems (they store and manage all the files you create and are needed for your software applications). He points out that Apache Hadoop provides the three essential things: velocity, scalability, and economics, that are needed to handle big data.


So the IC, Gus Hunt, Amr Awadalla, and others at the Government Big Data Forum are leading the next technology wave and gave us a glimpse of both the technology infrastructure and the business organization with chief data officers and data scientists that will be needed to implement and succeed with big data.

More details about what was said can be found at CTOVision and at my wiki document, Data Science Visualizations Past Present and Future.

Big data science visualizations have evolved from the use of proprietary data (past), to difficult-to-obtain-and-use big data (present), to the hope that business, finance, media, and government big data will be more readily available and useable in the future.

That future appears a ways off, however, given my experience with several recent projects and judging from some of the presentations at the just-concluded O’Reilly Strata Conference.


The Strata Conference is billed as the home of data science, that brings together practitioners, researchers, IT leaders and entrepreneurs to discuss big data, Hadoop, analytics, visualization and data markets.

I was especially interested in learning more about public domain data sets and how far they’ve evolved in their ability to to be used.

One way to look at that evolution is through an analysis of content from the past three Strata conferences, looking at the number of presentations by category. I was motivated to do this in part because of a previous story I read that government, real estate, and manufacturing had the highest value potential for big data in the future. Keep reading →

COMMENTARY: I keep hearing and reading that Google and Facebook are changing their polices about handling our personal information and that the White House, Congress, consumer groups, regulators, and their millions of users are concerned.

Then I heard a recent interview with Facebook founder Mark Zucherberg that asks him if he thinks that Google is trying to compete with Facebook and his answers are evasive and so I know that the interviewer is on to something. Keep reading →


I recently was asked to comment about the evolution of XML for a story about what the Government Printing Office is doing in migrating data to XML — and about how APIs (Application Program Interfaces) can help get agencies extract data out of their systems using XML (extensible markup language)

It reminded me of a conversation I had with GPO managers about 15 years ago in which I advocated how XML would give them “author once — use many” capability that would stand the test of time and it has! Keep reading →


I recently was invited by email to comment on the Office of Management and Budget’s discussion draft for Federal Information Technology (IT) Shared Services Strategy originally issued by U. S. CIO Steven VanRoekel in December 2011.

So I visited the IdeaScale site to engage with OMB in this forum. Unfortunately, it was like deja vu, reminding me of the Evolving Data.gov with You where comments were submitted as far back as two years ago and which have yet to addressed.

There weren’t weren’t many new ideas posted when I first visited it and those that were appeared to be topics in the table of contents of the document so I concluded that this is not getting that much attention. I asked myself why?

One reason could be that the idea of shared services has been around for so long that perhaps it no longer evokes much interest and excitment.

Some readers may recall when the Federal CIO Council started the Federal SOA (Service Oriented Architecture) Community of Practice some six years ago–and we have been at it ever since.

In fact, that group is having its 13th conference April 3, at MITRE in McLean, Va. Those looking for more on this subject might be interested in an article I delivered at the last conference, entitled: In Search Of Practical Ways To Share IT Services In Government.

To me the most interesting thing in the report was a table, “Potential Shared Service Opportunity Areas,” where I added the totals for investments and costs.

Area US Government
Number of Investments
Cost ($ Millions)
Planning and Budgeting 222 515
General Government 182 1209
Administrative Management 197 578
Financial Management 311 1650
HR Management 1035 17,371
Supply Chain Management 122 362
Totals 2069 21,685


HR Management leads all the areas with 1035 Investments (essentially 50%) costing $17.3 million (about 80%). That is exactly where the Federal SOA CoP started In January 2006 with piloting a shared service for Human Resources across the government.

The Federal CIO Council said to me then: ‘Show us a SOA’ and we did that with the model-driven architecture approach and tooling that showed both the SOA architecture and the XML messages and data flowing throught the architecture diagrams. We have since gone on to pilot a Open Source ESB (Enterprise Service Bus) for SOA that the FAA has adopted for its SWIM Program, piloted SOA and business process modeling with ontologies that the DoD Deputy Chief Management Officer is using.

Keep reading →


I recently led a team of nine that made a two hour presentation to the Department of Defense at the new Mark Center to provide perspective on the DoD Enterprise Information Web (EIW).

The EIW team is pioneering the adoption of semantic technology and approaches that can be the way forward for enterprise business intelligence and solution architectures in the DoD. Keep reading →

I saw the Tweets this morning about Aneesh Chopra “stepping down” based on a FedScoop article posted at midnight last night. Seems like a lot happens after normal business hours in this town.

I thought the most interesting words in the article were: “No information was provided on his future plans, but ongoing speculation includes running for political office to assuming an executive role leading the Washington offices of a major technology company,” writes Luke Fretwell in the article, which cites unnamed sources.

Then around noon the Washington Post just broke the story: Aneesh Chopra leaving the White House, likely to run for Virginia lieutenant governor, but said Chopra did not return requests for comment. (More on the story here.)

Chopra was part of a trio of D.C.-area tech and business heavyweights tapped by Obama at the start of his term to address government management and technological concerns. In the span of a few days in 2009, Obama named Chopra, Virginia’s former secretary of technology, to oversee the government’s tech upgrades, Jeffrey Zients, a D.C.-area business veteran, to serve as the first White House chief performance officer (Zients is now acting director of the Office of Management and Budget), and Vivek Kundra, a former District government official, who stepped down in June after serving as the first White House chief information officer, to go to Harvard briefly and and then recently joined Salesforce.com.

Page 4 of 912345678...9