Knowledge Management

Competitive pressures have increased the demand for superior performance by employees in every setting imaginable. Yet with a tight budgetary environment as well as workforce shifts, the challenge for federal agencies is how to bring employees’ knowledge base up to the required levels of excellence with the least disruptive impact on operations and cost.

That is particularly true as successive generations move into positions of responsibility. Baby Boomers, born between 1946 and 1964, account for 26% of the total U.S. population. Although many are postponing retirement, nationally, 10,000 Boomers retire each day – taking with them years of work experience and institutional knowledge. Keep reading →

I just recently attended a meeting about something that many people are beginning to hear about but which most people do not understand — Ontology for Big Systems.

Even the words describing the outcome of meeting are misleading: Summit and Communique. A summit is usually a global meeting of world leaders and a communique is usually a short statement for the public and press. This year’s communique is 11 pages long, which has grown from previous years: 2006 – 1, 2007 – 4, 2008 – 8.4, 2009 – 8.4, 2010 – 8.6, and 2011 – 8 pages.


Ontology has two definitions — one from philosophy and another computer science. I won’t even bother you with their definitions because the Intelligence Community prefers to use the word Knowledge Base instead to describe a collection with a very large number of documents that can be analyzed and searched “for more needles in bigger haystack.” Keep reading →


Recently, there have been several articles about companies moving to cut back or somehow control email, including an item on NBC News. Organizations are beginning to rebel against email’s constant, increasing presence – and realizing that by itself, email isn’t a solution to most business challenges.

Yes, email is great for communication. But too many organizations also depend on email for collaboration – and email provides no visibility. And many organizations also depend on email for execution – and email provides to tracking, no control, no auditability. Keep reading →


When the Senate Committee on Homeland Security and Governmental Affairs gathered last week to hear testimony about the state of information sharing across all levels of government, the committee leaders and even some of the expert witnesses pointed to the killings of Osama bin Laden and Anwar al-Awlaki as two examples of how information sharing across federal agency boundaries has improved.

Wrong. Keep reading →

The highlight of yesterday’s Geospatial Summit for me was mention of the National Hydrography Data Set.

Tommy Dewald (U.S. Environmental Protection Agency) and Keven Roth (U.S. Geological Survey, retired) set about the task of developing a surface water dataset of the nation in the 1990s with a vision to create a solution for the 21st century, when suitable water resources would become critical. What oil was for the 20th century, water would be for the 21st century. Keep reading →

One the nation’s most authoritative sources for residential address data, the U.S. Census Bureau, may soon have to confront a costly legal constraint that prevents it from sharing basic street address information with thousands of county, state governments and other organizations.

The limitation not only means that state and local governments must spend more to validate address information, so must the Census Bureau and other federal agencies, according to a group of data specialists speaking at a conference on the use of geographic data. Keep reading →

While most of NASA is looking up to the stars, scientist Michael Goodman is staring down at Earth, focusing this week on monster Hurricane Irene about to slam into the East Coast with a vengeance as soon as Friday.

Goodman, 55, NASA’s go-to guy for natural disasters and hazards, is defying the stereotypes about the space agency. He’s always focused on the ground, coordinating the space agency’s response to earthbound catastrophes. NASA has been involved in earth research since the 1960s.

“We’re constantly imaging the earth. If a significant event occurs, that data can be processed and made available,” Goodman, an atmospheric scientist in NASA’s Earth Science Division, told AOL Government. “Our role is to provide spaceborne and airborne observations and data analyses that can assist in damage assessment and aid in the recovery.”

NASA is not chasing hurricanes, but is using its arsenal of 14 orbiting satellites to develop new technologies or to use current ones to better measure the characteristics of hurricanes and the conditions that produce them. The information is made available to front-line agencies such as the National Oceanic and Atmospheric Administration (NOAA) to help them develop better forecasts.

Our role is to provide spaceborne and airborne observations and data analyses that can assist in damage assessment and aid in the recovery.

Hurricane warnings are already posted from Florida to Boston. The Category 3 hurricane is expected to touch down in North Carolina late Friday. The path of the storm is expected to be catastrophic. And a NASA satellite is on the job taking regular images of Irene as it barrels north from the Bahamas.

NASA’s satellites are often able to get better and different images than NOAA’s satellites, complementing, not duplicating, Goodman said. The space agency is constantly improving computing power efficiencies aboard satellites and improved optic technologies.

The satellites use a variety of different remote sensing techniques including visible, infrared, passive and active microwave, synthetic aperture radar and lidar (optical remote sensing technology) to take the images. They are never called pictures.

The Tropical Rainfall Measuring Mission (TRMM) satellite that is watching Irene uses an instrument known as the Precipitation Radar that can “see” through the clouds to measure the core of strongest hot towers of convection and updrafts.

As Hurricane Irene picked up steam this week in the Caribbean, the satellite began taking images of the storm on its regular north-south run of the Western Hemisphere. They pass over the same location only twice a day on its daily run, Goodman said.

The raw images are downloaded to tracking stations on the ground, analyzed, distributed to the proper agency and are available online, he said.

NASA works closely with several federal agencies, including the U.S. Geological Survey (USGS), NOAA, and the Department of Homeland Security (DHS), to provide imagery and data analyses for use by first responders in any kind of a disaster.

“If we know there’s a particular area we want to focus on, we can command the satellite to take the images. In some cases, we may want to compare the post-disaster with a pre-disaster image,” he said.

NASA’s fiscal 2011 Earth Science budget is $1.8 billion for the operations of existing satellites, Earth science research and analyses, and the development, launch, and operations of new satellite and instruments.

It recently took images for a host of other disasters, including:

  • NASA provided full views of the Gulf of Mexico four times a day in 2010 for six months to track the evolution of the oil slick from the Deepwater Horizon oil spill.
  • NASA satellites mapped the extent of the 2011 Mississippi River spring floods. The Department of Homeland Security used the maps to plan their flood mitigation operations and to aid in the flood assessment and recovery.
  • On April 27, 2011, a series of strong tornadic squall lines passed through Alabama and the surrounding states with over 50 tornadoes alone in Alabama. NASA images were instrumental in helping the National Weather Service locate, measure and evaluate the tracks of many of these tornadoes in the post-disaster phase.

NASA is the eyes for international disasters, too. Recent images included the aftermath of the Haitian earthquake and the eruption of the Icelandic volcano that spewed ash across Europe, both in 2010.

NASA imaging is used regularly by scientists to predict future disasters. But that comes with a dose of caution.

“In predicting the future, we’re only slightly better than the economists,” Goodman said.

Judi Hasson is an award-winning journalist who writes about technology and all things government.

See the live version.

Government workers and their contractors are intensely interested in everything that is being said about Gov 20 now and would like a place where they could get a distillation and visualization of that. Well today we can show you one solution that provides some interesting insights!

Chris Holden, Community Manager for Recorded Future, helped me use their tool for visualization of Gov20 events on the Internet from over 25,000 sources dating back to May. Keep reading →

Last month, the Department of Homeland Security joined Mitre Corp. and the SANS Institute provided an important service in highlighting the top 25 most dangerous software errors that lead to today’s most common security breaches.

The newly revised ranking calls out many of the mistakes made by developers while creating new code, such as SQL injection, OS command injection and buffer overflow. Keep reading →


I have talked about Recorded Future in several of my data stories published on Breaking Gov and how they mine more than 25,000 Internet sources and are backed by the CIA and Google.

Chris Holden, Community Manager for Recorded Future, and I met recently to talk about how we could mine and visualize people’s quotes about the debt ceiling debate so much in the news and on people’s minds right now. The initial result is shown above. Keep reading →

Page 1 of 212