Big Data and shared services, arguably two of the hottest trends in federal IT, posses the unquestionable power to revolutionize our ability to share information, make informed decisions and create knowledge – all while saving taxpayers boatloads of cash. However, despite the myriad memos, initiatives and projects focused on these transformational strategies, the federal IT community does not seem to be giving much attention to one of the most critical requirements needed to truly maximize these systems: bandwidth.

Here is the problem:

For Big Data to work, agencies must be able to quickly and repeatedly transfer massive amounts of data; be it down the street, across the country, or around the world. Without the proper infrastructure and networks (i.e.: 100gbps connections) to facilitate this basic requirement, even the most robust and well thought-out Big Data deployments will be reigned-in.

The same general concept holds true if we are to move large, data-intensive applications into a shared services or cloud environment, especially when it comes time to run analysis, reporting, and continuous monitoring.

The potential consequences of not acting to met these bandwidth requirements are exacerbated by the fact that we continue to create new data at unprecedented rates. What’s more, a culture that is quickly marching toward increased information sharing and silo-less environments means network administrators will be on the hot seat if they are unable to provided the bandwidth required to support these new strategies.

The scientific community including the Department of Energy’s network of Labs has already begun to tap into the power of 100gbps networks to support the large amounts of data produced by scientists and the medical industry. In October of 2011, DoE announced upgrades to its already super-fast network connection used by scientists and researchers at over 40 different national locations and supercomputer facilities.

Just last week, Internet2 announced that it had lit a 100gbps connection between Chicago Indianapolis to support medical information sharing. While these are great moves for their respective communities, the time is now for the rest of us to follow-suit.

In order for maximize the impact of today’s transformation and efficiency efforts, it is imperative that we address the inevitable requirement to move large amounts of data and start upgrading our networks now. After all, what good is a performance vehicle without the Autobahn to drive it on?

Parham Eftekhari is co-founder and director of research for the Government Technology Research Alliance.