SBA tech transformation underway, but more work needs to be done

By Kevin Shaker, senior analyst

When Maria Roat took the tech reins at the Small Business Administration last year, she promised to transition a large portion of the agency’s systems to the cloud. Things seem to be well underway, based on the chief information officer’s recent speech at the Citizen Engagement Summit hosted by FCW.

While SBA has made great strides since Roat, the former chief technology officer at the Department of Transportation, took over, tech companies still have opportunities to shape the future of IT at the SBA.

So far the agency has rebuilt SBA.gov’s interface, making it easier for small business owners to apply for loans and giving them easier access to loan processes and information. The website is also now mobile, giving internal and external customers more flexibility in how they use SBA’s services, which is a nice victory for Roat’s new leadership.

Read more of this post

Navy Operations Fueled by Data

Stephanie Meloni_65x85NavyOperations_062116By Stephanie Meloni, Consultant

The Navy is launching two new ventures that will produce significant opportunities for networking, infrastructure, and ultimately, big data and analytics tech companies. With Task Force Cyber Awakening, the organization used an agile, collaborative effort to help the Navy adapt quickly to cyber challenges; and now it’ll be using the same approach to help the organization become more interoperable.

Read more of this post

Want to Help Government Agencies with Their Big Data Strategy?

Stephanie Headshot 65x85 by Stephanie Meloni, Senior Analyst

After years of hearing buzz about Big Data, could it be that the government is actually starting to implement its use? According to a recent survey conducted by IDC Government Insights, the answer is yes. The survey gives insight into how government is using its data – along with what industry can do to help agencies improve upon their current Big Data Strategy and processes. It places most federal agencies at about the mid-point of the maturity cycle for Big Data adoption, which means that those agencies have a defined Big Data strategy and are generating repeatable results. These agencies have made a business case for the use of Big Data, but are still figuring out how to use big data technologies and data consistently. Being only halfway through to optimization also means that these agencies have a ways to go in order to overcome inefficiencies in process.

Read more of this post

Which Agencies are Spending Big on Big Data?

Mohamad Elbarasse_headshot_7-23-2013_For WordPressby Mohamad ElbarasseAnalyst

As agencies take on a more data-centric focus to achieving their missions, it would appear as though FY 2014 is the year of big data and a slew of agencies have funded initiatives in play that will set the bar for what analytics can bring to the table. Agencies like DHS with tons of data are investing big to get it all under control. These investments coupled with the White House’s Open Data Policy, which dictates that agencies should be collecting or creating information in a manner that “supports the downstream information processing and dissemination activities,” signal a paradigm shift from hypothesis-driven to data-driven decision making and discovery at a federal level.

The National Science Foundation, Department of Defense, National Institutes of Health, Department of Energy, and the US Geological Survey at the Department of Interior received $200 million for research and development in the field of big data. These initiatives run the gamut from NIH’s 1000 Genomes Project that brings together the power of big data with Amazon Web Services cloud to make 200 terabytes of data on human genetic variation available to the public to the Defense Advanced Research Projects Agency’s (DARPA) XDATA program. The XDATA program will address challenges, such as developing scalable algorithms for processing imperfect data in distributed data stores. DARPA plans to invest $25 million a year through 2016 in XDATA.

According to Simon Szykman, dataCIO at the Department of Commerce, information sharing should be agencies’ first priority. Speaking at an AFFIRM & GITEC event in September, Szykman stated that one of the easiest ways to make big data investments more cost effective in the long run is by thinking about information sharing early on. That means that agencies are going to need help managing, standardizing, and ensuring the interoperability of their data. Vendors with products positioned to help with those tasks should gear their messaging towards addressing those needs and emphasizing long run efficiencies. Szykman went on to say that the purpose of opening up government data is not just to increase transparency, but to allow others to find value in the data. “We haven’t cornered the market on good ideas,” said Szykman, as he further elaborated that the biggest benefits of an open data policy are the things we can’t imagine today, but that can come about by making more data available to more people.  Szykman oversees Commerce’s $2.5 billion IT budget and the agency is slated to spend over $300 million on General Purpose Data and Statistics in FY2014.

Ken Rogers, Chief Technology Strategist at the Department of State, also spoke at the event and said that “Data is the primary sustainable asset in an organization.” Therefore, the proper maintenance, security, and analysis of that data are paramount to the success of the organization. Along with data management, data integration, and information sharing requirements, agencies will be in dire need of data security solutions to protect the integrity of their data. Expect to see more agencies taking on a data-centric outlook and be sure to emphasize that getting big data right the first time around can lead to some big savings down the road.

%d bloggers like this: