The State Department’s Data-Driven Future

By Jessica Parks, Analyst

In January of this year, the State Department made headlines when it established its Center for Analytics (CfA) to manage and analyze data across the entire department. The formation of an enterprise-level analytics center is a significant move for what has traditionally been a highly decentralized organization. It also reflects a broader goal at to better harness and apply its troves of data.

If you’re looking to get in on the action, read on for a couple of areas worth targeting in FY21.

Analytics to Improve Administrative Functions

Under Chief Information Officer Stuart McGuigan, IT systems at the agency are viewed in terms of business output, especially in how they support operational functions like workflows and onboarding. Speaking at an AFCEA Bethesda event in April, he described how the State Department is exploring robotic process automation (RPA) to speed up the onboarding process for new employees and further empowering back office staff.   Read more of this post

3 types of technology to sell to USAID right now

By Kevin Shaker, senior analyst

Many in the contracting community might be worried that the U.S. Agency for International Development (USAID) is lacking sales opportunities as it continues to face budget cuts. But this could also spell opportunity as the agency looks at new ways to increase efficiency and reduce costs.

This means that in addition to utilizing shared services, USAID has been increasingly buying automation technologies and higher caliber virtualized hardware. USAID also has a slightly higher level of development, modernization and enhancement dollars compared to the rest of the civilian average of around 20 percent, which helps fund its data infrastructure. If you are aware of the current trends and drivers within the organization you may find it less daunting. Here are three of the organization’s top IT priorities:

Read more of this post

Which Agencies are Spending Big on Big Data?

Mohamad Elbarasse_headshot_7-23-2013_For WordPressby Mohamad ElbarasseAnalyst

As agencies take on a more data-centric focus to achieving their missions, it would appear as though FY 2014 is the year of big data and a slew of agencies have funded initiatives in play that will set the bar for what analytics can bring to the table. Agencies like DHS with tons of data are investing big to get it all under control. These investments coupled with the White House’s Open Data Policy, which dictates that agencies should be collecting or creating information in a manner that “supports the downstream information processing and dissemination activities,” signal a paradigm shift from hypothesis-driven to data-driven decision making and discovery at a federal level.

The National Science Foundation, Department of Defense, National Institutes of Health, Department of Energy, and the US Geological Survey at the Department of Interior received $200 million for research and development in the field of big data. These initiatives run the gamut from NIH’s 1000 Genomes Project that brings together the power of big data with Amazon Web Services cloud to make 200 terabytes of data on human genetic variation available to the public to the Defense Advanced Research Projects Agency’s (DARPA) XDATA program. The XDATA program will address challenges, such as developing scalable algorithms for processing imperfect data in distributed data stores. DARPA plans to invest $25 million a year through 2016 in XDATA.

According to Simon Szykman, dataCIO at the Department of Commerce, information sharing should be agencies’ first priority. Speaking at an AFFIRM & GITEC event in September, Szykman stated that one of the easiest ways to make big data investments more cost effective in the long run is by thinking about information sharing early on. That means that agencies are going to need help managing, standardizing, and ensuring the interoperability of their data. Vendors with products positioned to help with those tasks should gear their messaging towards addressing those needs and emphasizing long run efficiencies. Szykman went on to say that the purpose of opening up government data is not just to increase transparency, but to allow others to find value in the data. “We haven’t cornered the market on good ideas,” said Szykman, as he further elaborated that the biggest benefits of an open data policy are the things we can’t imagine today, but that can come about by making more data available to more people.  Szykman oversees Commerce’s $2.5 billion IT budget and the agency is slated to spend over $300 million on General Purpose Data and Statistics in FY2014.

Ken Rogers, Chief Technology Strategist at the Department of State, also spoke at the event and said that “Data is the primary sustainable asset in an organization.” Therefore, the proper maintenance, security, and analysis of that data are paramount to the success of the organization. Along with data management, data integration, and information sharing requirements, agencies will be in dire need of data security solutions to protect the integrity of their data. Expect to see more agencies taking on a data-centric outlook and be sure to emphasize that getting big data right the first time around can lead to some big savings down the road.

%d bloggers like this: