What you need to know about Trump’s plan for government

Chris Wiedemann_65 x 85

By Chris Wiedemann, consultant

Trying to read the tea leaves on the Trump administration’s technology priorities has been a challenge for all of us in the industry. But we got a little more clarity on how the new administration would like to manage the executive branch with a report last week outlining a new budget, including appropriations language that President Trump plans to submit in “mid to late April.”

While we haven’t seen the budget itself (and, as always, the appropriations committees will have significant input into the process), reading the Heritage Foundation report that Trump’s budget is purportedly based on reveals some potentially dramatic changes to executive agencies, particularly in the civilian sector. Those potential changes include:

Read more of this post

New IoT Security Principles On the Way

Tom O'Keefeiot-security_blog090816By Tomas O’Keefe, Consultant

If you want to look for a growing area of investment in federal IT, look no further than securing the Internet of Things (IoT).

There’s been a lot of recent talk about the IoT, with one of the latest conversation led by the National Institute of Standards and Technology (NIST) at an August 31st workshop to help industry get a grasp on the roadmap the federal government is pursuing in the coming year. IoT leaders across federal agencies will outline strategic principles that will guide near-and-long term purchasing decisions in securing internet-connected devices.

Read more of this post

What’s Next for Cloud in the Federal Government?

blog-ChrisWBy Chris Wiedemann, Consultant

immixGroup’s Event Center was packed to the gills the morning of April 12 with technology companies looking for insight into what’s next for federal cloud adoption. The good news is new federal policy, renewed emphasis from government leaders, and updated acquisition methods are creating opportunities for industry to sell technology as a service to the federal government.

So where are the cloud-specific opportunities? My colleague, DOD Manager Lloyd McCoy, and I talked on this issue for nearly an hour during our Market Intelligence Briefing portion of the event.

Here are some key highlights from this discussion that demonstrate where we’re seeing an uptick on cloud adoption in the federal IT community: Read more of this post

Which Agencies are Spending Big on Big Data?

Mohamad Elbarasse_headshot_7-23-2013_For WordPressby Mohamad ElbarasseAnalyst

As agencies take on a more data-centric focus to achieving their missions, it would appear as though FY 2014 is the year of big data and a slew of agencies have funded initiatives in play that will set the bar for what analytics can bring to the table. Agencies like DHS with tons of data are investing big to get it all under control. These investments coupled with the White House’s Open Data Policy, which dictates that agencies should be collecting or creating information in a manner that “supports the downstream information processing and dissemination activities,” signal a paradigm shift from hypothesis-driven to data-driven decision making and discovery at a federal level.

The National Science Foundation, Department of Defense, National Institutes of Health, Department of Energy, and the US Geological Survey at the Department of Interior received $200 million for research and development in the field of big data. These initiatives run the gamut from NIH’s 1000 Genomes Project that brings together the power of big data with Amazon Web Services cloud to make 200 terabytes of data on human genetic variation available to the public to the Defense Advanced Research Projects Agency’s (DARPA) XDATA program. The XDATA program will address challenges, such as developing scalable algorithms for processing imperfect data in distributed data stores. DARPA plans to invest $25 million a year through 2016 in XDATA.

According to Simon Szykman, dataCIO at the Department of Commerce, information sharing should be agencies’ first priority. Speaking at an AFFIRM & GITEC event in September, Szykman stated that one of the easiest ways to make big data investments more cost effective in the long run is by thinking about information sharing early on. That means that agencies are going to need help managing, standardizing, and ensuring the interoperability of their data. Vendors with products positioned to help with those tasks should gear their messaging towards addressing those needs and emphasizing long run efficiencies. Szykman went on to say that the purpose of opening up government data is not just to increase transparency, but to allow others to find value in the data. “We haven’t cornered the market on good ideas,” said Szykman, as he further elaborated that the biggest benefits of an open data policy are the things we can’t imagine today, but that can come about by making more data available to more people.  Szykman oversees Commerce’s $2.5 billion IT budget and the agency is slated to spend over $300 million on General Purpose Data and Statistics in FY2014.

Ken Rogers, Chief Technology Strategist at the Department of State, also spoke at the event and said that “Data is the primary sustainable asset in an organization.” Therefore, the proper maintenance, security, and analysis of that data are paramount to the success of the organization. Along with data management, data integration, and information sharing requirements, agencies will be in dire need of data security solutions to protect the integrity of their data. Expect to see more agencies taking on a data-centric outlook and be sure to emphasize that getting big data right the first time around can lead to some big savings down the road.

%d bloggers like this: