Skip to main content
Image of a person typing on a keyboard

Data keeps the wheels of government turning and therefore must be available to employees and the public. But the federal government is also subject to an unprecedented daily onslaught of cyberattacks, necessitating tight cybersecurity which can limit access to data that is critical to operations and decision-making. This requires agencies to establish the right balance between securing data and allowing for appropriate data sharing.

To discuss this important topic, Sunil Pentapati, vice president for technology strategy and solutions at Maximus, Scott Beliveau, chief of enterprise advanced analytics for the U.S. Patent and Trademark Office (USPTO), and Lauren Pavlik, chief of data and software services, enterprise cloud management agency for the Department of the Army, sat down with George Jackson, vice president for events at GovExec, to discuss the delicate balance between data accessibility and strong cybersecurity. The discussion centered around the challenges agencies face when balancing the need for effective cybersecurity with the demands of seamless data flow and explored new and innovative strategies to address these challenges.

Beliveau kicked off the conversation by defining the parameters of the problems facing the government. He noted that initiatives such as the Foundations for Evidence-Based Policymaking Act require federal agencies to use data-driven decision-making, so data has become even more essential to agency operations than before. This has compelled agencies to see data in new ways. As part of this new approach to data, the USPTO is shifting from viewing cybersecurity as an afterthought to embedding the technology from the beginning of the development process. The agency has also put a greater emphasis on training employees in data literacy, so they can better understand their role in protecting particular assets and why cybersecurity measures are in place.

To improve the customer experience, Maximus is focusing on building new architectures and wrapping them in cybersecurity. In particular, they are integrating the requirements like zero trust and the Evidence Act into new solutions. Maximus is exploring new ideas in addition to the traditional centralized data lake. For example, a data fabric or data mesh allows an agency to leave the data where it is and expose it through APIs, so the data can be leveraged in different analytical workloads to do data-driven decision-making.

Pentapati described how Maximus is addressing the obstacles hindering the sharing of data across agencies and between different federal agencies. The company develops data architectures and data platforms that are loosely coupled and tightly integrated. This creates open architectures that allow federal employees and the public to access the information they need—while placing enough controls in place that data cannot be misused. Such architectures must also be flexible enough to adapt to threats that government agencies will face in the future.

Pavlik agreed with Pentapati that data discoverability and visibility were critical issues for the government. She emphasized that data visibility is important not only for accessibility but also for cybersecurity—to ensure safe and accurate information within the Department of Defense. She said, “We need to set up our systems with those continuous monitoring tools for visibility into the threats, active cybersecurity monitoring, and an effective incident management to make sure that the soldiers always know when they're going into battle that their information is absolutely trustworthy.”

All of the participants agreed on the value of automation in providing data access and security. Pentapati noted that a big challenge for agencies is the “handshakes” between various systems, users, and products. When systems take time to recognize each other, it slows everything down and reduces the overall speed of government operations. To shorten these cycle times and ensure that data reaches the customer more quickly, vendors need to standardize their methodologies and automate as much as possible. He suggested that agencies adopt DataOps which leverage DevSecOps principles for secure data analytics—solving inefficient data generation and processing problems and improving data quality by applying controls in an automated fashion.

Participants also cited the critical importance of the customer experience. Maximus focuses on the user aspect of zero trust architecture because it drives all the other aspects. Since solutions can be built many different ways—low code, no code, or custom code—a unifying governance model across all technology constructs is essential to deliver a great customer experience. That experience begins with bringing the product owners together and ensuring user participation. Customer input is crucial to informing solution design.

Beliveau observed that their discussion had frequently touched on the concept of prioritizing human-centered design when developing products or processes. Discussions and requirements assessments with customers are a necessary part of any process—whether that means helping to make architectural decisions, getting training on technology, giving feedback, creating a proof of concept or anything else. When customers are involved, they have a feeling of ownership and a level of trust in the technology.

The discussion highlighted the need for continued collaboration between government agencies and industry—and the importance of customers—in ensuring the accessibility and security of agency data.

Listen to the entire discussion at: