Agencies by necessity must keep multiple IT projects going simultaneously. They’re upgrading infrastructure to use the latest communications protocols and cloud services, modernizing applications with artificial intelligence, deploying tools like containerization to take continued advantage of legacy systems, and adding automation and data integration to improve customer and employee experience.
Because of these competing project demands, spending time up front thinking hard about technology and requirements is the key to success of any automation project, said Gabrielle Rivera, vice president at Maximus.
“The assessment is going to be critical with introducing new automation or the improvement of automation that’s currently in the environment,” Rivera said. “Having a clear understanding of when automation should be applied and what type of automation and what can be a facilitator is imperative.”
Several factors must align before the best approach emerges. Rivera identified three:
- Infrastructure: Does the agency wants to retire legacy networks or increase cloud use?
- Technology: Is something suited for robotic process automation, machine learning or generative artificial intelligence?
- Mission: What is the use case? Is there an expected return on investment? Are there any regulatory or statutory requirements? Are there mission-driven deadlines or other schedule concerns?
She pointed out that agencies increasingly also want to know when and how best to employ AI.
Rivera cautioned to not overlook the employee training implications of automation projects. That means not only in the operation of the resulting systems themselves, but also whether employees are trained in the skills they’ll need for higher level activities that automation should free them to take on.
Because newly automated processes operate in highly interconnected environments, automation requires a team approach, she said.
“The pain points of automation [are] the same pain points that we deal with in any implementation,” Rivera said. For example, there is the potential for tension among the implementers, cybersecurity teams and the business program managers for whom the automation is designed. For that reason, she advised “having everyone come together on an agreement and understanding of what should be automated and why.”
How to manage ATOs effectively
Automation project plans should ensure cybersecurity stays intact and that adjacent systems continue without problems, especially when the goal includes automating authorities to operate (ATOs). That’s often a side benefit of the software factory approach to development and its iterative output of new functions.
Moreover, Rivera added, project leaders should ensure they know the answers to questions, such as: “Is the new tool and technology implementation going to fit within the environment? The upstream and downstream effects of what it’s automating? Also is there a human in the loop to validate what’s being produced?”
The ATO point is particularly central to automation and continuous delivery of software products, she said. Therefore, agencies and their contractors should work collaboratively on a phased approach to adopting DevSecOps and continuous ATOs.
“I don’t want to call it low hanging fruit because these are mission critical applications,” Rivera said. “What we can do is take a small component and a small portion and conduct an assessment within a fixed environment, within the continuous ATO process, so that we can test those first before we put them into a production environment.”
She noted, “Some of our pipelines have over 160 applications that we’re moving at one time, and so there’s a need for us to automate those.”
ATO considerations beyond cybersecurity include avoiding blackout periods, and proper handling and safeguarding of personally identifiable information used by applications.
Where cloud and AI cross paths
Agencies across the board now think critically about where use of commercial cloud service providers fit into the automation and modernization equation, Rivera said. It’s the natural evolution from the government’s cloud first to cloud smart philosophy.
Whether newly automated functions should execute in the cloud raises several questions, she said: “Do we need any actions beforehand? Do we need to refactor? Do we need to reengineer any code? Could containerization be best for said application? How much is it going to cost to migrate to the applications to the cloud?”
In considering how to manage cloud costs and return on investment, agencies must also think about whether a given application can run efficiently in multiple clouds. Conversely, people from components of large agencies or departments should consider how they can standardize their approaches to new application development, Rivera said.
“Can we find a consistent approach that works across the enterprise … in order to produce not just efficiencies within the procurement of automation technologies but also in how that is deployed to generate some sort of a beneficial outcome that is not realized currently?” she asked.
The AI question comes up in current automation assessments. Rivera said it’s important to understand that AI “can be a facilitator to technologies already in place” but not something that necessarily applies in every case.
Knowing “how and why we’re leveraging AI is going to allow us to produce the results that we’re looking for, as opposed to just blanketing it or layering it on,” she said.
In all cases, Riviera recommended that agencies measure their projects against established metrics and pointed out that not all metrics are strictly technical.
“It’s important for us, when we’re talking about metrics, to not just measure the health of the program, service level agreements and system uptime, but also that we’re measuring the employee experience and the customer experience.”
This article and video interview was originally published on Federal News Network on September 2, 2024.