Five Robotic Process Automation Risks to Avoid
As organizations explore how software robots — or bots — can help automate administrative tasks and decisions, it pays to keep in mind some of the risks that come with the territory.
Software robots have emerged as a potential way for organizations to achieve clear cost savings. As an evolution of automated application testing and quality assurance (QA) platforms, software bots can simulate activities that humans perform via screens and applications. A software bot can be trained to perform QA tasks, such as tapping and swiping through an app like a human would, or execute repetitive service tasks such as generating bills. It does this either by recording what humans do or via scripts that are straightforward to construct. The result is the ability to automate rote tasks that do not require complex decision-making, without the need for underlying systems changes.
Deploying bots seems deceptively simple. Robotic process automation (RPA) software vendors are actively pitching their platforms, and professional services organizations are talking up the possibilities that bots offer for cost savings with minimal project spending and limited transformational pain, which is resulting in significant corporate interest. Some forward-looking organizations are using bots and other rapid-process and data-automation tool sets to free up budget and resources to kick off large-scale reengineering programs. Others are simply using the tools to give themselves a bit of breathing room to figure out where to go next with their core platforms.
Given the push toward digital agility and away from legacy systems, it’s not surprising that organizations are executing pilots with bots across their operations. But there are five major risks to consider when designing a bot strategy.
1. If bot deployment is not standardized, bots could become another legacy albatross. The way in which business organizations are adopting bots brings to mind another application adoption: measures to address the Y2K software bug at the end of the 20th century. To deal with the time-clock change at the turn of the century, many organizations circumvented legacy limitations. Business users embraced the increasing power in Microsoft Excel and Access to create complex, business-critical applications on their desktops. But as those custom-made computing tools proliferated, so did the problems due to the lack of a strong controls framework, QA, release-management processes, and other formalized IT processes. Companies then had to spend large sums of money tracking down all their wayward tools and slowly eliminating them from critical functions.
Today’s explosion of bots threatens to repeat this pattern.