There’s much chatter in insurance innovation circles about robotics. The very word “robotics” conjures up images of R2D2 or the foes of  Doctor Who, but the reality is far more mundane. It’s usually just software on PC’s doing the boring stuff.

While some speculate about what happens in insurance when these “bots” get so smart they make better decisions than human brains, others are getting on and implementing basic robotics solutions now.  These implementations are taking place at the simpler end of the spectrum getting machines to do some of the simpler tasks that the industry has traditionally done manually, often on an outsourced basis. This is referred to as Robotic Process Automation or RPA. 

In essence, RPA is desktop software that emulates the human execution of repetitive processes. A digital or virtual worker which can either sit alongside an existing workforce (Assisted Automation) so as to streamline their tasks and increase their efficiency or replace the existing workforce (Autonomous Automation) because they can be left unattended to perform specific tasks.

The inescapable truth is that while most of the insurance community are curious about technology innovation, they really don’t think that any of the buzz topics like blockchain, AI, IoT etc. apply to them, but the business case for RPA is different. It’s not about improved data analytics or customer engagement, it is about efficiency and cost savings. Some reports suggest it can save up to 70% on manual data entry tasks. The main reason for that is that they can work 24/7 and once fully trained the accuracy rate is much better than manual equivalent.

So, what is RPA being used for in insurance? The tasks where they are most often seen at the moment are:

 - Application migration – data migration as part of a system upgrade

 - Data entry - simulating user rekeying of data from paper or image to systems.

 - Information validation and auditing - referencing data sources to inspect and validate the information and provide compliant outputs

-  Legacy enablement - interfacing disparate legacy systems, doing data translation and rekeying where a full API or coded integration is not possible.

So, what has the experience of the early adopters been like?

Well, it is not just a case of downloading the software and off you go. Implementation is quite difficult and considerable investment in terms of time is required at the start to define inwards data formats, iteration so as to enable the robots to learn and accuracy to improve. There are also complexities around the configuration of the robots, how they prioritise, their hierarchy, spread the work and so on.

Accordingly, a separate industry has grown up around implementation with the big consulting houses to the fore again.  This has upfront cost implications suggesting that the cost benefits are rarely achieved within a year, but are of such magnitude in future years that it is usually worth persisting. Some complain that the levels of accuracy achieved are less than desired, but rarely less than they were achieving manually.

 Improvements in the self-service features of the RPA software is starting to bring down implementation costs so that you no longer require code changes to refine implementation. There is also an increase in the amount of AI meaning that the software’s ability to teach itself is improving too. It’s probably worth checking on the maturity of these two features when reviewing potential RPA vendors.

As you would expect, many of the early adopters are themselves BPO providers so in several cases tasks that you think are being performed manually are increasingly done by RPA.  There is a useful White Paper reviewing the Xchanging adoption of RPA with Blue Prism (which was over a year ago now) containing some useful myth busting insights -

So, when you next read an article about robotics and dismiss it as science fiction, bear in mind that some of your competitors are using it, and probably your BPO provider too!