Other parts of this series:
- Poor data management threatens to hurt insurers in the digital economy
- As data soars in value, it’s time for insurance execs to step in and take the reins
- Data lakes and APIs provide insurers with the keys to unlock their data logjams
- These five steps will help insurers get their data governance on track
Insurers face huge challenges in their quest to improve their data management. Two key technologies – data lakes and APIs – can help them triumph.
Insurers and other financial services firms are investing heavily in data technologies in an effort to better manage their fast-expanding information resources. They spent US$6.4 billion on data-related programs last year, according to research company Technavio. And this expenditure is expected to climb by around 26 percent a year until 2019.
Data has become the most valuable commodity in business. Without quick and easy access to relevant and accurate data, companies are going to struggle to survive in the emerging digital economy. Data management, as I pointed out in my previous blog, is no longer just the concern of the IT department. Its strategic importance has elevated it onto the agenda of business leadership.
Insurers face two mammoth tasks in their quest to improve their data management. Firstly, they need to accommodate the huge, and ever-increasing, volumes of data their businesses are generating. Secondly, they have to extract critical business intelligence from vast swathes of data that are often stored in multiple locations and in many different formats.
Two important technologies are providing insurers with the tools they need to overcome these challenges. They are:
Data lakes: These scalable data repositories enable insurers to reduce costs and increase business agility. Whereas conventional data warehouses store data in specific structured formats, often resulting in siloes of different types and sources of data, data lakes can hold huge amounts of data in many different, structured and unstructured, formats. The illustration below shows how a data lake can enhance productivity and efficiency by holding data from multiple sources. Advanced big data analytical systems enable users to quickly extract and format the data pulled from the data lake when it’s needed. This is much faster and less costly than extracting data from data warehouses. Overheads can be further reduced by hosting data lakes on low-cost servers.
Application programming interfaces (APIs): The more insurers can source critical data from multiple sources, and distribute it across a variety of applications, the more valuable it will become. APIs allow insurers to open their IT applications to source and distribute data throughout their organizations as well as with third-parties.
Private APIs expose the data and functionality of legacy systems and hybrid platforms to enable developers to create enterprise-wide ecosystems that accelerate the retrieval and application of business-critical data. Partner APIs open these ecosystems to allow data to flow, under tightly controlled conditions, between business allies. Public APIs, backed by robust platforms, architectures and governance models, are designed to attract support from a broad range of digital business partners and developers. The ability of APIs to closely link the data resources within and between organizations has fueled the rapid growth of “platform businesses” that share information across digital ecosystems.
In my next blog post, I’ll discuss why insurers need to not only improve their data management but also embrace more comprehensive data governance.
Until then, take a look at these links. I think you’ll find them worthwhile.