9th January 2022

Data quality is key to future-proofing the Specialty insurance market-DQPro Co-Founder Nick Mair comments
Opinion

Continued rapid growth in Specialty data-sets places the emphasis on quality and standards as the market enters 2022, says DQPro co-founder Nick Mair:
"The InsurTech hype wave that swept through the industry in 2016-2019 demonstrated the potential for new data driven technologies such as algorithm-driven underwriting and external sensor data(IoT) to better quantify and price insurance risk.
But after several years of discussions and exploring the technology available, many insurers have realised they need to walk first before they run. More and more insurers are recognising that the hype cycle surrounding the promise of new technologies often lags the reality of what is achievable, and how critical it is to trust the data that is powering these complex technologies-particularly given that AI and other forms of machine learning require large, reliable data-sets to produce reliable outputs.
At the same time, remote working and the need to rapidly adapt operations to meet the needs of the unprecedented market environment that has prevailed over the last two years has focused attention on the importance of the processes around accessing and assessing the data that an insurer uses in its every day operations-from underwriting controls to premium income tracking, claims and risk management.
At this pivotal point as we head into 2022, the digital trends around automation, data-driven underwriting and embracing complex technologies such as artificial intelligence are further highlighting the critical importance of core data quality across the market. That means getting a good foundation of daily data hygiene in the place first before carriers start searching for a game-changing underwriting algorithm. This process of going to the fundamental data quality sources involves addressing the most common data challenges and questions that specialty insurers have on a daily basis.
For many insurers, the reality is that the way they do business is only part digitised and so the process is often high friction and inefficient.
Multiple intermediaries between the customer and the insurer, a lack of standards for transmitting data, a more complex data-set in specialty lines compared with personal lines, and manual data entry into older systems–all create a ripe breeding ground for erroneous or incomplete data which impacts carrier operations.
There are also challenges around uncertain ownership and accountability for data issues. This starts with recognising that bad data is a business problem, not an IT/data team issue. And it means instilling accountability for data ingestion with business side users like underwriting operations and integrating this into their workflow in a way that helps them add real value.
Additionally, specialty insurers often face challenges around responding quickly to changing regulations and their associated demands on data. For example, in 2021 Lloyd’s required all insurers to evidence they are capturing a minimum data-set for all cyber risks by 31st December as part of the Lloyd’s Minimum Standards.
Updating or manually checking policy systems to ensure this data is captured would be cumbersome and expensive, so a more flexible monitoring solution is required-one that can be implemented quickly to check data captured and notify when it falls short of the required standard.
Part of solving these problems involves the creation of market-wide data standards, which provide a common language for the transmission and receipt of data in a way that supports a digital placement process and the nirvana of straight through processing.
ACORD is recognised as the global custodian of data standards for specialty insurance but full standards for London Market business-currently an objective of Lloyd’s Blueprint II-have yet to be finalised.
But these standards operate at a market level and mostly relate to the form of and movement of data. At a carrier level, we also need an internal standard for data quality. By also having standards that apply to their most important “operational” data, insurers are establishing a strong foundation for other, more transformative, data initiatives.
The global specialty insurance market continues to digitally evolve and it is more important than ever for providers to be able to trust and assess their operational data across the multiple systems and market platforms which support the underwriting lifecycle.
The market in 2022 looks set to continue its digital journey, but with a clearer focus on the quality data sources that are needed to get results from the complex technologies it is exploring.

DQPro Trends(8 articles)