10%

Try : Insurtech, Application Development

Edtech(5)

Events(34)

Interviews(10)

Life@mantra(11)

Logistics(1)

Strategy(14)

Testing(8)

Android(45)

Backend(29)

Dev Ops(2)

Enterprise Solution(22)

Frontend(28)

iOS(40)

Javascript(13)

Augmented Reality(17)

Customer Journey(12)

Design(13)

User Experience(34)

AI in Insurance(31)

Insurtech(59)

Product Innovation(37)

Solutions(15)

E-health(3)

HealthTech(8)

mHealth(3)

Telehealth Care(1)

Telemedicine(1)

Artificial Intelligence(109)

Bitcoin(7)

Blockchain(18)

Cognitive Computing(7)

Computer Vision(8)

Data Science(14)

FinTech(44)

Intelligent Automation(26)

Machine Learning(46)

Natural Language Processing(13)

AI Use Cases for Data-driven Reinsurers

Across the Insurance expansile, a special fraction within the industry is notable for its embrace of new technologies ahead of others. For an industry that notoriously keeps a straggling pace behind its banking and financial peers, Reinsurance has conventionally demonstrated a greater proclivity for future-proofing itself. In fact, they were one of the first to adopt cat-modelling techniques in the early ’90s to predict and assess risk.  This makes perfect sense too — ‘Insurance for insurers’ or reinsurance is the business of risk evaluation of the highest grade — which means there are hundreds of billions of dollars more at stake. 

Front-line insurers typically practice transferring some amount of their risk portfolio to reduce the likelihood of paying enormous claims in the event of unforeseen catastrophe losses. For most regions of the World — wind and water damage through thunderstorms, torrential rains, and snowmelt caused the highest losses in 2019.

In the first half of 2019 itself, global economic losses from natural catastrophes and man-made disasters totalled $44 billion, according to Swiss Re Institute’s sigma estimates. $25 billion of that total was covered by reinsurers. Without the aid of reinsurance absorbing most of that risk and spreading it out, insurance companies would have had to fold. This is how reinsurance protects front-line insurers from unforeseen events in the first place.

Yet, protection gaps, especially in emerging economies still trails behind. Only about 42 per cent of the global economic losses were insured as several large-scale disaster events, such as Cyclone Idai in southern Africa and Cyclone Fani in India, occurred in areas with low insurance penetration.

Reinsurance can be an arduous and unpredictable business. To cope with a prolonged soft market, declining market capital and shaky investor confidence — reinsurers have to come up with new models to boost profitability and add value to their clients.

For them, this is where Artificial Intelligence and the sisterhood of data-driven technologies is bringing back their edge.


Source: PwC – AI in Insurance Report

AI Use Cases for Reinsurers 

Advanced Catastrophe Risk Modelling

Catastrophic models built on machine learning models trained on real claims data, and ethno- and techno-graphic parameters can decisively improve the authenticity of risk assessments. The models are useful tools for forecasting losses and can predict accurate exposure for clients facing a wide range of natural and man-made risks.

Mining Data for behavioural risks can also inform reinsurers about adjusting and arranging their reinsurance contracts. For example, Tianjin Port explosions of 2015 resulted in losses largely due to risk accumulation — more specifically accumulation of cargo at the port. Static risks like these can be avoided by using sensors to tag and monitor assets in real-time.

RPA-based outcomes for reducing operational risks

RPA coupled with smart data extraction tools can handle a high volume of repetitive human tasks that requires problem-solving aptitude. This is especially useful when manually dealing with data stored in disparate formats. Large reinsurers can streamline critical operations and free employee capacity. Automation can reduce turn-around-times for price/quote setting in reinsurance contracts. Other extended benefits of process automation include: creating single view documentation and tracking, faster reconciliation and account settlement time, simplifying the bordereau and recovery management process, and the technical accounting of premium and claims.

Take customised reinsurance contracts for instance that are typically put together manually. Although these contracts provide better financial risk control, yet due to manual administration and the complex nature of such contracts — the process is prone to errors. By creating a system that can connect to all data sources via a single repository (data lake), the entire process can be automated and streamlined to reduce human-related errors.

Risk identification & Evaluation of emerging risks

Adapting to the risk landscape and identifying new potential risks is central to the functioning of reinsurance firms. For example, if reinsurance companies are not interested in covering Disaster-related insurance risks, then the insurance companies will no longer offer this product to the customer because they don’t have sufficient protection to sell the product. 

According to a recent research paper, the reinsurance contract is more valuable when the catastrophe is more severe and the reinsurer’s default risk is lower. Predictive modelling with more granular data can help actuaries build products for dynamic business needs, market risks and concentrations. By projecting potential future costs, losses, profits and claims — reinsurers can dynamically adjust their quoted premiums. 

Portfolio Optimization


During each renewal cycle, underwriters and top executives have to figure out: how to improve the performance of their portfolios? To carry this out, they need to quickly assess in near real-time the impact of making changes to these portfolios. Due to the large number of new portfolio combinations that can be created (that run in the hundreds of millions), this task is beyond the reach of pure manual effort. 


To effectively run a model like this, machine learning can shorten the decision making time by sampling selective combinations and by running multi-objective, multi-restraint optimization models as opposed to the less popular linear optimization method.  Portfolio optimization fueled by advanced data-driven models can reveal hidden value to an underwriting team. Such models can also predict with great accuracy how portfolios will perform in the face of micro or macro changes.

Repetitive and iterative sampling of the possible combinations can be carried out to create a narrowed down set of best solutions from an extremely large pool of portfolio options. This is how the most optimal portfolio that maximizes profits and reduces risk liability, is chosen. 

Reinsurance Outlook in India 

The size of the Indian non-life market, which is more reinsurance intensive than life, is around $17.7B, of which nearly $4B is given out as reinsurance premium. Insurance products in India are mainly modeled around earthquakes and terrorism, with very few products covering floods. Mass retail sectors such as auto, health and small/medium property businesses are the least reinsurance dependant. As the industry continues to expand in the subcontinent, an AI-backed data-driven approach will prove to be the decisive leverage for reinsurers in the hunt for new opportunities beyond 2020. 

Also read – Why InsurTech beyond 2020 will be different

Cancel

Knowledge thats worth delivered in your inbox

Implementing a Clean Architecture with Nest.JS

4 minutes read

This article is for enthusiasts who strive to write clean, scalable, and more importantly refactorable code. It will give an idea about how Nest.JS can help us write clean code and what underlying architecture it uses.

Implementing a clean architecture with Nest.JS will require us to first comprehend what this framework is and how it works.

What is Nest.JS?

Nest or Nest.JS is a framework for building efficient, scalable Node.js applications (server-side) built with TypeScript. It uses Express or Fastify and allows a level of abstraction to enable developers to use an ample amount of modules (third-party) within their code.

Let’s dig deeper into what is this clean architecture all about. 

Well, you all might have used or at least heard of MVC architecture. MVC stands for Model, View, Controller. The idea behind this is to separate our project structure into 3 different sections.

1. Model: It will contain the Object file which maps with Relation/Documents in the DB.

2. Controller: It is the request handler and is responsible for the business logic implementation and all the data manipulation.

3. View: This part will contain files that are concerned with the displaying of the data, either HTML files or some templating engine files.

To create a model, we need some kind of ORM/ODM tool/module/library to build it with. For instance, if you directly use the module, let’s say ‘sequelize’, and then use the same to implement login in your controller and make your core business logic dependent upon the ‘sequelize’. Now, down the line, let’s say after 10 years, there is a better tool in the market that you want to use, but as soon as you replace sequelize with it, you will have to change lots of lines of code to prevent it from breaking. Also, you’ll have to test all the features once again to check if it’s deployed successfully or not which may waste valuable time and resource as well. To overcome this challenge, we can use the last principle of SOLID which is the Dependency Inversion Principle, and a technique called dependency injection to avoid such a mess.

Still confused? Let me explain in detail.

So, what Dependency Inversion Principle says in simple words is, you create your core business logic and then build dependency around it. In other words, free your core logic and business rules from any kind of dependency and modify the outer layers in such a way that they are dependent on your core logic instead of your logic dependent on this. That’s what clean architecture is. It takes out the dependency from your core business logic and builds the system around it in such a way that they seem to be dependent on it rather than it being dependent on them.

Let’s try to understand this with the below diagram.

Source: Clean Architecture Cone 

You can see that we have divided our architecture into 4 layers:

1. Entities: At its core, entities are the models(Enterprise rules) that define your enterprise rules and tell what the application is about. This layer will hardly change over time and is usually abstract and not accessible directly. For eg., every application has a ‘user’. What all fields the user should store, their types, and relations with other entities will comprise an Entity.

2. Use cases: It tells us how can we implement the enterprise rules. Let’s take the example of the user again. Now we know what data to be operated upon, the use case tells us how to operate upon this data, like the user will have a password that needs to be encrypted, the user needs to be created, and the password can be changed at any given point of time, etc.

3. Controllers/Gateways: These are channels that help us to implement the use cases using external tools and libraries using dependency injection.

4. External Tools: All the tools and libraries we use to build our logic will come under this layer eg. ORM, Emailer, Encryption, etc.

The tools we use will be depending upon how we channel them to use cases and in turn, use cases will depend upon the entities which is the core of our business. This way we have inverted the dependency from outwards to inwards. That’s what the Dependency Inversion Principal of SOLID implies.

Okay, by now, you got the gist of Nest.JS and understood how clean architecture works. Now the question arises, how these two are related?  

Let’s try to understand what are the 3 building blocks of Nest.JS and what each of them does.

  1. Modules: Nest.JS is structured in such a way that we can treat each feature as a module. For eg., anything which is linked with the User such as models, controllers, DTOs, interfaces, etc., can be separated as a module. A module has a controller and a bunch of providers which are injectible functionalities like services, orm, emailer, etc.
  1. Controllers: Controllers in Nest.JS are interfaces between the network and your logic. They are used to handle requests and return responses to the client side of the application (for example, call to the API).
  1. Providers (Services): Providers are injectable services/functionalities which we can inject into controllers and other providers to provide flexibility and extra functionality. They abstract any form of complexity and logic.

To summarize,

  • We have controllers that act as interfaces (3rd layer of clean architecture)
  • We have providers which can be injected to provide functionality (4th layer of clean architecture: DB, Devices, etc.)
  • We can also create services and repositories to define our use case (2nd Layer)
  • We can define our entities using DB providers (1st Layer)

Conclusion:

Nest.JS is a powerful Node.JS framework and the most well-known typescript available today. Now that you’ve got the lowdown on this framework, you must be wondering if we can use it to build a project structure with a clean architecture. Well, the answer is -Yes! Absolutely. How? I’ll explain in the next series of this article. 

Till then, Stay tuned!

About the Author:

Junaid Bhat is currently working as a Tech Lead in Mantra Labs. He is a tech enthusiast striving to become a better engineer every day by following industry standards and aligned towards a more structured approach to problem-solving. 


Read our latest blog: Golang-Beego Framework and its Applications

Cancel

Knowledge thats worth delivered in your inbox

Loading More Posts ...
Go Top