20.10.2023 | Blog Featured Insights

Our core values will remain untouched – while the journey continues

We might wonder sometime what life is all about – or at least I do every now and then. Some say its meaningless, while some are seizing the moment and maximize it in best possible way.

My name is Fadi Hannah, and I am 41 years old and work as a product marketer at Evitec. I am a father of 2 younger boys, who keeps me busy. Most parents with younger children will recognise themselves without me needing to explain anything further, but just nodding with a smile – that is what I do at least.

Let us re-tape 50 years back to give a better context. My parents were born in Syria and moved to Lebanon when they were teenagers to work in my grandfather’s small meat shop. The Lebanese civil war started, and my parents escaped it to protect us children to give us a second chance.
During my lifetime I have developed through my environment – but some things have not changed within myself – which are my core values which I was raised up with and taught by my parents. One of those is to always be real and truthful – both to myself as well as towards others – since that will make me stand firm, independent on how strong the wind is.

I started at Evitec in August 2023, and this is my 4th company I work at during my 15 years of professional life. I have been at a very large company that is much older than my grandparents – who managed to become 104 years before passing away – but also a company which is in the same age as my boy – who is 4 years old – as well as in between (not as old as I am – but close enough).

What is it with Evitec as a company that keeps me smiling?

We work 8 hours per day, we sleep in average 8 hours per day, we commute in average 1 hour per day, and it remains less hours for our spare time compared to our working day. Of course it is important to have a passion for your work, to enjoy the accompany of your colleagues and have fruitful discussions or whatever you feel is important to you.

I feel incredibly lucky, I have a great coaching manager, great colleagues (both physically and virtually), as well as interesting and challenging work tasks – my three magic wishes are fulfilled!

Evitec is a Nordic expert in software solutions and consultation services for the finance sector with over 30 years of history. I am grateful to be part of a company that creates social change while it is transforming the financial sector by offering a wide range of customized services and software solutions, partnering with professionals in banking, life and pensions insurance and asset management.

The importance to be innovative

The core values of Evitec since day of birth is always to be customer-centric, innovative, and building long-lasting partnerships. This was also declared and highlighted in the outcome from our customer experience survey that was carried out earlier this year – ranked as one of the top 3 advantages, where some of our customers have been part of our journey since day 1. This shows the solid partnership with a strong unity growing together.

We create customer value by solving their pain points which is then delivered through our products and services. In addition, this is an ongoing process, since today’s problem will look different from tomorrow, as well for different customers. The business landscape is evolving, and technology is moving at breakneck pace. But some things will not change and instead remain – the backbone and values of the company. It is like the DNA (the nature) within each species. It is impossible – at least as of today – to manipulate or re-program it in an already existing creature. But there are ways to adjust to the environment (the nurture) – while keeping the nature intact – and instead, adapting the nurture which will foster the individuals’ attributes. The importance of adoption to our environment for our survival, was also highlighted by Charles Darwin’s evolutionary theory with “survival of the fittest.”

Sweden was ranked as top 3 innovative countries in the world this year, for the 16th consecutive year, according to the WIPO global innovation index. But what does it mean to be innovative?

I discussed this topic with a few colleagues as well as consulting with ChatGPT to get as many angles as possible and ended up that it is to actively seek and implement new ideas, processes, products, or services that create value, solve problems, and differentiate it from competitors – which ties back to Evitec’s core values.

Innovation is the lifeblood, the DNA, of a thriving company, regardless of its age. Innovation is not a destination – it is a journey that requires commitment, adaptability, and a willingness to embrace change. But some things never change – where our core values will remain untouched – while we continue our journey.

Written by

Fadi Hannah

Product Marketing Manager

15.09.2023 | Blog Featured Insights Technology

Microsoft Fabric: Insights and Considerations from Our Customers

It has been almost four months since Microsoft Fabric entered preview at the end of May. Conversations with our customers have shown a significant level of interest and raised expectations surrounding this offering. Let us explore the reasons behind this interest based on our discussions with customers and address essential considerations for those thinking about adopting it.

Solving Traditional Problems

Interest in Microsoft Fabric primarily arises from its ability to tackle familiar challenges in cloud infrastructure management. Fabric simplifies complex cloud data infrastructures, previously composed of multiple services and platforms, into a unified SaaS service. This consolidation also extends to pricing, offering a more straightforward and clear cost structure. We will go into the pricing model in more detail shortly.

Data Science and Machine Learning Capabilities

Many of our customers are intrigued by Fabric’s built-in data science capabilities. Some have already advanced analytics solutions in their current environment and are eager to explore what Fabric could add to their toolkit. Others have focused on traditional business intelligence but are now considering taking their data-driven business further. Microsoft Fabric’s Synapse Data Science integration brings these capabilities to the forefront, and customers recognize the value of having these tools as part of the Fabric package.

Navigating Concerns

While the interest in Microsoft Fabric is apparent, it is not without its share of questions and concerns. One of the foremost considerations is the maturity level of the product. Given its brief time in the preview phase and the absence of an official release date, customers understandably question its readiness for production use. Moreover, there are still some rough edges in the product that need to be addressed before the official release. Fortunately, familiar tools like Data Factory, Power BI, and Synapse Tools retain their user experience within Fabric, eliminating the need to learn their usage from scratch.

Pricing and Capacity Management

There are few licensing options in the Fabric Preview, affecting Power BI and sharing capabilities, but in the base is a time-based billing (per second with a minimum of one minute) for provisioned compute – Fabric capacity. Pricing varies by region and capacity size.

Currently, only a pay-as-you-go model is available, but a reserved capacity pricing model is in the works. Additionally, OneLake storage is billed at a pay-as-you-go rate. As the costs are generated per used time when the capacity is on, for efficient cost management it is important to close the capacity when it is not needed either manually, or through automation. If Fabric eventually offers internal auto-pause functionality for capacity, this concern will be alleviated. In the meantime, at Evitec, we have developed an automated solution to pause capacity when not in use.

Let us get back to the question of having enough capacity – computing power – for your organization. In the simplest scenario, the entire organization can use a single capacity for all their needs in Fabric. When you start using Fabric, it is a good idea to begin with the smallest capacity and increase it as necessary. As your use of Fabric grows and your organization’s needs become more diverse, you might need to scale up the capacity.

Now, it is important to understand another fundamental term in Microsoft Fabric: workspaces. Workspaces function as containers for Fabric items and can be created for example based on different business domains – this decision is up to your organization. Each workspace is associated with a capacity, and multiple workspaces can share a single capacity. However, as your usage and use-cases become more varied, your organization might want to have multiple capacities of different sizes available for different purposes. In other words, you can have different levels of computing power at your disposal.

Changing the capacity that a workspace uses is a straightforward process. Based on the increased need for capacity within a specific workspace, such as heavy calculations, you can temporarily or permanently scale up the capacity of that workspace. The following illustration demonstrates how your organization’s Fabric capacity and workspaces might evolve as your experience with Fabric grows.

Security and Region Availability

Security is paramount in data solutions, and many companies have strict policies about data storage locations. Currently, Fabric Preview is not available in all regions. For example, as of writing this blog, the closest regions to Finland offering Fabric Preview are Western Europe and Eastern Norway. We hope that region availability will expand when Microsoft releases the official version. However, if a company policy dictates a specific region not currently supporting Fabric Preview, it may raise concerns about considering Fabric as an option.

Migration Strategy Dependant on Current Data Environments

Customer situations regarding their current data solutions vary greatly. Some rely solely on Excel-based reporting, while others maintain on-premises data warehouses, and some have extensive experience with cloud-based data solutions. These varying starting points influence their needs and questions concerning Fabric.

For those migrating from on-premises environments to the cloud, Microsoft Fabric provides a straightforward option, albeit requiring a clean slate approach. In contrast, businesses with existing cloud environments seek to understand how Fabric complements their current stack and its potential for hybrid solutions. The starting point of each customer significantly influences the migration effort required. Due to Fabric’s one-copy data approach, development should be faster compared to previous cloud migrations.

Conclusion

Microsoft Fabric has garnered significant interest and raised expectations among our customers. Its ability to simplify cloud infrastructure management, offer powerful data science capabilities, and address traditional challenges is compelling. However, it is essential to address concerns around product maturity, pricing, capacity management, security, and region availability before making a decision.

We at Evitec Solutions are excited about the potential that Microsoft Fabric brings to the world of data and analytics and look forward to its continued evolution.

Written by

Henni Niiranen

Data Consultant

03.08.2023 | Analytics Blog Insights

What is Data Mesh?

Microsoft Fabric was introduced into preview on end of May. Few update releases have been made, and the discussion around Fabric is active. Let’s take a closer look into one of the topical themes now: Data Mesh.

If this buzzword is completely new to you, or you have heard the term couple of times but have not have the time to figure out what it means, here is a short curriculum on Data Mesh, and how does it connect to Microsoft Fabric. Enjoy!

Written by

Henni Niiranen

Data Consultant

Data Mesh in short

The concept of Data Mesh is relatively new as it was introduced in 2019 by Zhamak Dehghani, a pioneer in managed data decentralization. Data Mesh is an enterprise data architecture, that opposite to traditional “monolithic” and centralized data lakes and warehouses, embraces intentionally distributed approach to data architecture, especially meant for large and complex organizations dealing with big data.

The key message of Dehghani was, that the traditional way of implementing data warehouses or lakes as big, centralized structures hasn’t been able to unleash the true value of the data, and has created big, complex and expensive data architectures full of delivery bottle necks, especially in large organizations with rich domains, several sources and a diverse set of consumers.

To harness the full potential of data, Data Mesh approach advocates for distributing data ownership and governance to individual domain teams, enabling them to take ownership of their data and work in agile way. The four core principles of Data Mesh include domain and data product thinking, self-serve data platforms, and federated data governance. Let’s dig a bit deeper to these core principles.

Domain Thinking

Domain thinking is a fundamental aspect of Data Mesh. It involves aligning data infrastructure, tools, and processes with specific business domains rather than treating data as a monolithic entity. Each domain team becomes responsible for its data products, including data collection, processing, storage, and analytics.

This approach promotes a deep understanding of domain-specific data requirements, leading to better insights and faster decision-making. In large organizations a single platform/data integration team will cause bottlenecks and hinders getting business value out of data. The integration work also needs data expertise from the team, which is hard to achieve in large organizations with number of data sources, for small and centralized teams. This supports the way how business domains naturally distribute in organizations. Data domains and the teams around them should be long-term.

Data Products

Data Mesh introduces a product-oriented mindset to data management. Each domain team treats its data products as assets and focuses on delivering data products that support the specific needs of their users. This approach encourages teams to think beyond just data pipelines and storage, considering the end-to-end data product lifecycle, including data discovery, documentation, accessibility, and continuous improvement that bring long term value for the business. The customers of these data products delivered by the domain teams can be other data scientists, data engineers or business users within the organization. Data products can be for example APIs, reports, tables or datasets. Through the data products the data can be shared also between the different data domain teams when needed.

Self-Serve Data Platform

Data Mesh encourages the creation of self-serve data infrastructure as platform within the organization for the domain teams. The domain teams have the autonomy to choose and manage their data storage, processing, and analysis tools based on their unique needs to be to deliver successful data products, but their job is not to manage technical infrastructure. This job is done by a centralized platform team, who are responsible to create domain-agnostic infrastructure platform, that can support domain teams in creating their data products with low lead time. Automation capabilities are one of the key features of the platform.

Federated Data Governance

Data governance plays a crucial role in Data Mesh, as in the distributed domain approach it is very important to make sure that we don’t fall back to creating silos, data duplication and building a wild west of the enterprise data architecture.

As a recap, data governance is a set of processes, policies, and guidelines that ensure the effective and secure management of an organization’s data assets. Instead of relying solely on a centralized data governance model, Data Mesh promotes a federated data governance approach. Federation approach means, that a set of data governance guidelines and standards are defined and managed centrally in the organization, and each data domain team must comply with these rules. However, each domain is free to decide how they will comply with the governance rules in practice, taking account domain-specific requirements.

It is important to make the distinction between dusty data silos and the decentralized data domains in the data mesh. Data silos refer to isolated data storage where data is stored within individual teams or departments in an organization. Each silo typically has its own data formats, definitions, and access controls, making it challenging to share and integrate data across different silos. This results in data duplication, inconsistencies, and limited data accessibility, hindering collaboration and a holistic view of data across the organization.

The key difference between data silos and decentralized data domains lies in their approach to data management and governance. While data silos isolate data within specific teams or departments, leading to fragmentation and limited data sharing, decentralized data domains emphasize the culture of collaboration within the organization following standardized common practices, but keeping the autonomy to define their data products, data schemas, and access controls in a way that supports their use cases the best.

Fabric & Data Mesh

Ok, so now we know what is a Data Mesh is, but how does it relate to Microsoft Fabric? It is important to remember, that Data Mesh itself is not a technology or coupled with any tech provider, it is an architectural paradigm that can be implemented in many ways, and multiple paths can lead to a Data Mesh. Currently the Fabric Preview enables the organizing the data into domains and thus supporting the domain thinking of data mesh. In the future releases federated governance capabilities are enabled. In general, Microsoft now defines in their data architecture suggestions a data mesh as a one approach and gives implementation instructions for the technical side as well as for the concept and change management perspective. Data mesh is here to stay as a one enterprise data architecture.

Conclusion

Even though the Fabric would provide ease to the technical requirements of Data Mesh, no tool is going to the actual groundwork of setting up the working methods, defining and organizing the teams and generally tuning the mindset of the organization into the data mesh frequency. Changing the way how people work is never an easy task. Organization don’t need to necessarily start building a Data Mesh from scratch, maybe you have a solid existing implementation, but you just change the way how it is governed, managed, and developed and by what kind of teams.

It is also important to remember, that Data Mesh is not always the best approach, as it requires independently working autonomous domain teams. The biggest benefits of Data Mesh are achieved in larger and complex organizations with rich data landscape. For smaller organizations a single centralized team might be a better alternative for the team set up perspective.

But still, it is not a waste of time to understand the concepts of product thinking, general technical requirements of a data mesh platform or the importance of data governance.

05.06.2023 | Blog Featured Insights Technology

Painting the future with Microsoft Fabric – data landscape in one frame

The data world is abuzz with excitement as Microsoft launched into public a preview of its latest offering, Microsoft Fabric. This so-called all-in-one analytics solution has generated significant market hype across the data community, promising to revolutionize and simplify the data & analytics infrastructures and bring the “data into the era of AI”. What does this all mean in practice? Take a minute and let us tell you what the Fabric is all about.

Microsoft Fabric is a Software-as-a-Service (SaaS) solution wrapping all the different components of data landscape together under one package. With one licence you get it all what you need for your data environment: Data Factory, Synapse, Power BI and OneLake. You don’t need to buy the different resources separately anymore; it is all included into a single service and managed and governed centrally.

OneLake = centralized data storage for all your analytics data

OneLake is the other of the most remarkable features of the Fabric, as it aims to mitigate the need of data duplication within the whole solution. You, who have been working with data infrastructures, probably know that it is common that the data needs to be duplicated across the data solution’s layers for different analytical engines to support the different use cases of the data. In OneLake the data is stored in compressed parquet-format, and all the different analytical engines within the Fabric can query the same data efficiently.

To put this in context, both T-SQL engine for building a data warehouse and Analysis Service Engine for Power BI reports can use the same data as efficiently. Microsoft promises to extend this “One copy of data“ -paradigm further by enabling shortcuts for the data, so that different teams can use the same data for their specific purposes by creating virtual data products. In addition, OneLake offers a possibility to expand the lake into some third-party data storages, such as Amazon S3, without a need to move the data physically to the OneLake. Quite impressive.

Introducing AI to empower developers

The other remarkable feature of Fabric is the inclusion of the AI within the Fabric across the solution. This means introducing Copilot into all building blocks of the Fabric to assist you in your work to increase your efficiency. For example, in the future you can ask Copilot to build a HR report for you in Power BI. Interesting to see how well this feature is going to work. With Copilot Microsoft aims to empower the citizen developers to be more integral part of the data development process and thus promote the organizations to become even more data driven. Most of the Copilot features are still in Private Preview though, so we all must wait a bit longer to get our hands on these cool new features.

More sustainable tomorrow through innovation in resource efficiency

At Evitec, we have already begun exploring the capabilities that Microsoft Fabric offers. Our own OneLake is already up and running, and we are well in our way to uncover the possibilities of Fabric. While the service is still in preview mode, and some child-diseases are expected, many of the features seem promising. We truly are impressed by its ability to eliminate the need for data duplication.

As the volume of data continues to grow in the world, so does the carbon footprint of the data storage. And as we are thriving towards more sustainable tomorrow, it is important that also the data solutions are designed to be as resource efficient as possible, and here Fabric seems to make a clear difference by having the only one copy of the data. Given of course that the processing of the data does not lose the benefits gained by reduction of the storage.

Time will tell whether Fabric can claim all the promises Microsoft has made for it, but if it does, we think that Fabric is a real game changer in the data field. Join us to the journey to unravel the potential of your data with Microsoft Fabric!

Written by

Henni Niiranen

Data Consultant

29.03.2023 | Blog Featured Insights

Why is sustainability on the agenda of an IT Company?

Not long ago, I was having dinner with some friends, one working in the energy industry, other in banking. At some point the discussion turned to sustainability and environmental issues and one of my friends commented “Well, working for an IT-company, you don’t need to deal with these topics at work”. What? No – so wrong!

The whole perception of not all industries and companies needing to get involved in sustainability is wrong. Sure enough, when working in energy production or with sustainable financing, the scope of how much difference you can make, might be different. But then promoting sustainability isn’t a question of how much or little you can contribute with. Every action counts.

Development to ensure peace and prosperity for people and the planet

It’s a common misconception that sustainability is only about environment and global warming. This is righteously the largest challenge, and as the latest climate change report from Intergovernmental Panel on Climate Change (IPCC) highlights, there is much to do and not that much time. But, as the abbreviation ESG, environmental, social and governance, implies, sustainability does involve more and incorporates a profound human aspect.

Living in one of the most equal societies, with a strong labour legislation, we easily neglect topics like diversity and equity. Therefore, having documented policies like Diversity, Equity, and Inclusion (DEI) and Sustainable Procurement are important, as they clearly state the company’s standpoint to the subject, what employees can expect from the employer and on the other hand, what is expected of employees.

Coming back to environment, also in software and consultancy industry we do have numerous possibilities to promote environmental aspects. There are big ticket items, like Green Coding, by which we can directly impact on how much energy the code we develop uses. And there are small ticket items, like paying attention to energy consumption in our office and business travelling, exploring possibilities to extend the lifetime of the equipment we use and making sure we have recycling procedures in place.

Work in progress

Something I think we all can agree upon, sustainability is far from ready, we still have plenty of work ahead of us. Promoting sustainability is complex and manifolded. For the environmental aspect, the Paris Agreement sets a tough goal. And when sustainability is also surrounded by controversy, like greenwashing and questionable CO2 calculations, it will require long-term commitment from companies to scrupulously strive for genuine and credible sustainability.

During the past 1 ½ years I’ve had the pleasure of working part-time in our Sustainability team. This has been rewarding both as the topic is of personal interest for me, but also as it has been even more educating than I expected. It’s remarkable how a topic starts to evolve and grow when it’s a regular discussion point. The more my colleagues talk about the topic, the more new and interesting views and facts are brought to the table.

Looking back at our work so far, it was with great pride I could tell my friends that sustainability indeed is on the agenda of our company and something we daily work on.

15.02.2023 | Blog Insights

Profit Life & Pension is now Evitec Life

At the end of January, the name of our company Profit Software changed, and we are now Evitec

At the same time, the name of the Profit Life & Pension (PLP) system changed and is now called Evitec Life (EL). At the time of change, it’s good to take a look in the rear-view mirror and reflect upon how we with deep dedication and professionalism have for more than 30 years been developing a policy management system for life and pension insurers. 

To begin with, the focus was on developing a system for managing savings and pension insurances, an entity that we call Evitec Life Savings (ELS) from now on. Evitec Life Savings includes all key business processes such as product management, sales, policy life cycle management and claims outpayments. The system supports different types of investment forms typically used in insurance savings, such as unit-linked, interest rate, combinations of these and capitalization agreements. In addition to individual pension savings agreements, the system has versatile support for managing large and diverse group policies. 

Evitec Life Risk (ELR) was the next major development area and forms another significant entity that manages all types of personal risk insurances. The individual needs for various personal risk covers are comprehensively considered, for example the requirements of managing underwriting decisions and different deductible options. Calculation of the insurance premium and invoicing are also features of Evitec Life Risk.  

Evitec Life Claims (ELC) has been our latest development area and forms the third entity where claims for personal risk covers are managed. Especially when it comes to claims processing, there are distinct differences between personal risk covers, for example what information and documentation about the event is needed and whether it is a lump-sum or recurring compensation. In Evitec Life Claims, you can manage the end-to-end claims process, covering all stages. Registration of the claims event and making the claims decision, managing all aspects of the payment, like beneficiaries, compensation shares, bank accounts, taxation and finally, out-payment of compensation. 

For a moment, let’s go back to Evitec Life as a whole and some of the system’s general features. 

  • Parametric product structure and business rules which are easily modified are Evitec Life’s evident strengths. Flexibility to modify products, is a significant benefit when new products are introduced to the market or when run-off portfolios migrated to the new system. 
  • API interfaces are widely used throughout the system and their number is constantly increasing. Standard interfaces enable e.g. development of digital services, process automation and system integration with other applications. Relevant policy information is easily displayed where needed. The insured can see up-to-date contract information in the customer portal and can make changes that are transmitted back to the policy management system. The claims handler has relevant policy information at hand during the various steps of the process, which in return speeds up processing and streamlines decision-making. 
  • Regulative and authority requirements are also supported in Evitec Life. The system has integrations for example with tax authority, in addition to which Evitec Life supports AML, GDPR and IDD regulations. 

Over time, Evitec Life has developed into a comprehensive solution capable of handling the day-to-day operation of a life insurers long-term saving, pension, and life risk products. Carefully thought-out functionalities, modern technology and system adaptability elevates the efficiency of daily operations to a whole new level.  

If you want to hear more about Evitec Life and get a taste of the functionalities, get in touch with us!  

Jani Boström

VP, Sales and Product management

tel:+358 40 528 6011

When an insurer plans a system renewal, the primary focus is usually on how the new system supports needs today and in the future. However, few insurance companies start from scratch. Especially within life insurance, policies may be older than 50 years. Therefore, the migration of run-off portfolios usually pops up at some point during the renewal project.

Older systems often have an “uncontrolled flexibility”, a feature that originally was regarded quite handy. Individual policy details could be modified in many ways, and not all information had a designated place or format. Thus, over time, users may have entered the same information in different places and, for example, dates in different formats. Older policies also do not always have all the information required by the new structure; in which case the policy information needs to be enriched. Not to mention file formats, which have changed over the years. There are certainly many more examples. And now, 15–20 years later, when this rather mixed data should be adapted to the structures of the new system, we are faced with a data cleaning task. The scope of a migration project can often be bit of a surprise, but luckily there are tools available to help.

The power of collaboration

In data migration the cooperation between the insurance company and the system supplier is key. The insurance company knows its old products and can foresee some of the challenges in the data structures. The system supplier on the other hand, knows inside out the logic and structure of the new system. When a mechanism for checking the quality and consistency of the data is created in between, even a difficult migration becomes easier.

The three phases of migration

Data migration can be divided into three phases. In the first phase, the migration is planned, and the portfolios are studied to the smallest details. First steps are taken with smaller test data and the creation of data mapping rules starts. At the same time, the insurance company often considers whether some product portfolios can be combined to simplify the management of portfolios in the future.

In the next phase, our conversion tool will take centre stage. It is used to check whether the data to be migrated is consistent and compatible with the new system. Rarely, if ever, is older data ready at once. The conversion tool provides feedback on differences and inconsistencies, such as data fields that cannot be matched in the new structure, missing data fields, or data in an inappropriate format.

This is where the actual data cleaning begins. The same data may be run through the conversion tool several times until it can be stamped as OK. Finally, a policy lifecycle testing will be done to ensure that everything matches in the future as well. For the work to progress promptly, the conversion tool is made available also for the insurance company. Hence, the actual experts of the portfolios and those working on data cleaning can independently test the changes and updates. All in all, a time-consuming phase, but the work is rewarded in the last phase.

The actual migration is often the fastest phase. When the old data has been processed and its compatibility has been verified, this is largely a technical routine, where the converted policies smoothly float into the new system. As a final check, the outcome is reconciled with the source data.

Extensive experience of migrations

In addition to the conversion tool, Evitec Life‘s accurate description of the data structure makes migration work significantly easier. The description gives the customer a clear view of which information is needed and in which format.

At Evitec we have carried out system migrations for several decades. We have converted nearly one hundred portfolios and hundreds of thousands of policies. So, it’s fair to say that our experience has built up over time and our migration process and tools have been put to the test in many demanding projects.

heinonenperttu-074-1920x1281-9841501
Perttu Heinonen, SVP Consulting Financial Services

When I was watching the excellent Adam McKay film The Big Short originally released back in 2015, the first thought to pop into my head was: “Finally a movie that will explain to my parents why we make these systems for banks!” In the film, the actress Margot Robbie, soaking in a bubble bath, explains how the derivatives that launched the financial crisis in the U.S. in 2007 were based on covered bonds. The scene was an ironic take on the fact that few people would normally have the patience to listen to long winded explanations riddled with financial terminology. The product structure was so complex that it concealed as well as concentrated the underlying risks of the housing market.

In Finland, the situation has been better, but the fundamental mechanism is still the same. Our mortgages are mainly financed by foreign investors, not deposits. These investors receive mortgages as collateral for the money they lend. The interest rate on the money provided by the investors depends on the quality of the collateral. The better the collateral, the lower the interest rate. The quality of Finnish housing collateral has been good, but recently a noticeable risk has arisen especially in regions experiencing net outflows. This calls for transparency also in Finland in order to ensure that the collateral meets the investors’ and credit rating institutions’ criteria. At best, the nearly one hundred reports targeted at different agencies are generated automatically, at worst by dozens of people manually typing them into Excel spreadsheets.

The required transparency and quality of reporting is one of the reasons why the process needs a separate system. Another one concerns daily optimisation: Collateral is mobile by nature, as homes are sold and purchased and loans are paid back every day. Insolvency and credit losses are part of the lenders’ daily life. The handling of hundreds of thousands of collateral assets requires that the bank have a safety margin to ensure the availability of collateral. The smaller this safety margin can be made, the more the bank is able to obtain external funding.

Our Evitec Covered Bonds is an Enterprise Resource Planning or ERP system for a bank issuing covered bonds. According to our estimate, our system processes over 40 per cent of Finnish mortgages as it optimises collateral for covered bonds. Our users include OP, the recently listed OmaSp, S-Bank and Hypo (The Mortgage Society of Finland), the only credit institution in Finland specialising in housing. The reliability of the system and service thus has a great social importance, as is the case with Profit Software’s products more broadly. We have comprehensive expertise in mortgage bank IT systems that meet legal requirements. We are happy to help you launch or automatise your business or modernise your existing system.

Perttu Heinonen, SVP Consulting Financial Services, Evitec

When discussing a system renewal, hot topics are amongst other digitalization, automation, conversions, and migration. And nothing wrong with these, all important factors ensuring the new system operates as whished and delivers the expected benefit. But will a system renewal bring to the users something in addition to a new interface?

annika-edit-3-1440x1920-6572602
Annika Karppinen, Evitec Life Product Manager

”We’ve always done it like this”

I’m sure we have all sometime come across the saying “we’ve always done it like this”. Same attitude can appear also during a system renewal. When the automatization level increases, the amount of routine manual work decreases. And the logic of the new system might differ from the old one. These factors automatically lead to changes also in work processes. Therefore, a system renewal should be seen as a more holistic renewal, not just a shift in technology. For the users this means getting used to both a new interface as well as new work processes and routines.

Technical and mental transformation

The project team members get to know the new operating platform stepwise. Demos of part deliveries and particularly testing phase are great moments to discuss the functionalities of the new system and listen to the system vendors viewpoints of different solutions. These are also natural moments for reviewing current processes and routines and when needed, form new ones.

Also, the trust in the new system and the rationality for the new work routines build up during the project. Project team members have plenty of time to get used to the changes and go through a mental transformation from the old to the new era.

When the launch approaches and rest of the organisation is brought along, the newcomers will not have the same timeframe for getting acquainted with all new. For them, the pilot phase is often their first touch point with the new system and work processes but as the pilot is a much shorter phase than the project, the rest of the organization needs to absorb all new much faster. Now the project team members have a new important role as the ambassadors of the new ear. They can support and rationalize the new processes and help to smoothen the transition. As the rest of the organization will most likely have same kind of questions as the project team members, so who is better to answer them than those who already have been through this phase.

Adjustable standard system

If some part of the deliverable system does not seem to quite fit into the insurers operations, customer specific adjustments are a good solution. Evitec Life is a standard system developed for life insurers for administering pension, savings and risk insurance policies and claims. Evitec Life has a parametrized product structure allowing a flexible product configuration. Additionally various system functionalities can be modified according to customer needs. Therefore, each delivery is to some extent customer specific, although the base is the same. We are our customers partners and system renewals are planned, tested, and implemented in close co-operation. By this, we can deliver a solution that supports the customers individual products, needs and procedures.


Got interested? Contact sales@evitec.com

Payment Services Directive 2 (PSD2) and Open Banking opened a few years ago a whole new world for handling payments. Now at the cash register you can just flash your smartwatch and pay online shopping with just a few clicks. Open Banking brought new players alongside with the banks, focusing on handling payment transactions. The eagerness to jump specifically into this opportunity is easily understandable when considering that for example in 2021 Finnish pay cards were used 1,9 billion times. Already a small slice of these transactions offers a decent revenue.

For the financial market, PSD2 was the prelude for sharing information more openly than before. Now same topic is discussed also in insurance industry as “Open Insurance” seeks its’ form. But what does Open Insurance mean and what is it aiming at? There isn’t yet a uniform definition. European Insurance and Occupational Pension Authority (EIOPA) published in 2021 a Discussion Paper ”Open Insurance: Accessing and sharing insurance related data”, which is based on a very broad definition: “Covering accessing and sharing insurance-related personal and non-personal data usually via APIs”. EIOPA states increased innovation, competition, and efficiency as main goals.

What would sharing of insurance information enable?

When considering how information dense insurances are and how information generally today can be utilized for various use cases, it’s clear that open sharing of insurance information would enable many kinds of product and service innovation. Still, it should be noted that insurers cannot freely share information. In PSD2 the customer manages their information and decides what and with whom to share. This should push innovation to be highly customer centric and is a starting point for new competitive factors.

When the customer holds insurances in several companies, sharing would enable collecting the scattered information in one place. The customer would get an overall view of their coverage making it easier to get insurance guidance based on correct information and make it easier to ask for offers. Especially In Finland, collecting information of occupational pension and other pension saving in one place could be a very useful use case, Swedish minPension service being an excellent example of this idea.

Claims management is a critical point in client relationship and there are certainly many use cases for making that smoother. For example, what if my flight is delayed more than 4 hours, which is the threshold entitling me to a compensation from my travel insurance? Could the information about the delay go directly from the airline to my insurer and the compensation be automatically paid to my bank account?

Interfaces in key position

Interfaces are a prerequisite for sharing and receiving information, but they might also turn out to be road blockers. This was experienced with PDS2, different standards for interfaces and APIs complicate the development of fluent ecosystem. Let’s hope this experience is taken into consideration when the Open Insurance directive is taking its form.

Additionally, the aging IT infrastructure of the insurers will set its own obstacles. When interfaces and APIs play a central role in digitalization, influencing many processes already today, many insurers are pondering about the best solution in the long run. Continue building upon an aging technology, or has the time come to renew core systems and start capitalizing on the benefits of digitalization?

The future of information sharing

Will Open Insurance cause a same kind of revolution as PSD2 did for mobile payment? Probably not, as within insurance there is not one, even closely, similarly frequented transaction. Also, the Open Insurance directive seems to proceed rather slowly with EU.

There are still many question marks attached to Open Insurance. Even so, the directive will come to force at some point. Therefore, it’s good to start evaluating how and with what kind of solutions to prepare oneself for the possibilities Open Insurance offers.