15.09.2023 | Blog Featured Insights Technology

Microsoft Fabric: Insights and Considerations from Our Customers

It has been almost four months since Microsoft Fabric entered preview at the end of May. Conversations with our customers have shown a significant level of interest and raised expectations surrounding this offering. Let us explore the reasons behind this interest based on our discussions with customers and address essential considerations for those thinking about adopting it.

Solving Traditional Problems

Interest in Microsoft Fabric primarily arises from its ability to tackle familiar challenges in cloud infrastructure management. Fabric simplifies complex cloud data infrastructures, previously composed of multiple services and platforms, into a unified SaaS service. This consolidation also extends to pricing, offering a more straightforward and clear cost structure. We will go into the pricing model in more detail shortly.

Data Science and Machine Learning Capabilities

Many of our customers are intrigued by Fabric’s built-in data science capabilities. Some have already advanced analytics solutions in their current environment and are eager to explore what Fabric could add to their toolkit. Others have focused on traditional business intelligence but are now considering taking their data-driven business further. Microsoft Fabric’s Synapse Data Science integration brings these capabilities to the forefront, and customers recognize the value of having these tools as part of the Fabric package.

Navigating Concerns

While the interest in Microsoft Fabric is apparent, it is not without its share of questions and concerns. One of the foremost considerations is the maturity level of the product. Given its brief time in the preview phase and the absence of an official release date, customers understandably question its readiness for production use. Moreover, there are still some rough edges in the product that need to be addressed before the official release. Fortunately, familiar tools like Data Factory, Power BI, and Synapse Tools retain their user experience within Fabric, eliminating the need to learn their usage from scratch.

Pricing and Capacity Management

There are few licensing options in the Fabric Preview, affecting Power BI and sharing capabilities, but in the base is a time-based billing (per second with a minimum of one minute) for provisioned compute – Fabric capacity. Pricing varies by region and capacity size.

Currently, only a pay-as-you-go model is available, but a reserved capacity pricing model is in the works. Additionally, OneLake storage is billed at a pay-as-you-go rate. As the costs are generated per used time when the capacity is on, for efficient cost management it is important to close the capacity when it is not needed either manually, or through automation. If Fabric eventually offers internal auto-pause functionality for capacity, this concern will be alleviated. In the meantime, at Evitec, we have developed an automated solution to pause capacity when not in use.

Let us get back to the question of having enough capacity – computing power – for your organization. In the simplest scenario, the entire organization can use a single capacity for all their needs in Fabric. When you start using Fabric, it is a good idea to begin with the smallest capacity and increase it as necessary. As your use of Fabric grows and your organization’s needs become more diverse, you might need to scale up the capacity.

Now, it is important to understand another fundamental term in Microsoft Fabric: workspaces. Workspaces function as containers for Fabric items and can be created for example based on different business domains – this decision is up to your organization. Each workspace is associated with a capacity, and multiple workspaces can share a single capacity. However, as your usage and use-cases become more varied, your organization might want to have multiple capacities of different sizes available for different purposes. In other words, you can have different levels of computing power at your disposal.

Changing the capacity that a workspace uses is a straightforward process. Based on the increased need for capacity within a specific workspace, such as heavy calculations, you can temporarily or permanently scale up the capacity of that workspace. The following illustration demonstrates how your organization’s Fabric capacity and workspaces might evolve as your experience with Fabric grows.

Security and Region Availability

Security is paramount in data solutions, and many companies have strict policies about data storage locations. Currently, Fabric Preview is not available in all regions. For example, as of writing this blog, the closest regions to Finland offering Fabric Preview are Western Europe and Eastern Norway. We hope that region availability will expand when Microsoft releases the official version. However, if a company policy dictates a specific region not currently supporting Fabric Preview, it may raise concerns about considering Fabric as an option.

Migration Strategy Dependant on Current Data Environments

Customer situations regarding their current data solutions vary greatly. Some rely solely on Excel-based reporting, while others maintain on-premises data warehouses, and some have extensive experience with cloud-based data solutions. These varying starting points influence their needs and questions concerning Fabric.

For those migrating from on-premises environments to the cloud, Microsoft Fabric provides a straightforward option, albeit requiring a clean slate approach. In contrast, businesses with existing cloud environments seek to understand how Fabric complements their current stack and its potential for hybrid solutions. The starting point of each customer significantly influences the migration effort required. Due to Fabric’s one-copy data approach, development should be faster compared to previous cloud migrations.

Conclusion

Microsoft Fabric has garnered significant interest and raised expectations among our customers. Its ability to simplify cloud infrastructure management, offer powerful data science capabilities, and address traditional challenges is compelling. However, it is essential to address concerns around product maturity, pricing, capacity management, security, and region availability before making a decision.

We at Evitec Solutions are excited about the potential that Microsoft Fabric brings to the world of data and analytics and look forward to its continued evolution.

Written by

Henni Niiranen

Data Consultant

05.06.2023 | Blog Featured Insights Technology

Painting the future with Microsoft Fabric – data landscape in one frame

The data world is abuzz with excitement as Microsoft launched into public a preview of its latest offering, Microsoft Fabric. This so-called all-in-one analytics solution has generated significant market hype across the data community, promising to revolutionize and simplify the data & analytics infrastructures and bring the “data into the era of AI”. What does this all mean in practice? Take a minute and let us tell you what the Fabric is all about.

Microsoft Fabric is a Software-as-a-Service (SaaS) solution wrapping all the different components of data landscape together under one package. With one licence you get it all what you need for your data environment: Data Factory, Synapse, Power BI and OneLake. You don’t need to buy the different resources separately anymore; it is all included into a single service and managed and governed centrally.

OneLake = centralized data storage for all your analytics data

OneLake is the other of the most remarkable features of the Fabric, as it aims to mitigate the need of data duplication within the whole solution. You, who have been working with data infrastructures, probably know that it is common that the data needs to be duplicated across the data solution’s layers for different analytical engines to support the different use cases of the data. In OneLake the data is stored in compressed parquet-format, and all the different analytical engines within the Fabric can query the same data efficiently.

To put this in context, both T-SQL engine for building a data warehouse and Analysis Service Engine for Power BI reports can use the same data as efficiently. Microsoft promises to extend this “One copy of data“ -paradigm further by enabling shortcuts for the data, so that different teams can use the same data for their specific purposes by creating virtual data products. In addition, OneLake offers a possibility to expand the lake into some third-party data storages, such as Amazon S3, without a need to move the data physically to the OneLake. Quite impressive.

Introducing AI to empower developers

The other remarkable feature of Fabric is the inclusion of the AI within the Fabric across the solution. This means introducing Copilot into all building blocks of the Fabric to assist you in your work to increase your efficiency. For example, in the future you can ask Copilot to build a HR report for you in Power BI. Interesting to see how well this feature is going to work. With Copilot Microsoft aims to empower the citizen developers to be more integral part of the data development process and thus promote the organizations to become even more data driven. Most of the Copilot features are still in Private Preview though, so we all must wait a bit longer to get our hands on these cool new features.

More sustainable tomorrow through innovation in resource efficiency

At Evitec, we have already begun exploring the capabilities that Microsoft Fabric offers. Our own OneLake is already up and running, and we are well in our way to uncover the possibilities of Fabric. While the service is still in preview mode, and some child-diseases are expected, many of the features seem promising. We truly are impressed by its ability to eliminate the need for data duplication.

As the volume of data continues to grow in the world, so does the carbon footprint of the data storage. And as we are thriving towards more sustainable tomorrow, it is important that also the data solutions are designed to be as resource efficient as possible, and here Fabric seems to make a clear difference by having the only one copy of the data. Given of course that the processing of the data does not lose the benefits gained by reduction of the storage.

Time will tell whether Fabric can claim all the promises Microsoft has made for it, but if it does, we think that Fabric is a real game changer in the data field. Join us to the journey to unravel the potential of your data with Microsoft Fabric!

Written by

Henni Niiranen

Data Consultant

When an insurer plans a system renewal, the primary focus is usually on how the new system supports needs today and in the future. However, few insurance companies start from scratch. Especially within life insurance, policies may be older than 50 years. Therefore, the migration of run-off portfolios usually pops up at some point during the renewal project.

Older systems often have an “uncontrolled flexibility”, a feature that originally was regarded quite handy. Individual policy details could be modified in many ways, and not all information had a designated place or format. Thus, over time, users may have entered the same information in different places and, for example, dates in different formats. Older policies also do not always have all the information required by the new structure; in which case the policy information needs to be enriched. Not to mention file formats, which have changed over the years. There are certainly many more examples. And now, 15–20 years later, when this rather mixed data should be adapted to the structures of the new system, we are faced with a data cleaning task. The scope of a migration project can often be bit of a surprise, but luckily there are tools available to help.

The power of collaboration

In data migration the cooperation between the insurance company and the system supplier is key. The insurance company knows its old products and can foresee some of the challenges in the data structures. The system supplier on the other hand, knows inside out the logic and structure of the new system. When a mechanism for checking the quality and consistency of the data is created in between, even a difficult migration becomes easier.

The three phases of migration

Data migration can be divided into three phases. In the first phase, the migration is planned, and the portfolios are studied to the smallest details. First steps are taken with smaller test data and the creation of data mapping rules starts. At the same time, the insurance company often considers whether some product portfolios can be combined to simplify the management of portfolios in the future.

In the next phase, our conversion tool will take centre stage. It is used to check whether the data to be migrated is consistent and compatible with the new system. Rarely, if ever, is older data ready at once. The conversion tool provides feedback on differences and inconsistencies, such as data fields that cannot be matched in the new structure, missing data fields, or data in an inappropriate format.

This is where the actual data cleaning begins. The same data may be run through the conversion tool several times until it can be stamped as OK. Finally, a policy lifecycle testing will be done to ensure that everything matches in the future as well. For the work to progress promptly, the conversion tool is made available also for the insurance company. Hence, the actual experts of the portfolios and those working on data cleaning can independently test the changes and updates. All in all, a time-consuming phase, but the work is rewarded in the last phase.

The actual migration is often the fastest phase. When the old data has been processed and its compatibility has been verified, this is largely a technical routine, where the converted policies smoothly float into the new system. As a final check, the outcome is reconciled with the source data.

Extensive experience of migrations

In addition to the conversion tool, Evitec Life‘s accurate description of the data structure makes migration work significantly easier. The description gives the customer a clear view of which information is needed and in which format.

At Evitec we have carried out system migrations for several decades. We have converted nearly one hundred portfolios and hundreds of thousands of policies. So, it’s fair to say that our experience has built up over time and our migration process and tools have been put to the test in many demanding projects.

data-masking-1-1920x960-7193124

 

Onko sinulla tietojärjestelmissäsi aineistoa, joka tulisi saattaa GDPR-vaatimusten piiriin, tai oletko epävarma asiasta? Viranomaisvaatimukset edellyttävät henkilötietojen käsittelyä luottamuksellisesti ja turvallisesti. Tämä koskee myös yritysten tietojärjestelmiä ja esimerkiksi testiympäristöjä.

Henkilötiedot on peitettävä tai häivytettävä siten, että tietoja ei voi suoraan nähdä tai yhdistää muihin järjestelmässä oleviin tietoihin, mutta tietorakenteet pysyvät kuitenkin eheinä. Käyttötapauksesta riippuen tietojen peittäminen tehdään algoritmeilla joko anonymisoiden tai pseudonymisoiden. Englanniksi puhutaan termistä data masking. Ilman henkilötietojen huolellista suojaamista ei testiympäristöissä voida hyödyntää oikeiden käyttötapausten tietoja.

Olemme kehittäneet henkilötietojen häivyttämiseen, eli maskaukseen, tehokkaan ja läpinäkyvän, ketterän ratkaisun. Se on myös rakenteeltaan kevyt ja käyttäjäystävällinen. Ratkaisumme avulla henkilötietojen suojaaminen isoissa järjestelmissä voi parhaimmillaan olla hyvinkin yksinkertaista ja mutkatonta. Asiantuntijoillamme on jo paljon kokemusta tietosuojaan liittyvistä projekteista. Me tiedämme, mitä viranomaisvaatimukset tarkoittavat käytännössä, ja mitä ne tarkoittavat järjestelmien ja projektinhallinnan kannalta.

Ketterä ratkaisu, joka sopii moneen tilanteeseen

Tietojen häivyttämiseen kehitetty, dynaaminen ratkaisumme käsittelee ja tuottaa aineiston hyödyntäjälle datan, josta on häivytetty henkilötiedot lain määräämällä tavalla.

Kun yritys haluaa ottaa ratkaisun käyttöönsä, toimimme läheisessä yhteistyössä onnistuneen projektin takaamiseksi. Asiantuntijamme määrittelevät eli konfiguroivat asiakkaan kanssa tiedot, joita halutaan hyödyntää, ja liiketoimintatarpeet ohjaavat esimerkiksi tietolähteiden valintaa. Lisäksi tapauksesta riippuen valitaan, tehdäänkö maskaus anonymisoiden vai riittääkö pseudonymisointi. Tietorakenteet säilyvät alkuperäisen aineiston mukaisina. Maskata voi tarpeen mukaan vain tietyt tietokantakentät tai koko taulun.

Määritellyt säännöt ovat koko ajan asiakkaan nähtävissä, ja tarvittaessa muutosten teko on helppoa. Kun määritykset on tehty ja lähdelataukset toimivat, ratkaisumme maskaa datan automaattisesti ja tehokkaasti. Näin valmis aineisto on valmis hyödynnettäväksi. Ratkaisu sopii monenlaiseen tarpeeseen, ja sillä voi olla useampia rinnakkaisia hyödyntäjiä, ihmisiä tai järjestelmiä, kuten tietovaraston testiympäristöt.

Ratkaisumme etuja ovat sen ketteryys, läpinäkyvyys ja yksinkertaisuus, mikä tekee siitä myös helppokäyttöisen. Manuaalityötä on erittäin vähän, lähinnä alkuvaiheen määrittelyissä.

Me olemme valmiita auttamaan yrityksesi tietojärjestelmien henkilötietojen suojaamisessa tietosuoja-asetuksen edellyttämällä tavalla.

Mikäli haluat keskustella lisää, lähetä meille yhteydenottopyyntö osoitteeseen sales@evitecdata.local. Asiantuntijamme vastaavat sinulle ripeästi.

Tutustu myös muihin analytiikan ja tiedolla johtamisen palveluihimme.

Lue myös:

https://profitsoftware.com/yksinkertainen-ratkaisu-suojaa-henkilotietoja-yritysten-testiymparistoissa/

 

Evitec and Live Foundation have been working on an enterprise level business intelligence solution for some time now, and the partnership has been smooth sailing. Positive past experiences of cooperation and a competitive quote led Live Foundation to choose Evitec as the supplier of its new, more efficient internal order management system. live_logo_rgb-300x167-1344256

Evitec’s proven UX design and SW development expertise combined with Microsoft Power Apps technology were the key to solve Live Foundation’s very specific needs. Power Apps, a user-friendly tool for agile software development, is part of Microsoft Power Platform, in which Evitec has invested heavily in recent years. Power Platform technology made the new order management system extremely quick to build.

“Evitec gave us everything we wanted and more, with open communication and iteration throughout the project”, says Live Foundation’s IT Manager Jouni Vesanen.

The new solution has given Live Foundation a consistent way to process internal orders, reducing the amount of manual labour and lowered the risk of human error. With integrated Power BI based reporting the internal orders are now visible to all authorized users.

Live Foundation’s new system is an excellent example of how process digitalisation can quickly add value. Power Platform technology is especially well suited to situations where suitable SW product is not available but where embarking on a major software development project is not a practical option either. Power Platform is ideal for developing software quickly and cost-effectively in no-code/low-code basis with minimal programming.

“Power Platform is also great in that it allows us to keep developing the system in-house”, Vesanen explains.

Live Foundation’s mission is to enable as many people as possible to function as full-fledged members of Finnish society. Live Foundation maintains two professional units: Vocational College Live, which is the third biggest special-needs vocational school in Finland, and Live Services, a provider of vocational rehabilitation, training and consultancy to promote employment. Live Foundation was established in 1940, and it employs just over 600 experts.