14.1.2021

Artificial Intelligence: Don’t Let a Biased Algorithm Ruin Your Business

The COVID-19 crisis and recent data breaches have highlighted the importance of trustworthy data processing. In its white paper on artificial intelligence, the European Commission also emphasised that trustworthiness is a prerequisite for the adoption of digital technology. Ethical AI is one of the EU’s key goals. Algorithmic biases are a particular risk that can undermine the trustworthiness of AI, and everyone developing or adopting AI applications should be prepared to deal with such biases.

AI is Biased and Can Lead to Discrimination

AI is biased by default, as its functioning is based on data and rules that are input and defined by people. This bias can arise for a number of reasons. The data used to train AI could be factually incorrect, incomplete or otherwise prone to cause algorithmic bias. The algorithm used by the AI could also have been developed to give weight to certain grounds for discrimination, such as gender or language. Biases are significant particularly when people are subject to an automated decision made by an AI application. If the bias is not corrected in time, such decisions can lead to discrimination.

A discriminatory AI could be created, for example, because the application has been trained primarily using data from men, and therefore, does not produce optimal results for women. Some of the specific features of many AI applications, such as opacity, complexity, unpredictability and partially autonomous behaviour, can be problematic from the perspective of the protection of fundamental rights.

Automated decision making and the risk of discrimination related to it have given rise to a great deal of debate. Decisions made by AI have even been appealed to the Non-Discrimination Ombudsman. One case was brought before the National Non-Discrimination and Equality Tribunal, which issued a decision and a conditional fine in 2018 prohibiting the use of the AI in question. 

The case concerned a credit institution, which had used a fully automated decision-making system. The system scored consumer credit applicants by comparing the applicant’s data to corresponding statistics to determine creditworthiness. The system was found to discriminate against applicants, among other things, based on gender and native language. It is worth noting that even small choices made by AI can be deemed decisions, such as the choice to direct a certain type of applicant to customer service.

AI Needs Supervision and Good Agreements Are Key

In order to prevent biases, the functioning of AI applications must be regularly monitored and evaluated, whether the application has been developed in-house or acquired from an outside provider. Because AI reflects the views and values or the people who developed it, acquiring AI applications from abroad involves its own risks that require caution. It is possible that the AI will display a bias that is based on the cultural norms of the developers.

As there is still little to no legislation specifically concerning AI, the rights and obligations of parties agreeing on AI applications are based on the agreements between the parties. Among other things, the contractual terms should provide for liability for defects and audits.  Agreeing on liability for defects is important, for example, in case the data used to train the AI is deficient or the algorithm used functions incorrectly. An auditing clause should provide for how the correct functioning of the AI is ensured if the decision logic of the AI is not sufficiently transparent. Though the field lacks actual legislation, the EU’s ethical guidelines for AI provide a framework for the responsible use of AI.

Latest references

We advised Lantmännen ek för in its contemplated acquisition of Leipurin from Aspo Plc. Lantmännen is an agricultural cooperative and Northern Europe’s leader in agriculture, machinery, bioenergy and food products. Lantmännen is owned by 17,000 Swedish farmers and has 12,000 employees in over 20 countries. Leipurin is a leading Nordic supplier of bakery ingredients, equipment, and expert services to professional bakeries, confectioneries, and food manufacturers. The company operates across Finland, Sweden, and the Baltic countries with subsidiaries located in the aforementioned countries, providing comprehensive solutions to the baking industry. The closing of the transaction remains subject to regulatory approvals.
Case published 25.8.2025
We assisted Oomi Oy in its expansion into the mobile telecommunications market with the launch of Oomi Mobiili, a new MVNO brand. Our work covered the preceding due diligence process as well as structuring and negotiating key partner agreements, laying a solid foundation for Oomi’s entry into the new market. Oomi Mobiili will operate as a virtual mobile network operator, offering customers the option to purchase a mobile subscription together with their electricity contract. The phased launch is set to begin in autumn 2025, with nationwide availability targeted for early 2026. 
Case published 15.8.2025
We advised Nevel Oy in its acquisition of the business of Labio Oy. Lahti Aqua Oy and Salpakierto Oy sold their entire shareholdings in Labio to Nevel, expanding Nevel’s already significant biogas portfolio. The transaction will have no impact on Lahti Aqua’s water utility operations or Salpakierto’s municipal waste management responsibilities. Labio’s operations and customer relationships will continue as before. ‘This partnership is a natural next step for us as we continue investing in sustainable material efficiency and renewable energy solutions. By integrating Labio’s comprehensive offerings and expertise, we can provide customers with a strong platform for material circularity. We are also strengthening our market position as one of Finland’s leading material efficiency solution providers,’ says Ville Koikkalainen, Director of Industrial and Biogas Business at Nevel. Nevel is an energy infrastructure company offering advanced, climate-positive solutions for industry and real estate. It operates more than 130 energy production plants and manages over 40 district heating networks. Nevel’s annual turnover is EUR 150 million, and it employs 190 experts in Finland, Sweden and Estonia.
Case published 16.7.2025
The Supreme Administrative Court (SAC) issued a significant precedent (decision KHO:2025:23) in a case in which it found that the Finnish Motor Insurers’ Centre (Liikennevakuutuskeskus, LVK) processed patient data in accordance with the requirements concerning fairness, data minimisation, and privacy by design and by default when deciding on compensation claims. We represented LVK in this case in which the SAC upheld the Administrative Court’s decision to repeal the EUR 52,000 administrative fine imposed on LVK by the Sanctions Board of the Office of the Data Protection Ombudsman. The SAC also confirmed the Administrative Court’s decision, which, as far as we know, was the first of its kind in Finland, ordering the Office of the Data Protection Ombudsman to reimburse some of our client’s legal costs. The decision bears great significance for the insurance industry as a whole. The crux of the matter were LVK’s information requests under the Motor Liability Insurance Act for patient data that were essential in determining insurance or compensation claims. In certain cases, making a decision may require extensive patient data. The Office of the Data Protection Ombudsman had found that LVK had systematically made overly broad information requests infringing Articles 5 and 25 of the GDPR and that the information should have been provided in the form of separate medical opinions. The Administrative Court repealed the Data Protection Ombudsman’s decision and found that patient records from medical appointments are, as a general rule, essential in establishing causality in compensation matters. It also stated that the tasks related to the consideration of compensation matters are specifically the core tasks of the insurance company and not of the controller of patient data. Furthermore, the Administrative Court found no evidence indicating that LVK would have systematically made overly broad information requests. ‘Once again, our collaboration with C&S was seamless throughout this extensive process, and we could trust that our case was in expert hands’, says Visa Kronbäck, Chief Legal Officer of the Insurance Centre. The full decision is available on the SAC website (in Finnish):  KHO:2025:23.
Case published 18.6.2025