The AI Act is still a relatively new legal instrument on the EU regulatory agenda; however, we are already witnessing how it transforms the way businesses think about technology and regulation. In fact, the AI Act is not merely a product of academic thinking but is based on standards and best practices designed mainly to support business needs. A lesson learned from the highly regulated industries, such as financial services, energy or critical infrastructure, is that the AI Act may be seen not only as another regulatory obstacle for business, but also as an opportunity to view technology (not just AI) from a different angle than before.
Implementing new technologies always poses a challenge for businesses. Even the process of implementation itself can prove to be time-consuming, budget-draining and demanding in terms of organizational capacity. There are, however, many more risks at the stages of selection, implementation, and exploitation of new solutions, and AI technologies place additional pins on the implementation map that should be considered in project navigation.
One of the major examples is managing data compliance. Surprisingly, many businesses still think of “data” primarily in terms of personal data. Obviously, personal data protection and the potential unauthorized use of personal data for training models remain a major concern when it comes to AI adoption in business. Oftentimes it is GDPR (not the AI Act) that comes as the main obstacle to AI deployments. In practice, data compliance goes beyond GDPR and requires businesses to ensure that data used for AI training is relevant, sufficiently representative and error-free. It should consider, i.e., specific geographical, contextual, behavioral or functional setting within which the high-risk AI system is intended to be used.
Properly assigning roles based on actual competencies is essential in the process of implementing the AI Act. Experts and departments responsible for data protection may not be sufficiently equipped when assessing whether the input data meets the requirements set by the AI Act (such as its relevance to achieve goals of the AI solutions). On the other hand, business owners may require further assistance when ensuring personal data may be used for machine learning processes or in setting proper formats and structure of data.
When implementing the AI Act, defining reasonable data governance rules and roles is not just about ticking another box on some checklist. It can also protect us against unpleasant surprises such as, e.g., consumer discrimination based on biased results or flawed business decisions due to incorrect assumptions about the actual logic embedded in the AI models used by the solution. In the wider context, data governance does not seem a burdensome regulatory obligation, but rather a business benefit, does it?
Integrating processes and dividing roles enables leveraging competencies and creating smooth approval flows for new projects. This is why an increasing number of businesses consider the AI Act not only as a regulatory brake but also a great chance to rethink processes and make digital transformation a genuine accelerator for business innovation and successful AI deployments.
Our experience from completed projects shows that the AI Act and AI Governance are inherently connected, and their relationship is based on four fundamental principles:
1. Without effective AI Governance, compliance will only be on paper.
AI Governance is crucial for ensuring compliance with regulations and standards.
Even though the requirements often seem excessive, AI Governance remains an essential step.
2. Good AI governance does not come without cost...
Sometimes it demands extra effort that might seem unnecessary, as regulations are often designed for complex situations. However, these efforts are necessary for compliance.
3. ...and requires involvement from the entire organization...
Implementing effective AI governance requires the involvement of the entire organization, not just a single function. AI-related issues spread across different units and processes, so the organization needs to handle this area in a consistent way.
4. ...but allows unlocking innovation and speeding up AI implementation in everyday business
Good AI governance allows for quick risk control. It improves organizational efficiency by eliminating unnecessary delays in assigning risk responsibilities.