How Legal Teams Can Master DORA, the Data Act, and the AI Act in 2025
May 2025
By
Carolyn McNally

The digital regulatory landscape in Europe is undergoing its most significant transformation in years. As companies race to harness the power of data, AI, and digital operations, the EU has responded with three groundbreaking regulations: the Digital Operational Resilience Act (DORA), the Data Act, and the AI Act. Each presents unique compliance challenges, yet together they form a comprehensive framework that will define how organizations operate in the digital space for years to come.
During a recent Axiom webinar, EU Data Regulations Unpacked: Staying Ahead of the Digital Curve, Axiom lawyer Jenny Hartmann joined Alice Flacco, General Counsel, Microport Scientific Corporation; Richard Magnan, General Counsel, Rising Tide; and Colin Levy, General Counsel, Malbek and Adjunct Professor at Albany Law School, in a conversation about breaking down these complex regulations in order to provide actionable strategies for in-house teams. The message was clear: Proactive preparation isn't just recommended. It's essential for avoiding disruption and maintaining a competitive advantage in the European market.
The Regulatory Landscape: Understanding the EU's Digital Regulation Trio
The EU has undertaken an ambitious agenda to regulate the digital economy through a comprehensive framework of interconnected regulations, creating a new paradigm for data governance, AI development, and digital resilience. But these regulations represent more than just compliance hurdles; they're reshaping how companies approach technology development and data management from the ground up.
One of the most significant challenges facing organizations is the inherent tension between rapidly evolving technology and the deliberately slower pace of regulatory development. In other words, tech moves very fast, but legislation doesn’t.
This creates a fundamental dilemma: how can regulations remain relevant when the technology they govern is constantly changing? Legal leaders must treat new legislation like scaffolding instead of a cage in order to establish principles and risk frameworks that can adapt to technological evolution without stifling innovation.
DORA: Strengthening Operational Resilience in the Digital Era
The Digital Operational Resilience Act (DORA) marks a turning point in how financial entities manage digital risk. As of January 17, 2025, banks, insurers, investment firms, and other institutions must now meet harmonized EU standards designed to ensure they can withstand and recover from information and communication technology disruptions, whether it be cyberattacks, system outages, or third-party failures.
What makes DORA particularly notable is its broad scope: it applies across 20 types of financial entities and extends to information and communication technology service providers. This shift forces organizations to rethink their operational risk frameworks, supplier contracts, and incident response protocols in light of a more integrated, regulated approach to digital resilience.
💡 Find Axiom lawyers to navigate complex regulations.
The Data Act: Redefining Data Sharing and Access
The Data Act, a comprehensive initiative to ensure the protection of personal data, introduces what Alice described as "a new principle of data sharing by design, even with competitors and partners." This represents a paradigm shift from the protective stance many companies have traditionally taken with their data assets.
As Alice pointed out, it creates "contractual implications we haven't faced before under GDPR, especially with industrial data where IP and trade secret concerns collide with mandated access." Companies will need to revisit their data licensing agreements, review intellectual property protections, and develop frameworks for compliant data sharing.
💡Watch the full conversation on demand.
The AI Act: Navigating Europe's First Comprehensive AI Regulation
The AI Act is a landmark piece of legislation and is also the world's first comprehensive legal framework for artificial intelligence. At its core, it classifies AI systems by risk and applies a regulatory regime accordingly. This tiered approach creates proportionate obligations based on the potential harm an AI system could cause.
From Jenny's perspective as a privacy lawyer working with clients across multiple sectors, this risk-based framework offers both challenges and opportunities. Companies must now conduct thorough assessments of their AI applications to determine which risk category they fall into, from minimal risk systems with light requirements, to prohibited applications that cannot be deployed in the EU market.
"From a general counsel perspective," Alice shared, "the real challenge is mapping our AI inventory to this risk framework, especially when the algorithms' decision logic is not as transparent, even to our development teams," signaling a fundamental shift in legal's role.
One of the most pressing issues facing legal teams is the AI Act's extensive documentation requirements, which present a significant compliance challenge. Technical teams will need to adapt to comprehensive documentation practices, and legal teams will need to ensure these new processes become embedded in the development lifecycle. In-house counsel can no longer remain passive interpreters of regulation; they must become active investigators and translators between technical teams and compliance requirements.
💡 Find Axiom lawyers to navigate complex regulations.
Building Integrated Compliance Frameworks
In this new regulatory environment, the role of in-house legal teams is changing. Legal professionals will need to develop a deeper understanding of how their organizations use artificial intelligence and data, including technological details that would previously have been considered outside of legal's domain. This shift represents both a challenge and an opportunity for legal departments, allowing legal to position itself not just as a compliance function but as a strategic partner in technology decisions.
A critical aspect of the new regulations is the fundamental tension that legal teams will need to help resolve. How do you explain systems whose functioning may not be fully transparent even to their creators?
A recent European Court of Justice decision required a company to explain in understandable terms how its financial scoring algorithm worked after a consumer was denied a mobile phone contract based on their score. While this case involved a traditional algorithm rather than AI, it demonstrates regulators' expectations for transparency in automated decision-making.
Being able to explain what you're doing with consumers' data is going to become a unique selling proposition, and people will be even more aware of how their data is used in the future. Organizations that can balance innovation with transparency will find themselves at a competitive advantage. Focusing on liability frameworks that allow innovation to continue while providing strong protections when harm occurs is imperative. Legal leaders can do this by:
- Building in-house playbooks with room for evolution
- Reviewing and re-tagging AI use cases quarterly
- Creating audit trails for AI lifecycle decisions
- Establishing escalation paths for unexplainable outcomes
💡Watch the full conversation on demand.
Updating Vendor Management and Contracting Strategies
As regulatory frameworks like the Data Act and AI Act evolve, vendor management strategies must keep pace. Even if your vendors operate outside the EU, your exposure does not. When offering services within the EU, data-sharing clauses must meet new standards, particularly around algorithmic explainability, data access, and portability. This means it's time to revise contract templates to include clear triggers: who has access to what data, in what format, how quickly, and at whose cost? Provisions around reverse engineering, model documentation, and decision thresholds are no longer optional. They’re core to due diligence.
This isn’t about becoming an engineer. It’s about asking smarter questions, building internal alignment with IT teams, and knowing when a vendor is offering a “black box” rather than a transparent product. Technical organizational measures, like encryption standards or drift monitoring, should be reviewed with those who know what "state of the art" really means. Building internal processes to engage IT early may feel like heavy lifting, but it prevents costly blind spots later.
For more mature or high-stakes vendors, particularly those in niche services, fallback strategies are key. This includes service-level agreements, insurance requirements, and where possible, indemnification. Not every vendor will have the muscle to offer robust contractual protections, especially smaller ones. In those cases, risk needs to be contractually mitigated and carefully documented. Consider introducing a scoring framework for AI vendors based on transparency, explainability, and compliance maturity. If a vendor can’t explain their system clearly, it’s a red flag.
Ultimately, success hinges on institutionalizing this approach. Train procurement to spot compliance red flags just as easily as pricing ones, establish internal escalation protocols, and above all, document everything. In AI and data governance, if it isn’t written down, it didn’t happen. And if it didn’t happen, you’ll be the one left explaining why.
💡Watch the full conversation on demand.
Practical Implementation Steps for Legal Teams
For legal teams wondering how to operationalize these regulatory requirements in practice, their discussion offered several concrete strategies:
- Create a cross-functional AI governance committee that includes legal, IT, data science, and business representatives to ensure holistic oversight.
- Develop and maintain an AI inventory that classifies systems according to the AI Act's risk framework and documents their intended use, training data, and potential impacts.
- Establish documentation protocols that can be integrated into the development lifecycle to address the cultural challenge of resistance to documentation.
- Implement regular review cycles for AI applications, especially those that continue to learn and evolve after deployment.
- Design user education initiatives to ensure that end users understand the capabilities and limitations of AI tools.
As in-house counsel navigate this complex regulatory environment, one thing becomes increasingly clear: preparation today will determine success tomorrow. The EU's regulatory trio of DORA, the Data Act, and the AI Act represents not just a compliance challenge but an opportunity to build stronger, more resilient digital operations that inspire trust within organizations.
The organizations that approach these regulations strategically, with cross-functional collaboration and forward-thinking governance, will find themselves at a competitive advantage. Those who wait will inevitably face rushed implementation, potential penalties, and business disruption.
Axiom's lawyers are helping clients transform regulatory compliance from a reactive exercise into a strategic advantage. Their specialized legal teams work alongside in-house counsel to develop tailored approaches that align with business objectives while meeting the demands of this evolving landscape.
If your organization needs support navigating EU regulations or developing compliance frameworks that adapt to future changes, Axiom's specialized privacy and technology lawyers can help. With deep experience across industries and jurisdictions, Axiom provides the flexible legal talent you need to stay ahead of regulatory deadlines without expanding your permanent headcount.
For legal professionals passionate about data privacy and technology regulation, Axiom offers unique opportunities to work with leading global companies on their most pressing compliance challenges. Join our network of lawyers helping clients navigate the digital future with confidence.
Posted by
Carolyn McNally
Carolyn McNally is a seasoned communications professional, leveraging a passion for precision and creativity in public relations for the world's leading provider of on-demand legal talent.
Related Content
5 Tips to Rediscover Your Purpose as an In-House Legal Innovator
Legal experts share actionable tips for in-house counsel to reinvigorate their purpose and job satisfaction in law.
Adapting to the EU AI Act: Four Key Takeaways for General Counsels
Understand current challenges and practical strategies for data privacy and cybersecurity compliance based on the recently passed EU AI Act.
How Axiom Attorney Dina Helps Companies Navigate the Changing Privacy Landscape
Axiom lawyer Dina Maxwell has built a career around an interest in privacy legislation, data governance, and data access.
- Expertise
- North America
- Must Read
- Legal Department Management
- Work and Career
- Perspectives
- State of the Legal Industry
- Legal Technology
- Spotlight
- Solutions
- Artificial Intelligence
- Regulatory & Compliance
- General Counsel
- Data Privacy & Cybersecurity
- United Kingdom
- Legal Operations
- Central Europe
- Technology
- Australia
- Commercial & Contract Law
- DGC Report
- Labor & Employment
- Large Projects
- Regulatory Response
- Banking
- Commercial Transaction
- Diversified Financial Services
- Hong Kong
- Intellectual Property
- Investment Banking
- Legal Support Professionals
- News
- Singapore
- Tech+Talent