Belgian guidelines for lawyers on the use of AI and what we can expect in the Netherlands
Recently (January 20, 2025), guidelines were published in Belgium by the Flemish Bar Association (Orde van Vlaamse Balies) regarding the use of artificial intelligence (AI) by lawyers. These guidelines provide more clarity on how lawyers can use AI within their practice and the associated responsibilities. What exactly do the Belgian guidelines entail? And how can Dutch lawyers prepare for future guidelines in our own country?
AI use: permitted, but with clear responsibilities
In Belgium, the use of AI by lawyers is neither mandatory nor prohibited. Lawyers are free to use AI tools, provided they understand how they work and maintain control over the results.
This means, among other things, that:
- The lawyer must have basic knowledge of AI and Large Language Models (LLMs).
- The lawyer must carefully read the terms of use of AI tools, paying attention to matters such as data processing and the nature of the AI system (open or closed).
- The lawyer must always check the AI output, including the cited sources.
Confidentiality and data protection: key points in AI use
Belgian lawyers must ensure that confidential information (covered by professional secrecy) does not end up in just any AI system.
This translates into the following concrete guidelines:
- The lawyer pseudonymizes personal data before it is processed by AI tools.
- Personal data may only be processed by AI when essential, provided the lawyer is transparent and requests consent from the person concerned.
- No confidential data (covered by professional secrecy) may be entered into AI systems unless the lawyer is absolutely certain that they are using the AI system in a closed environment with sufficient safeguards.
Liability: the lawyer remains ultimately responsible
AI may provide support, but the lawyer remains fully liable for the use of AI-generated output.
No mandatory transparency regarding AI use
A striking feature of the Belgian guidelines is that lawyers are not required to inform clients about their use of AI. This means that, as with other IT tools, a lawyer does not have to mention that AI was used when performing a legal analysis or drafting an agreement.
What does this mean for the Netherlands?
Although the Netherlands Bar (Nederlandse orde van advocaten) has not yet issued comparable guidelines, lawyers can already prepare for their arrival:
- Immerse yourself in AI and its legal applications – ensure you understand how AI works and what its limitations are. For example, as of this month (February 2025), AI literacy is a legal requirement under the AI Act. This means that providers of AI systems and organizations using AI systems are required under Article 4 of the AI Act to ensure that all employees using AI are AI-literate.
- Always check the output of AI tools – do not blindly trust the generated text and verify sources.
- Ensure that confidential data does not end up in AI systems – use only anonymized data and do not use just any (new) tool.
- Remain responsible: AI is a tool and the lawyer remains responsible – ensure that clients do not become dependent on automated advice.
Many thanks to Jos van der Wijst for contributing to this blog!