JournoTECH Convenes Legal Professionals to Tackle the Future of AI Governance and Security

London, United Kingdom — Legal professionals from several countries recently came together for a specialised training designed to help lawyers better understand the risks and responsibilities that come with using artificial intelligence in legal practice.
The two-day online programme, “Interrogating AI in Legal Practice: Security, Privacy, and Accountability for Legal Professionals,” took place on March 3–4, 2026. It was organised by JournoTECH with funding support from SPRITE+, a consortium of five UK universities led by the University of Manchester.
Thirty-four legal professionals were selected to take part in the training. They came from different backgrounds and levels of experience, including lawyers with more than 20 years in practice as well as mid-career practitioners. The participants included senior advocates, human rights lawyers, and professionals working in both private and government established from Nigeria, Ghana, Sierra Leone, and Zambia.
Many participants said the training helped address an important gap in AI knowledge within the legal profession. They explained that most AI training programmes available today are very general and often do not focus on the ethical duties, legal procedures, and confidentiality issues that lawyers deal with every day.
Opening the programme, Elfredah Kevin Alerechi, founder of JournoTECH, focused on connecting AI theory with real-life legal work. She introduced participants to specialised tools that can support legal workflows while emphasising that simply knowing about the technology is not enough.
Alerechi also spoke about some of the risks lawyers must consider when using AI systems, including bias, misinformation, and inaccurate results. She showed how platforms such as NewsAssist AI, Google NotebookLM, among others, can help legal professionals with tasks like transcription of depositions and analysing legal documents. At the same time, she reminded participants that technology should always be used carefully and with professional judgement.
Offering a wider perspective on responsibility in AI systems, Lizzie Coles-Kemp, Head of Information Security at Royal Holloway, University of London, spoke about what she described as a “sociotechnical” approach to responsibility. She explained that AI systems can sometimes create “responsibility gaps” when people rely too much on automated tools to make decisions.
According to Coles-Kemp, good AI governance is not only about deciding who is to blame when something goes wrong. It is also about building shared values, clear accountability, and transparency within organisations that use the technology.
The training also looked at the technical side of protecting sensitive legal information. Rebecca Bird, founder of BixBe Tech, cautioned lawyers against assuming that paid AI tools are automatically secure. She encouraged participants to treat every AI platform like a “stranger in the room” and to carefully check how their data is handled before sharing confidential information.
Meanwhile, Soribel Feliz, CEO of Personal Algorithms LLC, spoke about the growing problem of “Shadow AI.” This happens when employees use AI tools at work without approval from their organisation. She said law firms need clear policies and governance systems to ensure AI is used safely and responsibly.
Participants described the training as a valuable opportunity to receive practical guidance designed specifically for the legal profession. The sessions went beyond simple demonstrations of AI tools and explored deeper issues such as professional accountability, client confidentiality, and the full lifecycle of data privacy.
As the programme ended, organisers reminded participants that while AI can help automate many time-consuming tasks, the final responsibility for legal decisions always rests with the lawyer.
Participants said the training has equipped them with new strategies and frameworks that will help them guide their firms through the growing digital transformation of legal practice while maintaining strong ethical standards and public trust.


