In this insightful blog post, CEO Eliza May Austin sits down with legal expert Ryan Lisk to delve into the complex and rapidly evolving intersection of AI and Data Protection.
Ryan Lisk is the visionary founder of Hybrid Legal, recognised for fundamentally redefining the legal industry. A true trailblazer, he has successfully integrated cutting-edge technology with unparalleled business acumen to create a modern, efficient, and cost-effective legal service model.
Under his leadership, Hybrid Legal has become a strategic partner to ambitious businesses, driven by a deep commitment to client success and a philosophy that focuses on enabling commercial growth. Ryan Lisk is an architect of modern business success and a leading voice in future-proofing legal services.
AI and Data Protection: A Storm of AI, Surveillance, and Lost Privacy
Below Eliza and Ryan discuss everything from the unseen movement of personal data in large language models to the challenges emerging technologies like deepfakes and biometrics pose to UK law, concluding with a sharp debate on government demands for backdoors into encrypted consumer devices.

Eliza May Austin
What’s the thing related to data protection law that people should be talking about, but aren't?

Ryan Lisk
Where does their personal data go when entered into a third party large language model like ChatGPT, Gemini, or Claude? That is the thing people should be concerned about.
Data Protection Advice for Small and Medium Sized Businesses

Eliza May Austin
If you could give only one piece of advice regarding data protection to UK small to medium-sized businesses, what would that be?

Ryan Lisk
Treat personal data with the same seriousness as a new employee on their first day. Personal data is a gift, and it is more valuable than oil now. Be clear on the law and what you can and cannot do with it. I really like to relate it to onboarding. We do it with an employee but often do not do the same due diligence with a new tool or product.
Treat personal data with the same seriousness as a new employee on their first day. Personal data is a gift, and it is more valuable than oil now.

Eliza May Austin
How do you see emerging technologies like AI, facial recognition, or biometrics challenging the limits of UK data protection law?

Ryan Lisk
I think the Information Commissioner’s Office (ICO) has its work cut out because there are over 40,000 different AI systems, some of which use biometric data and facial recognition. Yes, you can do some cool things, but that is someone’s face being scraped, processed, and used. The Deutsche Telecom campaign on deepfakes is worth looking at. It makes you question if we should trust these companies without many guardrails to hold them to account.
We have legislation today, but compliance takes time. Some businesses gamble, thinking no one will check what they are doing with the data they are scraping. The cost of that gamble is not only financial penalties but also reputational. For me, the moral piece stands at the heart of it. If you are taking that level of data about somebody, you need a very good reason, a robust cyber security infrastructure to protect it, and clarity with people on what you are doing, including their right to remove it.
The Future of AI and Data Protection in a Strained UK Justice System

Eliza May Austin
The UK judicial system is archaic and has an inadequate backlog. Given the problem with actual person-to-person crimes, when it comes to deepfakes involving digital sex crimes, where do you see that going in terms of an incentive to fix it?

Ryan Lisk
It is a great question that all of us should be thinking about. This needs to be part of our education systems to raise awareness of how easy it is to put data out there, and how easy it is for it to be manipulated in a predatory sense.
Deepfake technology is used not only in a sexual context but also in a political one, contributing to a lot of social instability. On the sexual predatory front, I think that is as bad as it gets. The judicial system here in the UK is a mess. It is not a 'vote spinner' like the NHS, so it does not get the prioritisation it should. The legal system is slow-moving, especially when AI speeds up the pace of these digital crimes.
The legal deterrents are there, but the volume is huge. For businesses, the deterrent is not only the reputational issue but also the availability of private prosecution for victims. While victims need the financial means for this, with GoFundMe and charities, it is an option. People are sympathetic to these crimes and want to help, but there is so much of it.

Eliza May Austin
A growing trend for attackers is using deepfakes to call or video an employee, pretending to be a senior leader to get someone to do something. It is hard to combat. Do you have any ideas on how to avoid falling victim to this? For example, code words or hand signals?

Ryan Lisk
The risk of this has gone up since remote work became common. When you spend less time in the same environment with a colleague, you do not know their quirks and nuances, making it easier to be fooled by an impersonator.
I think your suggestion of having code words or not being afraid to challenge is right. It is about having team conversations, sharing the problem, and being aware that this is a real risk.
I recall a near miss where the CEO was spoofed, and an email was sent to a junior colleague asking them to urgently buy an Amazon gift card for a client. She believed the email but followed the procedure of checking with the finance manager about reimbursement. That process is what caught the fraud. The finance manager probed and confirmed with the CEO that no such request had been made.
I think looking at what is being asked, as usually it is something unusual, and enforcing proper processes and procedures can help catch these things.

Eliza May Austin
I think if we have quirky internal processes that are part of the culture but not documented, maybe the more we go down the AI attack route, the more we have to get back to humanity in order to solve the problem.
Government Demands for Encrypted Access: What It Means for UK Data Protection and Privacy

Eliza May Austin
Final question: At the time of this interview, news came out that the UK government is applying new pressure to Apple by demanding a backdoor into the tech company’s cloud storage service, targeting British users only. What are your thoughts on this?

Ryan Lisk
My initial thought is, "If you have got nothing to hide, why worry?" Can it speed up the criminal justice system by improving evidence for successful convictions? If it can, that sounds good.
However, there is a drawback regarding the rights and safeguards of the individual. I think there needs to be strong checks and balances before the government is allowed anywhere near that backdoor. It all lies on the 'what' and the 'why.'

Eliza May Austin
I am very much against it. I think it weakens security for everybody. A backdoor is essentially them wanting malware root-like access to something that we are all largely dependent on now. If Apple is the start, it will eventually be Android and every other platform.
Given that large corporations and governments have evidenced that they do not have cybersecurity completely nailed, handing over all our encryption keys and private conversations is incredibly dangerous. The amount of control is terrifying. We were sold CCTV because it "makes us safer," but crimes still happen, and no one comes to the victim's aid. This is like digital CCTV, watching everything everyone does under the guise of security, which I believe is not their primary goal. The whole thing concerns me deeply.

Ryan Lisk
I am in agreement with your concerns. That is worrying. The government is potentially removing the right for individuals to make their own decision about data privacy, not only wanting access to where you are, your spending, private conversations, and pictures, but wanting to piece it all together.
What is driving that decision? Other than improving evidence for heinous crimes, what are their reasons? It sounds like a step towards censorship, which is a dangerous slippery slope. It goes against human rights and current data protection legislation.
Apple has said, under no circumstances, absolutely not. They have been very firm, as they would lose so much money to Android. I think all consumer electronics brands need to hold the line on this. I can see this government passing legislation to allow this, but let us hope they do not. Having heard your points, while I initially thought about the evidence-gathering tool aspect, I agree that it has further-reaching consequences.
This conversation with Ryan provided critical insights into the present and future challenges of AI and data protection. From the need for small businesses to treat personal data with the same diligence as onboarding a new employee, to the moral and legal quagmire posed by deepfakes and the UK government’s push for a backdoor into encrypted services, Ryan offered candid and thought-provoking analysis. We appreciate his openness in discussing these complex and often controversial topics.
You can connect with Ryan on on LinkedIn.
Worried about AI threats against your organisation?
Don’t wait for a crisis; prepare now by finding out more about our DFIR retainer, which guarantees expert help during a major incident.
Alternatively, if you want to test your current preparedness, discover our custom-crafted, immersive tabletop exercises, which will give you a solid picture of what your existing people, processes, and technology are capable of doing during a major cyber crisis. Why not reach out today to chat with a member of the team and secure your peace of mind?



