Hong Kong: Privacy Storm Brewing Over Student’s AI Project
An award-winning AI project by a secondary school student in Hong Kong has come under fire, not just over who really built it, but also how private data may have been handled in the process.
The project, Medisafe, was celebrated for using AI to catch prescription errors by analysing patient allergies, medication records, and clinical conditions. It even bagged top honours at both local and international innovation awards. But now, a full investigation is underway with allegations that the project was developed by a US-based company, raising serious concerns about privacy, consent, and data ethics.
This is a real-life privacy lesson, not just a school scandal
We often treat student tech projects as harmless learning exercises. But this case shows just how real the stakes can be.
When medical data or any kind of personal data is used without proper gateway, people’s trust, privacy, and even safety are at risk. This isn’t theoretical. This happened, and it’s unfolding right now in Hong Kong.
It’s a wake-up call for students, educators, and innovation organisers across Asia: even with the best intentions, using real-world data without proper understanding of privacy obligations can land you and others in serious trouble.
What are the privacy concerns here?
Let’s break it down.
1. Was consent obtained?
If real patient data was used in the project, the team needed clear consent from every person involved. That’s not just good practice under Hong Kong’s Personal Data (Privacy) Ordinance (PDPO), it’s the law.
2. Was the data sent overseas?
The US company linked to the project may have hosted or developed the AI tool. That brings up concerns about cross-border data transfers, which need legal safeguards to protect individuals' privacy rights.
3. Who was accountable for data protection?
If a student used third-party tools or developers, were there any privacy contracts in place? Was anyone reviewing how the data was handled, or where it ended up?
4. Was there transparency?
The project’s website apparently redirected to a US AI firm’s site. That undermines trust and raises questions about whether stakeholders (including judges and users) were told the full story.
Why this matters for the next generation
Today’s students are tomorrow’s innovators, founders, and data scientists. Many are already building apps and AI tools with real-world applications. That’s exciting but also risky if they don’t have the right guidance on privacy, ethics, and legal obligations.
We need to instil a mindset of “privacy-first” innovation.
Just because you can access or simulate personal data doesn’t mean you should.
Just because a tool is available online doesn’t mean it’s safe or legal to use with live data.
Just because you’re a student doesn’t mean the rules don’t apply.
This is a real case — and it should prompt every aspiring coder, hacker, and innovator to ask themselves:
"If this were someone else’s private health record, would I feel okay using it in a project?"
How to move forward: Building a culture of responsible innovation
We’re not here to discourage creativity. Far from it. But innovation must come with accountability.
Let’s encourage:
Privacy education in every classroom that touches AI, app development or medical tech
Stronger privacy review in student competitions — especially those involving personal or medical data
Disclosure of all third-party developers, cloud tools, or data sources used in student submissions
Data Protection Impact Assessments (DPIAs) — even simplified ones for school projects
A new culture of “privacy by design” among young creators
The bottom line
This isn’t just a story about a Hong Kong student, it’s a story about the future of innovation in Asia.
If we want to empower our youth to build the next big thing, we also need to empower them to protect people’s data while they do it. Because at the end of the day, data privacy isn’t a side topic, it’s a shared responsibility.
Meta Connects (Asia) Pty Ltd is proud to support privacy education, awareness, and accountability across Asia. Stay tuned for more updates in our Privacy News in Asia series.
Source Reference: This article references details from:
"HK student’s award-winning AI software under scrutiny for academic integrity violations"
The Standard, 17 June 2025
Available at: https://www.thestandard.com.hk/hong-kong-news/article/304970/