Vibe coding security
An AI-built EdTech app exposed 4,538 UC Berkeley and UC Davis student accounts. The marketplace closed the ticket without a response.
18,697 user records leaked. Minors likely on the platform. The founder built the app with good intentions and shipped before they knew what row-level security was. This is one answer to the question of whether vibe coding security risk is theoretical.
An EdTech founder built a learning platform using AI build tools. The product solved a real problem, attracted real users, and grew quickly enough that students from at least two major universities signed up: 4,538 accounts from UC Berkeley and UC Davis between them. Eighteen thousand six hundred ninety-seven user records in total. Some of those accounts were almost certainly minors, since the platform served undergraduates and the path to a minor on a university platform is short.
The records were not behind a paywall. They were not behind a login. They were sitting in a Supabase database with row-level security disabled, accessible to anyone who guessed the URL of the API. A security researcher found the exposure, filed a ticket with the marketplace where the founder had built the app, and waited. The ticket was closed without a response. The data remained exposed.
I want to be careful here. The founder did not set out to expose student data. They are not a careless person. They built a product they believed in, with tools that promised they did not need to be a security expert to ship it. The exposure happened because the tools delivered on the first promise (you do not need to be a developer) and silently failed to deliver on the second (you also do not need to be a security expert). Both promises were necessary. Only the first one was kept.
Why student data is uniquely high-stakes
Most data exposures are bad. Student data exposures are categorically worse, for three reasons that compound.
The first is regulatory. The Family Educational Rights and Privacy Act (FERPA) governs disclosure of education records. While FERPA primarily applies to institutions receiving federal funding, the surrounding tort and contract law applies to anyone collecting student data. A platform that exposes student identifiers, grades, course enrollments, or any of the other typical EdTech fields can be on the hook for damages, especially if those students were the customer side of the platform, not the user side.
The second is the minor exposure. The line between "we serve undergraduates, so everyone is 18 or older" and "some of our users are 17-year-old early enrollees" is blurry on university platforms. The Children's Online Privacy Protection Act (COPPA) treats minors as a protected class with separate notice and consent requirements. A vibe-coded app rarely complies with COPPA by accident. The default is to not check age and to collect everything.
The third is the reputational compounding. Student data exposures are catnip for journalists and education trade press. A breach that would get a paragraph in TechCrunch for a generic SaaS app becomes a multi-source feature when the affected users are college students at named institutions.
The marketplace closed the ticket
The detail that escalates this from a generic vibe-coding-security story to a structural one is the marketplace response. A security researcher reported the exposure to the platform where the app was built and hosted. The ticket was closed without a substantive response. The data continued to leak.
This is not unique to one marketplace. It is the pattern across the AI build tool ecosystem. The marketplace's relationship is with the founder, not with the founder's users. When a researcher reports an exposure on someone else's app, the marketplace's incentive is to forward it to the founder and close the ticket. The founder, often a non-technical operator who built the app between other obligations, may not see the notification, may not understand it, or may not have the technical ability to fix it.
The implication for founders is uncomfortable. The platform you build on is not going to catch security issues for you, and is not going to act on reports about your app on your behalf. The security responsibility is yours. The platform's terms make this explicit, even if the marketing does not.
What would have caught this before launch
The exposure in this case was the same as Moltbook: row-level security off, on a table that contained user data. Five minutes of review by anyone who knew to check would have caught it. The check is mechanical.
For an EdTech app specifically, there are three other questions that round out the pre-launch checklist.
Are you collecting age at signup? If yes, you need a path that handles minors differently. If no, you should be, because COPPA applies whenever you have a reason to know users are under 13, and any platform that touches K-12 has that reason.
Are student identifiers (university email, student ID, transcript data) stored in their own table with stricter access policies than the rest of the schema? Mixing sensitive student data with general profile data in one table means everything has to meet the strictest policy, which usually means everything is over- or under-protected.
Do you have a published incident response policy and a way to be reached when researchers find issues? Most EdTech founders do not. The fastest version is a security.txt file at the root of your domain pointing at a security email you actually monitor. Researchers will not chase a founder who has not given them a way to reach the right person.
What to do if you ship EdTech
The cheapest move is to ensure the database is locked down before the first user signs up. Row-level security on, policies written, anonymous access disabled. Test with a fresh account and try to read another user's data. If it works, the gap is there.
The next move, before any meaningful user volume, is a written audit. For EdTech specifically, the audit should include FERPA exposure, COPPA exposure, the institutional notification path if a breach occurs, and the data-minimization question (are you collecting fields you do not need). Each of those is a category that can sink a launch independently.
At Kingbird Solutions, we run free 5-point diagnostics that catch the most common exposure patterns in any vibe-coded app, including EdTech. If we find something, the next step is either a written audit or a hardening sprint where we fix the gaps and hand you a clean codebase ready for launch. The earlier you run it, the cheaper it is. The cheapest moment is before the first user signs up. The next cheapest is now.
The EdTech founder in this story did not have a free diagnostic before they launched. They had a working app and good intentions. Both are necessary. Neither is sufficient.
If this helped
You can put this thinking to work directly. Run the diagnostic on a stuck product, or book a 30-minute call to talk through your situation.