The Student Privacy Pledge â a voluntary promise to protect student data â ceased.
The pledge was started to convince edtech companies to adopt transparency standards for working with K-12 schools. Itâs an artifact of the early days of the edtech industry, when many states had yet to create laws around how these companies handle data.
Now that states are governing this area more thoroughly, the nonprofit behind the pledge recently âretiredâ it, according to a note on the website, which nodded toward the âchanging technological and policy landscape.â An archive of the companies that signed the pledge will remain up through July 31 of this year.
But this development may not be a victory for student privacy, and there may be bigger challenges ahead, especially with the rise of artificial intelligence. Some experts warn that studentsâ privacy rights are in peril, with âdigital authoritarianismâ on the rise, as law enforcement uses AI-boosted surveillance to track students, a concern that they claim is not adequately addressed by state rules.
A New Era for Privacy?
The pledge was an example of self-regulation, arising when the edtech industry felt pressure to safeguard student data but before AI took up so much bandwidth.
When it was created, many states didnât have laws specifically detailing how companies should handle student data, although federal rules regarding privacy were already on the books, including the Family Educational Rights and Privacy Act and the Childrenâs Online Privacy Protection Act.
The pledge to shield student data was relatively popular in the industry. It started in 2014, an effort from the The Future of Privacy Forum and the Software & Information Industry Association. In all, more than 400 companies signed on.
In 2020, the pledge was updated to include a condition that companies signing had to build privacy and security into the design of their products. It also expanded the kinds of data companies were vowing to protect and had guidelines on how to live up to the commitments in the pledge.
But these days, the pledge has served its purpose, says John Verde, senior vice president for policy at The Future of Privacy Forum.
School districts navigate studentsâ data privacy in their rules and vendor contracts, but also in state and federal laws, which establish guardrails for what rules must be followed and provide avenues for civil rights investigations.
In the decade since the pledge came about, state laws have exceeded the principles it set out in what they ask companies to do, the site argued. At least 40 states have passed these laws to date, forcing companies to protect student data by law.
Conversations about privacy in the edtech industry have shifted from figuring out what companies need to protect to how they can comply with the law, and to privacy concerns emerging from AI, Verde says.
The pledge simply wasnât crafted to handle the fast-moving problems presented by AI, he explains, arguing that it makes more sense to build AI approaches from the ground up.
Signatories of the pledge argue that its end will not impact student privacy.
GoGuardian, a student-monitoring-services company, told EdSurge in an email interview that the retirement âwill in no way impact GoGuardianâs proactive and transparent approach to student data privacy.â Company representatives added: âWe will continue to uphold all requirements, which are widely considered to be standard practices for the industry,â noting that they will simply remove references to the pledge on their website.
Choppy Waters
Nevertheless, some observers see student rights as particularly vulnerable at the moment and have expressed concern that legal frameworks concerning emerging AI technologies blithely ignore studentsâ civil rights.
None of the recent state frameworks on AI even mention police use of the technology to monitor and discipline students, argued Clarence Okoh, a senior attorney for the Center on Privacy and Technology at Georgetown University Law Center. And under the current administration, itâs likely that there will be less strenuous policing of civil rights.
âUnfortunately, state AI guidance largely ignores this crisis because [states] have been [too] distracted by shiny baubles, like AI chatbots, to notice the rise of mass surveillance and digital authoritarianism in their schools,â Okoh has previously told EdSurge.
And at the federal level, under the Trump administration so far, thereâs less urgency in going after companies that may pierce studentsâ privacy.
In contrast, during the last presidential administration, Democrat senators opened an investigation into whether a number of student monitoring companies, including GoGuardian, disturb student privacy.
There was also a settlement with the Pasco County School District in Florida that had allegedly discriminated against students with disabilities using a predictive policing program that accessed student records.
But now, with changing emphasis at the federal level, in states with contested civil liberties cases it is harder to counter troublesome practices, Okoh added.