
This week's issue maps the gap between what consent is supposed to mean and what it looks like in practice. From cookie banners engineered to hide consent, to policies nobody can read, to the technologies that could remove the need to trust platforms with your data.
One YouTube Video
In this video, two of the world's leading experts on online manipulation, Professors Cristiana Santos and Woodrow Hartzog, break down how consent interfaces are systematically engineered to extract maximum data while giving users the illusion of choice.
One Research Paper
Researchers scanned 14,000 websites and found that consent manipulation is now systematic and increasingly sophisticated. 58% of sites make it harder to withdraw consent than to give it. 24% have fake opt-outs—cookies are still set after you click reject.
One Project
Harvard researchers built a searchable database of 20,000+ privacy policies. They found that 86% of policies require college-level reading proficiency, most platforms have quietly removed your right to sue them, and some policies have quietly disappeared from the internet when researchers came looking.
One Tool
Harvard's open-access archive lets you search and compare how privacy policies from 300+ platforms have changed over time, so you can see exactly what was quietly rewritten after you agreed to it.
One Guide
Under GDPR, consent must be freely given, specific, informed, and unambiguous. This EDPB guide maps what that means in practice: cookie walls are invalid, vague purposes like 'improving user experience' don't qualify, and withdrawing consent must be as easy as giving it.
One Structural Response
The alternative to relying on user trust is to create systems in which such trust isn't required. This piece covers the full landscape of privacy-enhancing technologies and maps each one to the specific regulatory obligations it addresses.

