EFF

Subscribe to EFF feed
EFF's Deeplinks Blog: Noteworthy news from around the internet
Updated: 50 min 38 sec ago

EFF to Court: Young People Have First Amendment Rights

3 hours 29 min ago

Utah cannot stifle young people’s First Amendment rights to use social media to speak about politics, create art, discuss religion, or to hear from other users discussing those topics, EFF argued in a brief filed this week.

EFF filed the brief in NetChoice v. Brown, a constitutional challenge to the Utah Minor Protection in Social Media Act. The law prohibits young people from speaking to anyone on social media outside of the users with whom they are connected or those users’ connections. It also requires social media services to make young people’s accounts invisible to anyone outside of that same subgroup of users. The law requires parents to consent before minors can change those default restrictions.

To implement these restrictions, the law requires a social media service to verify every user’s age so that it knows whether to apply those speech-restricting settings.

The law therefore burdens the First Amendment rights of both young people and adults, the friend-of-the-court brief argued. The ACLU, Freedom to Read Foundation, LGBT Technology Institute, TechFreedom, and Woodhull Freedom Foundation joined EFF on the brief.

Utah, like many states across the country, has sought to significantly restrict young people’s ability to use social media. But “Minors enjoy the same First Amendment right as adults to access and engage in protected speech on social media,” the brief argues. As the brief details, minors use social media for to express political opinions, create art, practice religion, and find community.

Utah cannot impose such a severe restriction on minors’ ability to speak and to hear from others on social media without violating the First Amendment. “Utah has effectively blocked minors from being able to speak to their communities and the larger world, frustrating the full exercise of their First Amendment rights,” the brief argues.

Moreover, the law “also violates the First Amendment rights of all social media users—minors and adults alike—because it requires every user to prove their age, and compromise their anonymity and privacy, before using social media.”

Requiring internet users to provide their ID or other proof of their age could block people from accessing lawful speech if they don’t have the right form of ID, the brief argues. And requiring users to identify themselves infringes on people’s right to be anonymous online. That may deter people from joining certain social media services or speaking on certain topics, as people often rely on anonymity to avoid retaliation for their speech.

Finally, requiring users to provide sensitive personal information increases their risk of future privacy and security invasions, the brief argues.

Keeping the Web Up Under the Weight of AI Crawlers

Thu, 06/05/2025 - 7:13pm

If you run a site on the open web, chances are you've noticed a big increase in traffic over the past few months, whether or not your site has been getting more viewers, and you're not alone. Operators everywhere have observed a drastic increase in automated traffic—bots—and in most cases attribute much or all of this new traffic to AI companies.

Background

AI—in particular, Large Language Models (LLMs) and generative AI (genAI)—rely on compiling as much information from relevant sources (i.e., "texts written in English" or "photographs") as possible in order to build a functional and persuasive model that users will later interact with. While AI companies in part distinguish themselves by what data their models are trained on, possibly the greatest source of information—one freely available to all of us—is the open web.

To gather up all that data, companies and researchers use automated programs called scrapers (sometimes referred to by the more general term "bots") to "crawl" over the links available between various webpages and save the types of information they're tasked with as they go. Scrapers are tools with a long, and often beneficial, history: services like search engines, the Internet Archive, and all kinds of scientific research rely on them.

When scrapers are not deployed thoughtfully, however, they can contribute to higher hosting costs, lower performance, and even site outages, particularly when site operators see so many of them in operation at the same time. In the long run all this may lead to some sites shutting down rather than bearing the brunt of it.

For-profit AI companies must ensure they do not poison the well of the open web they rely on in a short-sighted rush for training data.

Bots: Read the Room

There are existing best practices those who use scrapers should follow. When bots and their operators ignore these guideposts it sends a signal to site operators, sometimes explicitly, that they can or should cut off their access, impede performance, and in the worst case it may take a site down for all users. Some companies appear to follow these practices most of the time, but we see increasing reports and evidence of new bots that don't.

First, scrapers should follow instructions given in a site's robots.txt file, whether those are to back off to a certain crawling rate, exclude certain paths, or not to crawl the site at all.

Second, bots should send their requests with a clearly labeled User Agent string which indicates their operator, their purpose, and a means of contact.

Third, those running scrapers should provide a process for site operators to request back-offs, rate caps, exclusions, and to report problematic behavior via the means of contact info or response forms linked via the User Agent string.

Mitigations for Site Operators

Of course, if you're running a website dealing with a flood of crawling traffic, waiting for those bots to change their behavior for the better might not be realistic. Here are a few suggested, if imperfect, mitigations based in part on our own sometimes frustrating experiences.

First, use a caching layer. In most cases a Content Delivery Network (CDN) or an "edge platform" (essentially a newer iteration of a CDN) can provide this for you, and some services offer a free tier for non-commercial users. There are also a number of great projects if you prefer to self-host. Some of the tools we've used for caching include varnish, memcached, and redis.

Second, convert to static content to prevent resource-intensive database reads. In some cases this may reduce the need for caching.

Third, use targeted rate limiting to slow down bots without taking your whole site down. But know this can get difficult when scrapers try to disguise themselves with misleading User Agent strings or by spreading a fleet of crawlers out across many IP addresses.

Other mitigations such as client-side validation (e.g. CAPTCHAs or proof-of-work) and fingerprinting carry privacy and usability trade-offs, and we warn against deploying them without careful forethought.

Where Do We Go From Here?

To reiterate, whatever one's opinion of these particular AI tools, scraping itself is not the problem. Automated access is a fundamental technique of archivists, computer scientists, and everyday users that we hope is here to stay—as long as it can be done non-destructively. However, we realize that not all implementers will follow our suggestions for bots above, and that our mitigations are both technically advanced and incomplete.

Because we see so many bots operating for the same purpose at the same time, it seems there's an opportunity here to provide these automated data consumers with tailored data providers, removing the need for every AI company to scrape every website, seemingly, every day.

And on the operators' end, we hope to see more web-hosting and framework technology that is built with an awareness of these issues from day one, perhaps building in responses like just-in-time static content generation or dedicated endpoints for crawlers.

EFF to the FTC: DMCA Section 1201 Creates Anti-Competitive Regulatory Barriers

Thu, 06/05/2025 - 6:33pm

As part of multi-pronged effort towards deregulation, the Federal Trade Commission has asked the public to identify any and all “anti-competitive” regulations. Working with our friends at Authors Alliance, EFF answered, calling attention to a set of anti-competitive regulations that many don’t  recognize as such: the triennial exemptions to Section 1201 of the Digital Millennium Copyright Act, and the cumbersome process on which they depend.

Copyright grants exclusive rights to creators, but only as a means to serve the broader public interest. Fair use and other limitations play a critical role in that service by ensuring that the public can engage in commentary, research, education, innovation, and repair without unjustified restriction. Section 1201 effectively forbids fair uses where those uses require circumventing a software lock (a.k.a. technological protection measures) on a copyrighted work.

Congress realized that Section 1201 had this effect, so it adopted a safety valve—a triennial process by which the Library of Congress could grant exemptions. Under the current rulemaking framework, however, this intended safety valve functions more like a chokepoint. Individuals and organizations seeking an exemption to engage in lawful fair use must navigate a burdensome, time-consuming administrative maze. The existing procedural and regulatory barriers ensure that the rulemaking process—and Section 1201 itself—thwarts, rather than serves, the public interest.

The FTC does not, of course, control Congress or the Library of Congress. But we hope its investigation and any resulting report on anti-competitive regulations will recognize the negative effects of Section 1201 and that the triennial rulemaking process has failed to be the check Congress intended. Our comments urge the FTC to recommend that Congress repeal or reform Section 1201. At a minimum, the FTC should advocate for fundamental revisions to the Library of Congress’s next triennial rulemaking process, set for 2026, so that copyright law can once again fulfill its purpose: to support—rather than thwart—competitive and independent innovation.

You can find the full comments here.

The Dangers of Consolidating All Government Information

Thu, 06/05/2025 - 1:15pm

The Trump administration has been heavily invested in consolidating all of the government’s information into a single searchable, or perhaps AI-queryable, super database. The compiling of all of this information is being done with the dubious justification of efficiency and modernization–however, in many cases, this information was originally siloed for important reasons: to protect your privacy, to prevent different branches of government from using sensitive data to punish or harass you, and to perserve the trust in and legitimacy of important civic institutions.

Attempts to Centralize All the Government’s Information About You

This process of consolidation has taken several forms. The purported Department of Government Efficiency (DOGE) has been seeking access to the data and computer systems of dozens of government agencies. According to one report, access to the data of these agencies has given DOGE, as of April 2025, hundreds of pieces of personal information about people living in the United States–everything ranging from financial and tax information, health and healthcare information, and even computer I.P. addresses. EFF is currently engaged in a lawsuit against the U.S. Office of Personnel Management (OPM) and DOGE for disclosing personal information about government employees to people who don’t need it in violation of the Privacy Act of 1974.

Another key maneuver in centralizing government information has been to steamroll the protections that were in place that keep this information away from agencies that don’t need, or could abuse, this information. This has been done by ignoring the law, like the Trump administration did when it ordered the IRS make tax information available for the purposes of immigration enforcement. It has also been done through the creation of new (and questionable) executive mandates that all executive branch information be made available to the White House or any other agency. Specifically, this has been attempted with the March 20, 2025 Executive Order, “Stopping Waste Fraud and Abuse by Eliminating Information Silos” which mandates that the federal government, as well as all 50 state governments, allow other agencies “full and prompt access to all unclassified agency records, data, software systems, and information technology systems.” But executive orders can’t override privacy laws passed by Congress.

Not only is the Trump administration trying to consolidate all of this data institutionally and statutorily, they are also trying to do it technologically. A new report revealed that the administration has contracted Palantir—the open-source surveillance and security data-analytics firm—to fuse data from multiple agencies, including the Department of Homeland Security and Health and Human Services.

Why it Matters and What Can Go Wrong 

The consolidation of government records equals more government power that can be abused. Different government agencies necessarily collect information to provide essential services or collect taxes. The danger comes when the government begins pooling that data and using it for reasons unrelated to the purpose it was collected.

Imagine, for instance, a scenario where a government employee could be denied health-related public services or support because of the information gathered about them by an agency that handles HR records. Or a person’s research topic according to federal grants being used to weigh whether or not that person should be allowed to renew a passport.

Marginalized groups are most vulnerable to this kind of abuse, including to locate individuals for immigration enforcement using tax records. Government records could also be weaponized against people who receive food subsidies, apply for student loans, or take government jobs

Congress recognized these dangers 50 years ago when it passed the Privacy Act to put strict limits on the government’s use of large databases. At that time, trust in the government eroded after revelations about White House enemies’ lists, misuse of existing government personality profiles, and surveillance of opposition political groups.

There’s another important issue at stake: the future of federal and state governments that actually have the information and capacity to help people. The more people learn to distrust the government because they worry the information they give certain government agencies may be used to hurt them in the future, the less likely people will be to participate or seek the help they need. The fewer people engage with these agencies, the less likely they will be to survive. Trust is a key part of any relationship between the governed and government and when that trust is abused or jettisoned, the long-term harms are irreparable.

EFF, like dozens of other organizations, will continue to fight to ensure personal records held by the government are only used and disclosed as needed and only for the purpose they were collected, as federal law demands. 

Related Cases: American Federation of Government Employees v. U.S. Office of Personnel Management

Judges Stand With Law Firms (and EFF) Against Trump’s Executive Orders

Thu, 06/05/2025 - 11:00am

Pernicious.”

Unprecedented... cringe-worthy.”

Egregious.”

Shocking.” 

These are just some of the words that federal judges used in recent weeks to describe President Trump’s politically motivated and vindictive executive orders targeting law firms that have employed people or represented clients or causes he doesn’t like. 

But our favorite word by far is “unconstitutional.” 

EFF was one of the very first legal organizations to publicly come out in support of Perkins Coie when it became the first law firm to challenge the legality of President Trump’s executive order targeting it. Since then, EFF has joined four amicus briefs in support of other targeted law firms, and in all four cases, judges from the U.S. District Court for the District of Columbia have indicated they’re having none of it. Three have issued permanent injunctions deeming the executive orders null and void, and the fourth seems to be headed in that same direction. 

Trump issued his EO against Perkins Coie on March 6. In a May 2 opinion finding the order unconstitutional and issuing a permanent injunction, Senior Judge Beryl A. Howell wrote:  

“By its terms, this Order stigmatizes and penalizes a particular law firm and its employees—from its partners to its associate attorneys, secretaries, and mailroom attendants—due to the Firm’s representation, both in the past and currently, of clients pursuing claims and taking positions with which the current President disagrees, as well as the Firm’s own speech,” Howell wrote. “In a cringe-worthy twist on the theatrical phrase ‘Let’s kill all the lawyers,’ EO 14230 takes the approach of ‘Let’s kill the lawyers I don’t like,’ sending the clear message: lawyers must stick to the party line, or else.” 

“Using the powers of the federal government to target lawyers for their representation of clients and avowed progressive employment policies in an overt attempt to suppress and punish certain viewpoints, … is contrary to the Constitution, which requires that the government respond to dissenting or unpopular speech or ideas with ‘tolerance, not coercion.’” 

 Trump issued a similar EO against Jenner & Block on March 25. In a May 23 opinion also finding the order unconstitutional and issuing a permanent injunction, Senior Judge John D. Bates wrote: 

“This order—which takes aim at the global law firm Jenner & Block—makes no bones about why it chose its target: it picked Jenner because of the causes Jenner champions, the clients Jenner represents, and a lawyer Jenner once employed. Going after law firms in this way is doubly violative of the Constitution. Most obviously, retaliating against firms for the views embodied in their legal work—and thereby seeking to muzzle them going forward—violates the First Amendment’s central command that government may not ‘use the power of the State to punish or suppress disfavored expression.’ Nat’l Rifle Ass’n of Am. v. Vullo, 602 U.S. 175, 188 (2024). More subtle but perhaps more pernicious is the message the order sends to the lawyers whose unalloyed advocacy protects against governmental viewpoint becoming government-imposed orthodoxy. This order, like the others, seeks to chill legal representation the administration doesn’t like, thereby insulating the Executive Branch from the judicial check fundamental to the separation of powers. It thus violates the Constitution and the Court will enjoin its operation in full.” 

 Trump issued his EO targeting WilmerHale on March 27. In a May 27 opinion finding that order unconstitutional, Senior Judge Richard J. Leon wrote: 

“The cornerstone of the American system of justice is an independent judiciary and an independent bar willing to tackle unpopular cases, however daunting. The Founding Fathers knew this! Accordingly, they took pains to enshrine in the Constitution certain rights that would serve as the foundation for that independence. Little wonder that in the nearly 250 years since the Constitution was adopted no Executive Order has been issued challenging these fundamental rights. Now, however, several Executive Orders have been issued directly challenging these rights and that independence. One of these Orders is the subject of this case. For the reasons set forth below, I have concluded that this Order must be struck down in its entirety as unconstitutional. Indeed, to rule otherwise would be unfaithful to the judgment and vision of the Founding Fathers!” 

“Taken together, the provisions constitute a staggering punishment for the firm’s protected speech! The Order is intended to, and does in fact, impede the firm’s ability to effectively represent its clients!” 

“Even if the Court found that each section could be grounded in Executive power, the directives set out in each section clearly exceed that power! The President, by issuing the Order, is wielding his authority to punish a law firm for engaging in litigation conduct the President personally disfavors. Thus, to the extent the President does have the power to limit access to federal buildings, suspend and revoke security clearances, dictate federal hiring, and manage federal contracts, the Order surpasses that authority and in fact usurps the Judiciary’s authority to resolve cases and sanction parties that come before the courts!” 

The fourth case in which EFF filed a brief involved Trump’s April 9 EO against Susman Godfrey. In that case, Judge Loren L. AliKhan is still considering whether to issue a permanent injunction, but on April 15 gave a fiery ruling from the bench in granting a temporary restraining order against the EO’s enforcement. 

“The executive order is based on a personal vendetta against a particular firm, and frankly, I think the framers of our Constitution would see this as a shocking abuse of power,” AliKhan said, as quoted by Courthouse News Service. "The government cannot hold lawyers hostage to force them to agree with it, allowing the government to coerce private business, law firms and lawyers solely on the basis of their view is antithetical to our constitutional republic and hampers this court, and every court’s, ability to adjudicate these cases.” 

And, as quoted by the New York Times: “Law firms across the country are entering into agreements with the government out of fear that they will be targeted next and that coercion is plain and simple. And while I wish other firms were not capitulating as readily, I admire firms like Susman for standing up and challenging it when it does threaten the very existence of their business. … The government has sought to use its immense power to dictate the positions that law firms may and may not take. The executive order seeks to control who law firms are allowed to represent. This immensely oppressive power threatens the very foundations of legal representation in our country.” 

As we wrote when we began filing amicus briefs in these cases, an independent legal profession is a cornerstone of democracy and the rule of law. As a nonprofit legal organization that frequently sues the federal government, EFF understands the value of this bedrock principle and how it–and First Amendment rights more broadly–are threatened by President Trump’s executive orders. It is especially important that the whole legal profession speak out against these actions, particularly in light of the silence or capitulation of a few large law firms. 

We’re glad the courts agree.

Statement on California State Senate Advancing Dangerous Surveillance Bill

Wed, 06/04/2025 - 5:45pm

In the wake of the California State Senate’s passage of S.B. 690, the Electronic Frontier Foundation (EFF), TechEquity, Consumer Federation of California, Tech Oversight California, and ACLU California Action issued a joint statement warning that the bill would put the safety and privacy of millions of Californians at serious risk:

“SB 690 gives the green-light to dystopian big tech surveillance practices which will endanger the privacy and safety of all Californians. SB 690 would allow companies to spy on us to get our sensitive personal information, such as our immigration status or what healthcare we’ve received. And once they have our sensitive personal information, SB 690 places no limits on how that business can use or share that information, allowing them to share it with data brokers, immigration officials, or law enforcement officials in states that restrict reproductive or gender-affirming care.

“At a time where agencies of the federal government are actively targeting individuals based on information collected from businesses about their political beliefs, religious affiliations, or health decisions, we cannot risk sharing even more sensitive information with them. The legislature should be doing all it can to protect Californians, not make it easier for the federal government to secretly obtain our sensitive information.”

Background

Coalition Floor Alert Opposing SB 690

Take Action Against SB 690

Podcast Episode: Why Three is Tor's Magic Number

Wed, 06/04/2025 - 3:08am

Many in Silicon Valley, and in U.S. business at large, seem to believe innovation springs only from competition, a race to build the next big thing first, cheaper, better, best. But what if collaboration and community breeds innovation just as well as adversarial competition?

%3Ciframe%20height%3D%2252px%22%20width%3D%22100%25%22%20frameborder%3D%22no%22%20scrolling%3D%22no%22%20seamless%3D%22%22%20src%3D%22https%3A%2F%2Fplayer.simplecast.com%2Ffb16baef-bf64-4c2b-9069-ad1dfaade04b%3Fdark%3Dtrue%26amp%3Bcolor%3D000000%22%20allow%3D%22autoplay%22%3E%3C%2Fiframe%3E Privacy info. This embed will serve content from simplecast.com

    

(You can also find this episode on the Internet Archive and on YouTube.)

Isabela Fernandes believes free, open-source software has helped build the internet, and will be key to improving it for all. As executive director of the Tor Project – the nonprofit behind the decentralized, onion-routing network providing crucial online anonymity to activists and dissidents around the world – she has fought tirelessly for everyone to have private access to an uncensored internet, and Tor has become one of the world's strongest tools for privacy and freedom online.  

Fernandes joins EFF’s Cindy Cohn and Jason Kelley to discuss the importance of not just accepting technology as it’s given to us, but collaboratively breaking it, tinkering with it, and rebuilding it together until it becomes the technology that we really need to make our world a better place. 

In this episode you’ll learn about:

  • How the Tor network protects the anonymity of internet users around the world, and why that’s so important 
  • Why online privacy is NOT only for “people who have something to hide” 
  • The importance of making more websites friendly and accessible to Tor and similar systems 
  • How Tor can actually benefit law enforcement  
  • How free, open-source software can power economic booms 

Isabela Fernandes has been executive director of the Tor Project since 2018; she had been a project manager there since 2015.  She also has served since 2023 as a board member of both European Digital Rights – an association of civil and human rights organizations aimed at building a people-centered, democratic society – and The Engine Room, a nonprofit that supports social justice movements to use technology and data in safe, responsible and strategic ways, while actively mitigating the vulnerabilities created by digital systems. Earlier, Fernandes worked as a product manager for Twitter; Latin America project manager for North by South, which offered open-source technology integration to companies using  expertise of Latin American free software specialists; as a project manager for Brazil’s President, overseeing migration of the IT department to free software; and as a technical advisor to Brazil’s Ministry of Communications, creating and implementing new features and free-software tools for the National Digital Inclusion Program serving 3,500 communities. She’s a former member of the board of the Calyx Institute, an education and research organization devoted to studying, testing and developing and implementing privacy technology and tools to promote free speech, free expression, civic engagement and privacy rights on the internet and in the mobile telephone industry. And she was a cofounder and longtime volunteer with Indymedia Brazil, an independent journalism collective. 

Resources: 

What do you think of “How to Fix the Internet?” Share your feedback here.

Transcript

ISABELA FERNANDES: If Tor is successful, the internet would be built by its heart, right? Like the elements that Tor carries, which is community, which is decentralization. Instead of having everything focused on a few small companies would be more distributed. I come from the free software world, so I am always excited with, and I have lived at moments.
In my life where I saw I could touch it, I could touch the moment where the source code would be shared and multiple areas of society would benefit from it. Collaboration allows amazing innovation. We are here today because of free software. If it wasn't for that, we would not be here today.

CINDY COHN: That's Isabela Fernandes, head of the Tor Project, describing the beautiful promise of collaboration, community and innovation that is instilled in the free software world – and the important role it plays as we look forward to that better future we’re always talking about on this show.
I'm Cindy Cohn, the executive director of the Electronic Frontier Foundation.

JASON KELLEY: And I'm Jason Kelley, EFF's Activism Director. This is our podcast series, How to Fix the Internet.

CINDY COHN: The idea behind this show is that we're trying to make our digital lives BETTER. You know, a big part of our job at EFF is to envision the ways things can go wrong online-- and then of course jumping into action to help when things then DO go wrong.
But this show is about optimism, hope and solutions – we want to share visions of what it looks like when we start to get it right.

JASON KELLEY: And our guest today is someone whose vision of getting it right is stronger than most.

CINDY COHN: Isabela Fernandes has been an important presence in the security and free software communities for a really long time now.
She's been the executive director at the Tor project since 2018, and before that she was a product manager there. And I'm happy to say that when I was on the board of the Tor project, I was one of the board members that strongly recommended Isa for the executive director role.
She was and continues to be not only a brilliant mind, but a skilled executor. With the Tor Project providing a model for an open source tool that works, is trusted and literally saves lives around the world. We are so thrilled to have her here. Welcome, Isa.

ISABELA FERNANDES: Hi. Thank you.

JASON KELLEY: We're really excited to talk to you and I wanted to start, if I could, with some basics. I think a lot of our audience, you know, has heard of Tor. Maybe they know what the Tor browser is. But some of these things pop up and I think, you know, some people don't know the difference between. A torrent and a tour browser and like what the tour project actually works on. So what is the Tor Project? What are the tools that you’re sort of responsible for creating and maintaining there?

ISABELA FERNANDES: So the Tor Project is actually a non-profit. And our mission is to advance human rights through the technology that we build, right?
So Tor is very similar to a VPN, but much better. We have a decentralized network that is run by volunteers, that whenever you are making a connection to a service or a website, our network will route you through three servers and it's gonna encrypt it every step of the way. And because of this architecture, it's not centralized on anyone or any entity.
It's completely decentralized to thousands of servers around the world. And we also have the Tor browser, which is a fork of Firefox, and what the Tor browser does besides making it easier for you to connect to the to network, it protects your privacy on the device level, so it blocks third party cookies, it also protects you against fingerprinting tracking and other ways that your device identity can leak.

JASON KELLEY: Okay, so just to dig in and make sure I understand, if I'm on a VPN I'm, you know, basically connecting to another server. And all of my connections are going through that. And usually I can, like, pick where they are from a short list of, you know, potential servers in cities and countries.
But with Tor, I don't choose where I'm going or what those three connections are, but it adds that extra layer of protection because three is better than one. But, but why is that? I'm just like, I wonder for the audience who might not know, you know, why three, why not five, or why not two?

ISABELA FERNANDES: Right, so, three, mainly because, so it works like this, right? Like the first server will know who you are because you're connecting to Tor –

JASON KELLEY: Sure, ok.

ISABELA FERNANDES: But it does not know what you are requesting. The middle one does not know who you are and have no idea about what you are requesting, and the exit one, the third one, only knows that someone on the internet is requesting to open a website,

JASON KELLEY: Got it.

ISABELA FERNANDES: So that count is great because if it was only two, that information you, you would still have some way to discover it and to understand where the information is coming from and where it is going.
So three, it is indeed a sweet number for you to have the level of privacy and that we want without building more latency to the connection.

JASON KELLEY: Okay. And then I'm gonna ask one more question. Um. About the technical aspect. Over the last like decade plus, most of the web has become encrypted. The HTTP level has become HTTPS, and that's something that EFF has worked on with our Certbot project, and Let's Encrypt. And if I'm not super familiar with the difference between, you know, how HTTPS is encrypted versus what TOR is doing.
Why do I still need to use Tor? What is it saving? What, how is it protecting my privacy if, if, quote unquote, the web is already encrypted?

ISABELA FERNANDES: Um, let's give this example, right? Like HTTPS would be encrypting your connection to the website. So when you do, you type your username or password, that information is encrypted. However, the server who is watching still would know who you are and where you're coming from. With Tor, you gain that other layer of protection, right? Like, nobody would know who you are and where you like, uh, and what you are requesting except for the website. So it protects you from outside watchers who might be surveilling your connection, also protects you from other tracking mechanisms on the internet.
So the ideal scenario is for you to use both. Right, like it's for you to not only use Tor, but make sure that you're connected to a website that has HTTPS as well.

JASON KELLEY: Wow. Okay. That's really helpful. Thank you so much. I feel like I'm getting tech tech support from the literal executive director of the Tor Project, but I think a lot of people you know that come to us at EFF for privacy or security recommendations really do not understand some of these, you know, somewhat basic things that you're describing about the difference between proxies and encrypted sites and VPNs and Tor, and, um, I think it's just really important for people to know how these different tools work, because they're always, you know, different tools function for different purposes, right.

CINDY COHN: Yeah. And it's, you know, security is hard. It actually requires, I mean, it would be great if there was a one size fits all security. And I think that if you look at all the pieces that Tor’s building, they're, they're moving towards that.
I want us to talk a little bit more about the why of Tor, 'cause we've really outlined the how of Tor, and I wanna give you a chance to kind of debunk one of the arguments that we hear all the time at EFF, which is, you know, why do people need all this security? If you're not doing anything wrong, you know, why should you worry? Um, or is it all just hopeless and shouldn't I just give up?
But let's start with the first one, and I know that you've done a lot of work at Tor trying to really think hard about who needs these tools, who uses these tools in a way that's privacy protective. So I wonder if you could outline a little bit of kind of what you guys know about who uses Tor and why.

ISABELA FERNANDES: There is a spectrum, and I always like to give examples from the two sides of this spectrum. We collect a lot of anonymous stories from users, and let's call this one Brian. So we have Brian. He’s a father and he has two teenage kids at home.
And, uh, you know, as teenagers, they have questions about everything, right? Like about sex, gender, drugs, everything. So he recommends his kids to use Tor when they're searching for those topics on the internet. And sometimes he needs to search some topics himself. You know, like the kids bring a topic that he had no idea what it is about.
So they use Tor to make sure that those searches does not follow those teenage kids for the rest of their lives. Right. Like it's not tagged to them for the rest of their life. So we recommend his kids to use Tor. And then you have on the other side of the spectrum, um, let's call her Carolina.
Carolina is a woman in Uganda. Uh, she's a lesbian. And in Uganda, you can face criminal charge for that. So Carolina just wants to have a normal social online life. And because it's so dangerous in Uganda for her, she really needs to make sure that she's protected and anonymous online when she's interacting with her friends or just looking for topics that is related to her lifestyle. So she used Tor to be safe online, uh, to just have a normal social life on the internet. We did a research, which I thought was very interesting. We put a question on a browser, it was anonymous and anyone could answer.
And we had like a, almost like a 55,000 people answering that question and was how often you use Tor, the Tor browser. And actually more than half of that said that they use one, uh, a few times a day or a few times a week. And that for me says a lot, right? Like it's for those moments where you're like, okay, this, I will want to do on Tor. I don't want the rest of the internet to collect this information and restore it and attach that to my behavior profile.
And that for me, it's what is important, right? Like if people may think that everything is lost and there is no reason to do that. And I think the other way around, I think, uh, it is possible for you to create black spots about your behavior online. And that's what tools like Tor can allow you to do, right? Like you can, uh, create some black spots about you on the internet that protects your privacy.
I think today people do care a lot about their privacy and one example about, which is related to privacy that I always bring to people, it's how dangerous it is to compare the need and the right to be anonymous. With the need to hide something that you don't want others to know or some illegal activity, because anonymity is actually one of the pillars of our democracy. Your vote is anonymous for a reason. So for you to exercise your citizen rights, you need to have privacy.

CINDY COHN: Yep. I think that's exactly right. And I also think we're living in times when things are moving so fast about who's at risk and who's not at risk, that a lot of people are waking up to the fact that just because you might not need privacy in one zone of your life or in one time that we're living in, doesn't mean that that can't change really quickly.
And having the tools available and ready and working is one of the things that we can do – we meaning people in tech – to make sure that as times change, people have the tools that they need to stay safe and to stay protected, and to, to organize, you know, opposition, to organize for change.

Musical transition

CINDY COHN: I think that this is happening a lot, but I'm wondering how you think about helping people reclaim the idea that privacy isn't something we should be ashamed of, that privacy is something that we should be proud of.
I hear you say, and I think that's totally right, it's a pillar of an open and self-governing world. How do you help convince people about that?

ISABELA FERNANDES: Let me step by for a second. Every time, like you might have like heard this from multiple people, right? Like they complain about ads following them, or give an example. Oh, I, I was talking with my friend about bicycles and now all these bicycles ads are showing up, my phone is listening to me.
Right? Like, so I think those are the perfect moment for you to go deeper into the matter of privacy, right? Like, imagine if it was not bicycles, imagine if it was about a government decision that you were talking about, right? That is the moment, right? Like you need to connect with people when they are presenting to you the problem.
So it's, it's fundamental, right? Like that makes super, super easy for someone to understand. But the next step in it is like they ask you how, how do I protect myself. And sometimes I feel like, uh, our work at Tor is not only to create tools. But to make it easier for people to use, it needs to be friendly, it needs to be familiar, right?
Like, uh, that's why the Tor browser is super nice because it's just like any other browser. People hear about Tor and they think it's like, oh, it is this hacker tool that I need to have a special excuse to use. No, it's just like any other browser that you open and you use and you can use on your phone, you can use anywhere.
So it is extremely important to bring awareness when people are identifying the problem, even if it is in an informal conversation or in a more, uh, global conversation, right? Like sometimes those problems arise in global news. Uh, we have multiple, uh, examples of that. Cambridge Analytica was one of it.
And, um, at those, those moments, we need to learn how to connect. But when we connect, we need to also be able to provide solutions that it's easy and familiar to people so they can have hope. They can look at it and they can say, okay, I can control this, right? Like, I can control, I can protect myself, I can protect my privacy.
And those elements come in altogether, right? Like it's not, uh, a one, uh, catchphrase that will make it happen. You need to combine all those elements in the process, right? So it's doesn't seem too hard and people feel empowered to have agency to take action.

CINDY COHN: Yep. I think that's right. So what we try to do in this podcast is kind of flip the script and think about what would the world look like if we got it. All right. So what would the world look like if Tor was immensely successful? What's your vision of the world where we get this right.

ISABELA FERNANDES: In the case of Tor, I think, uh, one thing would be that service and websites, they are friendly to Tor. So if a user is coming to connect to an application, or to a website, that website would know it and would be friendly to it. This is one of the biggest problems right now, right? Like some websites are not friendly to Tor or solutions like Tor. So that would be number one, right?

CINDY COHN: Yep.

ISABELA FERNANDES: So if Tor is successful, we would have an internet or a world with technology, right? Like, to go a little bit the on internet where technology is driven by sharing, by collaboration and the model of it. It's not about the data. And the business model of it would, it can be unique to each case of services, but would not necessarily be the typical one That is the easy one between quotes of let's collect all the data, either to use it for advertise or sell it to, uh, data brokers so we can make some money out of it. Right?
Like, uh, I think that if Tor would be successful, we would have the philosophy of Tor being part of the heart of what it's building, the technology worldwide.

CINDY COHN: Yeah, I think that's great. Um, in some ways, you know, Tor wouldn't need to exist as a separate project because the Tor values would be built into everything. And what I hear there is that that also includes the way Tor has been developed, the open source collaborative, transparent process by which tools were developed would be part of what gets baked in - it's a good vision.

Music transition

JASON KELLEY: Let’s take a quick moment to say thank you to our sponsor.
How to Fix the Internet is supported by The Alfred P. Sloan Foundation’s Program in Public Understanding of Science and Technology. Enriching people’s lives through a keener appreciation of our increasingly technological world and portraying the complex humanity of scientists, engineers, and mathematicians.
We also want to thank EFF members and donors. You are the reason we exist. We will talk a little later in this episode about how important funding from the community is to the work that we do. You can become a member of EFF’s community for just $25.
The more members we have, the more power we have - in statehouses, courthouses, and on the streets. EFF has been fighting for digital rights for decades, and that fight is bigger than ever, so please, if you like what we do, go to eff.org/pod to donate.
And now, back to our conversation with Isabela Fernandes and the impact that free software and the community of people making it has had on her life.

ISABELA FERNANDES: I was raised by free software. Everything I know comes from this, from not only the software itself but the community. So all my skills come from it. So, uh, in the early 2000s, I joined a volunteer network, Colored in the Media was an independent news website, globe-wide. Uh, I built the one for Brazil and we did everything, uh, with free software, right?
Like it was an open publish website. When I explain to people nowadays about it, like you didn't need a username or password to publish your article on that platform, and we have hundreds of those sites around the world, and I did this work for 10 years because I really believe in the democratization of information, and I saw the internet as the root of it. And I saw how powerful that was because it was the beginning of the internet in many parts of the world, here in Brazil was just starting, and not everybody had a connection, and here we were with this powerful tool to democratize, uh, communication in Brazil.
And through that experience, I was invited to work for the federal government in, uh, series of free software initiatives being one of them, in digital inclusion, we would build solutions to communities, um, was a basket of solutions and that was from online stores, uh, they could sell their own products online, uh, to Voiceover IP. At the time it was like there was no cell phones. I'm talking about 2003, uh, like 2004. There was no cell phone.
I arrived to a community one month after they had power for the first time in their life, and I was bringing internet, you know, like, and I'm like, okay, you have internet. What do you want? They didn't have a phone number, a phone line, so I created Voiceover IP. I was like, you can call anywhere in Brazil with this computer. And there was a line with 20 people right away to make phone calls. But that's, we were doing this, we were like going to, uh, the favelas in Brazil and collecting all the teenagers and saying ‘what do you want?’ Like, ‘oh, I wanna record my music.’ And we were recording the music on CDs using everything, free software. Some communities were like, we wanna, uh, document because they have a lot of, uh, folklore stories that is only oral and they wanted to document it. We created a wiki for them to document it. So, education, go to public schools – we did a lot of that with free software. And at the same time, why this was possible, right? Like was because of the culture that was changing. The culture was, okay, we are not gonna use proprietary software anymore. We are not gonna use the money from the country that we barely have to pay for this big, super expensive license.
Instead, we are gonna use this money to invest on the people, to invest on computer science, the students to invest on conventions, free software meetups, uh, to invest on InstallFest. And we start to do that. And we had like a huge technology boom in Brazil from the private sector, like I said, from the government, from universities, everybody was collaborating.
There was a lot of companies being created to provide different types of service or to maintain software, there was a lot of different business being generated out of it as well.
So I could touch it, I could see it. It is possible. We could like we can do it. Right? Like I actually am always very excited when I, and right now I'm seeing a movement again in Brazil, it's not too public yet, but that is a movement like this, with hundreds of organizations debating and building a strategy to recreate that inside of the country.
So it is possible to build a better world with technology, right? Like better versions of technology for us. It's not a mission impossible thing. It is totally possible.

JASON KELLEY: And it's not the distant past really. I mean, sometimes when you talk about it, like I'm, I was alive at the time, but not, you know, not old enough to be involved in that. And it does sometimes sound like a kind of golden era that's lost forever to people. And it's, it's really great to hear that it's, maybe it's something that's cyclical and, or it's something that, you know, we lost for a brief period and we can get back to. How did that movement that's happening now in Brazil, get sort of reignited?

ISABELA FERNANDES: We're bringing some respect from that time. At that time we have, uh, Linux. Linux Install Fest. So you would bring your machine and you would install Linux. We want to combine any, anytime that you have an event that you're talking about the internet, that you're talking about regulation to have install fests – let's say install Mastodon, let's install Signal. Let's have everybody come out of this event and open for the population, right? Like, because sometimes when you offer those options to people. They don't have a network within that option, so they don't tend to stay.
But if you're doing this at an event and you like, let's say let's install Mastodon and everybody can have their account on different Mastadon instance, but we all following each other and I'm seeing the content and I can see it for real, what that means? And I will leave the event already with a network of people that I can follow on MAs on. Same thing. Every time I do a Signal training, I tell people now they're young, solid, let's copy each other's contact. So we have each other on our Signal account, right? Like, so we have a community. So we are thinking about that combination.

Music transition

JASON KELLEY: I wonder how, you know, again, you talk about some of this and I feel so jealous of, you know, being in this movement, I, I've never really been, you know, an engineer, so I'm sort of looking at the free software and open source communities at a distance. How did you end up sort of getting involved in them and, and do you have any advice for other people you know, today that want to be helpful or, um, want to connect with other people to help build the kind of internet you're talking about?

ISABELA FERNANDES: I end up on this out of necessity. When I was a teenager, I hated school, but I love to learn. I got kicked out of, uh, school multiple times until my dad put me in a technical high school to learn computers, but at the same time, uh, in my house, my parents had to work from 7:00 AM to 11:00 PM. So, you know, the strong survive in the house, it was me and my siblings.
And, uh, I could not touch the computer because my older brother would not let me. So I had to write code with a pen and paper and I hate it. And I start to go to my dad’s office to connect to at night when he would leave at 11, I would arrive and stay till 7:00 AM and that's how I start to learn about Linux.
We didn't have money for a license, so the more I wanted to do with the computer, the more I had to go to free software, you know? And like I said, after that, I joined this media network where we did everything. I learned how to build websites. I learned how to build data centers. We had to have security.
We had to build new products for journalists because we wanted to use free software, but sometimes we didn't have everything, or the solutions we had was not good enough, so we had to improve them to edit an audio or a video, right? Like things like that. So I went through this whole phase because I would not accept the technology that the normal, uh, business model wanted to offer me, I didn't accept that as it is, and I thought something else could happen. And like, uh, every time I talk with young people, I tell them this: Don't accept the technology that is being offered to you as it is. Don't accept it. It is possible. The reason we have free software was because people did not accept it, the technology that it was given to them. And I think that's the spirit.

Music transition

CINDY COHN: Thank you so much, Isa, for coming, and sharing your stories with us and your hope. Um, what a, what a hopeful conversation this was.

ISABELA FERNANDES: Thank you so much, Cindy and Jason. It was great to be here. Thank you.

JASON KELLEY: Well now I know how Tor works, which is great because I've been trying to figure that out for years. Um, three steps. I understand why there are three. This makes a lot more sense to me. And I'm honestly just a lot more hopeful than I was, which is always nice. It doesn't happen every time, but I feel like she's describing a future that actually not only is she, she and the Tor folks helping to build, but that other people can be a part of too, which is great.

CINDY COHN: Yeah, I think sometimes people envision privacy tools as the domain of people who are dark and worried and, and wanting to be self-protective all the time. And what was so refreshing about this, and refreshing about the way Isa and Tor operate in the world, is they're working with some pretty serious issues for people, but they're hopeful, they're building a future, they're very positive, and they have a vision of what the world looks like if we build privacy and security into everything. And, and in some ways it was a really light interview about something that protects people from very dangerous situations.

JASON KELLEY: Yeah. Yeah. And she talked a lot about, you know, what got her into free software. For her, it was kind of the necessity of having to write code on paper and not being able to buy software.
But I think we're coming to have that, for some people, that same necessity, again, for a lot of different reasons, you know, the software is bloated, it's enshittified, as Cory would say. Um, it's, you know, often monopolied in some way and, not that these are good things, but if it gets people back to the point she made where you realize that you can build the things yourself, that you don't have to accept the software that you're given and, and the tech that you're given, you can make your own and edit it and things like that. I think that would be a great outcome, and it sounds like that's already happening.

CINDY COHN: I think the other pieces were just, you know, really emphasizing the community, the need for community and how important community is, both in terms of entry into this, but also in the supporting and maintaining and developing of things and in, in how people use Tor. Right. You know, the Tor project operates because of nodes all across the country that volunteer to hold, you know, to carry other people's things. EFF has has done a Tor challenge a few times where we've tried to get more people to run nodes, whether they're in the middle or in the end. But that community is kind of infused in the way Tor works and it's infused in the vision that she has for a better future too. And that's just so consistent with, you know, what we've heard from people over and over again about how we, how we fix the internet.

JASON KELLEY: And that’s our episode for today – thanks so much for joining us.
If you have feedback or suggestions, we'd love to hear from you. Visit EFF dot org slash podcast and click on listener feedback. While you're there, you can become a member, donate, maybe even pick up some merch and just see what's happening in digital rights this week and every week.
Our theme music is by Nat Keefe of BeatMower with Reed Mathis
And How to Fix the Internet is supported by the Alfred P. Sloan Foundation's program in public understanding of science and technology.
We’ll see you next time.
I’m Jason Kelley…

CINDY COHN: And I’m Cindy Cohn.

MUSIC CREDITS: This podcast is licensed creative commons attribution 4.0 international, and includes the following music that is licensed creative commons attribution 3.0 unported by its creators: Recreation by Airtone. Additional beds and alternate theme remixes by Gaetan Harris.

San Diegans Push Back on Flock ALPR Surveillance

Tue, 06/03/2025 - 4:13pm

Approaching San Diego’s first annual review of the city's controversial Flock Safety contract, a local coalition is calling on the city council to roll back this dangerous and costly automated license plate reader (ALPR) program.

The TRUST Coalition—a grassroots alliance including Electronic Frontier Alliance members Tech Workers Coalition San Diego and techLEAD—has rallied to stop the unchecked spread of ALPRs in San Diego. We’ve previously covered the coalition’s fight for surveillance oversight, a local effort kicked off by a “smart streetlight” surveillance program five years ago. 

In 2024, San Diego installed hundreds of AI-assisted ALPR cameras throughout the city to document what cars are driving where and when, then making that data accessible for 30 days.

ALPRs like Flock’s don’t prevent crime—they just vacuum up data on everyone who drives past. The resulting error-prone dragnet can then chill speech and be weaponized against marginalized groups, like immigrants and those seeking trans or reproductive healthcare

Despite local and state restrictions barring the sharing of ALPR with federal and out of state agencies, San Diego Police have reportedly disclosed license plate data to federal agencies—including Homeland Security Investigations and Customs and Border Patrol.

Also, despite a local ordinance requiring city council approval before deployment of surveillance technology, San Diego police have reportedly deployed ALPRs and smart streetlights at Comic-Con and Pride without the required approval.

The local coalition is not alone in these concerns. The San Diego Privacy Board recently recommended the city reject the Surveillance Use Policy for this technology. All of this costs the community over $3.5 million last year alone. That is why the TRUST coalition is calling on the city to reject this oppressive surveillance system, and, instead, invest in other essential services which improve day-to-day life for residents.

San Diegans who want to push back can get involved by signing the TRUST Coalition’s  petition, follow the campaign online, and contact their council members to demand the city end its contract with Flock and start respecting the privacy rights of everyone who lives, works, or visits through their community.

Hell No: The ODNI Wants to Make it Easier for the Government to Buy Your Data Without Warrant

Tue, 06/03/2025 - 3:51pm

New reporting has revealed that the Office of the Director of National Intelligence (ODNI) is attempting to create the Intelligence Community’s Data Consortium–a centralized online marketplace where law enforcement and spy agencies can peruse and buy very personal digital data about you collected by data brokers. Not only is this a massive escalation of the deeply unjust data broker loophole: it’s also another repulsive signal that your privacy means nothing to the intelligence community.

Imagine a mall where every store is run by data brokers whose goods include your information that has been collected by smartphone applications. Depending on your permissions and what applications are on your phone, this could include contacts, behavioral data, financial information, and even your constant geolocation. Now imagine that the only customers in this mall are federal law enforcement officers and intelligence agents who should be going to a judge, presenting their evidence, and hoping the judge grants a warrant for this information. But now, they don’t need evidence or to justify the reason why they need your data. Now they just need taxpayer money, and this newly centralized digital marketplace provides the buying opportunities.

This is what the Office of the Director of National Intelligence wants to build according to recently released contract documents.

Across the country, states are trying desperately to close the loophole that allows the government to buy private data it would otherwise need a warrant to get. Montana just became the first state to make it illegal for police to purchase data, like geolocation data harvested by apps on smartphones. At the federal level, EFF has endorsed Senator Ron Wyden’s Fourth Amendment is Not for Sale Act, which closes this data broker loophole. The bill passed the House last year, but was rejected by the Senate.

And yet, the federal government is doubling down on this very obviously unjust and unpopular policy.

An ODNI that wants to minimize harms against civil liberties would be pursuing the opposite tact. They should not be looking for ways to formalize and institutionalize surveillance loopholes. That is why we not only call on the ODNI to reverse course and scrap the Intelligence Community’s Data Consortium–we also call on lawmakers to finish what they started and pass the Fourth Amendment is Not for Sale Act and close the databroker loophole at the federal level once and for all. We urge all of our supporters to do the same and help us keep the government accountable.

The Right to Repair Is Law in Washington State

Tue, 06/03/2025 - 12:49pm

Thanks in part to your support, the right to repair is now law in Washington.

Gov. Bob Ferguson signed two bills guaranteeing Washingtonians' right to access tools, parts, and information so they can fix personal electronics, appliances, and wheelchairs. This is the epitome of common-sense legislation. When you own something, you should have the final say about who fixes, adapts, or modifies it—and how.

When you own something, you should have the final say about who fixes, adapts, or modifies it—and how.

Advocates in Washington have worked for years to pass a strong right-to-repair law in the state. In addition to Washington’s Public Interest Research Group, the consumer electronics bill moved forward with a growing group of supporting organizations, including environmental advocates, consumer advocates, and manufacturers such as Google and Microsoft. Meanwhile, advocacy from groups including  Disability Rights Washington and the Here and Now Project made the case for the wheelchair's inclusion in the right-to-repair bill, bringing their personal stories to Olympia to show why this bill was so important.

And it’s not just states that recognize the need for people to be able to fix their own stuff.  Earlier this month, U.S. Army Secretary Dan Driscoll issued a memo stating that the Army should “[identify] and propose contract modifications for right to repair provisions where intellectual property constraints limit the Army's ability to conduct maintenance and access the appropriate maintenance tools, software, and technical data – while preserving the intellectual capital of American industry.” The memo said that the Army should seek this in future procurement contracts and also to amend existing contracts to include the right to repair.

This is a bedrock of sound procurement with a long history in America. President Lincoln only bought rifles with standardized tooling to outfit the Union Army, for the obvious reason that it would be a little embarrassing for the Commander in Chief to have to pull his troops off the field because the Army’s sole supplier had decided not to ship this week’s delivery of ammo and parts. Somehow, the Department of Defense forgot this lesson over the ensuing centuries, so that today, billions of dollars in public money are spent on material and systems that the US military can only maintain by buying service from a “beltway bandit.”

This recognizes what millions of people have said repeatedly: limiting people’s ability to fix their own stuff stands in the way of needed repairs and maintenance. That’s true whether you’re a farmer with a broken tractor during harvest, a homeowner with a misbehaving washing machine or a cracked smartphone screen, a hospital med-tech trying to fix a ventilator, or a soldier struggling with a broken generator.

The right to repair is gaining serious momentum. All 50 states have now considered some form of right-to-repair legislation. Washington is the eighth state to pass one of these bills into law—let’s keep it up.

The Federal Government Demands Data from SNAP—But Says Nothing About Protecting It

Tue, 06/03/2025 - 12:42pm

Last month, the U.S. Department of Agriculture issued a troubling order to all state agency directors of Supplemental Nutrition Assistance Programs (SNAP): hand over your data.

This is part of a larger effort by the Trump administration to gain “unfettered access to comprehensive data from all state programs that receive federal funding,” through Executive Order 14243. While the order says this data sharing is intended to cut down on fraud, it is written so broadly that it could authorize almost any data sharing. Such an effort flies in the face of well-established data privacy practices and places people at considerable risk. 

A group SNAP recipients and organizations have thankfully sued to try and block the data sharing granted through the Executive Order.  And the state of New Mexico has even refused to comply with the order, “due to questions and concerns regarding the legality of USDA’s demand for the information,” according to Source NM.

The federal government has said very little about how they will use this information. Several populations targeted by the Trump Administration are eligible to be on the SNAP program, including asylum seekers, refugees, and victims of trafficking. Additionally, although undocumented immigrants are not eligible for SNAP benefits, their household members who are U.S. citizens or have other eligible immigration statuses may be—raising the distinct concern that SNAP information could be shared with immigration or other enforcement authorities.

We all deserve privacy rights. Accessing public benefits to feed yourself shouldn't require you to give those up.

EFF has long advocated for privacy policies that ensure that information provided in one context is not used for other reasons. People who hand over their personal information should do so freely and with full information about how their information will be used. Whether you're seeking services from the government or a company, we all deserve privacy rights. Accessing public benefits to feed yourself shouldn't require you to give those up.

It's particularly important to respect privacy for government programs that provide essential support services to vulnerable populations such as SNAP.  SNAP supports people who need assistance buying food—arguably the most basic need. Often, fear of reprisal and inappropriate government data sharing, such as immigration status of household members not receiving benefits, prevents eligible people from enrolling in food assistance despite need.  Discouraging eligible people from enrolling in SNAP benefits runs counterproductive to the goals of the program, which aim to reduce food insecurity, improve health outcomes, and benefit local economies.

This is just the latest government data-sharing effort that raises alarm bells for digital rights. No one should worry that asking their government for help with hunger will get them in trouble. The USDA must promise it will not weaponize programs that put food on the table during times of need. 

The PERA and PREVAIL Acts Would Make Bad Patents Easier to Get—and Harder to Fight

Tue, 06/03/2025 - 11:23am

Two dangerous bills have been reintroduced in Congress that would reverse over a decade of progress in fighting patent trolls and making the patent system more balanced. The Patent Eligibility Restoration Act (PERA) and the PREVAIL Act would each cause significant harm on their own. Together, they form a one-two punch—making it easier to obtain vague and overly broad patents, while making it harder for the public to challenge them.

These bills don’t just share bad ideas—they share sponsors, a coordinated rollout, and backing from many of the same lobbying groups. Congress should reject both.

TAKE ACTION

Tell Congress: Don't Bring Back The Worst Patents

PERA Would Legalize Patents on Basic Software—and Human Genes

PERA would overturn long-standing court decisions that have helped keep some of the worst patents out of the system. This includes the Supreme Court’s Alice v. CLS Bank decision, which bars patents on abstract ideas, and Myriad v. AMP, which correctly ruled that naturally occurring human genes cannot be patented.

Thanks to the Alice decision, courts have invalidated a rogue’s gallery of terrible software patents—such as patents on online photo contests, online bingo, upselling, matchmaking, and scavenger hunts. These patents didn’t describe real inventions—they merely applied old ideas to general-purpose computers.

PERA would wipe out the Alice framework and replace it with vague, hollow exceptions. For example: it would ban patents on “dance moves” and “marriage proposals,” but would allow nearly anything involving a computer or machine—even if it only mentions the use of a computer. This is the same language used in many bad software patents that patent trolls have wielded for years. If PERA passes, patent claims  that are currently seen as weak will become much harder to challenge. 

Adding to that, PERA would bring back patents on human genes—exactly what was at stake in the Myriad case. EFF joined that fight, alongside scientists and patients, to prevent patents that interfered with essential diagnostic testing. Congress should not undo that victory. Some things just shouldn’t be patented. 

PERA’s requirement that living genes can constitute an invention if they are “isolated” is meaningless; every gene used in science is “isolated” from the human body. This legal wordplay was used to justify human gene patents for decades, and it’s deeply troubling that some U.S. Senators are on board with bringing them back. 

PREVAIL Weakens the Public’s Best Defense Against Patent Abuse

While PERA makes it easier to obtain a bad patent, the PREVAIL Act makes it harder to get rid of one.

PREVAIL would severely limit inter partes review (IPR), the most effective process for challenging wrongly granted patents. This faster, more affordable process—administered by the U.S. Patent and Trademark Office—has knocked out thousands of invalid patents that should never have been issued.

EFF has used IPR to protect the public. In 2013, we challenged and invalidated a patent on podcasting, which was being used to threaten creators across the internet. Thousands of our supporters chipped in to help us bring that case. Under PREVAIL, that challenge wouldn’t have been allowed. The bill would significantly limit IPR petitions unless you’ve been directly sued or threatened—a major blow to nonprofits, open source advocates, and membership-based defense groups that act in the public interest. 

PREVAIL doesn’t stop at limiting who can file an IPR. It also undermines the fairness of the IPR process itself. It raises the burden of proof, requiring challengers to overcome a presumption that the patent is valid—even when the Patent Office is the one reviewing it. The bill forces an unfair choice: anyone who challenges a patent at the Patent Office would have to give up the right to fight the same patent in court, even though key legal arguments (such as those involving abstract subject matter) can only be made in court.

It gets worse. PREVAIL makes it easier for patent owners to rewrite their claims during review, taking advantage of hindsight about what’s likely to hold up. And if multiple parties want to challenge the same patent, only the first to file may get heard. This means that patents used to threaten dozens or even hundreds of targets could get extra protection, just because one early challenger didn’t bring the best arguments.

These changes aren’t about improving the system. They’re about making it easier for a small number of patent owners to extract settlements, and harder for the public to push back.

A Step Backward, Not Forward

Supporters of these bills claim they’re trying to restore balance to the patent system. But that’s not what PERA and PREVAIL do. They don’t fix what’s broken—they break what’s working.

Patent trolling is still a severe problem. In 2024, patent trolls filed a stunning 88% of all patent lawsuits in the tech sector

At the same time, patent law has come a long way over the past decade. Courts can now reject abstract software patents earlier and more easily. The IPR process has become a vital tool for holding the Patent Office accountable and protecting real innovators. And the Myriad decision has helped keep essential parts of human biology in the public domain.

PERA and PREVAIL would undo all of that.

These bills have support from a variety of industry groups, including those representing biotech firms, university tech transfer offices, and some tech companies that rely on aggressive patent licensing. While those voices deserve to be heard, the public deserves better than legislation that makes it easier to secure a 20-year monopoly on an idea, and harder for anyone else to challenge it.

Instead of PERA and PREVAIL, Congress should focus on helping developers, creators, and small businesses that rely on technology—not those who exploit it through bad patents.

Some of that legislation is already written. Congress should consider making end-users immune from patent threats, closing loopholes that allow certain patent-holders to avoid having their patents reviewed, and adding transparency requirements so that people accused of patent infringement can at least figure out who’s making the allegations. 

But right now, EFF is fighting back, and we need your help. These bills may be dressed up as reform, but we’ve seen them before—and we know the damage they’d do.

TAKE ACTION

Tell Congress: Reject PERA and PREVAIL

The Defense Attorney’s Arsenal In Challenging Electronic Monitoring

Mon, 06/02/2025 - 4:32pm

In criminal prosecutions, electronic monitoring (EM) is pitched as a “humane alternative" to incarceration – but it is not. The latest generation of “e-carceration” tools are burdensome, harsh, and often just as punitive as imprisonment. Fortunately, criminal defense attorneys have options when shielding their clients from this over-used and harmful tech.

Framed as a tool that enhances public safety while reducing jail populations, EM is increasingly used as a condition of pretrial release, probation, parole, or even civil detention. However, this technology imposes serious infringements on liberty, privacy, and due process for not only those placed on it but also for people they come into contact with. It can transform homes into digital jails, inadvertently surveil others, impose financial burdens, and punish every misstep—no matter how minor or understandable.

Even though EM may appear less severe than incarceration, research and litigation reveal that these devices often function as a form of detention in all but name. Monitored individuals must often remain at home for long periods, request permission to leave for basic needs, and comply with curfews or “exclusion zones.” Violations, even technical ones—such as a battery running low or a dropped GPS signal—can result in arrest and incarceration. Being able to take care of oneself and reintegrate into the world becomes a minefield of compliance and red tape. The psychological burden, social stigma, and physical discomfort associated with EM are significant, particularly for vulnerable populations.   

For many, EM still evokes bulky wrist or ankle “shackles” that can monitor a subject’s location, and sometimes even their blood alcohol levels. These devices have matured with digital technology however,  increasingly imposed through more sophisticated devices like smartwatches or mobile phones applications. Newer iterations of EM have also followed a trajectory of collecting much more data, including biometrics and more precise location information.

This issue is more pressing than ever, as the 2020 COVID pandemic led to an explosion in EM adoption. As incarceration and detention facilities became superspreader zones, judges kept some offenders out of these facilities by expanding the use of EM; so much so that some jurisdictions ran out of classic EM devices like ankle bracelets.

Today the number of people placed on EM in the criminal system continues to skyrocket. Fighting the spread of EM requires many tactics, but on the front lines are the criminal defense attorneys challenging EM impositions. This post will focus on the main issues for defense attorneys to consider while arguing against the imposition of this technology.

PRETRIAL ELECTRONIC MONITORING

We’ve seen challenges to EM programs in a variety of ways, including attacking the constitutionality of the program as a whole and arguing against pretrial and/or post-conviction imposition. However, it is likely that the most successful challenges will come from individualized challenges to pretrial EM.

First, courts have not been receptive to arguments that entire EM programs are unconstitutional. For example, in Simon v. San Francisco et.al, 135 F.4th 784 (9 Cir. 2025), the Ninth Circuit held that although San Francisco’s EM program constituted a Fourth Amendment search, a warrant was not required. The court explained their decision by stating that the program was a condition of pretrial release, included the sharing of location data, and was consented to by the individual (with counsel present) by signing a form that essentially operated as a contract. This decision exemplifies the court’s failure to grasp the coercive nature of this type of “consent” that is pervasive in the criminal legal system.

Second, pretrial defendants have more robust rights than they do after conviction. While a person’s expectation of privacy may be slightly diminished following arrest but before trial, the Fourth Amendment is not entirely out of the picture. Their “privacy and liberty interests” are, for instance, “far greater” than a person who has been convicted and is on probation or parole. United States v. Scott, 450 F.3d 863, 873 (9th Cir. 2006). Although individuals continue to retain Fourth Amendment rights after conviction, the reasonableness analysis will be heavily weighted towards the state as the defendant is no longer presumed innocent. However, even people on probation have a “substantial” privacy interest. United States v. Lara, 815 F.3d 605, 610 (9th Cir. 2016). 

THE FOURTH AMENDMENT

The first foundational constitutional rights threatened by the sheer invasiveness of EM are those protected by the Fourth Amendment. This concern is only heightened as the technology improves and collects increasingly detailed information. Unlike traditional probation or parole supervision, EM often tracks individuals with no geographic limitations or oversight, and can automatically record more than just approximate location information.

Courts have increasingly recognized that this new technology poses greater and more novel threats to our privacy than earlier generations. In Grady v. North Carolina, 575 U.S. 306 (2015), the Supreme Court, relying on United States v. Jones, 565 U.S. 400 (2012) held that attaching a GPS tracking device to a person—even a convicted sex offender—constitutes a Fourth Amendment search and is thus subject to the inquiry of reasonableness. A few years later, the monumental decision in Carpenter v. United States, 138 S. Ct. 2206 (2018), firmly established that Fourth Amendment analysis is affected by the advancement of technology, holding that that long-term cell-site location tracking by law enforcement constituted a search requiring a warrant.

As criminal defense attorneys are well aware, the Fourth Amendment’s ostensibly powerful protections are often less effective in practice. Nevertheless, this line of cases still forms a strong foundation for arguing that EM should be subjected to exacting Fourth Amendment scrutiny.

DUE PROCESS

Three key procedural due process challenges that defense attorneys can raise under the Fifth and Fourteenth Amendments are: inadequate hearing, lack of individualized assessment, and failure to consider ability to pay.

Many courts impose EM without adequate consideration of individual circumstances or less restrictive alternatives. Defense attorneys should demand evidentiary hearings where the government must prove that monitoring is necessary and narrowly tailored. If the defendant is not given notice, hearing, or the opportunity to object, that could arguably constitute a violation of due process. For example, in the previously mentioned case, Simon v. San Francisco, the Ninth Circuit found that individuals who were not informed of the details regarding the city’s pretrial EM program in the presence of counsel had their rights violated.

Second, imposition of EM should be based on an individualized assessment rather than a blanket rule. For pretrial defendants, EM is frequently used as a condition of bail. Although under both federal and state bail frameworks, courts are generally required to impose the least restrictive conditions necessary to ensure the defendant’s court appearance and protect the community, many jurisdictions have included EM as a default condition rather than individually assessing whether EM is appropriate. The Bail Reform Act of 1984, for instance, mandates that release conditions be tailored to the individual’s circumstances. Yet in practice, many jurisdictions impose EM categorically, without specific findings or consideration of alternatives. Defense counsel should challenge this practice by insisting that judges articulate on the record why EM is necessary, supported by evidence related to flight risk or danger. Where clients have stable housing, employment, and no history of noncompliance, EM may be more restrictive than justified.

Lastly, financial burdens associated with EM may also implicate due process where a failure to pay can result in violations and incarceration. In Bearden v. Georgia, 461 U.S. 660 (1983), the Supreme Court held that courts cannot revoke probation for failure to pay fines or restitution without first determining whether the failure was willful. Relying on Bearden, defense attorneys can argue that EM fees imposed on indigent clients amount to unconstitutional punishment for poverty. Similarly, a growing number of lower courts have agreed, particularly where clients were not given the opportunity to contest their ability to pay. Defense attorneys should request fee waivers, present evidence of indigence, and challenge any EM orders that functionally condition liberty on wealth.

STATE LAW PROTECTIONS

State constitutions and statutes often provide stronger protections than federal constitutional minimums. In addition to state corollaries to the Fourth and Fifth Amendment, some states have also enacted statutes to govern pretrial release and conditions. A number of states have established a presumption in favor of release on recognizance or personal recognizance bonds. In those jurisdictions, the state has to overcome this presumption before the court can impose restrictive conditions like EM. Some states require courts to impose the least restrictive conditions necessary to achieve legitimate purposes, making EM appropriate only when less restrictive alternatives are inadequate.

Most pretrial statutes list specific factors courts must consider, such as community ties, employment history, family responsibilities, nature of the offense, criminal history, and risk of flight or danger to community. Courts that fail to adequately consider these factors or impose generic monitoring conditions may violate statutory requirements.

For example, Illinois's SAFE-T Act includes specific protections against overly restrictive EM conditions, but implementation has been inconsistent. Defense attorneys in Illinois and states with similar laws should challenge monitoring conditions that violate specific statutory requirements.

TECHNOLOGICAL ISSUES

Attorneys should also consider the reliability of EM technology. Devices frequently produce false violations and alerts, particularly in urban areas or buildings where GPS signals are weak. Misleading data can lead to violation hearings and even incarceration. Attorneys should demand access to raw location data, vendor records, and maintenance logs. Expert testimony can help demonstrate technological flaws, human error, or system limitations that cast doubt on the validity of alleged violations.

In some jurisdictions, EM programs are operated by private companies under contracts with probation departments, courts, or sheriffs. These companies profit from fees paid by clients and have minimal oversight. Attorneys should request copies of contracts, training manuals, and policies governing EM use. Discovery may reveal financial incentives, lack of accountability, or systemic issues such as racial or geographic disparities in monitoring. These findings can support broader litigation or class actions, particularly where indigent individuals are jailed for failing to pay private vendors.

Recent research provides compelling evidence that EM fails to achieve its stated purposes while creating significant harms. Studies have not found significant relationships between EM of individuals on pretrial release and their court appearance rates or likelihood of arrest. Nor do they show that law enforcement is employing EM on individuals they would otherwise put in jail.

To the contrary, studies indicate that law enforcement is using EM to surveil and constrain the liberty of those who wouldn't otherwise be detained, as the rise in the number of people placed on EM has not coincided with a decrease in detention. This research demonstrates that EM represents an expansion of government control rather than a true alternative to detention.

Additionally, EM devices may be rife with technical issues as described above. Communication system failures that prevent proper monitoring, and device malfunctions that cause electronic shocks. Cutting of ankle bracelets is a common occurrence among users, especially when the technology is malfunctioning or hurting them. Defense attorneys should document all technical issues and argue that unreliable technology cannot form the basis for liberty restrictions or additional criminal charges.

CREATING A RECORD FOR APPEAL

Attorneys should always make sure they are creating a record on which the EM imposition can be appealed, should the initial hearing be unsuccessful. This will require lawyers to include the factual basis for challenge and preserve the appropriate legal arguments. The modern generation of EM has yet to undergo the extensive judicial review that ankle shackles have been subjected to, making it integral to make an extensive record of the ways in which it is more invasive and harmful, so that it can be properly argued to an appellate court that the nature of the newest EM requires more than perfunctory application of decades-old precedence. As we saw with Carpenter, the rapid advancement of technology may push the courts to reconsider older paradigms for constitutional analysis and find them wanting. Thus, a comprehensive record would be critical to show EM as it is—an extension of incarceration—rather than a benevolent alternative to detention. 

Defeating electronic monitoring will require a multidimensional approach that includes litigating constitutional claims, contesting factual assumptions, exposing technological failures, and advocating for systemic reforms. As the carceral state evolves, attorneys must remain vigilant and proactive in defending the rights of their clients.

The EU’s “Encryption Roadmap” Makes Everyone Less Safe

Mon, 06/02/2025 - 4:15pm

EFF has joined more than 80 civil society organizations, companies, and cybersecurity experts in signing a letter urging the European Commission to change course on its recently announced “Technology Roadmap on Encryption.” The roadmap, part of the EU’s ProtectEU strategy, discusses new ways for law enforcement to access encrypted data. That framing is dangerously flawed. 

Let’s be clear: there is no technical “lawful access” to end-to-end encrypted messages that preserves security and privacy. Any attempt to circumvent encryption—like client-side scanning—creates new vulnerabilities, threatening the very people governments claim to protect.

This letter is significant in not just its content, but in who signed it. The breadth of the coalition makes one thing clear: civil society and the global technical community overwhelmingly reject the idea that weakening encryption can coexist with respect for fundamental rights.

Strong encryption is a pillar of cybersecurity, protecting everyone: activists, journalists, everyday web users, and critical infrastructure. Undermining it doesn’t just hurt privacy. It makes everyone’s data more vulnerable and weakens the EU’s ability to defend against cybersecurity threats.

EU officials should scrap any roadmap focused on circumvention and instead invest in stronger, more widespread use of end-to-end encryption. Security and human rights aren’t in conflict. They depend on each other.

You can read the full letter here.

245 Days Without Justice: Laila Soueif’s Hunger Strike and the Fight to Free Alaa Abd el-Fattah

Mon, 06/02/2025 - 3:14pm

Laila Soueif has now been on hunger strike for 245 days. On Thursday night, she was taken to the hospital once again. Soueif’s hunger strike is a powerful act of protest against the failures of two governments. The Egyptian government continues to deny basic justice by keeping her son, Alaa Abd el-Fattah, behind bars—his only “crime” was sharing a Facebook post about the torture of a fellow detainee. Meanwhile, the British government, despite Alaa’s citizenship, has failed to secure even a single consular visit. Its muted response reflects an unacceptable unwillingness to stand up for the rights of its own citizens.

This is the second time this year that Soueif’s health has collapsed due to her hunger strike. Now, her condition is dire. Her blood sugar is dangerously low, and every day, her family fears it could be her last. Doctors say it’s a miracle she’s still alive.

Her protest is a call for accountability—a demand that both governments uphold the rule of law and protect human rights, not only in rhetoric, but through action.

Late last week, after an 18-month investigation, the United Nations Working Group on Arbitrary Detention (UNWGAD) issued its Opinion on Abd el-Fattah’s case, stating that he is being held unlawfully by the Egyptian government. That Egypt will not provide the United Kingdom with consular access to its citizen further violates the country’s obligations under international law. 

As stated in a letter to British Prime Minister Keir Starmer by 21 organizations, including EFF, the UK must now use every tool it has at its disposal to ensure that Alaa Abd el-Fattah is released immediately.

CCTV Cambridge: Digital Equity in 2025

Fri, 05/30/2025 - 4:30pm

EFF has long advocated for affordable, accessible, and future-proof internet access for all. Digital equity, the condition in which everyone has access to technology that allows them to participate in society, is an issue that I’ve been proud to organize around. So, it’s awesome to connect with a group that's doing something to address it in their community.

Recently I got the chance to catch up with Maritza Grooms, Director of Community Relations at EFA member CCTV Cambridge, who told me about the results of their work and the impact it's having on their local community.

How’s your digital inclusion work going and what's been the results within the community?

CCTV has had a year of transition and change. One of the biggest was the establishing of the Digital Navigator Pilot Program in collaboration with multiple partners funded in part by Masshire Metro North Workforce Investment Board through the Mass Broadband Institute. This program has already had a great impact in Cambridge since its official launch in August 2024, serving 492 community members! This program demonstrates the clear need for digital navigator services in Cambridge and beyond. Our community has used this service to get devices that have allowed them restart their career journey or go back to school, and take digital literacy classes to gain new skills to help them along the way.

The Electronic Frontier Alliance works to uphold the principles of free expression, information security, privacy, creativity, and access to knowledge. What guides your organization and how does digital equity tie into it?

CCTV's mission is to nurture a strong, equitable, and diverse community by providing tools and training to foster free speech, civic engagement, access to knowledge, and creative expression. The Digital Navigator program fulfills this mission not only for the community we serve, but in the ripple effects that generate from our community members having the tools to participate in our society. The Digital Navigator Pilot Program aims to bridge the digital divide in Cambridge, specifically supporting BIPOC, immigrant, and low-income communities to enhance economic mobility.

How can people support and plug-in to what you’re doing?

We cannot do this alone. It takes a village, from partners in the work like our friends at EFF, and supporters alike. We encourage anyone to reach out to maritza@cctvcambridge.org to find out how you can support this program or visit cctvcambridge.org/support to support today and invite donations at your convenience. Follow us on social media @cctvcambridge!

Thanks again to Maritza for speaking with us. If you're inspired by CCTV Cambridge's work, consider joining a local EFA ally, or bringing your own group into the alliance today!

She Got an Abortion. So A Texas Cop Used 83,000 Cameras to Track Her Down.

Fri, 05/30/2025 - 2:36pm

In a chilling sign of how far law enforcement surveillance has encroached on personal liberties, 404 Media recently revealed that a sheriff’s office in Texas searched data from more than 83,000 automated license plate reader (ALPR) cameras to track down a woman suspected of self-managing an abortion. The officer searched 6,809 different camera networks maintained by surveillance tech company Flock Safety, including states where abortion access is protected by law, such as Washington and Illinois. The search record listed the reason plainly: “had an abortion, search for female.”

screenshot_2025-05-30_at_11.08.40_am.png

Screenshot of data

After the U.S. Supreme Court’s 2022 Dobbs v. Jackson Women’s Health Organization decision overturned Roe v. Wade, states were given sweeping authority to ban and even criminalize abortion. In Texas—where the officer who conducted this search is based—abortion is now almost entirely banned. But in Washington and Illinois, where many of the searched Flock cameras are located, abortion remains legal and protected as a fundamental right up to fetal viability.

The post-Dobbs legal landscape has also opened the door for law enforcement to exploit virtually any form of data—license plates, phone records, geolocation data—to pursue individuals across state lines. EFF’s Atlas of Surveillance has documented more than 1,800 agencies have deployed ALPRs, but at least 4,000 agencies are able to run searches through some agencies in Flock's network. Many agencies share the data freely with other agencies across the country, with little oversight, restriction, or even standards for accessing data. 

While this particular data point explicitly mentioned an abortion, scores of others in the audit logs released through public records requests simply list "investigation" as the reason for the plate search, with no indication of the alleged offense. That means other searches targeting someone for abortion, or another protected right in that jurisdiction, could be effectively invisible.

This case underscores our growing concern: that the mass surveillance infrastructure—originally sold as a tool to find stolen cars or missing persons—is now being used to target people seeking reproductive healthcare. This unchecked, warrant-less access that allows law enforcement to surveil across state lines blurs the line between “protection” and persecution.

From Missing Cars to Monitoring Bodies

EFF has long warned about the dangers of ALPRs, which scan license plates, log time and location data, and build a detailed picture of people's movements. Companies like Flock Safety and Motorola Solutions offer law enforcement agencies access to nationwide databases of these readers, and in some cases, allow them to stake out locations like abortion clinics, or create “hot lists” of license plates to track in real time. Flock's technology also allows officers to search for a vehicle based on attributes like color, make and model, even without a plate number.

The threat is compounded by how investigations often begin. A report published by If/When/How on the criminalization of self-managed abortion found that about a quarter of adult cases (26%) were reported to law enforcement by acquaintances entrusted with information, such as “friends, parents, or intimate partners” and another 18% through “other” means. This means that with ALPR tech, a tip from anyone can instantly escalate into a nationwide manhunt. And as Kate Bertash of the Digital Defense Fund explained to 404 Media, anti-abortion activists have long been documenting the plates of patients and providers who visit reproductive health facilities—data that can now be easily cross-referenced with ALPR databases.

The 404 Media report proves that this isn’t a hypothetical concern. In 2023, a months-long EFF investigation involving hundreds of public records requests uncovered that many California police departments were sharing records containing detailed driving profiles of local residents with out-of-state agencies, despite state laws explicitly prohibiting this. This means that even in so-called “safe” states, your data might end up helping law enforcement in Texas or Idaho prosecute you—or your doctor. 

That’s why we demanded that 75 California police departments stop sharing ALPR data with anti-abortion states, an effort that has largely been successful.

Surveillance and Reproductive Freedom Cannot Coexist

We’ve said it before, and we’ll say it again: Lawmakers who support reproductive rights must recognize that abortion access and mass surveillance are incompatible. 

The systems built to track stolen cars and issue parking tickets have become tools to enforce the most personal and politically charged laws in the country. What began as a local concern over privacy has escalated into a national civil liberties crisis.

Yesterday’s license plate readers have morphed into today’s reproductive dragnet. Now, it’s time for decisive action. Our leaders must roll back the dangerous surveillance systems they've enabled. We must enact strong, enforceable state laws to limit data sharing, ensure proper oversight, and dismantle these surveillance pipelines before they become the new normal–or even just eliminate the systems altogether.

California’s Cities and Counties Must Step Up Their Privacy Game. A.B. 1337 Can Do That.

Thu, 05/29/2025 - 12:25pm

“The right to privacy is being threatened by the indiscriminate collection, maintenance, and dissemination of personal information and the lack of effective laws and legal remedies,” some astute California lawmakers once wrote. “The increasing use of computers and other sophisticated information technology has greatly magnified the potential risk to individual privacy that can occur from the maintenance of personal information.”

Sound familiar? These words may sound like a recent push back on programs that want to slurp up the information sitting in ever-swelling government databases. But they’re not. They come from a nearly 50-year-old California law.

The “Information Practices Act of 1977”—or the IPA for short—is a foundational state privacy law and one of several privacy laws directly responding to the Watergate scandal, such the federal Privacy Act of 1974 and California’s own state constitutional right to privacy.

Now, as we confront a new era of digital surveillance and face our own wave of concern about government demands for data, it's time to revisit and update the IPA.

TAKE ACTION

The IPA puts a check on government use of personal information by establishing guardrails for how state agencies maintain, collect, and disseminate data. It also gives people the right to access and correct their information.

While the need for the law has not changed, the rest of the world has. Particularly, since the IPA passed in 1977, far more data collection is now done at the county and city level. Yet local and county government entities have no standard protections in the state of California. And those entities have troves of data, whether it’s the health data collected from vaccine programs or held by county-administered food programs.

As demand for this type of local data grows, we need to tap back into the energy of the ‘70s. It’s time to update the IPA so it can respond to the world we live in today. That’s why EFF is proud to co-sponsor A.B. 1337, authored by Assemblymember Chris Ward (D-San Diego), with our close friends at Oakland Privacy.

Specifically, A.B. 1337, also known as the IPA Reform Act:

  • Expands the definition of covered entities in the IPA to include local agencies, offices, departments and divisions.
  • Prevents information collected from being used for unintended or secondary purposes without consent.
  • Makes harmful negligent and improper release of personal information punishable as a misdemeanor.
  • Requires that IPA disclosure records be kept for three years and cannot be destroyed prior to that period.
  • Aligns the definition of personal information and sensitive personal information with the California Privacy Rights Act to include location data, online browsing records, IP addresses, citizenship status, and genetic information.

Privacy is foundational to trust in government. That’s part of the lesson we learned from the 1970s. (And trust in government is lower today than it was then.)

We need to be confident that the government is respecting our personal information and our privacy. More than ever, California residents face imminent danger of being targeted, persecuted, or prosecuted for seeking reproductive healthcare, their immigration status, practicing a particular religion, being of a particular race, gender identity, or sexual orientation—or simply for exercising their First Amendment rights.

California is a national leader on consumer privacy protections, having passed a landmark comprehensive privacy law and established the nation’s first state privacy agency. Now, its local governments must catch up.

We cannot afford to wait for these protections any longer. Passing A.B. 1337 is good governance, good policy, and just good sense. If you’re a California resident, tell your Assemblymember to support the bill today.

TAKE ACTION

The Insidious Effort to Privatize Public Airwaves | EFFector 37.5

Wed, 05/28/2025 - 1:45pm

School is almost out for summer! You know what that means? Plenty of time to catch up on the latest digital rights news! Don't worry, though—EFF has you covered with our EFFector newsletter.

This edition of EFFector explains why efforts to privatize public airwaves would harm American TV viewers; goes over how KOSA is still a very bad censorship bill, especially for young people; and covers how Signal, WhatsApp, and other encrypted chat apps back up your conversations.

You can read the full newsletter here, and even get future editions directly to your inbox when you subscribe! Additionally, we've got an audio edition of EFFector on the Internet Archive, or you can view it by clicking the button below:

LISTEN ON YouTube

EFFECTOR 37.5 - The Insidious Effort to Privatize Public Airwaves

Since 1990 EFF has published EFFector to help keep readers on the bleeding edge of their digital rights. We know that the intersection of technology, civil liberties, human rights, and the law can be complicated, so EFFector is a great way to stay on top of things. The newsletter is chock full of links to updates, announcements, blog posts, and other stories to help keep readers—and listeners—up to date on the movement to protect online privacy and free expression. 

Thank you to the supporters around the world who make our work possible! If you're not a member yet, join EFF today to help us fight for a brighter digital future.

Podcast Episode: Love the Internet Before You Hate On It

Wed, 05/21/2025 - 3:01am

There’s a weird belief out there that tech critics hate technology. But do movie critics hate movies? Do food critics hate food? No! The most effective, insightful critics do what they do because they love something so deeply that they want to see it made even better. The most effective tech critics have had transformative, positive online experiences, and now unflinchingly call out the surveilled, commodified, enshittified landscape that exists today because they know there has been – and still can be – something better.

%3Ciframe%20height%3D%2252px%22%20width%3D%22100%25%22%20frameborder%3D%22no%22%20scrolling%3D%22no%22%20seamless%3D%22%22%20src%3D%22https%3A%2F%2Fplayer.simplecast.com%2F185a41be-b203-47a2-9f26-6a4c4a5fd08b%3Fdark%3Dtrue%26amp%3Bcolor%3D000000%22%20allow%3D%22autoplay%22%3E%3C%2Fiframe%3E Privacy info. This embed will serve content from simplecast.com

   

(You can also find this episode on the Internet Archive and on YouTube.)

That’s what drives Molly White’s work. Her criticism of the cryptocurrency and technology industries stems from her conviction that technology should serve human needs rather than mere profits. Whether it’s blockchain or artificial intelligence, she’s interested in making sure the “next big thing” lives up to its hype, and more importantly, to the ideals of participation and democratization that she experienced. She joins EFF’s Cindy Cohn and Jason Kelley to discuss working toward a human-centered internet that gives everyone a sense of control and interaction – open to all in the way that Wikipedia was (and still is) for her and so many others: not just as a static knowledge resource, but as something in which we can all participate. 

In this episode you’ll learn about:

  • Why blockchain technology has built-in incentives for grift and speculation that overwhelm most of its positive uses
  • How protecting open-source developers from legal overreach, including in the blockchain world, remains critical
  • The vast difference between decentralization of power and decentralization of compute
  • How Neopets and Wikipedia represent core internet values of community, collaboration, and creativity
  • Why Wikipedia has been resilient against some of the rhetorical attacks that have bogged down media outlets, but remains vulnerable to certain economic and political pressures
  • How the Fediverse and other decentralization and interoperability mechanisms provide hope for the kind of creative independence, self-expression, and social interactivity that everyone deserves  

Molly White is a researcher, software engineer, and writer who focuses on the cryptocurrency industry, blockchains, web3, and other tech in her independent publication, Citation Needed. She also runs the websites Web3 is Going Just Great, where she highlights examples of how cryptocurrencies, web3 projects, and the industry surrounding them are failing to live up to their promises, and Follow the Crypto, where she tracks cryptocurrency industry spending in U.S. elections. She has volunteered for more than 15 years with Wikipedia, where she serves as an administrator (under the name GorillaWarfare) and functionary, and previously served three terms on the Arbitration Committee. She’s regularly quoted or bylined in news media, speaks at major conferences including South by Southwest and Web Summit; guest lectures at universities including Harvard, MIT, and Stanford; and advises policymakers and regulators around the world. 

Resources:

What do you think of “How to Fix the Internet?” Share your feedback here.

Transcript

MOLLY WHITE: I was very young when I started editing Wikipedia. I was like 12 years old, and when it said the encyclopedia that anyone can edit, “anyone” means me, and so I just sort of started contributing to articles and quickly discovered that there was this whole world behind Wikipedia that a lot of us really don't see, where very passionate people are contributing to creating a repository of knowledge that anyone can access.
And I thought, I immediately was like, that's brilliant, that's amazing. And you know that motivation has really stuck with me since then, just sort of the belief in open knowledge and open access I think has, you know, it was very early for me to be introduced to those things and I, I sort of stuck with it, because it became, I think, such a formative part of my life.

CINDY COHN: That’s Molly White talking about a moment that is hopefully relatable to lots of folks who think critically about technology – that moment when you first experienced how, sometimes, the internet can feel like magic.
I'm Cindy Cohn, the Executive Director of the Electronic Frontier Foundation.

JASON KELLEY: And I'm Jason Kelley, EFF’s Activism Director. This is our podcast, How to Fix the Internet.

CINDY COHN: The idea behind this show is that we're trying to make our digital lives BETTER. A big part of our job at EFF is to envision the ways things can go wrong online-- and jumping into action to help when things then DO go wrong.
But this show is about optimism, hope and solutions – we want to share visions of what it looks like when we get it right.

JASON KELLEY: Our guest today is Molly White. She’s a journalist and web engineer, and is one of the strongest voices thinking and speaking critically about technology–specifically, she’s been an essential voice on cryptocurrency and what people often call Web3–usually a reference to blockchain technologies.. She runs an independent online newsletter called Citation Needed, and at her somewhat sarcastically named website “Web3 is going just great” she chronicles all the latest, often alarming, news, often involving scams and schemes that make those of us working to improve the internet pull our hair out.

CINDY COHN: But she’s not a pessimist. She comes from a deep love of the internet, but is also someone who holds the people that are building our digital world to account, with clear-eyed explanations of where things are going wrong, but also potential that exists if we can do it right. Welcome, Molly. Thanks for being here.

MOLLY WHITE: Thank you for having me.

CINDY COHN: So the theme of our show is what does it look like if we start to get things right in the digital world? Now you recognize, I believe, the value of blockchain technologies, what they could be.
But you bemoan how far we are from that right now. So let's start there. What does the world look like if we start to use the blockchain in a way that really lives up to its potential for making things better online?

MOLLY WHITE: I think that a lot of the early discussions about the potential of the blockchain were very starry-eyed and sort of utopian. Much in the way that early discussions of the internet were that way. You know, they promised that blockchains would somehow democratize everything we do on the internet, you know, make it more available to anyone who wanted to participate.
It would provide financial rails that were more equitable and had fewer rent seekers and intermediaries taking fees along the way. A lot of it was very compelling.
But I think as we've seen the blockchain industry, such as it is now, develop, we've seen that this technology brings with it a set of incentives that are incredibly challenging to grapple with. And it's made me wonder, honestly, whether blockchains can ever live up to the potential that they originally claimed, because those incentives have seemed to be so destructive that much of the time any promises of the technology are completely overshadowed by the negatives.

CINDY COHN: Yeah. So let's talk a little bit about those incentives, 'cause I think about that a lot as well. Where do you see those incentives popping up and what are they?

MOLLY WHITE: Well, any public blockchain has a token associated with it, which is the cryptocurrency that people are trading around, speculating on, you know, purchasing in hopes that the number will go up and they will make a profit. And in order to maintain the blockchain, you know, the actual system of records that is storing information or providing the foundation for some web platform, you need that cryptocurrency token.
But it means that whatever you're trying to do with the blockchain also has this auxiliary factor to it, which is the speculation on the cryptocurrency token.
And so time and time again, watching this industry and following projects, claiming that they will do wonderful, amazing things and use blockchains to accomplish those things, I've seen the goals of the projects get completely warped by the speculation on the token. And often the project's goals become overshadowed by attempts to just pump the price of the token, in often very inauthentic ways or in ways that are completely misaligned with the goals of the project. And that happens over and over and over again in the blockchain world.

JASON KELLEY: Have you seen that not happen with any project? Is there any project that you've said, wow, this is actually going well. It's like a perfect use of this technology, or because you focus on sort of the problems, is that just not something you've come across?

MOLLY WHITE: I think where things work well is when those incentives are perfectly aligned, which is to say that if there are projects that are solely focused on speculation, then the blockchain speculation works very well. Um, you know, and so we see people speculating on Bitcoin, for example, and, and they're not hoping necessarily that the Bitcoin ledger itself will do anything.
The same is true with meme coins. People are speculating on these tokens that have no purpose behind them besides, you know. Hoping that the price will go up. And in that case, you know, people sort of know what they're getting into and it can be lucrative for some people. And for the majority of people it's not, but you know, they sort of understand that going into it, or at least you would hope that they do.

CINDY COHN: I think of the blockchain as, you know, when they say this'll go down on your permanent record, this is the permanent record.

MOLLY WHITE: That’s usually a threat.

CINDY COHN: Yeah.

MOLLY WHITE: I try to point that out as well.

CINDY COHN: Now, you know, look, to be clear, we work with people who do international human rights work saving the records before a population gets destroyed in a way that that can't be destroyed by the people in power is, is, is one of the kind of classic things that you want a secure, permanent place to store stuff, um, happens. And so there's, you know, there's that piece. So where do you point people to when you're thinking about like, okay, what if you want a real permanent record, but you don't want all the dreck of the cryptocurrency blockchain world?

MOLLY WHITE: Well, it really depends on the project. And I really try to emphasize that because I think a lot of people in the tech world come at things somewhat backwards, especially when there is a lot of hype around a technology in the way that there was with blockchains and especially Web3.
And we saw a lot of people essentially saying, I wanna do something with a blockchain. Let me go find some problem I can solve using a blockchain, which is completely backwards to how most technologists are used to addressing problems, right? They're faced with a problem. They consider possible ways to solve it, and then they try to identify a technology that is best suited to solving that problem.
And so, you know, I try to encourage people to reverse the thinking back to the normal way of doing things where, sure, you might not get the marketing boosts that Web3 once brought in. And, you know, it certainly it was useful to attract investors for a while to have that attached to your project, but you will likely end up with a more sustainable product at the end of the day because you'll have something that works and is using technology that is well suited to the problem. And so, you know, when it comes to where would I direct people other than blockchains, it very much depends on their problem and, and the problem that they're trying to solve.
For example, if you don't need to worry about having a, a system that is maintained by a group of people who don't trust each other, which is the blockchain’s sort of stated purpose, then there are any number of databases that you can use that work in the more traditional manner where you rely on perhaps a group of trusted participants or a system like that.
If you're looking for a more distributed or decentralized solution, there are peer-to-peer technologies that are not blockchain based that allow this type of content sharing. And so, you know, like I said, it really just depends on the use case more than anything.

JASON KELLEY: Since you brought up decentralization, this is something we talk about a lot at EFF in different contexts, and I think a lot of people saw blockchain and heard decentralized and said, that sounds good.
We want less centralized power. But where do you see things like decentralization actually helping if this kind of Web3 tech isn't a place where it's necessarily useful or where the technology itself doesn't really solve a lot of the problems that people have said it would.

MOLLY WHITE: I think one of the biggest challenges with blockchains and decentralization is that when a lot of people talk about decentralization, they're talking about the decentralization of power, as you've just mentioned, and in the blockchain world, they're often talking about the decentralization of compute, which is not necessarily the same thing, and in some cases it very much different.

JASON KELLEY: If you can do a rug pull, it's not necessarily decentralized. Right?

MOLLY WHITE: Right. Or if you're running a blockchain and you're saying it's decentralized, but you run all of the validators or the miners for that blockchain, then you, you know, the computers may be physically located all over the world, or, you know, decentralized in that sort of sense. But you control all the power and so you do not have a truly decentralized system in that manner of speaking.
And I think a lot of marketing in the crypto world sort of relied on people not considering the difference between those two things, because there are a lot of crypto projects that, you know, use all of the buzzwords around decentralization and democratization and, you know, that type of thing that are very, very centralized, very similar to the traditional tech companies where, you know, all of Facebook servers may be located physically all around the world, but no one's under the. The impression that Facebook is a decentralized company. Right? And so I think that's really important to remember is that there's nothing about blockchain technology specifically that requires a blockchain project to be decentralized in terms of power.
It still requires very intentional decision making on the parts of the people who are running that project to decentralize the power and reduce the degree to which any one entity can control the network. And so I think that there is this issue where people sort of see blockchains and they think decentralized, and in reality you have to dig a lot deeper.

CINDY COHN: Yeah, EFF has participated in a couple of the sanctions cases and the prosecutions of people who have developed peace. Is of the blockchain world especially around mixers. TornadoCash is one that we participated in, and I think this is an area where we have a kind of similar view about the role of the open source community and kind of the average coder and when their responsibility should create liability and when they should be protected from liability.
And we've tried to continue to weigh in on these cases to make sure the courts don't overstep, right? Because the prosecution gets so mad. You're talking about a lot of money laundering and, and things like that, that the, you know, the prosecution just wants to throw the book at everybody who was ever involved in these kinds of things and trying to create this space where, you know, a coder who just participates in a GitHub developing some piece of code doesn't have a liability risk.
And I think you've thought about this as well, and I'm wondering, do you see the government overstepping and do you think it's right to continue to think about that, that overstepping?

MOLLY WHITE: Yeah, I mean, I think it's that those are the types of questions that are really important when it comes to tackling problems around blockchains and cryptocurrencies and the financial systems that are developing around these products.
Tou have to be really cautious that, you know, just because a bad thing is happening, you don't come in with a hammer that is, you know, much too big and start swinging it around and hitting sort of anyone in the vicinity because, you know, I think there are some things that should absolutely be protected, like, you know, writing software, for example.
A person who writes software should not necessarily be liable for everything that another person then goes and does with that software. And I think that's something that's been fairly well established through, you know, cryptography cases, for example, where people writing encryption algorithms and software to do strong encryption should not be considered liable for whatever anyone encrypts with that technology. We've seen it with virus writers, you know, it would be incredibly challenging for computer scientists to research and sort of think about new viruses and protect against vulnerabilities if they were not allowed to write that software.
But, you know, if they're not going and deploying this virus on the world or using it to, you know, do a ransomware attack, then they probably shouldn't be held liable for it. And so similar questions are coming up in these cryptocurrency cases or these cases around cryptocurrency mixers that are allowing people to anonymize their transactions in the crypto world more adequately.
And certainly that is heavily used in money laundering and in other criminal activities that are using cryptocurrencies. But simply writing the software to perform that anonymization is not itself, I think, a crime. Especially when there are many reasons you might want to anonymize your financial transactions that are otherwise publicly visible to anyone who wishes to see them, and, you know, can be linked to you if you are not cautious about your cryptocurrency addresses or if you publish them yourself.
And so, you know, I've tried to speak out about that a little bit because I think a lot of people see me as, you know, a critic of the cryptocurrency world and the blockchain world, and I think it should be banned or that anyone trading crypto should be put in jail or something like that, which is a very extreme interpretation of my beliefs and is, you know, absolutely not what I believe. I think that, you know, software engineers should be free to write software and then if someone takes that software and commits a crime with it, you know, that is where law enforcement should begin to investigate. Not at the, you know, the software developer's computer.

CINDY COHN: Yeah, I just think that's a really important point. I think it's easy, especially because there's so much fraud and scam and abuse in this space, to try to make sure that we're paying attention to where are we setting the liability rules because even if you don't like cryptocurrency or any of those kinds of things, like protecting anonymity is really important.
It's kind of a function of our times right now where people are either all one or all the other. And I really have appreciated, as you've kind of gone through this, thinking about a position that protects the things that we need to protect, even if we don't care about 'em in this context, because we might in another, and law of course, is kind of made up of things that get set in one context and then applied in another, while at the same time being, you know, kind of no holds barred, critical of the awful stuff that's going on in this world.

JASON KELLEY: Let’s take a quick moment to say thank you to our sponsor.
“How to Fix the Internet” is supported by The Alfred P. Sloan Foundation’s Program in Public Understanding of Science and Technology. Enriching people’s lives through a keener appreciation of our increasingly technological world and portraying the complex humanity of scientists, engineers, and mathematicians.
And now back to our conversation with Molly White

JASON KELLEY: Some of the technologies you're talking about when sort of separated out from, maybe, the hype or the negatives that have like, overtaken the story. Things like peer-to-peer file sharing, cryptography. I mean, even, let's say, being able to send money to someone, you know, with your phone, if you want to call it that, are pretty incredible at some level, you know?
And you gave a talk in October that was about a time that you felt like the web was magic and you brought up a, a website that I'm gonna pretend that I've never heard of, so you can explain it to me, called Neopets. And I just wanna, for the listeners, could you explain a little bit about what Neopets was and sort of how it helped inform you about the way you want the web to work and, and things like that?

MOLLY WHITE: Yeah, so Neopets was a kids game. When I was a kid, you could adopt these little cartoon pets and you could like feed them and change their colors and do things, you know, play little games with them.

JASON KELLEY: Like Tamagotchis a little bit,

MOLLY WHITE: a little bit. Yeah. Yeah. There was also this aspect to the website where you could edit your user page and you could create little webpages in your account that were, it was pretty freewheeling, you know, you could edit the CSS and the HTML and you could make your own little website essentially. And as a kid that was really my first exposure to the idea that the internet and these websites that I was seeing, you know, sort of for the first time were not necessarily a read-only operation. You know, I was used to playing maybe little games on the internet  whatever kids were doing on the internet at the time.
And Neopets was really my first realization that I could add things to the internet or change the way they looked or interact with it in a way that was, you know, very participatory. And that later sort of turned into editing Wikipedia and then writing software and then publishing my writing on the web.
And that was really magical for me because it sort of informed me about the platform that was in front of me and how powerful it was to be able to, you know, edit something, create something, and then the whole world could see it.

JASON KELLEY: There's a really common critique right now that young people are sort of learning only bad things online or like only overusing the internet. And I mean, first of all, I think that's obviously not true. You know, every circumstance is different, but do you see places where like the way you experienced the internet growing up are still happening for young people?

MOLLY WHITE: Yeah, I mean, I think a lot of those, as you mentioned, I think a lot of those critiques are very misguided and they miss a lot of the incredibly powerful and positive aspects of the internet. I mean, the fact that you can go look something up and learn something new in half a second, is revolutionary. But then I think there are participatory examples, much like what I was experiencing when I was younger. You know, people can still edit Wikipedia the way that I was doing as a kid. That is a very powerful thing to do when you're young, to realize that knowledge is not this thing that is handed down from on high from some faceless expert who wrote history, but it's actually something that people are contributing to and improving constantly. And it can always be updated and improved and edited and shared, you know, in this sort of free and open way. I think that is incredibly powerful and is still open to people of any age who are, you know, able to access the site.

JASON KELLEY: I think it's really important to bring up some of these examples because something I've been thinking about a lot lately as these critiques and attacks on young people using the internet have sort of grown and even, you know, entered the state and congressional level in terms of bills, is that a lot of the people making these critiques, I feel like never liked the internet to begin with. They don't see it as magic in the way that I think you do and that, you know, a lot of our listeners do.
And it's a view that is a problem specifically because I feel like you have to have loved the internet before you can hate it. You know, like, it's not like you need to really be able to critique the specific things rather than just sort of throw out the whole thing. And one of the things you know, I like about the work that you do is that you clearly have this love for technology and for the internet, and that lets you, I think, find the problems. And other people can't see through into those specific individual issues. And so they just wanna toss the whole thing.

MOLLY WHITE: I think that's really true. I think that, you know, I think there is this weird belief, especially around tech critics, that tech critics hate technology. It's so divorced from reality because, you don't see that in other worlds where, you know, art critics are never told that they just hate all art. I think most people understand that art critics love art and that's why they are critics.
But with technology critics, there's sort of this weird, you know, this perception of us as people who just hate technology, we wanna tear it all down when in reality, you know, I know a lot of tech critics and, and most of us, if not all of us, that I can think of come from a, you know, a background of loving technology often from a very young age, and it is because of that love and the want to see technology to continue to allow people to have those transformative experiences that we criticize it.
And that's, for some reason, just a hard thing, I think for some people to wrap their minds around.

JASON KELLEY: I want to talk a little bit more about Wikipedia, the whole sort of organization and what it stands for and what it does has been under attack a lot lately as well. Again, I think that, you know, it's a lot of people misunderstanding how it works and, and, um, you know, maybe finding some realistic critiques of the fact that, that, you know, it's individually edited, so there's going to be some bias in some places and things like that, and sort of extrapolating out when they have a good faith argument to this other place. So I'm wondering if you've thought about how to protect Wikipedia, how to talk about it. How you know your experience with it has made you understand how it works better than most people.
And also just generally, you know how it can be used as a model for the way that the internet should be or the way we can build a better internet.

MOLLY WHITE: I think this ties back a little bit to the decentralization topic where Wikipedia is not decentralized in the sense that, you know, there is one company or one nonprofit organization that controls all the servers. And so there is this sort of centralization of power in that sense. But it is very decentralized in the editing world where there is no editorial board that is vetting every edit to the site.
There are, you know, numerous editors that contribute to any one article and no one person has the final say. There are different language versions of Wikipedia that all operate somewhat independently. And because of that, I think it has been challenging for people to attack it successfully.
Certainly there have been no shortage of those attacks. Um, but you know, it's not a company that someone could buy out and take over in ways that we've seen, you know, for example Elon Musk do with Twitter. There is no sort of editorial board that can be targeted and sort of pressured to change the language on the site. And, you know, I think that has helped to make Wikipedia somewhat resilient in ways that we've seen news organizations or other media publications struggle with recently where, you know, they have faced pressure from their buyers. The, you know, the people who own those organizations to be sure.
They've faced, you know, threats from the government in some cases. And Wikipedia is structured somewhat differently that I think helps us remain more protected from those types of attacks. But, you know, I, I am cautious to note that, you know, there are still vulnerabilities.
The attacks on Wikipedia need to be vociferously opposed. And so we have to be very cautious about this because the incredible resource that Wikipedia is, is is something that doesn't just sort of happen in a vacuum, you know, outside of any individual's actions.
It requires constant support, constant participation, constant editing. And so, you know, it's certainly a model to look to in terms of how communities can organize and contribute to, um, you know, projects on the internet. But it's also something that has to be very carefully maintained.

CINDY COHN: Yeah, I mean, this is just a lesson for our times, right? You know, there isn't a magical tech that can protect against all attacks. And there isn't a magical, you know, nonprofit 501-C3 that can be resistant against all the attacks. And we're in a time where they're coming fast and furious against our friends at Wikimedia, along with a lot of other, other things.
And I think the impetus is on communities to show up and, and, you know, not just let these things slide or think that, you know, uh, the internet might be magic in some ways, but it's not magic in these ways. Like we have to show up and fight for them. Um, I wanted to ask you a little bit about, um, kind of big tech's embrace of AI.
Um, you've been critical of it. We've been critical of it as well in many ways. And, and I, I wanna hear kind of your concerns about it and, um, and, and kind of how you see AI’s, you know, role in a better world. But, you know, also think about the ways in which it's not working all that well right now.

MOLLY WHITE: I generally don't have this sort of hard and fast view of AI is good or AI is bad, but it really comes down to how that technology is being used. And I think the widespread use of AI in ways that exploit workers and creatives and those who have decided to publish something online for example, and did not expect for that publication to be used by big tech companies that are then profiting off of it, that is incredibly concerning. Um, as well as the ways that AI is marketed to people. Again, this sort of mirrors my criticism, surround the crypto industry, but a lot of the marketing around AI is incredibly misleading. Um, you know, they're making promises that are not born out in reality.
They are selling people a product that will lie to you, you know, that will tell you things that are inaccurate. So I have a lot of concerns around AI, especially as we've seen it being used in the broadest, and sort of by the largest companies. But you know, I also acknowledge that there are ways in which some of this technology has been incredibly useful. And so, you know, it is one of these things where it has to be viewed with nuance, I think, around the ways it's being developed, the ways it's being deployed, the ways it's being marketed.

CINDY COHN: Yeah, there is a, a kinda eerie familiarity around the hype around AI and the hype around crypto. That, it's just kind of weird. It feels like we're going through like a, you know, groundhog day. Like we're living through the, another hype cycle that feels like the last, I think, you know, for us at EFF, we're really, we, we've tried to focus a lot on governmental use of AI's systems and AI systems that are trying to predict human behavior, right?
The digital equivalent of phrenology right? You know, let us, let us do sentiment analysis on the things that you said, and that'll tell us whether you're about to be a criminal or, you know, the right person for the job. I think those are the places that we've really identified, um, as, you know, problematic on a number of levels. You know, number one, it, it doesn't work nearly as well as,

MOLLY WHITE: That is a major problem!

CINDY COHN: It seems like that ought to be number one. Right. And this, you know, especially spending your time in Wikipedia where you're really working hard to get it right. Right. And you see the kind of back and forth of the conversation. But the, the central thing about Wikipedia is it's trying to actually give you truthful information and then watching the world get washed over with these AI assistants that are really not at all focused on getting it right, you know, or really focused on predicting the next word or, or however that works, right. Like, um, it's gotta be kind of strange from where you sit, I suspect, to see this.

MOLLY WHITE: Yeah, it's, it's very frustrating. And, you know, I, I like to think we lived in a world at one time where people wanted to produce technology that helped people, technology that was accurate, technology that worked in the ways that they said it did. And it's been very weird to watch, especially over the last few years that sort of, uh, those goals degrade where, well, maybe it's okay if it gets things wrong a lot, you know, or maybe it's okay if it doesn't work the way that we've said it does or maybe never possibly can.
That's really frustrating to watch as someone who, again, loves technology and loves the possibilities of technology to then see people just sort of using technology to, to deliver things that are, you know, making things worse for people in many ways.

CINDY COHN: Yeah, so I wanna flip it around a little bit. You, like EFF, we kind of sometimes spend a lot of time in all the ways that things are broken, and how do you think about how to get to a place where things are not broken, or how do you even just keep focusing on a better place that we could get to?

MOLLY WHITE: Well, I've, like I said, you know, a lot of my criticism really comes down to the industries and the sort of exploitative practices of a lot of these companies in the tech world. And so to the extent possible, separating myself from those companies and from their control has been really powerful to sort of regain some of that independence that I once remembered the web enabling where, you know, if you had your own website, you know, you could kind of do anything you wanted. And you didn't have to stay within the 280 characters if you had an idea, you know, and you could publish, uh, you know, a video that was longer than 10 minutes long or, or whatever it might be.
So sort of returning to some of those ideals around creating my own spaces on the web where I have that level of creative freedom, and certainly freedom in other ways, has been very powerful. And then finding communities of people who believe in those same things. I've taken a lot of hope in the Fediverse and the communities that have emerged around those types of technologies and projects where, you know, they're saying maybe there is an alternative out there to, you know, highly centralized big tech, social media being what everyone thinks of as the web. Maybe we could create different spaces outside of that walled garden where we all have control over what we do and say, and who we interact with. And we set the terms on which we interact with people.
And sort of push away the, the belief that, you know, a tech company needs to control an algorithm to show you what it is that you want to see, when in reality, maybe you could make those decisions for yourself or choose the algorithm or, you know, design a system for yourself using the technologies that are available to everyone, but have been sort of walled in by a large or many of the large players in the web these days.

CINDY COHN: Thank you, Molly. Thank you very much for coming on and, and spending your time with us. We really appreciate the work that you're doing, um, and, and the way that you're able to boil down some pretty complicated situations into, you know, kind of smart and thoughtful ways of reflecting on them. So thank you.

MOLLY WHITE: Yeah. Thank you.

JASON KELLEY: It was really nice to talk to someone who has that enthusiasm for the internet. You know, I think all of our guests probably do, but when we brought up Neo pets, that excitement was palpable, and I hope we can find a way to get more of that enthusiasm back.
That's one of the things I'm taking away from that conversation was that more people need to be enthusiastic about using the internet and whatever that takes. What did you take away from chatting with Molly that we need to do differently Cindy?

CINDY COHN: Well, I think that the thing that made the enthusiasm pop in her voice was the idea that she could control things. That she was participating and, and participating not only in Neopets, but the participation on Wikipedia as well, right?
That she could be part of trying to make truth available to people and recognizing that truth in some ways isn't an endpoint, it's an evolving conversation among people to try to keep getting at getting it right.
That doesn't mean there isn't any truth, but it does mean that there is an open community of people who are working towards that end. And, you know, you hear that enthusiasm as well. It's, you know, the more you give in, the more you get out of the internet and trying to make that a more common experience of the internet that things aren't just handed to you or taught to you, but really it's a two-way street, that's where the enthusiasm came from for her, and I suspect for a lot of other people.

JASON KELLEY: Yeah, and what you're saying about truth, I think she sort of applies this in a lot of different ways. Even specific technologies, I think most people realize this, but you have to say it again and again, aren't necessarily right or wrong for everything. You know, AI isn't right or wrong for every scenario. It's sort of, things are always evolving. How we use them is evolving. Whether or not something is correct today doesn't mean it will be correct tomorrow. And there's just a sort of nuance and awareness that she had to how these different things work and when they make sense that I hope we can continue to see in more people instead of just a sort of, uh, flat across the board dislike of, you know, quote unquote the internet or quote unquote social media and things like that.

CINDY COHN: Yeah, or the other way around, like whatever it is, there's a hype cycle and it's just hyped over and over again. And that she's really charting a middle ground in the way she writes and talks about these things that I think is really important. I think the other thing I really liked was her framing of decentralization as thinking about decentralizing power, not decentralizing compute, and that difference being something that is often elided or not made clear.
But that can really help us see where, you know, where decentralization is happening in a way that's empowering people, making things better. You have to look for decentralized power, not just decentralized compute. I thought that was a really wise observation.

JASON KELLEY: And I think could be applied to so many other things where a term like decentralized may be used because it's accessible from everywhere or something like that. Right? And it's just, these terms have to be examined. And, and it sort of goes to her point about marketing, you know, you can't necessarily trust the way the newest fad is being described by its purveyors.
You have to really understand what it's doing at the deeper level, and that's the only way you can really determine if it's, if it's really decentralized, if it's really interoperable, if it's really, you know, whatever the new thing is. AI

CINDY COHN: Mm-hmm. Yeah, I think that's right. And you know, luckily for us, we have Molly who digs deep into the details of this for so many technologies, and I think we need to, you know, support and defend, all the people who are doing that. Kind of that kind of careful work for us, because we can't do all of it, you know, we're humans.
But having people who will do that for us in different places who are trusted and who aren't, you know who whose agenda is clear and understandable, that's kind of the best we can hope for. And the more of that we build and support and create spaces for on the, you know, uncontrolled open web as opposed to the controlled tech giants and walled gardens, as she said, I think the better off we'll be.

JASON KELLEY: Thanks for joining us for this episode of How to Fix the Internet.
If you have feedback or suggestions, we'd love to hear from you. Visit EFF dot org slash podcast and click on listener feedback. While you're there, you can become a member, donate, maybe even pick up some merch and just see what's happening in digital rights this week and every week.
Our theme music is by Nat Keefe of BeatMower with Reed Mathis
And How to Fix the Internet is supported by the Alfred P. Sloan Foundation's program in public understanding of science and technology.
We’ll see you next time.
I’m Jason Kelley…

CINDY COHN: And I’m Cindy Cohn.

MUSIC CREDITS: This podcast is licensed Creative Commons attribution 4.0 international and includes the following music licensed Creative Commons 3.0 unported by its creators: Drops of H2O, the filtered water treatment, by J. Lang. Additional beds by Gaetan Harris.

Pages