EFF Press Releases

Syndicate content
Updated: 5 days 23 hours ago

Hearing Tuesday: EFF Will Voice Support For California Bill Reining In Law Enforcement Use of Facial Recognition

Mon, 06/10/2019 - 15:20

Sacramento, California—On Tuesday, June 11, at 8:30 am, EFF Grassroots Advocacy Organizer Nathan Sheard will testify before the California Senate Public Safety Committee in support of a measure to prohibit law enforcement from using facial recognition in body cams.

Following San Francisco’s historic ban on police use of the technology—which can invade privacy, chill free speech and disproportionately harm already marginalized communities—California lawmakers are considering AB 1215, proposed legislation that would extend the ban across the state.

Face recognition technology has been shown to have disproportionately high error rates for women, the elderly, and people of color. Making matters worse, law enforcement agencies often rely on images pulled from mugshot databases. This exacerbates historical biases born of, and contributing to, over-policing in Black and Latinx neighborhoods. The San Francisco Board of Supervisors and other Bay Area communities have decided that police should be stopped from using the technology on the public.

The utilization of face recognition technology in connection with police body cameras would force Californians to decide between actively avoiding interaction and cooperation with law enforcement, or having their images collected, analyzed, and stored as perpetual candidates for suspicion, Sheard will tell lawmakers.

Hearing before the California Senate Public Safety Committee on SB 1215

EFF Grassroots Advocacy Organizer Nathan Sheard

Tuesday, July 11, 8:30 am

California State Capitol
10th and L Streets
Room 3191
Sacramento, CA  95814

Contact: Nathan 'nash' Sheard
Categories: Privacy

Hearing Today: EFF Staff Attorney Alex Moss Will Testify About Proposed Changes to Patent Law That Will Benefit Trolls, Foster Bad Patents

Tue, 06/04/2019 - 09:41

Washington D.C.—EFF Staff Attorney Alex Moss will tell U.S. lawmakers today that proposed changes to Section 101 of the U.S. Patent Act—the section that defines, and limits, what can get a patent—will upend years of case law that ensures only true inventions, not basic practices or rudimentary ideas, should get a patent. Moss is among a panel of patent experts testifying today before the Senate Subcommittee on Intellectual Property about the state of patent eligibility in America.

The Supreme Court ruled in Alice v. CLS Bank that an abstract idea does not become eligible for a patent simply by being implemented on a generic computer. For example, a patent on the basic practice of letting people access content in exchange for watching an online ad was upheld in court before Alice. EFF’s “Saved by Alice” project has collected stories about small businesses that were helped, or even saved, by the Supreme Court’s Alice decision.

A proposal by Senators Thom Tillis and Chris Coons, chairman and ranking member of the subcommittee, would rewrite Section 101 of the Patent Act. The proposal is aimed squarely at killing the Alice decision. It will primarily benefit companies that aggressively license and litigate patents, as well as patent trolls—entities that produce no products, but make money by threatening developers and companies, often with vague software patents.

Section 101, as it stands, prevents monopolies on basic research tools that nobody could have invented. That protects developers, start-ups, and makers of all kinds, especially in software-based fields, Moss will tell senators.

Hearing before Senate Subcommittee on Intellectual Property: The State of Patent Eligibility in America, Part I

EFF Staff Attorney Alex Moss

Today at 2:30 pm

Dirksen Senate Office Building 226
50 Constitution Ave NE
Washington D.C. 20002

For more on Alice v. CLS Bank:

Contact: Alex H. Moss
Categories: Privacy

Caught in the Net: The Impact of ‘Extremist’ Speech Regulations on Human Rights Content

Mon, 06/03/2019 - 15:33

San Francisco – Social media companies have long struggled with what to do about extremist content that advocates for or celebrates terrorism and violence. But the dominant current approach, which features overbroad and vague policies and practices for removing content, is already decimating human rights content online, according to a new report from Electronic Frontier Foundation (EFF), Syrian Archive, and WITNESS. The report confirms that the reality of faulty content moderation must be addressed in ongoing efforts to address extremist content.

The pressure on platforms like Facebook, Twitter, and YouTube to moderate extremist content only increased after the mosque shootings in Christchurch, New Zealand earlier this year. In the wake of the Christchurch Call to Action Summit held last month, EFF teamed up with Syrian Archive and WITNESS to show how faulty moderation inadvertently captures and censors vital content, including activism, counter-speech, satire, and even evidence of war crimes.

“It’s hard to tell criticism of extremism from extremism itself when you are moderating thousands of pieces of content a day,” said EFF Director for International Freedom of Expression Jillian York. “Automated tools often make everything worse, since context is critical when making these decisions. Marginalized people speaking out on tricky political and human rights issues are too often the ones who are silenced.”

The examples cited in the report include a Facebook group advocating for the independence of the Chechen Republic of Iskeria that was mistakenly removed in its entirety for “terrorist activity or organized criminal activity.” Groups advocating for an independent Kurdistan are also often a target of overbroad content moderation, even though only one such group is considered a terrorist organization by governments. In another example of political content being wrongly censored, Facebook removed an image of a leader of Hezbollah with a rainbow Pride flag overlaid on it. The image was intended as satire, yet the mere fact that it included a face of a leader of Hezbollah led to its removal.

Social media is often used to as a vital lifeline to publicize on-the-ground political conflict and social unrest. In Syria, human rights defenders use this tactic as many as 50 times a day, and there are now more hours of social media content about the Syrian conflict than there have been hours in the conflict itself. Yet, YouTube used machine-learning-powered automated flagging to terminate thousands of Syrian YouTube channels that published videos of human rights violations, endangering the ability of those defenders to create a public record of those violations.

“In the frenzied rush to delete so-called extremist content, YouTube is erasing the history of the conflict in Syria almost as quickly as human rights defenders can hit ‘post,’” said Dia Kayyali, Program Manager for Tech and Accountability at Witness. “While ‘just taking it down’ might seem like a simple way to deal with extremist content online, we know current practices not only hurt freedom of expression and the right to access information, they are also harmful to real efforts to fight extremism.”

For the full report:

Contact: Jillian C. York
Categories: Privacy

Hearing Wednesday: Can Criminal Defendants Review DNA Analysis Software Used to Prosecute Them?

Mon, 05/20/2019 - 13:33

Fresno – On Wednesday, May 22, at 9 am, the Electronic Frontier Foundation (EFF) will argue that criminal defendants have a right to review and evaluate the source code of forensic DNA analysis software programs used to create evidence against them. The case, California v. Johnson, is on appeal to a California appeals court.

In Johnson, the defendant was allegedly linked to a series of crimes by a software program called TrueAllele, used to evaluate complex mixtures of DNA samples from multiple people. As part of his defense, Johnson wants his defense team to examine the source code to see exactly how TrueAllele estimates whether a person’s DNA is likely to have contributed to a mixture, including whether the code works in practice as it has been described. However, prosecutors and the manufacturers of TrueAllele claim that the source code is a trade secret and that the commercial interest in secrecy should prevent a defendant from reviewing the source code—even though the defense has offered to follow regular procedure and agree to a court order not to disclose the code beyond the defense team.

EFF is participating in Johnson as amicus, and has pointed out that at least two other DNA matching programs have been found to have serious source code errors that could lead to false convictions. In court Wednesday, EFF Senior Staff Attorney Kit Walsh will argue that Johnson has a constitutionally-protected right to inspect and challenge the evidence used to prosecute him—and that extends to the source code of the forensic software.

California v. Johnson

EFF Senior Staff Attorney Kit Walsh

Wednesday, May 22
9 am

Fifth District Court of Appeal
2424 Ventura Street
Fresno, California, 93721

For more on this case:

Categories: Privacy

EFF Project Shows How People Are Unfairly “TOSsed Out” By Platforms’ Absurd Enforcement of Content Rules

Mon, 05/20/2019 - 12:02

San Francisco—The Electronic Frontier Foundation (EFF) today launched TOSsed Out, a project to highlight the vast spectrum of people silenced by social media platforms that inconsistently and erroneously apply terms of service (TOS) rules.

TOSsed Out will track and publicize the ways in which TOS and other speech moderation rules are unevenly enforced, with little to no transparency, against a range people for whom the Internet is an irreplaceable forum to express ideas, connect with others, and find support.

This includes people on the margins who question authority, criticize the powerful, educate, and call attention to discrimination. The project is a continuation of work EFF began five years ago when it launched Onlinecensorship.org to collect speech takedown reports from users.

“Last week the White House launched a tool to report take downs, following the president’s repeated allegations that conservatives are being censored on social media,” said Jillian York, EFF Director for International Freedom of Expression. “But in reality, commercial content moderation practices negatively affect all kinds of people with all kinds of political views. Black women get flagged for posting hate speech when they share experiences of racism. Sex educators’ content is removed because it was deemed too risqué. TOSsed Out will show that trying to censor social media at scale ends up removing far too much legal, protected speech that should be allowed on platforms.”

EFF conceived TOSsed Out in late 2018 after seeing more takedowns resulting from increased public and government pressure to deal with objectionable content, as well as the rise in automated tools. While calls for censorship abound, TOSsed Out aims to demonstrate how difficult it is for platforms to get it right. Platform rules—either through automation or human moderators—unfairly ban many people who don’t deserve it and disproportionately impact those with insufficient resources to easily move to other mediums to speak out, express their ideas, and build a community.

EFF is launching TOSsed Out with several examples of TOS enforcement gone wrong, and invites visitors to the site to submit more. In one example, a reverend couldn’t initially promote a Black Lives Matter-themed concert on Facebook, eventually discovering that using the words “Black Lives Matter” required additional review. Other examples include queer sex education videos being removed and automated filters on Tumblr flagging a law professor’s black and white drawings of design patents as adult content. Political speech is also impacted; one case highlights the removal of a parody account lampooning presidential candidate Beto O’Rourke.

“The current debates and complaints too often center on people with huge followings getting kicked off of social media because of their political ideologies. This threatens to miss the bigger problem. TOS enforcement by corporate gatekeepers far more often hits people without the resources and networks to fight back to regain their voice online,” said EFF Policy Analyst Katharine Trendacosta. “Platforms over-filter in response to pressure to weed out objectionable content, and a broad range of people at the margins are paying the price. With TOSsed Out, we seek to put pressure on those platforms to take a closer look at who is being actually hurt by their speech moderation rules, instead of just responding to the headline of the day.”

Contact: Jillian C. YorkKatharine Trendacosta
Categories: Privacy

YouTube User Fights Unfair Takedown Campaign from UFC

Tue, 05/14/2019 - 16:13

San Francisco – The creator of popular post-fight commentary videos on YouTube is demanding an end to the Ultimate Fighting Championship (UFC)’s unfair practice of sending takedown notices based on bogus copyright claims. The creator, John MacKay, is represented by the Electronic Frontier Foundation (EFF).

MacKay operates the “Boxing Now” channel on YouTube, and his videos include original audio commentary and small number of still images from UFC events. While those stills are an obvious fair use—a lawful way to use copyrighted content with permission—UFC has sent five takedown notices to YouTube claiming infringement, and YouTube has complied with each takedown. MacKay has responded every time with a counter-notice, explaining the fair and non-infringing nature of his videos, and YouTube has reposted the videos after UFC failed to respond.

“My YouTube channel is a popular source of post-fight commentary,” said MacKay. “My videos are most often viewed in the days immediately after a fight, and when UFC has them taken down for a few days with these unfair copyright claims, I lose a lot of viewers and a significant amount of money.”

UFC also produces its own YouTube videos with post-fight commentary. In a letter sent to the chief legal officer of UFC today, EFF points out that convincing YouTube to remove MacKay’s videos may benefit UFC unfairly by reducing competition for post-fight commentary videos.

“Is UFC afraid of a fair fight?” asked EFF Staff Attorney Alex Moss. “It’s time for UFC to stop sending improper takedown notices for Mr. MacKay’s videos. Independent video creators have a right to fair use of copyrighted works.”

For the full letter sent to UFC:

Contact: Alex H. Moss
Categories: Privacy