nothin Time To Face Up To Big Brother | New Haven Independent

Time To Face Up To Big Brother

Contributed photos

Scenes from protest.

Makeup styles designed to trick FRT, including the style I tried last week.

(Opinion)—More than 50 universities across the U.S. have declared they do not plan to utilize facial recognition technology (FRT) on campus. That includes UCLA, which decided to halt its plans to use the technology after a very public and protracted battle with students.

Though Yale University has offered weak statements about having no plans to use FRT, an outright ban on this unreliable, biased, and dystopian threat to privacy needs to be the official policy of not just Yale but of all schools and municipalities ― including the City of New Haven.

The dangers of FRT are well-known and easy to prove. Digital activist group Fight For The Future scanned photos of over 400 faces of UCLA student athletes and faculty members with Amazon’s FRT software, finding that 58 photos were falsely matched with images in a mugshot database.

Often, when people were matched with 100 percent confidence,” the only factor in common between a UCLA photo and a mugshot was the person’s race. Not surprisingly, the vast majority of false matches were people of color.

This kind of algorithmic bias has been known for a long time, documented extensively in Georgetown Law’s report The Perpetual Lineup”.

It used to be true that we knew very little about FRT use in law enforcement or about the software vendors who universities, agencies, and governments pay to surveil populations.

2020 has been a watershed year, however, and we now understand the incredible reach of companies like Clearview AI, Banjo, and Wolfcom in the U.S. and around the globe.

Peter Thiel-funded Clearview AI was the subject of an explosive exposé in January, which revealed a public-private partnership that matches faces from images uploaded to millions” of websites, including Facebook, Venmo, and YouTube, to a private database of approximately three billion photos (a library of images seven-times that of the FBI).

At the time, we thought that Clearview AI was only offering facial recognition services to about 600 U.S. law enforcement agencies.

After the report, Clearview’s client list was breached and we learned the true extent of the company’s global penetration ― we now know there are about 2,900 organizations in 27 countries, including Abu Dhabi, Canada, the UK, and France. Clients from the U.S. include nearly every federal law enforcement agency (FBI, DEA, ICE/CBP), city police (New York, Philadelphia, Miami), retail and events companies (Macy’s, Walmart, NBA, Eventbrite), and universities (Columbia, Alabama, FIU, Minnesota, SMU).

Some of the universities on the list had already committed to Fight To The Future to ban FRT on campus, often with weak or vague statements similar to Yale’s. This reveals the very real danger of not taking the issue seriously.

Rather than wait for a campus officer or administrator to abuse their position and harass a student with an app like Clearview, Yale and other schools need to send a clear message that FRT will not be implemented, encouraged, or tolerated on campus or in university facilities. If billionaire investors, perhaps even celebrities like Ashton Kutcher, can’t contain themselves from using FRT to spy on people at parties, how long will it take until New Haven’s slice of the Ivy League is implicated in these stories?

On March 2, I joined a group of Yalies and New Haven residents in protesting the use of FRT on campus, supporting this important cause as a concerned New Havener, a lecturer at Yale Law School, and the head of the Information Society Project’s Privacy Lab initiative.

Fight For The Future

False matches between UCLA student athletes and faculty with mugshots.

The event was part of a nationwide day of action on college campuses organized by Fight For The Future. Though the group at Yale was small, I had spoken to many students and faculty afraid to join because they feared retribution or blacklisting: the chilling effect of surveillance on free expression is very real.

We gathered dozens of signatures for a petition and attempted to deliver a letter to Yale President Peter Salovey, but were instead met by his staff member Pilar Montalvo. The conversation was cordial but we received no guarantees or timeline, and our message to the administration is therefore worth restating:

We challenge Yale to extend their [pledge not to use FRT] to an outright ban on FRT as official policy. We implore Yale to increase accountability around its surveillance systems in general via audits, reviews, and annual transparency reports. Transparency can only be gained when we have answers to the questions below and others that will no doubt arise during internal review of FRT at Yale:

Is FRT in use at Yale-NUS or any Yale facility or campus?

Are Yale or Yale-NUS police, safety, and security officers using FRT directly or systems that utilize FRT information obtained elsewhere (e.g. gathered by other law enforcement agencies)?

Does Yale or Yale-NUS have a contract with Clearview AI or other FRT vendors?

Does Yale or Yale-NUS utilize other biometric markers or metrics as part of its surveillance systems, such as the video cameras inside and surrounding buildings?

In an email to Yale Daily News, Joy McGrath wrote that the Yale Police Department does not use FRT and that Yale is not a Clearview AI client. This is great news, but not nearly comprehensive enough. Will YPD never use FRT, even when it’s a default feature in even cheap security systems? Can we expect Yale-NUS to also not use FRT, and for FRT to never make its way into any Yale facility or event?

Yale Privacy Lab

Surveillance map of New Haven.

At Privacy Lab, we started to document surveillance cameras and other devices around New Haven by taking photos and pinning locations on a map. Though this surveillance under surveillance” map is far from complete, it shows a clear pattern: you can’t walk anywhere downtown without being recorded. For this reason the City of New Haven needs to ban FRT as well, joining other cities around the country such as San Francisco, Oakland, Somerville, Brookline, and Cambridge.

New Haven residents have an appropriate and visceral fear of Big Brother-style surveillance, as we saw during the Harp vs. Elicker battle last year, and stopping FRT before it becomes a palpable issue in the Elm City would unite friend and foe.

We can avoid the kind of mess that Utah now finds itself in, where FRT vendor Banjo has unrestricted, real-time access to municipal and state-owned cameras, 911 systems, and police data. Banjo correlates facial profiles with data scraped from social media, apps, and satellites to detect anomalies.” Do we want to be on the receiving end of such tech, letting it watch and judge us every day?

I have spoken to city officials, the mayor and alders in private about FRT, and I hear the same kinds of statements that Yale has made. In general, city staff are supportive of banning this creepy” technology and claim that neither New Haven police or traffic cameras use FRT. As we’ve seen elsewhere, however, controlling vendors can be extremely difficult and, without an official policy or ordinance, there is nothing to deter individual officers, staff, or contractors from installing an app and using FRT to scan New Haven residents. Companies like Wolfcom are now offering FRT in police body cams, boasting 1,500 clients in more than 35 countries, and we need a guarantee from the City that such technology will never be turned against residents and visitors.

Now is the time to turn the tide at Yale and in New Haven, before FRT becomes ubiquitous and a daily threat to our privacy. We will keep challenging the Yale administration to take effective action against FRT and, likewise, will bring the challenge to the City.

I’m proposing legislation to ban FRT and, specifically, face surveillance via a City of New Haven ordinance. The legislation should be an easy win for the City if the technology is truly not in use, sending a strong message to our vulnerable and underserved communities that they will not be the subject of invasive and totalitarian technology. I’m calling on all City officials to join me in passing and enforcing this legislation, which was written with the support and guidance of the Electronic Frontier Foundation.

Let’s nip FRT in the bud and keep our streets from becoming a sci-fi nightmare.

Tags:

Sign up for our morning newsletter

Don't want to miss a single Independent article? Sign up for our daily email newsletter! Click here for more info.


Post a Comment

Commenting has closed for this entry

Comments

Avatar for Hartman13

Avatar for ManOnThePulpit

Avatar for Heather C.

Avatar for Patricia Kane

Avatar for Ken McGill

Avatar for DMH464

Avatar for lawrence dressler