A proposed ban is part of a broader anti-surveillance ordinance that the city's Board of Supervisors is expected to approve on Tuesday. If passed — a majority of the board's 11 supervisors have expressed support for it — it will make San Francisco the first city in the United States to outlaw the use of such technology by the police and other government departments. The ordinance could also spur other local governments to take similar action.
Facial-recognition systems are increasingly used everywhere from police departments to rock concerts to homes, stores and schools. They are designed to identify specific people from live video feeds, recorded video footage or still photos, often by comparing their features with a set of faces (such as mugshots).
San Francisco's proposed rule forbids the use of facial-recognition technology by the city's 53 departments — including the San Francisco Police Department, which doesn't currently use such technology but did test it out between 2013 and 2017. However, the ordinance carves out an exception for federally controlled facilities at San Francisco International Airport and the Port of San Francisco. The ordinance doesn't prevent businesses or residents from using facial recognition or surveillance technology in general — such as on their own security cameras. And it also doesn't do anything to limit police from, say, using footage from a person's Nest camera to assist in a criminal case.
"We all support good policing but none of us want to live in a police state," San Francisco Supervisor Aaron Peskin, who introduced the bill earlier this year, told CNN Business ahead of the vote.
The ordinance adds yet more fuel to the fire blazing around facial-recognition technology. While the technology grows in popularity, it has come under increased scrutiny as concerns mount regarding its deployment, accuracy, and even where the faces come from that are used to train the systems.
In San Francisco, Peskin is concerned that the technology is "so fundamentally invasive" that it shouldn't be used.
"I think San Francisco has a responsibility to speak up on things that are affecting the entire globe, that are happening in our front yard," he said.
Early days for facial recognition laws
Facial recognition has improved dramatically in recent years due to the popularity of a powerful form of machine learning called deep learning. In a typical system, facial features are analyzed and then compared with labeled faces in a database.
Yet AI researchers and civil rights groups such as the American Civil Liberties Union are particularly concerned about accuracy and bias in facial-recognition systems. There are concerns that they are not as effective at correctly recognizing people of color and women. One reason for this issue is that the datasets used to train the software may be disproportionately male and white.
The ACLU is one of many civil-rights groups supporting the ordinance. Matt Cagle, a technology and civil liberties attorney at the ACLU of Northern California, said the raft of issues posed by facial-recognition systems mean the city's proposed legislation would prevent harm to community members. He also expects that, if passed, it will prompt other cities to follow suit.
"We should be able to live our lives without every movement of ours being tracked and monitored by the government," he told CNN Business.
There are currently no federal laws addressing how artificial-intelligence technology in general, or facial-recognition systems specifically, can be used, though a Senate bill introduced in March would force companies to get consent from consumers before collecting and sharing identifying data.
A few states and local governments have made their own efforts: Illinois, for example, has a law that requires companies get consent from customers before collecting biometric information. California's senate is currently considering a bill that would ban police in the state from using biometric technology — such as facial recognition — with body-camera footage.
In the Bay Area alone, Berkeley, Oakland, Palo Alto and Santa Clara County (of which Palo Alto is a part) have passed their own surveillance-technology laws. Oakland is also considering whether to ban the use of facial-recognition technology.
How surveillance could be harder in San Francisco
Under the proposed San Francisco law, any city department that wants to use surveillance technology or services (such as the police department if it were interested in buying new license-plate readers, for example) must first get approval from the Board of Supervisors. That process would include submitting information about the technology and how it will be used, and presenting it at a public hearing. With the proposed rule, any city department that already uses surveillance tech would need to tell the board how it is being used.
"This ordinance is really about making sure that, when there are surveillance programs, the community has a voice and a seat at the table," Cagle said.
The ordinance also states that the city will need to report to the Board of Supervisors each year on whether surveillance equipment and services are being used in the ways for which they were approved, and include details like what data was kept, shared or erased.
In a statement, the San Francisco Police Department said it "looks forward" to working with the city's supervisors, the ACLU, and others to develop laws that speak to tech-related privacy worries "while balancing the public safety concerns of our growing, international city."
"In accordance with the legislation, we are in the process of auditing our technologies and related policies," the statement said.
The vocal opposition
Some locals have been vocally opposed to the surveillance ordinance, including several groups of residents. Frank Noto, president of Stop Crime SF, a group focused on crime prevention, said his organization recognizes privacy and civil-liberties concerns that may have prompted the ordinance's introduction, but sees it as flawed legislation largely because it requires the police department to get approval from the city before it can obtain surveillance technology.
And while Stop Crime SF sees the faults in existing facial-recognition technology, it's also concerned about prohibiting its use entirely. Noto suggested a moratorium on using it — say, for two years — might be a better option.
"The idea of banning it forever doesn't make sense to us," he said.
No comments:
Post a Comment