close
close

CBSA wants to use facial recognition app for people facing deportation: Documents


CBSA wants to use facial recognition app for people facing deportation: Documents

OTTAWA – The Canada Border Services Agency plans to launch an app that will use facial recognition technology to keep track of people who have been ordered deported from the country.

The mobile reporting app would use biometric data to confirm a person’s identity and record their location data when they use the app to check in. Documents obtained through Access to Information show that the CBSA proposed such an app back in 2021.

A spokesperson confirmed that an app called ReportIn will be launched this fall.

Experts raise numerous concerns, questioning the validity of user consent and possible confidentiality regarding the technology’s decision-making.

Each year, about 2,000 people who have been asked to leave fail to show up, meaning the CBSA must “dedicate significant resources to investigate, locate and, in some cases, arrest these clients,” a 2021 document said.

The agency praised a smartphone app as the “ideal solution.”

Through regular updates through the app, such as “a person’s home address, employment, marital status, etc., the CBSA receives relevant information that it can use to contact the customer and monitor for early signs of non-compliance,” it says.

“In addition, automation makes it more likely that the customer will feel involved and recognize how much insight the CBSA has into their case.”

The document also states: “If a client does not show up for deportation, the information collected through the app provides good investigative leads for locating the client.”

An algorithmic impact assessment for the project – which has not yet been published on the federal government’s website – said the biometric voice technology the CBSA was trying to use was being discontinued due to “technical failure” and that the ReportIn app had been developed as a replacement.

It stated that “a person’s facial biometric data and location data provided by sensors and/or the GPS in the mobile device/smartphone” are recorded via the ReportIn app and then sent to the CBSA’s back-end system.

Once people submit photos, a “face comparison algorithm” generates a similarity score to a reference photo.

If the system does not confirm a facial match, it triggers a process that instructs officers to investigate the case.

“The location of the person concerned is also recorded every time they report and when they do not comply with their conditions,” it says. The document states that the people concerned are not “continuously tracked.”

The app uses technology from Amazon Web Services, a decision that caught the attention of Brenda McPhail, director of leadership education in McMaster University’s Public Policy in Digital Society program.

She said that while many companies that do facial recognition submit their algorithms to the U.S. National Institute of Standards and Technology for testing, Amazon has never done so voluntarily.

A spokesperson for Amazon Web Services said Amazon Rekognition technology is being “extensively tested, including by third parties such as Credo AI, a company specializing in responsible AI, and iBeta Quality Assurance.”

The spokesperson added that Amazon Rekognition is a “large-scale cloud-based system and therefore is not downloadable as described in the NIST participation guidelines.”

“For this reason, our Rekognition Face Liveness was instead handed over to iBeta Lab for testing against industry standards,” which is accredited by the institute as an independent testing laboratory, the spokesperson said.

The CBSA document said the algorithm used was a trade secret. In a situation that could have life-changing consequences, McPhail asked whether it was “appropriate to use a tool that is protected by trade secrets or proprietary secrets and that denies people the right to understand how decisions about them are really being made.”

Kristen Thomasen, associate professor and chair in law, robotics and society at the University of Windsor, said the reference to trade secrets was a signal that there could be legal barriers blocking access to information about the system.

There have been concerns for years that people affected by system errors are legally prohibited from accessing further information due to intellectual property protection, she explained.

CBSA spokeswoman Maria Ladouceur said the agency “developed this smartphone app to enable foreign nationals and permanent residents subject to immigration requirements to report without having to come to a CBSA office in person.”

She said the agency had worked “in close coordination” with the Office of the Data Protection Commissioner on the app. “Registration with ReportIn is voluntary and users must consent to both the use of the app and the use of their image to verify their identity.”

Petra Molnar, deputy director of York University’s Refugee Law Lab, said there is a power imbalance between the agency implementing the app and the people receiving it.

“Can a person really and truly consent in this situation where there is such an enormous power imbalance?”

If a person does not agree to participate, they can alternatively come forward in person, Ladouceur said.

Thomasen also warned that there is a risk of error with facial recognition technology and that this risk is higher for people with a migrant background and darker skin.

Molnar said it was “very disturbing that there is basically no discussion in the documents about … human rights implications.”

The CBSA spokesperson said Credo AI tested the software for bias against demographic groups and found a 99.9 percent facial match rate across six different demographic groups. He added that the app will be “continuously tested after launch to assess accuracy and performance.”

The final decision will be made by a human, with officials monitoring all submissions, but experts noted that people tend to trust the judgments of technology.

Thomasen said there is a “fairly widely recognized … psychological tendency for people to rely on the expertise of the computer system,” with computer systems being perceived as less biased or more accurate.

This report by The Canadian Press was first published August 16, 2024.

Anja Karadeglija, The Canadian Press

Leave a Reply

Your email address will not be published. Required fields are marked *