this post was submitted on 03 Dec 2025
51 points (100.0% liked)

Canada

10711 readers
794 users here now

What's going on Canada?



Related Communities


🍁 Meta


🗺️ Provinces / Territories


🏙️ Cities / Local Communities

Sorted alphabetically by city name.


🏒 SportsHockey

Football (NFL): incomplete

Football (CFL): incomplete

Baseball

Basketball

Soccer


💻 Schools / Universities

Sorted by province, then by total full-time enrolment.


💵 Finance, Shopping, Sales


🗣️ Politics


🍁 Social / Culture


Rules

  1. Keep the original title when submitting an article. You can put your own commentary in the body of the post or in the comment section.

Reminder that the rules for lemmy.ca also apply here. See the sidebar on the homepage: lemmy.ca


founded 5 years ago
MODERATORS
 

The Edmonton Police Service announced Tuesday it will become the first police force in the world to use an artificial intelligence (AI) product from Axon Enterprise to trial facial-recognition-enabled bodycams.

“I want to make it clear that this facial-recognition technology will not replace the human component of investigative work,” acting Supt. Kurt Martin with EPS’ information and analytics division said during a news conference.

“In fact, the resemblances that are identified by this software will be human-verified by officers trained in facial recognition.”

Martin said the police force’s goal is to test another tool in its operations toolbox that can help further ensure public and officer safety while also respecting privacy considerations.

Axon Enterprise, an Arizona-based company, develops weapons and technology products for military, law enforcement and civilians in jurisdictions where legal.

top 9 comments
sorted by: hot top controversial new old
[–] MyMotherIsAHamster@lemmy.ca 26 points 7 hours ago

Cuz what could go wrong implementing this kind of tech from a company based in a hostile country?

[–] melsaskca@lemmy.ca 17 points 7 hours ago

The police state has nothing to do with Nationalism I guess. There is big money in that surveillance crap.

[–] runsmooth@kopitalk.net 10 points 6 hours ago* (last edited 6 hours ago)

Axon's rep basically says that their mass surveillance cameras don't see colour, just people. Then follows with the main factor is skin tone (??). A problem that was essentially noted as far back as...2019. What development in the technology is she talking about?

According to Ann-Li Cooke, Axon Enterprise’s director of responsible AI:

In response to the report, Cooke said there has been a development in the technology since 2019.

“There are gaps in both race and gender at that time,” she said. “As we did our due diligence on evaluating multiple models, we were also looking to see if there were race-based differences, and we found that in ideal conditions, that is not the case.

“Race is not the limiting factor today, the limiting factor is on skin tone. And so when there are varying conditions, such as distance [or] dim lighting, there will be different optical challenges with body-worn camera[s] — and all cameras — in detecting and matching darker-skinned individuals than lighter-skinned individuals.”

Also note that the facial-recognition technology seems to have a fatal flaw when it comes to women with darker skin.

However, Gideon Christian, an associate professor of AI and law at the University of Calgary, said the inequities attached to facial-recognition technology are too great to ignore and that he believes there is not enough recent research to suggest any significant improvement.

“Facial-recognition technology has been shown to have its worst error rate in identifying darker-skinned individuals, especially black females,” he said.

In some case studies, Christian said facial-recognition technology has shown about a 98 per cent accuracy rate in identifying white male faces, but that it also has about a 35 per cent error rate in identifying darker-skinned women.

You know what was a problem with the technology back in 2019? LLMs are coded by primarily white males, and their idea for "normal" hard codes bias into the models. These "AI" products essentially show their coders' bias by discriminating what falls outside of that normal.

For example, from "How tech's white male workforce feeds bias into AI", by Aimee Picchi:

The report highlights several ways AI programs have created harmful circumstances to groups that already suffer from bias. Among them are:

An Amazon AI hiring tool that scanned resumes from applicants relied on previous hires' resumes to set standards for ideal hires. However, the AI started downgrading applicants who attended women's colleges or who included the word "women's" in their resumes.
Amazon's Rekognition facial analysis program had difficulty identifying dark-skinned women. According to one report, the program misidentified them as men, although the program had no problem identifying men of any skin tone.

https://www.cbsnews.com/news/ai-bias-problem-techs-white-male-workforce/

[–] CanadaPlus@lemmy.sdf.org 9 points 4 hours ago* (last edited 4 hours ago)

You know how the police can't force you to show ID when just walking around? Yeah, this is the same thing and they know it.

[–] avidamoeba@lemmy.ca 6 points 6 hours ago (1 children)

How can Axon recognize Canadians?

[–] CanadaPlus@lemmy.sdf.org 3 points 4 hours ago (2 children)

Hopefully, the EPS has their own server they load faces into domestically. Don't we have legislation about jurisdiction of storage of biometrics now?

It's still not great, and I have little confidence Axon can't (cooperate with an agency to) slip in there and steal information somehow.

[–] avidamoeba@lemmy.ca 2 points 4 hours ago

Hopefully, the EPS has their own server they load faces into domestically.

X

[–] Typhoon@lemmy.ca 2 points 3 hours ago* (last edited 3 hours ago)

Not true. A few months ago Microsoft admitted US law overrides privacy agreements with other countries or where it's stored. It was in reference to a situation in France.

As an American company Axon would be bound the same way.

[–] FaceDeer@fedia.io 2 points 4 hours ago

On the plus side, maybe this will encourage police to actually have their body cams on.