Startups

Italian Startup Clothing Line will Protect Your Identity from Facial … – The Motley Fool


For more crisp and insightful business and economic news, subscribe to
The Daily Upside newsletter.
It’s completely free and we guarantee you’ll learn something new every day.

Big Brother is watching you…unless you wear this wacky shirt.

Cap_able, an Italian clothing start-up, is trying to make fashion for the digital age, meaning all the items in its new Manifesto Collection feature intricate patterns that fool facial recognition software into believing the wearer is a giraffe or a zebra or totally invisible.

Error, Error

In a world where seemingly every device is constantly recording us – phones, computers, and yes, even Roombas – privacy has become almost a fantasy. Cap_able’s clothing uses what it calls “adversarial patterns” developed by artificial intelligence algorithms to confuse facial recognition devices. CEO Rachel Didero told CNN that her collection, which includes shirts, hoodies, pants, and dresses, successfully protected wearers’ identities 60% to 90% of the time.

They’re not very subtle. The designs are loud and vibrant, looking like a cross between animals and TV static. And the clothing may not keep up with advances in facial recognition, which makes them really expensive considering the cheapest item is a $310 short sleeve sweater.

Facial recognition capabilities are a mixed bag at this point. Sometimes it works well, but in other cases, it has led to dire consequences:

  • Thorn is a digital defense group that has helped authorities rescue more than 9,000 victims of human trafficking and capture more than 10,000 culprits. Advocates say facial recognition is one of the best tools for suppressing child exploitation.
  • In 2019, Nijeer Parks was accused of shoplifting candy and trying to hit a police officer with a car in New Jersey despite being 30 miles away at the time of the incidents. He wound up spending 10 days in jail and paying $5,000 to defend himself before the case was thrown out for lack of evidence. Studies have found that facial tech is not the best at correctly identifying Black, Asian, and female faces.

A Scanner Darkly: Legislation surrounding facial tech is still hotly debated. New York doesn’t allow it in schools, and Maryland employers can’t use it in job interviews without consent. But even communities that originally prohibited facial recognition for government use, have since made a U-turn. After a rise in homicides starting in 2020, New Orleans police can now use the tech to aid in violent investigations. There are currently no federal bans, but with all the concern around privacy, it wouldn’t be surprising to see Philip K. Dick’s scramble suits become a reality.



READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.