Apparel designed to protect wearer’s privacy

Privacy protection apparel gains momentum as brands work to shield consumers from artificial intelligence and photographs. Jessica Owen reports.

We live in a digital era. Not only do we have computers, social media and smartphones with built-in high-definition cameras, but we are watched by millions of CCTV systems – many of which are upgraded with the latest artificial intelligence (AI) technology.

While most of these systems are designed for entertainment or for our own safety, many people are uncomfortable with this level of surveillance. That’s why a new form of camouflage clothing to protect consumers from AI and unauthorised photographs is being developed by fashion brands.

Saif Siddiqui, founder of the technology and fashion brand Ishu, says: “In a world where everyone has a high-quality camera in their phone and social media platforms, there is a set of people that value privacy and don’t want to be seen.”

For these people, a line of garments that can block out unwanted flashes and ‘outsmart the spotlight’ have been developed by Ishu in collaboration with the luxury fashion brand Balr. The companies say the clothes are made from textiles containing thousands of nano-spherical crystals that reflect a camera’s flash. The result is an image where the wearer is blacked out.

This ‘anti-paparazzi’ collection comprises hoodies, T-shirts, backpacks, trainers and puffer jackets and have been a long time in the making considering the more complicated manufacturing process required to make the line, says Siddiqui.

He adds: “Respectfully, it’s not just a T-shirt brand with a logo. The technology – and merging it into fabrics – is a complicated process. We believe over time as the companies grow, we will be able to offer an even wider range of products to the masses.”

Meanwhile, garments that confuse AI technology have been developed by the Japanese company Unlabeled. Founded in 2019, the team is using a technique called adversarial example that triggers AI technology to unrecognise a person.

Naoki Tanaka, founder and creative director of Unlabeled, says: “The idea came from when I took a taxi and saw an advertising sign inside. The sign had a camera to recognise the passenger’s gender and age to choose suitable ads for the passenger.”

He adds: “I was uncomfortable with being unilaterally classified by AI and there is no permission or agreement to use personal data like that. So we came up with using the adversarial example technique for clothing.”

To do this, a small amount of visual noise is added to a design, which is then printed onto the product. By adding this noise to the original input image, calculating how much the recognition rate reduces and iteratively optimising the image so that the recognition rate reduces, the team can generate an image that fools the AI classification model.

So far, the team says it has applied its adversarial example-enhanced designs to bags, sweatshirts and hoodies, which have shown a 70% success rate at deceiving AI. And in future, the team would like to expand its applications to protect gender identity and children.

Tanaka says: “By wearing these patterns, people can choose if they want to be recognised. It’s a kind of awareness project and a speculative art. Making awareness and causing discussion that we have rights to protect our identity and privacy is important.”