Apple Vision Pro: how to turn people into 24/7 surveillance agents for capitalism

The viral commercial image of Apple vision Pro worn by a woman.
From Apple.com

In the world of tech, innovation is a double-edged sword. On one side, we have the allure of cutting-edge technology, promising to revolutionize our lives. But what happens when this revolution infringes on our privacy? The latest product from Apple, the Vision Pro AR headset, is a perfect example of this dichotomy.

The Vision Pro is a probably a break-through for product design, boasting features that seem straight out of a science fiction novel. But what if this science fiction is a dystopian one, where our every move is watched and recorded?

Spatial Operating System (Vision OS)

Vision OS allows apps to live in your space, reacting to light and casting shadows. But what if these shadows are not just on your walls, but also on your privacy? Imagine every step you take, every item you interact with, being recorded and analyzed. It’s like having a surveillance camera strapped to your face, except this one also knows when you’re looking at your fridge a bit too often.

But the implications go beyond personal surveillance. With this level of data, tech companies could potentially tailor advertisements to an unprecedented degree. Picture this: you’re looking at a new coffee machine in your kitchen, and suddenly, you start seeing ads for coffee beans on your social media feeds. But is this convenience worth the invasion of your privacy?

Moreover, the misuse of private data by governments is a legitimate concern. Big tech companies have a history of complying with governmental demands for data, often at the expense of individual privacy. With Vision OS, governments could potentially gain access to intimate details about your daily life. But should the government know more about your habits, preferences, and routines than you do?

3D Camera and Mapping

The Vision Pro includes a 3D camera that captures photos and videos with remarkable depth, providing more data about your environment than a traditional 2D camera. But what happens when this data falls into the wrong hands?

Moreover, the device features 3D mapping for a detailed understanding of the environment. This could potentially be used to create detailed maps of your private spaces. But who has the right to use this product in whose space? Should people need permission and consent before entering an environment with a device that can scan everything in 3D in mere moments?

Furthermore, the potential misuse of 3D mapping technology in a policing society is alarming. Detailed maps of private spaces could become tools of surveillance, used by authorities under the guise of public safety. But is safety worth the cost of our privacy?

Advanced Machine Learning and Synthetic Face[Time]

The Vision Pro uses advanced machine learning to represent users realistically during FaceTime calls, providing life-size visuals and spatial audio. But what happens when this realistic representation falls into the wrong hands?a

However, the implications of this technology extend beyond personal privacy. In an era where deepfake technology is becoming increasingly sophisticated, the potential/social misuse of facial data is a significant concern. By providing a detailed map of your face, you could inadvertently be contributing to the creation of synthetic data that could be used to impersonate you or violate your privacy in other ways. But who owns the digital representation of your face? And what happens if this data falls into the wrong hands? Do you have any rights regarding your face data and does anyone protect that?

A Person wearing the Apple vision Pro in a dystopic built environments
Built with Stability AI SDSL BETA

Designing a Surveillance Society

In a rush to embrace new technology, we often overlook the potential psycho-social second and third-order effects. The Vision Pro is not just a product; it’s a tool for dataveillance, a means for big tech companies to peer into our lives under the guise of providing us with better services. It’s another example of the “aggressive mimicry” I’ve previously written about.

But what happens when these services infringe upon our privacy? We need to approach these innovations with a critical eye, questioning not just what they can do for us, but what they can do to us. We need to consider the implications of turning our lives into an open book for tech companies to read at their leisure.

The Vision Pro is a testament to the power of product design. But as we marvel at its capabilities, let’s not forget to ask ourselves: at what cost does this come? Are we willing to trade our privacy for the promise of a better reality? Or are we just sleepwalking into a future where we are the product, and not only is our behavioral data the commodity, as Shoshana Zuboff highlights, but our entire built environment is added to the commodity portfolio with the emergence of products like the Apple Vision Pro?

The role of designers in this process is crucial. They are not just creating products; they are shaping our future interactions and experiences. With every line of code, every pixel, every decision, they have the power to influence how these technologies impact our lives.

Many design movements such as value-sensitive design and responsible innovation have been pointing to the role of designers in considering not only the intended use of their products but also the potential unintended consequences for a couple of decades. Daniel Schmachtenberger, also in more recent contexts, with his consilience projects solidly and tangibly highlights the importance of “psycho-social externalities” of digital technologies and elaborates on the fact that “Technologies always create the potential for new forms of behavior, values, and thought, even when the technology is not explicitly made to do so.” Designers must ask themselves: How could this technology be misused? Who might be harmed by it? What are the potential second and third-order effects?

While technologies are designed and embedded with specific values in mind, as they become entangled within the social and political economy of human societies, they can generate new, changing, and unpredictable values.

In the age of surveillance capitalism, our privacy is the price we pay for the promise of a better reality. But it doesn’t have to be this way. Designers, both of products and businesses, have a responsibility to advocate for the users, to demand transparency from big tech companies about how users’ data is used and who has access to it. They have the power to design products that respect our privacy, our autonomy, and our human values. And they must resist when they are asked not to care about these topics.

The Vision Pro is a marvel of technology, but it’s also a stark reminder of the ever-growing reach of surveillance capitalism. As we embrace these new technologies, we must also grapple with the privacy concerns they raise. We must ask ourselves: are we comfortable with the idea of our homes, our faces, and our lives being mapped and monitored for the sake of convenience?

*Small refinements to this article were made with the help of OpenAI GPT4 model. Not sure if next time it will be more of me editing its work or vice versa.


Apple Vision Pro: how to turn people into 24/7 surveillance agents for capitalism was originally published in UX Collective on Medium, where people are continuing the conversation by highlighting and responding to this story.


Posted

in

by

Tags:

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *