Apple Vision Pro: UX yay or nay. Every year, my students invariably ask… | by Dr. Andrés Zapata | Feb, 2024

[ad_1]

Every year, my students invariably ask me halfway through the semester if the heavy-headed UX design theory, psychology, neuroscience, anthropology, and design I peddle will be relevant when they graduate.

My answer has been as consistent in the 20+ years teaching as inertia:

As long as input and output largely remain the same, the underpinnings of UX will largely remain the same.

Since the major push for commercial desktop computing in the late ’70s with the debut of the TRS-80, Apple II, and Commodore Personal Electronic Transactor, the actual human-computer interface has been governed by keyboard and mouse (input) and screen (output).

Yes, it’s true that touch, gestures, mobile, and speech have nudged how we practice UX, but not enough to reinvent the UX curriculum or how we practice.

Apple’s latest product, the Apple Vision Pro, has the potential to absolutely disrupt how we teach and practice UX. It challenges the traditional way people interface with a machine.

In simplest form, the device’s immersive audio and visual output dramatically changes the output.

And input is largely done by combining eye tracking (akin to mouse pointer), voice, and open-air gestures (akin to clicking and typing).

Apple Vision Pro has the potential to uproot how we work, shop, and play on-line. Using the devise is in fact an immersive experience Apple is calling “spatial computing.” It offers a trove of useful, neat, and productive applications…

[ad_2]

Source link