DISPATCH — JUNE 2023
VISION PRO'S REAL INNOVATION IS MAKING YOUR EYES A MOUSE.
TL;DR
Apple unveiled Vision Pro at WWDC 2023 and positioned it as the beginning of "spatial computing." Pricing was announced at $3,499 in the U.S., with availability "early next year" (2024). Apple also highlighted a privacy posture that felt unusually explicit for a sensor-heavy wearable.
This wasn't just another headset launch. It was Apple doing what Apple always does: taking a category that existed (mixed reality) and trying to standardize the cultural default for it.
And the default they're proposing is this:
- Your primary inputs are your eyes, your hands, and your voice.
- Your face is the device boundary.
- Your attention becomes the cursor.
—
1) WHAT HAPPENED
On June 5, 2023, Apple introduced Apple Vision Pro at WWDC and positioned it as the beginning of "spatial computing." Pricing was announced at $3,499 in the U.S., with availability "early next year" (2024). Apple also highlighted a privacy posture that felt unusually explicit for a sensor-heavy wearable.
This wasn't just another headset launch. It was Apple doing what Apple always does: taking a category that existed (mixed reality) and trying to standardize the cultural default for it.
And the default they're proposing is this:
- Your primary inputs are your eyes, your hands, and your voice.
- Your face is the device boundary.
- Your attention becomes the cursor.
—
2) THE REAL SHIFT: FROM "TOUCH" TO "LOOK"
Apple's press release spells it out: Vision Pro introduces an "entirely new input system" controlled by a person's eyes, hands, and voice.
This matters because "input" is not neutral. Input shapes what gets built.
- When computing is built around touch, the dominant problems are speed and friction.
- When computing is built around gaze, the dominant problems become intention, manipulation, and surveillance.
A gaze-based interface is powerful because it feels natural. You don't have to learn it.
That's also what makes it ethically volatile: the more natural it feels, the less you notice when it's shaping you.
The danger is not that the headset can track your eyes.
The danger is what happens if society normalizes the idea that "looking" is a form of disclosure.
—
3) EYES AS BIOMETRICS: OPTIC ID AND THE NEW BORDER CONTROL
Apple also introduced Optic ID, an iris-based authentication system. Their claim is strong:
- It analyzes your iris.
- The enrolled data is protected by the Secure Enclave.
- It's encrypted, not accessible to apps, and never leaves the device.
This is the "best case" version of biometric authentication: on-device, compartmentalized, and framed as a privacy feature.
But biometrics always carry a deeper truth: they convert your body into a key.
And keys invite gatekeepers.
The question you should ask isn't "Is Optic ID secure?"
It's:
- In a world where identity can be read from eyes, who demands access?
- Employers?
- Governments?
- Platforms that want "age verification" or "fraud prevention"?
- Insurance companies that want "risk scoring"?
Apple can protect Optic ID data locally. But once iris authentication becomes normal, the rest of the world will attempt to copy it badly.
And "copied badly" is where abuse lives.
—
4) PUBLIC SPACE PROBLEM: BYSTANDER CONSENT
Apple's Vision Pro is filled with cameras and sensors because it has to blend digital content with the physical world. Apple says camera and sensor data is processed at the system level so individual apps don't need to see your surroundings to enable spatial experiences.
That's good engineering.
But it does not solve the public space problem.
Because the ethical unit here is not the user.
The ethical unit is the room.
If a device is worn on the face, it changes the social contract for everyone in line-of-sight:
- People will wonder if they're being recorded.
- People will avoid sensitive conversations.
- People will perform differently around someone wearing a sensor mask.
Apple tries to address this with EyeSight (showing the user's eyes externally to reduce social disconnect) and a visual indicator that signals when the user is capturing a spatial photo or video.
These are thoughtful mitigations.
But they also reveal the scale of the problem: when you need external indicators to prove you're not recording, the technology has already shifted trust.
—
5) APPLE'S PRIVACY CLAIMS: IMPORTANT, SPECIFIC, AND A HIGH BAR
Apple makes unusually clear privacy promises in the Vision Pro press release:
- "Where a user looks stays private while navigating Apple Vision Pro."
- Eye tracking information is not shared with Apple, third-party apps, or websites.
- Sensor and camera data is processed at the system level so apps don't need to see your surroundings.
If Apple actually holds this line, it sets a new bar for the entire wearable computing industry.
But it also creates a new kind of ethical risk: privacy becomes a premium feature that only wealthy users can afford.
When the headset costs $3,499, "privacy by design" risks becoming "privacy by price."
—
6) WHO BENEFITS VS WHO PAYS (ACCOUNTABILITY BOX)
MOST HELPED
- People who need hands-free interaction (accessibility upside if the system is reliable).
- Creators and professionals who benefit from large, movable digital workspaces.
- Developers who build spatial tools for education, training, and assistive use cases.
MOST EXPOSED
- Bystanders in shared spaces who did not consent to be in a sensor-rich environment.
- Workers if companies adopt spatial computing for productivity monitoring and attention measurement.
- Users whose attention becomes a target for persuasion and behavioral design.
QUIET WINNERS
- Platform owners who can define the rules of what "spatial computing" is.
- Any ecosystem that can monetize attention once gaze becomes the universal input.
QUIET LOSERS
- Anyone who relies on anonymity in public spaces.
- Communities already over-policed or over-surveilled, where "new sensors" do not land equally.
—
7) WHAT WE SHOULD DEMAND (NOW, BEFORE THE DEFAULT SETS)
If spatial computing becomes normal, we need the guardrails to become normal too.
A) A BYSTANDER RIGHTS STANDARD
Clear norms for recording indicators, public-space use, and consent signals that don't depend on trusting the wearer.
B) A "NO GAZE ADS" LINE IN THE SAND
Advertising that uses eye-tracking is not "personalization." It's coercion with higher resolution.
C) ENTERPRISE LIMITS
Workplace deployments must prohibit gaze analytics and attention scoring. "Productivity XR" cannot become "discipline XR."
D) AUDITABLE PRIVACY CLAIMS
Apple's promises are specific. Good. Now they should be auditable:
- What is processed on-device?
- What never leaves?
- What can apps infer indirectly?
E) ACCESSIBILITY WITHOUT EXTRACTION
If this device truly helps people interact hands-free, that benefit should not require surrendering privacy or becoming a dataset.
—
8) THE AUGMENTED HUMAN TEST
Apple is betting that the future of computing is not more screens.
It's more sensing.
If the cursor is your gaze, then the core ethical question is:
Will these systems respect attention as a human boundary —
or treat attention as a resource to be mined?
No hype. Just consequences.