- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
Apple’s device didn’t present any major breakthroughs in technology that Meta hadn’t “already explored”
I don’t believe that. Apple have spent a long time and still not ready for market till next year. Meta rushed an unfinished product that MZ put billions in to.
You can see Apple are starting to take gaming a bit more serious now, with their emoji’s you can see a direction in 2-3 iterations.
Depends on how you look at it, but I would classify the smoothness of the gesture controls and camera feed to be quite a major breakthrough in usability as well.
Hasn’t foveated rendering been a holy grail for real PC gaming headsets like the Index, HTC Vive, etc., for quite a while? As far as I know that technology is not supported in any current headset, which if it was would drastically increase performance/graphical fidelity while keeping frame times much lower, under that 8/12ms mark.
I would classify the smoothness of the gesture controls and camera feed to be quite a major breakthrough in usability as well.
This is the first time in quite a while that I am legit giddy over a new piece of technology. While Apple knocked it out of the park with Apple silicon, a lot of people were wondering why Apple initially bothered in the first place, and why they licensed ARM just to engineer their own custom ARM chip. I don’t think it’s that much of a leap to assume that Apple developing the Apple silicon chips was driven due to the Vision Pro. The M2 chips have amazing compute performance, there’s zero way that an Intel processor would meet the size, battery life, TDP, or overall heat generation requirements to match the performance Apple has touted for the Vision Pro.
Another point to mention is the focus that Apple put on text legibility during their keynote announcement of the thing. I have an OG Playstion VR headset and an Index. Text smears. In every headset I’ve used text smears. It’s decent in the Index, but playing something like Elite: Dangerous really makes it apparent thanks to the HUD elements on the edges of your vision. It’s super nice to be able to pin a web browser inside your ships cockpit for planning trade route jumps and stuff, but I would not want to use my Index for my daily desktop use for work or continual training. I can’t imagine how miserable using a terminal emulator and vim would be in an Index. If Apple’s claims about the text usability and general UI smoothness is true, which I completely believe given their obsession with UI/UX and aEstHeTic in general, that alone is game-changing.
For better or worse, certain values seem to have been ingrained into the fabric of Apple, to the point that most of Apple’s vision seems to have remained relatively unchanged after Jobs. Zuckerberg hasn’t shown any vision other than “collect data on people and extract profit”, and no one at Facebook/Meta has done anything to show any different vision from the company. At least Amazon’s focus on efficiency and scale brought us AWS. The massive differences between what is essentially the same product between both companies does a really good job of highlighting the major differences between the companies.
Maybe Facebook and Apple can work together to make a VR headset (that nobody wants) to explore the Metaverse (that nobody wants)