Well, here’s another example of the level tech journalism has sunk to.
163-inch 4K Micro-LED television that one home theater expert described as “tall as Darth Vader.” Each of the TV’s 8.3 million pixels is an independent, miniscule LED, a feat for which TCL charges over $100,000.
But here’s the real surprise: TCL’s new TV isn’t the most pixel-dense or exotic display ever produced.
No fucking shit, Sherlock. It is trivial these days to buy a laptop with a much smaller screen but exactly the same 3840x2160=8,294,400 pixels on it. Smaller screen, same number of pixels, more pixel dense. The Sony Experia Z5 Premium is a phone with that same pixel count.
Duh…?
The Vision Pro is wireless out of the box, but it’s somewhat heavy, struggles with meager battery life which, and can’t match the fidelity of Varjo or Pimax headsets.
Apparently nobody proofreads or does any copy editing anymore, either. Or maybe the whole damn thing is outsourced to ChatGPT now, who the fuck knows.
Ah yeah, because I have a frame of reference for how tall Darth Vader is! Like shit I know he’s tall, or he’s average and everyone else in SW is short? But that doesn’t help.
Heh, Darth Vader is an imperial unit.
I chuckled
You’re a Rouge One.
Mr. Grinch
The pixel density is 7 Aladeens!
Americans will go to great lengths in order to avoid using metric
Well if you’d played Vader Immortal in VR, you would.
He’s 6’6".
It’s definitely written by someone who’s never used a VR headset. It only takes a second to realize that these screens are nowhere near the resolution of your eye. Ya know, cause small text that would be easily read on my phone is blurry as fuck on a VR headset
I can see someone who only tried VR back 10 years ago, putting on an apple vision pro and being shocked that the resolution was so high, only to be informed it was a modest increase over other current headsets and that they are all pretty clear now. But really they should know if it was anywhere near “retina resolution”, apple would have been all over making that claim.
I’m a bit surprised at the ieee hosting nontechnical articles. How long have they published “news” in this capacity? Archive.org suggests 2021 but it may have been earlier. Seems a poor decision for an ostensibly professional website to branch out like this. God, I hope .gov sites never start hosting blogspam.
maybe the whole damn thing is outsourced to ChatGPT now, who the fuck knows.
I don’t understand why so many people assume an LLM would make glaring errors like this…
…because they frequently do? Glaring errors are like, the main thing LLMs produce besides hype.
They make glaring errors in logic, and confidently state things that are not true. But their whole “deal” is writing proper sentences based on predictive models. They don’t make mistakes like the excerpt highlighted.
Y’know what, that’s a fair point. Though I’m not the original commenter from the top, heh.
Ah apologies, I’m terrible with tracking usernames, I’ll edit for clarity.
No worries mate. I appreciate the correction regardless.
I’m imagining that the first output didn’t cover everything they wanted so they tweaked it and pasted the results together and fucked it up.
That could easily happen with reconfiguring throw own writing as well though.
Pretty soon glaring errors like this will be the only way to identify human vs LLM writing.
Then soon after that the LLMs will start producing glaring grammatical errors to match the humans.
I think he was being sarcastic lol. I…hope
ChatGPT writes much better than this article, here’s its response to your comment as an example:
Your frustration with the state of tech journalism is understandable, especially when it comes to covering high-end technology like the 163-inch 4K Micro-LED television by TCL. It’s true that on a surface level, comparing the pixel density of large screens to that of smaller devices like laptops or smartphones can seem straightforward. However, the engineering and design challenges involved in scaling up screen technology while maintaining image quality cannot be understated. TCL’s achievement lies not just in the number of pixels but in creating a large-scale display that maintains high-quality imagery, which is a different kind of technological feat compared to miniaturization.
Regarding the editorial errors and the critique of the writing quality, it’s clear that tech journalism, like many fields, faces challenges in maintaining high standards amid the fast pace of technology news and the pressure to publish quickly. While it’s disappointing to see, it highlights the ongoing need for rigorous proofreading and editorial oversight in publishing. However, attributing these issues to automation or outsourcing without evidence might not fully capture the complexities and pressures faced by publishers today. It’s crucial for the industry to address these issues to maintain credibility and provide the insightful, accurate tech coverage that readers deserve.
Yeah but I can’t trust that at all. It may very well be complete bullshit, it just happens to be composed in a way that appears meaningful.
Remember what the sources are : you, me, marketing talk from product webpages…, certainly not the brain of a display engineer at Sony
“Did you know that the human eye only sees in 720p at 30fps? Your computer isn’t better than my console” \s
24 fps*
I bought the 4k 120fps eyes. Sadly not all of real life is a available in HD anyways.
Call me when they hit the FOV limit too
Oh great, another round of nonsense about the limits of human vision peddled by A) companies trying to trick you into thinking their products are great, and B) fools trying to cope with their buyer’s remorse and envy, and C) people with not-so-great eyesight who, for some reason, think that’s inconceivable.
We are nowhere near the limits of human visual acuity. It is trivial to prove this by experiment.
Not sure I’d call $3500 trivial…
(/s…sorta?)
The resolution and pixel density of the Vision aren’t that much higher than the Q3’s. I wanna try one to compare after seeing that because I can’t believe it looks so much better that the $3000 more it costs is worth it. At least for VR; I know the cameras for the AR are way better.
I think it’s significantly higher than the Quest 3, but it’s kind of ridiculous to compare a $3500 productivity headset to a $500 gaming headset in the first place.
It’s hard to get totally accurate numbers without independent standardized evaluation. Calculating pixel density isn’t as straightforward with headsets as it is with regular displays.
There’s an interesting analysis of a bunch of different headsets on Reddit. They put a comparison column for equivalent viewing distance with different common monitor sizes/resolutions. e.g. they calculate that the density of the Apple Vision Pro is similar to a 32" 4K display at a mere 15"/38cm distance, which is definitely close enough to see pixels. These are only estimates, since we don’t know the per-eye FOV, or how exactly it’s warped from center to edge.
Reddit link: https://www.reddit.com/r/virtualreality/comments/18sfi3i/ppdfocused_table_of_various_headmounted_displays/
Direct spreadsheet link: https://docs.google.com/spreadsheets/d/1_Af6j8Qxzl3MSHf0qfpjHM9PA-NdxAzxujZesZUyBs0/edit?usp=sharing
I mean, it’s still really good, don’t get me wrong. But there’s a giant chasm between “really good” and “the eye’s resolution limits”.
What do you mean it’s not that much higher? It’s well over double the resolution. That’s a lot.
And there’s the other aspects, apple has Micro-OLED panels Vs LCD, virtually zero screen door effect, very very good video passthrough, very low latency on the passthrough. Plus a bunch of other crap.
But it doesn’t really matter, they’re not comparable. The vision pro, to me, seems more like an engineering exercise on Apple’s part, mixed with a Dev kit to put out in developers hands. It’s not meant to compete against a $500 gaming and porn consumption headset.
The vision pro is a cool engineering marvel. But it has no real place in the market for any normal person. Nobody outside of devs banking on future Apple VR should buy it.
It’s the framerate and response lag that is going to make it a motion sickness machine for folks like me.
And sadly, it gets worse as I age, so VR is running a losing race.
We have to speed up technology so that it outpaces us humans getting older!
PPD is really the big deal with the eye so close and foveated rendering being used. Was curious to see if they mentioned the limit of the human eye’s PPD resolution but I didn’t see it. Otherwise, a good article on the technology.
But nowhere close to the human eye’s dynamic range…
I’m curious what this actually is. Yes, we can see under moonlight and also at noon in the tropics, but not at the same time. It’s somewhat akin to the dynamic range of a camera — an 8bit B&W camera has a gigantic dynamic range if you allow for shutter, aperture, and gain settings to be adjusted.
In other words, while the dynamic range of my eye over the course of an hour is maybe 60dB*, there is no way I can use that dynamic range in a single scene/“image”.
*Just a guess from sunlight at ~1kW/m^2 to moonlight at roughly one millionth of that (super hand wavy I know).
A screen is not a camera. Dynamic range in a display matters so that your eyes can adjust to it, the way they adjust to the environment. Real life can obviously have comically dark and comically bright parts simultaneously.
I’m not sure if this is entirely true but I think one YouTuber somehow calculated and came up that each eye is ~500 megapixels
The least interesting figure that’s also the first thing reviews harp on and the hardest thing manufacturers push.
For presence, latency is what matters.
For immersion, FOV is what matters.
For adoption, cost is what matters.
I maintain that some absolute toy is what’s gonna break the market open. Dirt cheap, immediate, convenient, and with static specs that make current owners scoff. It can have potato graphics so long as it feels rock-solid. (And doesn’t make you sign in to a computer that’s strapped to your goddamn face.)
The trick is gonna be intermediate representation. We’re still using direct raster to bitmaps, from software. This is quite frankly insane. It’s a misunderstanding of why we have bitmaps. The refresh rate of old monitors had to be kept precise or else things got fucky. Generating pixels on-the-fly worked, but it was limited by hardware speeds. Showing a big dumb array of pixels instead simplified the technology and decoupled display from rendering.
But VR displays don’t need to refresh the same pixels every fraction of a second - they need to refresh the same scene every fraction of a second. The same surfaces should stay put while you move your head, even if updating those surfaces takes a moment. The modern equivalent of a simple video card reading off a big dumb array of pixels would be a big dumb array of colored balls floating in open space. The further, the bigger. If some very simple technology can guaranteeably render that at 200 Hz, then it doesn’t matter how long a game needs in order to update all those balls.