All fine and good. As I work with video I’m more interested in them adding HW acceleration of HEVC 10bit 4:2:2. Codec that a lot of cameras today shooting in. Sony, Canon, DJI and more. Intel Arc is the only GPU (and CPU with iGPU) that support’s it as of now. Supported since Intel 10 Gen, so well before 30-series cards.
It is for normal compressed video. 8bit 4:2:0. However cameras capture at higher color quality. 10bit 4:2:2. This for the purpose of color grading where the cameras shot in LOG profile where colorgrading is needed to get a good look. 10bit 4:2:0 is supported in many GPUs but that has lower color information and many cameras do only 4:2:2 and not 4:2:0 chroma subsampling. This difference is not much until you start colorgrade you videos. Then the colors can get strange and create artifacts that you don’t want.
All fine and good. As I work with video I’m more interested in them adding HW acceleration of HEVC 10bit 4:2:2. Codec that a lot of cameras today shooting in. Sony, Canon, DJI and more. Intel Arc is the only GPU (and CPU with iGPU) that support’s it as of now. Supported since Intel 10 Gen, so well before 30-series cards.
so if you already have it in hardware, why would you need another piece of hardware to do the same thing? *puzzled look*
I was under the impression hevc was commonly supported now. Aren’t we moving to av1?
It is for normal compressed video. 8bit 4:2:0. However cameras capture at higher color quality. 10bit 4:2:2. This for the purpose of color grading where the cameras shot in LOG profile where colorgrading is needed to get a good look. 10bit 4:2:0 is supported in many GPUs but that has lower color information and many cameras do only 4:2:2 and not 4:2:0 chroma subsampling. This difference is not much until you start colorgrade you videos. Then the colors can get strange and create artifacts that you don’t want.