Quote from: cfb on July 29, 2022, 18:47:58I'm one of the weirdos that doesn't need discrete graphics, but I'd like a newer model chip to play 4k/60 youtube vp9 videos without dropping frames.
I personally watched many times how Edge, on the old M$ engine, showed the absence of drops in 4k@60fps video on YouTube, but at the same time, I visually, well saw friezes and micro-pulling in the video, which should not be.
To believe the statistics on YouTube is not to respect yourself. Therefore, all tests on this subject are essentially empty. Only with your own eyes you can verify the smoothness and stability of video playback.
Secondly, in order for the video to run smoothly with minimal jitter, not only the hardware in the SoC (or with a discrete video card) must be able to draw 60 frames per second with a given stable frequency between them, but also the monitor or laptop panel must be able to draw them to the last point, otherwise it also automatically follows visible disruption of synchronization. And most laptop panels in the "60Hz" class have a response of more than 30ms, which means that, in the case of complex frames, they simply do not have time to give out for fraw of stable 60 frames per second and there is a breakdown in synchronization, although the browser in the statistics can brazenly write that there are no drops...
Today, there are practically no processor lines for notebooks that are _really_ without shame capable of producing a 4K@60fps picture without breaking synchronization(moreover, that for a long time, up to hours of playing) on YouTube. There are no such processors.
But on my projector, I personally observe the ideal reproduction of such a picture for hours when playing from a file in 720/1080p resolution in 50/60fps. Until the same level of quality is achieved on YouTube at 4k@60fps+, it's out of the question to talk about high-quality video playback.
In general, smartphones completely outplayed the x86 sector in this topic, to x86 shame...