News:

Willkommen im Notebookcheck.com Forum! Hier können sie über alle unsere Artikel und allgemein über Notebook relevante Dinge disuktieren. Viel Spass!

Main Menu

Post reply

The message has the following error or errors that must be corrected before continuing:
Warning: this topic has not been posted in for at least 120 days.
Unless you're sure you want to reply, please consider starting a new topic.
Other options
Verification:
Please leave this box empty:

Shortcuts: ALT+S post or ALT+P preview

Topic summary

Posted by Mark Rejhon
 - September 09, 2019, 00:48:23
Hello Allen,

A good twitter thread addressed to NotebookCheck:
twitter.com/BlurBusters/status/1170806444301717504
Posted by [email protected]
 - September 08, 2019, 22:42:09
Correct, humans can't see high Hz /directly.

However, there are other indirect artifacts caused by refresh rate limitations.  There are other human visible benefits.   Here is some photographic proof:

Effect #1: Stroboscopic effects, easy to tell 240Hz vs 480Hz in this photo:

IMGUR:
i.imgur.com/PxvWkcK.jpg

Effect #2: Reduction of motion blur.  Double fps & Hz = Half motion blur on a non-strobed display (even OLEDs).

CHART:
blurbusters.com/wp-content/uploads/2019/04/motion_blur_from_persistence_on_sample-and-hold-displays.png

You can see for yourself with TestUFO tests and other material.
Posted by doa379
 - September 06, 2019, 18:31:01
The human brain doesn't perceive anything more than a few Hz, unless you're actually practising some visualisation exercise. Much less 300Hz. There's seems little point to high Hz. Even playing games at anything more than 25-30Hz and the motion looks somewhat superficial. Nevertheless experimenting with high Hz does pose a worthy avenue for research.
Posted by Mark Rejhon
 - September 06, 2019, 04:00:21
(Part 2 of 2)

One big benefit of 300 Hz is that it will help low-Hz become cheaper.   
It apparently helps advertises the ASUS brand, they also offer lower-Hz displays.

And this laptop is also compatible with external GPUs.  You can buy a Thunderbolt box and install a GeForce RTX 2080 Super inside it, and push a few games to 300fps using that mechanism too.  Other times, you want the 300 Hz because you play CS:GO.  That's a popular game and sometimes that's the only game is played.  It's currently played in esports stadiums filled with thousands of people.  This laptop is definitely manages 300 frames per second in CS:GO, and many people buy it for that.  But that's the outlier person obviously. 

Refresh rate progress is useful in so many ways -- such as non-game benefits.   (e.g. scientifically, the only way to get flickerless CRT, or blurless sample-and-hold, or strobeless ULMB, is to use unobtainium refresh rates -- but someday it will be cheap).   Someday we'll love to see CRT-clarity scrolling on our laptop screens without needing to turn on an eye-straining image-dimming strobe backlight mode.   And things like that.

Unfortunately it does take a big jump in Hz to get quite noticeable benefits.

60Hz vs 120Hz is an 8.3ms refreshtime difference (8.3ms less motionblur)
120Hz vs 1000Hz is a 7.3ms refreshtime difference (7.3ms less motionblur).

So to get roughly similar improvement of 60Hz->120Hz, in laboratory experiments we actually had to jump 120Hz->1000Hz just to achieve that.   We are in for a very long technological slog to 1000Hz but the benefits are  impressive even for the average layperson. 

The problem is manyfold.  Having been hired by monitor manufacturers and headset manufacturers -- there are many limiting factors:

-- Pixel response interfering with refresh rate benefits -- 3ms pixel response is the VESA measurement standard only from GtG 10%->90% point (google GtG vs MPRT), so it's actually more closer to 20ms for GtG 0%->100%.  There are many scientific reasons for these cutoff points, but it does perpetuate mistrust in screen specs, as well as does actually degrade motion clarity differences between refresh rates.

-- People who love CRT (tolerate flicker) versus people who love flickerfree (has to put up with more blur).  It's not possible to have flickerless & blurless simultaneously except at ultrahigh refreshrates, so it ends up being a pick-your-poison that many people do not realize.   

-- Motion blur is dictated by pixel visiblity time, whether it's unique refresh cycles (packed sensely), or black periods between unique refresh cycles.(brief flicker with long black periods, ala ULMB strobing).   That's why 0ms GtG pixel response still has many milliseconds of MPRT motion blur -- that 0.5ms GtG OLED has 16.7ms MPRT!    It has never been possible to get 2ms MPRT without doing at least ~500fps at ~500Hz.  The two pixel response benchmarks are different.

-- The incrementalness of refresh rate increases.  It's getting difficult to keep increasing refresh rates quickly, 144Hz->180Hz->200Hz->240Hz->300Hz is extremely incremental, when benefits mainly show at doublings like 60Hz->120Hz->240Hz->480Hz->960Hz due to the need to jump abruptly up the curve of diminishing returns.

-- Fortunately, there are experimental 480Hz thru >1000Hz displays in the laboratory, so there's a progress path this century towards eventually commoditizing ultra-Hz refresh rates, at least in the next couple of human generations.   Blur Busters does a major job of advocacy/education to clear up misconceptions.

-- Even when refresh rates stop flickering, other benefits still appear (e.g. elimination of stroboscopic effects, or the elimination of motion blur in a strobeless manner).   The whac-a-mole on diminishing curve of returns.  Now that displays have gone retina resolution, the focus in on other unturned stones such as gradually marching towards retina refresh rates -- people laughed about 4K until 4K became cheap/common.   120Hz is only now becoming more mainstream (HFR, the 120Hz iPads, etc), but isn't this century's final frontier.   

-- VR scientists trying to emulate a Star Trek Holodeck have all come to quick agreement that the vanishing point of diminishing returns can go all the way into the quintuple digits (for a retina-FOV retina-resolution wraparound display, at least) though most scientific experiments run tests that have weaker variables (e.g not retina resolution, etc) that peter out at the high hundreds or low thousands Hz.  So there is much disagreement.  However, all the agreements converge into being well into the quadruple digits to eliminate the wagonwheel effects / stroboscopic effects / blur / etc simultaneously (make it look like real life analog-motion, no added blur above-and-beyond human vision).

It's a fascinating science!
Posted by Mark Rejhon
 - September 06, 2019, 03:42:24
Founder of Blur Busters here.

One problem is that pixel response produces a major limiting factor in quality of high refresh rates.   Pixel response needs to be a tiny fraction of refresh duration, in order to avoid interfering with motion clarity.

Once pixel response becomes a non-issue, doubling refresh rate does halve scrolling motion blur -- making web page scrolling noticeably easier to read, assuming you are using a smooth-scrolling web browser.   

The diminishing curve of returns is certainly sharp but does not disappear -- it does take roughly a doubling refresh rate to see the difference more clearly.   e.g. 120 Hz -> 240 Hz -> 480 Hz -> 960 Hz  ...Or perhaps, 144 Hz -> 300 Hz.

At 3ms pixel response time, will begin to somewhat bottleneck.   300Hz 3ms will probably have slightly more motion blur than 240Hz 1ms. 

However, 1ms laptop screens at these refresh rates currently don't exist, so even the earlier 240 Hz laptop screens are more motion-blurry than 240 Hz desktop monitors.

There are some flagship technologies that needs to be pushed for high refresh rates:

-- Pixel response time that is a tiny fraction of refresh rate.  0.5ms GtG pixel response time becomes necessary for good 480 Hz operation.   There was tested to be a human-visible difference between 0.5ms and 1ms GtG in certain situations.

-- Frame rate amplification technologies (less powerful GPUs doing higher frame rates).  Some VR headsets such as Oculus Rift uses such technology to laglessly and artifactlessly convert 45fps to 90fps, in a way that is superior to interpolation.

Also MPRT and GtG are different pixel response benchmarks.  Google "GtG versus MPRT" for more information.  That will explain the difference better --

If you have an NVIDIA GSYNC monitor with ULMB, you can see-for-yourself human-visible differences of 0.5ms MPRT versus 1.0ms MPRT by doing these steps.   ULMB is a strobe backlight that reduces display motion blur, flickering kind of like a CRT.  ULMB is adjustable from ~0.25ms to roughly ~1ms-1.5ms using a setting called "ULMB 

To compare 0.5ms MPRT vs 1.0ms MPRT with your own human eyes (more than 90% of people could tell the difference doing this specific test) -- first, switch your GSYNC monitor to ULMB (usually 120 Hz refresh rate will automatically enable the "ULMB" setting in monitor menus). 

1. Go to TestUFO Photo Test (google 'TestUFO Photo')
2. Select Toronto Map (pulldown near top of page)
3. Select motion speed 3000 pixels/second (pulldown near top of page)
4. You won't be able to read the street name labels at default ULMB which is about 1ms to 1.5ms MPRT
5. Now open your monitor OSD menu and select "ULMB Pulse Width" adjusting down to about 30% or 50%
6. Now you can read street name labels!

That's because 1ms = 3 pixels of motion blur per 3000 pixels/sec.  That will blur the tiny map text during the panning map text.  Reducing this to 0.5ms fixes this.

As screens get bigger and retina resolution, the tinier milliseconds actually gets amplified.  So higher rez begets higher Hz benefits.  It's hard to do both simultaneously because performance of higher resolution is often lower.   But this guarantees technology progress to continue, because the human visibility benefits are still there.  It's much like 4K is cheap now, and someday 240Hz and 1000Hz will be cheap in a few decades (adds only a bit) much like buying a 4K TV instead of a 720p TV.
Posted by Allen.Ngo
 - September 05, 2019, 19:20:31
Quote from: My Perspective on September 05, 2019, 07:52:34
For fast paced 1st person shooters, especially if they're online multiplayer competetive, then there's advantages to be had all the way up to 1000Hz when it comes to motion clarity (ie blur reduction).  This has been proved by Blur Busters (google Blur Busters 1000Hz to find the article).

Yes, so 300Hz on laptop displays is more than just the 'benefit' of an increased number of "Vsync Divisors" - because aren't most of these screen G-sync anyway, which makes the "Vsync Divisor" argument irrelevant anyway, because there is no screen tearing on Gsync screens.  I think the benefit of high & higher refresh rate screens comes down to motion blur reduction, that's really the main benefit.  Decreased motion blur also adds to the benefit that more information is being sent to your eyes/brain during fast motion on higher refresh rate screens at higher fps, more information on which to base in game decisions, which is a benefit to competetive play.

A fair number of gaming laptops do not come with G-Sync. So far, Optimus appears to be more common from what we've seen. But great points
Posted by My Perspective
 - September 05, 2019, 07:52:34
For fast paced 1st person shooters, especially if they're online multiplayer competetive, then there's advantages to be had all the way up to 1000Hz when it comes to motion clarity (ie blur reduction).  This has been proved by Blur Busters (google Blur Busters 1000Hz to find the article).

Yes, so 300Hz on laptop displays is more than just the 'benefit' of an increased number of "Vsync Divisors" - because aren't most of these screen G-sync anyway, which makes the "Vsync Divisor" argument irrelevant anyway, because there is no screen tearing on Gsync screens.  I think the benefit of high & higher refresh rate screens comes down to motion blur reduction, that's really the main benefit.  Decreased motion blur also adds to the benefit that more information is being sent to your eyes/brain during fast motion on higher refresh rate screens at higher fps, more information on which to base in game decisions, which is a benefit to competetive play.
Posted by Redaktion
 - September 05, 2019, 04:52:09
No, you don't need a stable 144 FPS to take advantage of a 144 Hz display. For gaming laptops in particular where CPU and GPU power are limited, these 240 Hz and 300 Hz high refresh rate panels are more useful for their wide range of divisors than their native refresh rate.

https://www.notebookcheck.net/Do-you-really-need-a-240-Hz-or-300-Hz-laptop-display-The-common-misconception-about-high-refresh-rate-monitors.434392.0.html