(Part 2 of 2)
One big benefit of 300 Hz is that it will help low-Hz become cheaper.
It apparently helps advertises the ASUS brand, they also offer lower-Hz displays.
And this laptop is also compatible with external GPUs. You can buy a Thunderbolt box and install a GeForce RTX 2080 Super inside it, and push a few games to 300fps using that mechanism too. Other times, you want the 300 Hz because you play CS:GO. That's a popular game and sometimes that's the only game is played. It's currently played in esports stadiums filled with thousands of people. This laptop is definitely manages 300 frames per second in CS:GO, and many people buy it for that. But that's the outlier person obviously.
Refresh rate progress is useful in so many ways -- such as non-game benefits. (e.g. scientifically, the only way to get flickerless CRT, or blurless sample-and-hold, or strobeless ULMB, is to use unobtainium refresh rates -- but someday it will be cheap). Someday we'll love to see CRT-clarity scrolling on our laptop screens without needing to turn on an eye-straining image-dimming strobe backlight mode. And things like that.
Unfortunately it does take a big jump in Hz to get quite noticeable benefits.
60Hz vs 120Hz is an 8.3ms refreshtime difference (8.3ms less motionblur)
120Hz vs 1000Hz is a 7.3ms refreshtime difference (7.3ms less motionblur).
So to get roughly similar improvement of 60Hz->120Hz, in laboratory experiments we actually had to jump 120Hz->1000Hz just to achieve that. We are in for a very long technological slog to 1000Hz but the benefits are impressive even for the average layperson.
The problem is manyfold. Having been hired by monitor manufacturers and headset manufacturers -- there are many limiting factors:
-- Pixel response interfering with refresh rate benefits -- 3ms pixel response is the VESA measurement standard only from GtG 10%->90% point (google GtG vs MPRT), so it's actually more closer to 20ms for GtG 0%->100%. There are many scientific reasons for these cutoff points, but it does perpetuate mistrust in screen specs, as well as does actually degrade motion clarity differences between refresh rates.
-- People who love CRT (tolerate flicker) versus people who love flickerfree (has to put up with more blur). It's not possible to have flickerless & blurless simultaneously except at ultrahigh refreshrates, so it ends up being a pick-your-poison that many people do not realize.
-- Motion blur is dictated by pixel visiblity time, whether it's unique refresh cycles (packed sensely), or black periods between unique refresh cycles.(brief flicker with long black periods, ala ULMB strobing). That's why 0ms GtG pixel response still has many milliseconds of MPRT motion blur -- that 0.5ms GtG OLED has 16.7ms MPRT! It has never been possible to get 2ms MPRT without doing at least ~500fps at ~500Hz. The two pixel response benchmarks are different.
-- The incrementalness of refresh rate increases. It's getting difficult to keep increasing refresh rates quickly, 144Hz->180Hz->200Hz->240Hz->300Hz is extremely incremental, when benefits mainly show at doublings like 60Hz->120Hz->240Hz->480Hz->960Hz due to the need to jump abruptly up the curve of diminishing returns.
-- Fortunately, there are experimental 480Hz thru >1000Hz displays in the laboratory, so there's a progress path this century towards eventually commoditizing ultra-Hz refresh rates, at least in the next couple of human generations. Blur Busters does a major job of advocacy/education to clear up misconceptions.
-- Even when refresh rates stop flickering, other benefits still appear (e.g. elimination of stroboscopic effects, or the elimination of motion blur in a strobeless manner). The whac-a-mole on diminishing curve of returns. Now that displays have gone retina resolution, the focus in on other unturned stones such as gradually marching towards retina refresh rates -- people laughed about 4K until 4K became cheap/common. 120Hz is only now becoming more mainstream (HFR, the 120Hz iPads, etc), but isn't this century's final frontier.
-- VR scientists trying to emulate a Star Trek Holodeck have all come to quick agreement that the vanishing point of diminishing returns can go all the way into the quintuple digits (for a retina-FOV retina-resolution wraparound display, at least) though most scientific experiments run tests that have weaker variables (e.g not retina resolution, etc) that peter out at the high hundreds or low thousands Hz. So there is much disagreement. However, all the agreements converge into being well into the quadruple digits to eliminate the wagonwheel effects / stroboscopic effects / blur / etc simultaneously (make it look like real life analog-motion, no added blur above-and-beyond human vision).
It's a fascinating science!