Quote from: vertigo on May 14, 2021, 16:39:28I would need a relevant (geographically) address to test. It's just a question of infrastructure. Getting those servers close enough to people. I tried play.geforcenow.com and got 6-7. I know that is still not a game server. But I don't think Nvidia has a local server. I remember that over 15 years ago, I had about 8-9 ms to a local search engine (definitely) and something in low 10s for Google, but I'm fuzzy about that (perhaps only 10-11, not entirely sure).
That's 6ms to Google, though, not to the game server, which may be more. Still, I suspect you could probably go up to 30-50ms before most people would see any difference, but it's an interesting thought nonetheless. And I've been unfortunate enough to have pretty lousy internet for the better part of the past decade, ranging from terrible to ok, with less than a year in there where I actually had really good internet, so I'm more hesitant to rely on it for gaming. And last night I had a ping of ~500-700, which was making even browsing unbearable, so I'd imagine even 100-200 would make gaming all but impossible. In fact, a few years ago, with the internet I had I was lucky to have a ping < ~60-70, and I rarely played games online with a friend just because it wasn't worth the frustration, not to mention the connection was downright unstable.
I'm guessing you don't use Steam for your games, since you don't own/control them then. I think it's ridiculous and should be illegal that you can buy a game but not own it.
Quote from: _MT_ on May 14, 2021, 11:51:57
I don't agonize over latency. But then I don't compete in shooters. I just find it funny. I've got something like 6 ms ping to google and pretty stable. Can you tell extra 6 ms between mouse and display? But I wonder how it works when you've got 100+ ms. I know some people do have that much, even over 200 (I've got a friend in Australia who is so affected). And then there is jitter. My case is really that I don't like being dependent if I don't have to. And I like owning and controlling things. Some people never look back as there is always something new to play. I really like returning to old favourites. Of course, I'm dependent even off-line. Through compatibility.
Quote from: vertigo on May 13, 2021, 17:54:39I don't agonize over latency. But then I don't compete in shooters. I just find it funny. I've got something like 6 ms ping to google and pretty stable. Can you tell extra 6 ms between mouse and display? But I wonder how it works when you've got 100+ ms. I know some people do have that much, even over 200 (I've got a friend in Australia who is so affected). And then there is jitter. My case is really that I don't like being dependent if I don't have to. And I like owning and controlling things. Some people never look back as there is always something new to play. I really like returning to old favourites. Of course, I'm dependent even off-line. Through compatibility.
That's something I've always wondered about cloud gaming and one of the things we talked about. I would expect it to make games borderline unplayable, but he said he didn't notice any latency and it was like playing the game locally.
Quote from: vertigo on May 13, 2021, 17:51:38I know you're talking about a different situation. And as I said, I can imagine it very well for drawing tablets, for example. Or portable monitors. The biggest barrier probably being that a computer would have to be able to supply that much power (not a problem when you need 5 W, but the more you need, the more problematic it's going to be).
It could draw power from the laptop's battery, since if it's a low-power display, similar to the ones in laptops, it wouldn't be a huge impact on it. Of course, more likely the laptop would be plugged in and would just pass power from the AC adapter through the USB to the monitor. And, of course, with a desktop, that would certainly make more sense, since you wouldn't need, or be able to, power the desktop from the monitor. So my point is that either way, you need a cable between the computer and the display, and so it makes more sense, in the example situation I gave where you're using a portable touchscreen display to control the computer without having to be sitting at it, to only have that one cable, and have it supplying video, USB, and power from the computer, vs having two cables attached to it, one going to the computer and one plugged into an outlet. And again, with a low-power display, this absolutely should be possible within the 100W limit of USB (heck, 100W should be enough to power even an ~30" gaming display). But the bottom line is that it just doesn't make any sense to have two cables plugged into it going to different places, seriously limiting its portability, vs just one. That would make sense if you're actually on the laptop and just using the other display as a second monitor, and maybe that's what you're thinking, but remember, I'm talking about an entirely different situation.
Quote from: _MT_ on May 13, 2021, 09:33:39
It's funny to see gamers agonize about latency - mouse latency, display latency, touchscreen latency. And then decide to add Internet into the mix which has significant latencies and jitter to boot.
Quote from: _MT_ on May 13, 2021, 09:17:13
Well, if the display isn't powered from a grid and there is only one cable connecting the laptop with the display, where does the power come from? Laptop's battery? Instead of charging it at home while it's "docked," you'd be discharging it? That's why you need two ports. And that's why I think it makes more sense (in the general case) to have display powered from the grid and use it to charge your laptop over the same cable that is used to send video to the display.
My point was that USB is already there. As long as you can live with the 100 W limit.
Quote from: vertigo on May 13, 2021, 01:19:27It's funny to see gamers agonize about latency - mouse latency, display latency, touchscreen latency. And then decide to add Internet into the mix which has significant latencies and jitter to boot. We have the technology to build incredibly fast networks. But we are not using them. Take InfiniBand. At one point, it was projected to replace Ethernet, at least within datacentres (although it can be used even across long distances, cross continents). It was cheaper, had higher bandwidth and much lower latency. It didn't. It established itself in supercomputers but that's pretty much it. This demonstrates that even in large business, IT isn't entirely rational. There is quite a lot of conservatism. It's gets only worse for wireless (and worse still if it goes via satelite).
Another consideration, and something a friend and I were discussing last night, is that gaming is going more online, in a Netflix-type model, i.e. cloud gaming.obsolete.
Quote from: vertigo on May 12, 2021, 16:03:42Well, if the display isn't powered from a grid and there is only one cable connecting the laptop with the display, where does the power come from? Laptop's battery? Instead of charging it at home while it's "docked," you'd be discharging it? That's why you need two ports. And that's why I think it makes more sense (in the general case) to have display powered from the grid and use it to charge your laptop over the same cable that is used to send video to the display.
a) But that's the point, that in theory it should only require one port and one cable, at least if the technology improved enough.
b-c) I realize all that, and realize my example is a pipe dream at the moment, but I'm hopeful that with an increase in the capabilities of USB4 and in other technologies...
Quote from: _MT_ on May 11, 2021, 19:42:38Quote from: t4n0n on May 11, 2021, 14:22:49A significant problem is that a GPU can require a lot of power. A laptop would have to get it from somewhere. And it gets only worse as a laptop (and its battery) gets smaller. You can't expect a laptop that was never designed to have a dGPU to power a dGPU. Even if the box is built around a mobile chip. Even a measly 65 W mobile chip is too much for a typical ultrabook.
Accomodating GPUs that were designed for a desktop environment immediately introduces two constraints that cripple the original premise: a bulky form factor that is not amenable to portability and the requirement for mains power.
Quote from: _MT_ on May 12, 2021, 11:30:48
a) It would take up two ports on the laptop (one for a display and one for a power supply). Not necessarily a deal-breaker, but worth mentioning.
b) Laptop's power supply would have to be sized to accommodate a monitor. Making it bigger, heavier and more expensive.
c) The connector used for power supply would have to be able to carry enough power for both a laptop and a monitor. USB PD is not there - officially, it tops out at 100 W which is plenty for ultrabooks. But not enough for an ultrabook plus a desktop monitor.
d) And when it comes to power, it's not just a monitor, but also all the peripherals connected to it.
I can't see an advantage compared to the other way around where a monitor essentially acts as a power supply. So, a desktop monitor is permanently connected to the grid, potentially also a wired computer network, speakers, camera, whatever, powering all that. And by connecting a single cable, you get both power for a laptop and data, all the peripherals. Thunderbolt is superior for this but USB can work for anything that can be made as a USB device. You take up only one port on a laptop and you can keep a charger in a bag.
Yes, USB4 can be used to connect e.g. a drawing tablet to a desktop with a single cable. Assuming it can fit within a 100 W budget. But your computer has to be able to supply that much power on an USB port. Technically, USB allows it. But the motherboard manufacturer has to implement it. And similarly, tablet's manufacturer has to build it so it supports USB PD. The thing here is that they can't assume a computer will be able to supply enough power so there has to be an alternate means of powering it and so they might not bother with USB PD on the video/ data port at all (especially if only a few computers can supply enough power). If it's USB4, I think USB + video on one cable is given. The potential complication here is that USB-C is not that common on desktops and that they will probably want backward compatibility and some flexibility. Another complication is that video output on a desktop is often on a video card. You have to get video and USB on the same port, not to mention PCIe for Thunderbolt. This is easier in a laptop with its higher integration.
Quote from: _MT_ on May 12, 2021, 11:30:48I have forgotten to add that what matters is not how much you use, but how big is the budget the monitor has for peripherals. I don't want to rain on your parade but the point is that this largely rests with device manufacturers, to take advantage of what is possible. The main limitation of USB that you come up against is the 100 W limit. I can imagine it going up, to accommodate more powerful laptops, potentially increasing its utility for other applications. But there is a limit. USB-C exists primarily for "ultraportable" devices. That won't be compromised.
d) And when it comes to power, it's not just a monitor, but also all the peripherals connected to it.
Quote from: vertigo on May 11, 2021, 19:52:46a) It would take up two ports on the laptop (one for a display and one for a power supply). Not necessarily a deal-breaker, but worth mentioning.
I was thinking more along the lines of a laptop passing power through from AC to the monitor, i.e. powering a monitor when the laptop is plugged in. But also for desktops, to be able to have just one cord going from the desktop to the monitor for everything. An example use case for either situation, and something I've wanted to do before, is to get a touchscreen monitor and have it connected via one USB cable to provide poewr, video, and USB, so it could be used as a tethered tablet. This would be great for using a computer while in bed or on the couch.
Quote from: _MT_ on May 11, 2021, 19:30:25
USB4 will do that. As I wrote, you can have USB 3.2 and DisplayPort streams on the same cable. And Power Delivery works as well. Although, you'd power a laptop from a display, not the other way around (laptops are not designed to supply a lot of power to external devices as they are battery-powered). That way, you just connect one cable and laptop charges, display is connected and USB peripherals as well. But that's it. You're limited to USB. You can't connect PCIe based devices like GPUs, network cards or storage (if you want to go above USB 3 speeds). That's what you need Thunderbolt for.
Quote from: t4n0n on May 11, 2021, 14:22:49A significant problem is that a GPU can require a lot of power. A laptop would have to get it from somewhere. And it gets only worse as a laptop (and its battery) gets smaller. You can't expect a laptop that was never designed to have a dGPU to power a dGPU. Even if the box is built around a mobile chip. Even a measly 65 W mobile chip is too much for a typical ultrabook.
Accomodating GPUs that were designed for a desktop environment immediately introduces two constraints that cripple the original premise: a bulky form factor that is not amenable to portability and the requirement for mains power.