I think you may be confusing the function of the dongle and the hardware actually running the games because you like all of us are used to having local hardware such as a PC, laptop, tablet, or conso...
See more...
I think you may be confusing the function of the dongle and the hardware actually running the games because you like all of us are used to having local hardware such as a PC, laptop, tablet, or consoles. Cloud is not the same thing. Cloud is a computer you connect to remotely from another device you have running locally. The local device sends instructions to the cloud computer and displays images from the cloud computer, but the cloud computer is the one doing all the heavy lifting and compute. Basically, the server rendered the graphics of the game and transmits it to the dongle. So the GPU on the server side can send (1) Native 4K (2) checkerbox upscaled or (3) AI upscaled 4K/8K images to the dongle. The dongle then decodes/decompresses the images and sends them to your TV. So with that being said, AI upscaling will be done at GPU at the cloud server level not locally at the dongle level. What the dongle does is decompress and output the images it recieves to a local screen. Where you could get improvements at the dongle is imaged decoding/decomprssion (H.264, H.265, VP9, AV1, etc). Aside from that AI up scaling would be interesting assuming Google can do it. There is AI-based upscaling currently available in Vulkan, but few games use it, just like few games use Nvidia's DLSS. That may change with next gen consoles and whatever AMD are going to announce on Wednesday. However, seeing that the guys at Stadia are not talking about features or future vision for the platform, we can assume that Stadia still using AMD's Vega 10 GPU architecture which is limited to FP16 compute, will not be capable of doing that upsampling or upscaling in anywhere near an efficient way. Remember the Vega 10 is from 2016/2-17. AMD have released Vega 20 (Radeon VII), Navi 10/RDNA 1 (RX 5000-series) and on Wednesday will be announcing Navi2X/Big Navi/RDNA2 (RX 6000-series and what is in the PS5 and XBox Series X and S). At this stage we can only say that Stadia is previous gen. I know Google have custom tensor compute (i.e. AI/ML compute) processors, but there is no mention of those being used on Stadia from what I remember. This is why I keep saying the guys at Stadia need to get beyond just releasing games, we trust them that the games are coming, what they need to do instead is tall us about upgrade paths and features. Since the roll out of Stadia is in countries where PC and consoles purchases are typically high, Stadia needs to give us a good reason to buy games on Stadia rather than PC or consoles. With SSDs dropping in price and internet speeds improving, not having to wait for downloads really is not a good reason to buy on Stadia. Added to this, there are other cloud services like Shadow.tech that give you access to your PC library that also do 4K and since they use Nvidia GPUs already have or will eventually get ray tracing and DLSS. I may be an exception, but I actually don't care about the games on Stadia, I believe the games will come and with so far 113 games either already live on Stadia or coming in the next month, Stadia has more than a lot of games for its first year. Right now, what I want to see from Stadia which covers the AI upscaling is, I want Stadia talking more about the server side hardware and the potential behind the tech they have at their disposal.