The Digital Society blog raises the question of bit rates for computer games.
The study assumed that computer games were effectively compressible to 100 Mbps which the researchers say is 8 times higher than HDTV. But I don’t know how this number came about since computer games (even the most realistic) are not as realistic as live video due to the lack of details. This is why even Hollywood has a hard time convincing us we’re looking at live shots instead of computer graphics. Compression is an arbitrary number because we can choose any level of compression level we want depending on how [much] data we are willing to discard.
Actual 1920×1080 resolution gaming requires 3000 Mbps of data going from the video card to the display and at no time is it ever compressed
We spent a lot of time investigating this issue; as the post says, it has a big effect on our total byte estimate. I’ll sketch the process we used. We are looking for better estimates – ideas and data welcome! The first step was to divide game-playing devices into about 10 categories, such as PS2-level consoles, Xbox-360 level consoles, low end computers, and high-end gaming computers. Second, for each category, we calculated the “raw” bit rate they are theoretically capable of, based on screen size, frames per second, and bytes per pixel. For example a 1280 x 960 pixel screen running at 60 frames per second, 24 bits per pixel is 1.769472e9 (1.7 gigabits) per second, abbreviated Gbps. 1280 by 960 is modest resolution for a gaming computer. The 60 frames per second depends on the game, and the power of the video processor.
The third step was the hard one. TV programs are generally compressed about 50:1, meaning that the actual bandwidth is 1/50 of the theoretical bandwidth. 50:1 compression of the 1024 x 768 example works out to 35 Mbps, compared with 4 Mbps for conventional TV. But bandwidth is a scarce resource for TV programs; it is NOT a scarce resource for computer games that only have to send the video stream a few feet to a monitor. Furthermore, 2008 mid-level Graphics Processing Units have theoretical speeds of more than 10 Gbits (gigabits) per second! (I will try to look up some exact numbers and publish a link, but see Tom’s Hardware and similar reviews of cards.)
Why not just use the raw bandwidth of these games? When you play a game, even a first-person-shooter or a basketball game, there is considerable continuity from one frame to the next. A target runs across a mostly fixed background, for example. In the Shannon sense, only the new parts of the image should be counted as “information,” and that is what we attempted with our measurements.
I ran some experiments on a couple of games, and came up with compression ratios around 10:1. But my experiments were very crude, and limited by time. I’ve now hired a grad student to do more. I had hoped that Nvidia or others could provide this information, but so far, no such luck. There are also issues of play style and game type. If I play a casual game like Plants versus Zombies, no matter how big the screen and how fast my GPU, the true resolution is low.
So at this point, I think our numbers are reasonable, but they could be off by a factor of 2 in either direction. This would affect the byte total (3.6 ZB in 2008), but computer games and TV would still be the biggest sources of bytes, by far.
1. Games are not compressed.
2. 3D rendered graphics contain less information than live video. Even the most advanced computer graphics still don’t look real. So if we were to compress game video (for remote rendering), it would be equal or less bitrate than live HD video.
3. What does it matter what the bitrate is? Are we actually consuming it?
Intimately, the post is actually the best on this valuable topic. I agree with your conclusions and will thirstily look forward to your incoming updates. Just saying thanks will not just be sufficient, for the fantasti c lucidity in your writing. I will right away grab your rss feed to stay abreast of any updates. Gratifying work and much success in your business endeavors!
Compression of 1080-60p Blu-ray is 75:1 since 3000 Mbps gets compressed to 40 Mbps which is relatively low compression for video. Broadcast 1080-60i is 100:1 compression because it’s 1500 Mbps to 15 Mbps which is still relatively low compression compared to what you get via digital satellite “fake HD”.
I don’t know where you get the 50:1 compression levels unless you’re talking about standard definition television.
Pingback: HMI Bonus Material: Video Game Screenshots « Roger Bohn's Blog