The Digital Society blog raises the question of bit rates for computer games.
The study assumed that computer games were effectively compressible to 100 Mbps which the researchers say is 8 times higher than HDTV. But I don’t know how this number came about since computer games (even the most realistic) are not as realistic as live video due to the lack of details. This is why even Hollywood has a hard time convincing us we’re looking at live shots instead of computer graphics. Compression is an arbitrary number because we can choose any level of compression level we want depending on how [much] data we are willing to discard.
Actual 1920×1080 resolution gaming requires 3000 Mbps of data going from the video card to the display and at no time is it ever compressed
We spent a lot of time investigating this issue; as the post says, it has a big effect on our total byte estimate. I’ll sketch the process we used. We are looking for better estimates – ideas and data welcome! The first step was to divide game-playing devices into about 10 categories, such as PS2-level consoles, Xbox-360 level consoles, low end computers, and high-end gaming computers. Second, for each category, we calculated the “raw” bit rate they are theoretically capable of, based on screen size, frames per second, and bytes per pixel. For example a 1280 x 960 pixel screen running at 60 frames per second, 24 bits per pixel is 1.769472e9 (1.7 gigabits) per second, abbreviated Gbps. 1280 by 960 is modest resolution for a gaming computer. The 60 frames per second depends on the game, and the power of the video processor.
The third step was the hard one. TV programs are generally compressed about 50:1, meaning that the actual bandwidth is 1/50 of the theoretical bandwidth. 50:1 compression of the 1024 x 768 example works out to 35 Mbps, compared with 4 Mbps for conventional TV. But bandwidth is a scarce resource for TV programs; it is NOT a scarce resource for computer games that only have to send the video stream a few feet to a monitor. Furthermore, 2008 mid-level Graphics Processing Units have theoretical speeds of more than 10 Gbits (gigabits) per second! (I will try to look up some exact numbers and publish a link, but see Tom’s Hardware and similar reviews of cards.)
Why not just use the raw bandwidth of these games? When you play a game, even a first-person-shooter or a basketball game, there is considerable continuity from one frame to the next. A target runs across a mostly fixed background, for example. In the Shannon sense, only the new parts of the image should be counted as “information,” and that is what we attempted with our measurements.
I ran some experiments on a couple of games, and came up with compression ratios around 10:1. But my experiments were very crude, and limited by time. I’ve now hired a grad student to do more. I had hoped that Nvidia or others could provide this information, but so far, no such luck. There are also issues of play style and game type. If I play a casual game like Plants versus Zombies, no matter how big the screen and how fast my GPU, the true resolution is low.
So at this point, I think our numbers are reasonable, but they could be off by a factor of 2 in either direction. This would affect the byte total (3.6 ZB in 2008), but computer games and TV would still be the biggest sources of bytes, by far.