How compressed are computer games?

The Digital Society blog raises the question of bit rates for computer games.

The study assumed that computer games were effectively compressible to 100 Mbps which the researchers say is 8 times higher than HDTV.  But I don’t know how this number came about since computer games (even the most realistic) are not as realistic as live video due to the lack of details.  This is why even Hollywood has a hard time convincing us we’re looking at live shots instead of computer graphics.  Compression is an arbitrary number because we can choose any level of compression level we want depending on how [much] data we are willing to discard.

Actual 1920×1080 resolution gaming requires 3000 Mbps of data going from the video card to the display and at no time is it ever compressed

We spent a lot of time investigating this issue; as the post says, it has a big effect on our total byte estimate. Continue reading

The Myth of Dick Fuld « The Baseline Scenario

Wall Street defenders like to point to Dick Fuld, who supposedly lost $1 billion by holding on to Lehman Brothers stock that eventually became worthless. You don’t get more of a long-term incentive than that, the argument goes.

Lucian Bebchuk, Alma Cohen, and Holger Spamann have exploded this myth in a Financial Times op-ed and a new paper.

via The Myth of Dick Fuld « The Baseline Scenario.

From Art to Science: what it means

Most of my research right now is about the evolution of technologies. They go from crafts, requiring skilled experts, to “engineering science,” i.e. mostly automated and very precise. For example, firearms manufacturing took 200 years to undergo this shift. Flying took about 100 years to go from the Wright Brothers, to autonomous aircraft (not just unmanned, but self-directed). How does this happen? Is it a good thing?

Here is a talk I gave on this topic. (Caution: 5 MB PDF file) The subtitle is Why old tasks get easier, but everything gets more complex.

Bohn knowledge evolution 2007

I’m working on a book on this subject, which does side-by-side comparisons of:

  • Flying
  • Medical care – several kinds
  • Firearms manufacturing (from Napoleon to 1980)
  • Semiconductor manufacturing

Each of them has undergone major transformations, with similar patterns.

How 3.6 Zettabytes of Data Get Consumed – hmi – Gizmodo

You probably already saw that the average American tears through 34GB of data per person per day. Here’s how the media has evolved these last few decades (sorry print), and below a way to compare your consumption with Joe Average.

This chart breaks down each activity by hours, bytes, and words for the total population, average per user, and average per American in 2008. There’s a lot to process here, but my first reaction is: that many people still watch TV in standard def?

Send an email to Brian Barrett, the author of this post, at bbarrett@gizmodo.com.

via How 3.6 Zettabytes of Data Get Consumed – hmi – Gizmodo.

————————————-

Response

What surprised me is not the amount of standard definition TV, but the large amount of live TV overall. It’s hard for me to find people who watch TV live – they either Tivo it, or watch something on the Internet, or rent DVDs with the shows they want. Unfortunately, the Nielsen TV data that we used lumps anything seen within 72 hours of original broadcast into its “live audience” report. Also, our data is for 2008; by 2010 I expect to find a lot less truly live TV. Finally, it is possible that the people I interact with are not typical. But even my computerphobic father watches recorded C-SPAN.

RB

Intel-AMD case

The lawsuit between Intel and AMD was settled a few weeks ago, five years after it was first filed by AMD. This was a private civil lawsuit; various government investigations are still going. According to reports, Intel agreed to pay $1,250,000,000, and to extend various license agreements with AMD. ($1.25 billion)

I was an expert witness on the case, which was an interesting experience to say the least. As an academic, I wish there were a way to use the data from civil lawsuits  cases for academic research. Of course, the confidential data itself has to be kept secret, but in principle it should often be possible to find ways to provide high-level analysis without revealing anything private.  As far as I know, though, this is never done. Certainly the standard agreements for using the data don’t allow it, and in practice it would probably take an agreement by both sides. The companies don’t seem to have much incentive to allow this. (I’m not speaking of AMD or Intel  or anything about this case in particular.)

There’s one  situation where some of the data does come out, namely if the case actually goes to trial. An example was the US vs Microsoft case in the late 1990s. But this is very rare – most cases settle before trial, and even when they don’t I gather that agreements to seal the data are standard.

The Economist praises a dangerous and obsolete management concept

The Economist just published a short article in praise of the experience curve. Even the first sentence is wrong . Here’s their lead-in.

The more experience a firm has in producing a particular product, the lower its costs

The experience curve is an idea developed by the Boston Consulting Group (BCG) in the mid-1960s.

Actually, no. The experience curve, known as the learning curve, goes back to the aircraft industry before World War II. (An excellent review of the history and application to management up to 1980 is J.M. Dutton, A. Thomas, J.E. Butler, “The history of progress functions as a managerial technology,” Business History Review 58 (2) (1984). )

Here’s my comment on the Economist article:

It’s sad to see such an obsolete and downright dangerous theory get this favorable write-up. BCG (and later Bain) ruined numerous businesses by persuading them to blindly follow “the experience curve.”

The danger in the Experience Curve concept is that it claims that improvement is _inevitable_ and _ the same for everyone in an industry_. Neither of these is remotely correct. If it were correct, the biggest firm would be able to reduce its costs faster than everyone else, and would become unassailable. This was exactly the theory behind BCG’s matrix, and it’s WRONG. General Motors was bigger than Toyota until 2008, but Toyota had lower costs, and faster declining costs, since at least 1965 or 1970. For decades GM claimed this was due to lower labor costs, but that was refuted in the book The Machine That Changed the World, which showed that Toyota (and others) were much more efficient than US auto makers per labor hour.

It’s certainly true that, properly managed, experience can facilitate improvement. But there’s been 25 years of research now showing that improvement requires deliberate effort, and that the improvement process takes careful management. Toyota, through JIT and “The Toyota Production Process,” essentially invented a system for making more rapid improvement – hence it surpassed GM and everyone else, while a fraction of their size. The semiconductor industry had its own epiphany about the folly of the experience curve, when a major research project run out of Berkeley surveyed a variety of fabs and found vastly different performance that had little  to do with scale or cumulative experience.

Even BCG no longer claims the experience curve is valid, as far as I know. (I’d be happy to hear from others who have experienced BCG’s views in the last 5 years.)

I could go on and on (and I did, in stuff I wrote 20 years ago on this topic)!  We need to drive a stake through the heart of this idea. It’s not that it’s totally and utterly wrong, because the learning curve  has some ex post validity. But it has little predictive power, and even less as a normative theory of how to manage learning!