The Obama administrations “Internet freedom” agenda — already tarnished — is on the line, and at least this time, officials seem to realize that their actions will have a direct effect on their foreign policy. …. There are signs, however, that the Obama administration is learning that it cant have a “do as I say, not as I do policy” when it comes to Internet freedom. During the SOPA debate, the State Department refused to comment on the bill despite virtually the entire tech industry complaining that it would amount to mass censorship. A spokesperson even released a statement at the time saying, “The Department of State does not provide comment on pending legislation,” despite a provision that would have made much of the circumvention software it is funding — to the tune of tens of millions of dollars — illegal.In stark contrast this time around, Secretary of State Hillary Clintons senior advisor for innovation, Alec Ross, was the first U.S. official to definitively say, “The Obama administration opposes CISPA,” as he matter-of-factly told the Guardian Monday. Prior to that, the administration had only released a broad statement saying that “privacy and civil liberties” should be preserved in any cybersecurity bill.
I’m in Grading Hell right now, which makes me open to alternatives. The idea of “gamifying” university courses is intriguing.
I attended a conference, Overloaded 2012, last week in San Francisco. Here’s a summary from one of the directors of the Information Overload Research Group (IORG). IORG is small, but addressing deep issues of how constant information deluges flows affect our working and thinking.
My “immediate action” conclusion was to start using #hashtags in my email. Examples: #invite, #meet #teach (for messages to colleagues about teaching). Also to provide longer subject lines in my emails – “6 words” is one suggestion.
And, I’ll take a look at a book by Jonathan Spira, an IORG director, who I’ve talked with but never met.
I spoke yesterday at The Economist conference, Information: Making Sense of the Deluge. Very interesting speakers, I was excited to go. The stated and sincere intent was to get a discussion going, including the audience. But the format was like watching TV news (and I don’t mean PBS): toss someone on stage, let them roll a 6 minute video, answer a few questions, and on to the next. This format was good for Twitter-bytes, but not for thinking or reflecting or building ideas.
One of the topics was Nick Carr on how “pseudo-multitasking” is hurting our brains. The conference itself seemed to follow the same format. This morning, for example, the schedule for the first 50 minutes has 7 different people on stage, in four sessions:
Act II: Bottom up: Information for people8.35 amFlash of genius: How to translate the internetLuis von Ahn, A. Nico Habermann, Associate Professor, Carnegie Mellon University8.45 amFlash of genius: Turning information into knowledgeAmit Singhal, Engineer, Google8.55 amData exhaustThe intersection of search and big dataLuc Barthelet, Executive Director, Wolfram AlphaArkady Borkovsky, Chief Technology Officer, Yandex LabsModerator: Kenneth Cukier, Business Correspondent, The Economist9.15 amFlash of genius: The information entrepreneurScott Yara,Vice-president, Products and Co-founder, Greenplum
We recently completed another major “How Much Information?” report. This one measures how much information servers processed, worldwide. Answer: we estimated 9.57 ZB in 2008. The coverage was broad, although not as much general press as our previous report on information consumption. I was impressed to see us reach 130,000 Google hits in a few days (try searching on the string 9570000000000000000000 ), until I realized that most of them are spam sites – they just duplicate hot items in hopes that Google will point to them, instead of the original.
As always, we had to make a lot of judgment calls in deciding what to measure and how. See the report, and the forthcoming technical appendix.
The report is available here.
Arguments over what information to measure, and how are fundamentally irresolvable. There are too many good answers, and for most purposes measurement problems make it impossible to measure exactly what we want, in any case. The WSJ’s “numbers guy” collects some views on this, related to the new USC/Science article on the world’s computation.
My print column this week examines a recent paper in Science and other research efforts that attempt to quantify the world’s information. The studies generally compile dozens of pieces of data, such as hard-drive production and sales, and put all information into a single unit, such as bytes or words. “We can say things like ‘a 6 square-cm newspaper image is worth a 1000 words,’ ” the Science study’s authors, Martin Hilbert and Priscila López, wrote.
Our How Much Information? 2008 Consumer report continues to generate discussion. The comments that follow this blog entry in Language Log are quite interesting. I added my own comment – we’ll see how long it takes to go through their moderation process.
Matt Richtel, one of the leading current peddlers of the “technology is eating our brains” meme, is fond of this assertion:The average person today consumes almost three times as much information as what the typical person consumed in 1960, according to research at the University of California, San Diego.That version is the lead paragraph of the online site for his appearance on Fresh Air, “Digital Overload: Your Brain On Gadgets”, 8/24/2010. I was curious about what this sentence could mean, and more specifically, I wondered which UCSD researchers did the measurements, and what they they measured.