Big data and AI are not “objective”

AI, machine learning, etc only appear to be objective. In reality, they reflect the world view and prejudices of their developers.

 Algorithms have been empowered to make decisions and take actions for the sake of efficiency and speed…. the aura of objectivity and infallibility cultures tend to ascribe to them. . the shortcomings of algorithmic decisionmaking, identifies key themes around the problem of algorithmic errors and bias, and examines some approaches for combating these problems. This report highlights the added risks and complexities inherent in the use of algorithmic … decisionmaking in public policy. The report ends with a survey of approaches for combating these problems.

Source: An Intelligence in Our Image: The Risks of Bias and Errors in Artificial Intelligence | RAND

Advertisements

Google/Alphabet continues toward Total Person Awareness: tracking every vehicle + person.

Secretive Alphabet division aims to fix public transit in US by shifting control to Google (from The Guardian)

Documents reveal Sidewalk Labs is offering a system it calls Flow to Columbus, Ohio, to upgrade bus and parking services – and bring them under Google’s management.

2813

The emails and documents show that Flow applies Google’s expertise in mapping, machine learning and big data to thorny urban problems such as public parking. Numerous studies have found that 30% of traffic in cities is due to drivers seeking parking.

Sidewalk said in documents that Flow would use camera-equipped vehicles,…. It would then combine data from drivers using GoogleMaps with live information from city parking meters to estimate which spaces were still free. Arriving drivers would be directed to empty spots.

Source: Secretive Alphabet division aims to fix public transit in US by shifting control to Google

Notice that this gives Google/Alphabet a legitimate reason to track every car in the downtown area. Flow can be even more helpful if they know  the destination of every car AND every traveler for the next hour.
The next logical step, a few years from now, will be to track the plans of every person in the city. For example Mary Smith normally leaves her house in the suburbs at 8:15AM to drive to her office in downtown Columbus. Today, however, she has to drop off daughter Emily (born Dec 1, 2008, social security number 043-xx-xxxx) at school, so she will leave a little early. This perturbation in normal traffic can be used to help other drivers choose the most efficient route. Add  together thousands of these, and we can add real-time re-routing of buses/ Uber cars.
For now, this sounds like science fiction.  It certainly contains the ability to improve transit efficiency and speed, and “make everyone better off.” But it comes at a price. Yet many are already comfortable with Waze tracking their drives in detail.
Tune back in 10 years from now and tell me how I did.

How did the Ukranian govt. know who was demonstrating against it?

[edits Jan. 31] A poli sci friend recently blogged about the Ukranian government’s “text that changed the world,” a mass text message thousands of anti-government demonstrators in Kiev. She asked 1) How did the government know who was in the main square of Kiev that day? (Cell phone location) and 2) How did it send the same message to everyone at once? (Mass SMS)

Demonstrators in Kiev. From CNN 

The second question is easy: phone companies routinely provide mass-SMS services to large customers. For example, I’m on the “emergency alert” texting service of UC San Diego’s campus police. It was designed for earthquakes, but it has been used for other kinds of messages “between earthquakes.” The same message goes out to every phone number on their list.

What to do to avoid tracking? Short version: Leave your phone at home. Second best is to shut it off or switch to airplane mode, but those work only if the government is not making an effort to target you.

Continue reading

“Anonymized” data frequently isn’t

An in-the-closet lesbian mother is suing Netflix for privacy invasion, alleging the movie rental company made it possible for her to be outed when it disclosed insufficiently anonymous information about nearly half-a-million customers as part of its $1 million contest to improve its recommendation system.

The suit known as Doe v. Netflix (.pdf) was filed in federal court in California on Thursday, alleging that Netflix violated fair-trade laws and a federal privacy law protecting video rental records, when it launched its popular contest in September 2006.

via Netflix Spilled Your Brokeback Mountain Secret, Lawsuit Claims | Threat Level | Wired.com.

(As the article goes on to make clear, this problem has been known for a while. Netflix ignored it at its peril.)