Skip to main content

Weapons of Math Destruction

There has been seemingly endless news regarding computational machines that manipulate massive amounts of data (Big Data). Almost every user and discipline are affected by technology companies that implement massive processing of BigData.

The faculty book group met in early February to discuss Cathy O’Neil’s book Weapons of Math Destruction, a groundbreaking book in the sense that it fully brings to fore, using many examples, how our lives are substantially affected by networked data gathering and analytical machines -- programmed and run by humans of course. A quote from the book:

“Ill-conceived mathematical models now micromanage the economy, from advertising to prisons. are opaque, unquestioned, and unaccountable, and they operate at a scale to sort, target, or “optimize” millions of people. By confusing their findings with on-the-ground reality, most of them create pernicious WMD feedback loops.”
 


It is a worthy read for all. Below are related resources with brief descriptions

Related Resources

What Happened to Big Data (the one to read if you are pressed for time)

This is a great article from Slate magazine that discusses in clear, readable language the hype of big data, companies need for MOAR (more + roar) data, and the danger of proxies, especially when they are magnified.

"But perhaps the bigger problem is that the data you have are usually only a proxy for what you really want to know. Big data doesn’t solve that problem—it magnifies it.”


Machine Bias by Pro Publica dives into detail discussed in the book on how inherent bias in machine algorithms can affect risk scores and as a consequence the lives of people previously incarcerated.) 

 
 
In an eye-opening talk, techno-sociologist Zeynep Tufekci details how the same algorithms companies like Facebook, Google and Amazon use to get you to click on ads are also used to organize your access to political and social information. And the machines aren't even the real threat. What we need to understand is how the powerful might use AI to control us -- and what we can do in response.
 
 
The program’s underlying assumption, common in the world of “big data,” is that data is good and more data is better. To that end, genuine efforts were made to gather as much potentially relevant information as possible. As such programs go, this was the best-case scenario.

Other Articles

Comments