Skip to content Skip to footer

Big Data’s Big Image Problem

This week the White House completes its big data policy review. Improving public perception of “big data” requires more than another report: Organizations need to improve transparency, be more accountable and better educate the public.

(Image: Big Data via Shutterstock)

This story could not have been published without the support of readers like you. Click here to make a tax-deductible donation to Truthout and fund more stories like it!

In Captain America’s latest big screen adventure, his arch-enemy isn’t a diabolical super villain or space alien. It’s big data. “The 21st century is a digital book,” the captain is told. “Your bank records, medical histories, voting patterns, emails, phone calls, your damn SAT scores! [Our] algorithm evaluates people’s past to predict their future.”

Since rocketing into the public imagination just a few years ago, big data has been portrayed as being able to do anything. Some claim it holds near magical potential, while others call it the biggest civil rights threat of our generation. Computer scientists, meanwhile, see “big data” as nothing more than a marketing buzzword that gets in the way of legitimate data analysis. There is little question, however, that big data has developed an image problem.

This image has only worsened as one headline after another has detailed how NSA and other intelligence agencies use data. As part of an effort to respond to these stories and find a path forward, the White House announced in January that it would engage in a comprehensive review of how big data impacts privacy, and a report is due at the end of April. While the report has been described as a “scoping exercise,” it will likely set the tone for future conversations about how to use big data and weigh privacy concerns.

Making society – and Captain America – more comfortable with big data is a multistep process. The big data bogeyman will only be excised with more transparency, more accountability, and ultimately, a lot more public education.

More transparency is an important first step to restoring public trust. As the NSA revelations have confirmed, the inner lives of individuals have become more transparent, just as the workings of government have become more opaque. This concern exists with businesses, as well, who have become skilled at profiling consumers to market to them. This dynamic must be flipped if individuals are to have meaningful choices about their privacy. It is impossible to determine if using big data to improve national security, public health, safety and education is worthwhile if no one knows what is being done.

As Judge Louis Brandeis once said, sunlight is the best of disinfectants. Transparency also doubles as a vital accountability tool. No normal person reads privacy policies, but the Federal Trade Commission does. Although the commission can police against practices it deems unfair, its authority to protect privacy rests on challenging false claims made in public privacy policies. Privacy advocates also comb through long, detailed government reports to figure out how data is being used.

At the same time, more internal accountability is needed to police big data. While the very notion of a “privacy officer” in a company or government agency was novel just a dozen years ago, today privacy is something that every responsible organization considers. But big data is not just about privacy: It raises broader ethical and social concerns that may require having new voices at the table. Leading thinkers have suggested the creation of consumer review boards or “algorithmists” that could help organizations weigh the benefits and risks of data use.

In one infamous example, a reporter discovered that big-box retailer Target was able to “predict” when one of its teenage customers was pregnant – before her family even knew about it. What makes this example so challenging is that it may not even be a privacy issue per se, and the use of predictive analytics to give consumers coupons may not even be explicitly harmful. Yet it does raise a big ethical conundrum about how to deploy big data predictions, and raises the question of what Target’s approval process for this project looked like.

While transparency and accountability may improve public trust in big data, the public still needs to figure out what big data is. There is a huge gap between what individuals think they know, and what reality is when it comes to the use of their data. This is compounded by the fact that the vast majority of people believe they are personally responsible for protecting their privacy, despite the fact they do very little to actively protect it. A little bit of misinformation can not only create a public relations fiasco, but it can also sour the public on the beneficial use of big data.

Hopefully, the White House’s report will serve as a call to action for government and businesses to look at how they use big data and to open their digital books to a public that badly needs to be educated.

Join us in defending the truth before it’s too late

The future of independent journalism is uncertain, and the consequences of losing it are too grave to ignore. To ensure Truthout remains safe, strong, and free, we need to raise $46,000 in the next 7 days. Every dollar raised goes directly toward the costs of producing news you can trust.

Please give what you can — because by supporting us with a tax-deductible donation, you’re not just preserving a source of news, you’re helping to safeguard what’s left of our democracy.