A senior Microsoft researcher issued a stern warning about the negative consequences of the current mania for data harvesting saying that a kind of “fundamentalism” was emerging regarding the utility of what’s been termed “Big Data” that could easily lead to a Orwellian future of ubiquitous surveillance and diminished freedom.
Speaking to an audience of around 300 technology industry luminaries at the Massachusetts Institute of Technology’s annual Emerging Technology (EMTECH) conference, Kate Crawford, a Principal Researcher at Microsoft Research in Boston said that the technology industry’s fetish for “Big Data” had blinded it to the limits of analytics, and the privacy implications of wholesale data harvesting.
The Massachusetts Institute of Technology’s (MIT’s) annual Emerging Technologies (EMTECH) conference, a high-gloss event that throws entrepreneurs, venture capitalists and academics together to talk ‘big ideas’ on TED-inspired sets. Crawford’s speech, coming on the heels of a talk about transforming healthcare with big data and the industrial internet by Mitchell Higashi, Chief Economist of GE Healthcare, set a more skeptical tone for the event.
Starting with a thought experiment that imagined a future society in which data analytics and ubiquitous surveillance tools were used to manage individuals every movement and interaction, Crawford went on to wonder whether wasn’t slipping into what she called a kind of “fundamentalism” about so-called “Big Data”: seeing it as being purely useful and “objective,” but not seeing the complex factors that influence what data is collected and how it is “massaged” to produce a certain result.
“We’re treating data like it’s a natural resource that we can pull out of databases like you pull oil out of the ground,” she said. Not so. Data is “a function of human creativity and thought,” she said, warning that care was required when handling it. Crawford said that the “anonymity” of metadata used in Big Data analytics is an illusion. To the contrary: Crawford noted a 2013 study of 60,000 Facebook users in the Proceedings of the National Academies of Science that suggested an individual’s pattern of “Likes” on Facebook could be analyzed to reveal a wide range of personal traits, from their gender to their political leanings and sexual orientation.
That kind of data analysis is already being used by e-commerce firms and online advertisers to create profiles from supposedly “anonymous” data, Crawford said. And, while the possible beneficial applications of such “Big Data” analysis shouldn’t be understated, neither should the possibility that such data will merely reinforce existing social trends and economic divisions.
Specifically, Crawford warned those assembled about so-called “data redlining,” a modern twist on the 20th century scourge that tended to concentrate people of color and the economically disadvantaged into neighborhoods that were, then, denied equal access to public and private services.
“If my buying history suggests that I may not have the best credit, maybe I won’t see the same interest rate offer as someone else. If my Facebook posts suggest that I like to drink wine, maybe I won’t be offered a job,” Crawford warned. But, like an invisible hand, those analytics wouldn’t be accessible to those who are affected by them. “The data is effectively discriminating – but what are the discriminations?” she asked.