Purdue faculty reflect on the state of data privacy both in and outside of academia

Ten years ago, give or take, those with access to technology began uploading photos of themselves to social media. Depending on the generation, we were looking to connect with our current high school friends or our high school friends from decades ago. We wanted to see what each of us were up to, sometimes obsessively. But ultimately, ten years ago we were looking for some kind of connection – personal, digital or both.

Fast forward to today, Facebook has monetized its free social media platform by selling advertising spots, specifically targeted ads. These ads are presented to people who may have a particular interest given their age, lifestyle, political affiliation or one of the hundreds of other things that Facebook tracks about its users. However, 74 percent of adult U.S. Facebook users did not know that Facebook tracked their interests and traits, according to a new Pew Research study. Furthermore, more than half are not comfortable with Facebook collecting this information.

So how did the divide deepen between social media users and companies? More simply, where did the confusion begin about what of our own data was available for consumption and what wasn’t? For Data Privacy Day (Monday, Jan. 28), three Purdue faculty discuss issues within and outside academia, and how our data are uploaded and consumed today.

Lindsay Weinberg, post-doctoral teaching fellow in Innovative Studies with the Honors College and Purdue Polytechnic Institute

In May, Weinberg will lead a class of students to Toronto for a study abroad program, “Smart Cities: Rethinking Urban Space in the Digital Age.” Alphabet Inc., Google’s parent, brokered a deal to construct a smart city in Toronto. Alphabet’s plan promises to use technology to create a sustainable and data-managed urban neighborhood.

How do “smart cities” and internet-connected devices, such as thermostats and lightbulbs, have an effect on our privacy?

“Certainly, it’s important to think about how smart devices extract profit from users’ personal data, and whether there is accountability for how these devices collect, store, and sell it, but these luxury technologies are only the tip of the iceberg,” Weinberg says.

“We’re trusting technology to inform institutions like the police, and big data is becoming enmeshed in the governing process,” she says. “Second, are smart cities being designed for all? How many of these innovations are accounting for the needs for women, for example? We build high tech bathrooms without changing tables, or poor lighting in redesigned ‘smart’ streets.”

There is an economic element as well. In many ways, the struggle for privacy is an economic struggle.

“Whether it’s privacy to have more space from others or being able to have control over your domestic space, or the use of your data, in many cases, if you don’t have the means, or if you receive support from the state, you effectively don’t have a right to privacy,” Weinberg says.

Her study abroad class will end with a discussion of smart versus wise cities.

“What makes a city wise, I think, is its ability to account for history and power, so if smart cities don’t do that then maybe wise cities would,” Weinberg says

Natalie Lambert, assistant professor in the Brian Lamb School of Communication

Lambert’s research focuses on organizations, especially in health care. She’s also interested in communication within online spaces such as message boards, and uses computational methods like network analysis to analyze the huge amounts of data she collects.

Her most recent work focuses on people with breast cancer who seek support and advice on online message boards. Although the data is publicly available, it is still her duty as a researcher to uphold the forum participants’ right to privacy. The information may be readily available online, but to what degree do participants think of their information as “public?”

“It is vital for researchers to have access to online data since so much of social life is now happening online,” Lambert says. “But as a researcher, it’s important to think about how these online spaces feel to the people who are there participating. Small discussion boards can feel very private, so while I believe it’s important we study these spaces, if I do collect data, I have a responsibility to anonymize the data so that it can’t be traced back to one user.”

Under Purdue policies, if a social media platform or website requires someone to create a login to access the information, it is not considered public. But anonymizing identities in an online world overflowing with personal data is a difficult problem.

“If someone posted something on Reddit and you click on their user name, they might not have a whole lot of information connected to that user name,” Lambert says. “But their user name might be used other places online that leads you to be able to figure out who that person is off line.”

She suggests that, in addition to following Institutional Review Board requirements, researchers new to online research should think through scenarios of privacy – what are members’ expectations for privacy within specific websites? Within online social groups, what is formally and informally considered “public” or not?

Kendall Roark, assistant professor with Purdue University Libraries 

Roark, whose research interests are critical data studies, privacy and research ethics, is part of a team of academics forming a multidisciplinary Critical Data Studies Collaborative and a cohort of the Data Mine Learning Community.

Critical data studies examine how data is collected and used, especially if its human data and that data is used to make assumptions about human behavior – an algorithm’s bias, for instance. The program’s lecture series will bring Virginia Eubanks, associate professor of political science at the University of Albany, SUNY, to address this question in her lecture “What If AI tools Punish the Poor?” at 4:30 p.m. Feb. 13 in Stewart Center’s Fowler Hall.

Roark says one of the goals of critical data studies is to bring the ethical and societal implications of big data, algorithms and automated decision-making to the forefront. This is why, she says, it’s important to invite speakers such as Eubanks who are able to translate the abstract, socio-technical issues into real world examples.

“Many users of social media are surprised that so much of their personal information is collected and shared with third parties. They find this sort of ubiquitous data collection and use ‘creepy,” Roark says. “Eubanks asks us to see that privacy and surveillance have differential impact. For instance, those who are experiencing poverty and find themselves in need of accessing public services have been experiencing this sort of invasive data collection and tracking as condition of receiving those services for quite some time.”

Writer: Kirsten Gibson, technology writer, Information Technology at Purdue, 765-494-8190, gibson33@purdue.edu 

Last updated: January 24, 2019