Receive Big Ideas to Your Inbox

Archives for March 4, 2020

Veronica Barassi: Data Rights are Human Rights

When anthropologist and TEDxMileHigh speaker Veronica Barassi was asked to sign a hospital form before giving birth, she hesitated. Buried deep in the form was a clause that allowed the umbilical cells from her soon-to-be-born daughter to be used in any future research. Any future research. The nurse reassured her that it was “just a form, you just sign it.” This experience sent her on a quest to find out who is collecting data on her children and how they are using it. Learn who is tracking your online data and why data rights are human rights. 

“That day [at the hospital] I had the perfect example of how natural and how accepted it has become to agree to terms and conditions without giving it a second thought,” Barassi says. “Every day, every week, we agree to terms and conditions. When we do this, we give companies the lawful right to do whatever they want with our data and the data of our children.”

Barassi advocates that “data rights are our human rights” and it’s time they are treated that way.

You’re Being Tracked

Think of every time you have uploaded information about yourself into an app or website. This includes all of the selfies you’ve added to social media and the tweets you’ve sent out for everyone to read. The times you’ve given your phone number to reserve a table for brunch. Or, when you’ve shared your email for 10 percent of your first purchase on any retail site. If you’ve agreed to the terms and conditions of these platforms, you have given them the right to track your data and create profiles based on data traces.

Implications for Children

The problem is, social media is just the tip of the iceberg, according to Barassi. “For the very first time in history, we are tracking the individual data of children before they are born,” she says.

When a couple wants to have a child, they search for ways to get pregnant. When they receive the good news, they post photos of their ultrasounds on social media and consult online medical platforms for all kinds of questions. When their baby is born, parents track every contraction, nap, and feed through health apps. 

The child is then subject to data tracking throughout the rest of their lives through educational portals and technologies at schools, and online health portals at doctor offices.

The Role of Data Companies and Brokers

All of this tracking is used to create profiles of individuals that are then sold to data companies or data brokers for profit. “In 2019, the British Federal Journal published research that showed that out of 24 mobile health apps, 19 shared information with third parties,” Barassi says. “These third parties shared information with 216 other organizations. Of these 216 other fourth parties, only three belonged to the health sector.”

This process can explain how you constantly see advertisements for iPhone cases after searching for a new one on Amazon. Or why you’re bombarded with promotions for a relaxing beach vacation after you googled photos of the beach. Or, after a conversation with a friend about wine from Argentina in front of Alexa, you’re targeted online the next day for… wine from Argentina. 

More companies have access to you and your data traces than you realize, and this infringement on individual data rights can have major implications.

The Real Problem: Bias

While some of the third and fourth parties that receive individual data profiles are trying to profit off of your online trends, others use these traces to make far bigger decisions. 

Banks can receive an individual’s profile and base their willingness to give a loan off of that person’s online finances. Insurance companies use them to decide on premiums, and college admission boards can make a decision to accept you to your dream university without ever meeting you in person. 

The danger in allowing online profiles and algorithms to make decisions for us is they don’t account for human experience, and they are always biased. The algorithms are products of real human biases since they are created by humans. “Data traces are not a mirror of who we are,” says Barassi. She explains that algorithms are sets of rules that have been designed to reach a specific outcome. However, these rules were designed by a human with certain biased contexts and cultural values

So, when a machine that is used for predictive policing is trained on biased data, you can see the problem. We cannot allow biased machines and algorithms to make decisions for us based on data that will always, in one way or another, be biased. 

We Need Political Solution

“What we need now are actually political solutions. We need governments to realize that our data rights are our human rights,” says Barassi. We cannot protect ourselves or our data rights on our own. We need legislation in place that prevents companies from sharing personal, individual data with third and fourth parties looking for profit. If you’d like to learn more or get involved, check out the work of Hu-manity.co

When we allow machines to make decisions for us based off of biased rules and data traces, we lose our individuality. A company looking to hire a new employee should make their decision after a face-to-face meeting, not based on a profile of all an individual’s social media posts from when they were in high school. 

Your data rights are your rights. Stop checking the I have read and agree to these terms and conditions box without reading. Stop clicking I accept.

Stay Connected

Spark your curiosity with talks and inside event updates sent directly to your inbox.

This field is for validation purposes and should be left unchanged.