Social media’s major issue is data mining

Illustration by Nicole Oliver


IT IS TIME TO demystify social media use. The problem is not how much we use it, but that the private companies that own them use our labor and data to profit.

We are using social media more than ever, and while there are ramifications to consider on a personal level, as with any technology, there are plenty of reactionary interpretations of the prevalence of social media in our lives, especially as college students. This is the same line of thinking that emerged when the radio, television and even electricity were invented.

Research on the effects of social media on our attention span, thinking and other concerns is still being investigated with few definitive conclusions. However, one thing that is certain about social media sites is that they use our data. Despite being a user of social media, I believe that this relationship is inherently exploitative and perpetuates harmful surveillance of marginalized groups.

For example, TikTok has been heavily criticized for its censorship of certain users it deems are ugly, overweight, disabled or poor, according to an internal memo for content moderators that was released in March 2020. The memo also specifically advised moderators to limit content from people with “eye disorders, crooked mouth disease, and other disabilities.” The company’s mass surveillance of its content has serious ramifications for such users, especially when those seeking to monetize popular content are unable to do so because of platform guidelines.

Policies like these are clearly problematic and have tangible effects on people’s lives and their livelihoods, but surface-level surveillance is just the beginning. Some insurance companies are legally allowed to use social media posts, like on Instagram or Twitter, to determine if they want to offer life insurance to people and at what cost. An article by the Wall Street Journal advises those buying life insurance not to post photos of smoking or engaging in high-risk athletics. Instead, they recommend using fitness-tracking devices, purchasing food from healthy meal prep services and visiting the gym with a location tracking device.

The surveillance of our online presence does not just put you at risk of not getting a job or limiting the presence of marginalized people. It can also lead to losing health insurance, creditors viewing you as too high of a risk or the government finding your location and detaining you. It is not innocuous that information is collected by these private companies and is legally owned by them according to many sites’ terms of service.

Our information is constantly being collected, and it is valuable, as evidenced by Facebook’s Cambridge Analytica scandal where the tech company sold off the data of up to 87 million profiles to the British consulting firm — which ended up swaying the 2016 U.S. presidential election. Not to mention, there may be more legally and morally dubious data practices that have not been exposed.

Much of the culture and affordances of Facebook’s platform actually affirms that the site’s main purpose is to mine the data of its users in order to profit from them. The “About” section encourages its users to include very sensitive information including exact age, residential history and entire resumes of work experience and education. Facebook also implemented a real name policy where accounts that did not have someone’s legal name on them were at risk of being taken down. While the policy was later revised after much criticism, it is clear that it was originally implemented to more easily connect the identity of users with their data, which makes it much more valuable. In my eyes, Facebook and many other social media services are just data-mining sites that the public just so happens to use for other reasons.

Even much less sensitive data is used by companies to turn a quick buck. In a study by Nicholas Carah and Daniel Angus on algorithmic brand culture, the two digital media professors assert, “In the brand culture of social media, the creative narration of cultural experience doubles as data that trains platform algorithms.” They argue that branded elements are algorithmically recognizable and are used to target advertisements to users.

This means that keywords, hashtags, images and locations that have recognizable elements are all data that can be automatically extracted. By using the hashtag for Coachella or posting a picture at the Absolut Vodka booth at Pride, Instagram knows what kind of food you like, where you hang out, who your friends are, what kind of shirt you would like and more just based on association. Even our seemingly meaningless, or less vulnerable, data can have larger ramifications when paired with algorithms and machine learning technologies.

Instead of having the same conversations about screen time and shortening attention spans, we need to focus on how our data is being used. It is well overdue to start considering the ramifications of private companies not only having access to much of our data, but owning it. When our lived experiences and our bodies are perceived through the internet, we all become data points which can have severe complications when profit is the motivation.

This article presents opinions held by the author, not those of The Pioneer Log, its editorial board or those interviewed for background information

Subscribe to the Mossy Log Newsletter

Stay up to date with the goings-on at Lewis & Clark! Get the top stories or your favorite section delivered to your inbox whenever we release a new issue. 

1 Trackback / Pingback

  1. Data Detox isn’t enough. But it’s commendable – Venom of Venus

Leave a Reply

Your email address will not be published.

AlphaOmega Captcha Classica  –  Enter Security Code
     
 

*