Epistemic Analysis of Algorithmic Information Delivery
by Jake Lynn

Introduction

As the internet has gained popularity across the country and world, one technology has come to dominate the information market. Social media has become the primary source of information for many types of people and now plays a role in information collection for every member of the modern world. Social media is not the only technology that has caused a cosmic shift in the type of information to which we have access. Analysis of previously incomprehensible amounts of information allows us to gain insights that may have been beyond the reaches of human theory [1]. Today’s information landscape is defined by these two forces working together to inform potential knowers and make copious amounts of money for the people who own these platforms.

In order to evaluate the epistemic effects of algorithm driven social media, a variation of Goldman’s framework concerned with the power, speed, fecundity, and reliability of information sharing techniques will be used. Efficiency is left out of this evaluation, as I feel it is defined well by the other indicators, particularly speed and power. In my opinion, social media is the most extreme way knowledge spreads according to these qualifiers. I will first consider how social media itself is described in the context of these indicators, then consider the effect algorithmic information delivery has had on each.


Power

The power of social media is defined in an epistemic context as its ability to provide its users with true beliefs.

Social media is an incredibly powerful tool for sharing information. With the touch of a button users of a given site can send whatever is on their mind to millions of other potential knowers, or even billions in the case of Facebook. Part of the power social media holds comes from its unrestricted nature. With few exceptions, anything can be posted on social media and shared with the world. The power of social media is not all rosy, however. The way in which we consider and vet information found on these platforms contributes to the power of the platform. Often, little consideration is given to the motivation of a poster. In the case where a post contains true information, this works out to the benefit of all. People quickly gain true beliefs. In the case where the information is false, this is very dangerous. People quickly gain false beliefs and accept them as fact. For malicious actors, it is very easy to influence the behavior and decisions of users [4]. This can be very harmful, and contribute to the spread of misinformation.

Algorithms that deliver content to the users increase the power of the platform, for better or worse. People are shown what the algorithm determines they want to see, which may be good or bad information. In general, people are more likely to believe and accept information that supports their current world view. Confirmation bias plays a large role in the power provided by these algorithms, which are optimized to show users things they already agree with. It has been determined by these algorithms that the way to keep users engaged and make the parent company more money through advertising revenue is to appeal to a user’s preexisting beliefs, not to challenge them with opposing information [5].


Speed

Speed in this context describes how fast patrons of a platform can gain new information. It is closely related to efficiency, and in some sense power. As stated in the discussion of the power of social media, users quickly gain beliefs, true or false, in part due to their lack of deep consideration of the value of a thought they consume.

Social media contributes to the fastest spread of information we have seen in history. Beliefs, true or false, are transmitted across the world instantly. Additionally, many social media platforms give users the ability to repost others ideas, take for example the Twitter retweet. This greatly speeds up the spread of information. Users can pass along information to their followers with the touch of a button, rather than typing out their own post (which is already very quick).

Algorithms play a smaller role in speed than in any of the other qualifiers discussed in this paper. Information is already shared about as fast as is conceivably possible, the method by which this information is selected has little effect on the speed at which it transmits. However, algorithms tend to show viral items to many users in a short period of time. This contributes to the speed at which some information spreads, but not all. More interesting, in my opinion, is the speed at which viral content is left behind. As soon as the algorithms find the next viral post, the previous ones are replaced and often forgotten.


Fecundity

One area in which social media has been at the forefront of development has been fecundity. Fecundity concerns the amount of people who can be exposed to a belief from a given epistemic source.

Fecundity the likes of which is provided by social media has never been seen before. At the time of writing, there are 2.7 billion registered users on Facebook. Due to social media, people who may have never met before the advent of the internet can be connected and discuss anything they would like. Information can be shared with an audience the magnitude of which was only reachable by governments and corporations just 20 years ago. The fecundity of social media greatly increases the power of the platforms, leveling the playing field and allowing any user’s ideas to be seen by all others.

We also must consider how many beliefs can be gained by a single person. Social media tends to keep users engaged for a long time, in part due to their addictive nature [2]. Users spend a ton of time on these platforms, and are exposed to a wealth of information in small chunks for very short periods of time. Whether or not these beliefs are true, the amount of beliefs gained by a single person can be enormous.

Fecundity is greatly diminished by content-delivery algorithms. These algorithms tend to group people into smaller communities centered around certain interests or views. This is not bad on the surface. However, these algorithms also tend to flood a user’s timeline with things they already know and agree with, therefore decreasing the amount of time spent consuming things produced by a diverse audience. Algorithms subject the user base of these platforms to harsh community division. People are not challenged by others' ideas, and their own ideas only reach those with similar views, doing little to change the viewpoint of the public as a whole.


Reliability

The greatest concern, in my opinion, is the reliability of information shared on social media. Reliability in an epistemic context describes the likelihood that a piece of information taken from an epistemic source is true.

Social media has always had issues with reliability, as have other mostly unfiltered sources. Many platforms have implemented ways to verify the identity of an account and differentiate it from imposters [3]. However, these same checks on reliability can be used against the common user by malicious actors. By posing as a reputable source, epistemic adversaries can quickly spread false beliefs. People tend to believe the accounts that look the most reputable, but reputation is easy to fake on social media.

Reliability is even harder for algorithms to determine than for a human. Algorithms pay very little mind to the probability that a statement is true, only to the chance that you will click on the post. As it turns out, false beliefs spread 6 times faster on social media than true ones do [5], in large part due to the role played by algorithms in deciding what we see. The spread of conspiracy theories is rampant on social media, simply because they get so many clicks. People are interested in sensationalized claims whether they are true or false. The algorithms have picked up on this, and continue to deliver content that they think will keep us coming back for more.


Conclusion

At its inception, social media was seen as a tool to bring our world more closely connected than ever. For a long time, this held true. As a society, we have never been more connected than we are today. However, with the introduction of content-delivery algorithms optimized with only profit in mind, the modern wonder that social media once was has crumbled into a dangerous weapon of information war. It is becoming increasingly difficult to determine what is true, or if truth even exists in our world. We are being intentionally misled by governments, corporations, and individuals with the worst intentions. Is there anything we can do to reverse this descent into information darkness?

One thing is for sure, and that is there is no one thing that will solve our problem. We can get rid of the algorithms, and return to the early days of social media and sort-by-recent content delivery. This may quell the problem, but will not make it go away. We could get rid of social media in its entirety, but that would rob us of the greatest communication tool mankind has ever seen. The issues with misinformation on the internet run deeper than the technologies we use. It is human nature to trust each other. Is it also in our nature to abuse the trust of others? Social media has made it seem that way, but there is much more to this issue than meets the eye.



Works Cited

[1] Crawford, Kate, Mary L. Gray, and Kate Miltner. "Big Data | critiquing Big Data: Politics, ethics, epistemology | special section introduction." International Journal of Communication 8 (2014): 10.

[2] Hong, Fu-Yuan, et al. “Analysis of the Psychological Traits, Facebook Usage, and Facebook Addiction Model of Taiwanese University Students.” Telematics and Informatics, vol. 31, no. 4, Nov. 2014, pp. 597–606., doi:10.1016/j.tele.2014.01.001.

[3] Lin, Hsin-Chen. “How Political Candidates' Use of Facebook Relates to the Election Outcomes.” International Journal of Market Research, vol. 59, no. 1, 1 Jan. 2017, p. 77., doi:10.2501/ijmr-2017-004.

[4] Moreno, Megan A., et al. “The Facebook Influence Model: A Concept Mapping Approach.” Cyberpsychology, Behavior, and Social Networking, vol. 16, no. 7, 13 Apr. 2013, pp. 504–511., doi:10.1089/cyber.2013.0025.

[5] Orlowski, Jeff, director. The Social Dilemma. Netflix, 2020, www.netflix.com.