top of page
Search
  • codepeters

IP 3: Algorithms & Everyday life

Option II: Everyday Life


Informed Consent & Fair Use: Informed consent used to mean that the individual in question understands the rules and implications of what they are agreeing to, and how these guidelines and reactions may impact them in some way, shape or form. However, this has changed in the time of algorithms; due to their complexity and how they “are said to be opaque, their content unreadable. A closely guarded and commodified secret, whose very value depends upon retaining their opacity. All we get to see are their results'' (Neyland, 2019, p.3), can an individual truly give informed consent if the processes and decisions being made to them are so inhumanly secretive? As Nyeland discusses in relation to how algorithms, while being a set of rules and decision making matrices, are still unpredictable to the average person. He goes on to note that “I do not anticipate that it is about to jump off the page (or screen) and act. It is not mute, but it also does not appear to be the bearer of any great agency. The notion that this algorithm in itself wields power seems unlikely. Yet its ordered set of instructions does seem to set demands. We might then ask for whom or what are these demands set?” (Neyland, 2019, p. 7). The same applies to fair use; how can an individual who does not have an intricate understanding of the “everyday life of an algorithm” (Neyland, 2019, p. 9) begin to comprehend what is defined as fair use to an algorithm? Inversely, how can an algorithm decide what is fair use for the data that it is given? It operates within the parameters set for itself, and thus everything it generates or applies must be considered fair use, regardless of how ludicrous this may be to an outside observer.


Discrimination & Net Neutrality: Net Neutrality and discrimination are slowly becoming depressingly intermingled thanks to the ever-increassing popularity of algorithms. Without the efficient and inhumanly accurate strength of predictive data, discrimination is becoming an integrated part of how algorithms are implemented in the real world. A prime example of this is how “in the USA, we are told of algorithmic policing that sets demands for police officers to continually pursue the same neighborhoods for potential crime” (Neyland,. 2019, p. 3). This is the cardboard cut-outr of discrimination against a specific population, but thanks to a lack of net neutrality and the predictive nature of algorithms, these procedures can be hand-waved away as simply following the most efficient and algorythimcally-proven course of action. It doesn't matter that the data being fed into the algorithm is biased, and that without net neutrality we will continue down this feedback loop of self-fulfilling predictive data that continues to discriminate more and more communities due to the biased nature of the data being provided by its own system; if the algorithm states that a region is more likely to be crime-addled, more police will be dispatched to that area, which leads to more arrests, which leads to more data which points out that yes, in fact, this area is crime prone, so more officers need to be sent there and the cycle continues to the point of collapse. In this situation, “the algorithm is presented as a new actor in these forms and relations of power” (Neyland, 2019, p. 7).


Personalization: In the context of algorithms, personalization is everything. Beyond how we style our hair and what posters we put up on the walls of our childhood bedroom, we are extremely personalized animals. How we dress, speak, behave, interact with, and move throughout the world is an entirely personalized experience. Not to get too philosophical, but if we look at existence through a cartesian skepticism lens, personalization is all we can be certain of; brain in a vat theory suggests that there is a very real possibility that everything that we see or experience is merely being projected to us and we are actually just a brain in a jar somewhere hooked up to electrodes that simulates existence. To put it another way, as Descartes would say, “I think, therefore I am”. The only thing we can be certain of is our own personalized experiences; without personalization, there can be no unique sense of self, and we cannot help but see the world through a sense of personalization. This has changed, however, when we approach algorithms. Suddenly, this sense of personalization is quantifiable, and can be rendered down into data sets. “We are now data subjects or, worse, data derivatives. We are rendered powerless. We cannot know the algorithm or limit the algorithm or change its outputs” (Neyland, 2019, p. 3). Suddenly, instead of us defining our existence through our personalizations, our existence is being defined by what a system arbitrarily sets out as our quantifiable personalizations and presents us with a curated reality (or at the very least, some frighteningly well-targeted advertisements). Instead of how we define ourselves and how we define our experience, personalization in regard to algorithms is how we are in turn defined and how our experiences will come to be set out for us.


Friend: The term friend typically denotes an individual with which one expresses mutual affection. This is typically reserved for someone human, or at the very least, something living (dogs are man’s best friend, after all). However, when considering the world through an algorithmic lens, sociologists “are constantly looking [...] for social links sturdy enough to tie all of us together” (Neyland, 2019, p. 10). Nyeland goes on to stipulate that “here, the non-humans should not simply be listed as part of an inventory of capitalism. Instead, their role in social, moral, ethical and physical actions demands consideration” (Neyland, 20149, p. 11). Based on this, “the algorithm might thus require study not as a context within which everyday life happens, but as a participant” (Neyland, 2019, p. 11). In this regard, if we begin to consider algorithms as active participants in our existence, especially as we move further and further online, perhaps the term friend can be extended to a non-living entity, and potentially all the way to an algorithm.

References:

Neyland, D., Springer Social Sciences eBooks 2019 English/International, OAPEN, DOAB: Directory of Open Access Books, SpringerLink (Online service), & SpringerLink Fully Open Access Books. (2019;2018;). The everyday life of an algorithm (1st 2019. ed.). Springer International Publishing.


6 views0 comments

Comments


Post: Blog2_Post
bottom of page