Moronization of the Masses – Bowing to the Digital Deities
ARTIFICIAL EMOTIONAL INTELLIGENCE (pp. 156)
…. The term artificial emotional intelligence refers to the following kinds of abilities:
• Predicting individual behavior by modeling emotional patterns. Artificial Intelligence can develop emotional profiles of individuals that enable a machine to evaluate someone’s psychological state.
• Substituting for human contact by providing emotional interaction. Artificial Intelligence is becoming adept at reading and responding to emotions like a human.
• Influencing moods and shifting people’s choices toward a product or idea with emotional value. With the ability to masquerade as human, AI can make people feel good about themselves, boost their self-esteem, and reinforce specific ideas. It can make them feel happy or sad or convince them to choose a certain movie, buy a specific product, fall in love with someone, start hating someone or something, and so forth.
In performing emotional functions, the machine is not expected to achieve perfection—but neither can human beings perfectly perform such tasks. If the machine’s emotional performance is sufficiently on par with that of humans, it will replace humans at some point or at least augment the emotional work of humans.
….. The branches of AI dealing with artificial emotional intelligence are galloping ahead because machines are no longer limited to well-structured tasks and can now deal with ambiguous situations. Ad hoc tasks involving instincts, intuition and creativity are also subject to automation. While the extent to which AI will be able to perform such tasks is uncertain, some cognitive functions are already becoming automated.
….. The broad goal of all these fields and subfields is to understand human cognition, replace or augment humans with machines, and influence people’s choices. These functions are already being widely used for clinical medicine, political analysis, customer service, market research, and business strategy. Considerable research, however, is still needed before models can understand and replicate human common sense, which is implicit knowledge and often unconsciously ingrained in human interactions.
EMOTIONAL HIJACKING (pp. 161)
Dumbing Down the Masses (pp. 161)
People’s memories are atrophying because they constantly depend on online searches and intelligent devices for information. As memory atrophies, attention span shortens, leading to a decline in study habits. At the same time, digital users artificially inflate their egos through social media platforms like Facebook and Twitter, with instant popularity measured by the number of likes or followers, sometimes running into millions. While these activities enhance social status—indeed, some social media stars consider themselves a new class of celebrities and intellectuals—they also contribute to a greater dependency on, and addiction to, social media. Ultimately, such users become dependent on social media for their self-esteem and psychological well-being. This cognitive reengineering is not a passing fad but the likely future being driven by the latest AI technology. I use the term ‘moronization’ to refer to this dumbing down of large portions of humanity.
This is unlikely to reverse because, contrary to popular belief that human cognition is somehow sacrosanct, algorithmic modeling of emotions, psychological characteristics and mental faculties is already delivering practical applications. Such applications, of course, render humans highly susceptible to emotional seduction by digital systems.
Artificial Pleasures and Emotions (pp. 165)
By manipulating hormones, neurotransmitters, neural networks, and eventually artificial memories, machines are rigging our human physiology to produce pleasure and avoid pain. Certain kinds of private experiences are already being technologically engineered to alter individuals’ emotional states.
One active area of research is the modeling of human weakness and vulnerability. Machine learning systems score the likelihood of users being diverted from reading something on their screen. When a pop-up appears on the screen, the machine learning system tracks the messages that are most successful in grabbing a given user’s attention. Various kinds of cognitive stimuli are devised and tested, and the responses are recorded and stored in a database that can be accessed by AI systems and used to construct a detailed map of an individual’s psychology.
This map provides insights into psychological behavior patterns. How likely are users to be diverted by, for example, an ad for a product for which they recently searched? Or perhaps by pornography? Or by a specific political conspiracy theory or the news of an impending alarming event? Models identify how specific individuals are fickle or susceptible to flattery, to techniques that feed their hunger for attention, and to the types of entertaining diversions that make their humdrum lives more exciting.
The cognitive mapping of hundreds of millions of people’s emotions, likes, dislikes, preferences and vulnerabilities is taking place in a very scientific manner. Their activities are recorded in a variety of formats including voice, text, images, handwriting, biometrics, buying habits, interpersonal communications, travel options and entertainment preferences. Machines have become extremely clever at not only capturing private information but also understanding the meaning and purpose of human activities.
….. Researchers are experimenting with physical implants that will take VR and AR systems to new heights for the gratification of sensory delights. Just as talkies replaced silent movies, a new generation of movies in which feelings are transmitted directly to viewers through implants is predicted to be available in the future. Virtual Reality can be used, for example, to provide the sense of walking around the neighborhood, even if one is physically confined at home.
Addictive Behavior Programming (pp. 168)
Numerous books and consultants specialize in teaching AI companies how to capture users through their emotions. Hooked: How to Build Habit-Forming Products by Nir Eyal examines human desires and weaknesses to make what are called sticky apps. The intent is to map out users’ emotional characteristics, especially their vulnerabilities, and then tap into that map to create a customized AI intervention that manipulates a specific desire. Sticky apps provide outlets for suppressed desires, such as the urge to watch pornography, go on an exotic journey, or indulge the fantasy of being a popular public figure. Once someone’s hidden desires are identified, the content is selected to satisfy them. Those who long to travel can do so via AR goggles that will transport them to the place of their dreams. Designers of online hooks exploit people’s tendency to seek relief from stress. Based on the idea that people prefer excitement to boredom and contentment to anxiety, digital marketing companies substitute artificial gratification to intervene and manipulate users’ emotions.
Some manipulative systems contrive scarcity as a gimmick to enhance the perceived value; online retailers often state “only three items left” to create a sense of urgency and play on the user’s fear of missing out. Other systems encourage users to invest in experiences which deepen their dependence on the system. For example, the exciting conversations held on a social media platform could become a precious part of one’s social relations, making it difficult to abandon them and start all over again on a new platform.
….. The process of psychological manipulation is designed to change behavior. The initial hook offers users a perceived benefit that the target group cannot resist; the system then makes it progressively harder for them to disengage. The resulting transfer of power is both gradual and unconscious. Facebook, YouTube and Twitter freely deliver a wide range of user experiences that consumers find difficult to resist. Artificial Intelligence systems have figured out the most powerful, irresistible desires for all kinds of individuals, and fulfills them. An entire field of research specializes in designing systems of instant gratification and addiction.
….. The playbook of the AI giants is eventually to have the maximum number of humans go through life on autopilot. People find comfort in automatic behavior that demands little or no conscious thought. Delegating one’s agency to a machine is like trusting a friend. This frees up the conscious mind to pay attention to more important activities. One day, consumers—and voters—will make very few free choices and will be rewarded for living mostly in autopilot mode.
Yet few people ever consider the ramifications of this transfer of power because they are blinded by the fulfillment of their desires. Most people are not balancing—nor even conscious of—the trade-off between the gain in gratification and efficiency and their loss of free will.
Digital Slavery (pp. 177)
A common technique for training AI systems is to throw a variety of stimuli at people simply for the purpose of measuring their emotional response. By tracking and analyzing responses, machines develop ever more sophisticated psychological maps of people, and in the process become emotionally savvy.
….. The system monitors users and creates a personalized predictive model, or map, of their private psychology. The right illustrates how digital platforms use these personalized predictive models to create what I call happy morons. Exploiting their predictive models, machines offer inducements (or threats) to drive behavior and addict users to the platform and its alleged benefits.
..… Facebook uses tens of thousands of factors such as clicks, likes, shares, comments and personal interests to determine users’ news feed. Its marketing material solicits advertisers by bragging how well it can influence the emotions of users by such manipulations. Depending on what is in the best commercial interests of Facebook, its algorithms decide how to filter the information presented to each individual user. It is important to note that there is no such thing as an objective choice of content being made on our behalf.
..… Social media has become the newest opium of the masses. Digital platforms distract and control the masses with addictive content to keep them mesmerized. Users’ reactions are then analyzed, and the responses incorporated into the system in a never-ending cycle that makes algorithms ever more effective at manipulating behavior. In effect, people surrender their agency and willingly enter a system of digital slavery.
THE BATTLE FOR AESTHETICS (pp. 185)
The strategy of aestheticized power is a brilliant method to deceive people and give them a false sense of pride. It pushes emotional buttons that influence people’s psychology and override their pragmatic interests.
The latest aestheticization of power is now being implemented by the digital platforms—the delivery of customized user experiences that machine learning has identified as those to which given individuals are most susceptible. Dumbing down users and addicting them to sensual gratification and intense emotions makes them more prone to aestheticization as a method of exploitation.
The use of aesthetics can be an effective means to capture power in a pragmatic sense. A crude example would be winning over someone’s heart and using the emotional attachment to siphon off their money. A more sophisticated example is the diplomatic offer of military support to another country to achieve the pragmatic goal of getting troops into that country. We are also familiar with the way missionaries win over poor people by giving them gifts at a time of vulnerability, only to convert them and turn them into a political vote bank. The sequence of events is depicted in the figure.