Written in a lucid conversational style, the essential argument made by Jaron Lanier is that smartphones carry ramifications that could spell disaster for the human species. Smartphones are an invention that we take for granted. But we should remember how recent this invention is. Since we carry around this device everywhere, we are always being tracked and measured. We are being hypnotized little by little by technicians we can’t see, for purposes we don’t know. We’re all lab animals now.
Algorithms collect data on you each second.
What kinds of links do you click on? What videos do you watch all the way through? How quickly are you moving from one thing to the next? Where are you when you do these things? Who are you connecting with in person and online? What facial expressions do you make? How does your skin tone change in different situations? What were you doing just before you decided to buy something or not? Whether to vote or not? All these measurements and many others have been matched up with similar readings about the lives of multitudes of other people through massive spying.
Please don’t be insulted. Yes, I am suggesting that you might be turning, just a little, into a well-trained dog, or something less pleasant, like a lab rat or a robot. That you’re being remote-controlled, just a little, by clients of big corporations.
A scientific movement called behaviorism arose before computers were invented. Behaviorists studied new, more methodical, sterile, and nerdy ways to train animals and humans.
Even though behaviorism worked, which was scary enough, now, it is possible to shape people’s behaviors wherever they are. You don’t even need their consent. Whereas in the past, you needed to step into a psychology building to get an experiment done, today, all you need is a smartphone.
Even employees of social media companies have admitted that something is very wrong.
Smart people should delete their social media accounts until a safe and less toxic version of social media is available.
Seems like a good moment to coin an acronym so I don’t have to repeat, over and over, the same account of the pieces that make up the problem. How about “Behaviors of Users Modified, and Made into an Empire for Rent”? BUMMER. BUMMER is a machine, a statistical machine that lives in the computing clouds. To review, phenomena that are statistical and fuzzy are nevertheless real. Even at their best, BUMMER algorithms can only calculate the chances that a person will act in a particular way. But what might be only a chance for each person approaches being a certainty on the average for large numbers of people.
Since BUMMER’s influence is statistical, the menace is a little like climate change. You can’t say climate change is responsible for a particular storm, flood, or drought, but you can say it changes the odds that they’ll happen. In the longer term, the most horrible stuff like sea level rise and the need to relocate most people and find new sources of food would be attributable to climate change, but by then the argument would have been lost.
With nothing else to seek but attention, ordinary people tend to become
assholes, because the biggest assholes get the most attention. This inherent bias toward assholedom flavors the action of all the other parts of the BUMMERmachine.
The most curious feature of the addict’s personality is that the addict eventually seems to seek out suffering, since suffering is part of the cycle of scratching the itch. A gambler is addicted not to winning, exactly, but to the process in which losing is more likely. A junkie is addicted not just to the high, but to the vertiginous difference between the lows and the highs. Similarly, a BUMMER addict eventually becomes preternaturally quick to take offense, as if hoping to get into a spat.
While people who work at Twitter might, on an emotional or ethical level, prefer that their platform was bot-free, the bots also amplify the activity and intensity of the service. Massive fake social activities turn out to influence real people.
Feedback is a good thing, but overemphasizing immediate feedback within an artificially limited online environment leads to ridiculous outcomes.
Here’s a non-geeky framing of the same idea: What if listening to an inner voice or heeding a passion for ethics or beauty were to lead to more important work in the long term, even if it measured as less successful in the moment?
A common and correct criticism of BUMMER is that it creates “filter bubbles. ” Your own views are soothingly reinforced, except when you are presented with the most irritating versions of opposing views, as calculated by algorithms.The pattern has become so clear that even research published by social media companies shows how they make you sad. Facebook researchers have practically bragged that they could make people unhappy without the people realizing why. Why promote something like that as a great research result? Wouldn’t it be damaging to Facebook’s brand image? The reason might have been that it was great publicity for reaching the true customers, those who pay to manipulate.
Recently, Facebook researches admitted the truth of claims that other researchers made. Their products cause real harm. The papers in the footnotes show that there are many hypotheses for why social media makes people sad: Unreasonable standards for beauty or social status, for instance, or vulnerability to trolls.
Why the variety? Wouldn’t one way of bumming people out be enough? Since the core strategy of the BUMMER business model is to let the system adapt automatically to engage you as much as possible and since negative emotions can be utilized more readily, of course such a system is going to tend to find a way to make you feel bad.
It will dole out sparse charms in between the doldrums as well, since the autopilot that tugs at your emotions will discover that the contrast between treats and punishment is more effective than either treats or punishment alone. Addiction is associated with anhedonia, the lessened ability to take pleasure from life apart from whatever one is addicted to, and social media addicts appear to be prone to long-term anhedonia.
Navigate the intricate maze of Artificial Intelligence with “Through a Glass Darkly: Navigating the Future of AI.” This isn’t just another tech book; it’s a curated conversation featuring diverse experts—from innovators to ethicists—each lending unique insights into AI’s impact on our world. Whether you’re a student, a professional, or simply curious, this book offers a balanced, accessible guide to understanding AI’s promises and pitfalls. Step beyond the hype and discover the future that’s unfolding. Order your copy today.
Gig economy workers rarely achieve financial security, even after years of work. To put it another way, the level of risk in their financial lives seems to never decline, no matter how much they’ve achieved. Meanwhile, a small number of entrepreneurs—who always turn out to be close to some kind of computation hub—have become fantastically wealthy, creating an ever-widening gap between rich and poor, reminiscent of the nineteenth century’s Gilded Age
Risk has been radiated out to ordinary people; those close to the biggest computers are locked in to wealth, like casino owners.
There was a time when people were happy about business titans like Steve Jobs becoming so successful. At the same time, the people building the internet wanted everything to be free.
There was a lot of hedging and fudging on this point around the turn of the century. Ultimately, only one method of reconciliation was identified: the advertising business model. Advertising would allow search to be free, music to be free, and news to be (That didn’t mean that musicians or reporters got a piece of the pie, for the techies considered them replaceable.) Advertising would become the dominant business in the information era.
This is how Bummer was born. Now we feel helples,s but we made the choices before, and we can make them again.
The business plan of BUMMER is to sneakily take data from you and make money off it. Look at how rich BUMMER companies are and remember that their wealth is made entirely of the data you gave them. I think companies should get rich if they make things people want, but I don’t think you should be made less and less secure as part of the bargain. Capitalism isn’t supposed to be a zero-sum game.
Art might be created automatically from data stolen from multitudes of real artists, for instance. So-called AI art creation programs are already practically worshiped. Then, robotic nurses might run on data grabbed from multitudes of real nurses, but those real nurses will be working for less because they’re competing with robotic nurses.
BUMMER only supports stars. If you are one of those rare, rare people who are making a decent living off BUMMER as an influencer, for instance, you have to understand that you are in a tiny club and you are vulnerable. Please make backup plans! I hate raining on dreams, but if you think you are about to make a living as an influencer or similar, the statistics are voraciously against you, no matter how deserving you are and no matter how many get-rich-quick stories you’ve been fed.
The problem isn’t that there are only a few stars; that’s always true, by definition. The problem is that BUMMER economics allow for almost no remunerative roles for near-stars. In a genuine, deep economy, there are many roles. You might not become a pro football player, but you might get into management, sports media, or a world of other related professions. But there are vanishingly few economic roles adjacent to a star influencer. Have a backup plan.
Google’s director of engineering, Ray Kurzweil, promotes the idea that Google will be able to upload your consciousness into the company’s cloud, like the pictures you take with your smartphone. He famously ingests a whole carton of longevity pills every day in the hope that he won’t die before the service comes online. Note what’s going on here. The assertion is not that consciousness doesn’t exist, but that whatever it is, Google will own it, because otherwise, what
could this service even be about?
I have no idea how many people believe that Google is about to become the master of eternal life, but the rhetoric surely plays a role in making it seem somehow natural and proper that a BUMMER company should gain so much knowledge and power over the lives of multitudes. This is not just metaphysics, but metaphysical imperialism.
The big tech companies are publicly committed to an extravagant “AI race” that they often prioritize above all else. 8 It’s completely normal to hear an executive from one of the biggest companies in the world talk about the possibility of a coming singularity, when the AIs will take over. The singularity is the BUMMER religion’s answer to the evangelical Christian Rapture.
The weirdness is normalized when BUMMER customers, who are often techies themselves, accept AI as a coherent and legitimate concept, and make spending decisions based on it.This is madness. We forget that AI is a story we computer scientists made up to help us get funding once upon a time, back when we depended on grants from government agencies. It was pragmatic theater. But now AI has become a fiction that has overtaken its authors.
AI is a fantasy, nothing but a story we tell about our code. It is also a cover for sloppy engineering. Making a supposed AI program that customizes a feed is less work than creating a great user interface that allows users to probe and improve what they see on their own terms—and that is so because AI has no objective criteria for success.