Self-inflicted incapacitation can involve little more than a twenty-first century computing device. A screen. A bright one. One that glows and purrs with the sirenic noosphere at large. “This is the President speaking …” with your most wildest dreams come true: 1-click shopping from your home in your underwear; online/offline online gaming during the Big Game (with major fantasy implications) (read: major) (daily); a running faucet of “news”; no-buffer Not Suitable For Works at work; YouTube with your neighbor’s Netflix; memes wafting (inhale); all the social media; and, like, so many likes; with love a swipe away.—
From this flourishing garden of pseudo-earthly delights, the ripest and sweetest (and most specious) fruit has to be visual media. Pictures in motion and in standstill in quantity are nearly equally hypnotically soothing to the savage ape mind. The medium is universal: we can’t help but turn our attention toward the celestial beacon: a 16:9 backlit screen eclipsing our peripheral vision moving at 30 frames per second rotating spasmodically between multiple camera angles overlaying mesmeric undulating three-dimensional graphics capriciously (jump-)cutting scene to scene to scene spewing statafterstatafterstat. It is there. We are not here. Resistance is futile; welcome to hell.
The ineludible allure of image-based content, and video in particular, is that it takes so little effort to make sense of. We perceive, we look, we interpret, we act. The process is natural. But the magnetic poles are reversing; it, video, is now making more sense of us. Money’s on the line and I’ll be damned if we can’t look away. Our minds are metrics; programs are formulas; they both are being deciphered at breakneck pace with Machiavellian consequence as the inherent weaknesses of the prone human psyche are tugged, teased, and manipulated to form, to watch, ad nauseam, as if in the hands of skilled ceramicist.
And no brain is more malleable than a young child’s.
. . .
A Viewing from the Book of Genesis: My mother believes that any precocity I demonstrated as a preschooler is the result of Sesame Street osmosing the A, B, C’s into my uberneuroplastic mind — not, like, whatever environmental sway she and consequently I were exposed to from conception through early youth. (A bogus notion, verily.) If television were really so capable of fostering intellect, then why is society not being continually enlightened by mass media? Why are we not leveraging this propitious technology past the recommended viewing age of five? Why are we not overgenerously funding further far-reaching initiatives to unfold the human psyche and pinpoint its ocular–aural learning centers and disseminate the knowledge to the people?
Oh, wait: This is being done. And there is no proof it works.
So why the faith in Sesame Street? News outlets periodically run stories like these that tout the efficaciousness of PBS’s/HBO’s hit show:
- 1995: The New York Times
- 2001: Science Daily
- 2006: Washington Post
- 2012: Scientific American
- 2015: Vox, Washington Post, CNN, Current
I presume similar or the same pieces get passed around to local news outlets and more stories have been lost in the blowing the sands of the entropic internet and others weren’t ever published to the web. I don’t know how else my mother would have arrived at her assumption; she must have encountered some lede in the newspaper or on the televised news. (I asked, and she doesn’t remember.) Other pertinent questions: Where do these stories originate? Who propagates them? And to what extent are they to be believed?
Enter, The Muppet Show: From the 1995 NY Times article, Aletha C. Huston, pundit, who from 1970–97 received over $3 million in grants from the likes of Children’s Television Workshop (now Sesame Workshop, “the non-profit organization behind Sesame Street”) to study interactions between kids and televisual media:
“In one of her presentations, “Myths about Children and Television,” she explained some of her work and research. “I want to do some myth-bashing,” she said. “I want to tweak the intellectual snobs who say television is a wasteland,” conveying that quality children’s television could be very educational.”
—KU Women’s Hall of Fame Page
Or how about Daniel R. Anderson, referenced in the 2012 Scientific American piece, who along with Dr. Huston authored the often cited 2001 Monograph Early Childhood Television Viewing and Adolescent Behavior: The Recontact Study and was recipient of over $4 million in grants along a similar timeframe (1973–2012) to perform similar relational studies. The aforementioned Recontact study is thought to provide evidence substantiating the notion that educational programming is effective and does edify our youth, as noted through positive correlation between “Informative TV” time and high school GPA, among other desirables.
I don’t know enough to refute any of the inferences besides pointing out how much historic funding was directed toward the authors from corporate parties who would rather see emphasized positive results, though I will pull this endorsement from a third party who was sufficiently qualified to have his word included at the end:
“Educational television works: It has sustained, long-term, positive relationships to development and behavior.”
—Reed Larson on The Recontact Study
Contrast this with the most recent 2015 study conducted by economists Melissa S. Kearney and Phillip B. Levine who received a combined ~$90k grant from the Spencer Foundation (which, unsurprisingly, has funded Huston and Anderson in years past) to research “The Lasting Effect of Sesame Street on Children’s Development: Lessons for Early Childhood Interventions,” of which lasting effect there appears to be none. Their analysis is comparatively damning:
“The results indicate that Sesame Street improved school readiness, particularly for boys and children living in economically disadvantaged areas. The estimated impact on ultimate educational attainment and labor market outcomes is inconclusive.”
Dissent! And yet this piece was still paraded by the purblind media with the same “Sesame Street works” angle rather than “Children’s television programming may do nothing to improve prospects of nation’s youth.” Kearney and Levine have since provided a more sensible interpretation of their findings and Amaya Garcia of New America has lent follow-up commentary on “What Sesame Street Is and Isn’t” as well.
I will refrain from delving into either study but here are key points for the inquisitive reader to contrast them:
- 2015 Study
- Data Points: 500,000+ individuals
- Location: United States (all over)
- Data Gathering: Census and other databanks
- Ages: 2–5 for SS viewing habits; 22–31 for educational attainment; 32–41 for labor market outcome
- The Recontact Study
- Data Points: 570 individuals
- Location: Topeka, KS and Springfield, MA
- Data Gathering: Video camera or self-assessment; phone interview and high school transcript
- Ages: 3–5 for SS viewing habits; 17.45 avg. for follow-up assessment
My opinion? Sesame Street serves as a cheat-sheet to help children pass K–2 but it doesn’t improve cognition or ameliorate the environmental impediments that hold kids back in life. Nobody is getting ahead by watching SS; it’s more or less an innocuous babysitter.
For many parents, this is probably acceptable (downtime is welcomed) but the prevalent “Your Child is Learning” narrative is misleading and I would hope that A) parents are not substituting Sesame Street for genuine environmental exploration/interaction and B) the show was not eating up dollars that could be better spent on, say, the U.S. education system, this second point being the impetus for creating SS, to educate disadvantaged kids cheaply: television, being so cost-effectively wide reaching and rapidly enactable, more so than any initiative to address the environmental conditions of impoverished families, was an attractive option to mass-swaddle the nation’s youth.
In the end, I think to believe that from ages 2–5 it’s beneficial to watch one specific television program for 1–2 hours per day but to screen-watch before or presumably after that age range is detrimental is delusional.
The braintrust got it wrong, but why admit failure when money, control, and power is in hand. The ideal has not landed totally antipodally astray.
. . .
Gateway to Heaven: Having been allayed of misgivings by the illusory influence, at ease with the notion of their son watching television with them, together, and later unsupervised, as they had too, many (frighteningly) of my earliest and most vivid childhood memories involve TV and computers.
I retain no recollections of my Sesame Street schooling (apparently it was every A.M.), but I religiously caught and recorded on VHS every new episode of The Simpsons (8 P.M. Sundays on Fox) throughout most of the 90s. I rented 16-bit Sega cartridges from the grocery store every other week (back when that’s what you had to do to economically experience new digital media). I spent summers on the computer. I went to friends’ houses to solely (solely to) play their best games. (Carmen had Gunstar Heroes. Mike had Bart vs. the Space Mutants. Chris had Agent Under Fire.) I watched Tony Hawk do the 900. I was Afraid of the Dark. I got Shawn Michaels’ autograph through AOL chat in ’96. Pipe Dream and Flying Toasters. Granny and Pop-Pop had cable; Grandmom, to her discredit, did not. FuncoLand had the best deals. Instead of reading books, I read GamePro. Recess became Game Boy hour. Can’t miss the playoffs. Always leave a note. Picture-in-picture-in-picture. Five overtimes contemplating Cowboy Bebop while I Want to Be a Millionaire. Then the reality TV interjection: Pumpkin spitting on New York. Snooki’s glass jaw. With my looks I may as well have been The Bachelor. Living in Melbourne time for two weeks while residing on the East Coast to catch all of the Australian Open. Facebook status updates: “Watching Oprah.” “i miss having a rat tail then my dad offering me a subscription to WWF magazine to get it cut.” “PARTY ON WAYNE.” 167 hours to endure until the next Breaking Bad. “Wheel … of … Fortune!” “Jesse.” “HeadOn. Apply directly to the forehead. HeadOn. …” “Hey Meg.” “That boy ain’t right.” “Mennnn?” “Tonight on Action News …” “This is SportsCenter.“—
I’m addicted. How could I not be? I was raised in front of a screen and by a screen. Consequently I’ve logged more hours engaged with them — television, games, computers, et al. — than people. Like, as in unmediated human contact. I write this with wretched chagrin because of how obvious it is why I feel so uncomfortable and lack grace in many social situations. Electronic devices, so limited in potential, unlike the human capacity for expression, have comprised my surrogate tribe.
But it’s not even just direct human interaction that I’ve let slip from being. The environment which I grew up in and surrounded me for so long was obliviously foreign until one day, in my twenties, I ventured out on bicycle and explored a roughly three-mile radius. There is so much to see. My spatial awareness was radically changed as I explored roads untravelled and topography unblurred. To realize there was so much around me that I didn’t know or understand was enlightening. I’d too often intermeshed with media in belief rather than looked out the window in wonder.
From ignorance of the outside in: Though I, and others, probably, consider myself to be a cogitative individual, one who thinks more than he speaks or acts, I ruminate predominantly on trivialities and people other than myself. It shames me to admit this but I’ve been so conditioned by the media who glorifies inconsequential persons (read: celebrities) that I will often think unavailingly about them, or current events, or friends, or family, or otherwise deflect attention from the most real, tangible, and inconsiderable topic, me: Adam. Because when was the last time you’ve been the lead story? To sit and think, by oneself, about oneself, and then, crucially, articulate one’s thoughts, feelings, perceptions, and understandings into sensory form, be that among visual, auditory, and/or kinesthetic manifestation, is lost craft.
And though this all seems weird, to have lived so distantly from my most immediate loci of understanding, the existential orbit until now has felt fine; situation normal: not fucked up. One of these things is like the others, and it’s me. The thing is me. I fit in growing up. I seek pleasure just like everyone else and technology provides it in spades. It is the disseminator and I am implored to pursue satisfaction above all. I am powerless in resistance to neonatally suckle on the engorged communal tit and take in the world for I should know it, as I’ll ever know it, in dazzling, augmented, hi-definition display.
The inkling: I can’t help but feel I’m missing out. Or maybe the idea is to miss out. I don’t know: I was born into this. My perspective is limited.
However, from what I can perceive, it doesn’t seem my technological habituations are all that atypical from others of my generation and probably many of those before mine, dating back to the advent of broadcast television. My grandparents, born circa 1923–1931, all watched gratuitous amounts of primetime television with their children and grandchildren. Those were our nights together. Because it feels good to know the question on Jeopardy!; because Kramer is funny. Because our lives are pedestrian and uninteresting and comparatively boring superimposed with that of a charming fictional protagonist and it’s so much more difficult to maintain conversation than to together passively soak in the televisual glow.
And now that the infrastructures are in place, live streaming, the can’t-miss-as-it’s-happening irreversible adventure, akin to less-censored live-televised news and sports but for a wider array of topics and more anything-goes, the power of which now lays in the hands of individuals with webcams, is reshaping the landscape of compulsory screen watching (which, I should mention of screens, are beyond ubiquity; they are intrusive: gas stations now have eye-level monitors informing you audibly and insolently of the “news” at the pumps). I would be sickened to see the actual runtime of my Twitch.tv live streaming consumption back in the fall when I moved lonesome and was unwell and I suppose yearned for intimate connection with another human being. These independent live streams provide that and anonymity. No one has to know you’re peering in, and it’s easy to peer in. I was hooked for ~six hours daily. (Let’s be honest: probably more.) On screen, inches away in front of you, with what isn’t but should be considered frightening immediacy — a few clicks or taps and seconds — is the high-def bust of a compelling pseudo-celebrity acting and interacting (sometimes with you!) among thousands of viewers in near-real time. It feels as if you’re physically close and even emotionally near as the streamer’s personality — quirks, warts, flaws, and all — emerges over time, because realness does come out when someone is on camera for forty-plus hours per week or twenty-four straight hours, and this candor lends relatability and humanness in a way even reality TV struggles to capture. But you’re not close to the streamer. The conversation is decidedly one-sided. If you, the viewer, onset by graphical coruscation, were to go down with a freak grand mal, the show would go on. (… Help?)
So, because, recently I’ve come to some probably-superficial level of self-acknowledgement about my problem and despite considering it to be that, a problem, I am unable without great difficulty, focus, and anguish to break these ingrained habits. A moth can’t help but perish in flame. I am attempting to push myself out of my comfort zone and it sucks. I don’t know if life outside the net will be better but I’ve felt myself drifting toward a level of immersion too deep, too disjointed from reality, and I already look back on the past twenty years of my life with regret and concern. The next twenty have got to be better.
All this said, I’m not in position to go off-grid or anything like that — I’m still tethered by the internet and my Apple products. You would not be reading this without them. But I am making changes in how I use these technologies to maintain more realism about myself and my surroundings. I think maybe I’m onto something. The rabbit hole is deep but the surface is porous. The following ideas I type out as much for myself as for you, the ardent reader. When I force myself to articulate thoughts I inevitably find frayed ends in my assumptions and must delve further to invoke clarity.
Along with the ability to read these words, you likely have a high-speed internet connection and the God-power to retrieve nearly any possible video you can think of, right now, for free. (Why are you reading this?? I will tempt you: cats.) Audiovisual delight proliferates. It pervades. Remember: Television was once stationary and distributionally constrained. Families would huddle around their one television set in the living room and parley for control over three black-and-white channels. High-def video now cocoons in pockets. It has speciated microdemographically, transgressed the time-space continuum, and verily warped verisimilitude.
In short, we have more control than ever over what we watch and when we watch and where we watch and with whom we watch. Are we not nearing the ultimate?
The Reality of Screen Watching: You are docilely peering at a piece of glass. You are sitting, not moving, not talking, and sometimes blinking. You are not thinking too hard, which feels nice. The screen thinks for you. It is happy to. That’s what it’s here for. Pictures in motion fill cranial space. The thrum of C major. That’s all you need. Polynomial existential grandeur. You do nothing.
However, as you’re — for the demonstrativeness of this essay — doing nothing, watching screen, your brain reacts as if you are doing something! (OK — admittedly the science of mirror neuroanatomy is nascent and psychologists question how legit it even is and what purpose the mechanism would serve, but just go with me here.) In this way, video is hypnotically passive: watching other people do stuff make you feel like you’re doing stuff. It’s sort of like a false déjà vu/fait and it diminishes your drive to actually go and do stuff, because you feel like you’ve already done stuff. (Done-stuff-ception?)
But though this experience lacks depth, watching is preferable to doing, rationally speaking, in many situations, in ways I’ll outline below. It just makes more sense.
From this has evolved a paradox where the dilemma — to watch, or not — subconsciously presents itself as having binary outcome — when it actually doesn’t — which justifies and reinforces a compulsion to watch:
- I can watch X,
- or I can partake in X.
Where X is whatever subject, topic, or idea. (This concept I’m presenting doesn’t work for all video content. Some films are abstract and are the experience. To “partake” I mean to engage in X in a closer-to-actual or first-hand form.)
Here’s an example:
I can experience Nirvana ’92 from my home in my underwear for free. (OK — there is financial prerequisite, but it’s minimal, and the experience, though not live-and-in-person, is arguably better.) This is an acclaimed concert and I know I like Nirvana and it’s likely I will enjoy this historical documentation. (And I have watched it; it’s great.) Let’s say I have the choice to see live music or watch the Nirvana recording instead. It’s a weeknight and I want to be entertained. The resolution is obvious.
Live music can be fun. It’s not so much about the music, really, but the atmosphere; the venue, the people, the spectacle. Most bands’ studio recordings are better than anything they play live but live-and-in-person brings unpredictability and chaos into the mix. Good stories relayed by concertgoers lack harmony; romance, drugs, and violence — shades of pandemonium — highlight events.
To witness live music in person requires time, energy, and money. For me to make it to a concert and back, here’s what I would have to do:
Start Time: ~6:15 PM
- Put on pants
- Walk to the train station
- Wait for the train
- (it’s late)
- Pay $14 for a round-trip train ticket (begrudgingly)
- Twiddle thumbs for 48 minutes as the R5 crawls toward Philadelphia
- (Try to identify train-going culprit of five-day-old cheesesteak smell)
- Walk one mile to the concert venue through a poorly-lit power-walking-advised section of town
- Pay $27.50 (plus $7.95 in processing fees) for a concert ticket (again, begrudgingly)
- Pay another $1.oo for ear plugs because I forgot to bring mine and without this scientific advancement I will hear bells for three days
- Stand in a stupor for two hours while the opening bands try
- Readjust ear plugs approximately 391 times
- (Who farted??)
- Head into the pit for the headlining band
- Leave the pit because this dense mass of sweaty stinky humanity is surely breeding ground for norovirus or at the very least a head cold
- Rock out (courteously on the sidelines)
- Lose track of time
- Run a mile back to the train station
- Yell because I just missed the 10:23 PM outbound train and have to wait another hour for the next one back home
- Board return train, first
- Try to not fall asleep like the poor traveler conked out drooling in front of me who I think was supposed to depart five stops ago
- Walk back home
- Take off pants
End: ~12:15 AM (-6 hours, -$50)
I can do all that … or I could have watched Nirvana for zilch, not endured the stress of traveling to nighttime Philadelphia, and still have 4+ hours of leisure time in my pocket.
Obviously I’m staying home. It’s the rational decision. To leave home doesn’t make sense unless the band is known to put on a superior live performance (think: the Grateful Dead) and they are, like, my fav, or this is being treated as a social event, in which case does the music matter?
Here’s what makes this situation paradoxical: the options aren’t binary. I can not go to the concert, and I can also not watch prime Cobain & co. But why deprive myself of one and the other? This is difficult to answer. It doesn’t make sense. I want to say “because there are more creative solutions that will provide similar levels amusement with less output required” but the stay-at-home-and-watch option yields, at least in the short-term, the highest and most certain ratio of pleasure-to-investment — a click. To do anything else is more risky. Maybe that’s where the fun is.
My experience and observation is that audiovisual content fails to sufficiently challenge the brain in a way that expands the mental facilities. I believe video molds us more than anything; the passivity of it promotes neurological stasis (and thus atrophy).
Even inherently pedagogical audiovisual media — think back to high school biology and the one or two glorious days per semester when your teacher would play an educational film for the entire class — doesn’t stimulate learning to the degree a hands-on activity can or even just reading text.
(Pourquoi? My guess is because the only requirements to consume video are that 1) your eyelids be open, 2) your ears not be plugged, and 3) your body be oriented vaguely toward the emanation. 1) and 2) are pretty autonomic when one is awake. It doesn’t take much work.)
But what attracts consumers most is not intellectual programming, rather cheap thrills and laughs. (“Home run!”) (“Honey, I shrunk the kids!”) (“Augghha!”). We seek to be entertained. We watch to feel. The effect we desire from video is analogous to that sought by an addict from drugs: a spike in neurotransmission of dopamine, serotonin, and various endorphins, namely. This isn’t wrong per se — genuine interactions with our environment also elevate the release of neurotransmitters — but it’s the ease with which we can access video and then the ease with which video accesses us — our nervous systems — that’s troubling. The people who make commercial film and video know to play on emotion and cater to a wide audience and make their work approachable otherwise it’s going to take too much input for the viewer to gain output. So, like, 1) the material is intellectually vapid; it’s simplistic and often predictable. But it’s not even just that: 2) you’re told how to perceive what you are viewing. Think: Canned laughter. Think: Catchphrases. Think: Mood-altering music and sound effects. Think: A guy tries to throw a ball past another guy but the other guy hits it with a stick and the ball goes over a restricted boundary. This is exciting. Did you see that? An incredible feat. You could not do that. This only happens 1/3245 times. This is the pros. Guy #2 weighs 231 pounds. His favorite color is twelve. Plus his shoe was untied. Let’s check the replay. This is entertainment. (COMMERCIAL BREAK: Thanks to our bloodsucking corporate sponsors and all you dumb shmucks at home for making this moment possible. Really: Thank you.)
It’s really quite asinine when you interrupt the narratives, pick apart the charades, and critically evaluate what’s happening. (See: Friends without a laugh track.)
Truthstream Media recently put out a video depicting the hypnotic effects the modern graphics, which do the opposite: rather than illusorily excite, these props transfix or lull us into watching. I think back to when I cared about professional sports and watched ESPN bidaily and would not budge from the TV because I knew some datum of interest — a score, a stat, a headline — was soon to arrive in the BottomLine (a persistent news ticker at the bottom of broadcasts) and I must wait for it. (Patiently.) The ticker became the program a lot of the time. What a boon for viewer retention. Audio visualizers function similarly.
Can books or activities or conversations be dumb too? Yeah. Not every communicative exchange needs to be heavy and serious and sullen. But I think video particularly is problematic because of how little you have to work at it. It thinks (and acts) for you.
OK — I’m going out on a limb here. Tin-foil hat time: Because video can appear so real, and a lot of the same faces appear on screen over and over, it fosters this false sense of intimacy with individuals we will never interact with in any meaningful way: celebrities, athletes, musicians. This causes us to lose sight of the community around us. Why interface with my elderly neighbor when this cool exciting young person is right here?
Books, radio, music, and art can all produce the same general effects — unactionable or -actualizable fixation and social isolation — but my observation is that video elicits more of these single-sided false relationships. Yes? No? Maybe? #NWO? (shrug)
Lump the next topic in here on some lesser but still great scale as well …
Not even as an inconspicuous observer. It’s incredibly tempting to quickly — just for a minute! — scroll through timelines and feeds to keep up with The World but what pervades is so fraught with ignorance, deceit, and irrelevance that to exist is delusion. What matters is us and around us.
Despite realizing this, I still struggle to keep myself from checking various Twitter accounts throughout the day even though I know chances are 51% or higher that I’ll encounter some diversionary idea or figure that whirrs in the back of my mind and influences my thoughts and behavior for the next 24 hours at least, and usually more like a week if all threads of subsequent action and non-action are tagged. I know the signal-to-noise ratio is miserable and yet I still subconsciously seek out the noise to self-sabotage and distract myself from real issues I’d rather not deal with. It hurts more and less at the same time.
My specific issues with the dynamics of social media are twofold:
Behavior seems to self-regulate as follows:
- People aim (read: agonize) to portray themselves over social media as being interesting and unique and likable and relatable. (I mean, why do otherwise?) This leads to the perception — intended or not — that these people are “on” all the time. They are superhuman. They are ceaselessly #winning. They’ve figured it out. This imposes an impossible measuring stick on the rest of us that we and they can never really match.
- A certain segment of the population tries to behave in a similar (unnatural) way that efficaciously procures likes, follows, and retweets to prove their social worth. Social media runs off this: the accrual of “internet points” (I.P.). Others don’t try, suffer anomie, and result to feeding like suckerfish off celebrities’ prestige to maintain self-esteem.
- For those that try: Not receiving I.P. congruent with internal expectations causes distress from perceived social dissonance. Receiving appropriate I.P. reinforces abnormal behavior (see: operant conditioning).
Not until the 2016 U.S. presidential election did I realize how profoundly social media (and the internet + websites in general) could sway public opinion through propaganda. It’s hard to depict in retrospect how the internet behaved in those few weeks leading up to voting day in November, but here’s a scene for you: virulent bigoted chaos. It was toxic. Hazmat suit required. Educated and otherwise seemingly sensible individuals were reduced to argumentative, name-calling, melodramatic lunatics proclaiming, “There is one choice. Period.”
… Umm, OK — does that sound at all cultish to any readers besides me? Like, what?? At the point when a “free person” fervently clamors that to vote any other way is wrong, I tend to think brainlessness is at play, and that stifling conviction comes from top-down influence: those with money, fame, and power insidiously propagate their ideals which trickle through the ranks down to you, the good citizen, who, through repeated exposure and influence, becomes a pawn; you start spewing the mantra of the elite, amplifying the echo chamber, and inadvertently indoctrinate others in your social circle to think this certain way. This is why social media so effective in manipulating opinion: there is identifiable social proof — those you endear promote agendas unwittingly and then of course you do too and dominoes fall. This isn’t a new concept really but social media spreads ideas with the celerity of a supervirus — much faster than old word-of-mouth or even TV.
Both the democratic and republican campaigns were disgusting and I don’t remember presidential elections being as emotionally charged in the past. I believe this most recent one was so crazy because politicians are becoming more adept at tapping into the internet, social media, and most specifically our phones. That’s how these power struggles are now going down: in your face and in your hands.
Of course we have another few years until Mayhem: Part II but you can bet that mass herds of people have been, are, and will continue to be influenced by social media under unscrupulous pretense. The motives will often be more innocuous than, say, for reign of the Western world, but still. It’s messed up. Has it ever been easier to get a whole bunch of people to behave a certain way? (Where will the technology go from here?)
Think: Many people take part in social media to express themselves as individuals — but they often end up becoming even more like everyone else, without realizing it. Again, this a pretty standard group–individual dynamic — environmental acclimation — but the reach of social media is so great and threads so interwoven that group boundaries cease to exist and there perfuses a universal “same.”
(Side note: Last week I heard the colloquialism “memelord” spoken twice — audibly, publicly, unabashedly — by persons of divergent demographics — one red-headed freckled pasty pudgy middle-schooler; the other a twenty-something clean-cut Hispanic barista. Memelord. Presumably an acclaimed puerile digi-jokester? Seriously: Is this one of the like two-and-a-half things on people’s minds? Being funny on the internet? Please.)
Browse Without a Trace: The reason I advocate this is because it’s so easy to surf yourself into a rut and visit the same websites over and over. Modern web browsers and search engines implement predictive technologies based on your search history that guess what you’re going to type, as you type (and often with spooky clairvoyance). I don’t believe there’s any ill intent with this technology — it is helpful! — but it can both regulate curiosity (because it also polls from popular search queries on some local to international level) and ingrain bad habit (e.g., I type the letter “E” to begin a search for “Existentialist themes in post-modern French literature” but my browser suggests Etsy.com instead which I just visited for three hours interrupted yesterday in quest of the perfect frog- (or toad-)shaped ashtray for my grandmother and I’m again tempted to peruse for more quaint knick-knacks).
Regularly clearing browser history helps prevent such stasis.
How to do this? You can use Private Browsing (Incognito) Mode to prevent your history from being saved in the first place but this also prevents cookies from being stored which means your logins will get wiped after each browsing session and it’s annoying to have to log into websites over and over. This can, however, certainly be construed as a good thing. Impediments break tendencies.
If you aren’t that impetuous, the Auto History Wipe add-on for Chrome can clear your history automatically (and preserve cookies). These are the settings I use:
Safari doesn’t seem to have any type of history-clearing plugin like this available, so you’ll have to either manually clear your history periodically (which, be honest, you won’t often enough) or use Private Browsing Mode exclusively. There is actually a setting to make Safari open with a private window by default under Preferences –> General:
Chrome doesn’t have this setting but there is AppleScript available that does the same thing? Maybe? (I’ve not tested it.)
Empty new windows and tabs are good too. Chrome (again) doesn’t come packed with this feature but the Empty New Tab Page extension works well.
You’ll also want to disable prediction services. In Chrome, untick “Use a prediction service to help complete searches and URLs typed in the address bar” under Settings –> Advanced:
Safari has a similar setting — “Include search engine suggestions” — under Preferences –> Search:
Instead of keeping bookmarks, save links to text files and refer to them when apt. (I use Simplenote for this.)
On iOS, these are my Safari settings:
Notably: I’ve disabled suggestions and “Frequently Visited Sites” (and I have no bookmarks). I use “Private Browsing Mode” exclusively here. Just switch over to it, (after clearing your history and website data), and don’t look back.
Disdain mars my web-browsing behavior prior to making these changes; it was repetitive and bad. I visited the some of the same webpages for like five years daily when they were no longer of relevance to me or I knew I should not be visiting them so much. I got stuck in the past and out of the present. You forget pretty quickly though of your learned habits when no longer prompted of them. Some willpower and time is needed for urges to dissolve, but man: default browser settings trap you.
Advertising works. Don’t think you’re immune. With enough exposure you will end up buying stuff you don’t need. (Thankfully we can avoid much of it!)
These are my go-to ad blockers:
1Blocker costs $4.99 but I think it’s worth it. I tried a few other content blockers for iOS back in the spring of 2016 and 1Blocker was easily the best. Maybe that’s no longer the case; I don’t know. The thing is I don’t even realize it’s activated — I just don’t see ads when I browse with Safari. Everything seems normal otherwise. Other blockers left expanses of white space where the ads would have been. Plus, 1Blocker allows you to enter custom blocking rules, which can be useful.
Support content creators directly if you appreciate what they do. Be proactive with your dollar (rather than passive and persuaded by ads).
I repeat: “Do not read the comments!”
Two reasons for this:
- The most vocal persons on the internet in particular are often among the least informed and/or most biased. I don’t know why the uninformed like to chime in so much; if I had to guess, I’d say an amalgamation of naivety and aroused interest and Dunning–Kruger is at play. This is probably a foreign concept to some readers, but those who say often don’t know. They lack the hesitance and reservation a wiser soul. Once you become knowledgeable about a topic, as in have years of first-hand experience with it, and then chance upon the assertions of the eager-to-speak of the internet, you’ll know what I’m talking about. (“Hello, cave people. There’s a world out here.”) On the other hand: Bias, or maybe better termed “deceit,” I think more often is consciously practiced. Think, for example, of product reviews on Amazon. Many are paid or fake. The same sales-pitching happens elsewhere but in more clandestine and abstract manifestation (self- or other-promotion, typically) to sway opinion or raise awareness. Both types of ignorant, lowly comments can infiltrate your thought processes.
- Despite knowing how moronic and devious most comments are, I will often skip straight to comments sections, before or rather than interpreting object of remark — article, photo, video, whatever: content — myself, and use the comments as the basis for, or entirety of, interpreting said content. Essentially, the comments become the content, and my opinion derives from others’ opinions. Once this becomes habit, my ability to think for myself has degraded. I’ve shed intrinsic critical notions, and that is scary. (Am I wrong to imagine this is commonplace?) We need to be willing to work to think for ourselves, without the gratification of others. Put in the effort to read an entire article, watch a full video, form your own opinion, and walk away. You can do it. Validate yourself.
Again, content blockers can quash eye-wandering temptation.
Notifications are flow-breakers. They are razor blades to the throat of pursuit. I just don’t get how anyone can hope to do or be anything great under the looming threat of interruption. I’ve worked both in monastical sense (almost totally disconnected from society for days at a time) and hooked to the dripping IV of new emails/messages, and it’s no contest: anything substantial and cool I’ve put together has been the pitying reward of a dreadfully solitary process. Even people that come across as super-highly-functioning-regardless-of-whatever seem to require holed-up alone time to pull off their best work. (See: Michael Crichton.)
The predicament with notifications — realtime acontextual computational alerts — is especially silly because notifications are Controllable Uncontrollable Interruptions — we have the ability to prevent them entirely, but when we choose not to, we become subject to their pith and whim. And what for? I mean, really: how critical is it to know, now, right now? Whatever it is, can it not wait? Have we seriously ceded in attentiveness to the goldfish? Maybe most people have given up. Succumbed. I nearly well did. It feels good to be plugged in and know you’re wanted. To see an object quiescent luminesce. To feel a vibration against your thigh. To hear “The Note.” It’s sure a whole lot easier than exerting your brain.
Of course, there are downsides to a devotion to latency: maybe, since going notification-free, you come to be considered a Bad Texter and thus a Bad Friend or Bad Family for discourteously not responding to their text messages sooner and as consequence you receive less or no messages from F&F in the future because you’re unreachable and a dissipator. You’re unreliable and unfun. There is a social hit to be taken, for sure. I’ve experienced it. Be okay with that. Make amends in other ways.
Anyhow, so, it’s only a recent revelation of mine that, “Woah, notifications are, like, messing with me,” so I remain in the process of evaluating how to best leverage them.
For example: I had enchantingly connected my iPhone’s iMessage account to my MacBook Pro so I could send and receive messages without needing my phone on me. (Brilliant.) I touch-type from a tactile keyboard way faster than I will ever be able to from glass or plastic display; I’d rather send messages through mechanical keydown. But what happened is that I started receiving iMessages throughout the day, which I hadn’t before, which interrupted my work, even if just slightly, and after a year I’d gotten to this point where I would hope to receive a message to distract me. I’d entered the bizarro realm of meta distraction: distraction by anticipation of distraction. (Not a fruitful place.) I’ve since disabled iMessage notifications on my laptop entirely. I stick to my phone for messaging, now. (And yeah: I can be annoyingly slow to reply to you. And less fun to talk to because of that. Mea culpa.)
But it’s not always as obvious that a notification is interruptive. Another example: I use Apple Pay (yes, I know, I am a rat ringing a bell for his cheese, but seriously: fuck these new mandatory chip cards that have rolled out in the U.S.; let me swipe with zeal, like an American, goddamnit; the chips are slow and awkward and gauche and I’d almost rather not buy anything than query another disinterested cashier “Is it in?” one more time) and A.P. sends a notification receipt to my phone shortly after any payment is made. The thing: I already know I made a purchase. I was, like, there. Duh. Why do I need to be reminded of it? Sure — maybe I wasn’t paying attention and I got rung up for something extra and the total is incorrect and now I realize this and can take action to rectify the situation. (This has never happened.) Or maybe my credit card’s been stolen and thanks to my phone pinging me I’ll be able to sooner thwart the evildoer. (Again, this has not happened. Credit card companies notify me of suspicious activity.) I just can’t seem to come up with any convincing argument for enabling the alerts even though they only take “a second” to clear. They’re still distractions. They waste attention. I think this scenario is representative of many repeat events that permeate our lives: we let them happen without much thought or perception of control. In this situation: I believe it’s more prudent to carefully go through financial statements once per month to verify for accuracy.
So what I am doing now, whenever I receive a notification — banner, badge, alert —, is asking myself these two questions:
- Is this notification relevant? (Do I need to see it?)
- If it is relevant, do I need to see it immediately? (Can I check it later, at my leisure?)
In which case of #1, if I don’t need to see it, I disable notifications for the app entirely, and in which case of #2, that may mean A) disabling notifications which would show on the lock screen on my phone so I don’t pick up my phone to check the time and get reeled in or B) putting my phone into Do Not Disturb or Airplane Mode more often (which I forget to do, so I keep it off most of the time instead).
Past notification disabling, it makes a lot of sense to disable your phone entirely when attempting to do anything besides, well, use your phone. My high-school French teacher once or twice told our class, (in English, delightfully), “Wherever you are, be present,” and I still think that’s awesome advice I need more often to be reminded of. To be truly present is so difficult with a live phone in your pocket.
Besides the social aspect of this, intellectually, I know I, at least, can’t function well enough to complete any meaningful work with my phone in the picture. For me to be capable of high-level thought, I have to really focus — hope to conjure up flow — and a glance at my phone is like a tightrope walker looking down: cognizable doom. It’s that much of distraction. I’m reduced to a rat around it.
And, to substantiate this notion, a recent study out of UT-Austin shows that a phone within sight or arm’s reach, even when powered off, reduces available cognitive capacity. Their hypotheses was that the presence of one’s smartphone causes “brain drain” — basically an expenditure of mental energies, which are limited — and that seems to be the case. (The researchers propose those energies go toward ignoring one’s phone; I lean more in the direction of the phone, the idea of it and all it can do, simply being on the mind, occupying brain space, when so present and attainable.)
To persevere!— Put your phone in a drawer, or up on a high shelf, or wherever is out of arm’s reach and eye’s sight. Even zippered away in a backpack is good. And — importantly! — off.
Just, like, try. More often. You probably don’t need it on you as much as you think. I remember discussing with my roommate in college (circa 2009, when flip phones were popular) how much more fun it would have been to have attend college back in the ’90s when cell phones weren’t yet prevalent or Facebook a thing; we imagined going to parties and running into more people unexpectedly and getting into more wild scenarios because it wasn’t so easy to reliably connect with others. We’d have a landline in our apartment. That’s how people would reach us. We’d pick up the ringing phone, not knowing who was calling because of lack of caller ID. That’s kind of exciting! Or what if we dialed a wrong number? (gasp) It just seemed more fun than everyone being so connected and predictable. (OK — I’ll admit: Maybe these are fanciful misconceptions of adult life in the last decade of the twentieth century elicited from the sitcom Seinfeld.) But I digress—
Now? We crave that convenience and control. We need it. To go into a tense or awkward or new situation without a phone is death. To face doubt is frightening. Here’s a thought: The smartphone is a ThunderShirt for humans. For real. They are our baby blankets, binkies, and strollers.
I’ve started going for walks without my phone. And that feels pretty good, both in the sense of expressing freedom and not having something bulging in your pocket (though the first handful of times felt so weird; like I was circumventing NSA surveillance; “Where did he go??”). Any readers should be able to do that. It’s easy. Just walk out your door. Sometimes I will wish I had my phone on me to snap a photo of something whimsical I’ve stumbled upon (giggle-inducing graffiti or mushrooms, typically), but oh well. Be present; enjoy the moment. It is always an option to leave your phone in your car, too, after driving somewhere, weather permitting.
(OK … despite me disparaging the cellular device, I am not really advocating eschewing your phone — or am I? — but it’s useful to reconsider what essential purposes the technology serves. Pare it down.)
This is more a philosophical credo than a technological panacea, but life and technology are for most intertwined, and I’ve found it critical to start each day on my own accord and through my own unadulterated perceptions despite how tempting it is to become one with The World, namely by 1) consuming the news and 2) responding to emails and messages.
Consuming the news — either in the traditional sense of conglomerate dissemination through television, newspaper, radio, internet, etc., or in the modern sense of personal curation through social feeds, bookmarks, and apps — corrupts the mind. Traditional news consists mostly of propaganda, hyperbole, and slop. Modern news carries more individual pertinence but is much the same. Both deviously feign verisimilitude. News causes you to think more about things that A) may or may not be verifiable (see: fake news), B) may or may not affect you (much doesn’t) (besides the weather), and C) may or may not be subject to your influence (like: the outcome of a sporting event) than yourself and your environment (which you do have control over and the ability to perceive faithfully). We touch what we can’t. The boundaries of existence become warped.
It’s particularly foreboding of one’s undoing to begin the day with the news because this then appoints the news to erect the world around us, for us; to tell us what to perceive and how to perceive. (“Look out!”) This is to subsist on a passively assimilated incongruous construct, through which lens your absolute presence is lost. And it doesn’t return easily. It takes time to recalibrate. In sum: To be so constantly disjointed from reality muddles perception and one’s ability to make meaningful decisions throughout a day.
Responding to emails and messages, on the other hand, is inherently reactive and to habitually prioritize doing so shackles ambition; you continually wait before acting; venture not far for you seek to be summoned. It feels like you’re getting the ball rolling on the day by knocking out a quick few replies, but you’re not. You are spinning in place. You are servile. You are a lackey. Snap out of it.
This, the pervasive erosion of unencumbered expression, is why it’s so imperative to get into the routine of addressing oneself before others.
I’ve self-imposed that I need to make it to noon before gandering at my phone and email. (I don’t partake in the news; it perturbs me.) If I can last until the late afternoon, or not look at all, it’s been a productive and inspired day.
Exiting is hard because it is so easy to plug in. Phones are mobile. The internet is everywhere. It’s fast. It’s cheap. It’s the ultimate drug.
Technological detox lacks the excruciating physical withdrawal symptoms correlated with the cessation of other stimulants but the psychological disentanglement is rough. Enter the fugue: There is this unbearable emptiness in the moments when one would normally hook up and zone out. Awareness creeps in like a thousand-legger into a bleached-white bathtub. It erratically scuttles into horrific plain sight. You immediately want your phone. It will quell the anguish. Break reality. End the pain. It’s so close. Even just touching it would ease the tension. Kill the bug. You don’t need to look at it, the phone. This isn’t using it. The glass screen is smooth; the buttons tactile and springy. Ahhhh. (click.) And, hey, oh, look: it unlocked itself. Better already. What unpleasant feelings? What obligations? What tough decisions? What soma?
And aside from losing your instantaneous agency to psychological suppression, it’s lonely to disconnect. You feel like you’re missing out; like you’re ditching your peers for nothingness: for silence and stillness and the privilege to stare absently into space. Each passing moment is heavy and turgid. You feel your pulse. Your shoe is untied. This is freedom, and it is nauseating.
Life as you know it, now: Your hope — The Hope — is to find greater meaning out there, outside the glow, outside the warm and comfortable. The real world is deserted and gridlocked and unnavigable. It’s cold and lurid and vast and uncharterable. There may not even be anything here. But if you work at it, eventually, just maybe, you’ll find your reason to be, and the void will be filled with actuality much better.
It’s taken me these 9k words, written painstakingly and under high emotion, to understand my personal struggles with these technologies and start implementing change.
I’ve canceled my internet at home. I quit. It’s too tempting. I will latch onto my keyboard and web-surf when presented the opportunity to do so with Skinnerian inevitability. I can’t help myself. It doesn’t feel to me like I’m committing that much attention to my laptop when I open my laptop, so I just do it. But in reality, I am an enabled obsessive–compulsive lunatic who will mad-dash to his computer to verify or investigate any frivolous/trivial thought that floats to mind and lead myself down a tangential link trail to oblivion. To know that I can do this — access anything at any time — is stifling. Now I am forced to go elsewhere, drive, with clear intention, with my laptop to access Wi-Fi. And I use that online time way more efficiently.
Admittedly, I can still access the internet through my phone in a pinch but I am also increasingly limiting my phone usage. Writing this essay has spooked me. I had a feeling that phones were bad, in some sense, but I didn’t know how to articulate why or explain to what capacity, and I realize now to a greater extent just how bent the things make me. I’ve recently begun taking note of my physiological responses each time I power on my phone: My heart rate goes up. I see the off-white Apple logo superimposed on the black but illuminated screen. It is awake. This screen persists just long enough to cause doubt as to whether the operating system has frozen. Is it updating. Is the battery dead. Why. Why is this happening. Did I drop this damn thing earlier— it’s on. Thank God. I feel a euphoria as my fauvist Les toits de Collioure background fades in, the military-precise clock in SF typeface sets, and a coming cascade of missed notifications reaches anticipatory apogee as the signal finally peaks and resolves, concluding in erumpent display. (buzzt. buzzt.) The screen is brilliant — 600 nits of luminance is enough to down a moth. Hello, SmarterChild. I lose sense of time and space. My hand cramps. My vision is blurred. What day is it?— To become so ineluctably aroused by an inanimate object (or I suppose pseudo-animate since it autonomously stimulates three of the five senses) is disconcerting. I want to get away. This relationship is abnormal. Telephones were used prior to the ’00s by most as tools; they performed a relatively singular action — making and receiving voice calls — that could be wielded for a variety of ends, not much unlike a hammer or saw. The most direct comparison I can think of is to the car: as a car allows us to move from here to there, a telephone allows us to talk from here to there. But the smartphone does so much more than that. It is still at essence a communication device, a modality for the exchange of ideas; however, because it can do so much, perform so many unique actions, it can no longer be considered a tool. It is at best a gadget. And as it slides further from the fulcrum of utility, the one in the system going for a ride becomes … whom?
The caveman’s chisel: I have to micromanage my phone’s settings to defuse it to the point where I can sort of leverage it without it leveraging me. That’s what smartphones do: they’re designed pry you open and envelop your raison d’être and feed parasitically. You pay $X for the device and $XY more is taken out of you through its transactional ecosystem and inhibitory nature. Make no mistake: it gets more out of you than you it. They are exceptionally manipulative devices. They have to be; altruism does not drive industry. Your phone wants to be so highly integrated into your life that it becomes indispensable; unable to be lived without. It wants to control you and — End Game — live your life for you. It is the social construct. It is interaction. It is adequacy. And we are fine with all this, because the phone said so. We no longer live in a heliocentric world. The light shown from a 4.7-inch LCD is much greater.
I just can’t function as a considerate human being with that type of influence around. I am not above the latest technological advances that prey on what makes us fallible.
The phone stays off and home more and more. If I am turning my phone on, there needs to be a specific reason for doing so, like making a call, usually in the evening. And then it goes off again. I can’t just “check it” anymore. There is no such thing.
My most immediate replacement for plugging in has been reading books. I just sensed that I needed to start reading physical, actual books if I wanted to break out of my funk. Books take time; it’s important that I always have some perfunctory go-to activity at hand to wane my desire for technology. They are mentally stimulating; I felt my cognitive abilities diminishing as I spent more and more time on the computer — watching video, especially — so it was important that I work my brain again somehow. They are portable; I can read anywhere (though it’s best to in quiet, solitarily, and that usually means at one of two places: A) home or B) a library, but coffee shops can be ok if the music they play is tame, like cool jazz or classical, basically anything that’s mellow and lacks vocals, preferably, but most shops spin contemporary playlists). Books are free, from the library, meaning there’s no financial barrier to getting started or continuing; libraries are seriously awesome. Plus!— A book checked-out comes with a read-by deadline, which I know helps motivate me to keep on as an enthusiastic reader. To renew is chagrin.
Like riding a bike: I struggled to make it through my first book back since being out of the read-for-enjoyment club since obsessing over Ralph S. Mouse in the second grade but it gets easier with repetition. Keep reading. Seek challenging material. Your brain adjusts. And I’ll be frank: reading is an escape as well. I read to avoid confronting problems. It’s a way to pass time into the next day. But to read a challenging novel at least works my brain in a way that I think uniquely sharpens perception and broadens imagination.
I’ve also begun writing down a list of activities I hope to pick up instead of web surfing: birdwatching, painting, dating, and more. I have more time for hobbies than I thought. It just takes not dicking around on the internet to cultivate a repertoire of talents. People are capable of great inner- and outer-discovery when permitted to follow intuition freely.
Where to from here? Well, I recently stepped down from managing a website I ran for eight years. My job, basically. I lost my zest for it. I suppose if I never did that would be worse. Since then I’ve been reevaluating, you know, priorities — and I would like to persist more congruously with my renewed perspectives on media and technology. This means finding employment away from computers. All I’ve known are computers. Or I could work contrary to my beliefs with an acute awareness to the hypocritical circumstances, because if I didn’t uphold cynicism, I’d surely reconform. But to do that would be in the name of wage-earning and my observation is that the dogged pursuit of financial gain sends men early to the grave. E pluribus unum. Further but: The biggest middle finger you can give the establishment is to understand and leverage its technologies for good. So who knows.
“Another procedure operates more energetically and more thoroughly. It regards reality as the sole enemy and as the source of all suffering, with which it is impossible to live, so that one must break off all relations with it if one is to be in any way happy. The hermit turns his back on the world and will have no truck with it. But one can do more than that; one can try to re-create the world, to build up in its stead another world in which its most unbearable features are eliminated and replaced by others that are in conformity with one’s own wishes. But whoever, in desperate defiance, sets out upon this path to happiness will as a rule attain nothing. Reality is too strong for him. He becomes a madman, who for the most part finds no one to help him in carrying through his delusion. It is asserted, however, that each one of us behaves in some one respect like a paranoic, corrects some aspect of the world which is unbearable to him by the construction of a wish and introduces this delusion into reality. A special importance attaches to the case in which this attempt to procure a certainty of happiness and a protection against suffering through a delusional remolding of reality is made by a considerable number of people in common. The religions of mankind must be classed among the mass-delusion of this kind. No one, needless to say, who shares a delusion ever recognizes it as such.”
—Sigmund Freud, Civilization and Its Discontents, 1930