Rotten.com was a website that collected pictures of dead people. As old as the internet, it survived lawsuits, DDOS attacks, and being featured on the Howard Stern show. In 2017, it finally went offline. The website that harvested death became death. If it was a human body, it would have decayed to bones by now.
Imagine a website, Botten.com, that collects pictures of dead websites. It wouldn’t be exciting: just screenshots of 404 and ERR_NAME_NOT_RESOLVED messages. Dead people rot; dead websites cease to exist. At least it’s not the other way around. I’m glad my body will hang around as a foul-smelling mass instead of vanishing peacefully into the ether. No matter how unloved and ignored I was in life, someday it’ll end, and then someone – if only the county medical examiner – will pay attention to me.
Rotten.com had a FAQ page, and one of the questions was “are they real”? They, meaning the pictures. As if car crash victims are like freakish Bigfoot sightings instead of something that could happen to any of us this afternoon.
The webmaster’s reply was characteristically blunt. “Pictures of this nature aren’t particularly rare; they are merely hidden from the public in most cases.”
Hidden by whom? Traditionally, the media and the government stopped you from seeing upsetting photos. It used to be easy (and common) for a state actor to control and prohibit the release of a photograph, and there are photos that we know exist and which we’ll never see. Diana, Princess of Wales, shattered like a bisque doll against the asphalt of a Parisian tunnel. Rudolf Hess, post mortem after what was either suicide or extrajudicial execution by MI6 agents at Spandau Prison. Photos can die, but they can also be imprisoned and serve life sentences.
In 2020, social media is the primary way people view images, and the volume of digital data overwhelms traditional state censorship. 95 million photos are shared on Instagram every minute. Far more than anyone wants to look at. When you scroll a feed, you’re rolling the dice that the next picture won’t be of an amputated penis.
Hiding atrocities now falls to contractors for Facebook and Twitter, typically located in the Philippines or India. These business process outsourcing (BPO) companies provide human content moderation at scale for large companies. They’re the thin brown line separating Facebook from 4chan. Scrolling social media all day might not seem like an especially demanding job, but apparently the job causes psychological problems.
In his first few weeks on the job, Rahul felt shocked by the graphic videos he encountered of car crashes and child abuse. Eventually, he grew desensitized.
“It gets to a point where you can eat your lunch while watching a video of someone dying. … But at the end of the day, you still have to be human.” Rahul said he didn’t see a therapist — it wouldn’t have been useful to him, he said.
…it was a graphic video of a child being abused that stuck with him. After seeing the video, he began to notice a change in his own behavior that worried him. “I am not a bad person,” he told Rest of World. “But I’d find myself doing little diabolical things, saying things I would regret. Thinking things I didn’t want to.”
This reminds me of a six year old article from Wired, outlining the same problem.
Eight years after the fact, Jake Swearingen can still recall the video that made him quit. He was 24 years old and between jobs in the Bay Area when he got a gig as a moderator for a then-new startup called VideoEgg. Three days in, a video of an apparent beheading came across his queue.
“Oh fuck! I’ve got a beheading!” he blurted out. A slightly older colleague in a black hoodie casually turned around in his chair. “Oh,” he said, “which one?” At that moment Swearingen decided he did not want to become a connoisseur of beheading videos. “I didn’t want to look back and say I became so blasé to watching people have these really horrible things happen to them that I’m ironic or jokey about it,” says Swearingen, now the social media editor at Atlantic Media. (Swearingen was also an intern at WIRED in 2007.)w of humanity.”
Some content moderators end up traumatized by their experiences, and some are now suing the the companies they used to work for. Others (like Swearingen) have the opposite problem: they’re not traumatized. Quite the reverse: looking at horrible things is becoming far too comfortable for them.
Is there a solution?
Some people enjoy seeing this content. Or are stimulated in some way by it. Robert Ripley’s Believe It or Not newspaper column documented the bizarre and unfortunate, and became an American institution. Rotten.com got millions of clicks a month in 1997. In recent years, subreddits like /r/watchpeopledie have replaced them. Whether this is normal or not is up for debate: it’s conceivably useful.
Someone on Hackernews had the idea of outsourcing content moderation to /r/watchpeopledie.
It’s kind of brilliant. There’s clear lines of supply (disturbing pictures + people who like looking at them) and demand (content moderation + boredom alleviation) on both sides. People would do this job for free, or for MTurk-level wages.
I can think of only two problems with this idea
1) Everyone has different triggers. Perhaps I enjoy beheading videos, but am upset about animal abuse. A /r/watchpeopledie user can selectively avoid links containing disturbing content, whereas a content moderator has to view everything.
2) Doing something recreationally doesn’t mean you’ll succeed with it as a job. Game development studio Ion Storm hired level designers who had created mods for Doom and Quake, on the theory that the skill would translate to the work environment. Often, it didn’t. Doing something for fun is a radically different vibe, because you have agency and can choose the shape of your task. At work, the task’s shape is imposed on you by management. It’s not the activity that’s fun, it’s the freedom.