CONCENTRATION

edited by Ingo Niermann

 

A collection of texts on the present and future of personal, social, technological, and literary concentration by Dirk Baecker, Nina Bußmann, Charis Conn, Kenneth Goldsmith, Boris Groys, Ingeborg Harms, Arthur M. Jacobs / Raoul Schrott, Sophie Jung, Quinn Latimer, Ingo Niermann, Amy Patton, Emily Segal, Jenna Sutela / Elvia Wilk, Alexander Tarakhovsky, Johannes Thumfart, Ronnie Vuine, and Jacob Wren

Contents

Ingo Niermann
Introduction

 

Emily Segal
Concentration

 

Nina Bußmann
Sensitivity and Calm

 

Quinn Latimer
Congress, Her

 

Jacob Wren
All Profound Distraction

 

Dirk Baecker
Diminished Concentration

 

Charis Conn
The Executive Chair

 

Johannes Thumfart
From the Rhizome to the Heavy Tail

 

Boris Groys
Fiction Defictionalized

 

Kenneth Goldsmith
Yes #tweeting is real #writing.

 

Ingeborg Harms
Blackberries

 

Alexander Tarakhovsky
Signs of Concentration

 

Jenna Sutela / Elvia Wilk
When You Moved

 

Arthur M. Jacobs / Raoul Schrott
Captivated by the Cinema of the Mind

 

Ronnie Vuine
Future Reading Systems

 

Ingo Niermann:
Literature’s Victory

 

Sophie Jung
X-Examination <3

 

Amy Patton
Extracts from an Outline for a Memoir Written Abroad

 

 

INTRODUCTION

Human labor is increasingly being replaced by machines. When it comes to tasks of calculation, observation, and recollection that demand a high level of accuracy, computers are also vastly superior to humans. At the same time, the demands made on humans to offer up their concentration—in order to believe, choose, buy, love, or shape whatever it is they are experiencing—are greater than ever. In a society premised on humanism, every social act is validated by people paying attention to that act. Or, in the vocabulary of systems theory, symbiotic mechanisms of social communication must be experienced to be valid. Tactics that induce unconscious decisions—in advertising or love, for example—are considered unethical.

Practice and medication can do only so much to enhance concentration, and so it has become a fiercely contested commodity, hotly pursued by an endless stream of new strategies and media. It isn’t enough to have a useful or stimulating idea and to develop a practical, affordable, and pleasing product on that basis. Social significance comes only to the things people manage to concentrate on, if only for an instant.

One activity that requires a particularly high level of concentration is the reading of non-trivial texts. In light of this, Fiktion—a pilot project exploring the digital distribution of challenging literature—asks first and foremost how this kind of reading can be possible at all in the future. Fiktion has developed its own e-reader for this purpose, and in this anthology, nineteen German- and English-speaking writers and scholars tackle the topic of concentration in previously unpublished essays, stories, and poems.

Ingeborg Harms, Quinn Latimer, Arthur Jacobs, and Raoul Schrott write about the circumstances under which a text or activity can completely draw us in. Dirk Baecker and Amy Patton speculate about the shifting nature of concentration, while Jenna Sutela and Elvia Wilk write about concentration that carries us into outer space, and Charis Conn about concentration violently induced. Nina Bußmann writes about not being sure whether she is currently concentrating or pursuing distractions, while Sophie Jung, Emily Segal, and Alexander Tarakhovsky take attention deficit disorder as their subject. Kenneth Goldsmith argues that avant-garde art and literature result precisely from reduced attention. Jacob Wren wonders if concentrating on one thing at the cost of all others is not the exact reason our world is in such a terrible state. Johannes Thumfart looks at the relationship between concentration of the mind and on the digital market, Boris Groys observes a defictionalization of literature through the Internet, while Ronny Vuine and I make speculative—i.e., decidedly fictional—literature the enduring and/or recurring main driver of social change.

 

Ingo Niermann

Emily Segal
CONCENTRATION

When I try to think about concentration, I think about not concentrating. It’s like the game whose only goal is not to think about the game. But you can think about concentration after the fact. It was amazing, I was so focused. The book just wrote itself. The sun was shining. The concentration came naturally. In such formulations we have the past tense; a sense of wonderment or retroactive self-surprise; the specification of the object of concentration (the task); the insinuation that concentration has since eluded you or become unnatural.

If concentration comes up at all, it’s always already a problem. Nobody’s sitting there self-describing as someone who can “always concentrate.” It’s not something to brag about—the lack of an object voids the statement. Similarly, one can’t really describe ambient concentration, like a warm fog of concentration filling a room. A hush of concentration fell over the library? Sure, but an element of compulsion is implied, an offstage challenge. People don’t, like, randomly concentrate for no reason in a group. It’s not a “vibe” in that sense. It’s more likely a group of people would break into song than unexpectedly begin to focus.

Of course, you can definitely brag about “not having problems” concentrating.

In which case you are probably already high.

 

WHAT’S THE ENERGY OF YOUR ENERGY DRINK?

 

Amphetamine is very masculine, historically. Fighter pilots, killers, science fiction writers, Hitler. But it all depends on how your speed is branded. Men take the chemical amphetamine. Artists shoot speed. Fags snort meth; scum and hicks smoke it. High schoolers crush and blow Ritalin. Desperate ones, like me, stack their Sudafed. College kids buy Adderall. Surgeons, pilots, and venture capitalists are on Provigil. Brainhackers and life extensionists take nootropics. Ladies munch diet pills. I used to take Vyvanse, a drug whose name is only a couple vowels away from the present active participle of the Latin verb to live, to be alive.

 

SPEED

 

For me, concentration is synonymous with prescription speed. Until Adderall, there was no such thing as concentration proper—the word indicated only its own loss or elusiveness. With Adderall came the genuine article, the proper noun. Take this pill, and you can concentrate. To get a prescription, you need to adequately describe yourself as someone with problems concentrating, at which point you may begin to believe it.

Right now, America’s in the middle of an unresolved cultural freakout about the effects of prescription speed and other smart drugs, particularly on young people. There’s a sense that our comeuppance is on the way. Timothy Ferriss, the cartoonish hyperefficiency expert and author of The 4-Hour Workweek: Escape 9-5, Live Anywhere, and Join the New Rich, is an experienced user of stimulants and smart drugs. But don’t think he feels good about it. On a recent podcast, Ferriss describes the “equal and opposite effect theory” which is basically a zero-sum concept of drug use: you’ll pay for it later, or you’re paying for it now in ways you don’t know.

This aligns with the common narratives of doping and hitting rock bottom. Lance Armstrong paid for his exceptionalism with disgrace. Daniel Keyes’s Flowers for Algernon is the scariest no-free-lunch drug story: the protagonist’s medical enhancement makes him supernaturally smart, which first alienates him from everyone he knows, and then, when he suddenly starts degenerating, gives him a scarily hi-res view of his own decline, right back down to where he started.

Even more basic is the version in the movie Empire Records, when Liv Tyler’s beautiful, high-strung character snaps. Her “diet pills” scatter all over the ground, revealing the secret to her success. She stops doing drugs and gets a boyfriend. In both cases the truth comes out, and resolution follows. Doing drugs can make you too special.

As Cat Marnell, the beauty journalist and author of short-lived Vice column “Amphetamine Logic” explains: “I got more attention than other people. It’s like the same term they use to describe narcissistic people, which is ‘conspicuous existence’ and it’s the same thing on speed. You have a conspicuous existence. I have never not been on speed since. If anything, that’s what you are addicted to: you become a little more special than other people. I’ve always been an enhanced version of a human being. Of myself. I’m addicted to that. When I went off of it, you know what happened? I became normal. I looked normal. My ideas were normal.”

 

NON-ZERO-SUM

 

Of course, there’s also the possibility that brain doping is not a zero-sum game—that being smarter actually makes you smarter.

I love the moments in superhero stories when the adolescent’s powers jump out unexpectedly for the first time. Harry Potter hears a snake talk at the zoo; in the first X-Men film Magneto bends a fence. In William Gibson’s Pattern Recognition, the protagonist Cayce Pollard’s coolhunting ability is out of control—she gets rich off her instant and accurate reactions to corporate branding, yet she has “no way of knowing how she knows.”

 

FAST GROK

 

Young brains are getting rewired. Digital Native discourse is all about neurons moving faster through more seamless translations. Douglas Rushkoff, discussing his book Present Shock, gives a weird example of a kid reading Hamlet so fast:

 

The other main difference is that kids who no longer appreciate narrative try to get the “gist” of things by looking at them rather than immersing in them. They will “take a look” at Hamlet on Sparknotes (an online summary service) and then attempt to get the gist of it by reading a couple of paragraphs or quotes. In some cases, I am almost embarrassed to say, they do seem to “get it.” I spoke with a boy who seemed to have gotten the gist of Hamlet this way, and was able to explain how, amongst other things, “To be or not to be” encapsulated the essence of Hamlet—a man who wasn’t sure whether or not to take action. So this generation is actually really good at looking at things, getting a sense of them without ever moving through them in a sequential, narrative way. But the challenge then becomes interesting them in actually delving into something. Helping them see the value in immersion, and to realize that not all immersive experiences are to be avoided.

 

At first I thought it seemed so dumb, journalistically, to take this kid’s word for it, or to give a shit that a teenager managed to cram and get enough of the point to get by—teenagers and other disadvantaged peoples have always done that. But there’s a non-dumb part, potentially, if we can answer the question of whether Rushkoff has identified a new form of intelligence, or just articulated new respect for an old kind. Either bullshit—Ferris Bueller bullshit not Tim Ferriss bullshit—is becoming fashionable again, or we have a new wave of mutants on our hands.

 

X-MEN

 

This is where the X-Men come in. In X-Men: The Last Stand, a major pharmaceutical company has developed a way to suppress mutant X-gene permanently, turning X-Men into regular people. The character Storm, played by Halle Berry, lays down the law. “There’s nothing to cure. Nothing’s wrong with you; or any of us!” she says.

But what if Halle Berry failed to stop the non-mutant pill, they made it and enforced it on everyone, and instead of eradicating mutants, it just made everyone a million times smarter and more powerful?

That’s my fantasy for the effect of prescription speed: that a drug intended to make young people “more normal” has inadvertently created a new self-reinforcing form of intelligence, a kind of Adderall singularity.

And maybe this is happening because they concentrate less.

 

ATTENTIONAL DECONTROL

 

Attentional decontrol is about dismantling the figures in the field of perception and transforming it into a uniform background. As a child I used to do this in my bedroom, flattening the wall into the ceiling until I fell asleep. It turns out this may have been an accidental Kegel for my brain.

AD is induced by distributing your attention to the periphery, which requires effort because your brain thinks you’re supposed to just look at what’s in front of you. One method of AD makes you hang everything on an imaginary screen. It aspires toward a uniform distribution of attention on the whole perception field—the opposite of attention concentration, which can be defined as certain objects being distinguished from the environment, singled out as special.

Deconcentration of attention as a technique was developed in Russia as a Cold-War strategy for reacting to “complex, uncertain, and extreme” conditions. The US acronym for that is VUCA (volatility, uncertainty, complexity, and ambiguity).

Post-normal times call for post-normal measures. Deconcentration of attention is a psychological extreme sport, or a technique for being, when being aware and staying alive is itself an extreme sport—like freediving, hunting, or being a mutant. Instead of concentrating, you become concentration. You cease to conspicuously exist.

In his essay Nature, Emerson writes about retiring to uninhabited places like the woods where “we return to reason and faith. There I feel that nothing can befall me in life,—no disgrace, no calamity, (leaving me my eyes,) which nature cannot repair. Standing on the bare ground,—my head bathed by the blithe air, and uplifted into infinite spaces,—all mean egotism vanishes. I become a transparent eye-ball; I am nothing; I see all; the currents of the Universal Being circulate through me; I am part or particle of God.”

In AD, you no longer have inner dialogue. You stop hearing voices. Like being in flow, AD “is like meditation without detachment from reality.” You’re alive, but just barely. You’re alive, but only technically…

AD rarely arises involuntarily—it’s not that “natural”—and it’s also the opposite of specialization, the opposite of “focusing on your task” and keeping your eyes on your own paper. Yet systemic stress actually induces AD. It breaks down awareness, perforates your field, and makes you vulnerable.

AD achieves the same objectives as flow but through totally different processes. Flow state is the achievement of full focus, enjoyment and involvement—and like the existence of any truly unburdened lovers of life, it’s hard to believe.

 

FLOW

 

Flow comes from artists getting lost in their work. Lovers getting lost in their love. Me getting lost in the supermarket: that’s where AD comes from. Getting lost is like getting confused. I depressed the shit out of Michael the other night, because I was laughing, saying I saw this old lady on the U-Bahn, and isn’t it so funny that sometimes cops pick up old women on the street and report that they’ve found them, and they’re “confused”? Michael was like, are you talking about Alzheimer’s? Because this is depressing. Why is this funny? And I just meant that, it seems funny that sometimes confusion becomes an emergency, when everyone is always so confused.

Flow is, in a way, the loss of the ability to concentrate. You are no longer granted the choice over what you want to see, it just goes in. Your brain becomes slutty and the world becomes neon. In flow theory, this occurs because all your attention is occupied—you have no more attention free to allocate. I kind of see what they mean. In the club the other night, I felt my smartest, because I was finally having enough fun. My brain was finally satisfied, and free to think. Basically, a desire for flow state is the desire to be a baby.

These are the conditions for flow:

 

·Clear set of goals and progress

·Clear and immediate feedback from your environment (obviously impossible)

·And “good self-esteem,” i.e., the sense that you’re up to the task, that you perceive your skills as capable of meeting the perceived challenges.

 

Apathy, boredom, and anxiety are considered enemies of flow: flowblockers. Of course they’re also the mothers of invention. Flow theory thinks anxiety comes from the perception of impossibility, the sense that there’s no possible way to achieve the task at hand. I disagree. I think anxiety comes from the sense that it’s impossible to find out what success is. Flow favors autotelic personalities—selfish people—personalities that are ends in themselves.

Both flow and AD are about being able to process a distinctively larger volume of information. Sometimes the amount of information we can process changes radically without us even knowing. Michael, in Mauerpark, heard a guy ruffling through the garbage three hundred meters away. That’s how he found out he was tripping.

Nina Bußmann
SENSITIVITY AND CALM

Collecting one’s thoughts… There’s someone I know who practices Chinese vocabulary for two hours every morning to calm his mind and be able to write. Others claim that they do their best writing at night, with a looming deadline, or when they are high. Still others set an alarm clock and don’t leave their desk until it’s time. Some people log their hours of work in Excel spreadsheets, and others set themselves targets: from one to four pages per working day. There are people who sit there for ten, twelve hours or more at a time, who say they don’t even notice how long they’ve been sitting, don’t know if they’ve been sitting at all, etc. There’s no doubt at all we could talk about a lot more important things than whether we’ve been lazy or hardworking, or how we manage our time. We could talk about beauty, for example, or about what’s relevant and radical, or: what was this supposed to be about again?

I can never concentrate long enough. And I can never reliably judge whether the state I’m getting into is one of constructive confusion or exciting anxiety—or whether it’s merely a state of absence, be it listlessness, laziness, or genuine exhaustion. I can never really tell whether I’m actually working or doing something just as important, whether I’m gathering indispensable information—or whether, as is the case right now, I’m avoiding my real work on a piece of long fiction. I can never tell why I’m still trying to divide everything into real and distracting, and why I think it’s so urgently important to write a novel. That’s something I must have picked up somewhere.

Maybe. Or it could simply be that I can’t yet imagine doing anything else. For the time being, though, I know that beautiful, relevant, radical thoughts—such things, to return to my point, can only emerge and be recognized, if at all, under certain conditions, and one necessary condition for this is concentration, a state of heightened attentiveness. The decision to direct one’s gaze onto one particular thing, and therefore not onto anything else. But what does it mean to speak of a decision? Under pressure, our field of attention can contract, whether out of fear of failing an exam or simply because we’re at the theater, where—at least in the Western world since the late nineteenth century—we sit quietly and stare raptly at the stage. All other things can lose their charm in the face of love or being in love, hunger or fascination. This is where those states of sensitivity and calm that interest me are to be found—on the far side of subjection to a regime of attentiveness. What does it mean to really look at something?

Where are we supposed to we be directing our attention? What is considered important, what isn’t, and who has decided? These questions are discussed far less frequently than the pathologies of absent or reduced or dysfunctional attentiveness, of fidgeting schoolchildren and exhausted adults and a general unease in the face of always-on electronic devices, an incessant flood of information. It is widely assumed that a connection exists between chronic fatigue illnesses, difficulties in concentration, and technologization. Imaging processes can show what the brain of a person with normal levels of awareness looks like, and how this brain changes as a result of using digital media. Cultural critics worry about the decline in the ability to concentrate as we used to know it, regarding it as a self-inflicted incapacitation, an increase in childish impatience.

However, the claim that technology and media have robbed us of the Enlightenment is nonsense even in its terminology. What such critics really mean is probably a diminishing power of judgment, and this isn’t something either that people’s gadgets take away from them. If people seem to give up this power willingly, they do so under specific conditions. There are not only the commands not to be lazy and to pay attention learnt at school, but also standards of flexibility and availability and of staying up to date, the fear of being out of touch and the desire to be well-dressed and not to miss out on anything exciting. It isn’t email and electronic devices that trouble our thinking and immersion in a particular task, but the compulsion to react. Anyone who cannot or will not do this, who takes concentration too seriously, is as recalcitrant as an undisciplined child. Anyone wanting to stay connected with what’s going on cannot afford to take concentration too seriously.

The student magazine of a German weekly newspaper recommends that its readers stop pulling all-nighters, and “sometimes turn off Facebook.” There is software that makes it easier to turn off social media or harder to turn it on. What seems more revealing to me is the question discussed but never conclusively answered in the comments section: “how much work do other people actually do?” If no one worried about this question, if how much the competition was doing simply wasn’t that important, then it would be easier to concentrate on the tasks you’ve set yourself.

Studies show that children lose interest in an activity as soon as they are praised for it or paid for it. They learn that anything that is compensated with a reward cannot really be fascinating and exciting in itself. Though they may continue to carry out their tasks, their interest shifts from doing sums or drawing pictures to being praised for doing these things. Something similar happens when people have to study for exams, or work on something to earn money. The more precarious your working conditions, the more easily your attention can wander from the task at hand to the question of where your next paycheck will be coming from. By this point it should be clear that not every distraction is necessarily a good thing. It is, of course, more fun looking at eye-catching pictures or responding to friend requests than devoting yourself to tasks that are boring or demanding or just seemingly pointless. But then again, almost anything is more fun than devoting yourself to things that are boring and pointless.

Any progressive business culture aims at minimizing grievances about work. A productive workplace doesn’t need employees performing a function, it needs people utterly dedicated to the projects they are working on—not merely working hard, but genuinely enthusiastic, in the way artists are, and for whom, as for artists, the problem of lack of concentration shouldn’t arise. When you’re utterly enthused by something, you’ll never get tired. When individuals are supposed to get and keep their jobs by bringing to them not (only) time and energy, but also passionately focused attention and passion itself, then the resources businesses can draw on are going to be pretty extensive, not to say totalitarian.

There are various ways of attempting to wrest at least one’s own emotional economy from the reach of this culture. Withholding enthusiasm, for example, or developing an art of procrastination. Not exploiting yourself in performing the role of a Romantic creative genius, and feeling instead a little more flexible and a little more independent of control mechanisms and conventions of behavior. Such an approach would doubtless help you remain competitive without being driven crazy by the question of whether you’re working hard or being lazy at any particular moment. But although it may be useful, it isn’t enough. Keeping this kind of inner distance basically just means looking after the state of your soul. But we would need resistance and refusal, we would need something to happen, for conditions to change. Not only for overworked, privileged, precarious freelancers in rich countries, but for everyone.

It has to be possible to think beautiful and radical and important thoughts, and there has to be time and space to do this. How is that supposed to work? For now I have no answers, but just a few strategies for specific cases. Perhaps individuals should break ranks and shut themselves away with their own weirdnesses and commercially unviable projects. Persevering with this could perhaps be an act of resistance, or at least perhaps a way of being a thorn in the system. What then, would be a genuinely effective practice of resistance? Are we actually dissatisfied enough for something like that? Or are we too nervous ever to think all this through to the end?

Quinn Latimer
CONGRESS, HER

The Congress would not listen to the people.

People surrounded the Congress and set it on fire.

 

The Congress would not / People surrounded the Congress.

To the people / and set it on fire.

 

Something I read on my little screen—its rhythm

Moved me. I was in Oslo, I couldn’t do anything.

 

I wanted a Congress but not my country. But not my—

I wanted a Congress of feeling. Or political recovery.

 

My anxiety a small city, clouded by cloud

Or smoke. My North a moneyed Riviera North

 

Riding its blond banality

Below a truer, higher North.

 

Sharing a border with Russia / sharing a border with Russia.

A theater of war bonded them / also herds of reindeer, shepherds.

 

Soon a kunsthalle would be built there. Still men with guns

Accompanied you among (or fear) the icy streets. Pale bears, you know.

 

Pale bears / pale bears / pale bears / pale bears.

Slim guns / slim guns / slim guns / their trucks.

 

I was not high. I hadn’t been in a long time.

Anxiety fog enough. My suburban Northern-European. Each image

 

Come through my screen a painting of a digital image.

O god. I could not make myself care about the art arguments

 

On the social. I could not care. I had too many cares, as Frau Doktor

Elegance might say in some European early or mid-something.

 

Send me to a sanatorium, mountain-cool, or to here.

Whatever. I was called a Valley girl in print by a Scottish critic.

 

It sounds better, that line, than the feeling it produced.

Revulsion, etc. Also: error. I was from the Westside, the coast.

 

But West no more I was of the North. I was in a kind of North

Of the mind and of the map and one of the nerves. They were shot

 

As some modernist might throw off, write down, publish.

What a cover. A North of the mind—I could feel my scalp.

 

Nothing else. Just its heat and tremor and oil. Pale bears

Pacing its cool white avenues, slippery with armory.

 

I hope they protect me / I hope they protect me.

Jacob Wren
ALL PROFOUND DISTRACTION

 

All profound distraction opens certain doors. You have to allow yourself to be distracted when you are unable to concentrate.

Julio Cortázar

 

If something is boring after two minutes, try it for four. If still boring, then eight. Then sixteen. Then thirty-two. Eventually one discovers that it is not boring at all.

John Cage

1.

 

I don’t think I know what I’m doing. I don’t know what I think I’m doing. I stare at these two sentences. I know they each have a distinctly different meaning but for a long moment can’t intuit which means which, or which one I mean. Either way, I don’t know what I’m doing and haven’t for a very long time. This “not-knowing” is something I tell myself I believe in, and might be reformulated as a fairly specific kind of concentration. I even find myself searching for a “more real” not-knowing, while at the same time experiencing anxiety that I’ve accidentally fallen into a false one: that I do actually know what I’m doing, only pretending I don’t in service of some half-articulated ideal of how an artist should or shouldn’t proceed.

Directly next to any not-knowing I perform or attempt to conjure while creating work, there is another, perhaps more honest, not-knowing that keeps me awake at night, and that, more often than not, makes me almost unbearably sad. This awake-at-night not-knowing has something to do with all the injustice and suffering in the world. Why don’t we simply just know how to reduce it, fight it, undermine it? This must be pure naiveté on my part, but I cannot believe it would be so impossible or so difficult. Yet apparently it is all that and more. I can think about these problems endlessly, read about them endlessly, turn them over and over in my mind, and get virtually nowhere, back around in circles to things I already know and seem so obvious that there was little need to give them any thought in the first place.

So what I now find myself wondering is: what is the connection between these two aspects of my not-knowing? Between not-knowing as a longing for artistic breakthrough, as desire to leave behind both acknowledged and unacknowledged habits, and not-knowing as not knowing how to save or even slightly improve the world?

2.

 

When I write I often listen to hip-hop. On a line by line basis, I have to admit that my comprehension of what they’re getting at in any given track is, to say the least, somewhat limited. Some things are of course clear, others I’ve listened to hundreds of times and remain, for me, in the realm of multiple possible meanings. As a writer, at least I think it’s because I’m a writer, when I listen to music I focus on the lyrics. So listening to hip-hop while writing is often a distraction that almost completely prevents me from actually writing, focusing on lyrics I’m perpetually unable to fully decipher instead of on the blank screen in front of me I’m supposed to be filling with words. My solution is to turn down the volume until the track is barely a murmur. This hip-hop murmur pulses in the background as I type and somehow gives me a feeling that somewhere in the world there is an energy greater than the dull silence in the room that surrounds me.

My computer is full of hip-hop that I mainly listen to on shuffle. Often when a track comes on that seems too sexist or homophobic I simply delete it. I don’t know if this is the right thing to do, but I’m nervous about sexism and homophobia seeping into my subconscious through tracks I listen to sometimes hundreds of times. It might be stating the obvious to say that this hip-hop also exists as an artistic otherness, completely removed from anything I immediately identify as part of my daily life or experience. Many tracks speak of socioeconomic experiences I haven’t had: life-threatening poverty or almost comically conspicuous wealth (or both at the same time). I also listen to a lot of hip-hop that has nothing to do with either of these things yet the modality of the language itself is mostly enough to separate me from relating it to my own experiences too directly. (It just occurred to me that I delete tracks that are too sexist or homophobic, but don’t delete tracks that are too capitalist, which might be equally important.)

When I ask myself why I like hip-hop so much there is one aspect to the pleasure that is fairly straightforward. I am a writer with a certain faculty for language. In many ways my writing is performative; it asks to be spoken aloud. However, even mediocre hip-hop displays a virtuosity of spoken language that I could never approach or aspire to. It is simply something I can’t do. The pleasure I get from it might be analogous to the pleasure I assume others get from watching sports, seeing someone do something that you could never possibly do that well yourself.

3.

 

I remember something I said in a recent interview and go looking for it in my computer. When I find it I’m disappointed; it doesn’t quite say what I had hoped. What it does say is: “I’m searching for breakthroughs, if one is still allowed to think in such romantic terms. At the given juncture of any breakthrough one momentarily feels there is no precedent. It is only later that one might see how everything fits (or doesn’t fit) into various histories and narratives.” This feeling, this momentary feeling that there’s no precedent, must in another sense be a kind of concentration, almost tunnel vision. A radical openness combined with an equally intense focus on a few key aspects of a current endeavour. Do artists still want to have breakthroughs? Do people? Is it something we can still imagine having at every stage of our life, right until the end, or is it only for the young?

4.

 

Thinking about the many ways my love of hip-hop is problematic, I begin to think about Descartes as one of the foundations for white Western thought. How he decided to sit in front of that fireplace and simply concentrate on the core philosophical problem, get rid of all distractions, all assumptions, and begin again. Descartes wanted to know, to get to the truth of the matter, while when I concentrate on a given artistic question I claim to want to not-know. But either way, isn’t there something a bit anaemic about this idea of what it means to concentrate—to block out distractions and focus—when another word for distractions might be life: other people, the sensual world that surrounds us.

My thinking takes place in dialogue with so many things, texts, and people, and yet I most often feel I’m working in almost complete isolation. I regularly complain about this isolation but now also wonder if it is a sort of Cartesian ideal that I claim not to want but perhaps actually do. What does it mean to actually want something you claim not to want? I know relatively little about Descartes but he is an unquestioned stand-in for something in the daily habits of my thought. He is a stand-in for a mode of scientific thinking that focuses on certain questions at the expense of everything else. To give a cartoon example: that focuses on how to get the oil out of the earth as efficiently as possible at the expense of all the repercussions involved in doing so. This also has something to do with a desire for certainty, often connected to domination of things and/or people. Within a certain theoretical framework, much of this has also become, over time, a cliché.

5.

 

I’ve made 20516 posts on Tumblr but only two have gone viral. The first was from a Rwandan speaking as part of the Moth podcast “Notes on an Exorcism”:

 

We had a lot of trouble with Western mental health workers who came here immediately after the genocide and we had to ask some of them to leave.

They came and their practice did not involve being outside in the sun where you begin to feel better. There was no music or drumming to get your blood flowing again. There was no sense that everyone had taken the day off so that the entire community could come together to try to lift you up and bring you back to joy. There was no acknowledgement of the depression as something invasive and external that could actually be cast out again.

Instead they would take people one at a time into these dingy little rooms and have them sit around for an hour or so and talk about bad things that had happened to them. We had to ask them to leave.

 

The second was from Walter Benjamin:

 

Mankind’s self-alienation has reached such a degree that it can experience its own destruction as an aesthetic pleasure of the first order.

 

In the space between these two quotations lies almost the entirety of the problem.

6.

 

In 1953, Mohammad Mosaddegh, the then newly elected president of Iran, was overthrown in a CIA-sponsored coup. His crime was his desire to nationalize the Iranian oil industry. I often wonder what that part of the world would look like today if he had successfully managed to build a working socialist-democratic precedent in the region. When I lie awake at night, more and more often it is history that fills my thoughts: Jacobo Árbenz in Guatemala, Patrice Lumumba in the Republic of Congo, Salvador Allende in Chile. There is another moment I often think of: shortly after Mussolini was elected he apparently managed to either kill or jail almost every single card-carrying member of the Italian communist party.

When you start reading the history of the left, stories like these pile up one atop of another. (As I’m writing this I chance upon a New York Times piece about Operation Condor: It is stories like these that form the background for Margaret Thatcher’s “there is no alternative.” The more I read the clearer this background becomes: over the course of the twentieth century, all attempted alternatives have been systematically undermined using money, propaganda, dirty tricks and, whenever necessary, extreme violence.

Of course, we can’t change the past; things that are done cannot be undone. But how to effectively think of possibilities for the present and future while at the same time keeping this history at the front of one’s mind? How to actually feel the fact that the world we currently live in didn’t just happen, that battles were fought, won and lost, and in so many ways we are living the desired outcome of the victors? Through posing these questions, I am attempting to walk myself towards activism. In most of what I have witnessed, single-issue activism has the greatest chance of success. But I always fear this is little more than blocking out larger realities in favour of short-term gains. Is it possible to have a genuine overview and still effectively fight? This fight might resemble the familiar slogan: think globally, act locally. In this sense I can always get behind the hope of setting a precedent: success in one context can create a sense of possibility elsewherea sense that there is, in fact, an alternative.

If I concentrate my energies on the specific activist battle at hand, it does not necessarily mean I am ignoring the global history that has brought us to this point. But I do feel there is something painful, almost enervating, in attempting to focus on both levels of reality at the same time, both on the devastations of history and on possible gains in the present moment. So many battles have already been lost that the playing field feels almost nihilistically askew.

7.

 

There is a most-likely apocryphal story I’ve seen mentioned in various forms over the years. In it, there was a secret meeting of all the major record labels at which they decided to work together to promote gangsta rap, to make gangsta rap the dominant form in hip-hop. Whether or not this meeting actually took place, any hip-hop fan can’t help but notice that the lyrical content of early hip-hop was considerably more varied, often more sunny, and generally more political than it is today. When one form dominates, other perspectives fall by the wayside. Even if marginalized, however, they never completely disappear. When concentrating on the things directly in front of us, our peripheral vision remains rife with every possibility not currently pursued. The peripheral might be seen as a distraction or it might, perhaps as effectively, be seen as our only hope.

Dirk Baecker
DIMINISHED CONCENTRATION

Diminished concentration is a sign of a diffusing form. Some distraction that was formerly kept at a distance becomes so attractive that one finally turns one’s attention to it—only to find it, too, soon to be supplanted by further distractions. Attention is no longer focused, but dispersed.

This is connected to an initial insight. Diminished concentration does not arise because of a lack of attentiveness, but as the result of a wandering and dispersed attentiveness, an attentiveness that does not settle upon anything. Otherwise it would be the problem itself that is wrongly worded, and we should rather be speaking of a diminished attentiveness.

A further insight also emerges here. Poor concentration not only has to battle against what may be the decreasing attractiveness of the task at hand, but also deals, at the same time, with the attractiveness of one or another distraction.

These two insights help us understand the phenomenon under discussion as one in which the negative connotations of the idea of a difficulty concentrating—and the challenge this poses to one of the highest values of our culture, concentration itself—are accompanied by two positive connotations. On the one hand, difficulties concentrating are a positive development in that one’s attention isn’t rigidly focused on a single object, but instead moves its focus across several different objects. And on the other hand, it is also positive in that other objects remain or even become attractive alongside the increasing or decreasing attractiveness of the task at hand. Thus a lack of concentration involves not only a loss of the priority of one thing over others, but also an increase in variety.

Nevertheless, most observers are likely to find that the negative connotations outweigh the positive ones. Diminished concentration is not seen as a reclamation of human vitality from its subjugation to restricted circumstances, but merely as a symptom of the failure to sufficiently understand the advantages of devoting oneself to a task. This is interesting because it reveals that hidden in this negative value judgment of poor concentration is the belief in a concentric view of the world, which may have several centers, but they are centers that are apparently worth focusing one’s attention, efforts, work, and energy on for some time without letting oneself get distracted. These centers stand out from their peripheries—which, although it does not make them invisible, does mean they can at all times be considered less important than the center. At the center is always the task at hand, while at the periphery are, at best, additional resources and, at worst, seductive distractions. Both the resources and the distractions merit only a sideways glance, though, since the value of the task at hand overshadows all other considerations.

Diminished concentration offers evidence that these centers exert a dwindling persuasive power. Work and attentiveness can no longer easily be concentrically organized, because the difference between center and periphery is becoming indistinct. The centers are becoming more peripheral and the peripheries are becoming central, although neither will persist as such indefinitely. You can allow concepts and judgments about the dichotomy of center and periphery to change places for a while, but very soon both sides of the coin lose all value.

Enlightened educationalists have on the whole reacted to this tendency by saying: let’s remove all centralized control, political protection, economic separation, institutional endowments, and cultural support from center and periphery, and leave it all instead to individual decision-making. Diminished concentration may then be a transitional phenomenon. The old centers are no longer as attractive as they once were, and individual decisions as yet remain weak. The concentric organization of work, investments of time and money, and attention is being deconstructed, is becoming obsolete, and remains only a pale shadow of what it was. But the materials gained from this deconstruction—including the energies it has set free, and the surviving memories of one or another source of attraction that wasn’t necessarily dependent on institutional support and cultural maintenance—are all lying idle because no one can find the right tools to shift them back to the middle of another, new form of attentiveness.

What does this new kind of concentration look like? How is it rewarded? And by whom? Which investments of time or money would make it sustainable, and which kinds of attention does it deserve?

These questions are all theoretical, but they may still be helpful. What kind of concentration is worthy of the name when the center it revolves around is no longer recognized and lacks the praise and support it once enjoyed? What kind of periphery deserves to be shifted into the center? And what concept of a center could we then still defend?

Is there such a thing as a distracted concentration? Outside the realm of art, that is, where it has been accepted for decades? We might try to imagine what such a form of concentration might look like: perhaps, instead of retreating into mere idiosyncrasy, it would remain active in society, aiming to achieve an objective difference, while aware that it had a beginning and will come to an end—possibly even interrupting its own work before it’s finished, so as to distract itself from itself? And what would work look like when we haven’t had it assigned to us, but have chosen it ourselves? Can anything of this kind exist outside of academia and its freedom to research and teach?

An unease is palpable in the phenomenon of diminished concentration—which, from this perspective, emerges as a symptom not of a lack but of an excess. The concentric movement around old kinds of work is no longer attractive, movements around new centers are not yet in sight, but people have already left behind their habitual ways of doing things.

Diffusing forms are characteristic of a modernity that insists upon its contingency. Diffusing forms are not to be confused with disintegrating forms. Diffusion means that the elements of these forms are separating from one another and are ready for new combinations. You just need to know how to gather them together. And you need to realize that the conventional models for doing this are no longer effective.

Diminished concentration exists along a vector produced by the reformatting of center and periphery. It is forcing a reorganization of how we understand social roles, which formerly tended to be distributed between center and periphery. It is dissolving the disciplinary technologies that once instilled in young people the habit of appearing to concentrate, even when their minds had drifted elsewhere. It is bracketing off institutional practices and cultural activities that were once deployed in the form of sanctions, rituals, and routines as soon as anyone’s concentration faltered. And it is refusing to judge diminished concentration as something unambiguously negative, arguing instead that it could be both symptom and symbol of a shift in society.

Diminished concentration means we are bidding farewell to a social formation that had shaped us in the mold of various tasks, social roles, and institutional recognition. We can now see points of light breaking through the shadow of diminished concentration. What’s happening in them and around them? In our distractedness, we are reorganizing ourselves into something new and perhaps a little different.

Charis Conn
THE EXECUTIVE CHAIR

Each night, I scrawl new questions for my student, and each day he spits on them, sometimes literally, but always figuratively. Those that most draw him out enter the curriculum. Here is one:

Wouldn’t it be nice to work for a company where you knew every employee?

Naturally I leave out love; we’re not there yet. As it is, “work for” alone is enough of a face squincher.

“Run! RUN, RUN…!” he shouts straight through the rest of the question, and often beyond, pounding the few surfaces at his disposal, gradually catching the pleasure of the word’s shiny facets of menace in this strange present tense of his. It is important in such moments not to betray the emptiness of such lively imagined vistas here in our little unpainted room, where the near, sun-twinkled woods murmur through a window of breeze; this close and endless space where he is running nothing and I am running nowhere, at least not to any nowhere he’s in charge of.

In any case, I’m staying put. And armed with the menace of his love, I ask him,

Wouldn’t it be nice to run a company that makes an object people can fix on their own for a lifetime with natural logic and their hands?

Silence.

A product requiring no additional purchases, services, upgrades, or trade-ins in order to work reliably for generations?

  His howl at this, though all his own, sounds like the roar with which even the masses these days would greet such a proposal: that dense, starving, over-fed chorus of empty certainty: No! Such a product would be foolhardy and no one would make any money!

What a loud sound out there, considering that within the flicker of film’s memory—and for the few hundred preceding millennia—such simple objects and ingenuities lived together gently, embracing and supporting every home, shanty, tent, and cave like air. And though the first thieves of such air arrived on the human scene extremely late, in just a few generations, their trashy, trashing, dissatisfied ways became an equal-opportunity mental illness; those still untouched now confined to jungle tribes, infants, animals, the most unsinkable of the desperately poor, and other dismissibles: those shamblers in the spooky realms of nature, 3-D eye contact, and laundry lines; or, worse, in worlds in which the very idea of laundry is inconceivable.

And since no one will suck up the wisdom of such types, in the name of trickle down, we are working from on high here, in our secluded dens, one cripple—or as they like to call themselves, executive—at a time.

It’s slow going, and has to be, though out there, beyond us, the workers such topsters used to own still fiercely carry the executive mindset like a plague, still “know” they would be sorely wanting if some boss didn’t get his “share.” And since our little kidnapped students are themselves always being replaced by newer brands—though one hopes fewer and fewer—their ex-slaves still believe that such tricky merchandising and pretended sharing is immune to challenge; the leap of “challenge” itself somehow neatly stripped of any clear meaning or significant outcome while we somehow weren’t looking. And the broad-shouldered, verb-muscled action hero demand a share has been similarly lost, soaked and shrunk by all the new kool-aid into the royal, niggardly size of the early Middle Ages.

But even those kings had to give in a little, eventually, though “little” is another word now broken at the knees and plastered with a solid gold wig, a pricey facsimile of which every citizen must order in order to labor beneath.

Another thing my student hates hearing said aloud, though it is meant to appear soothing, is this:

Products that last are dangerous.

His brief flaring up at this surprises me at first, and maybe even him, though perhaps only in terms of strategy, as I begin to see once I admit:

Not according to what anyone says or writes in public, of course.

Here he nods voraciously, softens his face, suddenly remembering the good-student trick.

But—oh how he hates my “buts”—according to the solid cement of the money trail, it is clear that lasting products and a hardy, self-sufficient populace have been branded as problems. And if you demanded that economic leaders admit to posting this flashing warning sign on their precious trail, they could eventually be forced to defend it.

“Just think!” they might say, if tied to a chair, “what a half-century of free unbreakable lifetime phones cost us back then in telephone executives, let alone perks and sales!”

  Yes, children, there was once such a terrifying half-century, preceded by a few thousand similar ones. And yes, it is rude to lecture a man with examples of folks tied to chairs when he himself is literally tied to one in front of you. And it’s possible that by now he’s infected me with his rage, though I naturally prefer my own and think of it as logical. And I don’t tie him tightly enough to elicit the spitting and the silences. Or the feces that keep appearing in various places.

So I go on:

It seems that many services are perilous to our economy too, yes?

He makes that little face and finger wiggle that means he wishes to be seen as wishing he could take some notes at this point. But I never fall for that, and today, for the first time, drag out the blackboard, catching the trail of his disappointed, voiceless whiteboard…? across the lips and eyes, then his jumping gaze across the floor and walls for plugs and wires, and my hands for a rectangle with a clicker.

Tell me this is not true: The new math renders “worthless” such things as:

On the board, I start a list:

Free, inspiring, gentle education for all; enough local social workers and medical care to go around without typing or middle men or rush; the solid infinity of homegrown food and rags and wicks and light. And though less infinite, universal clean air and water are also an economic drain, forcing our fresh, booming executive classers to lunch each other daily in search of how to grind such waste into the ground and suck up the juice.

I want juice!” he yelps here, almost pitifully, but when I give him some made this morning by our little orchard he spits it out. I don’t ask, but I can only assume it’s not the flavor or color he wants at the moment, or not The Juice of the Moment; or the jam jar offends, and the service. All he asks, carving each word like a jewel, is “What. Is. This!?” And I don’t know if he means what brand or which fruit or how it was made. And I am tired, so I give him no answer. Which is sometimes good for them, I am told, in situations involving literal objects. This is how they might learn.

The next day, I take a different approach, a story. Halfway through, it comes to this conclusion:

Clearly that wily garden snake won, and now we are our own devil. Transformed by more than a reptile’s bad advice into carriers of his globally disabling genetic kink. Still, not every genetic kink makes it through to everyone, even if it lets the rest infect the entire planet to produce blind android spawn. At least I like to think some have been spared, and do, which means you might have to kill me.

But he can’t, though I’m sure he dreams of his brethren still running amuck out there, selling cobra oily escape and desire like bandits. And of course they are. But we have only so many comfy chairs and undisclosed locations and rope to go around. And only so many volunteers with enough time and patience and garden produce and solar panels for two.

To thin the mass of consumer arrows entering our home each day, I’ve often considered telling The still-existing Man out there that I do not and will not purchase any thing or service he ever wants to sell me, and therefore not to waste his precious monetizing time. Yet is it not another symptom of the times, dear reader, that most of us understand exactly what would then ensue? “The last holdouts!” I hear systems everywhere screaming even louder in their silent clicking way, and the hunt would be on for this dwindling edge of the pie, this stubborn strip of cocaine frosting; the first shot being its rapid whipping into soufflés of reality coverage and a hundred temptations Including Brand New elmer fuddish trap invites to “Come in, enjoy the rewards of our special advisors [sic] club! Help us improve! Your [sic] the boss! Win cash and prizes, and a free chance to join the My Asteroid! ™ Club! ™”

Exploding this last acre of us, in and out, and now The New Universe ™ and beyond.

But it would create jobs, my man in the chair will point out, once this household’s stubborn non-purchasing, fix-it conspiracy is explained to him. He means this advanced hunting down of us, the non-buyers, with our dangerous indifference. It will create jobs! Yes, I would and will tell his writhing body in the chair: insect-sized jobs for trusting youth who live on free beer to seek the honor of building poison tools to gather, know, cull, distract, and control everyone on earth, day after day.

Even fantasies are hard these days. In keeping The no-longer-existent Man tied to a chair, there are so many decisions to make. And he has so many needs one must waste hours a day explaining are not needs.

“My fingernails are a mess!” He complains regularly, unable to handle a bar of soap, running water, and his body’s own little edges to take care of such disasters. For one thing, he says the soap is not “nail soap.” Here is where we understand it is counterproductive to lie, to say for instance that it is in fact produced only for nails, by Himalayan virgins for two cents a day who bless the bar with ancient manicure hootchie cootchie, naked, or show him some sort of silken wrap or label. Neither is it advised this early on to tell the truth: that I carried this year’s dozen bars through the woods in my apron from the hands of another teacher down the way who makes it from moldy yam peelings and ashes, and offers this bounty each spring in exchange for the careful darning of both pairs of his sweet socks. It is hard to know, though worth exploring someday, which parts of this truth would disturb my captive more.

But for now, with none of this soap news available to assuage the nail crisis, I finally offered him my great-grandfather’s most delicate rasp. This was instantly, wordlessly refused as if it had a bad smell. He also refuses to use the woods for exercise or pleasure without any items in tow or installed. I watched him out there the first day, checking his pockets and ears in vain for machinery, looking for manufactured seating, unnerved as a modern urban toddler at every insect and falling leaf, though with half the attention span. Finally, he simply stood still, eyes closed, pressing the crown of his head against a trunk. I don’t know what she said to him, but he was a little better the next afternoon in the chair. And I was pleased then that my training allowed me to resist the powerful temptation to steer him toward some thanking of the woods for this.

We figured that each of us having one of these men was the best way to do it, since they require so much care and listening and questioning. And if cared for together, they might get away from us and back out there infecting others and reproducing. But dealing with just one, even if tied up most of the time, is exceptionally hard. When they demand to document and prove in print that their craziness is sanity they just break the pencils and tear up the legal pads and pages of data we provide and throw them across their nice little bedrooms. Mine has tripped on such debris often enough that an entire day had to be spent on the concept of responsibility, followed by another afternoon I had to devote to tenderly removing splinters. Still, in his room now, he shuffles around almost hip-deep in all he’s broken, demanding a maid who will never arrive. I’ve even dug him a little hole just outside his barred window to throw it in, piece by piece, assuring him needlessly that all of it will rot good and proper in the earth there, or, dragged elsewhere, could make warm water for bathing or cooking his dinner, though he hates both the shower and whatever I feed him. But he’s only been here a few weeks, and I’ve heard from others that after a few months they often start glorying in any and all falling water, including rain, and gratefully eating whatever they’re given, without a tablecloth or bonded chef or barbeque or blonde.

Johannes Thumfart
FROM THE RHIZOME TO THE HEAVY TAIL

CONCENTRATION OF THE DIGITAL MARKET, CONCENTRATION IN THE INDIVIDUAL, AND THE REDISCOVERY OF SOVEREIGNTY

 

I. TOPOGRAPHY OF A HOMEOPATHIC COLLISION

 

Homeopaths believe in the principle of similitude, according to which illnesses can only be cured by minuscule doses of the same substance that in larger doses would cause the illness to begin with. Even if the medical efficacy of homeopathy is unproven, it is nonetheless widely usedprobably because it is based on a dialectic so intuitive that it can still function as a placebo.

At first sight, concentration of the digital market and concentration in an individual person seem to be two entirely distinct phenomena. Although both are designated by the same term, they appear to be far too different for their collision to produce the placebo effect, which is the only effect that theory is capable of.

Concentration of the digital market has gone hand in hand with the increasing overall tendency toward monopolization that has characterized the world economy over the last twenty years. It is, however, even clearer here than in the non-digital realm. Googles share of the US search-engine market, for example, is at almost seventy per cent. In Europe, where neither Bing nor Yahoo have a significant number of users, it is closer to ninety per cent. Amazon holds a similar monopoly position in the United States, with almost forty per cent of all book sales, sixty-five per cent of all online book sales, and almost seventy per cent of the e-book market. Amazons share of print sales in Europe is substantially smaller, since bricks-and-mortar bookshops still play a bigger role there. Facebook has gained about sixty per cent of the social network market in the United States, and a couple of percentage points more in Europe.

Similar figures apply to the Indian market, though it is linguistically challenging for Western service providers: here, ninety per cent of all search requests are made through Google and almost ninety-five per cent of all Internet users have a Facebook account. It is only in China that things are different, where the state is constructing its own digital monopolies, generally seen in the West as a simple case of political censorship.” The bottom line is that while there are certainly people who do not drive a car and never eat at fast-food chain restaurants, there is virtually no Internet user outside of the Middle Kingdom who doesnt click onto one of the global digital monopolies at least once a day.

What concentration means for the individual, on the other hand, cannot be expressed with anywhere near such quantitative clarity. Despite this, studies based on voxel-based morphometry have shown that concentration is not only something experienced subjectively, but also produces physical changes in the brain. These studies showed that test subjects who had learnt to juggle, speak several languages, or navigate London as a taxi driver exhibited corresponding increases in the density of the brains gray matter, a substance composed largely of neuronal cell bodies. Meditation produces similar neuroplastic changes, principally through the control of breathing and pulse that one learns from practicing it.

So despite their superficial differences, from a structural point of view, concentration of the digital market and concentration in the individual are really quite similar phenomena: both involve networks centripetally crystallizing around one or more centersa neuronal network in the case of the individual, and multiple digital networks in the case of the Internet. Since digital monopolies, and the filter bubbles they produce, essentially determine the content that can be accessed by individual concentration, market concentration in the digital realm has long-term neuroplastic effects, meaning that it will generate neuronal patterns.

However, one antagonism that should be mentioned right away is that concentration in the individual is something thats considered desirable (and is often artificially induced using Adderall or Ritalin), while concentration of the digital market is often demonized, especially by pre-digital elites. Alongside their sense of being disempowered by the new digital overlords, the dismantling of the basic distinction between public and private, while fundamental to digital monopolies, inspires a great deal of fear in pre-digital elites, because it is exactly the maintenance of this distinction that constitutes bourgeois identity.

Not least, however, the demonization of digital monopolies by pre-digital elites stems from the confusion of digital with pre-digital monopolies. While market concentration and the resulting concentration of power does represent a serious problem in principle, digital monopolies, unlike pre-digital ones, do not produce a rise in the price of their productsat least, not for end userssince their broad acceptance is itself a result of their offering their products for free. There is also little need to fear a decline in the quality of their products, as frequently happened with pre-digital monopolies. When algorithms learn from user behavior, quantityfreed from production cost overheadsgenerates quality, with the result that a decrease in the latter offers no cost-saving advantages to the provider. Given how far they have expanded in quantitative terms, monopolies have even become necessary to the smooth functioning and high quality of digital products. As will be explained in more detail in what follows, the dangers associated with digital monopolies are more fundamental than this.

One thing is certain: the increasing interest in the connection between cognitive concentration in humans and concentration of the digital market is a sign that the era of the idea of a network society structured around a rampantly proliferating “rhizome” is over. We have entered a new epoch where network theory needs to be expanded to include notions of concentration, restructuring, and hierarchization, in stark contrast to the laissez-faire libertarianism that characterized the net thinking of the 1990s.

Though the pattern of re-concentration first became evident in the recent substantial market concentration in the digital sector, the ubiquity of digital technology implies that it has long also affected the non-digital economy, the state, the secret services, and the military, and has extended the reach of these traditional centers of sovereign power. This process also has consequences for individual concentration, since the latter represents the point of origin of power in the digital agepower that is itself based on an economy of attention.

As far as the attempt to achieve a homeopathic cure is concerned, the collision between market concentration and human concentration, the collision of like with like, will only have a homeopathic, rather than a toxic, effect if the enormously impactful structure of the digital monopolies is understood in its potential for individual concentration. This means that on the one hand we need to depoliticize and de-demonize the digital monopolies, and, on the other, that we reconceive the relations between the human neurological system and the networks commonly designated “the Internet.”

 

II. FACEBOOK AT WORK: PURE SURPLUS-VALUE EX NIHILO AD NIHILUM

 

As the survivors of digital market concentration become ever more powerful and ever fewer in number, avoiding them, or at least making oneself unrecognizable when dealing with them, requires the almost inhuman concentration of a cypherpunk ascetic. Anyone trying hard to prevent Facebook, Google, Yahoo, and Amazon from collecting their data—or having their online behavior predetermined by those companies ready-made filter bubblesmust be prepared to sacrifice more than most people would consider rational.

For the most part, we act against our better judgment. Protected data transmission is still far rarer than protected sex, even if it is certainly more necessary than the latter. While you can only catch physical diseases from sex, unprotected data transmission has consequences well beyond its apparently harmless limitation to the symbolic order, since it extends into the realm of being that concerns our species-specific, neuroplastic essence.

The digital monopolies confront us with the dilemma of either accepting defeat by ignoring their growing power, or using all our concentration to protect ourselves from themand the latter means drifting off into a peculiar realm that is both utterly paranoid and yet more rational than the way most people are behaving: a condition otherwise known only to citizens of totalitarian states.

To put it less polemically, we cannot fundamentally assume that power accumulated in the digital realm doesnt corrupt. We could already reach the opposite conclusion from the anecdote that, right after he founded Facebook, Zuckerberg collected the email passwords of his fellow students whod unsuccessfully attempted to log onto his page and used this knowledge against them.

Above all, the Internets zero-cost economy is based on diffusing concentration at the level of the individual. We dont just pay with our private data, but with our state of constant distractionsomething that was already the case in the era of TV and print media, both of which were largely funded by advertising. Anyone who doesnt want to have their concentration corrupted by advertising must pay for a premium accountthough in most cases, this option isnt even available. In this economy, concentration has become the actual starting point of capital accumulation. When we surrender it in return for the services of the digital monopolies, we are also signing a release declaration concerning our right to produce anything that could create competition for them. At the same time, it is precisely when were most idly and distractedly surfing the net that were helping the digital monopolies extend their vast reserves of psycho-behavioristic power-knowledge.

Thus concentration accumulates in the digital monopolies as cognitive capital, meaning that we have to think beyond Debord, who at the birth of the classical media industry thought that capital in its highest form of accumulation would eventually become an image. The current degree of economic concentration goes beyond the image and involves the accumulation of concentration itself. It is worth noting the curious fact that this means, at least terminologically, that a liminal state has been reached that is purely tautological.

The phenomenon of “Facebook at Work,” which aims at creating a work-specific version of the social network, has a particularly toxic effect on individual concentration. “Funemployed playbor” requires you to offer your concentration to the digital monopolies for free. It means participating in the postmodern office routine, which increasingly consists in inhabiting a state of distractedness during the prevailing phases of net-working/not-working. In these, people send each other jokey personality tests for the upcoming product campaigns of our digital cloud overlords, or indulge in some other kind of symbolic production whose only real use lies in optimizing algorithms for market analysis, the prevention of terrorism, and facial recognition. Today, the only realistic preparation for the average contemporary office routine is offered by a former avant-gardist like Kenneth Goldsmith, who teaches a course at an Ivy League university entitled “Wasting Time on the Internet.”

But why are companies allowing digital monopolies to use their offices as distraction-farms? The main reason is their sheer lack of power: the former giants of glass and steel lost control over their means of production when it simply stopped being possible to work without Google. The result is that they are willing to surrender to Google a substantial part of their employees cognitive capital.

The digital monopolies exploitation of free time in particular has a diabolical effect. In a Pavlovian reversal that is all too human, every minute employees spend distractedly surfing and being exploited by the digital monopolies at work feels like time off, which theyre even prepared to fight for. Since the workforce paradoxically considers using Google and Facebook to be generally in its own interest, the employer, caught between identical demands from both the workforce and the digital monopolies, has no alternative other than to permit the use of Facebook and Google at work, andso as not to seem utterly emasculatedeven to encourage it.

With his concept of “cognitive surplus,” Clay Shirky was one of the first to understand that the revolutionary characteristic of the digital industries was that they actively exploited surplus intellect, surplus energy, and surplus timein contrast, for example, to television, which merely made passive use of its viewers free time, producing no surplus value. The result has been a reversal and liminalization of the categories of Marxist analysis of the real economy, on which the digital economy feeds parasitically. It begins by accumulating and concentrating users’ free time—time which was hitherto external to processes of capital valorization—and then it makes this the starting point for an economy based on a magical or purely illusory cycle of pure surplus-value creation, ex nihilo ad nihilum. For Marx, human subsistence set the limits to capitalist exploitation, and failing to provide for it was the necessary and perhaps even sufficient condition of any historical revolution; for the digital economy, on the other hand, this is something for someone else to worry about.

As Jaron Lanier has correctly pointed out, this peculiar approach can lead to a wholesale hollowing-out of the economy, resulting in a situation where wages can no longer cover food and rent, but Facebook and Google continue to be free. Meanwhile, the price increases typical of pre-digital monopolies remain merely symbolic, in that the digital monopolists demand that increasing quantities of concentration and private data be rendered unto them in return for their supposedly free products.

But these fault lines between the concentration of the digital market and the concentration in the individual have become so widely known that they are now even being explored by trivial television series indulging in a superficial critique of the Internet, such as Selfie. Given the trivial nature of our private data, it is worth noting that the lack of concern most of us have toward the digital monopolies does make a certain kind of sense: as long as we arent pop stars or enemies of the state, our data is really of no interest to anyone except in the context of de-individualized Big Data. It brings to mind alchemical processes for turning lead into gold, optimized to a point where excrement is used as raw material. In their aggregate mass, the semiotic waste products of our aptitude as a zoon logon echon (an animal which has language), which are often as unusable as they are unwanted, have become something like a philosophers stone.

Unlike other sectors of industry, the digital monopolies have so far not raised the price of anything. The monopolies’ products remain free.. The monopolies deliver. Even the concentration of the market into a few data-gathering protagonists means that their learning algorithms function better with every input of data.

Amazon is better than any librarian at finding related authors and books, particularly in its range and its freedom from ideological constraints. For the same reasons, Netflix is preferable to a video rental shop, just as Yelp is to any restaurant critic or Google Maps to any cartographer or guide. And who seriously mourns the demise of the Encyclopaedia Britannica, when in its place we get Wikipedia, the only one of the Internets big monopolies that is still non-commercial, and whose unquestionable democratic legitimacy makes all the resentment stirred up against digital monopolies seem like the jealous whining of deposed literati? The time seems to have come for humans to concentrate on different tasks from those that can be carried out just as well or better by algorithms.

In a reciprocal movement, outsourcing much of todays dull, disagreeable but necessary intellectual labor to digital monopolies could result in a reshoring of human production capacity as a haven for qualitative, creative worldmaking. As in the Renaissance, when people were working out the consequences of the rise of the printed word and developing mnemotechnics as a kind of anthropotechnics to cope with the new conditions, we now stand upon the brink of a rediscovery and redefinition of human concentration. It will be characterized less and less by calculating, ordering, and mathematical operations, and more and more by the teleological control of automated algorithmsthe determination of their direction and their purpose.

 

III. SIX DEGREES OF SEPARATION: MARKET CONCENTRATION, RHIZOME, AND HEAVY TAIL

 

A comets plasma tail doesnt trail behind it, but rather always points away from the Sun. Thus the tail can trail from the side of the comet or even, if its moving away from the Sun, trail ahead of it.

In a similar way, the intellectual history of the commercial Internet appears retrospectively as a fait accompli long before it emerged from the Arpanet, its governmental/military forerunner, at the beginning of the 1990s. Two short but highly influential texts formed the basis of the particular commonplaces that still define discussion of so-called network society today.

The older of these two texts, published in 1969 by the experimental psychologists Jeffrey Travers and Stanley Milgram, dealt with the so-called “small-world problem,” and came to the famous conclusion that everyone knows everyone else through six degrees of separation. The second was written in the early 1980s by the philosopher Gilles Deleuze and the psychoanalyst Félix Guattari, and aims to apply the structure of the predominantly underground root and shoot systems known as rhizomes to social and cultural phenomena.

Almost everything there is to say about digital monopolies can be said by comparing these two texts. It is best to begin with Deleuze and Guattaris text, which, though historically the later of the two, is really the others intellectual precursor, precisely because it denied the possibility of digital monopolies before their very existence.

The reception of the rhizome metaphor by early network theorists such as Stuart Moulthrop, Brian Massumi, Jamie Murray, Geert Lovnik, or Antonio Negri and Michael Hardt remains influential. Although today it is primarily understood in terms of the “anything goes” principle of the network society, Deleuze and Guattaris text actually formulated a fairly authoritarian imperativeprecisely that of a structurelessness that was, tellingly, obligatory: “nimporte quel point dun rhizome peut être connecté avec nimporte quel autre, et doit lêtre.” To paraphrase: Every point of the mega-network to be formed not only could be connected with every other point, but had to be connected with every other point.

It is no coincidence that this structural a priori seemed to apply so well to the civilian Internet that was to evolve soon afterwards. As far as Deleuze and especially Guattari are concerned, they essentially just translated into French ideas that had come from American cyberneticists such as Gregory Bateson, who had initially sketched out the concept of a network society at the underrecognized Macy conferences during the 1940s and 50s. Deleuze and Guattari borrowed some of their most important concepts, including the “plateau” and the “double bind,” from Steps to an Ecology of Mind, Batesons work of popular cybernetics.

What is notable is that today, in an age of extreme concentration of the digital market, it has become clear that talk of network society as a purely horizontally organized rhizome applies as little to the actually existing Internet as the utopia of the classless society applied to actually existing socialism.

In theory, every point in the actually existing Internet may be linked to every other, allowing a glorious chaos of equality of the kind celebrated in cyberpunk culture or Hakim Beys idea of the temporary autonomous zone. And in practical terms this has long been the case, too, because of the radically egalitarian and reciprocal nature of the initial structure of the Transmission Control Protocol. Today, however, on a conservative estimate, half of all the connections established through the Internet run between individual users and mega-hubs such as Amazon, Google, and Facebook, so that in this regard they are hardly any different from the connection ratio for traditional mass media like radio and television.

The surrender of net neutrality is cementing this tendency toward concentration, but initially it developed quite spontaneously. It is not an accident of history, nor does it have anything to do with the dynamics of capitalism, but is rather a general characteristic of networks themselvesas becomes clear from a careful reading of the other great text that laid the intellectual groundwork for the Internet.

It is correct to say that, in their essay on the “small-world problem,” Travers and Milgram showed that everyone knows everyone else through six degrees of separation. Using 296 randomly selected test subjects in Nebraska and Massachusetts, the two scientists discovered that, starting with an individual they knew personally, each subject needed on average 5.2 intermediaries before they could establish a mediated acquaintance with a randomly determined individual in Boston.

To this extent, Travers and Milgrams results appear to conform to the metaphor of the rhizome: we are all part of the same global network, which is not even very large. Social distances cannot be measured in geographical terms. This was proved by the fact that the chains starting in Massachusetts, for example, were only marginally shorter than those starting in Nebraska.

However, the egalitarian aspect now frequently associated with these results is actually something that was read into them later. For the scientists principal discovery was that almost half of these chains of acquaintance functioned only because of an enormous inequality among human beings, that is, they did not function because every point is linked with every other, but rather because there are grotesque quantitative differences in the social connections existing among different individuals.

In almost half of all cases, close to forty-eight per cent, the chains of acquaintances reached the test subject only because they led through three people who were extremely well-connected. Travers and Milgram, for want of a better word, described these people as “sociometric stars.” An example of such a person would be the owner of a clothes shop near where a target person lived, who would naturally know the entire local community, and through whom it was therefore easy to establish acquaintances.

Thus a precise reading of the small-world essay shows that the idea of a completely egalitarian rhizome doesnt correspond to the social structure of actual human beings. Further development of mathematical network theory made clear that accumulations and concentrations around a few nodal points are the rule in naturally occurring networks.

In mathematical network theory, this structure is known as “heavy tail,” a term that reflects how, in a function graph of a coordinate system, the few nodal points representing “sociometric stars” lie far to the right of a Guassian normal distribution curve, when y = “number of agents” and x = “number of connections.” It occurs in all known model trials of this kind, such as the parlor game Six Degrees of Kevin Bacon, which measures the degrees of separation between actors in a film and Kevin Bacon, or the Erdős number, which functions in a similar way by measuring the distance in terms of collaborative associations between any given mathematician and the mathematician Paul Erdős. In this and similar experiments, there are always a very small number of individuals whose connections so far exceed the test groups Gaussian average that its possible to say that it is these unusual individual cases, and not the average, that are crucial to the understanding of the network in question.

The existence of such heavy tails can be explained by the fact that social relations grow exponentially and recursively, since every new connection increases the probability of further new connections. Heavy tails are therefore not only limited to social structures, but also occur in many inanimate networks, such as rail transport networks or electricity grids, which tend to function by way of a few, very densely interconnected nodal points. With every new connection that, for example, Chicago OHare airport gains over Chicagos way smaller Midway airport, the former becomes a more likely destination for additional scheduled flights than the latter, because its importance for transporting additional passengers increases with every new connection. This process of accumulation reaches its limit in airport capacities and in increasing airport charges, which in turn make such destinations less attractive.

It is no surprise that the Internet, as a form that has evolved without any central organization, and whose virtual immateriality means that it is largely unconstrained by any limits imposed by scarcity, should be characterized by a heavy tail. So the success of Facebook, Amazon, Wikipedia, and Google doesnt necessarily have anything to do with conflicts over market dominance or a malign struggle to succeed among agents motivated by profit (as many have concluded from the similarity between pre-digital and digital monopolies), but rather with a general tendency among naturally occurring networks toward a certain degree of concentration.

The exponential and recursive basis for Googles growth seems to be that, with every new user it gains, the search enginesince it learns from user clicksbecomes more efficent and therefore more attractive to other users. Besides that, for Facebook, it is its nature as a medium of communication that results in every additional user making it more attractive as a means of communication for others.

Of course, it is quite likely that Google, for example, maintains its position in the heavy tail by means that are either illegal or questionable under antitrust law, in that it gives preferential treatment to its own products. But at the end of the day, an entirely centerless rhizome wouldnt be in the users best interest, since the latter will only be able to get the best available products if the digital realm contains a certain concentration. Monopolization in the digital realm also improves the product intrinsically, since lowering its quality would offer no cost advantages to the supplier; this is especially true for the algorithms that are optimized through a high frequency of user interactions.

Popularity is also a prerequisite for the functionality of any medium of communication. A digital communication product that is used only by a minority is not cool and exotic like a rare brand of jeans or a handmade hamburger, but useless, like a highway to nowhere or a flight that lands on an open field without connections to any other form of transport.

These intrinsic tendencies of digital products toward concentration are particularly effective because digital monopolies are not beholden to the constraints of mass production. While Coca-Cola and Pepsi stock supermarket shelves with millions of identical but individually distinct cans, whose success and value ultimately depend on individual decisions of buyers, Facebook and Google in a certain sense offer one and the same virtual product on billions of computersa product whose value increases with every individual decision to use it. It increases because these decisionswhether in the form of the popularity of the product in question as a medium of communication, or in the algorithms improved by a high frequency of user interactionsaccumulate within a single virtual object.

These qualities mean that it is precisely in the digital realm that market concentration exceeds even the irregularity usually associated with heavy-tail networks. This is evident not only in the speed of concentration of the digital market, but also in the cut-offness of the reverse, dark side of the heavy tail, the “long tail,” which describes all those innumerable places in the net that there is almost no demand for, which are almost never searched for, and, in the case of the so-called darknet, do not appear on search engines on principle.

By contrast, if we take experiments following the template of the small-world problem, which are conducted in the social world of everyday life, we find what is in fact a relatively stable interconnection among those links of the chain that arent “sociometric stars,” and which lie outside the anomaly of the heavy tail and within a Gaussian normal distribution. Fifty-two per cent of the chains in Travers and Milgrams studythat is, more than halfdont run through the three “sociometric stars” who are the nodal points of the network under examination.

If we look at the typical surfing behavior of any Internet user, we find that it is a far smaller percentage of Internet sessions that dont at some point use Google or Facebook. It is estimated that ninety per cent of all Internet sessions involve at least one search request, and of these almost ninety per cent are made through Google. Those sessions that use Facebook but not Google would then have to be deducted from the remaining total of nineteen per cent, which on a very conservative estimate would give a result of about fifteen per cent for all sessions that do not run through one or the other of these digital monopoliesmonopolies that are frequently described as the “strongly connected core” because of the essential function they perform in Internet connections.

This enormous concentration is even more dramatic if not measured by the number of sites visited in Internet sessions, but by the number of incoming links to a given website, which is the essential criterion that decides whether the thesis that the Internet is a medium organized more horizontally than vertically can still be upheld.

In their much-cited 2000 paper on the graph structure of the Internet, which at the time was far less extensively monopolized, Andrei Broder, Ravi Kumar, et al. found that, for any number of links i greater than 1, the probability of one link leading to a website is a multiplicative inverse with the constant 2.1, which produces the formula: 1/i2.1. When there are two links leading to the same website, this becomes 1/22.1, which in decimal terms gives a figure of only 0.23. This means that for sites lying outside the concentrated “star” regions of the Internetthat is, for more than seventy per cent of all sitesthe average number of links leading to a given website is less than one.

In contrast to the social world offline, there is no room here for any romanticization of the niche, or for any arguments that quantitatively minoritarian communication processes can also change the world. Since the average number of links leading to a site in the unpopular long-tail region is close to zero, in most cases no communication happens here of any kind; rather, postings remain limited to various kinds of personal journaling and communication with bots, a narcissistic or therapeutic mirroring of the self with no recipient at all, a perfect Lacanian setup in which the other really doesn’t exist. The utopia of unrestricted and nonhierarchical communication on the Internet, shaped by the metaphor of the rhizome, shatters on the rocks of its ever-growing and ever more pronounced division into an extremely unpopular long tail and an extremely popular heavy tail.

 

IV. PALIMPSEST, NOT ARCHIVE: WHY THE MONOPOLIES WILL (AND SHOULD) REMAIN

 

To imagine that monopolies in this accelerated system would supersede one another as quickly as they emerge is to oversee a crucial difference between physical and digital networks. While media technologies of the past five hundred years were characterized by both an ever-increasing speed and a corresponding ever-increasing ephemeralityone thinks of printing, newspapers, television, and radiothe Internet brings back the durability and immobility, long thought to have had its day, of parchment, stone, and marble.

Such media, which Harold Innis described as “time-biased,” did not historically tend toward democratization and liberalization, as people repeatedly claim when they compare the Internet with printing. The relative costliness and durability of such media, which cannot easily be transported from the specific power structures in which they are imbedded, means that they tend instead toward the consolidation of hierarchies.

Historically there has been only one instance of a turn away from a space-biased to a time-biased medium, and that was at the end of antiquity, when the ephemeral and constantly circulating papyri of the Roman imperial bureaucracy was replaced by the carefully stored parchments of the early Christian monasteries. The Internet is not a publicly accessible archive composed of printed materials that that can be easily transported into another, neutral contextone of the most important characteristics of any democratically accountable governmental administration since the French Revolution. The data storage centers of the digital elite, often located in remote places, more closely resemble the chronicles that were kept hidden from the public behind monastery walls, where, accessible only to priests, power-knowledge resided precisely in its inaccessibility.

On the level of the user interface, a comprehensive historiography of digital signs is impossible, since they are too numerous and their existence too fleeting to be remembered. In this respect, they resemble the largely forgotten oral vernacular culture of the Middle Ages. Unlike newspapers or even broadsheets, the collective state of the publicly accessible Internet as it changes from day to day is not archived. Its history survives only underground, within the institutional context of the digital monopolies, and far from the eyes of scientists or the public.

The very form of this purely private data, which is intelligible only to the companies in-house analysts, means that it will never be available for a genuinely archival public reviewwhich, when one considers all the searches and emails that have been made or sent through Google, would be as banal as it would be legally problematic. Before a seductive and theoretically possible total history becomes reality, all the relevant and exclusively privately owned user data will have been deleted and overwritten by new data to save energy, which is also standard procedure for the publicly accessible Internet. The paradigm for digital storage is not the archive of printed media, but the palimpsest of medieval parchment.

In concrete terms, the inaccessibility of these collections of data means that the hierarchies that have formed in the digital world over the past twenty-five years will continue to exist for a very long time to come. Contrary to a widely-held misconception, demographic factors alone make it unlikely that users, at least in the Western world, will ever again throw themselves into the world of the net with the same youthful naivety of the past twenty-five years. The numbers of users and interactions on Wikipedia are already falling. Facebook will never again grow at over 150 per cent a year, as it did in its launch phase. Ironically, another factor favoring the digital monopolies could be precisely the growing awareness of legal restrictions on the use of data, which could already make data collection more difficult in the very near future. The very critique of digital monopolies could therefore have the effect of favoring those monopolies that have so far been able to collect data virtually unhindered.

It is for these reasons that digital accumulation has today reached a critical magnitude, whose effect will be that the history of digital technology as a succession of disruptions is coming to an end. The quantities of data that Google, Facebook, Amazon, and Yahoo now control as a dividend from the golden years of net hysteria will in all probability always be larger than those of any newcomer. Since the nature of digital networks is to strengthen the heavy tail, we shall have to come to terms with the digital monopolies. Those who still think that the Internet is a rhizome will have to adapt to the facts on the ground. And, as people like Yochai Benkler have emphasized, we need not assume their political consequences will be entirely negative. An Internet with strong centers must pay increasing attention to its editorial responsibility, and hence may well be preferable to an Internet-as-rhizome, which would produce the “Balkanization” of the public sphere so feared by Habermasand whose emergence would mean the end of any conversations in which the whole of society participated, since the lack of shared points of reference would make it impossible to talk about the same things.

If, as Jodi Dean writes, “the” imaginary Internet is not a classical liberal-democratic public sphere, but rather constitutes the condition of possibility of any global political discourse as a “zero institution,” then this is only thanks to the fact that it is precisely not a rhizome but a concentrated form. The digital monopolies have only prepared the infrastructural and epistemic ground on which all further political questions can be debated. Rather than adversaries, they are giants whose shoulders their critics have still to climb.

 

V. NORMCORE: CONCENTRATED LIFE UNDER THE GAZE OF THE BIG OTHER

 

The concept of neuroplasticity describes the changes in neuronal connections in the brain in relation to adopted habits. Thus for example, a study of test subjects who had taken up a sport showed changes in the structure of their brains after just two weeks. The subjectively experienced increase in ones ability to concentrate by practicing meditation can also be objectively confirmed by observing an increase in the density of the brain stems gray matter. The concept of concentration is therefore also apt in a physical sense, as a description of a centripetal movement.

From the point of view of neuroplasticity, the distinction between human and machine is, as far as the Internet is concerned, becoming increasingly arbitrary. A complex feedback effect develops between the neuronal patterns in the brain and the increases in density in the digital network. The latters physical dimension means that the term “noosphere” only superficially describes it, since the Ancient Greek term νοΰς designates something like the mind in an immaterial sense, rather than a physical context.

Even merely passive activity on the Internet, such as liking, sharing, and clicking, creates new densifications in the digital network and, with them, new cognitive realities. In this sense, it is possible to say that market concentration, which uses your previously ascertained individual preferences to shut you in a filter bubble of Buzzfeed, Facebook, and Google, is equivalent to individual concentration, insofar as it emphasizes some contents while blocking out othersthe only difference being that one involves the centripetal structuring of nerve cells, and the other the centripetal structuring of circuits.

Nevertheless, there is no denying that there is an antagonism here. Google, and even the news Google brings you from tabloid newspapers, is finding a spot in your daily routine at precisely the moments when you are concentrating the least, at precisely the moments when you give in to a certain mental inertia and, rather than clicking on a piece of in-depth reportage on your workplace rights, you click on the latest news about Kim Kardashians bum, for example, or the latest listicle about the seven most dangerous types of pasta in the world. At some point, Google will be spewing out nothing but trivialities, reinforcing this unconcentrated inertia. When it becomes noticeably unpleasant, the statisticians call this “overfitting.”

However, there is an important difference between ignoring the effects of filters and watching a conventional TV program. It doesnt matter how exactly data on the behavior of digital consumers is gathered: in contrast to classical media, algorithmic filters are influenced by the decisions consumers actually make (regardless of why), and not just by the user profiles conjectured by editors and market researchers.

The same is true collectively for the dominance of Google, Facebook, and their associates, who in a certain sense represent both the inertia of the masses and a decision that they make every day, albeit one that they surprisingly always make the same way; this decision is as legitimate as the outcome of any democratic election, and, in contrast to the decisions people make about their electricity or water supplier, involves far less coercion by monopolies.

One could therefore go one step further and claim that it also isnt difficult for the average user to ascribe every instance of concentration and densification they encounter, such as advertisements sent to them over the Internet, to conscious or unconscious patterns in their own behavior. The average web user has already known for a long time that if she posts pictures of babies on the Internet she will be inundated with advertisements for baby food, and that if she makes frequent use of the word “bodybuilder” even in private Facebook messages or on Gmail, she will be bombarded with protein shakes. Similarly, everyone knows that Facebook on principle favors posts that contain words such as “congratulations,” as well as those that come from permanently active accounts.

Whats crucial here is that the results of filter algorithms dont only show us how algorithms function but also the structure of our own behaviorsuch as, for example, a tendency to click on articles about Kim Kardashians bumwhich we would mostly not even have been aware of before the monopolies started personalizing all content. Such preferences, after all, come precisely from unconscious, automated, and instinctive regions of our minds. The structure of what we are presented with on the Internet isnt simply an “inversion of life [and] the autonomous movement of non-life,” as Debord once more or less rightly described the classical mass media, but a feedback loop of the subjects own behavior that is thoroughly and indeed horrifyingly alive.

Of course, this feedback loop also wont be accurate: it will be subject to all the commodifying distortions typical of capitalisms apparatuses of self-knowledge. This is evident, for example, in Facebooks aforementioned preference for posts containing the character sequence “congratulations,” and for especially active user accounts.

Nevertheless, the first thing to be noted is the obvious: the increasingly popular tendency to mentally pre-categorize future situations on Facebook, Instagram, and Twitter using the predictable images, comments, likes, status updates, and hashtags is only the clearest symptom of a fundamental change in how we perceive ourselves in the digital age.

Put briefly, the total feedback loop signifies the emergence of a digital ego from the digital id, opening up a Heideggerian “clearing of being.” It marks our reaching a point where surfing becomes self-observation, where the gnothi seauton, for Carolus Linnaeus, the brand essence of Homo sapiens, senselessly and perhaps also irritatingly follows us everywhere, like a fly around the breakfast table, in the form of a selfie-maniain a manner that that has finally become entirely secular, that puts paid to any notion of mystical self-awakening. Commercial television has made us both more cynical and more ironic. The commercial Internet ensures that we learn to see ourselves with the all-seeing eye of the Leviathan-cum-panopticon we have ourselves created.

Just as TV per se hasnt made literature, the arts, or the viewing public any worse than they used to be, so the digital panopticon also only seems to be a pure dystopia. In his early and unparalleled queer reading of it, the net-theory pioneer Howard Rheingold proposed the Benthamite panopticon as a desirable condition, so long as it wasnt controlled by the state. According to Rheingold, the possibility of permanent surveillance would eventually discipline us, turning us into more ethical and considerate human beings. He was the only one of the early net theorists not to unreservedly share the laissez-faire libertarianism typical of this group, turning out instead to be a clear-sighted realist in the tradition of Hobbes. It was precisely for this reason that he could predict the success of platforms based on mutual observation and discipline and punishment by rating, such as Airbnb and eBay, before they even existed.

Just as television completed the project of Romanticism by popularizing universal irony as the foundation of art, the Internet is popularizing absolute concentration as permanent consciousness of ones self in the form of a constant background noise of selfie-taking. The Être suprême, under whose all-pervading gaze humanity was to convalesce during the Enlightenment, has materialized in the digital monopolies apparatuses of consumption with precisely the horrifying banality that usually signifies real historical progress rather than aheroic, utopian anticipation of it. The first time as tragedy, the second time as farce. The accomplishment of digital monopolies is that they have developed a mechanismwhich they have brought into every living room and even every bedroom, not to mention every pants pocketthat combines idleness with self-observation, and actually enables the former, in the form of a listless clicking onto links, to become a necessary condition for self-knowledge.

This permanent certainty of the possibility of a feedback effect is now acting almost as an antidote to the permanently distracted link-clicking that it emerged from. It is giving digital natives and subsequent generations an apparently asynchronous and often downright puritanical capacity for focusing and concentrating their attention, which is wholly unknown to earlier generations, and for which the term “normcore” is an apt description even as it downplays a neuroplastic mutation into a mere fashion trend.

It may be that anyone who begins almost all their relationships on the Internet, and who grows up with the awareness that any social faux pas they make, of whatever kind, could end up recorded in the web giants global chronicles, will outwardly display a permanent distractedness, also as a result of their everyday lives as “funemployed playbor”; but inwardly they have no choice but to maintain a permanent watchfulness, one that is wholly comparable with that of a deeply religious Quaker who believes himself to be constantly under the gaze of his censorious God.

Though the digital avant-garde, inspired as it was by Ayn Rand, may have been predominantly atheist, peoples brains in the age of digital monopolies display a structural sanctimoniousness. Fundamentalist Islam, which flourishes on social networks, and Hindu nationalism fueled by holographic avatars are only the clearest symptoms of this structurally serious fact, which lurks behind all the supposed hedonism of the age of the meme. The age of the Internet is the age of the big Other, which may be imagined as an idea of God, of the swarm, of ones competitor in popularity on Instagram, as the penumbra of the shadow of the coming singularity, or as an impersonally sublime algorithmjust as it was already imagined in Lacans seminar on anamorphosis as the starting point of the gaze in the form of a cold-blooded, intrinsically mathematical, and above all utterly inhuman point of view: the big Other is never another actual human being. Except for this qualification, though, it can assume any form.

Just as in classical Freudian psychoanalysis, where patient and analyst never look each other in the eye, the specter of the big Other flourishes especially well on the Internet because it is never clear whether you are really being watched or whether you just imagine that you areand the fact that you can never know is the whole point of the panopticon.

 

VI. RES EXTRA COMMERCIUM: CONCENTRATION IN THE STATE OF LEGAL EXCEPTION

 

Before we can meaningfully address the unsettled question of the autonomy of the individual as a nodal point of the network society, we need to examine a more easily identifiable concentration of nodal points, namely those of the economy and the state, and its effects on the autonomy of those domains. For, in a sense, concentration of the digital market seems to represent only the tip of the iceberg, in that it bolsters concentrations of power in a range of different fields:

 

·Edward Snowdens revelations have shown that digital monopolies have strengthened the power of the secret services to a historically unprecedented degree, because they can be relatively easily controlled by individual governments, and especially the United States.

·The enormous concentration of financial markets over the last twenty years is also partly a result of digital interconnectedness, in that it was this interconnectedness that made high-speed trading possible, and this kind of trading was only accessible to, and therefore only offered an advantage to, very large investors.

·Similarly, digital communication made possible the offshoring of economic production on an enormous scale, in that it enables real-time coordination between the home country and the site of production. Since it requires substantial investment, offshoring is especially profitable for larger companies, allowing them to produce goods more cheaply than their smaller market competitors.

·Since the crises that began in 2008, the unprecedented economic concentration of our times has also led to concentration in more technocratic power centers such as the European Commission and the central banks, whose rescue and bailout policies show them to be increasingly acting in a realm of pure sovereignty beyond the law and democratic control.

·The Islamic State shines like a dark star over this new epoch marked by an agglomeration of powerful sovereign nodal points. Detaching itself from the structures of the ultra-postmodern rhizomatic network of al-Qaeda, it aims simply to assert itself as a sovereign state with its own social and financial system. Given the agglomeration of fanatical groups that has been made possible by digital networks, it merely represents an attempt, typical of its time, to cement them into a heavy tail of terror.

·United States military operations in AfPak offer an example of states using drones not as a way of waging war against another sovereign state, but rather as a form of policing (from πόλις, “state”) that operates below the threshold of actual war. It is an experiment carried out in airspace that is de jure controlled by other sovereign states, and is only a preparation for total police control over the constantly growing slums of global conurbations. The minimal human resources required to operate this control mechanism are indicative of how there are no longer any effective natural restrictions on the power of the elite, and economic inequality is allowed to increase without limit.

 

The fact that digitization has caused or bolstered this wide variety of shifts towards greater concentration and fewer but stronger sites of sovereign power clearly refutes those kinds of technological determinism that paint a picture of history in which the development of technological means of production forces political actors to decentralize, communicate, dissolve, and soften up hegemonic structures. As in Marx and Engels, the melting into air of everything solid is merely a narrative behind which power struggles become clearer and all the more entrenched. The reality of the long tail refutes all equations posited by technological determinism that follow the pattern of “more Internet = more equality,” “more bloggers = more democracy,” “more digitization = less power at the centers,” or “more Internet = more free enterprise.”

The form of technological determinism that characterizes a brand of subversive affirmation known as accelerationism is equally naïve. It is based on the assumption, taken from Marx though applied with apocalyptic coquetry, that there is a structural tendency within capitalism toward speed and progress, which merely needs to be directed along socially responsible lines. This may have been true twenty years ago. But not only Pikettys formula R>G tells us otherwise. The fact is that todays monopoly capitalism fundamentally operates counter to innovation and speed.

This is evident in Disneys huge commitment to endlessly extending copyright, which effectively aims at preventing this corporation, one of the most important producers of global culture, from ever having to perform any creative work again, as well as ensuring it an income exclusively based on economic rents for the rest of eternity. Capitalisms tendency to resist progress is also evident in the comparatively slow Internet speed prevailing in the virtually monopolized American ISP market, in phases of stock market volatility that are deliberately caused by speculators and which have a negative effect on the growth of the real economy, and in efforts by manufacturers to incorporate planned obsolescence into every one of their products, keeping their technical quality at ridiculously low levels in light of what would be possiblea development that began in 1924 when the Phoebus cartel deliberately reduced the life expectancy of its light bulbs, and continues with Monsantos patenting of genetically engineered types of grain which are unnaturally sterile. Monopoly capitalism doesnt want to accelerate economic, technological, political, and biological development; it wants to freeze it.

Contrary to their current rhetoric, the digital monopolies will, like any powerful agents in the course of history, act to preserve the existing structures of power. The monopolization of the Internet therefore means that its development can already almost be regarded as over. Contrary to all the wild ideas often bandied about, we arent going to see any more “disruptive” quantum leaps in this field, but only developments of the kind that the monopolies can easily incorporate into their existing product range and use to consolidate their power. These days Google may announce one of its notorious “moonshot” projects to inflate its share price, but at the end of the day it will do everything it can to ensure that the world remains exactly the same as it was when Google became powerful. Nothing in this entrenched situation is going to change until the next industrial revolution.

The question remains: what consequences do the concentration of the digital and non-digital economy and in the domain of the state have for the individual? It seems, although utterly counter-intuitive, at least on the vague and uncertain terrain of conclusions drawn by analogy, that human beingswho, collectively at least, also form a nodal point, perhaps the most fundamental one of all the digital networkscould actually have their sovereignty strengthened as unambiguously as the other nodal points discussed above.

What can end up leading to the trite but correct warning of the threat of the black virus of digital totalitarianism maturing within todays market concentration could thus also lead to a quite different outcome. Generally speaking, concentration in heavy tail distributions, which are characterized more by local agglomerations than uniform dispersal, poses the question of the decision-making power in a new form. This much becomes clear when one considers that, though the extreme notion of the digital panopticon could indeed amount to totalitarian repression, it is nevertheless different from the repressive mechanisms of the pastin that the digital monopolies can only function as apparatuses of control by reacting to users decisions, and the users for their part use these apparatuses to develop some form of self-knowledge which, though distorted, also brings about concentration of some kind.

Given the starting position we find ourselves in, it would be unlikely, though not inconceivable, that a heavy-tail concentration could form at the other end of the spectrum, that of the individual, and that its opposition to the digital monopolies might bring with it that leap in consciousness that not only Marx but also Heidegger in his late writings imagined as the sublation of reification in the liminal states of industrial society. If users concentration is really the dominant form of capital today, and if it is its accumulation in the hands of the digital monopolies that makes possible the concentration of power in all other domains, then this must also mean a shift in the balance of power away from the monopolies and institutions and toward the individual, a process that in principle has already happened, but whose consequences the individual has yet to act upon.

Should this take place, then it would mean both transcending the arbitrary distinction between digital and neuronal networks and transcending the distinction between the individual and the vastand now materially existingglobal mind, which for lack of a better word could be described as the “noosphere.” In this huge corpus mysticum that goes beyond any naturalist essentialism, post-humanity would constitute itself as an autonomous subject comprising both digital and neuronal networksthat is, as a single super-monopoly that determines all other monopolies. This act thus harbors the potential for a post-human species both to shape new neuroplastic patterns and to determine how they will be used as consciously and deliberately as an individual concentrating on her own self.

Though they may sound trivial next to such an apparently fantastical vision, the first signs of such an autonomous mass concentration could already be visible in todays increasingly important consumer boycotts, virtual sit-ins, and the interventions of Anonymous, which, in light of a Marxist and Heideggerian historical teleology, appear like the precursors of the active self-knowledge of an autopoietic posthumanity.

In a further development, individuals could recognize themselves as the true political sovereign of history, a sovereign who is merely represented in the power of digital monopolies, and who controls them in a much clearer sense than the states of the past, whose power was derived not so much from the individual as from military infrastructure.

As long as it is clear that it is the masses who control the digital monopolies, the latters continuing existence becomes a question of secondary importance. Should concerns arise about the digital monopolies getting out of control or developing their own agenda, then the individuals have no choice but to pursue concrete sovereignty, which proves itself in the state of exception outside of positive law. Here, too, it is possible to say that this state of exception has already become everyday reality, in that today almost everyone has stolen material on his computer and convicting individuals for such “crimes” has long since become an exercise of arbitrary state power, causing the rule of law to shrink to a mere fiction.

The situation in which the concept of property now finds itself seems to have proved Marxs prediction that it would become obsolete once the development of technical infrastructure had reached a certain stage. Indeed, something similar happened at the beginning of the last century, when the development of civil aviation caused profound changes in the previous conception of the law, which held that ownership of land included the air above it. It became an issue that led to heated debates.

Today digitization has already hollowed out the concept of property on so many levels, especially with regard to software, book, and music piracy, that even conservative groups are asking to what extent the traditional notion of property is still viable, particularly when it has become such an impediment to the educational and progressive interests of our species that every individual visit to a physical library could be said to sacrifice science and progress on the altar of a fetishized concept of property; in these circumstances, its absurdity becomes comparable to changing a flight path because individual owners object to its going over their land.

In the same way, it is not difficult to see the neuroplasticity that has been trustingly, not to say irresponsibly, outsourced to the digital monopolies as a common good whichrather like the human organs it actually is equivalent to, or even, from a posthuman point of view, is an instance ofhas to be thought of as standing outside existing property relations; as a good that by its nature cannot be traded, that must be a res extra commercium, since it contains the vital essence of the species.

If this is right, the global neoplasticity that manifests itself today in digital monopolies should be entrusted to the species, in a process that could involve transforming the entire noosphere into a single inalienable global digital fund, one in which every user would be a shareholder with full voting rights. Since this would be ownership of a nonexclusive resource, and since in the realm of swarm intelligence quantity of use actually produces quality, there is no need to fear the tragedy of the commonsrather the opposite.

The issue of whether such an outcome can be justified, given the amount of labor and investment that the digital monopolies have put into their business, is irrelevant to those who have understood the ontological depths of the neuroplastic question. Since humanitys neuroplasticity is the formlessly forming origin of all concepts of justice, it is impossible to set limits on its legitimacy using these same concepts. Rather it is a sign of true concentration to be able to think beyond the limitations of such normative ideas. The relentless dichotomy of zeroes and ones in digital space means that the questions that surround it are now dominated by a pure decisionism, which, unlike parliamentary discourse, allows no cautious compromise.

Boris Groys
FICTION DEFICTIONALIZED

ART AND LITERATURE ON THE INTERNET

 

Over the past few years, the Internet has become the primary site for the production and distribution of writing, including literature, artistic practices and, more generally, cultural archives.

Obviously this shift is experienced by many cultural workers as liberating, because the Internet is not selective—at least much less selective than a museum or a traditional publishing house. Indeed, the question that traditionally troubled artists and writers was: What are the criteria of choice—why do some artworks get into the museum and other artworks not? Why do some texts get published and others not? We know the, so to speak, Catholic theories of selection according to which artworks deserve to be chosen by a museum or publishing house: They should be good, beautiful, inspiring, original, creative, powerful, expressive, historically relevant—one can cite hundreds of similar criteria. However, these theories collapsed historically because nobody could persuasively explain why one artwork was more beautiful, original, etc., than another. Or why a particular text was better written than another text. So other theories succeeded that were more Protestant, even Calvinist. According to these theories, artworks are chosen because they are chosen. The concept of a divine power that is perfectly sovereign and does not need any legitimization was transferred to the museum and other traditional cultural institutions. This Protestant theory of choice, which stresses the unconditional power of the chooser, is a precondition for institutional critique: The museums and other cultural institutions were criticized for how they used and abused their alleged power.

This kind of institutional critique does not make much sense in the case of the Internet, however. There are of course, examples of political censorship of the Internet practiced by some states, but those are a different story. Another question arises instead: What happens to art and literary writing as a result of their emigration from traditional cultural institutions to the Internet?

Traditionally, literature and art were considered fields of fiction. I would argue that the use of the Internet as the main medium of production and distribution of art and literature leads to their defictionalization. Traditionally, institutions like a museum, a theater, or books presented fiction as fiction by means of self-dissimulation. Sitting in a theater, the viewer was supposed to reach a state of self-oblivion—and forget everything about the stage, everything about the space he or she was sitting in. This enabled the spectator to spiritually leave everyday reality—and to immerse him- or herself in the fictional world presented on the stage. One had to forget that the book was a material object like every other object to be able to truly follow and enjoy the literary narrative it contained. And one had to forget that one was inside the art museum to become spiritually absorbed in the contemplation of art. In other words, the precondition for the functioning of fiction as fiction is the dissimulation of the material, technological, and institutional framing that makes this functioning possible.

Since at least the beginning of the twentieth century, the art of the historical avant-garde has tried to thematize and to reveal the factual, material, nonfictional dimension of art. The avant-garde did so by thematizing the institutional and technological framing of art—by acting against this framing and thus making it visible, experienceable by the viewer, reader, visitor. Bertolt Brecht tried to destroy theatrical illusion. The Futurist and Constructivist art movements compared artists to industrial workers, to engineers who produce real things—even if these things can be interpreted as referring to a fiction. The same can be said about writing. At least since Mallarmé, Marinetti, and Zdanevich, the production of texts was understood as the production of things. Not accidentally, Heidegger understood art precisely as a struggle against the fictional. In his late writings, he speaks about technological and institutional framing (das Gestell) as being hidden behind the image of the world (Weltbild). The subject who contemplates the image of the world in an allegedly sovereign manner necessarily overlooks the framing of this image. Science also cannot reveal this framing because it depends on it. Heidegger believed that it was art alone that could reveal the hidden Gestell and demonstrate the fictional, illusionary character of our images of the world. Here Heidegger obviously had in mind the art of the avant-garde. However, the avant-garde has never fully succeeded in realizing Heidegger’s quest for the real because the reality of art—its material side that the avant-garde tried to reveal—was refictionalized, by being put under the standard conditions for art representation.

This is precisely what the Internet changed. Internet data is virtual but it is not fictional. The Internet functions under the presupposition of its nonfictionality, of its having a reference in reality offline. One speaks about the Internet as a medium of information, as a space of information flows—but information is always information about something. And this something is always placed outside the Internet, namely offline. Otherwise all economic operations on the Internet would become impossible. Or military operations. Or security surveillance. The Internet is, by definition, the place of truthfulness; being virtual, having virtue (virtus) means, among other things, being truthful. Of course, there is the possibility of creating a fiction—for example, a fictional user of the Internet. But in this case, the fiction becomes a fraud that can be—and even must be—revealed. (In the case of fictional identity, the film Catfish shows how the real story behind such an identity was revealed.)

But most importantly, on the Internet, art and literature do not get a fixed, institutional framing, as was the case in the analog world: here the factory, there the theater; here the stock market, there the museum. On the Internet, art and literature operate in the same space as military planning, the tourism economy, capital flows, etc.: Google shows, among other things, that there are no walls in Internet space. Of course, there are specialized websites and blogs for art. But accessing them means a user has to click them, to frame them on the surface of his or her computer, or iPad, or mobile phone. Thus, the framing becomes de-institutionalized and the framed fictionality becomes defictionalized. The user cannot overlook the frame because he or she has created this frame in the first place. The framing—and the operation of framing—becomes explicit and remains explicit throughout the experience of contemplating and writing. The dissimulation of the framing that defined our experience of the fictional throughout centuries here finds its end. Art and literature can still refer to fiction and not to reality. As users, however, we do not immerse ourselves in this fiction; we do not travel, like Alice, through the looking-glass; but, rather, we perceive art production as a real process, and the artwork as a real thing. One could say that on the Internet there is no art or literature but only information about art and literature—alongside other information about other fields of human activity. For example, literary texts or artworks by a particular author can be found on the Internet when I google his or her name, and they are shown to me in the context of all the other information that I find about this author: biography, other works, political activities, critical reviews, and details of his or her personal life. Here the “fictional” text becomes integrated into information about its author as a real person. Through the Internet, the avant-garde impulse that has driven art and writing since the beginning of the twentieth century finds its realization, its telos. Art is presented on the Internet as a specific kind of reality: as a working process or even life-process taking place in the real, offline world. This does not mean that aesthetic criteria do not play a role in the presentation of data on the Internet. In this case, however, it has to do not with art but with data design—with the aesthetic presentation of documentation of real art events, and not with the production of fiction.

The word “documentation” here is crucial. Over the course of recent decades, the documentation of art has been increasingly included in exhibitions and museums alongside traditional artworks. But this proximity has always seemed highly problematic. The artworks are art; they immediately demonstrate themselves as art. Hence they can be admired, emotionally experienced, etc.. And such artworks are fictional; they cannot be used as evidence in a court of law; they do not guarantee the truth of what they represent. That is the role of documentary photography. (Sure, one could use a painting as a document in the absence of photography; one could fall in love with a beautiful lady by looking at her painted image.) But art documentation is not fictional: It refers to an art event, or exhibition, or installation, or project which we assume has really taken place. Art documentation refers to art but it is not art. That is why art documentation can be reformatted, rewritten, extended, or abridged. One can subject art documentation to all these operations that are forbidden in the case of an artwork because these operations change the form of the artwork. And the form of the artwork is institutionally guaranteed because only the form guarantees the reproducibility and identity of the fiction that this artwork is. Documentation, on the contrary, can be changed at will because its reproducibility and identity is guaranteed by its “real” external referent and not by its own form. But even if the emergence of art documentation precedes the emergence of the Internet as an art medium, only the introduction of the Internet has given art documentation its legitimate place (as Benjamin noted regarding montage in art and cinema).

Meanwhile, cultural institutions themselves began to use the Internet as their primary space for self-representation. Museums put their collections on display on the Internet. And, of course, virtual depositories of art images are much more compact and much cheaper to maintain than traditional art museums. Thus, museums gain the opportunity to present the parts of their collections that are usually kept in storage. The same can be said about the publishing houses that permanently expand the electronic component of their publication programs. And it can also be said about the websites of individual artists, where one can find the fullest representation of what they are doing. During studio visits, artists now usually simply put a laptop on the table and show documentation of their activities, including the production of the artworks, participation in long-term projects, temporary installations, urban interventions, political actions, etc. The Internet makes it possible for the author to make his or her art accessible to almost everyone all around the world—and, at the same time, to create a personal archive of his or her own art.

The Internet thus leads to the globalization of the author, of the person of the author. Here, I again do not mean the fictional, authorial subject allegedly investing the artwork with intention and meaning that should be hermeneutically deciphered and revealed. This authorial subject has already been deconstructed and proclaimed dead many times. I mean a real person existing in offline reality to which Internet data refers. This author uses the Internet not only to write novels or to produce art, but also to buy tickets, make restaurant reservations, conduct business, etc. All of these activities take place in the same integrated space of the Internet—and all of them are potentially accessible to other Internet users.

Of course, the authors, like other individuals and organizations, try to escape this total visibility by creating sophisticated systems of passwords and data protection. Today, subjectivity is a technical construction: The contemporary subject is defined as the owner of a set of passwords that he or she knows and that other people do not know. The contemporary subject is primarily a keeper of secrets. In a certain way it is a very traditional definition of the subject: The subject was always defined as knowing something about him or herself that maybe only God knows but other people cannot know because they are ontologically prevented from “reading another’s thoughts.” However, today, we do not have to deal with ontologically protected secrets but, rather, with technically protected ones. The Internet is a place in which the subject is originally constituted as a transparent, observable subject, who only afterward begins to be technically protected to conceal the originally revealed secret. Every technical protection can be broken, however. Today, the hermeneuticist has become a hacker. The contemporary Internet is a place of cyber warfare in which the secret is the prize. To know the secret means to put the subject that is constituted by this secret under control; cyber wars are wars of subjectivization and desubjectivization. But these wars can take place only because the Internet is originally a place of transparency and referentiality.

Nevertheless, so-called content providers often complain that their artistic production drowns in the sea of data that circulates through the Internet. Indeed, the Internet functions as a huge garbage can in which everything disappears rather than emerges, never granting the degree of public attention that one hopes to achieve. Ultimately, everyone searches the Internet for information about what has happened to their own friends and acquaintances. One follows certain blogs, information sites, e-magazines, websites—and ignores everything else. So the standard trajectory of a contemporary author is not from the local to the global, but from the global to the local. Traditionally, the career of an author—be it writer or artist—moved from the local to the global. One had to become known locally to be able to establish oneself globally later. Today, one starts with self-globalization. To put one’s own text or artwork on the Internet means to directly address the global audience, avoiding any local mediation. Here the personal becomes global—and the global becomes personal. At the same time, the Internet offers the opportunity to quantify the global success of an author because the Internet is a huge machine for equalizing readers and readings. It works according to the rule that one reading equals one click. However, to be able to survive in contemporary culture one has to draw the attention of local, offline audiences to one’s own global exposure—to become not only globally present but also locally familiar.

Here, a more general question arises: Who is the reader, or who is the spectator, of the Internet? It cannot be a human being, because a human being’s gaze does not have the capacity to grasp the whole of the Internet. But nor should it be a god, because the divine gaze is infinite, and the Internet is finite. Often enough we think about the Internet in terms of infinite data flows that transcend the limits of individual control. But, in fact, the Internet is not a place of data flows—it is a machine to stop and reverse data flows. The unobservability of the Internet is a myth. The medium of the Internet is electricity. And the supply of electricity is finite. So the Internet cannot support infinite data flows. The Internet is based on a fixed number of cables, terminals, computers, mobile phones, and other units of equipment. The efficiency of the Internet is based precisely on its finite nature and, therefore, on its observability. This is demonstrated by search engines such as Google. Today one often hears about the increasing extent of surveillance, especially through the Internet, but surveillance is not external to the Internet; nor is it some specific, technical use of the Internet. The Internet is, in essence, a machine of surveillance. It divides the flow of data into small, traceable, and reversible operations and, thus, exposes every user to surveillance—real or possible. The Internet creates a field of total visibility, accessibility, and transparency. And it allows the behavior of all Internet users to be tracked. The gaze that reads the Internet is the algorithmic gaze. And, at least potentially, this algorithmic gaze can see and read everything that has been put on the Internet.

But what does this original transparency mean for artists? It seems to me that the real problem is not the Internet as a place for the distribution and exhibition of art but the Internet as a place of work. Under the traditional, institutional regime, art was produced in one place—the atelier of an artist, the room of a writer—and shown in another: the museum, or a published book. The emergence of the Internet has erased this difference between the production and the exhibition of art. The process of art production insofar as it involves the use of the Internet is always already exposed from its beginning to its end. Earlier, only industrial workers operated under the gaze of the others—under the permanent control that was so eloquently described by Michel Foucault. Writers and artists worked in seclusion—beyond panoptic public control. However, if the so-called creative worker uses the Internet, he or she is subject to the same or an even greater degree of surveillance than the Foucauldian worker.

The results of surveillance are sold by the corporations that control the Internet because they own the means of production, the material-technical foundation of the Internet. One should not forget that the Internet is privately owned. And profit comes mostly from targeted advertisements. Here, an interesting phenomenon occurs: the monetization of hermeneutics. A classical hermeneutics that sought the author behind the work was criticized by theoreticians of structuralism and close reading, who thought that it makes no sense to chase ontological secrets that are inaccessible by definition. Today, this old, traditional hermeneutics is reborn as a means of additional economic exploitation of the subjects operating on the Internet in which all secrets are originally revealed. The subject is here no longer concealed behind his or her work. The surplus value that such a subject produces and that is appropriated by Internet corporations is the hermeneutic value: The subject not only does something on the Internet but also reveals him- or herself as a human being with certain interests, desires, and needs. The monetization of classical hermeneutics is one of the most interesting processes that we have been confronted with in the course of the last decades.

At first glance, it seems that for artists this permanent exposure has more positive than negative aspects. The resynchronization of art production and art exposure through the Internet seems to make things better, not worse. Indeed, this resynchronization means that, as an artist, one does not need to produce any final product, any artwork. Here, the documentation of the process of artmaking is already an artwork. Art production, presentation, and distribution coincide. The artist becomes a blogger. Almost everyone in the contemporary art world acts as a blogger: individual artists, but also art institutions and in fact even museums. Ai Weiwei is paradigmatic in this respect. Balzac’s artist who never could present his masterpiece would have no problem under these new conditions: Documentation of his efforts to create a masterpiece would already be his masterpiece. Thus, the Internet functions more like the Church than the museum. As Nietzsche wrote his famous “God is dead,” he continued: We have lost the spectator. The emergence of the Internet means the return of the universal spectator. So it seems that we are back in paradise and, like saints, do the immaterial work of pure existence under the divine gaze. In fact, the life of saints can be described as a blog that is read by God and remains uninterrupted even by the saint’s death. So why do we need any secrets any more? Why would we reject radical transparency? The answer to these questions depends on the answer to a more fundamental question concerning the Internet: Does the Internet effectuate the return of God or of the malin génie, with its evil eye?

I would suggest that the Internet is not paradise but, rather, hell—or, if you like, paradise and hell at the same time. Jean-Paul Sartre has already said that hell is other people—life under the gaze of others. (And Jacques Lacan said later that the eye of the other is always an evil eye.) Sartre argued that the gaze of others “objectifies” us and, in this way, negates the possibilities of change that define our subjectivity. Sartre defined human subjectivity as a “project” directed toward the future; this project is an ontologically guaranteed secret because it can never be revealed in the here and now, but only in the future. In other words, Sartre understood human subjects as struggling against the identity that was given to them by society. This explains why he interpreted the gaze of others as hell: In the gaze of the other, we see that we have lost the battle and remained prisoners of our socially codified identity.

Hence, we try to avoid the gaze of the other for a while in order to reveal our “true self” after a certain period of seclusion—to reappear in the public in a new shape, in a new form. This state of temporary absence is constitutive for what we call the creative process—in fact, it precisely is what we call the creative process. André Breton tells a story about a French poet who, when he went to sleep, put a notice on his door saying, “Please be quiet—the poet is working.” This anecdote summarizes a traditional understanding of creative work: It is creative because it takes place beyond public control, and even beyond the conscious control of the author. This time of absence could last days, months, years—even a whole life. Only at the end of this period of absence was the author expected to present a work (maybe found in his or her papers posthumously) that would then be accepted as creative precisely because it seemed to emerge, as it were, out of nothingness. In other words, creative work is the work that presupposes the desynchronization of the time of work from the time at which its results are exposed. Creative work is practiced in a parallel time of seclusion, in secrecy, so that there is an effect of surprise when this parallel time gets resynchronized with the time of the audience. That is why art practitioners traditionally wanted to be concealed, to become invisible. The reason they want to keep away from the gaze of the others is not that the artists have committed crimes or have dirty secrets to conceal. The gaze of the others is experienced as an evil eye not when it wants to penetrate our secrets and make them transparent (such a penetrating gaze is rather flattering and exciting)—but when it denies that we have any secrets, when it reduces us to what it sees and registers. In this sense, one can suffer also under the algorithmic gaze—even if the algorithmic gaze, unlike the human or divine gaze, does not judge us.

Of course, today we discuss the Internet as we know it. But I expect that the present state of the Internet will be radically changed by the coming cyber wars. These cyber wars have already been announced, and they will destroy or at least seriously damage the Internet as a means of communication and as the dominant marketplace. The contemporary world looks very much like the nineteenth-century world. This was a world defined by the politics of open markets, growing capitalism, celebrity culture, the return of religion, terrorism, and counterterrorism. World War I destroyed this world, making the politics of open markets impossible. In the end, the geopolitical and military interests of individual nation-states showed themselves to be much more powerful than their economic interests. A long period of wars and revolutions followed. Let us see what awaits us in the near future.

But I would like to close with a more general consideration of the relationship between the archive and utopia. As I have tried to show, the utopian impulse has always had to do with the desire of the subject to break out of its own historically defined identity, to leave its place in a historical taxonomy. In a certain way, the archive gives the subject the hope of surviving its own contemporaneity and revealing its true self in the future, because the archive promises to sustain and make accessible this subject’s texts or artworks after his or her death. This utopian or, at least, heterotopian—to use Foucault’s word—promise that the archive gives to the subject is crucial to the subject’s ability to develop a distance from, and critical attitude toward, its own time and its own immediate audience.

Archives are often interpreted merely as a means of conserving the past, of presenting the past in the present. But in fact archives are, at the same time, and even primarily, the machines by which the present is transported into the future. Artists do their work not only for their own time but also for their archives—and that means for a future in which their work remains present. This produces a difference between politics and art. Artists and politicians share a common here and now of public space and they both want to shape the future; this is what unites art and politics. But politics and art shape the future in different ways. Politics understands the future as the result of its actions, which take place in the here and now. Political action has to be efficient, to bring results, to transform social life. In other words, political practice shapes the future, but it disappears in and through this future: It becomes totally absorbed by its own results and consequences. The goal of politics is to become obsolete, and to make space for the politics of the future.

But artists work not only inside the public space of their time but also for the heterogeneous space of art archives where their works are placed among those of the past and the future. Art, as it functioned in modernity and still functions in our time, does not disappear after its work is done. Rather, the artwork remains present in the future. And it is precisely this anticipated future presence of art that guarantees its influence on the future, its chance to shape the future. Politics shapes the future with its own disappearance. Art shapes the future with its own prolonged presence. This gap between art and politics was demonstrated often enough throughout the tragic history of the relationship between leftist art and leftist politics in the twentieth century.

It’s true that our archives are structured historically, and our use of these archives is still defined by the nineteenth century’s tradition of historicism. Thus, we tend to posthumously reinscribe artists into historical contexts from which they actually wanted to escape. In this sense, art collections that preceded the historicism of the nineteenth century—the collections that wanted to be collections of examples of pure beauty, for example—seem naïve only at first glance. In fact, they are more faithful to the original utopian impulse than their more sophisticated historicist counterparts. Now it seems to me that we are becoming more and more interested in a nonhistoricist approach to our past; more interested in decontextualisation and the reenactment of individual phenomena of the past than in their historical recontextualisation; more interested in the utopian aspirations that lead artists out of their historical contexts than in those contexts themselves. Maybe the most interesting aspect of the Internet as an archive is precisely the possibilities for decontextualisation and recontextualisation it offers its users through cut-and-paste operations. In a certain way, the Internet, and especially Google, is the fulfillment of the program to liberate words that Marinetti famously proclaimed at the beginning of the twentieth century. Asking for specific words and word combinations using Google, the user is able to create his or her own contexts, breaking through historically established narratives and discourses.

And this, to me, seems like a good development, because it strengthens the utopian potential of the archive and works against the risk of its betrayal, which is inherent to any archive, in whatever way it is structured.

Kenneth Goldsmith @kg_ubu, 28 October 2014
YES, #TWEETING IS REAL #WRITING.

Uncreative Writing @UncreativeWriti, 11 September 2012

Hunter S. Thompson retyped Hemingway & Fitzgerald novels. He said, “I just want to know what it feels like to write these words.”

 

Uncreative Writing @UncreativeWriti, 11 September 2012

Jonathan Franzen is America’s greatest novelist… of the 1950s.

 

Uncreative Writing @UncreativeWriti, 18 September 2012

The New Literature: Words now function less for people than for expediting the interaction and concatenation of other machines.

 

Uncreative Writing @UncreativeWriti, 22 September 2012

Alphanumeric code, indistinguishable from writing, is the medium by which the internet has solidified its grip on literature.

 

Uncreative Writing @UncreativeWriti, 28 September 2012

The future of reading is not reading.

 

Uncreative Writing @UncreativeWriti, 28 September 2012

The future of writing is pointing.

 

Uncreative Writing @UncreativeWriti, 2 October 2012

Contemporary writing is the evacuation of content.

 

Uncreative Writing @UncreativeWriti, 3 October 2012

Individual creativity is a dogma of contemporary soft capitalism, rather than the domain of non-conformist artists: fiction is everywhere.

 

Uncreative Writing @UncreativeWriti, 10 October 2012

An article in China Daily refers to a young worker who copied a dozen novels, signed his name, and published a collection of “his works.”

 

Uncreative Writing @UncreativeWriti, 10 October 2012

Short attention span is the new Silence.

 

Kenneth Goldsmith @kg_ubu, 2 December 2012

I used to be an artist; then I became a poet; then a writer. Now when asked, I simply refer to myself as a word processor.

 

Uncreative Writing @UncreativeWriti, 3 December 2012

Writing should be as effortless as washing the dishes -- and as interesting.

 

Uncreative Writing @UncreativeWriti, 12 December 2012

There’s nothing that cannot be called “writing” no matter how much it might not look like “writing.”

 

Uncreative Writing @UncreativeWriti, 29 December 2012

Creativity’s not creative.

 

Uncreative Writing @UncreativeWriti, 4 January 2013

‘I’ve finally decided what I need to do is make work that’s insincere.’ -- Marcel Broodthaers

 

Uncreative Writing @UncreativeWriti, 15 January 2013

From producer to reproducer.

 

Uncreative Writing @UncreativeWriti, 29 January 2013

Appropriation is literary communism.

 

Uncreative Writing @UncreativeWriti, 30 January 2013

Passion leads you astray.

 

Uncreative Writing @UncreativeWriti, 11 February 2013

Creativity is a jail.

 

Uncreative Writing @UncreativeWriti, 16 February 2013

‘Word processors’ can replace writers. -- Vilem Flusser, 1983

 

Uncreative Writing @UncreativeWriti, 25 February 2013

The avant-garde is not popular because it is democratic.

 

Kenneth Goldsmith @kg_ubu, 24 March 2013

The new memoir is our browser history.

 

Kenneth Goldsmith @kg_ubu, 5 July 2013

A child could do what I do, but wouldn’t dare to for fear of being called stupid.

 

Kenneth Goldsmith @kg_ubu, 12 July 2013

“The telling of a true story is an unnatural act.” -- @RichardPrince4

 

Uncreative Writing @UncreativeWriti, 16 August 2013

Today’s writer resembles more a programmer than a tortured genius, brilliantly conceptualizing, executing & maintaining a writing machine.

 

Uncreative Writing @UncreativeWriti, 25 August 2013

Words are not be written to be read but rather to be shared, moved and manipulated.

 

Uncreative Writing @UncreativeWriti, 30 September 2013

Imagine the writer as meme machine.

 

Kenneth Goldsmith @kg_ubu, 13 December 2013

Hunter S. Thompson retyped Hemingway & Fitzgerald novels. He said, “I just want to know what it feels like to write these words.”

 

Kenneth Goldsmith @kg_ubu, 14 December 2013

Writing should be as effortless as washing the dishes -- and as interesting.

 

Uncreative Writing @UncreativeWriti, 14 December 2013

Individual creativity is a dogma of contemporary soft capitalism, rather than the domain of non-conformist artists: fiction is everywhere.

 

Kenneth Goldsmith @kg_ubu, 16 December 2013

If you’re not making art with the intention of having it copied, you’re not really making art for the twenty-first century.

 

Uncreative Writing @UncreativeWriti, 22 December 2013

Writers who function more like programmers than traditional writers.

 

Uncreative Writing @UncreativeWriti, 2 January 2014

An ethos in which the construction or conception of a text is as important as what the text says or does.

 

Kenneth Goldsmith @kg_ubu, 15 February 2014

Short attention span is the new avant-garde.

 

Kenneth Goldsmith @kg_ubu, 18 February 2014

The internet is the greatest poem ever written, unreadable mostly because of its size.

 

Kenneth Goldsmith @kg_ubu, 24 February 2014

The human entity formerly known as “the reader.”

 

Kenneth Goldsmith @kg_ubu, 25 February 2014

The new reading is not reading.

 

Kenneth Goldsmith @kg_ubu, 28 August 2014

Authenticity is another form of artifice.

 

Kenneth Goldsmith @kg_ubu, 21 September 2014

The internet is destroying literature (and it’s a good thing).

 

Kenneth Goldsmith @kg_ubu, 2 October 2014

Citation is the new close reading. #Infrathin

 

Kenneth Goldsmith @kg_ubu, 26 October 2014

From producer to reproducer.

 

Kenneth Goldsmith @kg_ubu, 27 October 2014

Distracted electronic multitasking is the new surrealism.

 

Kenneth Goldsmith @kg_ubu, 27 October 2014

#Distraction is the new #concentration.

 

Kenneth Goldsmith @kg_ubu, 3 November 2014

The future of #writing is #pointing.

 

Kenneth Goldsmith @kg_ubu, 9 November 2014

Content no longer matters. The way in which we distribute ideas is more important than the ideas themselves. Citation trumps creation.

 

Kenneth Goldsmith @kg_ubu, 20 November 2014

In the digital age, the current avant-garde is the mainstream. We are all poets now.

Ingeborg Harms
BLACKBERRIES

It started when my mother showed me round her garden. We went into its furthest corner, where there were red and yellow plum trees growing, overshadowed by tall pine trees. I had all but forgotten about them. She insisted that we go through a gate Id never used before to the empty lot next door. The topsoil there was firm and overgrown with grass. Crossing it was like walking on stones. I noticed a plant with a red stalk and red leaves. Several such growths were sprouting out of the hard ground. On the Internet I learned that it was a purple-leafed plum. The next day I came back to dig it up. Some of the stalks grew out of a thick stem running crosswise under the ground that Id initially taken for an Ethernet cable. I got my mother to help and she pulled up the root with a single heave. We cut it into three pieces, each of which had a red shoot growing out of it, and put them in water to allow the roots to grow. I dug around another young shoot and levered it out of the surrounding soil. We put it in a pot half-filled with rich earth, filled the pot to the top and gave it lots of water. During the day the plants leaves drooped to save strength, but at night it recovered. On our tour Id also noticed the ripe berries on the blackberry bushes that grew on the dead wood piled along the fence under the pine trees. But I did not realize how much of an art there was in picking them. My shoes sank into the dry pine needles, my bare legs were burning from nettle stings, and I ended up with scratches and splinters of thorns.

And yet it was impossible to go back and get changed. There was always a berry I had overlooked. Many of them were small, stunted, or shriveled, or they fell apart when I grasped them. Their juice ran over my hands, it soaked into my fingers and nails like ink into a blotter. I didnt eat a single one; I didnt even know what they tasted like. Picking them was enough. In the afternoon I went out again, this time to the other side of the fence. Id dressed more sensibly this time, in an anorak, boots, and denim trousers. But the branches still clutched at my head, pulled at my hair, clung to my nylon jacket and caught on my trousers. I spent half my time fending off thorns. I held them back by bending my elbows, or by taking high steps to trample down as many of the branches as I could. Again and again I would scratch the back of my hand when reaching for a tempting berry hanging deep in the bushes. It never occurred to me to give up on trying to pick one. I proceeded automatically, like a bee buzzing from flower to flower. As soon as Id fought my way to a new footing, I was determined not to miss a single fruit. It was only after Id scoured all the young bushes on the empty lot for their first berries that I felt able to stop.

Then my mother pointed out the brambles at the side of the house opposite. They ran past the neighbors yard, up the Postberg, an asphalt road once used by mail coaches as they descended into the valley of the Elbe, sounding their horns to warn the ferries moored at the riverbank to wait for them. It wasnt entirely clear whether these brambles belonged to the neighbor or were common property. I picked them as silently as I could, so as not to wake the collies that the neighbor bred. The blackberries grew more abundantly here; the bush stood in the sun all day long, and its fruit were plump and hung in clusters. But the hedge was also treacherous, its branches interwoven into a matted tangle. To make any progress, you had to disentangle them like in a game of pick-up sticks. When the branches sprang back I had to step on them to stop them scratching me. This, finally, alerted the dogs. But the hedge was so wide and thick that they couldnt have got through. The thicket muffled the sounds, and also hid me from sight. When Id picked as many as I could reach, I noticed more blackberries growing on the other side of the road. In the end, I gave up and rode off on my bike to cross the river and get an ice cream.

On the eastern side of the Elbe, I saw more brambles by the side of the path. But I had nothing to collect them with. The next day I furnished myself with several containers. I rode to the ice cream parlor the other way around, passing a wood first and using a different ferry to get to the other side of the stream. As I came through the wood I spotted blackberries growing under the trees close to the road: tender, low-lying plants with plump fruit. In the end their yield was small, though I spent a long time on it. Continuing on my bike, I saw taller brambles, deeper in the woods, rounded like megalithic barrows, but my fear of snakes prevented me from approaching them.

For my next visit to the country I wore pants and sneakers. The blackberries in the garden had ripened again after just a week. I stocked up on plastic containers for the bike ride. Before reaching the wood I discovered blackberry bushes by the side of the road, growing in a ditch. I filled all the containers with fruit from just one of the bushes. I adapted to the difficult terrain, to the overgrown ditches that I had to jump over, to the tall weeds that needed to be tramped down before I could get to the bushes. Even in the garden, it had seemed as if the blackberry hedges were coquettishly fending me off. It occurred to me that there was something erotic about picking fruit. Every time I pulled a berry from a stalk, I was touching the bush in an intimate way, I was making contact, was communicating with it. Evidently there were more forms of sexuality than science knew of. When I set the containers down in my mothers kitchen, the blackberries turned out to be covered with frantically crawling tiny creatures. I put the berries in a sieve on the terrace to let them make their getaway. And yet, why should they escape, since the blackberries were their home.

At night, while drifting into sleep, I started seeing blackberries. I was inside their realm. Magnificent clusters of violet berries chimed like bells to guide my way; there were no more thorns, only this deep domain in which they swayed quietly like pearl earrings. The more blackberries I picked in this dreamlike state, the more came into view, and the more deeply their domain unfolded. I clutched at them like a robber in Ali Babas cave. As I did, they were transfigured into pieces of jewelryno longer perishable fruit, but works of art. Only once before, when flocks of swallows had darted around me on my bicycle, did nature follow me into sleep.

The next morning the day was overcast. I set off on a short bike ride, taking the containers with me. When falling asleep the night before, Id thought of a bush by the side of the road just before the wood which I hadnt yet inspected. Since I had forgotten to put on contact lenses instead of reading glasses, the bush, at first, seemed barren to me. I was about to turn away, but took a few more steps and noticed the first berries. After that, I was sucked in, to reemerge only hours later. More and more blackberries came into view. It was like in the dream. By now, I understood how to avoid the thorns. My hand passed through the tangle of briars without catching on anything. My fear of snakes had disappeared. I worked my way further and further into the dark recesses of the hedge, and in every stage of its depths there were more clusters of berries. I had never encountered such a vast bounty. I began to tread very carefully, since the fruit also grew on the ground, almost hidden in the grass. The whole hedge was stuffed with stalks of dried grass. This naturally occurring hay made the bush look like a strawberry patch. At some point, I finally gave up on exhausting this vast bounty. The blackberries had won.

While I was standing there motionless like a hunter, with one foot pressed on some branches and one arm outstretched to fend them off, my eyes adjusting to the darkness of the thicket, cars, buses, and trucks were racing by. I had the surreal feeling of existing on two radically different temporal planes at once. I was closer to a butterfly or the small-celled organisms balancing on the clusters of berries than I was to those unsettled human beings, rushing from place to place. I had left behind the world of horsepower, my senses followed a different rhythm, my needs were dictated by this bush. I had entered its universe; it called to me before I fell asleep and I found myself under its protection when I woke up. It no longer hurt me; after all, there was no one else it could have chosen in my stead.

We made jelly out of the fruit. We brewed it up with an equal amount of water, let it drip through a linen cloth, boiled the juice with preserving sugar, and poured the jelly into jars, filling them to the brim. Everything was contained in its taste. The jelly was an elixir. For days I was pulling thorns out of my hands and arms, they didnt really hurt, it was more like a game of skill. I was still carrying some of the thorns in my flesh the next spring, by which time municipal maintenance workers had torn up all the blackberry bushes in the ditches. I took one of the jars of jelly to a birthday party in Berlin. You simply have to talk to Till, said the host: theres a woman in Alsace who boils jelly without using any preserving sugar. It takes three days, but it works. I felt ashamed of my blackberries and would have liked to take the jar back home with me.

Alexander Tarakhovsky
SIGNS OF CONCENTRATION

The most attentive things in the world are transcription factors or TFs. TFs are proteins that cross the barrier between cell protoplasm and enter the nucleus to control the activity of various genes. To do this successfully, the TFs must land precisely on a specific stretch of DNA that determines gene activity. The landing strips differ significantly between different genes and there are thousands of TFs that control gene activity at the same time. A precise landing requires a TF focusing its attention on a few letters of the DNA alphabet. TFs are essential for life on the planet and many of us die because TFs stop working properly.

 

As Ryan started to move his body in a tangential trajectory toward the space between the multi-dimensional fold of Lynn Mendelmann’s anemic pelvis and the exponential enlargement of her pubic bone, his attention was focused exclusively on an inner bremse that his therapist had told him to place firmly inside his penis to prevent premature ejaculation. The bremse was supposed to have the shape of a solid hard olive, which Ryan would force to swell along the arousal to block the flow of seminal fluid and to spare him from witnessing the futile precipitation of his raw DNA outside the frustrated ovaries of Lynn. Attention to details and focus on the bremse were supposed to give Ryan the self-confidence that was essential to his career. 

 

A tiny chameleon was limping through the folds of the painted ceiling. Lying in bed, as was appropriate for a man in his circumstances, Tarum, who two weeks ago lost his job as an accountant with Goldman Sachs, was watching the slow change of the chameleon’s skin as it camouflaged itself against the background with the precision of a Samsung copy machine. The chameleon’s dotted attention to details was enjoyable until Tarum stood up and, with all power of his Equinox arms, threw an attentively folded newspaper at the spot where the ceiling and the animal shared a split identity. A tiny drop of greenish liquid from the squashed chameleon immediately became part of the well-decorated wall, adding to the playfulness of smartly placed wallpaper.

 

What does it mean to be precise?

Just to know the right time to open a sealed order.

 

It took Or a second to realize that he was a very dull and untalented sharp-shooter. The man was running away, his limbs already tacked into smart exoskeletal jabs or knabs, whatever the name of the brand was. Or couldn’t hold his concentration on the trigger because the escaping dirtbag was using a self-made remote debrator that relieves concentration in proportion to the concentration being applied. It would have been a sure shot if Or had been able to take a single bite of the weed marshmallow that was left from the soldiers’ gathering earlier in the day. The shot would have been intuitive and not concentrated, but that is the highest skill of killing. “Shooting post-human is thinking post-human” was a slogan the app-ridden idiot sergeant from Belorussia was always whispering into the sharp-shooters’ ears. “Fuck you, Mr Sergeant, and fuck you, post-humans! Except, perhaps, Dora with herhis rubbery curves and jittery motion, softened by myelin degeneration around the spinal cord.”

 

Mr McGuire: I want to say one word to you. Just one word.

Benjamin: Yes, sir.

Mr McGuire: Are you listening?

Benjamin: Yes, I am.

Mr McGuire: Plastics.

Benjamin: Exactly how do you mean?

Mr McGuire: There’s a great future in plastics. Think about it. Will you think about it?

 

Myelinated robots. Sounds like child mauled by dog. What was the story about the Borscht Belt?

“A bear [very concentrated] snatched a Brooklyn infant [not concentrated] from her stroller at a Catskills vacation spot and mauled her to death [both concentrated] yesterday, officials said. Horrified onlookers [not concentrated], including the girl’s father [concentrated], chased the animal [concentrated] away before a police officer [concentrated] shot it dead. The bear attack on 5-month-old E. came as her mother, R., rushed her two other children—a son, J., 4, and daughter C., 2 [autistic]—to safety in their cottage at the Machne Ohel Feige colony in Woodridge, 70 miles northwest of the city.”

Now I will randomly erase the subsequent text and see whether it makes sense anyway. I just need to concentrate like an egg that holds within it the image of a bird in a Magritte painting.

 

“The baby had been sleeping in a stroller in front of the cottage when the rogue black bear suddenly came out of the woods and took her in its jaws. The child’s father was injured as he and friends—waving IIIIIIIIIIIIIIIII and yelling—chased the bear into the IIIIIIIIIIIIIIIII the little girl. But it was too late. Officials said the baby, whose head and neck had been mauled in the bear’s mouth, was pronounced dead on IIIIIIIIIIIIIIIII Regional Hospital. IIIIIIIIIIIIIIIII had traveled to the Sullivan County vacation spot—a place frequented by New York Hasidic Jews—from their home in Williamsburg. State IIIIIIIIIIIIIIIII said it was believed to be the first IIIIIIIIIIIIIIIII—the only type native to the state—IIIIIIIIIIIIIIIII anyone in New York.

“Fallsburg Police Chief Brent Lawrence said the bear, a young 150-pounder, unexpectedly wandered IIIIIIIIIIIIIIIII about 55 families, which was IIIIIIIIIIIIIIIII off a panic. ‘The baby’s mother heard people IIIIIIIIIIIIIIIII Bear! Bear!’ and ran her children into the IIIIIIIIIIIIIIIII said. ‘When IIIIIIIIIIIIIIIII baby was gone.’

“Isaac Abraham, a spokesman for the Williamsburg Satmar Hasidic community, said, ‘IIIIIIIIIIIIIIIII the child out of the stroller and IIIIIIIIIIIIIIIII her to the woods. People started IIIIIIIIIIIIIIIII rocks and other IIIIIIIIIIIIIIIII at it. Everybody was absolutely in panic.’

“State Department of Environmental Conservation spokesman Mike Fraser said E.’s father, P. S. S., was injured when he ran up to the bear IIIIIIIIIIIIIIIII its way.

“Lawrence said one of his officers got within 50 feet of the bear, IIIIIIIIIIIIIIII turned and IIIIIIIIIIIIIIII. Officer David IIIIIIIIIIIIIIIII one shot, killing the animal on the spot. Ward IIIIIIIIIIIIIIIII, the state’s wildlife pathologist, said the bear’s carcass was being trucked to his lab last night for a necropsy. ‘The big question,’ he said, ‘is going to be: Does the bear have anything abnormal going on, like rabies?’ IIIIIIIIIIIIIIIII

 

The wedding party has been brought to the hospital in several emergency cars. A pink-colored bride was already dead when the nearly dead groom arrived with the second load of formerly cheerful peasants that had been celebrating the wedding of one of their own to the city doctor. Bottles eroded by turbid moonshine were piled up on the table covered with pre-excretory raw produce of the local collective farm; in the middle, the vagina-shaped jars were filled with pickled mushrooms. The party was in a full swing when nearly 19 micrograms of the highly concentrated mushroom toxin ergotamine were evenly distributed among the guests (on the Y-axis) plotted against the time (on the X-axis) of the toasts for long-lost sons and absent or departed friends. The toasts extended into the sirens of emergency cars and then, later, the car horns accompanying the funerals.

 

At school I liked to think about osmosis. The sound reminded me of something soft, perhaps a baby snail going through my lips (os), peristalting down the esophagus (mo), and settling down inside my stomach (sis). The passivity of the sounds was perfectly aligned with the nature of the process itself, where things were moving fatefully through the invisible phlegm of time just to be diluted to bosons.

 

High concentration generates a lack of concentration.

 

Signs of poor concentration:

Open hosenstall (only in German)

Liquid salo (only in Ukrainian)

 

Signs of excellent concentration:

Giant black holes

Hard salo

Jenna Sutela / Elvia Wilk
WHEN YOU MOVED

Isolated in a spherical, cave-like room, the narrator, a young woman, stands still on a treadmill. After a pause, she reaches a finger into the space before her to operate an invisible interface by which she controls the treadmill through small gestures. The treadmill begins to run. Each stage of her exercise program is announced by the interface, whose echoing voice reverberates through the space. The narrator speaks in chapters, articulating a story of the past, the future, and the permanent now. As she runs, her increasing physical exhaustion can be heard in her voice.

 

PROFILE: ILL-SUITED PRIMATE

 

The concept of the cyborg was originally developed as a potential means of surviving in space, but it became increasingly applied to survival on earth. To sur-vive was not just to live but to surpass living, to live in surplus, with a permanent oversupply of hemoglobin and antigens. The threshold for bare life was constantly rising. Pupils dilate.

 

ADD SKILLS

 

Likewise, magnifying glasses were invented to be aimed at the cosmos, but almost immediately we flipped them around and aimed them at ourselves. The telescope became a microscope. We discovered blood cells. We discovered spermatozoa.

Later, we spent about $20,000 to have ourselves attached to proprietary, eight-channel EEG amplifiers with custom brain-training protocols. In seven days, we learned to put the brain into a mental state that, under normal conditions, takes people between 21 and 40 years of daily Zen mediation to achieve. Along with that state came stress-management tools resulting in higher IQ and creativity levels.

 

SELECT LEVEL: BULLETPROOF EXECUTIVE

 

There were only those of us with sanctioned disabilities, and those of us who hadn’t found ours yet. Emboldened by advances in Artificial Intelligence and medical science, we fantasized about abandoning the wetware of the human state altogether.

Yet dominant narratives of progress remained arthritic: stories of rigid joints and weak circulatory systems, exercise programs and paleo diets. We focused on architecture rather than infrastructure, stiffening and crystallizing the flow.

 

LIFE SUPPORT/LIFESTYLE

 

Humans, as essentially a tropical mammal species, have always had to subjugate their habitats. Technologies like clothing and shoes, Gore-Tex and Neoprene, houses, rooms, and atmospheric chambers, were necessary mediums that allowed us to skip millennia of evolutionary adaptation to freezing and boiling temperatures. Air conditioning allowed humanity to progress beyond the accidents of climate.

 

ECO CHALLENGE

 

The microscope discovered populations of others living within our bodies. We cleaned ourselves and our environments to become safe. But extreme sanitation produced extreme allergy. Children growing up in disinfected homes developed asthma. We colonized our bodies while we colonized the world. Disinfectants leaked from decomposing bodies into the soil. The earth developed an allergy. The host rejected the implant.

 

CONTROL PANEL

 

Complex, emergent systems such as body/planet/city became cybernetically inflected inputs/outputs/controls. Architecture became information, heat, weather, air, sex. Architecture could be downloaded, eaten, sniffed, installed, copied, grafted, transferred, genetically modified, transplanted. Like desire, architecture existed without object.

 

PROCESSING…

 

The information found in veins was not so meaningful by itself anymore. Insight came not just from one’s own cell count, calorie count, step count, but from combining these numbers with the findings of others. The microscope became the macroscope, a lens turned on whole populations with the goal of mass data acquisition. The macroscope allowed us to see not with optics but with digital algorithms, transforming myriad bits of information into a larger, readable pattern.

 

DRIP COUNT: 3

 

We were together. Omniscient and impotent. The daily weather forecast was collected from a cloud of collective consciousness data overlaid on a map in real-time, tracking the influence of the group mind on the physical environment. Chaotic minds were random number generators and wrists were unable to trick wrist monitors. The advancement of civilization was measured not in energy but in data. We had not only a surplus of hemoglobins but a surplus of ways to count the cells.

 

SPERM COUNT: -1

 

How to put our bodies to use was the chief ethical concern. Overwhelmed with the amount of information to process—coordinates, shuttle manuals, ornate procedures—humanity was a global firm of junior law associates trying to keep up with an ever-increasing workload.

 

The powers of spatial design suspended the body in an ever more passive relation to its environment. Skin was sealed off. Homes were disaster-proofed. The planet developed a calcified shell of satellites. The body was a cursor performing a choreography of clicks.

 

ERROR

 

The crash was the only real experience the computer had been through for years. For the first time it was in physical confrontation with its body, an inexhaustible encyclopedia of pains and discharges, under the hostile gaze of other machines, and with the undeniable fact of the dead. The computer discovered blood cells. It discovered spermatozoa.

 

AUTO-MODE

 

Apatheia: freedom from all passions.

Aponia: suppression of physical pain.

Ataraxia: elimination of mental disturbance.

 

What use is sperm in outer space? Anti-gravity ejaculate spells a language in search of objects, the beginnings of a new sexuality divorced from any possible physical expression. The elegant aluminized air vents in the walls of the X-ray department beckon as invitingly as the warmest organic orifice.

 

LEAVING THE DEMO AREA

 

The space of outer space is the opposite of the space of the architect, because it is a space that humans cannot actually encounter without dying, and so must enter exclusively through a dependence on technology.

The compacted cabin of the spaceship, like some bizarre vehicle modified for an extreme cripple, is the perfect module for all the quickening futures of our lives. One giant, luxurious spaceship. Eventually it will become a sarcophagus, and the containment of the body will be brought to its logical conclusion.

Space provides the perfect occasion and excuse to quantify the body within a hermetic environment, either completely isolated as one or fully fused together as one.

 

The narrator rises in the air, becoming weightless—a pause before her program resumes. As the treadmill begins to run again, the scene zooms out and a view of the asteroid in which she travels is revealed.

 

From the script for the video essay When You Moved (2014). Referencing the works of J.G. Ballard, Michael Crichton, Nicholas de Monchaux, and Beatriz Preciado.

Arthur M. Jacobs / Raoul Schrott
CAPTIVATED BY THE CINEMA OF THE MIND

ON TOGGLE SWITCHES, MADELEINE EFFECTS, AND DON QUIXOTE SYNDROME DURING IMMERSION IN TEXTUAL WORLDS

 

It starts spontaneously, and it keeps on as long as I keep reading. … I have to concentrate and get involved. … I immediately immerse myself in the reading, and the problems I usually worry about disappear. … It starts as soon as something attracts my attention particularly, something that interests me. … It can start wherever there is a chance to read undisturbed. … One feels well, quiet, peaceful. … I feel as if I belonged completely in the situation described in the book. … I identify with the characters, and take part in what I am reading. … I feel like I have the book stored in my mind.

Fausto Massimini, Mihaly Csikszentmihalyi, and Antonella delle Fave, “Flow and Biocultural Evolution,” in Mihaly Csikszentmihalyi and Isabella Selega Csikszentmihalyi, eds, Optimal Experience: Psychological Studies of Flow in Consciousness. Cambridge: Cambridge University Press, 1988

 

So it was that as I read my point of view was transformed by the book, and the book was transformed by my point of view. My dazzled eyes could no longer distinguish the world that existed within the book from the book that existed within the world. It was as if a singular world, a complete creation with all its colors and objects, were contained in the words that existed in the book; thus I could read into it with joy and wonder all the possibilities in my own mind.

Orhan Pamuk, The New Life, translated from the Turkish by Güneli Gün. New York: Farrar, Straus and Giroux, 1997

 

Sitting at my desk, elbows on the page, chin on my hands, abstracted for a moment from the changing light outside and the sounds that rise from the street, I am seeing, listening to, following (but these words don’t do justice to what is taking place within me) a story, a description, an argument. Nothing moves except my eyes and my hand occasionally turning a page, and yet something not exactly defined by the word ‘text’ unfurls, progresses, grows and takes root as I read. But how does this process take place?

Alberto Manguel, A History of Reading. London: HarperCollins, 1996

 

INTRODUCTION

 

These “readers’ testimonies” tell of a world of images and feelings, of figures and objects, that appear to be as real as the world around us; they tell of the magic of a story that so captivates and draws us in that we forget everything around us. When people are asked about how reading (fictional) narrative texts affects them, they speak of experiences of “diving in,” “immersing oneself,” and “losing oneself” in a textual world. What becomes clear from the quotes above is that concentrated attention and emotional involvement play just as important a part in this process as interest, the desire to forget oneself, escapism, identification, empathy, and happiness. On the other hand, they raise of the question of the how of the process, and the fact is that empirical science knows very little about this utterly ordinary and universal phenomenon.

In his 1988 work Lost in a Book, Victor Nell gave this phenomenon the name absorption (in the sense of absorbing attentiveness, demanding complete concentration), and associated it both with enjoyment in reading, which he calls ludic reading (see Anz 1998), and—like Freud—with fiction as a form of play. Other literary scholars, sociologists, and reading researchers such as Hakemulder, Green, and Gerrig speak of transportation (into textual worlds). In our book Gehirn und Gedicht (Brain and Poetry, Schrott & Jacobs 2011), we called this capacity of a particular type of reading to hold our attention immersion. The term is borrowed from Bela Balazs’s film theory, as our eyes move in a similar manner to a movie camera while reading. According to Balazs, the camera “takes my eye along with it. Into the very heart of the image. I see the world from within the filmic space. I am surrounded by the figures within the film and involved in the action, which I see from all sides. … What does it matter that I remain seated for a two-hour period in exactly the same way as in the theatre? … My gaze, and with it my consciousness, identifies with the characters in the film. I see what they see from their standpoint. … I travel with the crowd, I fly up, I dive down, I join the ride.”

Similarly, the literary scholar Wolfgang Iser describes the act of reading as a Mittendrin-Sein, a state of being in the midst of things. In contrast to the normal process of perception, where we relate to an object by standing in front it, the reader occupies a vantage point as he moves through the realm of objects presented to him (Iser 1978). It is this, according to Iser, that constitutes what is specific to understanding the nature of aesthetic objects in fictional texts. The fact that a text, in contrast to many other objects of visual perception, can never be grasped as a whole but always only as a series of distinct moments of reading—as a wandering point of view—has cognitive consequences. What has already been read fades from memory; only what has just been read is present to the mind, while what has not yet been read is anticipated in terms of what is remembered and currently being experienced. In Iser’s words: “every moment of reading is a dialectic of protension and retention, conveying a future horizon yet to be occupied, along with a past (and continually fading) horizon already filled; the wandering viewing point carves its passage through both at the same time and leaves them to merge together in its wake.”

 

TYPES OF IMMERSION

 

Transportation, absorption, presentness, and flow are terms used to describe immersive phenomena that have been barely examined by experimental science, and which have yet to be properly understood. Despite this, a wide variety of disciplines offer a whole series of theories that can be used to provide a hypothetical explanation. In the humanities, for example, popular theories include Lipps’s classic theory of empathy (1903–1906), Ryan’s virtual reality theory (2001), or Iser’s triadic model of real, fictive, and imaginary fictional reception (1993) (cf. Voss 2008).

Lipps’s aesthetic theory in particular has given rise to the hypothesis that reader and text “qualitatively merge into each other.” The reader’s empathetic investment in the text thus creates the impression that the latter is alive and has strength and energy flowing through it, so that the self, as it were, “empties itself out into a fiction” (Voss 2008). For the film and theater scholar Robin Curtis, immersion is also an aesthetic effect, which can lead, through the animating impulse of Lipps’s concept of empathy, to various possibilities of involvement, including those that lie beyond a strategy of naturalist representation. For her, it makes sense to see immersion and empathy as synonyms.

By contrast, the Dutch reading researcher Rolf Zwaan describes the processes that take place during immersive reading as extending well beyond straightforward empathy (for example, processes of inference, or of forming mental models of situations): “When readers read stories, they construct a rich mental image of the story-world. They have an idea of how it looks to the protagonist, and can move with him through this world (assuming he is familiar with this environment). In addition, the reader imagines what the protagonist’s aims are, and keeps a mental account of his successful and unsuccessful attempts at achieving them. The reader often also makes causal inferences about physical events, for example when he mobilizes his knowledge about fire and water to conclude that the fire went out because someone poured water over it. Moreover, the reader draws on his rich emotional knowledge to infer that the protagonist is frustrated when he does not achieve his aim. The reader is caught in a temporal series of events in such a way that events that are nearer to us in the story-world are also easier to remember than those that happened further back in the past. Nevertheless, the phenomenological experience of immersion in a story-world extends well beyond this. When reading a story, we can ‘experience’ a cold wind blowing in our face, the smell of stale beer, a kiss on our lips or a hot slice of pizza in our mouth.”

Ryan’s book Narrative as Virtual Reality defines immersion as an imagined relationship with a meaning-universe in a textual world, a window onto something that exists beyond language and that extends both spatially and temporally far beyond the frame of this window. According to her, a text must be familiar and “mimetic” to have an immersive effect, that is, it must create a virtual space with individual characters, objects, or events that a reader can relate to and participate in. This imagined world must contain temporal and spatial contours that also enable these imagined objects to be concretely visualized. Such “mimetic texts” can involve readers being “taken prisoner,” being plunged into artificial worlds (immersion), traveling in foreign lands (transportation), or losing contact with all other realities. Drawing on Gerrig’s book Experiencing Narrative Worlds (1993), Ryan’s analysis distinguishes between two forms of transportation: one minimal and weak, the other rich and strong. The former merely involves representing an object located concretely in space and time (for example, if you read the word “Texas,” you cannot, according to Gerrig, help but be mentally transported to Texas), while the latter involves not only thoughts about a concrete object but also about its environment, the world that surrounds it—including the idea of being inside that world oneself, in the presence of the object. Like Iser, Ryan also characterizes the strong form of transportation as “aesthetic immersion,” because it is dependent upon features of the text such as plot, narrative presentation, quality of the imagery, and style. According to Iser, in a perspectivally constructed text the reader’s wandering point of view is located in one of four perspectives: narrator, character, plot, or a fictionalized version of the reader himself (an ancillary perspective, which reflects the reader’s own frame of reference and largely serves to delineate his attitude to the narrated events). The quality of these features and (shifting) perspectives makes a crucial contribution to the enjoyment of reading, and with it to the aesthetic aspect of immersion. According to Iser, changes of perspective can follow the Gestalt psychology concept of the law of good continuation if the “felt and expected relationship” between successive sentence correlates is a given. However, they can also interrupt fluent reading’s “effortless stream of sentence thinking” if an unexpected twist in the plot upsets the immersion of the moment.

Moreover, Ryan distinguishes between four different degrees in intensity of transportation during reading: i) concentration, that is, the type of attention given to nonimmersive texts, which is still highly vulnerable to the distracting stimuli of external reality; ii) imaginative involvement, a form of “split subject” attentiveness which transports the reader into the textual world while still allowing a detached contemplation of it from an aesthetic or epistemological point of view; iii) entrancement, a nonreflexive enjoyment of reading which completely absorbs the reader, causing him to forget the aesthetic qualities of the author’s work as well as the (logical) truth-value of its statements; it allows him to forget he is reading a text, without forgetting that the world of the text is not reality; and iv) addiction, a kind of compulsive reading aimed at escaping reality, which can also lead one to lose touch with reality itself, something Ryan calls the “Don Quixote Syndrome.”

In the triadic model of fictional reception outlined in Iser’s 1993 book, he gives special emphasis to the imaginary, which he sees as a capacity to make manifest the latent structures of meaning available in the text, and central to which is an active process of filling out the imagined meaning gestalt (by including what is systematically left out). In this process, the reader uses imaginative divergences from the original text’s material constellation to transform his experiential knowledge of reality into an imaginary form representing his own reality and intentionality (Voss 2008).

In short, we can say that there is no shortage of definitions and theoretical approaches to immersive phenomena. Nevertheless, for an experimental scientist this still begs the question of the how of immersion. How does immersive reading function, not only on the verbally communicable level of subjective experience, but also on the levels of cognitive and affective processes and their neuronal bases, which cannot be directly observed? The processes of reading that can be consciously communicated, which are those that most literary critics and psychologists refer to, and which are generally measured using questionnaires and psychometric scales (for example, Appel et al.’s transportation scales 2002; Busselle & Bilandzic 2009; Green & Brock 2000), only amount to the tip of the iceberg. The iceberg itself consists of many unconscious and affective processes which reading researchers aim to illuminate using methods such as measuring eye movement and brain activity (Jacobs 2006a). In Gehirn und Gedicht (Schrott & Jacobs 2011), we postulated two neuronal bases for the phenomenon of immersion, components of a general model of neurocognitive poetics that involves further neuronal bases (Jacobs 2011; 2014a,b; 2015a,b; Jacobs et al. 2013; Lüdtke et al. 2014): “symbol grounding” on the one hand, and “neuronal recycling” on the other. In what follows, we will attempt to outline both of them.

 

NEUROCOGNITIVE BASES OF IMMERSION

 

What is the origin of reading’s astounding capacity to hold our attention? How can it be that such supposedly abstract symbols as words—cultural objects that in evolutionary terms are extremely recent—should be able to induce “sensory delusions” and “quasi-real feelings” in us, captivating us in the “cinema of the mind”?

A glance back to the early years of psychology suggests several possible answers. Thus Sigmund Freud, writing in 1891, claimed the brain treated words in much the same way as any other kind of object, and saw no reason for coding them in any other terms than their perceptual and motor features—that is, principally in terms of their vocalized form and articulatory operations. Karl Bühler’s studies of children led him to realize in 1934 that words have a “spheric fragrance”: if, for example, the word “radish” appears in a text, the reader is immediately transported to the dining table or the garden, that is, to an entirely different sphere than the one associated with a word such as “ocean.” If we generalize Bühler’s remarks about speech, a reader is “transported to the things that are spoken of, and lets his inner constructive or reconstructive ability be guided in great part by the object itself, which he either already knows or which the text has already arranged and constructed.” This is why a reader actually “hears” the crunching of the radish in her head, “sees” the red and white colors, and perhaps even “smells” the earthy smell when she reads the word “radish.” According to Bühler, words therefore have a “substance.” They are embodied cognitions, and the activities they are used for—speaking, reading, thinking, feeling—are themselves determined by their being as material substances (cf. Jacobs 2014b; Jacobs & Kinder 2015; Jacobs et al. 2015).

Imagine a child hearing the sentence: “Lisa bumped into the table and cried.” Could she understand this sentence if she couldn’t imagine bumping into a table herself or hadn’t seen someone else do it? Building on the ideas of Jean Piaget and Ludwig Wittgenstein, we can even surmise that the meaning of the word “table” consists of nothing other than a (neuronal) pattern of actions relating to this object. The embodied meaning—the motor-sensory concept “table”—is composed of the experiences of a table a person has already had and the judgments they’ve been able to form as a result. Examples of the questions to be determined would be the following: how far is it from me; where is its edge; what do I need to do so as not to bump into it; what shape does it have; how big and heavy is it, and what kind of material is it made of; how much strength do I need to get round it; what is it like to touch? Empirical studies at the Dahlem Institute for Neuroimaging of Emotion (DINE) at the Freie Universität Berlin have established that, as well as being distinguishable by their linguistic and affective features, words can be differentiated by the attributes “body-object-interaction” and “sensory experience”: words such as “sea” or “honey” exhibit a high “embodiment index,” while “purpose” or “accident” have a low one (Jacobs et al. 2015). From time immemorial, poetry has used this knowledge of the motor-sensory and affective associative potential of words, skillfully linking these with their phonetic qualities (Jakobson 1960; Schrott & Jacobs 2011).

Put simply, the hypothesis of symbol grounding claims that the memory images evoked by words and sentences we hear or read are similar to those evoked by the objects they refer to. This phenomenon, described by Ryan as the “madeleine effect” in reference to Proust, points to the fact that, when reading or when listening to language, the processes involved are based on the same or similar neuronal mechanisms as those used in direct experience. This mental simulation of situations described verbally or in writing is therefore, under certain circumstances, capable of holding our attention with an intensity comparable to real perception, and sometimes even greater. This hypothesis hence contradicts the traditional understanding of these matters in cognitive psychology, which postulates a strict division between language on the one hand and perception or action on the other, because, unlike in cognitive psychology, language is here considered to be based on the manipulation of abstract symbols. However, this does overlook the fact that the visual appearance of words and sentences constitutes the same kind of sensory stimuli as objects or faces. They are also automatically associated with their auditory form. Light and sound waves, transformed into neurochemical signals, affect our brains in a way that transforms these waves through complex intermediary stages into (multi-modal) “symbols”: into letters/graphemes on the one hand, and into their corresponding sounds/phonemes on the other. A word is therefore symbolically grounded by those learnt motor-sensory activities connecting its reception (seeing, hearing) with its production (speaking, writing). What at first sight appears as an abstract, amodal object composed of letters of the alphabet acquires its familiar, almost obvious meaning only after many laborious years of learning—and anyone who has watched children or adult patients with brain lesions learning to read and write will know how hard this process is. Today neuroscience is able to actually prove the existence of Bühler’s “spheric fragrance.” Reading the sequence of letters that makes up the word “radish” causes various sensory-response areas of the brain to become active, while “ball” also causes movement centers to be active, and “kiss” serves those that deal with emotions. The brain actually experiences events it is actually only reading about, and this power of simulation (mimesis, reliving) is an important basis of immersion, the neuronal substratum of the “cinema of the mind.”

The second hypothesis, that of neuronal recycling, postulates that structures in the brain eventually adapt so well to their environment that culturally determined processes such as reading end up operating through them, even though they had not evolved for this purpose. This means that cultural inventions such as writing have occupied brain networks that are older in evolutionary terms by taking over, at least in part, their general structural framework, and forming a kind of “neuronal niche.” In the six thousand years since the development of writing, evolution hardly had time to develop completely new, reading-specific structures, capable of specializing in the construction of such amodal symbols. Since neuronal networks in all four lobes of the brain, as well as the cerebellum and other subcortical structures, play a part in recognizing just a single word, we may assume that structures are being used here that performed comparable functions among our ancestors (for example, recognizing patterns, objects, and faces).

The paleontologist Stephen Jay Gould proposed the term “exaptation” for such processes. It means a kind of creative evolutionary misappropriation: the utilization of a characteristic for a function it wasn’t originally intended for. In his analysis of one of the greatest achievements of human civilization and one of the most complex functions of the human brain—namely reading—the neuropsychologist Stanislaw Dehaene’s 2009 theory of neuronal recycling claims that a particular part of the left cerebral hemisphere’s fusiform gyrus—a structure in the lower temporal lobe—represents just such an exapted region of the brain. The process of learning to read, which often takes years, recycles the circuits of this region, reshaping structures that had initially served to recognize objects and faces: a classic example of how the actual form of the brain can react to new cultural inventions. This so-called visual word form area includes a series of neuronal circuits, which on the one hand are reasonably close to the original function of recognizing patterns, objects, and faces that the other parts of the fusiform gyrus specialize in, but on the other are also malleable enough to be able to muster considerable resources for tasks that are culturally determined, such as recognizing letters and words. We can therefore claim, along with the reading researcher Maryanne Wolf (2007), that “The brain’s design made reading possible, and reading’s design changed the brain in multiple, critical, still evolving ways.”

A third approach, the so-called Panksepp-Jakobson hypothesis (Jacobs 2015b; Jacobs & Schrott 2013; Jacobs et al. 2015), is based on the notion that evolution did not have enough time to develop emotional circuits and “pleasure centers” for the specific enjoyment of art or even literature. Rather, as the neuroscientist Jaak Panksepp demonstrated in 1998, the evidence is that the feelings experienced during reading, whether “vicarious fear” (for the protagonists) or the aesthetic enjoyment of a beautiful metaphor (i.e., Jakobson’s famous poetic function of language) are based on the ancient circuits of affect that we share with all other mammals (e.g., the so-called limbic system).

If we try to describe these processes in terms of neuronal activity, the majority of studies shows that fluent reading primarily draws on the left brain hemisphere’s reading system, especially the “fast” lower (ventral) route. In normally developed, proficient readers this system covers large parts of the left hemisphere and can be roughly divided into three constituent parts: a posterior region in the brain’s parietal and temporal lobes, consisting of two networks, and an anterior region in the frontal lobe. The lower route, which runs from the visual areas through the inferior and middle temporal lobes to the frontal areas, contains the visual word form area, which is associated with fluent, highly automated reading. The anterior part includes the lower frontal gyrus, which appears to play a special role in recoding the phonology and articulation of words. The superior (dorsal) reading circuit, which runs from the visual areas in the visual cortex through the superior areas in the temporal lobes to the frontal area, is associated with the relatively slow, rule-based decoding of less familiar words that requires intensely focused attention.

Complex processes of interpretation and comprehension, requiring a bilateral activation of he brain, depend on the ability of the left hemisphere’s rapid reading system to efficiently decode written information. Maryanne Wolf puts it vividly and succinctly: “With its decoding processes almost automatic, the young fluent brain learns to integrate more metaphorical, inferential, analogical, affective background and experiential knowledge with every newly won millisecond. For the first time in reading development the brain becomes fast enough to think and feel differently. This gift of time is the physiological basis for our capacity to think ‘endless thoughts most wonderful.’ Nothing is more important in the act of reading.”

An area in the anterior temporal lobe seems to be important for Iser’s aforementioned theory of the completion and closing of meaning gestalts. Since this area contains multimodal associative areas, it is likely that it integrates semantic, syntactic, and episodic sources of information, transforming textual input into meaningful representations. The least complicated assumption is that the right anterior temporal lobe is responsible for propositionalization: it probably translates words into larger semantic units of content, which could correspond to Iser’s “meaning gestalts.”

Using the DINE’s magnetic resonance tomograph (scanner), we looked at one-word metaphors, that is, composite terms that make a single word out of two nouns (noun-noun composites, or NNCs), to investigate how the brain works on simple meaning gestalts, and how the two dimensions of familiarity (known vs. unknown) and visualizability (literal vs. metaphorical) were manipulated in this process. Handschuh (“glove,” literally “hand-shoe” in German) or Angsthase (scaredy-cat) are typical examples of familiar one-word metaphors, which can be described as “dead” or “sleeping” metaphors to express the sense that as a rule these words cannot be understood and used “visually” but “literally.” The NNCs from our study were divided into four groups: conventional metaphors (CM) such as Flughafen (“airport,” literally “flight-harbor”) or Rampensau (literally “stage-sow,” one who hogs the limelight, though not always in a pejorative sense); conventional, “literal” NNCs (CL) such as Lehrjahr (“academic year,” literally “teaching year”) or Reisepass (“passport,” literally “travel-pass”); new metaphors (NM) such as Neidfieber (literally “envy fever”) or Mensakoma (literally “canteen coma,” in reference to students being tired after lunch); and finally new, “literal” NNCs (NL) such as Stahlhemd (“steel shirt”) or Sofaladen (“sofa store”). In so doing, the semantic relations between the two words of each NNC were kept at a constant by means of an algorithm that calculates the high-dimensional semantic distances (i.e., the dissimilarity of the semantic features of two words) using the latest computer-linguistic methods, thereby preventing any confusion of the possible effects of “metaphoricity” with those of semantic relatedness. In the left inferior frontal gyrus, a region of the brain that is systematically associated with language processing and meaning construction, clear differences emerged between the groups, showing a ranking order of graduated semantic processing and meaning gestalt construction: NMs > NL > CM > CLs. As the authors had suspected, the activity in the left inferior frontal gyrus reflects the relative “neuronal work” needed to work out what the NNCs mean; the newer, more unusual and more striking an NNC is, the greater the semantic effort required to construct a meaning gestalt. The brain evidently finds this process easier with words like Reisepass than with neologisms such as Mensakoma.

Moreover, the dorsolateral prefrontal cortex, the so-called posterior cingulate cortex and the so-called temporoparietal junction or TPJ seem to be important in coherence formation and logical examination during reading, as well as in constructing more complex meaning gestalts, sometimes extending over several sentences or paragraphs. All these regions also play a part in empathy and theory of mind, the particular ability to guess what is going on in other people’s minds, recognize similar events in one’s own mind, and imagine the feelings, needs, ideas, intentions, expectations, and opinions of others.

Apart from symbol grounding, neuronal recycling, and the other neuronal processes mentioned here, a whole series of additional factors play a role in immersion, which is also a phenomenon determined by genre; these include interest, curiosity, surprise, suspense, enjoyment, and aesthetic processes (see Jacobs 2011; 2015b).

 

NEUROSCIENTIFIC STUDIES ON IMMERSION POTENTIAL

 

The immersion potential of an encounter between text and reader is determined both by the qualities of the text and the reader’s own characteristics. Thus the term “immersion potential” denotes the encounter between a story’s setting (spatial aspects), plot (temporal aspects), and its characters’ emotions and the reader’s personality traits. The latter determine whether one reader, thanks to a powerful visual imagination, can successfully imagine herself in the setting, whether another reader is more interested in the plot, that is, in the storylines and the question of what happens next, or whether someone successfully identifies with the protagonists, with their inner world (thoughts, feelings, aims), and can sympathize with their emotional conflicts (empathy). These three aspects of immersion, spatial, temporal, and emotional (see Ryan 2001), are also important components of a mental situation model. Our thesis is that an immersive text is one that offers the reader a strong likelihood of encountering (familiar) situation models conforming to the cognitive and affective schemata he has acquired in the course of his life, and allowing him to read fluently within a “familiar textual world.” Such mental scripts are five-dimensional representations, formed dynamically in an automatic and implicit reading process, and consist of spatial, temporal, causal, motivational/intentional, and person- and object-related information. They deal with questions of the where, when, why/how, who, and what of individual events, and represent embodied cognitions based on psychosomatic experiences and motor-sensory, kinesthetic, and affective sensations which are automatically associated with words. Immersive reading therefore involves the construction of a series of situation models, which more closely interrelate the more strongly these five dimensions overlap with each other. Every time a rupture occurs in one of these dimensions—if, for example, the protagonist changes his location—then the situation model must be newly updated (Zwaan 1993).

Thus an immersive text must play with the reader’s situation models, arousing curiosity likely to induce immersion, springing surprises, and inducing suspense (Brewer & Lichtenstein 1982), such as by causing unexpected ruptures in one or another of the five dimensions. We studied this phenomenon at the DINE by examining people’s reading reactions to so-called black stories, a collection of macabre short stories whose protagonists find themselves in dangerous or distressing circumstances, and at the end of which they generally die, as in the following example:

“A farmer drove his combine harvester into a field of corn where his children were playing hide-and-seek. When the machine came to a stop, he got out to see what was wrong. When he realized that he’d run over his children, he took his own life.”

Our hypothesis was that empathy for the protagonists of a story, and the emotional immersion this is likely to cause (the so-called fiction feeling), would be greater for stories with strongly negative content than for control stories with neutral (unemotional) content. The findings support this thesis, and show that a whole network of brain regions is active (centered around the medial prefrontal cortex) when readers experience empathy and fiction feeling (Altmann et al. 2012). This particular brain activity was dependent on the test subjects’ capacity for empathy, which was measured on a psychological scale. In order to test the toggle-switch theory of fictional reading, half the test subjects were told that the stories were pure fiction before reading them, the other half were made to believe that they were facts (newspaper reports). Gerrig’s toggle-switch theory claims that Coleridge’s reception-theory thesis of a “willing suspension of disbelief” is an illusion. This theory holds that the principle of toggling the switch that suspends disbelief when you’re reading fiction and toggling it back when you’re reading facts is not an option for our brains. Nevertheless, we did actually find clear differences in the brain activities of both groups: the brain activity patterns in the “fact group” indicated that the mind-brain attempts to reconstruct the events related in the stories, while in the “fiction group” it was primarily networks associated with fantasizing and the mental simulation of future events that were active. Since the subjects were neither aware of these activities nor claimed to have willingly caused them, and since, moreover, the fiction feelings were not different between the two groups, it is at least the case that the findings do not directly support Coleridge’s thesis. They do, however, support Oatley’s principle that “fiction could be truer than fact,” because the mental simulation processes that fictional literature requires enable individuals to gain a deeper understanding of their own emotions (cf. Green et al. 2012).

At DINE we also studied the literary production of the fiction feeling by reading passages from the Harry Potter novels (Hsu et al. 2014). Test subjects reported the highest levels of subjective immersion in passages that induced fear through their descriptions of pain or emotional stress. On these occasions a brain region in the mid-cingulate cortex was particularly active, which is a region that plays a central role in physical and psychic feelings of pain, and is associated with the motor components of affective empathy (Craig 2009). A comparison of bilingual test subjects’ brain activity while reading passages from the novels in both German and the original English showed that fiction feelings were not only more intense in their mother tongue but also appeared to be more differentiated in neuronal terms (Hsu et al. 2015a). In the same study we also examined the role played by surprise and reading enjoyment in immersion, by using passages featuring magical content that contradicted our knowledge of the world. These descriptions of supra-natural events primarily activated parts of the brain’s amygdala, which are systematically related to the discovery of striking and emotionally important aspects of the world around us, and which here presumably correlated with the novelty, surprise, and reading enjoyment that these descriptions produced (Hsu et al. 2015b).

Empathy and emotional immersion are associated with another factor that facilitates immersion, namely suspense. We studied this in readers of E.T.A. Hoffmann’s short story “The Sandman.” Subjectively the feeling of immersion strongly correlates with that of suspense (a high statistical correlation coefficient), which in turn correlates with the reader’s degree of subjective excitement (Jacobs 2015). The heart rate increases in suspenseful and immersive passages (Auracher 2007), something that is attributable to plot, or more precisely the density of narrative developments per passage, measured by the number of verbs (Jacobs & Schrott 2013). In addition, neuronal activations in certain brain areas (medial-prefrontal, inferior-frontal, and posterior-temporal) suggest the influence of processes of empathy and the future event prediction during the reading of suspenseful passages (Lehne et al. 2015). At least one of the five key factors of current personality theories, namely conscientiousness, is also linked to suspense and immersion and probably has an indirect effect on the ability to concentrate (Jacobs & Schrott 2013).

In sum, it can be said that an increasing number of neuroscientific studies that have addressed the phenomenon of immersion have largely supported the three hypotheses we’ve outlined above (symbol grounding, neuronal recycling, Panskepp-Jakobson). In a manner entirely in line with the theories of Freud or Bühler, who sadly did not have access to DINE as part of their repertoire of methods, the neuroscientific evidence supports all three hypotheses. It makes it clear why literary reading is both a sensory and emotional experience, and why it is capable of becoming a kind of seventh sense, reliant on sensory experiences of sight and hearing, the limbic system’s affective responses, and countless memory images.

 

GENRE EFFECTS AND SUBLEXICAL FACTORS

 

It is often argued that—in contrast to novels—poetry or other kinds of literature that encourage self-reflection do not produce any immersive phenomena, or at least very few (Ryan 2001). We tested this thesis empirically using the so-called Stimmungsgedichte (mood poems) from Meyer-Sickendieck’s anthology (2011), which includes poems by Eduard Mörike, August Stramm, and Jürgen Becker, as well as subjects such as “The City,” “Space,” “Morning,” and “Silence” (Lüdtke 2013; Lüdtke et al. 2013). It turned out that test subjects not only experienced both mood empathy and emotional involvement—two important aspects of immersion—while reading “Romantic” poems, but also while reading “abstract” (post-)modern ones, as long as these poems described familiar phenomena, experiences, situations, moods, and atmospheres. This supports Max Kommerell’s claim that “In it the poet was harmonized, the poem is harmonized and the reader is harmonized” (1985). We can therefore speculate that poems which—however subtly—address familiar situation models and enable the reader to mentally enter their poem-world and empathize with its mood/atmosphere do indeed exhibit an immersion potential. What may also be important is the initial mood that readers find themselves in, and how well this suits the mood of the poem or its basic affective tone (the hypothesis of mood management, on which more below.)

The reception of poetry can of course be a playful, pleasant, and concentrated process, related to the reception of music or painting, which transports readers into an artificial world and thus enables them to partly or completely forget the world around them by their partial or total absorption in the text. The poet can facilitate this immersion by poetically imitating endogenous brain rhythms in his verses, which we have elsewhere described as follows:

“By using its three-second intervals to occupy the timeframe in which we experience our audio-temporal present, the typical line of poetry creates an artificial psychic space in which—divorced from everything around us—we can concentrate exclusively on the poem. And this in turn leads to the pleasant and utterly harmless side-effects produced by reading and listening to poems: poets such as Emily Dickinson and Robert Frost have spoken of how, when reading poetry, they have had goose bumps or hot and cold shivers running up and down their spine; the muscles relax, while the mind can focus and concentrate; one finds one can laugh or cry more easily, draws deeper breaths, and is pervaded by a slight feeling of intoxication—Raymond Roussel compared it with a sober high, and Coleridge with the effect of drinking a couple of glasses of spirits during a conversation…” (Schrott & Jacobs 2011).

Thus poetry doesn’t only operate on the lexical and supralexical level of words and verses, but also leads to subtle sublexical effects that depend on meter, rhyme, and rhythm, on the one hand, and affective phonological iconicity, on the other.

 

We can, dear reader, only guess what effect the following couplet by Wilhelm Busch has on you:

 

Oft ist das Denken schwer, indes
Das Schreiben geht auch ohne es.
(Take heart if thinking leaves you chary,
For writing it’s not necessary.)

 

At any rate, an empirical study by Menninghaus et al. (2014) claims that the typical reader should find these two lines funny; a humorous effect that isn’t produced by semantic incongruities but effected by rhyme and meter. This becomes clear when we defamiliarize the couplet’s form while leaving it semantically unaltered, either by destroying its rhyme (“Oft ist das Denken schwer, jedoch / das Schreiben geht auch ohne es”) or its meter (“Oft ist das Denken schwierig, indes / das Schreiben geht auch ohne es”). Notably, the test subjects in the study by Menninghaus et al. (2014) found the defamiliarized versions not nearly as funny as the original ones. The “pleasure” is lost. We explain this pleasure by the fact that when we read a line of poetry, each word activates a neuronal stimulus pattern that resonates as a sound gestalt, and this is still reverberating mentally, albeit subliminally, when the next word triggers its own neuronal pattern. Even when the first word has already been conceptualized—and the reader has already begun reading the third word in the series—the first word’s stimulus pattern remains present. Poets make use of these sliding transitions by constructing lines of poetry whose letter sequences repeat themselves, making them easier to memorize and recite with the help of meter and rhyme. Using the available residual stimulus patterns makes memorizing them less work. This is true not only of assonances within a verse, but also of alliteration at the beginning of words. It is precisely the ease with which rhyming lines can be recited that constitutes their pleasure—they trip as easily off the tongue as they impress themselves upon the memory (Schrott & Jacobs 2011).

Along with meter, rhyme, and rhythm, poets also make use of onomatopoeia, which to a greater or lesser extent plays with the phonological iconicity of words, something that influences the affective basic tone and determines the overall emotional mood of how a poem is received. We have studied this phenomenon by means of both textual analysis, using the so-called EMOPHON program (Aryani et al. 2013), and psychometrics, using a ratings scale for readings of Hans Magnus Enzensberger’s 1957 collection Verteidigung der Wölfe. The idea for this arose because Enzensberger himself had already attempted an intuitive classification of his fifty-seven poems into friendly, sad, and nasty ones. Using EMOPHON, which quantifies phonological salience (the significant incidence of particular phonemes in any text), and a standardized database of words—which among other things makes it possible to quantify the affective features of phonemes, syllables, and words (Jacobs et al. 2015; Võ et al. 2006; 2009)—we discovered that up to twenty per cent of the variation in text subjects’ emotional assessments of the poems could be explained by basic affective tones, as calculated by EMOPHON (Aryani et al. 2015). These pre-attentive, and probably unconscious, effects serve to support others produced at the level of words and individual lines, leaving it an open question whether they were deliberately intended by the poet.

 

RESEARCH QUESTIONS

 

In conclusion, we would like to elaborate some of the exciting research questions and methodological challenges that research into textual immersion raises. These largely concern the necessary and sufficient conditions for immersive experiences and how these are to be measured. Given the digitization of the world of reading and the enormous importance of immersion, the question urgently arises of whether immersive phenomena depend on the reading medium (and if so, how), as well as whether, for example, the use of Fiktion’s own Reader inhibits or encourages immersion in texts (a private study carried out together with Ingo Niermann and Mathias Gatza). As part of the European Union’s E-READ research program (COST action IS1404 Evolution of reading in the age of digitization), the first author of this paper is investigating this issue with a large international group. Moreover, it is also interesting to ask where and when immersion is at its strongest, and to what extent audio books, song lyrics or spoken poetry, analog and digital books, quiet reading vs. reading aloud are inherently different from film, music, painting, or sculpture, or only by degree. Initial data from this survey indicates that books are considered to be by far the most immersive medium, followed by film and music (Hakemulder 2013). Hakemuder’s study also showed that respondents believed that this was primarily because of literature’s successful and empathetic depiction of its protagonists’ inner worlds, and only secondarily to plot-related effects such as curiosity, surprise, and suspense.

The question of whether immersion in works of fiction operates as powerfully as in works of fact is also still largely an open one (Altmann et al. 2012; Green et al. 2012), as is the question of the role of genre (for example, novel vs. poem). According to a pioneering study by Zeman et al. (2013), the first to use a scanner to compare the reception of prose and poetry, prose and poetry activate more common neuronal networks than separate ones. Chief among the latter in the reception of poetry are regions that are associated with theory of mind and the mental simulation of the future, such as the right temporal lobe and the anterior right temporal lobe, which is associated with propositionalization. Interestingly enough, a study at the University of Greifswald also found the right temporal lobe to be a possible neuronal correlative of creative writing (Shah et al. 2012).

It is well known that poets like Brecht opposed the effects of empathy and immersion by alienating their readers in order to enhance their capacity for critical thinking. Despite the “mood empathy” referred to above, there is little doubt that lines such as “Schwarze Milch der Frühe wir trinken sie abends” (“Black milk of daybreak, we drink it at sundown”; the first line of Celan’s Todesfuge [Death Fugue], here in Michael Hamburger’s translation) have an alienating effect on the reader, forcing him to adapt (among other things) his thought schemata and mental affect scripts, as well as his (self-)reflexive processes, which in theory can only attain the weakest of Ryan’s four degrees of immersive intensity (concentration). In Gehirn und Gedicht (Schrott & Jacobs 2011), we discuss alienation effects in reading as an instance of foregrounding (van Peer 1986)—that is, the deliberate use of rhetorical and poetic stylistic means, such as the oxymoron in the opening line of Celan’s poem. We consider the question of whether immersion and alienation effects mutually exclude or interact with one another to be primarily an empirical one, which, despite the methodological problems we outline below, is likely to be resolved in the future using methods of empirical literary criticism, experimental psychology, and cognitive neuroscience.

Finally, it is an open question to what extent immersion itself is a uniform phenomenon, even within a medium such as the analog book. The three-way division into spatial, temporal, and emotional immersion is a purely theoretical one, which has yet to be experimentally researched (Ryan 2001). It is also an open question whether all three components are equally necessary or sufficient for immersion, whether they interact, and in what regards they depend on the reader’s personality traits, such as their empathy or conscientiousness score, their curiosity, or their spatial imagination.

A general methodological problem in measuring immersion lies in the fact that test subjects cannot provide any data on their experiences during the act of reading itself without interrupting or rendering impossible the immersive process. Immersion can be pre-attentive—so much so that, in cases of complete absorption, readers are not even aware of being in this state. As soon as test subjects notice or report that they are immersed, it is clear that they can no longer be so (Hakemulder 2013). An assessment of the “immersivity” of entire texts or passages made subsequent to the act of reading and based on a scale of immersion does not, however, constitute a neutral measure. Nevertheless, by simultaneously collecting data on personality, subjectively felt suspense, familiarity, valence , degree of excitement, and concentration of action in the content—that is, on constructs theoretically linked to immersion—this “non-neutral” measure can be “cross-validated.”

Furthermore, people can give false answers to questions about the possible causes of immersive states on the grounds of what they personally think its causes may be. That is why the subjective methods of empirical literary criticism need to be broadened with the more objective methods of measurement that we use at the DINE (for example, oculo- and pupillometry, peripheral physiological measures of heart rate, skin-conductance level or corrugator muscle activity, electroencephalography, or functional magnetic resonance tomography). These methods are not only expensive, however, and supply—despite some opinions to the contrary—only correlative and not causal information (Jacobs 2006b), but they also fail to clearly link certain measurement parameters to immersive states. In other words, there are no biomarkers for immersion, even though initial findings from the DINE have discovered certain regions of the brain to be possible neuronal correlates of particular immersive processes; one of these is the mid-cingulate cortex, mentioned above (Hsu et al. 2014). On the other hand, neuronal correlates for attention processes (frontoparietal network), spatial imagination (the so-called parahippocamal gyrus), theory of mind (TPJ), emotional involvement (so-called limbic system), surprise (amygdala), suspense (dorsolateral PFC), all of which are important components of immersion, are relatively well known, and can serve as “objective” evidence for immersive processes. An example of this would be to first have the test subjects read a text in the scanner and then ask them to mark those parts that they considered particularly immersive, suspenseful, poetic, etc. (Speer et al. 2007). If the brain activity coinciding with those parts of the text corresponds to one or more of the “correlates” mentioned above, then this would be evidence for immersive processes.

In the future, these processes, both analog and digital, will—irrespective of whether or not they use mental toggle switches—undoubtedly continue to captivate people in reading’s cinema of the mind, offering them a broad spectrum of possibilities for enjoyment that neither film nor music can provide, from madeleine effects to Don Quixote syndromes.

 

BIBLIOGRAPHY

 

Altmann, U., I.C. Bohrn, O. Lubrich, W. Menninghaus, and A.M. Jacobs. 2012. “The power of emotional valence—from cognitive to affective processes in reading.” Frontiers in Human Neuroscience 6, 2012, 192. doi: 10.3389/fnhum.2012.00192.

Anz, T. 1998. Literatur und Lust: Glück und Unglück beim Lesen. (Literature and Pleasure: Happiness and Unhappiness When Reading), Munich: Beck.

Appel, M., E. Koch, M. Schreier, and N. Groeben. 2002. “Aspekte des Leseerlebens: Skalenentwicklung” (Assessing Experiential States During Reading: Scale Development). Journal of Media Psychology 14, 149–154. doi: 10.1026//1617-6383.14.4.149.

Aryani, A., A.M. Jacobs, and M. Conrad. 2013. “Extracting salient sublexical units from written texts: ‘Emophon,’ a corpus-based approach to phonological iconicity.” Frontiers in Psychology 4, 654. doi: 10.3389/fpsyg.2013.00654.

Aryani, A., M. Kraxenberger, S. Ullrich, A.M. Jacobs, and M. Conrad. 2015. “Measuring the basic affective tone in poetry using phonological iconicity and subyllabic salience.” Psychology of Aesthetics, Creativity and the Arts, in press.

Auracher, J. 2007. “… wie auf den allmächtigen Schlag einer magischen Rute.” Psychophysiologische Messungen zur Textwirkung. (‘…as though moved by the omnipotent beat of a magical rod’: Psychopysiological Measurements of Textual Effects). Baden-Baden: Deutscher Wissenschaftsverlag.

Brewer, W. F. and E. H. Lichtenstein. 1982. “Stories are to entertain: A structural-affect theory of stories.” Journal of Pragmatics 6, 473–486.

Bühler, K. 1934. Sprachtheorie (Language Theory). Stuttgart: Lucius und Lucius, 1982.

Busselle, R. and H. Bilandzic. 2009. “Measuring narrative engagement.” Media Psychology 12, 321–347. doi: 10.1080/15213260903287259.

Csikszentmihalyi, M., and I. Csikszentmihalyi, eds. 1988. Optimal Experience: Psychological Studies of Flow in Consciousness. Cambridge: Cambridge University Press.

Craig, A. D. 2009. “How do you feel—now? The anterior insula and human awareness.” National Review of Neuroscience 10, 59–70.

Curtis, R. 2008. “Immersion und Einfühlung” (Immersion and Empathy). montage AV, February 17, 89–107.

Forgacs, B., I.C. Bohrn, J. Baudewig, M.J. Hofmann, C. Pleh, and A.M. Jacobs. 2012. “Neural correlates of combinatorial semantic processing of literal and figurative noun–noun compound words.” Neuroimage 63, 1432–1442.

Freud, S. 1891. Zur Auffassung der Aphasien (On Aphasia). Vienna: Deuticke, 1891.

Gerrig, Richard J. 1993. Experiencing Narrative Worlds: On the Psychological Activities of Reading. New Haven: Yale University Press.

Enzensberger, H. M. 1957. Verteidigung der Wölfe: Gedichte (Defense of the Wolves: Poems). Frankfurt: Suhrkamp, 1981.

Green, M. C. and T.C. Brock. 2000. “The role of transportation in the persuasiveness of public narratives.” Journal of Personality and Social Psychology 79, 701–721.

Green, M. C., C. Chatham, and M. Sestir. 2012. “Emotion and transportation into fact and fiction.” Scientific Study of Literature 2(1), 37–59.

Hakemulder, F. 2013. “Travel experiences: A typology of transportation and other absorption states in relation to types of aesthetic responses.” In Wie gebannt: Ästhetische Verfahren der affektiven Bildung von Aufmerksamkeit (As if Under a Spell: Aesthetic Processes for the Affective Production of Attentiveness), edited by J. Luedke et al., 163–182. Berlin: Freie Universität.

Hsu, C. T., M. Conrad, and A.M. Jacobs. 2014. “Fiction Feelings in Harry Potter: Hemodynamic Response in Mid-Cingulate Cortex Correlates with Immersive Reading Experience.” Neuroreport 25(17), 1356 –1361. doi: 10.1097/WNR.0000000000000272.

Hsu, C. T., A. M. Jacobs, and M. Conrad. 2015a. “Can Harry Potter Still Put a Spell on us in a Second Language? An fMRI Study on Reading Emotion-laden Literature in Late Bilinguals.” Cortex 63, 282–295. doi: 10.1016/j.cortex.2014.09.002.

Hsu, C.T., A. M. Jacobs, U. Altmann, and M. Conrad. 2015b. “The magical activation of left amygdala when reading Harry Potter: an FMRI study on how descriptions of supra-natural events entertain and enchant.” PLoS ONE 10 (2):e0118179. doi: 10.1371/journal.pone.0118179.

Iser, W. 1978. The Act of Reading: A Theory of Aesthetic Response. Baltimore/London: Johns Hopkins University Press.

Iser, W. 1993. The Fictive and the Imaginary: Charting Literary Anthropology. Baltimore/London: Johns Hopkins University Press.

Jacobs, A. M. 2006a. “Was passiert beim Lesen im Gehirn?” (What happens in the brain when reading?). Süddeutsche Zeitung, August 18.

Jacobs, A. M. 2006b. “Messung der Hirnaktivität” (Measuring Brain Activity). In Handbuch der Allgemeinen Psychologie, Kognition (Handbook of General Psychology, Cognition), edited by J. Funke and P. French, 697–704. Göttingen: Hogrefe.

Jacobs, A. M. 2011. “Neurokognitive Poetik: Elemente eines Modells des literarischen Lesens” (Neurocognitive poetics: elements of a model of literary reading). In Gehirn und Gedicht: Wie wir unsere Wirklichkeiten konstruieren (Brain and Poetry: How We Construct our Realities), edited by R. Schrott and A. M. Jacobs, 492–520. Munich: Hanser.

Jacobs, A. M. 2014a. “Affektive und ästhetische Prozesse beim Lesen: Anfrage einer neurokognitiven Poetik” (Affective and aesthetic processes in reading: towards a neurocognitive poetics). In Sprachen der Emotion (Languages of Emotion), edited by G. Gebauer and M. Edler, 134–154. Frankfurt: Campus.

Jacobs, A. M. 2014b. “Metaphern beim Lesen in Gehirn und Geist” (Metaphors during reading in the mind/brain). Psychosozial 37(3), 27–38.

Jacobs, A. M. 2015a. “Towards a neurocognitive poetics model of literary reading.” In Towards a Neuroscience of Natural Language Use, edited by R. Willems, 135–195. Cambridge: Cambridge University Press.

Jacobs A. M. 2015b. “Neurocognitive poetics: methods and models for investigating the neuronal and cognitive-affective bases of literature reception.” Frontiers in Human Neuroscience 9, 186. doi: 10.3389/fnhum.2015.00186.

Jacobs, A. M. and R. Schrott. 2013. “Gehirn und Gedicht: Wie Wörter emotional wirklich werden” (Brain and Poetry: How Words Become Emotionally Real), keynote lecture given at the 55th Annual Conference of Experimental Psychologists (TeaP), Vienna, Austria, March 24–27, 2013.

Jacobs, A. M., J. Lüdtke, and B. Meyer-Sickendiek. 2013. “Bausteine einer Neurokognitiven Poetik: Foregrounding/Backgrounding, lyrische Stimmung und ästhetisches Gefallen” (Elements of a neurocognitive poetics: Foregrounding/backgrounding, lyrical mood, and aesthetic pleasure). In Stimmung und Methode (Mood and method), edited by B. Meyer-Sickendiek and F. Reents, 63–94. Tübingen: Mohr Siebeck.

Jacobs, A. M. and A. Kinder. 2015. “Worte als Worte erfahren: Wie erarbeitet das Gehirn Gedichte” (Experiencing words as words: how the brain constructs poems). In Kind und Gedicht (Child and Poem), edited by A. Pompe, 57–76. Berlin, Rombach.

Jacobs, A. M., M. L. H. Võ, B. B. Briesemeister, M. Conrad, M.J. Hofmann, L. Kuchinke, J. Lüdtke, and M. Braun. 2015. “10 years of BAWLing into affective and aesthetic processes in reading: what are the echoes?” Frontiers in Psychology, in press.

Jakobson, R. 1960. “Closing statement: Linguistics and poetics.” In Style and Language, edited by T.A. Sebok, 350–377. Cambridge, MA: MIT Press.

Kommerell, M. 1985. Gedanken über Gedichte (Thinking about Poems). Frankfurt: Klostermann.

Lehne M., P. Engel, M. Rohrmeier, W. Menninghaus, A.M. Jacobs, and S. Koelsch. 2015. “Reading a Suspenseful Literary Text Activates Brain Areas Related to Social Cognition and Predictive Inference.” PLoS ONE 10(5): e0124550. doi:10.1371/ journal.pone.0124550.

Lipps, T. 1903. Ästhetik: Psychologie des Schönen und der Kunst (Aesthetics: The Psychology of Beauty and Art), volume 1: Grundlegung der Ästhetik (Foundations of Aesthetics). Hamburg & Leipzig: Voss.

Lipps, T. 1906. Ästhetik: Psychologie des Schönen und der Kunst (Aesthetics: The Psychology of Beauty and Art), volume 2: Die ästhetische Betrachtung und die bildende Kunst (Aesthetic Perception and the Visual Arts). Hamburg & Leipzig: Voss.

Lüdtke, J. 2013. “Eine Frage der Empirie: Zum emotionalen Erleben bei der Rezeption von Stimmungsgedichten” (A Question of Empirics: On emotional experience in the reception of mood poems), Stimmung und Methode (Mood and method), edited by B. Meyer Sickendiek and F. Reents, 119–140. Tübingen, Mohr Siebeck.

Lüdtke, J., B. Meyer-Sickendiek, and A.M. Jacobs. 2014. “Immersing in the Stillness of an Early Morning: Testing the Mood Empathy Hypothesis in Poems.” Psychology of Aesthetics, Creativity and the Arts 8(3), 363–377. doi: 10.1037/a0036826.

Manguel, A. 1996. A History of Reading. London: HarperCollins.

Menninghaus, W., I. C. Bohrn, U. Altmann, O. Lubrich, and A. M. Jacobs. 2014. “Sounds funny? Humor effects of phonological and prosodic figures of speech.” Psychology of Aesthetics, Creativity and the Arts 8, 71–76.

Meyer-Sickendiek, B. 2011. Lyrisches Gespür: Vom geheimen Sensorium moderner Poesie. (Poetic Instincts: On the Secret Sensorium of Modern Poetry). Munich/Paderborn: Fink.

Nell, V. 1988. Lost in a Book: The Psychology of Reading for Pleasure. New Haven: Yale University Press.

Oatley, K. 1999. “Why fiction may be twice as true as fact: Fiction as cognitive and emotional simulation.” Review of General Psychology 3(2), 101–117.

Pamuk, O. 1997. The New Life. Translated by Güneli Gün. New York: Farrar, Straus and Giroux.

Ryan, M.-L. 2001. Narrative as Virtual Reality: Immersion and Interactivity in Literature and Electronic Media. Baltimore/London: Johns Hopkins University Press.

Schrott, R. and A. M. Jacobs. 2011. Gehirn und Gedicht: Wie wir unsere Wirklichkeiten konstruieren (Brain and poetry: How we construct our realities). Munich, Hanser.

Shah, C., K. Erhard, H.J. Ortheil, E. Kaza, C. Kessler, and M. Lotze. 2013. “Neural correlates of creative writing: An fMRI Study.” Human Brain Mapping 34, 1088 –1101.

Speer, N. K., J.K. Zacks, and J.R. Reynolds. 2007. “Human brain activity time-locked to narrative event boundaries.” Psychological Science 18, 449–455.

Turner, F. and E. Pöpel. 1983. “The neural lyre: Poetic meter, the brain, and time.” Poetry Magazine 12, 277–309.

Van Peer, W. 1986. Stylistics and Psychology: Investigations of Foregrounding. London: Croom Helm.

Võ, M.L.H., M. Conrad, L. Kuchinke, K. Hartfeld, M.J. Hofmann, and A.M. Jacobs. 2009. “The Berlin Affective Word List reloaded (BAWL-R).” Behavior Research Methods 41 (2), 534–539. doi: 10.3758/BRM.41.2.534.

Voss, C. 2008. “Fiktionale Immersion” (Immersion in fiction). montage AV, February 17, 69–86.

Wolf, M., 2007. Proust and the Squid: The Story and Science of the Reading Brain, New York: HarperCollins.

Zeman, A., F. Milton, A. Smith, and R. Rylance. 2013. “By heart: An fMRI study of brain activation by poetry and prose.” Journal of Consciousness Studies 20 (9–10), 132–158.

Zwaan, R. A. 1993. Aspects of Literary Comprehension: A Cognitive Approach. Amsterdam: Benjamins.

Ronnie Vuine
FUTURE READING SYSTEMS

The Fiktion reading app includes an autoscroll feature. Text streams upward from below, from the depths of an invisible wellspring beneath the edge of the screen. We enter the stream of text reading, to avoid being driven over the top edge; we have to master the stream, finding a balance, keeping ourselves afloat, to avoid being swept away by it. In the text’s system of time, the invisible wellspring beneath the screen produces the future: in our writing system, the future in the text is to the right, until the end of the line, then down, then to the right and up on the next page in a printed text, or—in a text that scrolls—to the right and further down.

Recently an idea with the—to German ears, at least—rather silly name Spritz has been making a splash on social media. Spritz is a reading application that doesn’t even scroll; instead it flashes text at the reader, word by word, forestalling every eye movement at a headlong pace without leaving her the chance to digress, pause, or check back on an earlier passage. Speed-reading techniques have long been proposed as solutions to a problem that even those concerned with improving efficiency have, on closer inspection, found to have mistaken premises, since the corresponding techniques of speed-thinking aren’t quite so easy to develop, but the idea of pressurized fueling with text seems, at least for now, to have excited a lot of enthusiasm.

What are these reading apps doing to us? One obvious idea comes to mind, a sort of vulgar McLuhanism: the days when we accessed the textual future from texts ourselves, at a speed set by our meandering thoughts while reading, are over. These new reading applications force the future into us with the obstinacy of technology and at its own pace. As this vulgar McLuhanism would have it, what we are really reading when we read like this is that resistance to the future—a future that emerges, ready-written, from a dark unknown—is pointless.

Even if we avoid the McLuhanism, autoscrolling reading apps and Spritz remain at least metaphors for technology’s ongoing, far-reaching, and unstoppable transformation of the cultural technique of “reading.”

Or at least that’s how it seems. Because what hidebound bibliophiles like to overlook is that when they’re actually reading their cloth-bound fetishes they forget the object they’re holding in their hands. Even the book-lover’s brain isn’t interested in books: when reading undisturbed, it switches into a strange and wonderful special mode. Neuroscientific lore tells of the brain being assailed by an input, entering via the sensors, of many hundreds of megabytes per second. Most of this data is filtered out as irrelevant, receiving no attention and not even entering our consciousness. In immersion mode—that is, when lost in a book—the brain becomes extremely selective about this data: out of the hundreds of megabytes of sensory data, only a thin stream of glyphs reaches consciousness. (A Latin letter can be encoded in a single byte.) Using these symbols, the brain calls up sensory data from the past to forge new scenarios on a stage in the mind’s eye, which holds the reader’s full attention. With the exception of a few alarm systems running in the background, all the reader’s cognitive resources are focused on a synthetic world, made out of barren symbols and long-established associations: if the conditions for immersion are present, then literature bends the mind back onto itself.

When it became clear, in the late 2000s, that in the near future novels would also come to be read on digital reading systems, and not just by a marginal group of technology fanatics, the figure of the hidebound bibliophile emerged, telling his story of the book as a bulwark against the speed of the digital world. What we were supposed to be doing of an evening was sitting back in a comfortable chair with a good book, as well as, needless to say, a good glass of wine, and enjoy the look and feel of the book as an object: its special aroma, and the gentle rustling of its pages. Here, with us, among the books, so the story went, you are protected from a threatening future. Wearied by the toils of the day, and mildly inebriated, here you can sniff the binding and finger the paper. The future can’t hurt you any more.

If it’s true that the old distinction between fiction and science fiction no longer applies, and that we would now, in recognition of how things have changed, do better to speak of fiction and denial fiction, then it’s easy to imagine what kinds of good books the hidebound bibliophiles think we should be reading, once we’ve converted our armchairs into bulwarks.

Anyone who reads knows that none of these things are relevant. The cult of sitting down with a good book, no matter how ridiculous it may sometimes be, or how likeable it almost always is, has nothing to do with reading. Immersion wins out against the glass of wine and the smell of the binding. Immersion wins out against a tram full of phlegm-hawking jerks as it jolts through the rush-hour traffic on a freezing wet November evening. In the first years of the addiction, when the drug grips you in a way it never will again, immersion wins out against your parents’ tendency to switch off the light too early. The true reader demands only one thing from her surroundings—and that includes her reading system, whether digital or not—and that is that they not get in the way of immersion.

Texts in the immersion loop use symbols to call up data onto an interior stage. It is not mere abstractions that prose and poetry call up, but sensory data which are linked or can be linked to our desires, and which are given more attention on this immersive interior stage than they could ever get in the world outside: good prose is a machine for producing intensities. Whether it streams up to us from the wellspring of its inner time beneath the edge of the screen, or whether it emerges with the turning of a page, a text is good when it contains things that we will desire.

In 2008, a small team from the newly founded Berlin company txtr travelled around Germany. On visits to towns like Poing or Kiel, they demonstrated a prototype product to the editors of gadget and computer magazines: a circuit board connected via a loosely hanging cable to an e-ink display. What we’re bringing to you, the Berliners told the editors, is a piece of the future, and you can be the first in this country to touch it. No one doubted what they were told. Why was the young Berliners’ story so credible, why weren’t they dismissed as fantasists and sent home? Why did they imagine they knew what the future looked like, these editors of magazines that weren’t exactly based in Silicon Valley? The answer is: they knew what the future looked like because they’d seen it before.

As the editors of computer magazines, they belong to a global culture in which it is widely assumed that e-readers or tablets—in 2008, the difference didn’t really yet exist—will be part of everyday life in the future. They know this future from the afternoon TV shows they’d avidly watched as twelve-year-olds: like many of the inventions of Apple’s supposedly brilliant inventors, tablets are Starfleet standard issue. What we recognize as the future was long ago to be seen on Star Trek: The Next Generation. And better still: almost all the people who are actually producing the future tend naturally to be Trekkies or ex-Trekkies, whether they are researchers in Cambridge, employees in Cupertino, excessively self-confident young entrepreneurs in Berlin, or the editors of computer magazines in Kiel. They’ve all read science fiction or still read it. They aren’t just prepared for the futures of their youth, they are working on them. For the hidebound bibliophiles, by contrast, these futures are an imposition. They feel that the enthusiasms of others, of the Trekkies and the nerds, are being forced upon them.

What if they’re right?

The standard account, of course, has it the other way round: new technologies emerge on the basis of what is already possible. When the time is ripe, they are invented by brilliant inventors. The prophets of science fiction anticipate these inventions and use technology to explore their own time under slightly altered conditions.

Perhaps this standard story is right. Often the more boring stories are the correct ones. But perhaps it really is the other way round. At the end of the day, science fiction has, with few exceptions, never been terribly good at predicting the future. And oddly enough it is precisely those texts that provoke an especially intense enthusiasm for their mises en scène that have offered the better predictions.

So it may be that William Gibson did not so much predict cyberspace as create it. For three decades, his text, an intensity machine without equal, has been producing young men and women who want to live in his world—a nihilistic and brutal world, but one that is also mercilessly cool. Anyone who read Neuromancer at fifteen can still close their eyes today and, wherever and whenever they may be, become Molly or Case or both, and glimpse into a world of data and freedom; a world that’s shining, pure, and abstract, desirable and sliding brightly into an infinite darkness.

We stubbornly say “cyberspace” to our pitiable Internet, as if conjuring it to become something better, and try to ignore that instead of offering visually exciting opportunities to transcend our limitations (that Oedipal ballast of body, sex, and origin) it largely contains only dreadful advertisements for diet products, howling packs of mindless idiots, and, at best, cute pictures of cats and otters. But we’re working on it. All over the world, people are sitting at their computers and working on it, often unpaid, with compilers and soldering irons and CAD-software and pick-and-place robots. They are building a net that resembles cyberspace more than its current pathetic imitation; a net where we have a chance against the Oedipal ballast, where everything’s more exciting and smarter and the sex is better. As we do so, we ignore what the ballast protected us from. We are setting everything free. And we’d even accept the world inhabited by Case and Molly, if it glistened and shone just as brightly…

Literature, when it’s good, produces intensities that turn us into agents of its ideas. This isn’t only true of the old science fiction, but of all texts that invoke the violence of progress. Literature isn’t only metaphorically a stream flowing from the future—and not only in the vulgar McLuhanist sense of a cultural technique that’s being accelerated by Spritz or the Fiktion reading app, or bound up with speed in some other way—it is also literally such a stream, code from the future that is executed, letter by letter, on readers of the present to produce new time. Now that we’ve learnt to welcome this dangerous intrusion by alien code, there’s no stopping it: the loop of text, capital, and technology draws us deeper into this vortex of liberation, whose sinister origin we don’t understand but which, as beings programmed by literature, we ecstatically desire.

Ingo Niermann
LITERATURE’S VICTORY

Even twenty years ago, I had friends who thought that they read too many novels and hardly had time for anything else. These days, however, everyone I know laments how little fiction they read and how much less than they used to. Writers are no exception. When we were launching the digital publishing project Fiktion, I got a note from Douglas Coupland: “I am finding that the one thing I no longer have is discretionary reading time. It almost feels like some weird twenty-first-century version of poverty. I read half of what I used to read and have to work ten times as hard to do so. It really is worrying me. I think this is common to anyone who is busy since the year 1900, but in 2013, it is sick and extreme and is possibly really fucking up our culture.”

This could, of course, merely be an effect of getting older. But that is not borne out by how most people, no matter their age, say that it only got really bad in the middle of the noughties, with the emergence of social media. Since then, they haven’t necessarily been reading less, just fewer works of fiction. They just aren’t getting round to them any more. Other texts are always getting in the way—because they’re more topical, because they’re shorter, or because friends and colleagues have sent a link recommending something. Reading novels has become like sex in a long and happy marriage: it’s still great, but it’s harder and harder to find the time for it.

Although more or less everyone says they’d like to read more novels, they’re increasingly at a loss to say which ones. Sure, everyone can think of a couple of classics they’d like to read—but contemporary fiction? It would be too hasty to blame this on a decline in quality because people also no longer keep themselves as thoroughly informed as they used to. What people do still read, at most, is books that receive a lot of media attention. Not the chart toppers, but the one book of the season that newspapers and magazines consider relevant or portray as scandalous. Only then is there a chance of friends and acquaintances reading it as well and so of talking about it with them.

The usual explanation for the shrinking importance of literary fiction goes something like this: we can only take in so much, and the amount of information available is not only increasing very quickly but also armed with ever smarter strategies and ever more seductive content. In this context, literature appears to be at a hopeless disadvantage. The bombardment of the senses by blockbuster cinema, television, computer games, news feeds, and social media is comparable to the effect of industrially produced sugar, flavor enhancers, and saturated fats in our food. Faced with this competition, vegetables, wholegrain bread have it tough. And so does literature. To stay with the metaphor, one could find solace in how, although obesity may be increasing, so too are the number of people who have the minutiae of their nutrition, fitness, and weight fully under control. Could a similar process lead to the bolstering of an elite that abstains from multimedia entertainment on principle and is committed to reading intellectually demanding literature for many hours a day?

Maybe—but maybe not. Literature is, after all, fundamentally different from fresh fruit and vegetables. People have lived on a diet of meat and fruit since prehistoric times, while reading and writing are cultural technologies that are only a few thousand years old, and their reach has only extended beyond a small minority of priests, monks, aristocrats, and the bourgeoisie over the last couple of centuries. This process is still very far from over: from 1970 to 2005, the global illiteracy rate halved to just under twenty percent of the world’s population, and it is still falling. Children from all kinds of backgrounds are reading novels in series that are thousands of pages long, and they are privately and lengthily corresponding with each other every day in a manner that was earlier confined to a tiny elite—and which only two decades ago seemed in danger of disappearing altogether as a result first of telephones and then of cellphones. Today the objects we continue to call phones are primarily used for reading and writing.

This daily practice of writing has made it easier for people to write poems, stories, and novels themselves. Thanks to the Internet, you no longer have to worry that your own literary works will never be published and will at best only be read by friends and family. On the Internet, as Bertolt Brecht envisioned in the “radio theory” he developed in 1932, everyone can be a broadcaster. Everyone can make their own texts available to the whole world on their homepage or on platforms that disseminate it for free or offer it for sale, and they can discuss them in online forums specializing in particular genres. Anyone who does particularly well with their self-published books has a chance of having traditional publishing houses take them on afterward. The fastest-selling novel and novel series of all time—E.L. James’s Fifty Shades of Grey (2011)—initially appeared as a self-published book.

When, with the rise of the bourgeoisie, wider sections of the population started reading fiction, it was still considered to be extremely dangerous. Novels were suspected of distracting readers from their work, estranging them from their families and their communities by stimulating their imagination in a pathological way and putting morally reprehensible nonsense into their heads, such as notions of romantic love or living in the lap of luxury. When, as a child in the early 1980s, I started reading a great deal, my mother, who had grown up on a farm, found it deeply suspect. When I was watching television she could easily check what I was up to (the television was right next to the kitchen). But my mother could hardly keep up with the blurbs on the backs of the one or two books I was devouring every day—though many of the books I borrowed from the library had none.

There seems a broad agreement today that people never read enough. Every minute that a child spends reading a book instead of playing computer games—regardless what kind of book—is considered beneficial. Reading novels is generally considered more valuable than reading nonfiction, precisely because they stimulate the imagination—which was once so frowned upon—and thereby create a zone of protection from the flood of multimedia stimuli. At the same time, more than ten percent of all children in the United States are diagnosed with ADHD in the course of their time at school and are prescribed stimulants so they can concentrate better. However, it is hard to know whether television, computer games, and multitasking have really reduced people’s ability to concentrate, or whether the complaint about a lack of concentration isn’t rather similar to the one about too much violence: it is not violence that has increased, but society’s intolerance of violence. There is probably nothing that has helped the publishing industry more than television, yet the fear that we could “amuse ourselves to death” (to paraphrase the title of Neil Postman’s 1985 book) has given rise to a permanently guilty conscience which we can find no better way to assuage than buying more books. And even if today the amount of time spent reading novels is actually stagnant or falling, it is an open question whether this might be compensated for by more people writing literary texts than ever before. While we don’t know the number of literary titles among them, 148,424 printed books and 87,201 e-books were self-published in the United States in 2011 alone. This represents an increase of over 300 percent over five years—and that’s not including the rapidly growing number of books without ISBNs.

Hope of fame and financial success may play a role here, but it cannot be the decisive factor. If I mention that I’m a writer in conversation with strangers, their first reaction is generally to remark that I can probably barely make a living from it. The second thing they say is how much they would like to write a book themselves; in some cases also that they have already tried to do so. These days a good piece of music or an artwork can be produced in a day. Only exceptionally talented writers are capable of finishing a novel in less than several months, and it can easily take several years. In a world in which personal growth through creativity is considered of central importance, writing a successful novel is treated with the deepest reverence. Society seems so sensitized to the danger of failing in this endeavor that romantic comedies devoted to the struggling writer have turned into a subgenre.

The more the realms of work and private life become scientifically rationalized and quantified, the more literature is needed as a humanizing buffer in which there is still space for stories, metaphors, myths, legends, and rumors. In America, lines hundreds of meters long form outside collective storytelling events, and in business storytelling has long since become a popular tool in the arsenal of rhetoric. We can anticipate that psychoanalysis, which used to be criticized for fabricating imaginary memories from early childhood, will now be deliberately used to strengthen the arts of storytelling. Bibliotherapy is developing into its own branch of psychotherapy. Annette Simmons, author of The Story Factor (2001) and Whoever Tells the Best Story Wins (2007) writes: “the missing ingredient in most failed communication is humanity. This is an easy fix. In order to blend humanity into every communication you send, all you have to do is tell more stories and bingo—you just showed up.”

The creative professions are also employing literary techniques ever more directly. A short while ago I was invited to a symposium at Columbia University’s faculty of architecture, which advertised itself with the words: “Fiction allows us to imagine new possibilities, new politics, new modes for living, new ways of understanding the world. Indeed, fiction enables and encourages the elaboration or extrapolation of the present and proves history and memory are mutable and contingent. What para-fictional possibilities open up when architectural production purposely blurs truth and how might this problematize relationships to the real?” (Interpretations: Discerning Fictions, April 20, 2013). If the visual arts of the past few decades tended to draw on film and television as the new leading forms of media production and attempted to reflect the multimedia spectacle critically, art is now teeming with installations containing books and literary references, while increasing numbers of artists are themselves writing literary texts.

Fiction has become a panacea enabling us to better cope with our lives and our work. Characteristic of this is the success of David Shields’s book Reality Hunger (2010), which promotes literature as a zone of particular authenticity, especially through the use of quotations. Shields aims to unify writer and reader in a prosumer who relates everything they read to their own life by continually asking “what does that have to do with me?” in order to use it in their own writing if it’s a close enough match. It is enough to highlight your favorite passages while reading a text in digital form; the archive this produces turns into a running commentary. Everything that is discussed in chatrooms and posted in websites is potentially material for a novel.

Even if you never write a novel, you may still find you’re constantly asking yourself how far your own life resembles one. This way of seeing things is corroborated by how people expect the novels they read to contain considerable biographical or, even better, autobiographical content. But it is also precisely why people assume these novels lack the great imaginative range of the classics, from the Odyssey to Ulysses.

The literary flowering of the late nineteenth century was already decidedly realist, and every new literary movement of the twentieth century has distinguished itself by its heightened claim to reality. The same was true, in an inverted fashion, of aestheticism, whose extreme inauthenticity sought to make life itself resemble a work of art, thereby ultimately unifying the two. Literature competed with the sciences, challenging their claim to objectivity by portraying subjective perceptions that were not scientifically verifiable. Literary writing was understood as a kind of applied philosophical skepticism, in that it declared everything we perceive to be a form of fiction.

However, the idea that there are no facts that we can be absolutely sure of has only encouraged the analysis of our lives according to scientific principles, and recording technologies increasingly enable us to acknowledge the fallibility of our senses and our memories. The opportunities for making up stories about one’s own life that cannot be fact-checked are shrinking.

Only twenty years ago, being able to use a search engine in the middle of a conversation to check whether what you were being told was true would have sounded like a piece of dystopian science fiction. Now people are not just using it to check the truth of what the other person is saying, but their honesty, if in a more or less playful manner. People approach novels with the same kind of interrogational desire. No sooner is a book published than everyone starts pestering its author about how far it resembles real events—even if he or she had expressly written it to trump those events—and again readers may find this additional information more or less plausible. Conclusions about the author’s fears and desires are drawn precisely from what he or she invents.

In the past, people turned to fiction as a way of writing about difficult truths without putting themselves or others in danger. In some cases, it was a way of outwitting the censor; in others, a way of protecting real people—including the author himself or herself—from having their reputations damaged. Today, however, people are writing more openly about political issues and people’s private lives than ever before. The flowering of the novel since the nineteenth century owes a lot to this zeal to confess—Foucault’s “will to know.” The more radical this will becomes, and the more technology advances, the more the novel becomes merely one more way of processing reality among many others, such as confessionals, diaries, various forms of documentation, psychotherapy, self-help groups, reality TV, or social media. Anyone can feel an urge to write such books; at most it takes some instruction and assistance. If a full-time writer hasn’t suffered particularly cruel strokes of fate in his own life—and doesn’t want to stake out a reputation by mercilessly exposing his ordinary joys and sorrows in the manner of Svende Merian, Karl Ove Knausgård, or Tao Lin—he or she still has the option of writing genre fiction as a commodity for entertainment, or serving as someone else’s ghostwriter or mentor.

While visual artists have endeavored for more than a hundred years to be understood as inventors rather than as craftsmen, novelists look more like craftsmen than ever. Unlike the visual arts, the novel didn’t have to contend with one major competitor, as art did with photography, and the result is that the literary avant-garde movements of the twentieth century fizzled out pretty much without a lasting effect. Today most novels are written according to the same principles as nineteenth-century realist fiction: in the main they deal with individuals and their psychological development, which they seek to describe and explain empathetically and without passing judgment. At least since the emergence of postmodernism, one of these people can also be the narrator, who is allowed, as proof of his humanity, to turn out to be somewhat unreliable. Furthermore, there is no longer any need to draw a veil of silence over intimate physical acts such as sex and defecation. This is about all that most novels—including the most renowned—have managed in terms of innovation in the course of a hundred and fifty years.

Yet fiction being seen as only one confessional mode among others also offers an opportunity to consider its particular strengths. Reading involves an enormous narrowing of attention. No other medium conveys less information per unit of time than a text, and no other medium allows for less distraction. During a film you can talk to your neighbor during the pauses in dialogue, but when you’re reading any voices are inherently distracting. If you listen to music while reading, you might forget it. There is no other medium where the consumer is so alone with himself. Even when writing you don’t necessarily have to be so focused, and this isn’t only a question of speed: reading more slowly doesn’t make it easier to concentrate on something else at the same time. When you read very slowly it becomes difficult to understand anything at all.

Writing, like watching films or listening to music, always involves anticipating what will happen next: someone has taken a step and a moment later will take another. Fiction, by contrast, describes the action of walking itself, and only describes a single step when it is different to the ones before. Each word and sentence can evoke more or less clearly developed ideas which—even if the author isn’t aiming for an unexpected turn of phrase—can be supplanted a moment later, because even if the text is speaking of a table in general terms you already have a specific one in mind. To be as predictable as a scene from a film, novels have to be excessively banal and hence boring. Compared with other stimuli, text is the most intolerant of mediums, but at the same time it allows the most room for the reader’s own imagination. This is particularly true of literary texts, which are particularly good at describing specific details very vividly, as well as stimulating the imagination in their more abstract passages.

Realist writing does not make full use of this potential; rather, it vies with photography in trying to describe everything that happens with the same degree of detail. In the beginning, this was sensational enough as a way of focusing on a range of new subjects that society did not otherwise discuss. But in a society where everything can be said, consuming realist novels has come to seem harmless.

Should fiction take refuge instead in the world of fantasy? Like painting and sculpture, fiction is by nature not a reproductive medium; it is capable of describing the fictitious just as effectively as the real. But when what it describes becomes too divorced from reality, our interest wanes. Alternate-history fiction is increasingly ruthlessly knocked down by the stores of knowledge freely available on the Internet, and the real and the fictitious can very quickly be told apart.

When it comes to the future, things are different. What is written about the future can only become reality (more quickly) precisely because it has been written about: lunar landings, atom bombs, mobile telephones, exoskeletons, and Leninism all existed first as fiction. Fictitious inventions became real ones. In his 2011 book Science Fiction Prototyping: Designing the Future with Science Fiction, Intel’s in-house futurist Brian David Johnson sees this from the other side, promoting science fiction as a cheap and harmless intellectual game that can help develop new technologies and explore their effects on society.

In the past, science fiction was generally considered literary trash. The few exceptions were dystopian novels which could be read as exaggerated critiques of the present—especially in countries where literature was censored. Otherwise the scenarios they depicted were either taken to be untrustworthy, or the attempt to explain them as plausibly and completely as possible made them appear dull. Nothing about the future is self-evident, so everything has to be explained in greater detail than in the realist novel of the present—if not directly to the reader, then to a limited number of sentient, humanoid beings. The reader is aware that such a set-up is only a pretext, and how the writer inevitably reaches the limits of what he or she is able to conceive and imagine. Science-fiction writers such as Stanisław Lem in his novel Solaris (1961), or the Strugatsky brothers in Roadside Picnic (1971), got around this dilemma by having their heroes come across something that utterly transcends their comprehension; others have presented the future world as a hallucination, like Philip K. Dick in The Three Stigmata of Palmer Eldrige (1965). By contrast, the science-fiction novels in which today’s society sees itself most accurately reflected—Brave New World (1932), 1984 (1949)—are primarily read to see where their prognoses were accurate and where they were wide of the mark. Aspects such as plot mechanisms or how characters are described are generally ignored.

Nevertheless, the problems that science fiction has to struggle with are only exaggerated versions of the problems facing conventional realist fiction. In his 1963 collection of essays Pour un nouveau roman, Alain Robbe-Grillet pointed out that any novel that follows the principles of plausible plot and wholly explicable characters is a relic of bourgeois society. The more society outgrows its faith in the idea of an individual in complete control of his actions who is at most affected by adverse circumstances, the less self-evident and readable a novel structured in this way will appear.

I notice from my own reading that books that are considered easily readable by common criteria are precisely those that I find excruciatingly boring and annoying. I stop reading many novels after twenty or thirty pages because I like them only for their beginnings, where a certain mood, setting, or conversation is described and everything still appears possible. The further the plot unfolds and the more the characters are defined, the duller the book becomes. Something similar must have happened to those seventies hip-hop DJs who started mixing only the intros and instrumental solos from disco and soul records. Yet, to continue with this analogy, the literary establishment is stuck in the prog-rock phase. With cleverly constructed, exhaustively researched four-hundred-page novels dealing with pressing social issues, it attempts to maintain its significance in the face of electronic media and DIY writers.

If a novel were to be exhaustively realist, then it would, for example, have to describe how you manage to maintain physical and virtual contact with a hundred, two hundred, three hundred people. In this respect, film has it a great deal easier, since faces, voices, and characteristic movements impress themselves upon the mind in a matter of seconds. Literature can only ever fail when it tries to portray everything that is perceivable and imaginable as precisely as possible.

This is a failing you can make yourself at home in. While film and music are as if in the last stage of an addiction and threatening to choke on their own spectacle—action films primarily consist of chases and explosions, porn films of fucking and coming, pop songs of reverbs and sound effects—literature has become a refuge of quiet contemplation. At the same time, language is more modern and more radical than any other medium. Its strength is that it is so abstract that it is capable of representing things that are imperceptible to the human senses.

What is happening in the world is an experiment, better documented year by year, demonstrating what is still possible. The quantities of data available are growing exponentially, yet what the individual is able to know remains limited. Trapped like hoarders in ever-greater quantities of data, scientists struggle to come up with major new theories. As data harvesting becomes ever more accurate and comprehensive, this may on the one hand diminish fiction’s proprietary space for speculation; but on the other, it leaves it with an ever larger arena of speculation that was formerly either the preserve of the sciences or—like legends and annunciations—still tried to compete with them. Even when it shows itself to be demonstrably false, fiction is capable of greatness.

Novels that deal explicitly with ideas are accused of instrumentalizing their plot and characters by making them into mere vehicles of these ideas. But this criticism is only valid if they still present themselves as, or are compared with, the realist fiction of the nineteenth century, and plot and characters are automatically understood in human terms. Why not write a novel in which the main characters are ideas, and the plot is their development? The ideas would just need to be approached as sensitively as people—whom we also understand by means of ideas. The same, moreover, is true of our understanding of technology, animals, plants, buildings. Human thought—like animal thought—involves a constant balancing of signals and patterns, of impressions and ideas.

In its explicit recourse to ideas, art is decades ahead of literature. While fiction still very much finds itself at the stage of “isn’t that a nice drawing?” there is practically no contemporary art deemed worthy of the name today that is not conceptual—that is, which does not understand itself as an original engagement with specific ideas.

When you’re concerned with ideas, there is no compelling reason to limit yourself to a single image, a single text, or a single film: ideas have no beginning and no end. This is why it is often said of writers that they basically spend their whole lives writing the same novel over and over again. In art you can go even further and allow the work to extend beyond specific objects. It is enough to give directions which can be carried out by someone else, or whose implementation needs only to be imagined. This expanded notion of art is appreciated and welcomed in a society in which people increasingly think of themselves as creative. The prosumer appears again here, and he will also be receptive to literature’s attempts to reach out into the world.

Art today seems rather gestural and ephemeral, which is a consequence of its traditional restriction to individual pictures and sculptures. What, by contrast, could be more marvelous than to live as if in a novel? Not only to feel reminded of Proust’s remembrance of the madeleine, but also, like Marcel, to taste it and to remember its taste. To be finally offered the prospect of a fiction you can actually walk into—not just through advertising (as Klaus Streeck argues in Management der Fantasie [2006]), but through literary writing itself. Would you actually prefer to live inside a film? You’d get the film for free, since most popular films with a livable story are still based on novels. The easier it becomes to film—and soon it will be possible to film your entire life from your own POV—the more urgently documentary footage is going to need a script. Even if we are in future able to record our dreams and experiences and transmit them to other brains, we will still need a narrative to prevent these from merely descending upon us like a multisensorial storm. If the ongoing importance of literary writing is not itself understood by the literary world, then it may be understood by the other arts. Conceptual art, for example, requires explanatory texts by its very nature, and can also itself take the form of texts.

Starting in March 2011, the artist Erik Niedling followed a drill I had developed for him, living for a whole year as if it were his last. He wrote down his thoughts and experiences in a diary, and thus produced, in real time, a work of literature that vividly dramatized his euphoria and despair (The Future of Art: A Diary [2012]). For those who find this too risky a prospect, there is always the alternative of meeting people who have been drawn to a particular idea and listening to their stories. This is what I did for a volume of transcripts titled Minusvisionen – Unternehmer ohne Geld (Visions in the Red: Entrepreneurs Without Money, 2003), for which I interviewed fifteen people whose business models failed because they didn’t take economic realities into account.

A work starts to acquire a political dimension when the ideas it presents lead to widespread imitations. This is something that could also happen to a conventional novel, one that dealt, for example, with wealth, love, or suicide. From Goethe’s Sorrows of Young Werther (1774) to Hermann Hesse’s Steppenwolf (1927), entire youth movements have repeatedly grown up around novels—where people understood all too well that they could walk right into them. But a good writer had to behave as if things like this did not really concern him. In the future, by contrast, it will be entirely up to the writer to determine where a text begins and where it ends. It can easily extend far beyond the beginning and end of a book or of a published story, and change over the course of time.

Regardless of whether you consider the reader’s thoughts and actions part of the work itself, most readers will in future think of themselves as writers, and when it comes to how you are understood, part of it is always how you influence other people’s writing. These days budding authors can be some of the laziest readers ever—knowing how many excellent books have already been published is simply intimidating. But when increasing numbers of people—and now even computer programs—are also writing, you have to constantly question your own writing to stay ahead; and the most reliable way of doing this is to read widely and attentively.

Digitization means that literary writing can be made permanently available across the world without any fixed costs. More and more books whose copyright has expired are being digitized and posted for free on the web. More and more self-published books are available for free from the beginning. However, this abundance of books also presents a problem. Panic at the prospect of never being able to read all of them might make us prefer not even to try. It was also true in the past that people never read nearly as much as they bought or borrowed. The bookshelf was an externalized good and bad conscience.1 Virtual space lacks the presence of such read and unread books—and compared to the unread books that are worth reading, those that have been read are in a hopeless minority.

The expansion of the concept of art gave rise to an enormous growth in the amount of information produced by the resulting works. Their reception by the art world is usually a kind of random sampling. In the process of selection, it relies, like a scientific proof, upon experts—in this case, curators and gallery owners—who comprehensively certify the work in question as art. Resources such as Wikipedia entries and text search functions make it easier to read literary texts, too, in extract form. This leads to writers publishing intellectually demanding novels above all in order to elevate their status in discussion groups, talk shows, and funding programs. But while the reception of artworks has always taken a relatively short time, literary works have been produced for millenia that take many hours to read from beginning to end and offer some of the most exciting experiences to be had anywhere. With the approach of immortality and the technological singularity, people’s need to express themselves at length in literary terms will only increase.

In order to keep their audiences transfixed, films are enhanced with special effects and 3D, as music is by surround sound. Digital literature has nothing with which it can counter this. Multimedia supplements and social networks linked to the reading of books (so-called social reading) are merely so many distractions. Nor is it enough to cut off the Internet or other programs for a while. Rather, what it needs is a new reading format that draws the reader into the spell of a text as into a hypnotic trance. What’s decisive in determining the success of such reading formats is how effectively they hold the reader’s attention, not how fast they enable one to read or how much one retains of what one has read. The reader developed by Fiktion, which is based on the principle of a teleprompter, amounts to an initial attempt.

It is also worth rethinking public readings as collective events. Although it is debatable whether the white cube presents art as neutrally as possible, there has never even been an attempt to create the most neutral conditions possible for readings. The listener’s gaze is drawn to bookshelves, to the backs of people’s heads and shoulders, and to the reader’s involuntary movements. Closing your eyes seems affected, and once the reading has started you can no longer discreetly get up and walk out. Even before the reading has started, you feel trapped in your chair. Instead of this, listeners could be encouraged to slowly walk around during the reading as people used to in the promenade halls of antiquity, keeping their gaze fixed to the floor and noticing only the feet or stockings of those around them. Since everyone is moving, anyone who wants to can disappear whenever they like. Anyone getting tired can sit down on the floor or on one of the few chairs dotted around the edge of the room. The walls and floor would dampen the echo.

When you are really concentrating on the fiction that you are reading, it is like dreaming according to a scheduled program. You can imagine any kind of nonsense and it will hurt no one, while the more technology makes possible, the more of a burden its possible consequences become. In future no one will need to worry about reading too much, because the processes of production will have been automated to such a degree that value creation will have become largely independent of human labor. Machines will be so much better at regulating everything, even themselves, that we will need to worry less and less about the real world (whatever we take that to be). If everyone lives a modest life, then there will be more than enough for everyone and the only people who’ll have to work will be those who want to. Books will basically be free, because no one should be unable to read a book because of its price, any more than anyone should read a book to the end just because they’ve paid for it. Publishing literature will be understood to be something that serves the public good, just as it was in the Roman Empire, and doing it to make money would be considered shameful. If anybody is picked on to give money it will be the rich, whose vanity could be exploited by letting them pay for the equivalent of baroque dedications, or by selling strictly limited signed editions, while everyone else is “only” allowed to use the book. Should it turn out there aren’t enough people who want to edit, translate, and sell the many, many books being written, barter clubs will be established for this purpose.

Of course, this scenario is also a fiction.

 

1Herr F, a novel by Momus published by Fiktion in 2014, contains the following passage: “This guilt is much more wonderful than the contents of the books themselves could ever be, and spiritually much more uplifting. The unreadness of books outstrips their readness in beauty and in utility. It’s tremendously important to believe that there are heights which we’ve failed to attain, mountains we can glimpse in the distance but not climb. It’s almost like believing in heaven. … An unread book is a mirage which enchants the whole world. As in the paper money system that Goethe rails against in Faust, Part 2, the legitimacy of the humanities is based on vapour, glamour, lies and lacunae. Money relies on notional values, debts and obligations which, if called upon, would prove fictional and collapse the whole system.”

Sophie Jung
X-EXAMINATION <3

Someplace cosy, 11.10.2014

 

It was about 11 o’clock in the morning, mid-October. 
I was wearing my powder-blue suit, with dark blue shirt, tie and display handkerchief
(I just heard: it is also called, rather amputatedly: kerchief!) black brogues (Pumas, actually. #winning. You can’t strike through a hashtag) black wool socks with dark blue clocks on them. I was neat, clean, shaved and sober. And I didn’t care who knew it.

 

I start like this for a reason. Cold start. Cold sober. Stone cold sober. And none of these will be a leitmotiv, or one, just a bit. Hazard a guess. If I manage to make it that, that is. Time will tell. (You off for thinking this could be anything but NOT chronological. “Above all: footnote.”)

I’m not trading in motifs, though. Just to get that out of the way. “Fashion police! Out of the way!

No false expectations. Someone I once knew once said: Your How beats My Why.

I stuck with it.

 

So we’re in California, obvs. LA I guess –

And now I’ll drive down to some place in the Midwest. My geography is beyond all good and evil (as is anybody’s, to be fair. But what’s fair? In nature all’s fair. In nature it’s all beyond GE).

 

I’ve never been to the United States, so it’s all the same to me. All my states (sober, fluid, cast in 1st stone, unturned) are temporarily united into one Big Idea. US – Us two. We’re not married (I’ll repeat) but we’re very much, you know, on the same page. (“It’s not a matter of size.”)

It’s a matter of size. A1 is what I like best cos it’s an A and a 1 and I like beginnings. Neat, clean-shaven and sobered. Stubble is for 4h later. A4 o’clock-on-sock shade.

 

This textUR (E for the Engl.)’s like the …-kerchief. Fine wool, too. But white, so as to see the traces better. You know. Evidence n all that.

Kerchief is like a sneeze.

A sneeze is like an orgasm for minors.

That’s what they say in sexual education.

“Oh no, Ma!” (To P.: “Aahhh”)

And that’s a major reason for tryna contract those colds.

 

UR text will begin now:

A little sth I wrote earlier (like last week sometime when I was on the road again):

 

Music:

“You with your rings on your fingers and time on your hands”

Is what he sings as we’re driving down a semi- and why not country road.

And why not have a country song matching a country road sometimes.

And I never listened to his lyrics and now I realise he is a wordsmith.

Because I thought about it and a good writer is something different. A good songwriter means a lot of things but somebody who’s capable of making taste full/less twists of tongue is what I decided is a wordsmith. I make twists of tongue and he’s so very into that, but not while we’re driving. My linguistic turn puts him off, he said. He said that once after a near death experience. But what a thing to have had. It’s pretty priceless. I am a WHSmith as I sell a lot of things. Tell me what you like: I’ll make it. That’s the kind of artist I am. Always acutely aware of my audience. Always a cute and a ware. Wolf Girl, like Mowgli. And I, too, was taught by the Big aka Baloo. It’s no joke, you know the bit where he almost drops the tree on the kid cos he’s just too into that musty pit of ants? My dad almost (or rather not almost but certainly did, yes) dropped a heavy piece of furniture on me and hence – but hey.

He let go. But my mum will not let it go. Blames it on everything that’s ever gone wrong in my life.

My lack of focus during chemistry.

I blame that on chemistry itself tbh.

We are one big fat bundle of hormones.

“It’s what I call assemblage,” she said. It’s what a lot of ppl b4 u called assemblage, bitch. But I nod and take notes.

At least I am. Don’t know bout you. Don’t wanna assume things. This is what I’ve learned from you English. To assume. Always assume and then decide post-factum you actually don’t want to assume. And you take one big step back, as if stepping away from your imaginary assumption, pushing it away with both hands. There you go, Assumption. I disassociate myself from your never having existedness.

I have one ring on my finger and that finger does not push any imaginary things away. Just the real ones, like marriage, which makes it a constant paradox right in the middle of my index.

What’s your happiness index?

I heard on BBC4 there is no such thing.

I could have told you that but I haven’t got the broadcasting authority.

I do have broad casts but use them for other things. Like making any kind of art you like. Any. You name it.

You name it and I will rename it, as I am in charge of titles, thankyouverymuch.

If you think you can get into my designation you’re barking up the wrong tree. You’re staring at:

The Wronk Bark. (bark (not), me. Dimensions vary, 1982)

I am a bit of a sniffer dog. Not a wolf, more a sniffer. You know, the classic. Don’t know much about breeding but I found I’d like to call myself a mongrel. I detect things, but it’s not up to me to say what. Or I’m not up to it. Or none of the above or anyone’s business.

Eye am an i. A private i, as would you take me seriously if i was publicly funded? Taxpayers’ investigator?

I think not.

I think I’ll go private and find out for myself and my pecunious clients. As that’s what artists do, let’s call things by their proper name.

Index to Title: I haven’t found what I’m looking for. And I don’t know what I’m looking for, what’s more. But I’ve got a lead. Got some role to play. I called it “Cop Out! Dick In! #Unleashed.”

To quote every detective that’s ever been on late-night telly: Sth sth remembered sth in that conversation was important but sth sth can’t remember for the life of me what it was sth sth. And by the end of the show somebody says sth unrelated: Cut to and on his stone-cold-sober-for-now-yet-informed-by-a-lifelong-business-partner-liquor-obvs facial structure twitchingly follow his memory back to that then-not now-so relevant place and he’s like: YES! And all the pieces come together like in a magnetic jigsaw puzzle where the magnetism had been switched off forever due to rising electricity costs etc. And then some rogue comes and switches it back on. Do they exist? A passive B-Move. Maybe I shd patent it. That’s the kind of work I do. I stumble across stuff. I sniff it out. I am following a trail.

(Once I know Final Cut I’ll make a trailer so you get the idea without paying that fee.)

As we’re following a VW trailer that says peace and love on it and I think “wake up, mate” and yet I stick both fingers up to show: yes, me too. Peace. But he doesn’t see me. Or it’s a she but I assume the man is driving. Cos that’s 60s hippie sexism for you. I could stick 3 fingers up with my other hand and then it’s VW but then again I could be more economical and just stick up one. As I haven’t got that many to spare, what with all that time on my hands.

 

Sth we shd all drop asap is our Oh D’O. Ur (I’m encoding in case there’s sm1 else reading) if we care about privacy. Listen to me, I read this in the Daily Mail: Forget fingerprints – Detectives will soon SNIFF out criminals: Technology identifies people by the odour of their hands.

 

Mag Pi (as I said I sort-of find stuff) / Mag Net: Is ohm not something? I wasn’t that present during 8 years of physics, mainly due to physic-all other interests. But I seem to remember there is sth called ohm?

That could be incorporated. Body and sole (of my “brogues” stepped into sth smelly. I sniff dog but that’s a conceptual inversion.)

Ohm shanti my jig saw the light. Felt the power of passive assemblage. DIY into a/the (index/title) big picture. Oh that’s what you look like. Never would have guessed. Didn’t want to assume.

(U know the drill.)

Here: Hole-E

3 Ohms a day have improved my timber’s intonation somewhat.

And we seal the practice with 1 Ohm.

May I just say: Shant i (say?)

I will say one thing though: Non-stealing is sth we chant a lot, so I’d like to take a minute to clarify. If you google the following sentence you’ll understand:

Magpies do not steal trinkets and are positively scared of shiny objects, according to new research.

And as he opens his battered raincoat the rays of the VW headlights just catch a glimpse of the happy-to-see-me in his pocket which in turn blinds me with fear and longing (or sth along those lines).

 

Now, you’ve got to locate your reader in time/space is what sm famous female just said, and for the middle bit to be held comfortably btw two sections I shd go back to California, y’-i-know. But since I’ve never actually been there and comfort is not what I’m after in this piece & even better: I’ll go to Qualifornia.

I’m good at some things, you’ll see.

 

Down and out: “ever have a tune in your head and you can’t remember the words?

(necessary footnote)

 

Yes. I think I mentioned this before but then that was yesterday. I tell you yesterday is a wind gone down, a sun dropped in the west. Yesterday is a cancelled check.

And yesterday may well live in an altogether different era (I do still believe in) before and after rehab Re: Fab 4 all I’m concerned. And today I am all new. Stone fresh clean cut sober etc. etc. So.

 

The hunch. (Back to that issue. I carry the cross of a repetitive inclination. Word. Cross my heart and examine my chest. It’s all the paint fumes, probs.) I’ve been thinking about this A Lot. My thesis, how about it: The artist as “con/temp.” sleuth. Not sloth no longer as we’re in the business. Big Business BB. “Success? I following my instincts.” Up to a point when clean cut analysis kicks in. Out of the blue October morning. You can’t predict timelines.

Out of my hands.

Onto your forehead.

Sloths, on the other hand, follow insects. Or rather simply hang out and wait. So in that respect we might have sth in .com…/ (I just said that to make you feel comfortable in your assumtions regarding your average artist’s productivity. What do I do all day? Oh, you know. Breathe f.e. Breathe fumes f.e. Breathe deep: OHM.)

One thing though. I don’t mind being charged and I do appreciate attention but most days I simply can’t pay. The doctor troped it for me: Your headlamp flickers. And I assume the bulb’s got a slack joint or sth. Before you know it the light’s gone out. (See and sloths have very strong joints. They stay hanging off branches for weeks. Hanging. Not A Move. Never drop never stop. Tailing is their middle name. Middle man. Neither. No man’s name no man’s strain. Right in the middle of their cute little bodies. Anyway, point is: My focus is too weak to sustain writing a master piece on said theme. (& anyway peace can only be sustained once the concept of master is dropped. Talk about dropping & throwing cold cuts / sober stones: Just sort of threw that in here. It’s debatable. Challenge me. Find me at luncheon. In the glass house, covered with dusty blinds. For the effect. Sharp blades of light. Ring the bell with the lying down 8 on it. It’s an insider. xx J.O.B.)

 

So I’ll go sniffing for quotes cos that’s a dumb job any assistant can do and as it is I’m assisting my own brain right now. I mean that flash of pure genius I had to disassociate myself from so as not to be held responsible. I can’t be held accountable. I can’t even count most days as my hands are temporarily busy. I count one cunt and one private dick.

 

(Enter: Maestro)

 

TBH with you, rn I am less of a wolf or a bear and not even the funny monkey with their trumpet but The Pantheist. In this case. I got many-a-role. And a colour like: I’ve just made myself Pink. Like, as I said: private dick. Private 1-eyed clock (off – in/form./informal 4 face) Got many a ring to myself, like a private bell (end the cat now)

Just like that.

And why not be a little bit of a gender fluid. Fluidity is my belief.

Religious views: (fluidity)

Description: Fuck off

Anywho

dunnit/dammit. Get to the point. Blank. That’s my state and I / USe my statUS to remind myself of myself. Don’t forget, Sophie Jung is dotdotdot. Enter.

My status is thusly: I find quotes in detective novels. I already found one.

See above.

And above all it’s about proving my point which is this:

Space out and see the whole picture.

In outer space all is fluid. I think. Though I wasn’t that composed, myself, when Dr. Prof. Challenger talked me through it. Right to the other end. And back.

To the moon and back if you b – if you b my bb –

And you’re gonna drop a beat.

And I’m gonna drop the link.

It’s gonna happen.

But it’s for the greater good.

And the trouble is finding, amongst all links. ALL links. That one that gets you to the next one. The so-called solution and you know anyone will do but then only one does do in the end and that was the only one that was gonna do. The chosen one. By me. The right one, though you know there’s more to spatial taxonomy than just those 2 of which you get 1.

AB solution

But chemistry aint got nought (excuse the accent. I’m in cognito ergo sum of all pieces more than so on) to do with this, piss off bb.

Plan A

Plan B

Because there are infinite ones and not just 26 but you gotta fix sth somewhere.

As an artist (heavy breathing – pregnant pause) I am not searching, I am finding. (That’s a quote by a very famous man – you know him – you’d know if I told you his initials. BB. PP. Good shape, no belly. Not circular, he’s cubist. lol.)

 

I am finding in the depth of my “dark matter.”

I am a miner. I was a minor but now it’s ok. No pressed charges. No investigation into this dark matter as we both… u know. “doesn’t matter.”

Ok. So then I need that torch. Casual stroll. Not looking for anything. Patrolling my subconscious while keeping the Big Picture.

The Big Sleep is what I’m trying to evade but bits of me are always snoozing. Depending on what I stumble across I might wake one or the other memory and then. Crooks and dealers beware.

B-Where? (Beware wolf man. Freud of pretty Götterfunken will try and diagnose you. Resisters and brothers. No. Do not let him, he comes and chops down all the pretty maybes in your backrooms. All the dark I’s looking down from that tree onto your SN-eezing/oozing self. Word of advice – a word is a vice and pre-linguistic darkness is UR beauty. Oh the Joy.)

 

But also CK Be. Wear them goods. Good. Ware them black blue socks.

COS face it: you are what you wear. I was told. Navy is The New R-Me. R-We dressed alright? Uniform but always morphing.

 

I read about 4h of facebook a day. A good book. An open book. Open to all. And now they’ve apologised to the transgender community, so that’s ok, too and I won’t feel too bad for contributing.

See, we’re only really allowed one identity, anything more is just self-indulgent.

iDentity. Its sharp teeth only run on and over the new operating system milklamb.

I made a sculpture out of the “first page” of a detective novel. Text UR – (a mountain of magnetic potential form-re-form-re-form-re-form in infinite variations. Magnetberg Variations. LoL. Infinite precision at the tip of a digit. C’mon bb Kindle my “fire” this book won’t burn. You melt its <3 make grown men cry.)

Her eyes were slate-grey and had almost no expression when they looked at me. She came over near me and smiled with her mouth and she had little sharp predatory teeth. As white as fresh orange pith and as shiny as porcelain.

The trouble was my palette got mixed up, I couldn’t decide whether to use orange as well because, you see, the word is mentioned. And that’s how easy it is to bi-dentify. I shd give this as an example in my feedback form to FB. I can say: hey, fb you private dick! Lol. It’s your job to sniff around. So sniff out who I am and tell me after. Cos my private aye ain’t so private anymore, lol, so I sort of got lost.

E-value ate my profile. No good? I’m always best full frontal. Edit profile l8er with that nose job. But it does worry me as I rely on my scenting ability. I take this at face value cos it’s what it’s called and sometimes that’s enough of a reason. I come across much. And some things stick and that’s how I collect evidence.

Magnetic movement still captures the minutes I’m in. That’s a sticky note (I can’t read notes but this one is part of a good good tune and I once heard it hummed back in 68).

It’s like the private dick. He shd b dirty though, cos then it sticks better. That’s why we artists always walk around looking for gold looking like shit looking for gold. I mean our clothes. Dirty Filthy Nomads. This way we don’t even have to bend down to pick up the eventual evidence. Keep our heads up high: Stuff just sticks. Like bushy stuff on dogs. U know the ones. Cocklebur. (Superintendent Cock Le Bur – i don’t trust his face.)

That’s how Velcro (i must capitalize as it’s a trademark you know) was invented. Superintendent Cock Le Bur (without no super intention or any intention other than the one at hand: dog needs piss) walked with his dog in a forest and all those funny sticky things got caught in its fur. He deduced if A and B etc. and was #winning. He really got his sharp lil teeth into it, he did and 8 years later he patented Velcro. Light on: The Velvet Hook. (Though there are rumours that Velcro is a Vulcan invention but I won’t go into it now. Magmetic herring. Molten red. No deep space on my screen. U know the English dresscode: Red and Green shd not be seen.)

 

At times and as we’re near the colour green, what with forests n that I want to say that I was gonna get a ring from my godmother. An emerald. X-actly the same one Faye Dunaway sports in Chinatown. She got it from her Parisian relative. And I was always gonna inherit it. Value! (I got a quote.) “Forget it JaCK, it’s Chinatown.” Cos somewhere along the line sth got lost. Hand Cuff Link dropped. And so I got a pair of bifocals, for what it’s worth. With a chain on them with loads of links, super pretty, and I wouldn’t give it/them away for the world. But so I’m not the one with rings on fingers as that role went to another member of my family. The Ringleads’er down a long line of Jungs. And Jung wd have sth to say about my dream involving lacks of liquid and me – with my dried-up mouth chewing on a piece of my grandmother’s jigsaw cos “in my dream” it was gonna be the best artwork EVER and I was gonna #win. I did as I was told after waking and it’s on show as we speak but nobody’s said anything congratulatory about it. It’s a piece of a larger image of some Cali Cali foothills. Bathed in purple evening light. But good/evil is beyond all that – over the hills and everywhere. Cos I’m ok with it. And talkabout finding stuff:

It’s from a jigsaw that had 13 pieces too many. That’s half an alphabet. And then we, my mum, my gran and I composed 1 letter. The packaged response was a brand new one. Same one. Composition in Blue, Purple and Orange. Uncomposed, so we spent the better part (we couldn’t tell which, what with 1000 parts a part) of the day/week/year assembling it. And it had 2 pieces missing. And one of those two is the one I used. Go Figure. 2 fingers. One Peace.

Etc.

Etc.

 

(Repeat Markers)

 

EPILOGUE or whatever this is called amongst you literary ppl.

Of course I do have sth to say on the matter.

And if this was a BBC 4 program like, say: Artists in Their Own Words (not quite so possessionless after all. After 6 S’s & before After 8. Wait 4 the T – to get the general & infinite setting.)

 

I’d make sure that:

 

A)I’d get myself a sophisticated studio as background, which would necessarily have to be shot with a short depth of field so that I, for once in my life, am in focus.

 

B)I’d be shot with a long depth of track. I mean back on track. I mean I’d get my words all lined up and back on track ready to perform the straight and efficient run. You may say linearity is not my cup of tea but then I’d say that linearity is not a cup of tea at all as I’ve abandoned metaphors to the viewers’ ranks for now and anyway. You may run 100m but the track is still circular so the potential to come back loopy is there.

 

C)But I wouldn’t actually be shot because the detective genre is an analogy and though we try to stay on track, nobody said anything about analogue. We are still all about digits, I haven’t given up on the ring, honey.

 

D)π

 

So I start. Short and sweet: (one spoon, please, not a whole zuckerberg. – Bad for the figure <3.14159265359etc.etc.

 

E)Ah, yes. That’s a splendid question. You see, I try to detect relevance. I fancy myself as a bit of a detective, you know, the classic 1940s noir type, you know. BogArt, say. I walk the streets at dusk and ask a lot of questions. Or I sit and listen carefully. I know when something I see or hear has got relevance, when something will be of importance later on in the plot. I shall most likely not be able to place it, but I infallibly sense what will come to play a role. Think me queer, but, you see, I just, for the life of me, can’t tell you why. Or where. Or when, for that matter. I simply have this compulsion to collect evidence, as it were. Yes, I certainly should say that I am a born artist just as a snoop is a born snoop. Do people still use that term, yes?
Though I’d say that is only a role one assumes and roles belong to the main body of the text and not to this epilogue sequence to be aired later on in the year (11th of October, I bet) role-ex pelled of this body back onto the socks. Rewind. Back to the beginning. Flash back and here we go again:
I intuitively sense when something has got potential, let me call it potential, that radiates outside of its current status. It is a thing in itself and a piece of a larger ensemble,
a bigger www.hole. (enter: private dick)

 

F)“I’m sure I don’t know what you mean.” “I’m sure you do, missy.”

 

G)The core difference, and this is important, is that the PI finds evidence to get to the solution. Whereas in my profession there is simply the resolution to compose a good riddle. The raw riddle. Crudities and cold cuts. And if you cut deep enough you’ll find that my eye is essentially public. And I find clues to get to the bottom of a cast. No, let me put it differently. And that’s the challenge. To continuously put it differently and yet stay infinitely spot-on. Switch on the spot light while trusting in the ghost light. Precision within vastness. You must think me rather vague, but that is the only way I can put it. Or one of infinite ways. Look at this infinity of nos, not-like-thats. And then this expansive potential of yeses. I have to nose my way towards the right poise of potentials. Does that make sense? Oh dear, I must sound frightfully airy-fairy to you. Please do cut this out. That’s what I do. I work on pliable casts for you to put whatever paste you use in. I cut my losses, I lose my cut links and yet. This is the basic scenario for the perfect case itself. Puzzle your X-Examiners.
Secret letters from the bottom of the ABC will, if cunningly placed, resurface just in time for you to crack & heal the well-heeled crossword in the Times. Yes, I’d say The Perfect Case Itself, 1968, is a precisely arranged assemblage that allows for an infinity of finite solutions.

 

H – Physics) Time on my hands and clocks on my socks. Now you see, this is a sentence I once made. Out of found material. And – well. I think it’s rather beautiful. Don’t you? Though I must admit, I’ve no clue as to what it means.

 

CUT

 

Thank you

Thank you

(Thumbs Up)

Amy Patton
EXTRACTS FROM AN OUTLINE FOR A MEMOIR WRITTEN ABROAD

 

CHAPTER 1.

 

The Author arrives at Work Chair; queries, her correspondence with the Mayor of that Enterprise. Paralyzed & befogged, she attempts to Focus, queries for twenty minutes, suddenly grows very sick; the Author panicks & vomits, a very meticulous account of all circumstances thereupon: her languid descent down the staircase & arrival upon the Street below, into the Deutsche Post for a Short Break & Fresh Air: how the inhabitants of that territory—descendants of Vandals, Goths, Huns & Lombards, all—register her presence; the Author’s reflections thereupon. Author accidentally faints while standing in Line.

 

CHAPTER 2.

 

How they flock around & gently stroke her limbs until she stirs, & feed her Venison Stew by a fire hewn of burning Mailbags & castoff Packages, until she has returned to her faculties. How, after a few tentative days & nights with the Fellows at the Post Office, she conjugals with a young, Deutsche-Post-Fellow, becomes pregnant & is taught to subsist on mostly Milk & Flesh & Figs, & to bathe in the sink/toilet for Employees Only. How she becomes accustomed to no employment, or discipline, & doing nothing at all unless she feels like it.

 

CHAPTER 3.

 

How, six to eight weeks later, the Author is already settling into life there, & has adopted their customs & costume of Beaver-Skins & Special Garbs lined with shredded envelopes, decorated with staples & multicolored adhesive dots. Description of edible plants & animals found in this habitat, with words explaining all their anatomy & uses; along with the reasons why they would not live outside the Post Office & the Methods used to promulgate them here. Cross-section diagram of the Interior of a Deutsche Post Sorting Station; its Structure wonderfully adapted to animal husbandry & minor crops cultivation, sustained by a small Aquaponics System that also contains, inter alia, Catfish, Carp, Watercress, Radishes, Beets, Squash, Peas, Cauliflower, Edible Flowers & Cucumbers; a short aside about more menacing Creatures, i.e. Screech-Owls & Foxes, & other Predators that sometimes access the building, their plunder a source of constant anxiety & a major Resource Drain. Terrifying real-life anecdotes about attacks in the night; darktime dangers in General; the Time the Author slumbered while guarding the Four Cash Registers, plants & animals, & her rightful humiliation & violent publick upbraiding afterwards.

 

CHAPTER 4.

 

How the Author takes unto her a Supervisor, a Mentor. Describing Supervisor & Her Character. How Supervisor visually asserts Herself in space, flaunting garish, ill-fitting Skins, intentionally widening Her figure with Lavish Amounts of Food that is Actually For Everyone; pale flesh spilling from the bottom of her Bear-Skin Waistcoat, & Breeches so taut, she can neither walk, nor perch. How Supervisor belittles & distresses the Author with lewd remarks about Postal “Slots,” introducing paper clips into the Author’s vagina as she slumbers & defiling the Author’s Bear-Skin with Gold Stickers, & Thumbtacks, & Neon Bookmark Stickies, & One Time almost pushed the Author’s neck onto the Package-Evicerator(!) How the Author could not stir for Petrification for days, & yet eventually regained her Confidence, Triumphed, & Righteously Murdered Supervisor as She was Attending a Customer & what Customer said on that occasion. Paragraph on the disposition of corpses.

 

CHAPTER 5.

 

How the Author—exalting, bloodied from Battle, having annexed Supervisor’s entire flock of animals, two Husbands, & several valuable decorative & medicinal plants—advances from the Front Counter Area deep into the Inner Sanctums of the Package Unboxing & Eviceration Station, where she glories in the Approbation & Respect of her Fellows. Short dissertation on the Nature of Power. Bales of collapsed cardboard Package-Boxes set ablaze, then doused with water from the Aquaponics Tank. Description of rejoicing & music into the night, an eruptive symphony of deep Hum Strums, Shrill Trumpets, Bagpipes, Salt-Boxes, Pipes & Kettle Drums. How the Author’s belly is covered & wrap’d in Deer’s Hides & Taffety & hands are painted, & a Cap of Neon Posterboard is placed upon her head, Brown Paper on her legs, & one High-Value Postage Stamp over each of her eyes. How the Author is placed on a Bier, & is carried with great Vogue to the Supply Closet, where she awaits the birth of her Child.

 

CHAPTER 6.

 

Child is born: a Son! How the Author gives birth in the Supply Closet, in a squatting position, attended by Crones of the Post Office who massage & knead the Author’s abdomen, burn bundles of Sage & Schmierpapier to purify the Supply Closet; & the Placenta is tossed onto a cactus. How the Child is Packing-Taped to a piece of plastic bent to fit the Child’s body, leaving only his head free, so that the Author can pick up the bundle & hold it to her breast to nurse. How the Child is kept swaddled in a small cage just outside the Supply Closet, so the Author can also sleep, & is removed from its cage thrice to four-times daily, with shredded paper & fresh moss placed between his legs to absorb the natural discharges. How the Author slumbers in the Supply Closet on a slightly raised platform of Padded Envelopes draped in Skins, & remains there until after her next menstruation (8–10 weeks). How her diet is confined for some days to a broth made of Boiled Roots & Berries, & chewed coffee grounds. How the Child becomes very ill.

 

CHAPTER 7.

 

How ennui sets in. How the Author resumes regular leisure activities outside of the Supply Closet, but the Blight of Crops & Animals means basically she is forced to perch & perform Counter Tasks for Food & Packages. Mortal antipathy for Customers develops, leading to certain incidents. A Confederacy forms, & Deutsche Post falls into a vile state of brutishness, with striking fists, nightly beatings, limbs & heads severed in great anger. Author’s Child perishes. Author mourns Desperately, keeping a fire day & night, & harms self with Envelope Slicer in paroxysm of Grief. She is also eating Lavish Amounts of Food that is Actually For Everyone, little that there is & growing fat as a Hog. Confederates conspire to Expel her.

 

CHAPTER 8.

 

How this basically amounts to the Author being denied All Food for Days and Weeks on End. How the Author, starving & emaciated, finally slips defeated through the Automatic Double Sliding-Glass Doors, bidding farewell to No-One, & onto the Street. Poverty-Stricken, Emotionally Unstable, & Riddled with Sexually Transmitted Diseases, she survives for 6 days in the Vastness of the Great City. Observations on how rich it is, & how it stinks.

Credits

 

First published by

Fiktion, Berlin 2015

www.fiktion.cc

ISBN: 978 3 944818 88 7

 

Project Directors

Mathias Gatza, Ingo Niermann (Publishing Program)

Henriette Gallus (Communications)

Julia Stoff (Management)

 

Translations from German

Nathaniel McBride (Dirk Baecker, Nina Bußmann, Ingeborg Harms, Arthur M. Jacobs / Raoul Schrott, Ingo Niermann, Johannes Thumfart, Ronnie Vuine)

Amy Patton (Introduction)

 

English-Language Editor

Alexander Scrimgeour

 

Proofreader

Tess Edmonson

 

German-Language Editor

Mathias Gatza

 

Design Identity

Vela Arbutina

 

Web Development

Maxwell Simmer (Version House)

 

The copyright for the texts remains with the authors.

 

Fiktion is backed by the nonprofit association Fiktion e.V. It is organized in cooperation with Haus der Kulturen der Welt, Berlin, and financed by a grant from the German Federal Cultural Foundation.

 

Fiktion e.V., c / o Mathias Gatza, Sredzkistraße 57, 10405 Berlin

 

Chairs

Mathias Gatza, Ingo Niermann

 

Registered association VR 32615 B

(Amtsgericht Charlottenburg, Berlin)

 

KSB logo HKW logo