Modern Life

The Hummus Wars

Hummus and Pita - Erin J. Bernard

I used to hate hummus. I mean, like, really, really abhor hummus. Considering my Middle Eastern heritage, my intense dislike of the beige stuff was pretty much sacrilegious, and intensely baffling to everyone around me. And I guess that was sort of how I wanted it.

Hummus was available by the vatful at every family gathering we ever had when I was a kid. And I summarily shunned it. I’d turn my nose up at my Lebanese grandmother’s ministrations, preferring instead to dip my carrot sticks in nasty-ass ranch dressing; to smear my Syrian bread with spicy mustard, or cheese dip, or anything, really, so long as it wasn’t that vile, lemon-tinged spread.

But it wasn’t personal with me and hummus. Looking back, I suppose both my initial hatred for and eventual veneration of it had more to do with being contrary than it ever did with the actual dip itself. Cause honestly, what’s not to like? It’s a cheaply made and entirely unassuming concoction: toss two-bucks-worth of beans, tahini and garlic into a blender and you’re in easy meals for month.

But I hated the smell of hummus. I hated the texture of hummus. And most of all, I hated being constantly strong-armed into “just giving it another try” by my garbanzo bean-gobbling brethren.

“You’re crazy,” they’d announce as they loaded heaping spoonfuls onto black olives, onto cold lamb, onto green radish salad, onto anything in reach, really. It didn’t matter what the food was: it functioned merely as a delivery mechanism for the great globs of hummus.

But I cared not a whit. I’d glare back, reach for the French’s and defiantly splooge a watery dollop onto my flatbread.

They’d merely gaze over at me like I was an alien, then continue with their garlicky repast.

By the end of such a night, my grandparent’s entire house would begin to reek of reconstituted garlic. And as I was the only one abstaining, I was also they only one who could smell it. The windows would fog up and the afterglow of the hummus glut would seep through everyone’s pores and out of their slackened mouths as they dozed on the flower-patterned davenports, platefuls of olive pits quivering atop their distended bellies. The smell would drive me out to the darkened backyard garden. There, I’d station myself among the bowling trophies on metal stakes that my grandfather had driven into the dirt at irregular intervals. And I’d pout. These people were crazy. I was sure.

I grew. Things changed. In the mid-1990s, hummus fell into sudden fashion in urban pockets across the U.S., including Portland. The freewheeling and ultra-polarized culinary landscape of the 80s was in its death throes: McDonalds, Easy Cheese, Twinkies, were déclassé, even trashy. But so was the lofty egocentrism of haute cuisine: People wanted to eat well, but they certainly didn’t want it to make them fat or broke.

Hummus made that dream attainable by acting as a sort of great equalizer. Unlike sushi or snails, it came cheap. And unlike veal or caviar, eating it didn’t make you feel like an asshole. Spoonfuls of cilantro hummus spread onto a plate of saltine crackers for din din allowed broke hipsters to maintain both their (questionable) ethics and their (immutable) dignity. What’s more, the opposite sex found hummus charming, bad breath be damned: out on a date, ordering a hummus platter appetizer in place of chicken fingers or crab cakes made a person seem worldly, health-conscious, adventuring. And bringing a tub of kalamata olive hummus and a bag of tortilla chips to a party was a popular and economical solution for any cool kid short on coin, or creativity, or both. Just like that, everybody was eating it, because they wanted to. And because they could.

Everybody except me.

The meteoric rise of this gustatory giant coincided neatly with the slow burnout of my childhood, although I suppose at the time I only vaguely noted that either of those particular constellations had begun with its irrevocable shifting. I was too busy being rash and impressionable and entirely infatuated with the New Guard of Older and Wiser scenesters I’d begun to spot out and about in Portland. I was a comet’s tail, chasing after a set of re-imagined ideals that both perplexed and intrigued me. The city was changing. And I wanted to change, too. Sort of.

Smart little tubs of hummus appeared in the deli section of the grocery store, next to the brie and the prosciutto. Friends started insisting we eat out at an ethnic-without-being-too-ethnic Hawthorne Street café called “Garbanzo’s.” I began to reconsider. Perhaps the nougaty, fibrous pabulum was more than just a foul-odored throwback to the Old County. Perhaps it was something worth rethinking.

So, come 1998, I grudgingly set aside the curio of excuses I’d been hurtling at relatives for a decade-and-a-half (It’s too mooshy, it gives you butt-breath, it tastes like the inside of a man’s dress loafer smells) and I purchased a package of Athenos-brand roasted red pepper hummus from the Safeway near my house. Ounce for ounce, that round little tub cost a good ten times more than my grandmother would have shilled to make the homemade version. I brought it home and spread a thick layer onto a piece of toast. Standing alone in kitchen, I scrunched my eyes and took a bite. And … something was different. was different. I ate and I ate, and as my belly expanded, all those nights sulking alone in the gardens of my malcontented youth fell away and I rose up, a (kind of) new person.

Of course, my grandmother was overjoyed when I informed her of my culinary deconversion. She started bringing me quarts and quarts of her version of the stuff, delivered always in a repurposed cottage-cheese container, and made extra salty, just how I liked. In the early-aughts, when I was a destitute, vegetarian university student, she also taught me how to make it. Our family recipe, I discovered, was runnier, lemonier, and imagined from a more spare spread of ingredients. My grandma insisted it was more “authentic.” It was a vague word, crowded with opposing possible interpretations, but I’d learned by then -from my close observations of Bigger Kids that it connoted something of infinitesimally high worth. I ditched the French’s and the ranch. I did not look back.

Today, I’m almost 30. My grandma will turn 85 this October. We still make hummus together. It’s now on the starter menus of at least 80,000 Portland restaurants, and a covey of competing labels do battle with Athenos in the deli case. Scads of culinary trends have come and gone, but this particular one has endured, at least in Pacific Northwest. And why not? The stuff is simple, inexpensive and healthy, with just a touch of the exotic. Eating it is sort of like a vacation in a tub, and shoveling down a humble meal of pita and hummus in front of the TV some Thursday night with the lights dimmed, a person could imagine himself fantastically transported: seated upon a mirrored pillow in some Bedouin tent in the Moroccan desert, perhaps, where the hookahs bubble well past midnight, where dancing girls gyrate their browned bellies in the shadows of a low-burning fire, where endless silver platters of hamos lay spread across low, rough-hewn tables and some friendly, dark-eyed stranger with garlic-breath crouches nearby, whispering, “Yahalla! Eat, eat, habibi!”

I recently had the misfortune of spending two years in the Midwest. As cultural trending tends to incubate on the coasts and bleed slowly inward, hummus was just coming into fashion there. Same for Pabst. At the time, I was a vegan, but I was also extremely poor and had no car, so I subsisted largely on hummus. In spite of or perhaps because of my checkered past, I took to the task of seeking converts with great zeal.

I taught a roommate my family’s recipe and we’d spend hours concocting new takes: sundried tomato and feta hummus. Green-olive hummus. Extra garlic and chives hummus. Salsa hummus. Our fridge was packed with the little Tupperware containers. Everything began to smell like garlic, including the soy milk. One of the recipes even ended up in the local paper.

Not everyone was sold. I often found myself stationed by the snack table at boring Midwestern birthday parties, coaxing the faint of heart with the self-same kit of proddings that had so enraged me during my childhood:

Just try it. Have one more bite! You’ll get used to the texture.

Sometimes they did. And sometimes they merely wrinkled their noses and turned back to their platefuls of catsup and curly fries. I couldn’t blame them. A little hummus is a healthy thing, life and value-affirming, even. But so,too, is the free exercise of a sturdy incertitude. I get that.

I really, really get that

____________________________

Why I Can’t Cry at Movies

I’ve never been one of those people who can cry at a movie. In fact, I can count on a single hand the number of films that have reduced me to genuine, unabashed tears: they include: “The Truman Show,”  “Lord of War,” “Avatar” and, semi-humiliatingly, “Rudy.”

It’s not that I don’t enjoy a good cry as much as the next person. In fact, I am a big crier. A HUGE crier! I cry when I’m angry, I cry when I’m overwhelmed, I cry, sometimes even, when I’m merely bored. (Not kidding.)

But the brain is a lazy and selfish mechanism, and it’s as if my particular brain simply cannot abide the thought of crying for something or someone else. The pure self-indulgence of lying on my bed sniffling away over some half-tragic and decades-gone love affair strikes my brain as a totally a legit use of a Saturday afternoon. But when that emotionality is turned outward, my brain protests so noisily that the moment is inevitably ruined.

And it’s not like I don’t try. For example, yesterday when I was watching a movie about John Lennon. I knew it was going to be a tearjerker, and I was so down with that. So we get to the flashpoint moment of the whole sad story and they’re running all these old film reels of John and Yoko and Sean as they’re celebrating birthdays and dancing crazy and snuggling in their pajamas looking so ridiculously happy and you just know it’s coming. And I felt my chest start to bubble up into my throat.

As I watched a now-old Yoko struggling to retain her composure, as the sad music kicked on, as the old home movie footage rolled unstoppable toward its too-soon ending point, a pleasantly deep chill kicked on in my chest and started its slow spread down my arms and legs. I squeezed my fingers tight around the stem of my wine glass like it was some rare and doomed flower.

And then: BOOM! No more John.

The spaces behind my eyes got hot as I watched the ever-retiring, ever-striking Yoko raise a gnarled hand to her forehead and stare blank-eyed into the heart of her greatest disappointment. And I want to cry, because it is a tragedy so senseless it seemed somehow preordained, like something she’d always known, he’d always known, something we’d all always known.

My face smooshed up and the tears started to drip out from the corners of my eyeballs as if they were a pair of leaky faucets.

And my brain started in with its incessant grumbling.

Brain: Oh, seriously? Now you’re going to cry?

Me: Shhhhh! I’m watching.

Brain: C’mon. Do you have any idea what your face is doing right now?

Me: I can’t help it! Look at how sad Yoko still is. They were true loves.

Brain: PFAA! You don’t even KNOW them! They probably would’ve split up within the decade anyway. And What about ME? Try listening to MY problems for a change.

Me: SHUT THE FUCK UP! This is the BEST PART and you are RUINING IT!

Brain: I’m just saying. You look really ridiculous right now.

Me: Fine. Just forget it.

A couple of rogue tears rolled on down to my chin, now set in a square grimace. But the moment was ruined.

It’s like my brain is this grumpy sentry resigned to its tireless circular patrol. Like those sad and paunchy security guards who wander around the outsides of shopping malls after twilight time: underworked, eternally ambulatory, and at the ready to quash out frivolity of any, any kind, if for no other reason than to break up the endless hours of Nothing Much.

You’d think an educational documentary would be enough to keep my brain amused, but, no. I actually have enormous trouble finishing a lot of movies because my brain starts searching around for something more useful to do. Serious. Ask anyone. Watching a movie with me is rarely enjoyable, as I’m either talking on the phone, working on my computer, asking totally irrelevant questions about the characters, or sometimes all of those things at once.

The whole issue really begs the question of who is truly in the driver’s seat, here. I’m seriously, seriously beginning to wonder.

____________________________

On Groundhog Day, on Groundhog Day

220px-Groundhog_Day_(movie_poster)

The occasion of Groundhog Day seemed a fortuitous moment to finally sit down and do a little write up of an all-time favorite film, Groundhog Day. Like the best Big Lesson films, its presentation is unassuming. So unassuming, in fact, that it took cultural critics and religious zealots almost a decade to begin trumpeting its genius in numbers loud enough to be audible. Likewise, it took me a good dozen viewings over almost 20 years to arrive at the revelation that there is a veritable wealth of postmodern thought packed into this film.

Groundhog Day is invariably relegated to the romantic comedies genre in video stores. Its movie poster features the brow-beaten visage of Bill Murray peering, bewildered, from behind the smooth glass face of an alarm clock. And it is rescreened ceaselessly on cable movie channels, often late at night.

It was precisely during one of these nocturnal screenings a few months back that the idea for this essay came to me. There in the darkened living room of my sister’s house, bathed in the ethereal, flickering glow of the television screen, my creative license bolstered outrageously by the ingestion of a few chemical products, I saw the light. This movie was pure genius! And postmodern, through and through.

My enthusiasm was dampened somewhat when I set out to do some research and discovered that a small army of bona-fide philosophers and religious types had already churned out an abundance of mealy-mouthed tomes on this very subject. And, to my surprise, many argued that the film summarily rejected the central premises of postmodernism. It was an allegory, they said. A religious trope, or perhaps a modernization of Nietsche’s doctrine of eternal returns. But postmodern? Hardly.

I’d like to disagree. But, first, let’s work backward  a minute.

Groundhog Day, for the three of you who haven’t seen it, tells the tale of Phil Conner, a rumpled, road-weary meteorologist who is brought, against his will, to Punxsutawney, PA, to film a cutesy news segment. He grumbles and snarls his way through the trip, sneering at the exuberant locals and throwing a man-sized hissy fit when a blizzard traps him in Punxsutawney overnight. Then, time hiccups, and Phil awakens the next morning to find that he is trapped in the past—doomed inexplicably to relive the exact same day over and over.

Predictably, antics ensue. Phil becomes drunk with his own power and decides to use his unique knowledge of the day’s events to play out a host of dark fantasies: he hops into bed with a bevy of local gals, he steals, he lies, he uses his knowledge of the townsfolk to trick them into seeing him as a deity. When this gets dull, he grows terrified by the concept of his own immortality and attempts a multitude of suicides: he electrocutes himself, he drives off a cliff, he sets himself on fire. When these attempts fail, he resigns himself to performing good deeds and developing his latent intellectual abilities. Unknowable eternities pass in clipped montage. Eventually, Phil’s fancy alights on Rita, his producer, and the only being who seems immune to his manipulations. And, well, you can guess the ending: Phil decides to use his powers for good, not evil, he gets the girl, and the spell breaks. He is free.

What does all this have to do with an obscure critical theory, right? Well. Writ broadly, postmodernism can be understood as a genre of literature, art, architecture and general thought that rejects a fundamental premise of modernism—that the world can be explained by all-encompassing truths.

Postmodernism asserts that we are incapable of  understanding the nature of reality. On its own, this is a depressing notion. But, and here’s the cool part, because reality cannot truly be understood, we as individuals are freed up to draw from a variety of narratives (Christianity, Buddhism, Democracy, Science, Hollywood, Greek Mythology, etc.) as we construct our own individual pictures of what reality is. And as more and more people construct these unique pictures of what constitutes the human experience, society is destined to fragment into a noisy sea of narratives.

Further, in a postmodern world:

• Objectivity is seen as impossible to achieve
•

The part is seen as more interesting and more important than the whole

• Emotion trumps reason as a guiding social and personal force for decision-making

•Linearity in all forms is questioned or outright rejected

• There exist no originals; only copies

Postmodern societies are earmarked by several aesthetics: pastiche, collage, juxtaposition, simulations, reflections and copies, and nonsense in any form. Las Vegas, the band Radiohead, and wax museums are a few commonly referenced expressions of postmodern thought.

If you ask me, postmodernism gets an unfair rap. It is often dismissed as a mere parlour riddle, some piece of philosophical fluff scratched out for the singular pleasure of cabbage-faced academicians and pompous cultural critics, or worse.

If postmodernism stopped with the assertion that our lives are colored by meaninglessness, if it wiped its hands and left it at that, then yes, we could interpret its assertions as essentially nihilistic or existentialist. And we could, by extension, reasonably assume that Groundhog Day is arguing noisily against the primacy of the postmodern framework of understanding, and is promoting in its stead any one of a variety of philosophies that see all things as infused with deep and tangible meaning. Because, really, take your pick: the Buddhist belief in reincarnation, the Christian premises of purgatory and redemption, the Jewish belief in salvation through good deeds. That’s the beauty of this film: they’re all in there.

But there is a bigger lesson at play, here. One that spans the longer story of our civilization’s emotional and mental development.

“What if tomorrow never comes,” Phil moans at one point in the film. “It didn’t come today!”

Keep this line in mind as you consider this: Phil’s understanding of himself, and of his nonsensical plight, evolves as the film progresses. He starts out as an insufferable prick who hates his life. When he falls down the rabbit hole, he is shepherded through a series of transformations. First, he is giddy with his newfound power. Then, he grows desperate to rid himself of it. And, eventually, he merely resigns himself to it.

Keep in mind, also, that postmodernism has never asserted that life is meaningless, and anyone who tells you it does has failed to understand it properly. Quite the opposite is true.

Postmodernism asserts that humans have failed to grasp life’s true meaning, yes. And postmodernism also asserts that said true meaning is, at our current point of individual and collective development, incomprehensible. There is simply too much data in our store of collective knowledge to sort out. There are too many possible interpretations, and too many possible interpretations of those interpretations. In short, as we make life decisions, we are picking our way through a mess of code, a jumbled welter of sensory data that we lack the time or the art to detangle. We are all of us grouping in the dark, every day of our lives.

But. Postmodernism rejects the notion that it or any framework of understanding can ever be the endgame of cultural, social and moral progress. Instead, it merely describes a stage of understanding humans find themselves passing through in this present moment, as they move beyond the modern.

I’d like to argue that this film uses the subtext of postmodernism to work out the potentialities tied up in the concept of infinite possibility: What would happen if we could do and have whatever we wanted, without consequence? And what if we were given an infinite amount of time with which to decide just what it was we really, really wanted? And, finally, what if we were permitted the luxury of changing our minds about what it was we really, really wanted, over and over?

In short: what would we choose for ourselves if we knew that all choices were reversible?

Groundhog Day essentially amends postmodernism by theorizing that, in the post-postmodern world, meaning will be restored. But it asks us to ponder what might happen when that curtain is yanked suddenly back? What will we find there?

A whole heap of trouble, if Phil Conner’s misadventure is any indication.

When we eventually come face to face with the troubling concepts of infinite power and possibility, when we finally navigate the leap beyond the postmodern, we may well be forced abide more power than we know how to handle. Like Phil, we’ll be forced to take account of all eventualities. And we’ll have to make a choice, or perhaps a thousand choices, over and over, until we’ve exhausted all those possibilities and settled on just one. And at that point, the postmodern experiment will be finished.

With knowledge comes power. And with power comes, well, a lot of power. Whether we asked for it or not.

Happy Groundhog Day.

________________________________

The Student of the Opposite

There is a moment in drunkenness, markered somewhere between the vague mileposts of sloppy and rip-roaring, when both time and truth come temporarily untethered from their stations. It is the first sip of a third Cuba Libre, the carafe of red wine hitting the empty stomach, the quick-spreading warmth of a molten double shot: A joyous vapor that sets the body reeling as it fast recedes from the reaches of the brain.

It is perhaps best understood as a Moment of Obsolescence, for in this brief span of time, the irrelevance of All Things crystallizes flawlessly. Suddenly, every word is the right word. Suddenly, all one’s heavy unease flies away so complete and effortless that a person could nearly just float up off the barstool. Powerful, immortal. Clocks bang out the tremulous hour and all is right in this huge, rotting earth. The world, as it should be.

As the senses are impaired, they become receptive to subtler machinations: the tinny ring of a child’s laugh, the rash impossibility of a purple-orange twilight, the heady glow brought on by a long pull from the first cigarette of the evening. The colors shade themselves into the softest hues of rose and violet and beauty escapes from the filthy undersides of every single thing.

Drunks are intimately acquainted with this moment, but remain powerless to grasp its temporality. Suffragists who care not to see the world tilt on such an ecstatic and unlikely axis call it a pathetic hallucination, a parlour riddle that indicates nothing. Less than nothing, even.

Me, I find it fascinating that a feeling so mercifully life-affirming can also get away with being so cruelly brief. What is the meaning of such a lark? What is being communicated to us by the universe through this ostentatious and short-lived display of obsolescence? We are afforded a glimpse into a world where everything that matters too much suddenly matters not at all, and then the curtain is snatched back down and we’re staring bleary-eyed into the indifferent heart of Things As They’ve Always Been, seltzer water in hand.

I’ve had ample time to consider the implications: I’m only 28, but already, my drunken times have been legion. Irish blood allows me to consume copious amounts of hard liquor without becoming ill, while a panache for self-indulgence and a low-grade inadequacy complex conspire to verily guarantee that I will be drunk in the company of others more often than I am not.

A particular Moment of Obsolescence I feel compelled to share: I was 21, and at the butt-end of a long, difficult semester studying French in Paris. In a quintessentially Parisian gesture, my teachers, classmates and I celebrated the fin du semester by laying out a lunchtime repast of cheese and bread and pastry across our abandoned desks and cracking open a long row of wine bottles.

As the party kicked into gear, I stationed myself near the wine. I’d been cast out from the group for heckling their shopping trips and expatriate café clatches, insisting instead upon Heavy Drinking and Meeting Boys as essential extra-curriculars. I’d sassed my teachers. I’d skipped classes to explore monoliths perched at the edges of German hamlets and to embark on three-day clubbing benders in London. But the ends of things tend to soften our assessments of What Is, and as the party progressed, I began to feel somewhat shamed by the memory of my behavior during the five months previous.

And so I drank. Nervous and giddy, I drank with a bold, exhausted impunity. Blame it on the nosedive my personal life had taken (Life lesson: don’t fall in love with your host brother) or perhaps on the months of rash indiscipline that lay both behind and before me, but as the wine took hold, I got loud and joyous.

I toasted my classmates, chiding them for boringly refusing to drink with me at 12:30 on a Thursday. I toasted my professors by apologizing for being “The Student From Hell,” only I jumbled the worlds and instead called myself “The Student of the Opposite,” to the utter confusion of everyone around me. I even sidled up to the program director, Ed, and asked after his kids, although it was clear to both of us by this late hour that the deep well of disgust and disdain bubbling inside of each one of us when we came face to face was entirely mutual.

The rest of the party is a blur in my memory. The next moment I recall clearly is me sitting on a park bench on the Boulevard Saint Michel, just outside the ecole, digging into a fresh pack of Gauloise cigarettes. It was close to 2 p.m. Rivers of cheap red wine were flowing through my arms and legs. My blood felt purple and entirely shot through with the stuff. The world read blurry and narcotic as I lit my first cigarette. I thought for a moment of the very first time I’d sat on my smoking bench. It had been the summer previous, on the first day of my arrival in Paris, in fact. I’d flitted down to the street in a long brown skirt and sandals, so full of hope and fear. Now I was bundled in a ski jacket. My hair was ratted into dreadlocks. But I was still Me, wasted on this park bench in the Latin Quarter puffing a cigarette, a ridiculous kid far from home with too much cash to burn and too little sense to scare herself sober. I felt such perfect freedom.

Within an hour, the world would go back to its aching. My eyeballs would swell and my throat would shrivel in a thirst so interminable only three cans of Orangina could sate it. My host brother would still spend dinner tracing his finger along the pattern of his monogrammed napkin instead of meeting my eyes.

The litany of cigarettes would ooze formaldehyde into my blood until the stuff was coursing through me, a yellowy elixr that spoke of everything wrong just as loudly and unmistakably as the red wine rivers spoke to me of everything right. But I wasn’t thinking of that there, then. And I’m not thinking of it here, now. In fact, I don’t recollect a second of it.

It’s a funny thing. I can always remember with astonishing clarity the way it felt to be me, drunk, in some particular Moment of Obsolescence, all the sensory bits and the thoughts that went spinning through my body, the way the world pitched and rolled in pure joy at my feet. And yet I can never remember the way it felt to be me, atrociously hung over, in the moment that came inevitably after. I think that’s telling something.

On second thought, “The Student of the Opposite” has a nice ring to it. In fact, it sounds just about dead on.

________________________________

After the flood: social connectivity in the credulous age

Last September, I attended my 10-year high school reunion. I’d felt a powerful and unexpected surge of icy dread in the weeks leading up to that inveterate ritual of passage into mid-life. The whole thing seemed so painfully overblown, so likely to disappoint.

I was secretly desperate to skip it entirely, but a handful of best friends propped up my courage with a few strong drinks. Arm in arm, we strolled giddily into a flashy and very grown-up-looking event hall at the designated hour. We donned nametags boasting our senior pictures and we screwed our collective courage and we headed for the bar.

And you know what? It was actually fun. We all stood around a crudités table, drinking whiskey-cokes and giggling into our shirtsleeves as we compared the battle scars of our first decade of adulthood; dog tags from tours of duty in Iraq, ill-advised yin-yang tattoos, tarnished wedding bands. Out in a courtyard attached to the building, the guys shed their jackets and passed around cigarettes, while the girls took off their shoes and danced with each other.

Someone lit a joint. And as I gazed around me, I was suddenly struck by an overpowering sense of semantic disorientiation.

Was it the monumental ordinariness of the event? Sure, we were the lot of us merely only slightly fatter and (sometimes) wealthier versions of our eighteen-year-old selves. And, yeah, it felt like playing at grown-up, somehow. But that wasn’t the thing that so shook me.

It wasn’t the company, either; people were clever and engaging and boisteriously enthusiastic about re-encountering each other.

It was only that I found myself rendered speechless by the dawning of an odd and ignoble truth: I hadn’t spoken to the bulk of these people in 10 years, but I knew them intimately; their jobs, their political inclinations, their relationships statuses. We were Facebook buddies, after all.

Knowing that, and, further, knowing that they knew it, too, made it feel unbearably disingenuous to flash a toothy grin and toss out some overfed line:

“So. What have you been up to?” (Because I already know about your frat-tastic college boyfriend, your honeymoon in Vietnam and your quick-and-dirty divorce.)

“Oh! You have kids?” (Because I have already noted the color of their eyes and how they celebrated their first birthday, which seems completely normal when I’m sitting at home on the Internet, but somehow feels creepy to admit to your face.)

What could I possibly say?

*    *    *

In the heady early days of Myspace and Facebook, everything was different. Nobody had 800 friends. Potential employers didn’t dress you up or down based on your virtual comportment. Instead, logging in was really just like being transported to a seedy, raucous bar containing 100 of your closest acquaintances. People cursed and spat, freely posting photo evidence of their indecorous weekend exploits. They bitched about their jobs with total impunity. And they needled each other endlessly without care for who might see. Because, way back in the dark ages of 2006, the answer was almost nobody. At least, nobody you cared about caring.

After that, the flood. Parents joined. Teachers joined. Your entire fourth-grade class joined. And they all found you and wanted to know how the heck you’d been.

During grad school, a favorite professor who’d recently set up a Facebook account described the experience this way: “It’s like going to heaven—everybody’s there!”

He’s right, you know. Suddenly, Everybody was there. And this posed some problematic and very interesting implications. It became impolite to tell anyone “No” when they frisbeed a Friend Request your way. Even if you never talked during whatever brief period your non-virtual lives intersected. Even if you found them sort of grating. That virtual bar suddenly began to feel a bit stuffy and overcrowded. The bouncer had been trampled into a heap at the front door. And things were about to get much, much louder.

*    *    *

As a kid, or even as a departing high school senior, the concept of a direct line to every single person I’d ever met was inconceivable. Goodbye was goodbye. We donned our red gowns, hugged each other hard, and got on with the general business of living. We pursued relationships and college degrees, we had adventures or retired to the leafy suburbs, and we bookshelved that era of our lives. We banked on staying close to our best friends, sure, but we were happy to settle for infrequent and short-lived run-ins with everybody else. The world was as it should be.

Cut to the high school reunion. Ten long years later, all of us gritting teeth beneath a vapor of pot smoke and half-conceded commonality.

And (!) the lot of us practically neighbors. We’d shacked up in a noisy and querulous virtual neighborhood whose fences never seemed quite high enough to contain the secret messiness of our lives, a fact that both fascinated us and filled us with mutual terror.

Cause nowdays, gossip travels at the speed of sound. And the ends and beginnings of our relationships are things of public record.

Dana is engaged.

Erin is now single.

There were simply just… no words.

We are, in short, a generation incapable of surprise. But social decorum hasn’t caught up with the velocity at which we now access information about each other, and so it asks us to feign surprise never the less. We still feel somehow obliged to interact with each other in the old fangled ways, to perpetuate the ruse.

In the absence of precedent, I’ve had only my gut and my guilt to guide me. And so far, those two have proven rather questionable barometers when it comes to navigating the tricky, argumentative terrain of social-networking.

Exhibit A: My Friend Request folder is littered with a mess of potential “friends” that I don’t want, but feel, for various reasons, too guilty to turn down: acquaintances of my parents, elderly relatives, grade school classmates who’d slighted me some way or another. But saying “No” feels too direct, too harsh. So I just never look in the folder.

Exhibit B: I have an ex-boyfriend. His profile is private. His new girlfriend’s profile, however, is not. She appears to be unemployed, and she is what social-media researchers would call a “high-volume user.” And she is, to put it charitably, not exactly an intellectual. I know that cruising her wall in search of inanity and spelling mistakes is inappropriate, and pointless, and slightly pathetic. But I can’t resist! Somehow, I crave the affirmation.

Which begs the question of whether the aforementioned direct line will ever be used for good and not for evil. I know where I’m placing my bets.

I have a (real-life) friend who regularly creates and deletes social-networking profiles. The endeavors always start out well: she uploads a cute, quirky picture, she sends out requests, she updates her status. But always, within a matter of days, she finds herself stalking the pages of old rivals, of new love interests, of anyone, really, that she wouldn’t be granted intimate access to in the Real World. Things dovetail predictably from there. She feels crushed and furious when friend requests are denied. In fits of pique, she compulsively deletes people. And eventually, she trashes her profile altogether.

“I get too freaked out,” she told me after a recent deletion. “I just can’t handle it. I feel like these people aren’t really my friends, even the ones that really are my friends.”

And that word… Anymore, “friend” comes prepackaged with a sizeable cache of contradictory meanings. It’s even a verb, now. So what does it mean when Friendship is no longer a passive, gradual endeavor, a seed you water and watch grow? What does it mean when Friending becomes, instead, something you do to someone else. Like in the same way you would hit someone or hug someone or hang someone, you friend them. It’s active and direct, but the weird part is that it often signifies little more than passive curiosity.

Further, what does it mean when who you aren’t friends with begins to explain more about you than who you are friends with? My whole life, I’ve chosen friends on the basis of shared interests. Now I need a reason not to be someone’s friend. This makes it hard to tell anybody apart, to pick out who my real tribe is among the mess of People-I-Only-Kind-Of-Know-Or-Care-About.

Navigating the shift from late adolescence the grown-up world has never been easy. But the Facebook Era has made early adulthood feel uncomfortably much like some sort of virtual tribunal. Everybody’s judging. Or, at least, everybody’s watching.

And in the semantic bubble of my high school reunion, I finally figured that out.

We’re all of us up for execution every single day of the rest of our lives.

Of course, there is a way out. It involves deleting your profile and resigning yourself to the cold comfort of the analogue world, where little has probably changed, where your friends are the only ones within shouting distance, where your secrets are safe.

But making a break for it also requires that you resign from your station on the firing squad.

Be honest. Do you really want that?

________________________________

Undescribable unthings: Franz Kafka and Tiger Woods get a dressing down

“Leopards break into the temple and drink to the dregs what is in the sacrificial pitchers. This is repeated over and over again; finally it can be calculated in advance, and it becomes a part of the ceremony.” – Franz Kafka

This quote has been bouncing around my head ever since the Tiger Woods scandal reached its thundering crescendo a few weeks back. At first, though, I wasn’t sure why this mini-parable — coming as it does from such a different age  — so resonated with the way I was beginning to understand the flashpoint event that sealed the demise of a longtime American hero.

Let’s start with the general. Kafka’s parable seems to offer commentary on the way Men deal with Dissent and Difference. We’re groping at two key truths about humans, here: one, that we are inclined to favor the predictable, the orderly. Two, that we place our bets, consistently, on unstable entities.

And, Kafka suggests, this fundamental paradox leads us to act in some rather odd ways, to create all sorts of bizarre archetypes.

It doesn’t take a sour-face academician to pin down Tiger Woods as a Fallen Hero. In fact it’s so obvious, it’s been vastly underanalyzed, ask me. Let’s review for a second.

If human history can tell us anything, it’s that those we idolize and deify are bound to disappoint us. It is a tale as old as time itself: we hold humans imbued with certain Magical Qualities up for deification. We give them power. Lots of power. More, it seems, than they often have the maturity to handle. And then wait for them to betray us.

That waiting is a way of neutralizing unorthodoxy, I’d wager, by integrating threats to our social structures into the fabric of traditions. But it seems sort of unfair, because, working forward from this assertion, Tiger Woods begins to read less as a glad-handed slimeball and more as a victim of an oblique but inveterate social ritual of deification and divestment.

Here, the ritual entails three principal transformations: first, the transformation from human to superhuman (See: Tiger Woods Playing Golf at Age 2). Second, the ceremonial divesting (See: Tiger Woods Apology Speech) Last, the symbolic restoration (On the way, in short order, I’d guess: in his apology, Woods cryptically promises a return to golf “one day,” then slightly less cryptically adds that it may well be “this year.”)

At first blush, Tiger Woods and Franz Kafka couldn’t seem more different: One is a contemporary sports star who’d notched billions in endorsements before a series of figurative and literal car wrecks sent him tumbling into disrepute. The other was a nineteenth century writer and European Jew who died of Tuberculosis young, virtually unknown, and begging his friends to burn everything he’d ever authored.

One would live to see a man of similarly diverse ethnic background assume his country’s highest office; the other would die shortly before all three of his sisters perished in World War II concentration camps.

But. Using the Aryan nightmare of the Holocaust as a dividing line from which racial hatred and racial obsolescence snaked outward in opposite directions, one man, Kafka, can be seen as pre-racial, and the other, Woods, as post-racial. And I’d argue that the differences between the two men belie an equally broad swath of similarities, as it is with two sides of a coin, as it is with so many other things.

Let’s consider a few examples:

Both were caught in culturo-ethnic netherworlds: Kafka was born in the Prague of the late 1800s, yet an enclave of the Austro-Hungarian empire. His upbringing was wrought in a mess of tongues; Yiddish at home, Czech in polite society, German for all matters bureaucratic. And he was Jewish in a time when Judaism was associated with filthiness, weakness, when centuries of anti-Semitic thought were heating to a rolling boil.

Woods, too, is something of a neither-nor: one-quarter each of Chinese, Thai and African-American, plus one-eighth each of Native American and Dutch. So variegated is Woods’ ethnic makeup that he coined his own word for it: “Cablinasian,” a synthesis of Caucasian, Black, Indian and Asian.

(At the risk of mixing metaphors, it is also interesting to note that the protagonist of Kafka’s most famous work, The Metamorphosis, collapses under the weight of a jumbled identify; traveling salesman Gregor Samsa awakens one morning to discover that he has been transformed from a human into an “ungeheuren ungezlefer,” which translates roughly as an “undescribable unthing.” It is usually vernacularized into “insect” in English translation of the novel, but this isn’t quite what Kafka meant.)

Both had daddy issues: Kafka never married, fearing above all else the potentiality of becoming his father; a harsh, exacting man who’d pulled himself up by bootstrap and sheer stubbornness into the ranks of respectable society. He expected his children to follow unarguing suit, but the younger Kafka had other ideas, and his works consistently portray authority figures as cold, illogical and unreasonable.

Papa Woods, too, was larger-than-life. He was driven. He was exacting. He had big plans for his offspring. And, like his son after him, he was unfaithful to his wife, a revelation which is said to have horrified the adolescent Woods, who had idolized his father.

Both had weird sexual hangups: Kafka pursued parallel sexual lives: he had a documented taste for whores, but consistently maintained relationships with respectable women. Likewise, The Metamorphosis is often read as an allegory about being torn between the powerful social urge to please those around us and the equally powerful primitive urge to satisfy our own, darker desires.

Woods married a nice girl and started a family. But he also cultivated a (purportedly insatiable) taste for women of the evening; two porn stars and a Playboy model are named among the running tally of his conquests, and some even claim he also pursued sexual relationships with other men.

Both put on extensive airs: Kafka maintained a demeanor of charm and cultivaton that made him popular with everyone, especially women. Deep down, though, he was irrationally terrified that he’d be unmasked for what he truly saw himself as: a filthy, sickly Jew.

Woods was famous for his reticence and quorum when it came to interacting with the public; He wore “power colors; his answers during press conferences were clipped and consistently glib, his handlers rarely allowed him to interact casually with fans or other golfers.

*    *    *

Similarities aside, the fates of the two men also read as the flipsides of a single coin; One did not see success in his life, the other did— seemingly more than he could easily abide. one was terrified of rejection and failure, the other came to believe he was immune to such things. And both battled mightily against the revelation of an inner ugliness, of the soul’s darkness bleeding outward into physical form: early on, Kafka had peered into his own heart and hated what he found there. Likewise, we peered into Tiger Woods’ heart and recoiled in disgust.

One man died before he had the chance to join a million of his brethren in martyrdom; the other seemed immortal, and thus required our assistance in his own destruction.

Before his death, Kafka penned a lengthy and damning missive to his father. In the last paragraph of this letter, he wrote:

“Naturally things cannot in reality fit together the way the evidence does in my letter; life is more than a Chinese puzzle. But with the correction made by this rejoinder—a correction I neither can nor will elaborate in detail—in my opinion something has been achieved which so closely approximates the truth that it might reassure us both a little and make our living and our dying easier.”

If you really stop and think about it, the concept of any confession being an approximation of the truth, and not the truth itself is sort of mind-blowing. Indeed, in the end, Kafka and Woods were each forced to concede that the truth is never simple, no matter how smart you were, now matter who your dad said you were—or weren’t.

In the end, it’s a coin toss: some people become martyrs, others become heroes. But either way, there’s a penance to be paid.  It’s history’s vow made good. It’s the critical and exacting eye turned selfward, then outward, then selfward again, over and over. It sees only weakness. And it demands in recompense nothing short of total destruction.

________________________________

How I Learned 견디다

We dissect nature along lines laid down by our native language. The categories and types that we isolate from the world of phenomena we do not find there because they stare every observer in the face; on the contrary, the world is presented in a kaleidoscope flux of impressions which has to be organized by our minds — and this means largely by the linguistic systems of our minds. We cut nature up, organize it into concepts, and ascribe significances as we do, largely because we are parties to an agreement to organize it in this way — an agreement that holds throughout our speech community and is codified in the patterns of our language […] all observers are not led by the same physical evidence to the same picture of the universe, unless their linguistic backgrounds are similar, or can in some way be calibrated.
-Benjamin Whorf

견디다.
It was a word I’d come to abhor intensely before the end of my 16 months teaching English in South Korea. Gyon-di-da. I’ve never known the precise translation, but it corresponds roughly to the English verb “endure,” and seems to suggest perseverance in the face of suffering, approaching life’s twists and turns with a martyr’s grace.

Working where I worked demanded gyon-di-da in spades. I quickly discovered that I had none. In spite of the fact that they were running 10 after-school English programs, neither of my bosses spoke English. Mr. Kim and Mr. Khang were also bankrupt and laboring under massive gambling debts. There was never enough money, enough supplies. Things went askew, constantly. I’d lose patience, and inevitably, someone would try to placate me with that word. The boss has decided that you will now be working on national holidays for no extra pay? Gyon-di-da, came the soothing refrain. A child has vomited in your classroom and the janitor refuses to clean it because you “don’t count” as part of the elementary school? Gyon-di-da. You’ve been standing in the detergent aisle for 10 minutes and still can’t figure out which one is the laundry soap? Gyon-di-da.

I quickly came to hate the sound of the word, and all I understood it to symbolize. In some ways, I suppose I blamed everything that didn’t make sense about my new life, all the little failures and miniscule tragedies, on it. I couldn’t tell a taxi driver my own address. I couldn’t control my first graders. I couldn’t find any shoes that fit. I felt muted, tired.

I’d never realized how full of pleasantly mindless chatter life is, how, “Mostly, we just talk” (Johnson 112). We chirp and twitter incessantly at each other, passing on and picking up vital information, aware of the fact that “There are more things to be spoken of than there are words with which to speak them” (Johnson, 115), but determined nevertheless to have our say. Language gives us comfort. It lets us learn from each other, allows us to feel important, and heard.

We chart our territories in imperfect ways, yes, but when you venture outside of your linguistic bubble, when you fall off the map entirely and all your pretty talk becomes meaningless, when the comfort of mindless chatter disappears, how can you possibly be expected to endure?

The Whorf-Sapir hypothesis suggests that language shapes culture, that we are only capable of understanding those things for which we have words. In Korea, I came to see reality as a social construction, a club you get to join only when you consent to its collective agreement to call this sort-of-ugly bunch of leaves and blooms a weed, and this much prettier one a rose. It exists because we’ve imagined it into being. We construct it, and reconstruct it, daily: “We do not merely discover facts; in some degree we fashion them” (Johnson, 96). It is necessarily exclusive, and tailored to the needs of its creators. You have to take reality on its own terms. And you’ll never get past the bouncer until you learn what in the heck those terms even are.

Me, I got to Korea, and I had no words any more. Or perhaps I had far too many, and none of the right ones. I was lost amid an alphabet soup of symbols and sounds that had no correspondents in my personal reality. Streets lined with twisting mazes of fluorescent gibberish, unintelligible newspapers, impatient shouts of Odi-ga on the bus, all of it was nonsensical and hard. Coherence left me, and with it went my identity. I thought constantly of the way my mother used to coax me from my shyness by cooing, “Use your words.” But my words were getting stuck, reversing themselves. I was choking, and all they could say was gyon-di-da. I didn’t want to gyon-di-da! I wanted to scream. I felt muted, powerless. There was no backdoor in to understanding, and so I’d stand around instead like a big, dumb beast, waiting for meaning to reveal itself in some lesser way, cause, really, what else could I possibly do?

And the strangest part was, it was language that had brought me there in the first place. In Korea, as in so many parts of the earth, the ability to speak English is a source of great power and prestige. By being born in the United States, I’d acquired, without effort, a valuable blueprint for modernity, for success. Each day, I stood in front of rows and rows of uniformed schoolchildren and tossed out my precious words. Get, got, gotten. I watched my students go scrambling after them, snatching them up. Myself, yourself, herself, himself. They spilled out from inside of me, like little gold coins rattling across the floor, rolling into the corners. From the doorway, the mothers clapped or frowned, and life moved forward.

Months passed, filled with mishaps and small victories, my fumbling attempts to master Hangul. I bought a poster of the Korean alphabet and worked on the sounds with a clumsy tongue, scratched out the symbols onto pieces of paper, over and over. One day, a Korean friend looked over my work and commented that I wrote like a six-year-old. I was flattered enormously. I’d become intelligible. Maybe only just barely, but still.

And then, too quick, it was all over, time to go back home. My last night in Korea, I wandered alone down my street and thought of that first, strange night when my boss had picked me up at the airport and dropped me off there, at my tiny little apartment. Jet-lagged and in a vague state of shock, I’d snuck out for a cigarette. My heart racing, I’d ventured down the alley behind my building, staring up at the rows of glittering signs, mapping out a fluorescent trail of breadcrumbs in my head. Straight past the pile of garlic bulbs, left at the neon chicken. I made it 300 feet, and then scampered back to my apartment in fear. I was afraid to go any farther. I had no coordinates to map, no map at all, in fact, and was forced to rely solely on fleeting visual referents. If I lost my way, how could I ever get back?

By that final night, I couldn’t even count the hundreds of times I’d trudged up and down my street. I could read every sign, knew where to find chicken, or beer, or ginseng tea. And it seemed almost inconceivable to me by then that I’d worried so desperately about finding my way back that first night, to the place where I’d started. It had taken me16 months to realize that you can’t go back, ever.

“Use your words.” It seemed like such a simple idea. From my time in Korea, I learned that sometimes there are no words, no trail of breadcrumbs to lead you back to a meaning you can rest your head on. Truth is, sometimes you just have to keep moving through the void, keep waiting for the next corner of the puzzle to reveal itself to you. Because it will, most of the time, if you can stand the wait.

Using language is a graceless dance, equal parts waltz and stumble, and “under the spigots of cultural evolution the old categories bulge, spill over, and collapse” (Johnson, 119). But living as we do in an “infinite-valued reality” (Johnson, 117), we must learn nevertheless to position ourselves within the overlap, shared language or none. We must have grace with each other, and ourselves. We must be patient.

견디다. We must endure.

________________________________

The Unfinished Project of Aggregated Intelligence

“The most merciful thing in the world…I think…is the inability of the human mind to correlate all its contents. We live on a placid island of ignorance in the midst of black seas of infinity….and it was not meant that we should voyage far…The sciences…each straining in its own direction….have hitherto harmed us little….but some day the piecing together of dissociated knowledge will open up such terrifying vistas of reality….and of our frightful position therein…that we shall either go mad from the revelation or flee from the deadly light into the peace and safety of a new dark age.”

-H.P. Lovecraft; Call of Cthulhu

The systems which drive and define humans as a species have been engaged since time immemorial in a process of adaptation and increasing sophistication known in its sum as evolution. Moving either backward or forward from the individual as a single unit, one marvels at the complexities of our organic systems. At the cellular level, an intricate ecology of networks comprises the human body. An old Buddhist koan reads, “In every speck of dust are Buddhas without number,” and indeed, the blueprints mapped onto each of our cells contain universes within universes.

Likewise, at the collective level, humans have learned to organize themselves into efficient and complex systems of “social cohesion” (Hall, 30). We call them cultures, and although they mimic the growth and complexification of the biological systems, they differ in one crucial aspect: while our organic systems favor variety, our cultural systems favor homogeneity. They function as large systematic units of aggregated intelligence aimed at pooling physical and mental resources, and they are inefficient unless they succeed in expressing and enforcing norms. Such uniformity is achieved easily enough within a single system, but humans have struggled mightily to achieve, or reconcile, any sort of intra-cultural homogeneity.

The aggregation and correlation of knowledge, which is informed and reinforced by an awareness of passing time, is the conspicuous earmark of modern civilization. As the biological components of organisms adapt and change over passing time, so do cultural systems evolve. This evolution has sped up exponentially during the last million years. Perhaps one of the most fascinating manifestations of our increasing sophistication has been the emergence of complex language systems, which began as systems of verbal communication and have developed into elaborate systems of gesture, speech and writing which “arose at the same time as tool-making, sometime between 500,000 and 2,000,000 years ago” (Hall, 56).

The catalytic pairing of tools and words has “made possible the storing of knowledge” (Hall, 57), and has nurtured the aggregation of a sizeable collective knowledge, especially during the last 10,000 years. Armed with the intelligence to “symbolically store their learning against future needs” (46), humans have become obsessed with creating a record which can be accessed and utilized by future generations.

But why the obsession with manipulating future environments? It seems that our increasing sophistication has forced upon us some troubling revelations. The construct of rational time forces each human being to concede to mortality on individual and special levels. Intelligence demands an acknowledgement of the chronology of existence, and if time is understood as a commodity of limited proportions, the recognition of mortality becomes inevitable. In short, intelligence led us to the construction of time, and time and death are directly linked.

Perhaps this revelation is what drove and continues to drive us to make a record, to carve our names into rocks, and to copy the words of great prophets into precious holy books, and to build amphitheaters that will stand long enough to contain our great grandchildren’s great grandchildren’s great grandchildren. We seem obsessed with the question of what will we leave behind us when we go, both as individuals and as a species. Humans have made immense efforts to scratch out a lasting record, something more true and definite than a yellowed pile of bones and teeth gathering dust in a cave. Something that will survive us, something we can leave behind that will say, definitively, we were.

Armed finally with intellectual complexity and fast heading in the direction of a universally intelligible linguistic and numerical system with which to construct and correlate this record, the various tribes of humanity now face the Herculean task of a larger aggregation of understanding. Our cultural systems spent the bulk of their history evolving in isolation, separated by mountains and oceans that were, for most of human history, impassable, uncrossable, impenetrable. Perhaps the most astounding consequence of our advanced ability to manipulate our environments has been the uneasy joining together of these discrete cultures, for humankind’s highly developed capacity for self-awareness nurtured an overwhelming curiosity to know what lay on the other side of the far mountains, beyond the edge of the sea. When these disparate cultures met, lines of evolution came face to face, and their disparate systems of intelligence could be pooled together.

The best consequence of these meetings has been a more coherent and complete understanding of how our environment functions, a wide-scale pooling of physical and intellectual resources that allows for ever-greater efficiency. Russians may now eat pineapples in the dead of January. Americans can power their big, noisy automobiles with oily black goo from the hot high deserts of a land most of them will never even see. Mathematical equations scratched out on tiny slips of paper in a laboratory in Delhi can be passed on instantaneously to scientists in Sweden, where those intelligible marks might be turned into a Tupperware container, or an atomic bomb, or a cure for cancer.

However, there has been scant little agreement as to how we as humans should forge that record, and these sometimes graceless meetings between tribes have often sparked in us an impulse to recoil at the “shock of contrast and difference” (Hall, 30). When human cultures meet, it seems, the best and worst instincts of our species engage.

But why is it so hard? Why are human cultures so threatened by the appearance of opposing truth systems? Hall explains that our cultural systems guide our behavior and shape our values in ways that we often don’t understand: “Culture controls behavior in deep and persisting ways, many of which are outside of awareness and therefore beyond conscious control of the individual” (Hall, 25). Cultures can be understood as providing human tribes with coherence and security, and when a culture is faced with a competing or irreconcilable version of the truth, it is often interpreted as a threat to that culture’s understanding of how it came to be, and why.

Such dissonance was once easier to ignore, but our shrinking world demands more and more insistently that we acknowledge, and react to, that which is foreign to us, that we find a way to place it within our own cultural contexts. Sometimes an opposing truth claim can be integrated, or absorbed, or tolerated, and sometimes it is seen as simply too dangerous, and then it must be destroyed, symbolically or physically. For the non-dominant cultural system, these deaths can be enormously painful to abide. Those who cried when the library at Alexandria burned were mourning far more than the sooty bits of papyrus floating in the air like gray snowflakes. They mourned the death of history, the destruction of a precious record of who they were.

All of this compulsive and sometimes violent aggregation is, at core, an expression of humankind’s deep longing for perspective. Sadly, for all of our extensional wizardry, we are at the end of the day forced to concede the disarming fact that we still have absolutely no idea what it all means. We have a cursory understanding of how our species came to be, but the question of why is as yet unanswerable, and immeasurably troubling. We long for true perspective and we cannot, for all of our compulsive aggregating and correlating, correlate our way to a deeper understanding of who we are. In our quest to alleviate this uncertainty, we’ve developed within our cultural systems elaborate religious constructs and political ideologies; we’ve funneled our money into government-funded space exploration programs. We battle and rage over whose truth is the truest. Then the winners colonize the losers, who subsequently drink themselves to death on parcels of crappy, unarable land until they die out or are simply absorbed into the dominant cultural system. The books burn, and the temples collapse, and the process of aggregation becomes a process of elimination.

Humans seem to lack the perspective to approach aggregation in a more humane way, but perhaps there is hope. Piers Sellers, a Space Shuttle astronaut who has been to outer-space twice, called the experience of viewing the earth from on high “mind shattering,” an experience that overloaded his senses and completely altered his understanding of himself, and the planet he was born on. In a sense, what he achieved was that fabled, long-sought, “perspective from a hill.” Likewise, many credit the visual revelations brought back from the Apollo 8 mission in the 1960s with sparking the global conservation movement. We’d invented microscopes with which to view the miniscule complexity of our organic components, we’d clawed and scratched our way to the top of Mount Everest and to the salty, pressurized depths of the Mariana Trench, but there had before that moment been no hill high enough from which to gain a true perspective of our collective makeup, no mirror which could reveal to us a true reflection of who, and what, we were in the process of becoming. The picture was astonishing.

We go deep and we go high, and we return with snapshots of the things we found, precious bits of knowledge to share and aggregate. The questions of how this knowledge will be divided and with whom it will be shared, whom it belongs to, cause us enormous unrest. And yet the impulse to set out across the dark ocean, to meet up face to face with those other lines of evolution, to find out what they know and tell them what we know, is irresistible, and now, unavoidable.

The Bible tells the story of a time when humans shared a single culture and language, after the Great Flood had washed away the tainted beginnings of civilization and the remaining humans congregated in the city of Babel, where they lived as a united people with a shared tongue. The Babylonians longed to create something larger and lasting, longed to know God more directly. “Let us make a name for ourselves,” they told each other, “lest we be scattered abroad upon the face of the whole earth.” And so they set about building a great tower, whose top, they hoped, might graze the heavens. But when God spied the mighty obelisk reaching up into the clouds, he grew angry and destroyed their tower, scattered the citizens of Babel across the earth, confusing their languages and condemning them to live apart, disunited, without the benefit of a collective intelligence or the drive of a collective longing. According to the big bang theory, the universe began as a smoldering, densely-packed dot, infinitesimally smaller than the period at the end of this sentence. Then, some unnamable force set the boundaries of that dot expanding, and expanding, and expanding, and all of that compressed matter scattered with it in a process of galactic diffusion that continues to this day. Some scientists theorize that this process of dispersal will continue indefinitely, until the components of our universe are hopelessly diluted, all the little bits drifting off into obscurity, becoming unrecognizable, too big even to retain a notable shape.

Either way you slice it, the history of humankind, of cultural systems, is writ as a protracted struggle of aggregation and correlation, a frenzied and sometimes violent attempt to get to the truth and to wrest it into something lasting, to trace the pattern backward to its source, and forward to its ultimate completion, to reach a hill high enough that it might grant us some kind of meaningful perspective, from where all of what we are will lay spread out before us like some giant patchwork, from where the true shape of things might emerge, finally. We have not yet succeeded.

______________________________________

Toward a Narrative Ethics of Journalism

For decades, the relationship between news gathering and news consumption has been described in metaphors of direct transmission, in terms of acts done by those who create news to those who use news. As with an injection, journalists distill the influx of daily information into a neat, concentrated dose and shoot it into the bulging vein of a passive populace. As with a bank transaction, journalists gather up the golden coins of meaning and deposit them into the heads of the masses.Plink. Everyone’s wiser, happier.

However, such metaphors belie the reality of reporting within and about a complex world. They fail to see news transmission as an active rather than passive endeavor, one that is informed perhaps as much by recipients as by deliverers. They fail to take into account the complexity and imprecision of the language acts that shape journalistic products. And, worst of all, they fail to empower news consumers.

Certainly, the ability to construct and decipher complex spoken or written codes is what sets the homo sapien sapien— the double-knowing human— apart from the rest of the beasts. However, problems arise when those codes are seen as precision systems, when words spoken about a thing are mistaken for the thing itself, for the map is not and can never be the territory it represents (Korzybski, 1933, p. 747). This is even truer in contemporary society, which has grown increasingly “super charged with information” (Christians et al., 1993, p. 122) and “fragmented by linguistic games” (p. 122). To see journalism produced in such an environment as simply “the deposit of truth into the minds of listeners by learned authorities” (p. 104), or to see journalists laboring in such an environment as mere conduits through which transmission occurs is to grossly oversimplify the situation, and to risk seeing news consumers as passive data receptacles.

The post-postmodern age calls for a more nuanced, more empowering and more reciprocal approach to doing journalism; one that acknowledges the complexities and limitations of language-as-a-conduit and works to cultivate conversation, to grow knowledge. This can be accomplished by pushing an agenda of narrative ethics. Such an agenda eschews metaphors of direct transmission and focuses instead on journalism as an interactive endeavor. And as it’s “less moved to persuade than to understand” (Christians et al., 1993, p. 104), it aims to create of dialogue, not transmit monologue.

An emphasis on narrative ethics in journalism empowers the public in several crucial ways. First, as it acknowledges complexity, it provides a useful framework for understanding a dynamic reality. Second, as it invites conversation, it promotes higher levels of mass critical consciousness. Third, as it values input, it encourages communities to develop critical languages with which to interpret their worlds.

If narratives can be understood as “linguistic forms through which we think, argue, persuade, display convictions, and establish our identity” (Christians et al., 1993, p. 114), then narrative ethics, by extension, is concerned with developing codes that move those linguistic forms forward. It demands that journalists ask the question,“‘[w]hat norms or institutions would the members of an ideal or real communication community agree to as representing their common interests after engaging in a special kind of argumentation or conversation?’” (p. 116). Implied here is the crucial importance of argument and dialogue. Assumed here is the fact that such conversations don’t happen nearly as often as they should.

Such a view aligns itself with the broader theory of communitarian journalism, a model which sees news as “a narrative construct that enable cultural beings to fulfill their civic tasks” (Christians et al., 1993, p. 14). Communitarian journalism emphasizes “the dialogic self, community commitment, civic transformation, and mutuality in organizational culture” (p. 13) and expects journalists to create news that encourages such world views and catalyzes such events. The emphasis here is on the proper functioning of the whole and not the discrete operations of the individuals within the whole, and on the view of that whole as an organic, shifting entity that must be re-correlated and re-understood constantly, by journalists and by the community at large.

By acknowledging dynamism at the individual and societal levels, narrative ethics provides a useful framework for interpreting events. It takes on as a goal conversation, not consensus, and acknowledges the metaphor of journalist-as-translator, not the antiquated metaphor of journalist-as-transmitter out of the belief that, inevitably, “in composing their reading of events, the news media selectively form a coherent translation” (Christians et al., 1993, p. 120). At core, narrative ethics recognizes that knowledge takes on the shape of the conduit it passes through, much as water expands to fill the contours of the vessel it is poured into. The contents may retain their essential elements, but they are bound by external forces to recorrelate those elements endlessly.

Certainly, acknowledging such complexity piles a heavier burden of responsibility onto the journalist, for when he begins to see communities as “woven together by narratives that mediate their common understanding of good and evil” (Christians et al., 1993, p. 115), his task of describing and interpreting the world takes on great significance. Thus, “communitarian ethics challenges those storytellers whom we call reporters to specialize in truthful narratives about justice, covenant, and empowerment” (p. 116). A complex world demands explanation, for it is only through “recomposing events into narrative ensures that humans can comprehend a reality at all” (p. 117).

However, the task of meaning-making isn’t left to the journalist alone, for the emphasis narrative ethics places on empowerment encourages the development of a critical mass consciousness. The journalist who views stories as symbolic frameworks that shape the human experience (Christians et al., 1993, p. 114) is compelled to write stories that raise community consciousness. And raised consciousness leads to mass engagement, which in turn allows humanity to prosper enormously (p. 107). Thus, “[n]arration gives order to social life by inducing others to participate with us in its meaning” (p. 114).

Because narrative ethics focuses on the revelation of complexity as an essential step in the journalistic process of meaning creation, it is summarily suspicious of the language of false binaries. While it acknowledges that the asking and answering of questions about good and evil or right and wrong is central to the project of communal meaning construction, it also pushes journalists to question this-or-that dichotomies, to mine the seams of a neglected middle in search of more nuanced perspectives. No longer must citizens searching for meaning drink to the dregs the watery stews of a two-valued orientation. Instead, they’re encouraged to think more critically, to abandon default binary thinking and address the gray territories of their public and private lives.

But critical consciousness remains little more than a dogma until it is tethered to a concrete means of application. Thus, a journalism based in narrative ethics, recognizing that language shapes reality and that “to label the world is to transform it” (Christians et al., 1993, p. 105), also focuses on helping communities develop the critical language skills that will allow them construct a more precise alphabet of meaning.

As a starting point, narrative ethics recognizes the limitations of language by acknowledging its imprecision. Because “language abstracts, just as a map does” (Christians et al., 1993, p. 117), and the sounds and symbols humans use to describe the ideas and objects they come up against in daily life are by nature inadequate to describe experience completely, journalists must work to avoid high-level abstractions. They must aim instead to force the particulars of a story down the ladder of abstraction and into the realm of the concrete whenever possible.

Further, they must insist on precision, and aim for the inclusion of a variety of perspectives. Citizens, for their part, are encouraged not only to participate in the construction of new frameworks, but also to question the validity of existing frameworks, to ask questions of journalists and of each other. This is empowering, for “through literacy and social activism, the oppressed learn to name their world” (Christians et al., 1993, p. 105).

In a many-dimensioned world, the old direct transmission metaphors of journalism read as flat, simplistic. Conversely, a brand of journalism emphasizing narrative ethics advocates a process of doing and consuming journalism that is more mutual, more dialogic. Perhaps our maps will never become precise, perhaps language will always hide more than it reveals, but if nothing else, communities empowered to participate in that imperfect science of plotting coordinates, in that difficult process of attaching words to experiences, are far better equipped to navigate the difficult terrain of human experience.

References

Christians, C. G., & Fackler, P. M., & Ferre, J. P. (1993). Good news: social ethics and the press. New York: Oxford University Press.

Korzybski, A. (1933). Science and sanity. Lancaster: The Science Press.

______________________________________

Holes

I’ve always had this problem with letting things go.

I can’t quite explain it. It’s like my brain is hardwired to cling to sensory data.

All data, good or ungood, absorbed into the frenetic, pulsing jumble that is my head.

I remember everything. Seriously, it’s almost creepy. I remember shitting my diapers. I remember the names of my tablemates from preschool and entire conversations we had over tiny cups of Mott’s apple juice. I can recite lengthy, tepid monologues from movies I haven’t seen or thought about in 20 years (which makes me really, super (not) fun to watch films with … just ask my family). It’s fucking endless.

It’s not that I’m particularly proud of this personality quirk of mine. You see, it gets mighty hard to make room in your head for new, good things when your mind is constantly spinning its way around an elaborate matrix of loops.

Alas, it’s just how I’m bent.

I suppose on some deep level I hate the thought of all the things in this world that are gone away. I hate that no one remembers the baby worms wriggling in a pile of dirt outside our kindergarten classroom, or the singular beauty of this one moment when we were 15 and we smoked a joint and sat together on my living room couch and the sun came pouring through the blinds and just set the room afire with perfect light.

I feel compelled to rescue such defunct moments from the clutches of time. So I categorize and inventory them ceaselessly. Then I hit rewind, over and over.

Some people amass collector coins or VHS tapes or porcelain birds. I hoard memories.

My unwillingness to let go has extended, also, to the physical. Goodbyes of any kind feel, to me, like ripping off a limb. My packrat tendencies have mostly been reformed by a string of unindulgent roommates and more than 25 moves across 14 cities in a single decade, but I still don’t forgive or forget easily. And I still mourn for lost things – a brown finch that mysteriously vanished from its cage one September morning during my 8th year, a cool shoe purse that I lost at the dollar store when I was nine, a couple of almost-full passports, a few boys I loved too hard, or perhaps not hard enough.

I’ve tended to avoid loss at any costs.

In the end, though, I’m coming to realize that this way of being comes with its own set of less tangible but equally mighty costs.

* * *

Considering my propensity toward limerism, it’s probably unsurprising that I never got my wisdom teeth out when I was a teenager.

It’s a fairly routine dental surgery, and a rite of passage as time-tested as crashing your mom’s car or pulling off your panties come prom night.

I did crash a car my senior year of high school, but most other traditions I bucked vehemently, including the ceremonial tooth extractions that mark so many young Americans for adulthood.

My mom needled me about making an appointment for a few years, but she eventually gave up.

Throughout my 20s, I was plagued by vague, insistent toothaches as a result of my refusal to give the teeth up.

I started to grind my molars together at night. As time passed, I noticed with some dismay that one of my bottom front teeth was being slowly and unceremoniously crowded out of its rightful spot.

There was no denying that those rogue teeth were on the move, but I was too busy traveling and job-hopping and fucking up to bother with things like dentistry.

I did actually schedule the surgery – twice – during my 20s, but in both instances, I canceled at the last minute. I was always too broke, or in a strange foreign country with no one to drive me home from the clinic. So it went.

As I approached 30, I kept at the grasping and clinging. To people. To lost places. To ideas, both about myself and others. I changed careers, cities, again and again.

I turned 30 last summer, and soon after that landmark birthday, I decided it was finally time.

My mouth ached incessantly by then. A tiny tooth edge had broken through my back lower left gum, right on top of my 12-year molar, the tip of a surfacing iceberg for which there was nowhere near enough room.

A fissure had appeared too, in the rest of my life. I couldn’t get up in the morning, no matter how much I slept. When I wasn’t working, I was thinking about hating working. I’d wake up in the night wincing at the gnash of my teeth. The little front tooth had turned practically sideways.

I retraced my steps, trying to figure out where I’d gone wrong, my mind awhirl with recalling and recasting.

Then, one day last September, it all became too much.

I scheduled the dental surgery and I decided to quit my job.

On my last day of work a few weeks ago, I woke atop a pillowful of tears, my jaw clenched tight.

I’d been dreaming hard. I’d gathered into my arms a little girl I knew years ago. In carelessness, I’d hugged her so hard her tiny back broke.

In the dream, I’d gone to visit her mother, to beg forgiveness and to discuss reparations.

“This is going to get ugly,” I told her as we cried together.

Sitting on the edge of my bed, I tried to get hold of myself, but it felt as if this great flood had come bursting up from somewhere way deep down. My mouth and the side of my head ached for release.

I knew it was silly, but I wanted to keep crying for what my dream self did, for my carelessness, for how I’ve always needed too much and grasped at everything so hard.

Instead, I accepted a hug from Morgan, I dragged my hands down my face and I rose, dressed, and headed to the office for a final time.

I finished my work that afternoon and I carried out my things in an old Sky Vodka box. Sort of a sad amalgamation – a few old folders full of newspaper clippings, a tube of mint lotion, a Chapstick, a notebook, my favorite silver mechanical pencil. It seemed so little to show for such effort.

“When a hole appears in your life, be careful not to fill it too quickly,” my coworker advised me.

That night, I met Morgan for Mexican food at a divey little place on U.S. Highway 101.

The waiter was a shy, old hombre with a bloodshot eye. He spoke to us softly in Spanish and we flexed our rusty language skills back at him. I ate a taco salad and I worked at being easy and ready. But I couldn’t stop gritting my teeth. It was like my mouth was clamping shut of its own accord.

* * *

Before bed that night, I took two heavy-duty sleeping pills, as prescribed. They would sedate me, make me forgetful, finally.

I remember little of what came afterward.

An alarm. A bowl of lentils. More sedatives. A car ride. An improbably huge needle.

Me, in the dentist’s chair, clenching a tiny Buddha figurine in my hand so hard I break the skin on my palm. Opening and closing my eyes to puzzle over the sudden and strange doubling of everything in sight. So many pairs of hands.

The pulling and cracking, the teeth refusing to give way. Commotion. A grunt of frustration from the dentist. The sides of my mouth tearing and bleeding.

Once, they hit a nerve, and angry energy raced through my bones, making me jump – something thumping and zinging and adamant inside of me that protested angrily at the premise of giving up even just one hunk of the matter that proved, certifiably, that I existed.

Hours later, I stood, or was stood up.

I asked to see the teeth. The nurse guided me to a bloody little pile of bones on a blue napkin in the corner of the room. She picked one up and pointed to a series of raised root ridges.

“See those bumps?” she asked. “Those are what made your teeth so hard to pull out.”

A genetic quirk. A lifelong battle. Every part of me fighting extraction of any kind.

I asked to keep one of the teeth, or perhaps I only thought I’d asked. Morgan led me, empty handed, to the car.

On the drive home, he tells me, I kept pulling wads of bloody gauze from my mouth. My head lurched backward and forward as I mumbled incoherently and pointed to the colorful whoosh of cars and passing billboards.

For two days, I cried, I slept, and I dreamed my dreams. I forced down pudding and eggs and watched movies I don’t remember. I slept more. I itched. I ate pain pills until the world turned soft pink and yellow and my voice thundered in my ears.

I rested. I twitched and I dreamed and I remembered and I forgot again.

As I emerged from the fog, my head felt airy and empty. So did my future. So many gone-away things. So many holes begging to be filled back in.

I knew I wasn’t supposed to, but I tongued at the big spaces in the back of my mouth. I discovered a stitch and a strange landscape of swollen ridges and crevaces. I healed.

* * *

And from the ordeal, one Big Adjustment to the way I understand my own earthly allotment of joy and misery.

If there is such thing as Hell, I think it exists in the heaviness we consent to carry inside of us, mile by mile, when we fill our hands too full of dust and blood and old slivers of bone. When we just won’t let go.

I’m trying not to be so afraid of the hellish bits anymore, of the partings and the losses and the forgettings. But I’ve also resolved to be less cowed by the beautiful bits, too.

I am not a religious person, but I’m learning to place my faith in a lesser brand of redemption. I call them my three minor commandments: be kind, try hard, and make reparations when you fuck up.

Because, trust me, you will fuck up. Things will get ugly. Backs and hearts and jaws will break in equal measure.

I had been stumbling through life with a head so stuffed full of memories and sharp bits of bone that it was turning me to marble, cold-white and implacable.

Now I am hollowed out. I am jobless, and three teeth lighter, and hard at work shedding some of that massive storehouse of remembrances.

While I was hopped up on pain pills, I threw away half of my photographs.

And now, whenever I feel a memory rising, I pause to reconsider its worth. If I really need it, or it really needs me, I’ll tuck it back to bed. But more often, I loose the knot and let it float away instead. Out my ear, or my mouth, or wherever else.

In a single month, I’ve been punched so miraculously full of holes.

Advertisements

Thoughts? Objections? Curiosities? Your comment gets mine!

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s