Thursday, January 29, 2009
In retrospect, George W. Bush’s presidency could be viewed as a science-fiction disaster movie in which an alien force seizes illegitimate control of a nation, saps its wealth, wreaks devastation, but is finally dislodged and forced to depart amid human hope for a rebirth.
There was even a satisfying concluding scene as a new human leader takes power amid cheers of a liberated populace. The alien flees aboard a form of air transportation (in this case, a helicopter), departing to the jeers of thousands and many wishes of good riddance.
After Bush’s departure on Jan. 20, 2009, the real-life masses actually had the look of survivors in a disaster movie, dressed mostly in ragtag clothing—ski caps, parkas, boots and blankets—bent against the cold winds trudging through streets largely devoid of traffic.
My 20-year-old son, Jeff, and I made our way home from the Mall to our house in Arlington, Virginia, by hiking across the 14th Street Bridge, part of the normally busy Interstate 395, except that only buses and official vehicles were using it on Inauguration Day.
So, the bridge became an impromptu walkway with clumps of half-frozen pedestrians straggling across it, over the icy Potomac. Jeff and I picked an exit ramp near the Pentagon, clambered over some road dividers, and worked our way to Pentagon City where we’d parked the car. It took much of the afternoon and evening for the cold to work its way out of our bodies.
Everyone I’ve talked to who attended Barack Obama’s Inauguration had similar tales of transportation woes—standing in long lines in freezing temperatures, frustrated by jammed subway stations, walking long distances—but no one was angry. Remarkably, police reported no Inaugural-related arrests.
Despite the grim economy and other havoc left behind by Bush and his associates, Inauguration Day 2009 was filled with a joy that I have rarely seen on the streets of Washington, a city that even at its best is not known for spontaneous bursts of happiness.
But there was more than joy that day; there was a sense of liberation.
An estimated 1.8 million people braved the frigid temperatures and the transportation foul-ups to witness not only Obama’s swearing-in, but Bush’s ushering-out. They not only cheered Obama and other favorites, but many booed those considered responsible for the national plundering, especially Bush and the wheelchair-bound Dick Cheney.
Wednesday, January 28, 2009
From WorldHum via Art & Letters Daily.
Visit the Museo de la Revolución in central Havana, and two things about the museum’s photo displays will immediately capture your attention. First, it’s clear that the battle to control Cuba in the late 1950s was ultimately won by the cool guys. Young, bearded and ruggedly handsome, the rebel warriors of Fidel Castro’s 26th of July Movement look like Beat hipsters and rock stars—Fidel tall and imposing in his fatigues; Camillo Cienfuegos grinning under his broad-brimmed cowboy hat; Ernesto “Che” Guevara looking smolderingly photogenic in his black beret. By contrast, the U.S.-backed dictator Fulgencio Batista and his cronies look bloated, balding and unquestionably corrupt in their stubby neckties and damp armpits and oversized paunches. Even without reading the captions, it’s easy to discern the heroes from the villains.
Look closer, however, and you’ll notice that the triumphant photos of Fidel and Che are faded and mildewed, their corners curled by age and humidity. The photo captions are spelled out in a clunky die-cast typeset that hasn’t been used in a generation, and contain glowing present-tense references to the magnanimity of the Soviet Union—a country that hasn’t existed since 1991. Despite the grungy glamour of the young men who toppled a tyrant all those years ago, the anachronism and decay of the museum’s exhibits reveal just how tired and toothless Cuba’s revolutionary myths have become in Havana. In many ways, the building is a museum of a museum—a yellowing relic of how the communist regime chose to portray itself in the 1970s.
Step outside the Museo de la Revolución into the humid Havana air, and the glamorous sheen of the bygone Cuban revolution seems to have been distilled into a single image—Alberto Korda’s famous 1960 photo of a bearded Che Guevara looking steely and determined in his beret. In a city where few buildings outside the restored Habana Vieja district have seen a new coat of paint in half a century, freshly retouched renderings of Che’s mug adorn countless walls and billboards. Moreover, in a country largely devoid of public advertising and religious iconography, Guevara’s ubiquitous image appears to fill the role of both Jesus Christ and Ronald McDonald—a sainted martyr of unwavering purity who also happens to promote a meticulously standardized (if not particularly nutritious) political menu.
Tuesday, January 27, 2009
Monday, January 26, 2009
What does the contemporary self want? The camera has created a culture of celebrity; the computer is creating a culture of connectivity. As the two technologies converge — broadband tipping the Web from text to image, social-networking sites spreading the mesh of interconnection ever wider — the two cultures betray a common impulse. Celebrity and connectivity are both ways of becoming known. This is what the contemporary self wants. It wants to be recognized, wants to be connected: It wants to be visible. If not to the millions, on Survivor or Oprah, then to the hundreds, on Twitter or Facebook. This is the quality that validates us, this is how we become real to ourselves — by being seen by others. The great contemporary terror is anonymity. If Lionel Trilling was right, if the property that grounded the self, in Romanticism, was sincerity, and in modernism it was authenticity, then in postmodernism it is visibility.
Thursday, January 22, 2009
Nat King Cole and Francis Albert Sinatra.
Howard Hawks is momentarily distracted on the set of Rio Bravo (1958).
These great photos and many more at
If Charlie Parker Was a Gunslinger, There'd be a Whole Lot of Dead Copycats.
Wednesday, January 21, 2009
They sicken of the calm, who knew the storm.
Oh, life is a glorious cycle of song,
A medley of extemporanea;
And love is a thing that can never go wrong;
And I am Marie of Romania.
She runs the gamut of emotions from A to B.
(speaking of Katharine Hepburn).
There's a hell of a distance between wise-cracking and wit. Wit has truth in it; wise-cracking is simply calisthenics with words.
This is not a novel to be tossed aside lightly. It should be thrown with great force.
(regarding "Atlas Shrugged").
It serves me right for putting all my eggs in one bastard.
(On her abortion).
You can lead a horticulture, but you can't make her think.
(When asked to use the word horticulture during a game of Can-You-Give-Me-A-Sentence).
Razors pain you,
Rivers are damp,
Acids stain you,
And drugs cause cramp.
Guns aren't lawful,
Gas smells awful.
You might as well live.
I require only three things of a man. He must be handsome, ruthless and stupid.
It was produced according to the fine old Shubert precept that "nothing succeeds like undress."
At a dinner party, the actress Ilka Chase defended Clare Boothe Luce by saying that not only was Clare loyal to her friends, but "she's very kind to her inferiors." Dorothy countered with, "And where does she find them?"
The transatlantic crossing was so rough the only thing that I could keep on my stomach was the first mate.
The Monte Carlo casino refused to admit me until I was properly dressed so I went and found my stockings, and then came back and lost my shirt.
Monday, January 19, 2009
The disadvantages of being a writer, who is often written about, are numerous. I begin with an enthusiastic call to my 81-year old mother, hoping to share my enthusiasm from an assignment abroad. "Hey ma..." "I know," she says, "You're on Jupiter, it's all over the Internet. They say you're cavorting with the planet's president! They say he's anti-earth! And Sean, why is your hair so big in the pictures?" I muse, "Lack of gravity?" "That's what Hannity said!!" she tells me. It seems that American movies are pretty popular in far away places, and one must dance a bit to avoid being more a spun story, than the true story one intends to tell. However, there are also grand upsides.
I have been in the public eye to varying degrees, for most of my 48 years, and had many occasions to sit in the front row of popular and political culture. I can speak in firsthand, to bearing witness to an often untruthful, reckless and demonizing media. Yes, in many cases, the smoke would prove an accurate expectation of fire. But, the fact is, that our most respected, call that mainstream media, in print and on television are, in part, conscious manufacturers of deception. In one case, I have photographic evidence. It was widely reported that I had commissioned my own photographer to self-promote my involvement among many other volunteers in New Orleans in the aftermath of Katrina. This simply did not happen. Though the notion of self-promotion had not occurred to me, I did later regret that I had not gotten some snaps of the devastation I saw. I will probably bring someone along to document the next fuck-up of media or government. Meanwhile, I challenge anyone to hunt up the few pictures that were taken by the random photojournalists who'd stumbled upon me, and find a single one that would've passed the test of my own narcissistic scrutiny. But a benefit greater than the insight offered by this front row seat, is finding that having a public persona, inclusive of a perceived open mind to the qualities of countries outside one's own, may grant breathtaking access.
Who'd a thunk? There I was with the biggest hair on the planet. Oh yeah. Big, big hair. It does that in the tropics. It gets big. And I mean American big, baby. And there I was with my big, American hair, finding faith in American democracy in the unlikeliest of places. Sitting in the Salon de Protocol at the Convention Palace in Havana's Miramar district, all I had to do was tell the five-foot-six bespectacled man who sat in the chair across from me in his khaki dress militaries, that these words would not be published until after the American election. And with that, granting his first ever interview to a foreign journalist since the beginning of the 1957 Cuban revolution, President Raul Castro smiled warmly and simply said, "We want Obama." His initial reluctance was due to a concern that an endorsement by a Cuban president might be detrimental to the Obama candidacy. And this is where the faith came in: Though Obama would be the 11th American president in the long history of the Castro brother's reign, and despite tumultuous U.S. Cuban relations since what Henry Cabot Lodge called, "the large policy," as justification for American violations of the Teller amendment in the late 1800s. Despite multiple assassination attempts by the CIA on his older brother Fidel, the destabilization tactics of Robert F. Kennedy and the Bay of Pigs, The Platt Amendment with the taking of Guantanamo Bay, and even despite an endless and unjustified embargo (in effect: blockade) on Cuba by the United States, here we were in 2008, and Raul Castro said flat out that if the American people, who today stand with candidate Barack Obama, continue to stand with President Barack Obama, then "meaningful and productive advances could be achieved in Cuba and the world."
In my anticipation of a brief interview, I pulled from my pocket the dwindling remains of my small note pad. Again, Castro smiled, and slid a fresh, full pad across the small polished table to me. We would spend the next seven hours together.
Part Two here
Sunday, January 18, 2009
WHAT WILL CHANGE EVERYTHING?
DANIEL C. DENNETT
Philosopher; University Professor, Co-Director, Center for Cognitive Studies, Tufts University; Author, Breaking the Spell
THIS VERY EXPLORATION IS CHANGING EVERYTHING
What will change everything? The question itself and many of the answers already given by others here on Edge.org point to a common theme: reflective, scientific investigation of everything is going to change everything. When we look closely at looking closely, when we increase our investment in techniques for increasing our investment in techniques... for increasing our investment in techniques, we create non-linearities, — like Doug Hofstadter's strange loops — that amplify uncertainties, allowing phenomena that have heretofore been orderly and relatively predictable to escape our control. We figure out how to game the system, and this initiates an arms race to control or prevent the gaming of the system, which leads to new levels of gamesmanship and so on.
The snowball has started to roll, and there is probably no stopping it. Will the result be a utopia or a dystopia? Which of the novelties are self-limiting, and which will extinguish institutions long thought to be permanent? There is precious little inertia, I think, in cultural phenomena once they are placed in these arms races of cultural evolution. Extinction can happen overnight, in some cases. The almost frictionless markets made possible by the internet are already swiftly revolutionizing commerce.
Will universities and newspapers become obsolete? Will hospitals and churches go the way of corner grocery stores and livery stables? Will reading music soon become as arcane a talent as reading hieroglyphics? Will reading and writing themselves soon be obsolete? What will we use our minds for? Some see a revolution in our concept of intelligence, either because of "neurocosmetics" (Marcel Kinsbourne) or quantum-computing (W. H. Hoffman), or "just in time storytelling" (Roger Schank). Nick Humphrey reminds us that when we get back to basics — procreating, eating, just staying alive — not that much has changed since Roman times, but I think that these are not really fixed points after all.
Our species' stroll through Design Space is picking up speed. Recreational sex, recreational eating, and recreational perception (hallucinogens, alcohol), have been popular since Roman times, but we are now on the verge of recreational self-transformations that will dwarf the modifications the Romans indulged in. When you no longer need to eat to stay alive, or procreate to have offspring, or locomote to have an adventure — packed life, when the residual instincts for these activities might be simply turned off by genetic tweaking, there may be no constants of human nature left at all. Except, maybe, our incessant curiosity.
Berkman Professor of Entrepreneurial Legal Studies, Harvard; Author, The Wealth of Networks: How Social Production Transforms Markets and Freedom
RECOMBINATIONS OF THE NEAR POSSIBLE
What will change everything within forty to fifty years (optimistic assumptions about my longevity, I know)? One way to start to think about this is to look at the last “change everything” innovation, and work back fifty years from it. I would focus on the Internet's generalization into everyday life as the relevant baseline innovation that changed everything. We can locate its emergence to widespread use to the mid-1990s. So what did we have that existed in the mid-1940s that was a precursor? We had mature telephone networks, networked radio stations, and point-to-point radio communications. We had the earliest massive computers. So to me the challenge is to look at what we have now, some of which may be quite mature; other pieces of which may be only emerging; and to think of how they could combine in ways that will affect social and cultural processes in ways that will “change everything,” which I take to mean: will make a big difference to the day to day life of many people. Let me suggest four domains in which combinations and improvements of existing elements, some mature, some futuristic, will make a substantial difference, not all of it good.
We already have handsfree devices. We already have overhead transparent display in fighter pilot helmets. We already have presence-based and immediate communications. We already upload images and movies, on the fly, from our mobile devices, and share them with friends. We already have early holographic imaging for conference presentations, and high-quality 3D imaging for movies. We already have voice-activated computer control systems, and very very early brainwave activated human-computer interfaces. We already have the capacity to form groups online, segment and reform them according to need, be they in World of Warcraft or Facebook groups. What is left is to combine all these pieces into an integrated, easily wearable system that will, for all practical purposes, allow us to interact as science fiction once imagined telepathy working. We will be able to call upon another person by thinking of them; or, at least, whispering their name to ourselves. We will be able to communicate and see them; we will be able to see through their eyes if we wish to, in real time in high resolution to the point that it will seem as though we were in fact standing there, next to them or inside their shoes. However much we think now that collaboration at a distance is easy; what we do today will seem primitive. We won't have “beam me up, Scotty” physically; but we will have a close facsimile of the experience. Coupled with concerns over global warming, these capabilities will make business travel seem like wearing fur. However much we talk now about telecommuting today; these new capabilities, together with new concerns over environmental impact, will make virtual workplaces in the information segments of the economy as different from today's telecommuting as today's ubiquitous computing and mobile platforms are from the mini-computer “revolution” of the 1970s.
It is entirely plausible that 110 or 120 will be an average life expectancy; with senescence delayed until 80 or 90. This will change the whole dynamic of life: how many careers a lifetime can support; what the ratio or professional moneymaking to volunteering; how early in life one starts a job; length of training. But this will likely affect, if at all within the relevant period, only the wealthiest societies. Simple innovations that are more likely will have a much wider effect on many more people. A cheap and effective malaria vaccine. Cheap and ubiquitous clean water filters. Cheap and effective treatments and prevention techniques against parasites. All these will change life in the Global South on scales and with values that they will swamp, from the perspective of a broad concern with human values, whatever effects lengthening life in the wealthier North will have.
We are already have unmanned planes that can shoot live targets. We are seeing land robots, for both military and space applications. We are seeing networked robots performing functions in collaboration. I fear that we will see a massive increase in the deployment and quality of military robotics, and that this will lead to a perception that war is cheaper, in human terms. This, in turn, will lead democracies in general, and the United States in particular, to imagine that there are cheap wars, and to overcome the newly-learned reticence over war that we learned so dearly in Iraq.
(Casa del ionesco editor's note: see Robots at War: The New Battlefield for P.W.Singer's perspective on the future of Military Robotics).
Free market ideology
This is not a technical innovation but a change in realm of ideas. The resurgence of free market ideology, after its demise in the Great Depression, came to dominance between the 1970s and the late 1990s as a response to communism. As communism collapsed, free market ideology triumphantly declared its dominance. In the U.S. And the UK it expressed itself, first, in the Reagan/Thatcher moment; and then was generalized in the Clinton/Blair turn to define their own moment in terms of integrating market-based solutions as the core institutional innovation of the “left.” It expressed itself in Europe through the competition-focused, free market policies of the technocratic EU Commission; and in global systems through the demands and persistent reform recommendations of the World Bank, the IMF, and the world trade system through the WTO. But within less than two decades, its force as an idea is declining. On the one hand, the Great Deflation of 2008 has shown the utter dependence of human society on the possibility of well-functioning government to assure some baseline stability in human welfare and capacity to plan for the future. On the other hand, a gradual rise in volunteerism and cooperation, online and offline, is leading to a reassessment of what motivates people, and how governments, markets, and social dynamics interoperate. I expect the binary State/Market conception of the way we organize our large systems to give way to a more fluid set of systems, with greater integration of the social and commercial; as well as of the state and the social. So much of life, in so many of our societies, was structured around either market mechanisms or state bureaucracies. The emergence of new systems of social interaction will affect what we do, and where we turn for things we want to do, have, and experience.
Computer Scientist, UC Berkeley, School of Information; Author, Search User Interfaces
THE DECLINE OF TEXT
As an academic I am of course loathe to think about a world without reading and writing, but with the rapidly increasing ease of recording and distributing video, and its enormous popularity, I think it is only a matter of time before text and the written word become relegated to specialists (such as lawyers) and hobbyists.
Movies have already replaced books as cultural touchstones in the U.S. And most Americans dislike watching movies with subtitles. I assume that given a choice, the majority of Americans would prefer a video-dominant world to a text-dominant one. (Writing as a technologist, I don't feel I can speak for other cultures.) A recent report by Pew Research included a quote from a media executive who said that emails containing podcasts were opened 20% more often than standard marketing email. And I was intrigued by the use of YouTube questions in the U.S. presidential debates. Most of the citizen-submitted videos that were selected by the moderators consisted simply of people pointing the camera at themselves and speaking their question out loud, with a backdrop consisting of a wall in a room of their home. There were no visual flourishes; the video did not add much beyond what a questioner in a live audience would have conveyed. Video is becoming a mundane way to communicate.
Note that I am not predicting the decline of erudition, in the tradition of Allan Bloom. Nor am I arguing that video will make us stupid, as in Niel Postman's landmark "Amusing Ourselves to Death." The situation is different today. In Postman's time, the dominant form of video communication was television, which allowed only for one-way, broadcast-style interaction. We should expect different consequences when everyone uses video for multi-way communication. What I am espousing is that the forms of communication that will do the cultural "heavy lifting" will be audio and video, rather than text.
How will this come about? As a first step, I think there will be a dramatic reduction in typing; input of textual information will move towards audio dictation. (There is a problem of how to avoid disturbing officemates or exposing seat-mates on public transportation to private information; perhaps some sound-canceling technology will be developed to solve this problem.) This will succeed in the future where it has failed in the past because of future improvements in speech recognition technology and ease-of-use improvements in editing, storage, and retrieval of spoken words.
There already is robust technology for watching and listening to video at a faster speed than recorded, without undue auditory distortion (Microsoft has an excellent in-house system for this). And as noted above, technology for recording, editing, posting, and storing video has become ubiquitous and easy to use. As for the use of textual media to respond to criticisms and to cite other work, we already see "video responses" as a heavily used feature on YouTube. One can imagine how technology and norms will develop to further enrich this kind of interaction.
The missing piece in technology today is an effective way to search for video content. Automated image analysis is still an unsolved problem, but there may well be a breakthrough on the horizon. Most algorithms of this kind are developed by "training", that is, by exposing them to large numbers of examples. The algorithms, if fed enough data, can learn to recognize patterns which can be applied to recognize objects in videos the algorithm hasn't yet seen. This kind of technology is behind many of the innovations we see in web search engines, such as accurate spell checking and improvements in automated language translation. Not yet available are huge collections of labeled image and video data, where words have been linked to objects within the images, but there are efforts afoot to harness the willing crowds of online volunteers to gather such information.
What about developing versus developed nations? There is of course an enormous literacy problem in developing nations. Researchers are experimenting with cleverly designed tools such as the Literacy Bridge Talking Book project which uses a low-cost audio device to help teach reading skills. But perhaps just as developing nations "leap-frogged" developed ones by skipping land-line telephones to go straight to cell phones, the same may happen with skipping written literacy and moving directly to screen literacy.
I am not saying text will disappear entirely; one counter-trend is the replacement of orality with text in certain forms of communication. For short messages, texting is efficient and unobtrusive. And there is the question of how official government proclamations will be recorded. Perhaps there will be a requirement for transliteration into written text as a provision of the Americans with Disabilities Act, for the hearing-impaired (although we can hope in the future for increasingly advanced technology to reverse such conditions). But I do think the importance of written words will decline dramatically both in culture and in how the world works. In a few years, will I be submitting my response to the Edge question as a podcast?
Assistant Professor of Neuroscience, Baylor College of Medicine; Author, Sum
SILICON IMMORTALITY : DOWNLOADING CONSCIOUSNESS INTO COMPUTERS
While medicine will advance in the next half century, we are not on a crash-course for achieving immortality by curing all disease. Bodies simply wear down with use. We are on a crash-course, however, with technologies that let us store unthinkable amounts of data and run gargantuan simulations. Therefore, well before we understand how brains work, we will find ourselves able to digitally copy the brain's structure and able to download the conscious mind into a computer.
If the computational hypothesis of brain function is correct, it suggests that an exact replica of your brain will hold your memories, will act and think and feel the way you do, and will experience your consciousness — irrespective of whether it's built out of biological cells, Tinkertoys, or zeros and ones. The important part about brains, the theory goes, is not the structure, it is about the algorithms that ride on top of the structure. So if the scaffolding that supports the algorithms is replicated — even in a different medium — then the resultant mind should be identical. If this proves correct, it is almost certain we will soon have technologies that allow us to copy and download our brains and live forever in silica. We will not have to die anymore. We will instead live in virtual worlds like the Matrix. I assume there will be markets for purchasing different kinds of afterlives, and sharing them with different people — this is future of social networking. And once you are downloaded, you may even be able to watch the death of your outside, real-world body, in the manner that we would view an interesting movie.
Of course, this hypothesized future embeds many assumptions, the speciousness of any one of which could spill the house of cards. The main problem is that we don't know exactly which variables are critical to capture in our hypothetical brain scan. Presumably the important data will include the detailed connectivity of the hundreds of billions of neurons. But knowing the point-to-point circuit diagram of the brain may not be sufficient to specify its function. The exact three-dimensional arrangement of the neurons and glia is likely to matter as well (for example, because of three-dimensional diffusion of extracellular signals). We may further need to probe and record the strength of each of the trillions of synaptic connections. In a still more challenging scenario, the states of individual proteins (phosphorylation states, exact spatial distribution, articulation with neighboring proteins, and so on) will need to be scanned and stored. It should also be noted that a simulation of the central nervous system by itself may not be sufficient for a good simulation of experience: other aspects of the body may require inclusion, such as the endocrine system, which sends and receives signals from the brain. These considerations potentially lead to billions of trillions of variables that need to be stored and emulated.
The other major technical hurdle is that the simulated brain must be able to modify itself. We need not only the pieces and parts, we also the physics of their ongoing interactions — for example, the activity of transcription factors that travel to the nucleus and cause gene expression, the dynamic changes in location and strength of the synapses, and so on. Unless your simulated experiences change the structure of your simulated brain, you will be unable to form new memories and will have no sense of the passage of time. Under those circumstances, is there any point in immortality?
The good news is that computing power is blossoming sufficiently quickly that we are likely to make it within a half century. And note that a simulation does not need to be run in real time in order for the simulated brain to believe it is operating in real time. There's no doubt that whole brain emulation is an exceptionally challenging problem. As of this moment, we have no neuroscience technologies geared toward ultra-high-resolution scanning of the sort required — and even if we did, it would take several of the world's most powerful computers to represent a few cubic millimeters of brain tissue in real time. It's a large problem. But assuming we haven't missed anything important in our theoretical frameworks, then we have the problem cornered and I expect to see the downloading of consciousness come to fruition in my lifetime.
Digital Technologist; Managing Director, Co-Founder, area/code
THE EBB OF MEMORY
In just a few years, we’ll see the first generation of adults whose every breath has been drawn on the grid. A generation for whom every key moment (e.g., birth) has been documented and distributed globally. Not just the key moments, of course, but also the most banal: eating pasta, missing the train, and having a bad day at the office. Ski trips and puppies.
These trips and puppies are not simply happening, they are becoming data, building up the global database of distributed memories. They are networked digital photos – 3 billion on Flickr, 10 billion on Facebook. They were blog posts, and now they are tweets, too (a billion in 18 months). They are Facebook posts, Dopplr journals, Last.FM updates.
Further, more and more of these traces we produce will be passive or semi-passive. Consider Loopt, which allows us to track ourselves, our friends through GPS. Consider voicemail transcription bots that transcribe the voice messages we leave into searchable text in email boxes on into eternity. The next song you listen to will likely be stored in a database record somewhere. Next time you take a phonecam photo, it may well have the event’s latitude and longitude baked into the photo’s metadata.
The sharp upswing in all of this record-keeping – both active and passive – are redefining one of the core elements of what it means to be human, namely to remember. We are moving towards a culture that has outsourced this essential quality of existence to machines, to a vast and distributed prosthesis. This infrastructure exists right now, but very soon we’ll be living with the first adult generation whose entire lives are embedded in it.
In 1992, the artist Thomas Bayrle wrote that the great mistakes of the future would be that as everything became digital, we would confuse memory with storage. What’s important about genuine memory and how it differs from digital storage is that human memory is imperfect, fallible, and malleable. It disappears over time in a rehearsal and echo of mortality; our abilities to remember, distort and forget are what make us who we are.
We have built the infrastructure that makes it impossible to forget. As it hardens and seeps into every element of daily life, it will make it impossible to remember. Changing what it means to remember changes what it means to be.
There are a few people with who already have perfect episodic memory, total recall, neurological edge cases. They are harbingers of the culture to come. One of them, Jill Price, was profiled in Der Spiegel:
"In addition to good memories, every angry word, every mistake, every disappointment, every shock and every moment of pain goes unforgotten. Time heals no wounds for Price. 'I don't look back at the past with any distance. It's more like experiencing everything over and over again, and those memories trigger exactly the same emotions in me. It's like an endless, chaotic film that can completely overpower me. And there's no stop button.'"
This also describes the life of Steve Mann, passively recording his life through wearable computers for many years. This is an unlikely future scenario, but like any caricature, it is based on human features that will be increasingly recognizable. The processing, recording and broadcasting prefigured in Mann’s work will be embedded in everyday actions like the twittering, phonecam shots and GPS traces we broadcast now. All of them entering into an outboard memory that is accessible (and searchable) everywhere we go.
Today is New Year’s Eve. I read today (on Twitter) that three friends, independent of each other, were looking back at Flickr to recall what they were doing a year ago. I would like to start the New Year being able to remember 2008, but also to forget it.
For the next generation, it will be impossible to forget it, and harder to remember. What will change everything is our ability to remember what everything is. Was. And wasn’t.
Friday, January 16, 2009
Thursday, January 15, 2009
Immersing myself in a book or a lengthy article used to be easy. My mind would get caught up in the narrative or the turns of the argument, and I’d spend hours strolling through long stretches of prose. That’s rarely the case anymore. Now my concentration often starts to drift after two or three pages. I get fidgety, lose the thread, begin looking for something else to do. I feel as if I’m always dragging my wayward brain back to the text. The deep reading that used to come naturally has become a struggle.” Sound familiar? Describing, in The Atlantic Monthly, his own struggles to keep his attention span from contracting like the wild ass’s skin in Balzac’s novel, Nicholas Carr cites a British study of research habits among visitors to two serious scholarly websites which suggests a more general problem: that “users are not reading online in the traditional sense; indeed there are signs that new forms of ‘reading’ are emerging as users ‘power browse’ horizontally through titles, contents pages and abstracts going for quick wins. It almost seems that they go online to avoid reading in the traditional sense.”
Almost seems? I don’t know about Mr. Carr, but I have no doubt that I go online to avoid reading in the traditional sense. The question is, how guilty do I need to feel about this? In his view, presumably, quite a lot guilty, since by reading online as much as I do I am depriving myself of the ability to read offline. He takes this insight to an even more alarming conclusion in the end, writing that “as we come to rely on computers to mediate our understanding of the world, it is our own intelligence that flattens into artificial intelligence.” And if that’s the case for veteran readers, think how much worse it must be for the jeunesse dorée of the information age, if they never developed the habits that accompany “deep reading” in the first place.
It is these poor cultural orphans, for whom “information retrieval” online is the only kind of reading they know, who are the main concern of Mark Bauerlein in his new book, The Dumbest Generation: How the Digital Age Stupefies Young Americans and Jeopardizes Our Future. One would think that a whole future in jeopardy would be too serious a matter for the flippancy of the rest of the subtitle: Or, Don’t Trust Anyone Under 30. But Professor Bauerlein, who teaches English at Emory University and is a former director of research and analysis at the National Endowment for the Arts, is not always sure just how much a matter of mirth “the dumbest generation” is, or isn’t. After all, it is not really their fault if, as he says, they have been “betrayed” by the mentors who should have taught them better. Yet he seems to agree with Nicholas Carr that what we are witnessing is not just an educational breakdown but a deformation of the very idea of intelligence.
The book is modernity’s quintessential technology—“a means of transportation through the space of experience, at the speed of a turning page,” as the poet Joseph Brodsky put it. But now that the rustle of the book’s turning page competes with the flicker of the screen’s twitching pixel, we must consider the possibility that the book may not be around much longer. If it isn’t—if we choose to replace the book—what will become of reading and the print culture it fostered? And what does it tell us about ourselves that we may soon retire this most remarkable, five-hundred-year-old technology?
We have already taken the first steps on our journey to a new form of literacy—“digital literacy.” The fact that we must now distinguish among different types of literacy hints at how far we have moved away from traditional notions of reading. The screen mediates everything from our most private communications to our enjoyment of writing, drama, and games. It is the busiest port of entry for popular culture and requires navigation skills different from those that helped us master print literacy.
Enthusiasts and self-appointed experts assure us that this new digital literacy represents an advance for mankind; the book is evolving, progressing, improving, they argue, and every improvement demands an uneasy period of adjustment. Sophisticated forms of collaborative “information foraging” will replace solitary deep reading; the connected screen will replace the disconnected book. Perhaps, eons from now, our love affair with the printed word will be remembered as but a brief episode in our cultural maturation, and the book as a once-beloved technology we’ve outgrown.
Wednesday, January 14, 2009
42. O.J. Simpson
Charges: Jesus H. Christ, man. You literally get away with murder, to the astonishment of anyone capable of tying their own shoes. Then you write a book, coyly framed as “hypothetical,” in which you explain slicing and dicing your ex-wife and some poor shlub by describing her as a pain in the ass. You know the whole country is still gunning for you. And yet, you feel it sensible to try your luck one more time, because some guy in Vegas is selling a football you signed? Sure, O.J.’s sentence was too harsh to believe he wasn’t being punished for previous crimes of which he was acquitted, but did anyone think that wasn’t going to happen? O.J. could get 33 years for pissing on a tree, and he knew it, so at a minimum the whole “gimme my shit back” caper was unbelievably stupid, the product of a life in which consequences are things that happen to other people. At least now he can get to work on his next book, “If I was an idiot who got himself locked up for life after skating on a double homicide.”
Exhibit A: "I'm O.J. Simpson. How am I going to think that I'm going to rob somebody and get away with it? Besides, I thought what happens in Las Vegas stays in Las Vegas."
Sentence: Ghost of Howard Cosell narrates the remainder of OJ’s life: “This man, once a man of greatness, now a man fallen, disgraced, disgusting, reduced to defecating in an unenclosed, seatless toilet, in close proximity to other convicted felons, the indignity apparent on his sad, rapidly aging face. What an incredibly pitiful story is his.”
11. Rush Limbaugh
Charges: The father of modern stupidity, Limbaugh spins reflexively, never struggling with issues, because he knows his conclusion must favor Republicans, and his only task is finding a way to get there. In other words, he may or may not actually believe what he’s saying, but it’s beside the point. His job is not to say what he thinks, but to instruct his listeners on what they should think. If the facts don’t agree, he can always change them, as his “ditto heads” are already armed against the contrary evidence with the all-purpose “liberal bias” attack. “Rush is right,” as the slogan goes, and all those nerdy reporters in the “drive by media” are lying, because they secretly love terrorists. It’s this creepily worshipful, breathtakingly infantile abdication of intellect to a blatantly dishonest hypocrite that makes Limbaugh’s audience so goddamn sad. These pathetic, insecure, failures of men look to Rush as the champion of their impotent rage, helping them to externalize responsibility for their own deficiencies, pinning the blame on those darn liberals and their racial and gender equality.
Exhibit A: You have to marvel at the sheer ignominy of someone who coins the term “Obama recession” two days after the election.
Sentence: Tiny speaker implanted in his inner ear which blares Randi Rhodes 24-7.
10. Bernard Madoff
Charges: Normally, the idea of a bunch of billionaires getting robbed blind for believing in a free lunch would amuse the hell out of us, but Bernie Madoff stole a lot of money from charity endowments, and is responsible for two suicides so far. Here’s a tip, Bernie: If you’re running the biggest scam since the Catholic church, handling billions of dollars, and all it takes to get busted is that some of your marks ask for their money back, you really should take some of that money and set up an escape plan. Still, he gets some credit for making Mort Zuckerman look like a jackass. The real villains here are Christopher Cox and the SEC, who investigated Madoff eight times, the last time specifically on suspicion of running a Ponzi scheme, each time “finding” no wrongdoing, which begs the all-too-familiar question of the last eight years: Satanically corrupt or grossly incompetent? Either way, Madoff was finally brought to justice… by his kids.
Exhibit A: "In today's regulatory environment, it's virtually impossible to violate rules ... but it's impossible for a violation to go undetected, certainly not for a considerable period of time."
Sentence: Sold into slavery.
7. Dick Cheney
Charges: Still alive. The amount of medical resources devoted to keeping this black hole of decency operational could have cured cancer by now, but if they had, Cheney would make sure to keep it a secret. Since Watergate, Cheney’s been fighting to rehab Nixon’s image, and he has succeeded in a way, by showing us all just how much worse a presidency can be.
Exhibit A: “It is easy to take liberty for granted, when you have never had it taken from you.”
Sentence: Eaten alive by baboons.
6. Hank Paulson
Charges: The latest practitioner of the real Bush Doctrine, which is indistinguishable from Naomi Klein’s Shock Doctrine. This time, it was Paulson, not Powell, holding up a vial of toxic mortgage backed securities, but the similarly election-panicked congressional reaction’s been the same—a panicked passage of whatever crazy late-term legislation the White House wants, namely Paulson’s 3-page “Gimme the money and go away” bill, plus $150 billion in anonymous pork, in one last massive federal theft on the way out the White House door, costing more than the entire U.S. space program for its 50 years of existence, even in inflation-adjusted dollars. Paulson’s initial spending plan was the financial equivalent of blowing into a broken balloon, and his typically Bushian demands for unfettered power seemed appropriate for a guy who looks and sounds like the ghost-melted Nazis from Raiders of the Lost Ark.
Exhibit A: “It’s a safe banking system, a sound banking system. Our regulators are on top of it. This is a very manageable situation.”
Sentence: Crushed by falling brokers.
5. Alan Greenspan
Charges: The mortgage meltdown may seem complicated, but it started simple, with Al Greenspan pegging the Fed fund rate at 1%. This made Treasury Bonds a fairly lame investment, and led to investors looking for other seemingly safe securities to buy, which led to a flourishing demand for mortgage-backed securities, which led to banks increasingly lowering their standards for mortgage applications, eventually giving liar loans away to anyone willing to take them, which used to be called usury. This led to a decline in the real value of these MBA securities due to high probabilities of foreclosure, but somehow they were still AAA-rated by credit agencies displaying either hopeless incompetence or criminal collusion. Even a monkey wouldn’t need a slide rule to see what would come next. But Alan Greenspan, super-genius guru of the glorious realm of the self-regulating free market, is totally flummoxed. Refusing to accept any blame for years as the housing bubble, long-predicted by out-of-favor economic realists, bloated and burst, only recently has Greenspan accepted even marginal responsibility, admitting only that he was “partially” wrong, professing a state of “shocked disbelief” that lenders couldn’t regulate themselves, and thinking to himself, “This isn’t how it worked in Atlas Shrugged!”
Exhibit A: “Parasites who persistently avoid either purpose or reason perish as they should.”
Sentence: Recurring role as a senile great uncle on new C-grade sitcom “Krugman’s Krew.”
4. George W. Bush
Charges: It’s hard—believe us, we know—to keep coming up with new things to say about this brutally stupid narcissist, who may have ruined this country irrevocably and certainly has ruined a couple of others, mugging amiably all the way. If anything good comes from Bush’s reign of error, let it be the death of the notion that vitally important, life or death decisions that affect the entire world should be made with one’s “gut.” We used to think that incompetence was just a good cover story for this administration, an excuse that masked their deliberate criminality, but it turns out that Bush and his inner circle are both treasonous, corrupt warmongers and inept fools. One good thing about him, though, is that he has no real interest in politics, and probably won’t give a flying shoe what happens to the world when his term is up. As he once put it, ““History, we don’t know. We’ll all be dead.” Here’s to George W. Bush being history.
Exhibit A: "Goodbye from the world's biggest polluter."
Sentence: Detained in formaldehyde-laced FEMA trailer without charges or counsel, sodomized by Lynndie England, declared guilty by military tribunal, set adrift naked on a small ice floe in the Arctic.
3. Sean Hannity
Charges: This relentlessly repugnant McCarthyite tool really outdid himself this year, in an all-out quest to otherize Obama in any way he could. This paranoid pustule is able to find a liberal conspiracy lurking behind any mundane occurrence, even attributing Obama’s selection as Time’s Person of the Year, an event as predictable as sunrise, to a pay-to-play scheme. Hopelessly outmatched shill Alan Colmes is finally leaving his role as Hannity’s doormat; he will not be replaced.
Exhibit A: "I never questioned anyone's patriotism."
Sentence: Wrongfully convicted of murdering Vince Foster, based on evidence falsified by Jerome Corsi.
2. John McCain
Charges: McCain vowed to run a clean, respectful campaign, and then accused Obama of pushing sex ed for kindergartners, calling Palin a pig, hanging with terrorists, being a welfare-loving Marxist, being an arugula-loving elitist and pretty much everything but conspiring with the Borg—but he didn’t really mean it, and he didn’t use Reverend Wright, so we’re all supposed to think he’s swell. McCain lied so blatantly and constantly that even cable news bootlicks were compelled to fact-check him, to which he and his surrogates responded by insisting on the same lies. When pressed on the Nixonian onslaught of falsehood, McCain whined that he wouldn’t have had to be such a mendacious prick if Obama had only refrained from raising so much more money than him. McCain pretended to give a shit about America, and then he picked a vapid ambition-hound to succeed him. His response to the economic crisis might as well have been to punch himself in the face. In every way he could this year, McCain burned up all the credibility he had stored up from decades of shameless worship by the press, utilizing every tactic he ever decried, exuding a heady aroma of bullshit and Alzheimer’s, and displaying an unrequited obsession with Joe the Plumber, and he still wound up a failed Faust even the Devil didn’t want.
Exhibit A: "In the 21st century nations don't invade other nations."
Sentence: Every time anybody says the word “surge,” McCain is shot in the leg.
1. Sarah Palin
Charges: If you want to know why the rest of the world is scared of Americans, consider the fact that after two terms of disastrous rule by a small-minded ignoramus, 46% of us apparently thought the problem was that he wasn’t quite stupid enough. Palin’s unending emissions of baffling, evasive incoherence should have disqualified her for any position that involved a desk, let alone placing her one erratic heartbeat from the presidency. The press strained mightily to feign respect for her, praising a debate performance that involved no debate, calling her a “great speaker” when her only speech was primarily a litany of insults to city-dwellers, echoing bogus sexism charges when a male Palin would have been boiled alive for the Couric interview alone, and lionizing her as she used her baby as a Pro-life stage prop before crowds who cooed when they should have been hurling polonium-tipped javelins. In the end, Palin had the beneficial effect of splitting her party between her admirers and people who can read.
Exhibit A: Waving her embryo-loving credentials, in the form of her Down syndrome baby, at "But ultimately what the bailout does is help those who are concerned about the healthcare reform that is needed to help shore up our economy."
Sentence: Hand-to-hand combat with Vladimir Putin and a pack of wolves.
from The Guardian
Giovanni Boccaccio's masterpiece, set in far darker times than our own, is a hymn to the physical joys of life.
Giovanni Boccaccio's 14th-century literary masterpiece The Decameron may hold the recipe to defy these troubled times. Boccaccio's collection of 100 stories told over 10 days is set against the backdrop of a crisis that puts today's credit problems in perspective: the black death. He begins it with a harrowing piece of reportage on the plague in his city, Florence, describing how the disease spread across Europe in 1347-8, killing rich and poor alike in such terrible numbers that bodies littered the streets, the sick were shunned by their families, and funeral rites were abandoned. He paints a picture of a society on the brink of absolute disappearance - would everyone in Florence die? Everyone in Europe?
The moral is that people can be happy, prosperous and creative even in the worst of times: nothing quenches the life force.
Tuesday, January 13, 2009
If you look at language and ideas in evolutionary terms then every advance is an incremental improvement on what's gone before. Language evolves organically. All we do is create a cocktail, mixing up the same basic ingredients and influences as everyone else. Hopefully, we can conjoin them in an interesting and deceptively novel way, and maybe garnish the concoction with an idiosyncratic little twist of our own. We use a magician's misdirection to create the illusion of originality, but we're little more than bartenders at the business end of the creative process. We synthesize the raw materials but the consumer filters the product through his own perception before passing on his interpretation of the experience to the next person in line.
We're not only custodians of the verbal tradition; we also have a duty to synthesize existing elements in new and surprising ways. We have to harness the true power of words and subvert the stiflingly narrow parameters and prevailing paradigms perpetuated by pop culture.
What I'm saying is that if you want to write, write, but do it for yourself and as an end in itself. Don’t emasculate your prose for commerce and don’t worry about the consequences. Don't be afraid to challenge people’s preconceptions and, most importantly of all, avoid perpetuating the empty catechisms, shallow ‘insights’ and lazy conjunctions of ideas and words that pass for contemporary discourse within our corroded culture. Bad memes are metastasizing exponentially, so take your words seriously. Making people think is tantamount to a subversive act these days.
Ewan McNaught as Frankie Sumatra by Nicola Cairns
Sunday, January 11, 2009
Confined by the limbs of claustrophobia
Cursing quietly the uncertain of stride
Determined to cut through this disturbia
Dirty is every glance to the side
Goache is the current of the guileless
Grievance unto others but a mystery
Lost within fogs of listlessness
Lulled into a false sense of maturity
Publicly dispossessed of purpose
Positioning an unfathomable idea
Volunteering for a game of versus
Vacuous to the point of insincere
Unstudied in the movements of union
Unaware by selfish degree
Yearning only for a straight line to yon
Yes, neighbor, peripheral vision is deceased
The pollution of external dialogue
Spewing forth as a so-what aside
From a soddened, muttering mad dog
With only himself in which to confide
Perhaps he’s a sooth, a vagrant seer?
More than likely a hopped-up bum
Should I listen or laugh; why must I fear?
There is, after all, nowt wrong with asylum
But let's say I to engage Mister Gabber
If only through a squint of salvo
Would he smile at my counter-chutzpah
Or cut me down with spitting vitriol?
Yet if it's poetry, I would recognize it
If it is gibberish, I should know
But I could get stuck fast; best cut to the quick
I’m on the clock, in the end; places to go