Friday, December 25, 2009
Wednesday, December 23, 2009
Monday, December 21, 2009
Wednesday, December 16, 2009
King of New York ~ Schooly D
From the soundtrack to Abel Ferrara's classic gangster movie "King of New York."
Wednesday, December 09, 2009
Faux Friendship
By William Deresiewicz
From Chronicle.com
We live at a time when friendship has become both all and nothing at all. Already the characteristically modern relationship, it has in recent decades become the universal one: the form of connection in terms of which all others are understood, against which they are all measured, into which they have all dissolved. Romantic partners refer to each other as boyfriend and girlfriend. Spouses boast that they are each other's best friends. Parents urge their young children and beg their teenage ones to think of them as friends. Adult siblings, released from competition for parental resources that in traditional society made them anything but friends (think of Jacob and Esau), now treat one another in exactly those terms. Teachers, clergymen, and even bosses seek to mitigate and legitimate their authority by asking those they oversee to regard them as friends. We're all on a first-name basis, and when we vote for president, we ask ourselves whom we'd rather have a beer with. As the anthropologist Robert Brain has put it, we're friends with everyone now.
Yet what, in our brave new mediated world, is friendship becoming? The Facebook phenomenon, so sudden and forceful a distortion of social space, needs little elaboration. Having been relegated to our screens, are our friendships now anything more than a form of distraction? When they've shrunk to the size of a wall post, do they retain any content? If we have 768 "friends," in what sense do we have any? Facebook isn't the whole of contemporary friendship, but it sure looks a lot like its future. Yet Facebook—and MySpace, and Twitter, and whatever we're stampeding for next—are just the latest stages of a long attenuation. They've accelerated the fragmentation of consciousness, but they didn't initiate it. They have reified the idea of universal friendship, but they didn't invent it. In retrospect, it seems inevitable that once we decided to become friends with everyone, we would forget how to be friends with anyone. We may pride ourselves today on our aptitude for friendship—friends, after all, are the only people we have left—but it's not clear that we still even know what it means.
....
And so we return to Facebook. With the social-networking sites of the new century—Friendster and MySpace were launched in 2003, Facebook in 2004—the friendship circle has expanded to engulf the whole of the social world, and in so doing, destroyed both its own nature and that of the individual friendship itself. Facebook's very premise—and promise—is that it makes our friendship circles visible. There they are, my friends, all in the same place. Except, of course, they're not in the same place, or, rather, they're not my friends. They're simulacra of my friends, little dehydrated packets of images and information, no more my friends than a set of baseball cards is the New York Mets.
More here
From Chronicle.com
We live at a time when friendship has become both all and nothing at all. Already the characteristically modern relationship, it has in recent decades become the universal one: the form of connection in terms of which all others are understood, against which they are all measured, into which they have all dissolved. Romantic partners refer to each other as boyfriend and girlfriend. Spouses boast that they are each other's best friends. Parents urge their young children and beg their teenage ones to think of them as friends. Adult siblings, released from competition for parental resources that in traditional society made them anything but friends (think of Jacob and Esau), now treat one another in exactly those terms. Teachers, clergymen, and even bosses seek to mitigate and legitimate their authority by asking those they oversee to regard them as friends. We're all on a first-name basis, and when we vote for president, we ask ourselves whom we'd rather have a beer with. As the anthropologist Robert Brain has put it, we're friends with everyone now.
Yet what, in our brave new mediated world, is friendship becoming? The Facebook phenomenon, so sudden and forceful a distortion of social space, needs little elaboration. Having been relegated to our screens, are our friendships now anything more than a form of distraction? When they've shrunk to the size of a wall post, do they retain any content? If we have 768 "friends," in what sense do we have any? Facebook isn't the whole of contemporary friendship, but it sure looks a lot like its future. Yet Facebook—and MySpace, and Twitter, and whatever we're stampeding for next—are just the latest stages of a long attenuation. They've accelerated the fragmentation of consciousness, but they didn't initiate it. They have reified the idea of universal friendship, but they didn't invent it. In retrospect, it seems inevitable that once we decided to become friends with everyone, we would forget how to be friends with anyone. We may pride ourselves today on our aptitude for friendship—friends, after all, are the only people we have left—but it's not clear that we still even know what it means.
....
And so we return to Facebook. With the social-networking sites of the new century—Friendster and MySpace were launched in 2003, Facebook in 2004—the friendship circle has expanded to engulf the whole of the social world, and in so doing, destroyed both its own nature and that of the individual friendship itself. Facebook's very premise—and promise—is that it makes our friendship circles visible. There they are, my friends, all in the same place. Except, of course, they're not in the same place, or, rather, they're not my friends. They're simulacra of my friends, little dehydrated packets of images and information, no more my friends than a set of baseball cards is the New York Mets.
More here
Saturday, November 21, 2009
Thursday, November 19, 2009
Tuesday, November 17, 2009
The Dark Side of the Bright Side
From In These Times
In her new book Bright-Sided: How the Relentless Promotion of Positive Thinking Has Undermined America (Metropolitan/Holt, October 2009), Barbara Ehrenreich traces the origins of contemporary optimism from nineteenth-century healers to twentieth-century pushers of consumerism. She explores how that culture of optimism prevents us from holding to account both corporate heads and elected officials.
Manufactured optimism has become a method to make the poor feel guilty for their poverty, the ill for their lack of health and the victims of corporate layoffs for their inability to find worthwhile jobs. Megachurches preach the “gospel of prosperity,” exhorting poor people to visualize financial success. Corporations have abandoned rational decision-making in favor of charismatic leadership.
This mania for looking on the bright side has given us the present financial collapse; optimistic business leaders—assisted by rosy-eyed policymakers—made very bad decisions.
In These Times recently spoke with her about our penchant for foolish optimism.
Is promoting optimism a mechanism of social control to keep the system in balance?
If you want to have a compliant populace, what could be better than to say that everyone has to think positively and accept that anything that goes wrong in their lives is their own fault because they haven’t had a positive enough attitude? However, I don’t think that there is a central committee that sits there saying, “This is what we want to get people to believe.”
It took hold in the United States because in the ’80s and ’90s it became a business. You could write a book like Who Moved My Cheese?, which is a classic about accepting layoffs with a positive attitude. And then you could count on employers to buy them up and distribute them free to employees.
So this picks up more in the early ’80s and even more so in the ’90s when globalization really took off?
I was looking at the age of layoffs, which begins in the ’80s and accelerates. How do you manage a workforce when there is no job security? When there is no reward for doing a good job? When you might be laid off and it might not have anything to do with performance? As that began to happen, companies began to hire motivational speakers to come in and speak to their people.
Couldn’t this positive thinking be what corporate culture wants everyone to believe, but at the top, people are still totally rational?
That is what I was assuming when I started this research. I thought, “It’s got to be rational at the top. Someone has to keep an eye on the bottom line.” Historically, the science of management was that in a rational enterprise, we have spreadsheets, we have decision-trees and we base decisions on careful analysis.
But then all that was swept aside for a new notion of what management is about. The word they use is “leadership.” The CEO and the top people are not there so much to analyze and plan but to inspire people. They claimed to have this uncanny ability to sense opportunities. It was a shock, to find the extent to which corporate culture has been infiltrated not only by positive thinking, but by mysticism. The idea is that now things are moving so fast in this era of globalization, that there’s no time to think anymore. So you increasingly find CEOs gathering in sweat lodges or drumming circles or going on “vision quests” to get in touch with their inner-Genghis Khan or whatever they were looking for.
The same things are happening in foreign policy. We’ve abandoned a sense of realism. You had this with Bush and also with Obama, although he is more realistic. Is there a connection between optimism and the growth of empire?
In the ’80s, Reagan promoted the idea that America is special and that Americans were God’s chosen people, destined to prosper, much to the envy of everybody else in the world. Similarly, Bush thought of himself as the optimist-in-chief, as the cheerleader—which had been his job once in college. This is very similar to how CEOs are coming to think of themselves: as people whose job is to inspire others to work harder for less pay and no job security.
Would you say that Obama is our cheerleader-in-chief?
I haven’t sorted it out. He talks a lot about hope. And as a citizen I’d rather not hear about “hope,” I’d rather hear about “plans.” Yet he does strike me as a rational person, who thinks through all possibilities and alternatives.
You write about the science of positive thinking having taken root at Ivy League universities. It’s amazing to me that a course in happiness at Harvard would draw almost 900 students.
That was in 2006. And these courses have spread all over the country—courses in positive psychology where you spend time writing letters of gratitude to people in your family, letters of forgiveness (whether or not you send them doesn’t matter), getting in touch with your happy feelings, and I don’t think that’s what higher education should be about. People go to universities to learn critical thinking, and positive thinking is antithetical to critical thinking.
Anis Shivani
More here
In her new book Bright-Sided: How the Relentless Promotion of Positive Thinking Has Undermined America (Metropolitan/Holt, October 2009), Barbara Ehrenreich traces the origins of contemporary optimism from nineteenth-century healers to twentieth-century pushers of consumerism. She explores how that culture of optimism prevents us from holding to account both corporate heads and elected officials.
Manufactured optimism has become a method to make the poor feel guilty for their poverty, the ill for their lack of health and the victims of corporate layoffs for their inability to find worthwhile jobs. Megachurches preach the “gospel of prosperity,” exhorting poor people to visualize financial success. Corporations have abandoned rational decision-making in favor of charismatic leadership.
This mania for looking on the bright side has given us the present financial collapse; optimistic business leaders—assisted by rosy-eyed policymakers—made very bad decisions.
In These Times recently spoke with her about our penchant for foolish optimism.
Is promoting optimism a mechanism of social control to keep the system in balance?
If you want to have a compliant populace, what could be better than to say that everyone has to think positively and accept that anything that goes wrong in their lives is their own fault because they haven’t had a positive enough attitude? However, I don’t think that there is a central committee that sits there saying, “This is what we want to get people to believe.”
It took hold in the United States because in the ’80s and ’90s it became a business. You could write a book like Who Moved My Cheese?, which is a classic about accepting layoffs with a positive attitude. And then you could count on employers to buy them up and distribute them free to employees.
So this picks up more in the early ’80s and even more so in the ’90s when globalization really took off?
I was looking at the age of layoffs, which begins in the ’80s and accelerates. How do you manage a workforce when there is no job security? When there is no reward for doing a good job? When you might be laid off and it might not have anything to do with performance? As that began to happen, companies began to hire motivational speakers to come in and speak to their people.
Couldn’t this positive thinking be what corporate culture wants everyone to believe, but at the top, people are still totally rational?
That is what I was assuming when I started this research. I thought, “It’s got to be rational at the top. Someone has to keep an eye on the bottom line.” Historically, the science of management was that in a rational enterprise, we have spreadsheets, we have decision-trees and we base decisions on careful analysis.
But then all that was swept aside for a new notion of what management is about. The word they use is “leadership.” The CEO and the top people are not there so much to analyze and plan but to inspire people. They claimed to have this uncanny ability to sense opportunities. It was a shock, to find the extent to which corporate culture has been infiltrated not only by positive thinking, but by mysticism. The idea is that now things are moving so fast in this era of globalization, that there’s no time to think anymore. So you increasingly find CEOs gathering in sweat lodges or drumming circles or going on “vision quests” to get in touch with their inner-Genghis Khan or whatever they were looking for.
The same things are happening in foreign policy. We’ve abandoned a sense of realism. You had this with Bush and also with Obama, although he is more realistic. Is there a connection between optimism and the growth of empire?
In the ’80s, Reagan promoted the idea that America is special and that Americans were God’s chosen people, destined to prosper, much to the envy of everybody else in the world. Similarly, Bush thought of himself as the optimist-in-chief, as the cheerleader—which had been his job once in college. This is very similar to how CEOs are coming to think of themselves: as people whose job is to inspire others to work harder for less pay and no job security.
Would you say that Obama is our cheerleader-in-chief?
I haven’t sorted it out. He talks a lot about hope. And as a citizen I’d rather not hear about “hope,” I’d rather hear about “plans.” Yet he does strike me as a rational person, who thinks through all possibilities and alternatives.
You write about the science of positive thinking having taken root at Ivy League universities. It’s amazing to me that a course in happiness at Harvard would draw almost 900 students.
That was in 2006. And these courses have spread all over the country—courses in positive psychology where you spend time writing letters of gratitude to people in your family, letters of forgiveness (whether or not you send them doesn’t matter), getting in touch with your happy feelings, and I don’t think that’s what higher education should be about. People go to universities to learn critical thinking, and positive thinking is antithetical to critical thinking.
Anis Shivani
More here
Friday, November 13, 2009
Thursday, November 12, 2009
Wednesday, November 11, 2009
Random Transmissions 14 ~ S.J. Perelman
Love is not the dying moan of a distant violin .. it's the triumphant twang of a bedspring.
Their the waiters' eyes sparkled and their pencils flew as she proceeded to eviscerate my wallet - pâté, Whitstable oysters, a sole, filet mignon, and a favorite salad of the Nizam of Hyderabad made of shredded five-pound notes.
A dreary industrial town controlled by hoodlums of enormous wealth, the ethical sense of a pack of jackals and taste so degraded that it befouled everything it touched.
Fate was dealing from the bottom of the deck.
On director Ernst Lubitsch:
(he was) "smoking a cucumber and looking cool as a cigar."
A case of the tail dogging the wag.
There is such a thing as too much couth.
I tried to resist his overtures, but he plied me with symphonies, quartettes, chamber music, and cantatas.
I have Bright's disease and he has mine.
The main obligation is to amuse yourself.
The whistle shrilled and in a moment I was chugging out of Grand Central’s dreaming spires followed only by the anguished cries of relatives who would now have to go to work. I had chugged only a few feet when I realized that I had left without the train, so I had to run back and wait for it to start.
Nature, it appears, has been rather more bountiful to Paul’s body and purse than to his intellect; above the ears, speaking bluntly, the boy is strictly tapioca.
I’ve always taken my liquor mixed and my peril neat, and I see no reason to switch now.
“I love people from the East,” she went on. “There’s so much more to them.” Uncertain whether I was supposed to hail from Jubbulpore or Newark, I decided to play it safe and adopted an inscrutable global expression.
The moment Audrey’s tongue touched bourbon, it began wagging in a key just resonant enough to drown out the music.
“In France,” Marcel said with wintry dignity, “accidents occur in the bedroom, not the kitchen."
The showing (of the movie Foolish Wives) roused me to neither vandalism nor affection; in fact, it begot such lassitude that I had to be given artificial respiration and sent home in a wheelbarrow.
As for consulting a dentist regularly, my punctuality practically amounted to a fetish. Every twelve years I would drop whatever I was doing and allow wild Caucasian ponies to drag me to a reputable orthodontist.
"Great-grandfather died under strange circumstances. He opened a vein in his bath."
"I never knew baths had veins," protested Gabrilowitsch."
"I never knew his great-grandfather had a ba—" began Falcovsky derisively.
Before they made S.J. Perelman they broke the mold.
I guess I'm just an old mad scientist at bottom. Give me an underground laboratory, half a dozen atom-smashers, and a beautiful girl in a diaphanous veil waiting to be turned into a chimpanzee, and I care not who writes the nation's laws.
"Oh, son, I wish you hadn’t become a scenario writer!" she sniffled.
"Aw, now, Moms," I comforted her, "it’s no worse than playing the piano in a call house."
And finally, a quote from Groucho Marx on Perelman:
From the moment I picked up your book until I laid it down, I was convulsed with laughter. Someday I intend reading it.
That Old Feeling: Perelmania
from time.com
Am I alone these days in regarding S.J. Perelman as a formative font of 20th century wit?
...summon his name on Google, and you will find only about a third as many references for him as for his brother-in-law, Nathanael West, who published only four short novels in a six-year career, compared to 20-some volumes in Perelman's public half-century. Ransacking the Internet, I could discover no Perelman parties, no memorial readings from the canon, no revivals of the Broadway shows he worked on, no retrospectives of the films he helped write. I think of the Perelman story, "Who Stole My Golden Metaphor?" and wonder: who buried my Perelman beyond price?
But, kids, before you consign one of my favorite comedy writers to Corliss' mausoleum of lost causes, take a moment to roll naked in some Perelman prose. I choose, almost at random, a few verses from the "Acres and Pains," his magnum opus — or minimum opus, for it consumes fewer than 40 pages when reprinted. The book is Perelman's account of a city boy who, in 1932, acquired and tried to run a rural property in Bucks County, Pa.:
"A farm is an irregular patch of nettles bound by short-term notes, containing a fool and his wife who didn't know enough to stay in the city."
"I began my career as a country squire with nothing but a high heart, a flask of citronella, and a fork for toasting marshmallows in case supplies ran low. In a scant fifteen years I have acquired a superb library of mortgages, mostly first editions, and the finest case of sacroiliac known to science.... I also learned that to lock horns with Nature, the only equipment you really need is the constitution of Paul Bunyan and the basic training of a commando."
"When I first settled down on a heap of shale in the Delaware Valley, I too had a romantic picture of myself. For about a month I was a spare, sinewy frontiersman in fringed buckskin, with crinkly little lines about the eyes and a slow laconic drawl.... After I almost blew off a toe cleaning an air rifle, though, I decided I was more the honest rural type. I started wearing patched blue jeans [and] mopped my forehead with a red banana (I found out later it should have been a red bandanna).... One day, while stretched out on the porch, I realized I needed only a mint julep to become a real dyed-in-the-wool, Seagram's V.V.O. Southern planter.... I sent to New York for a broad-brimmed hat and a string tie, and at enormous expense trained the local idiot to fan me with a palmetto leaf."
"Today, thanks to unremitting study, I can change a fuse so deftly that it plunges the entire county into darkness.... The power company has offered me as high as fifteen thousand dollars a year to stay out of my own cellar."
I don't see today's kids busting their allowance to find buy Perelman books on ebay; in fact, the name name retrieves only 12 items, and four of them are other Perelmans. I think I know why the writer's short comic pieces, so influential in the last century, have little resonance in this one. It's because they are written in a dead language: English.
I should refer to it as Perelman English: a cocktail of Victorian and Edwardian sentence structure, Jazz Age slang whose sell-by date had long since expired, and a veritable Mount Meron of Yiddishisms. "Sid commands a vocabulary that is the despair (and joy) of every writing man," proclaimed his New Yorker colleague E.B. White. "He is like a Roxy organ that has three decks, 50 stops and a pride of petals under the bench. When he wants a word it's there.... His ears are as busy as an ant's feelers. No word ever gets by him." The language was, for Perelman, a gentleman's orgy, and he was Petronius, knowing which wench to peel, which grace to savor.
His stock of references could have filled the Great Library of Alexandria, if that august edifice had housed every copy of Cap'n Billy's Whizbang. Even those introduced to Perelman in his prime had to cram for antique references. The man was an instant anachronist, peppering his stories with allusions to Greek and Roman mythology, vaudeville dialect comics like Gallagher and Shean (Al Shean was the Marx Brothers' uncle), silent movies, and "the exact method of quarrying peat out of a bog at the time of the Irish Corn Laws." Add 50 years to these arcana, toss them at a collegiate today and he�ll expect a translation on the recto. (Buck up, young scholar. That's what Google's for.)
In his time, though — and why shouldn't that time be now, again? — Perelman was called America's foremost humorist, a comic genius, mein Yiddishe Aristophanes, a gift from Providence (he was reared in Rhode Island), but Perelman preferred the simple designation "writer." Even when his feuilletons don't stir a Gorgon to chuckle, they educe awe for the Wallenda grace of his prose, his solving of sentences. Attend to the choice of verbs in this relatively simple description: "Struggling into a robe, he reeled across the room, fumbled with the chain latch, and wrenched open the door." Action words, picture words, funny words.
Perelman's free-associative style spun fantasias out of girdle ads, tabloid tattle, sleazy pulp fiction and recipe prose. He was a Charlie Parker on tenor Underwood, running bizarre and beautiful variations on the tritest themes. With a difference: Perelman's prose was improv with agony. He perspired platelets to make it read cucumber-cool.
Richard Corliss
More here
Monday, November 09, 2009
The Bitter Tears of Johnny Cash
From salon.com
In July 1972, musician Johnny Cash sat opposite President Richard Nixon in the White House's Blue Room. As a horde of media huddled a few feet away, the country music superstar had come to discuss prison reform with the self-anointed leader of America's "silent majority." "Johnny, would you be willing to play a few songs for us," Nixon asked Cash. "I like Merle Haggard's 'Okie From Muskogee' and Guy Drake's 'Welfare Cadillac.'" The architect of the GOP's Southern strategy was asking for two famous expressions of white working-class resentment.
"I don't know those songs," replied Cash, "but I got a few of my own I can play for you." Dressed in his trademark black suit, his jet-black hair a little longer than usual, Cash draped the strap of his Martin guitar over his right shoulder and played three songs, all of them decidedly to the left of "Okie From Muskogee." With the nation still mired in Vietnam, Cash had far more than prison reform on his mind. Nixon listened with a frozen smile to the singer's rendition of the explicitly antiwar "What Is Truth?" and "Man in Black" ("Each week we lose a hundred fine young men") and to a folk protest song about the plight of Native Americans called "The Ballad of Ira Hayes." It was a daring confrontation with a president who was popular with Cash's fans and about to sweep to a crushing reelection victory, but a glimpse of how Cash saw himself -- a foe of hypocrisy, an ally of the downtrodden. An American protest singer, in short, as much as a country music legend.
More here
Antonino D'Ambrosio
In July 1972, musician Johnny Cash sat opposite President Richard Nixon in the White House's Blue Room. As a horde of media huddled a few feet away, the country music superstar had come to discuss prison reform with the self-anointed leader of America's "silent majority." "Johnny, would you be willing to play a few songs for us," Nixon asked Cash. "I like Merle Haggard's 'Okie From Muskogee' and Guy Drake's 'Welfare Cadillac.'" The architect of the GOP's Southern strategy was asking for two famous expressions of white working-class resentment.
"I don't know those songs," replied Cash, "but I got a few of my own I can play for you." Dressed in his trademark black suit, his jet-black hair a little longer than usual, Cash draped the strap of his Martin guitar over his right shoulder and played three songs, all of them decidedly to the left of "Okie From Muskogee." With the nation still mired in Vietnam, Cash had far more than prison reform on his mind. Nixon listened with a frozen smile to the singer's rendition of the explicitly antiwar "What Is Truth?" and "Man in Black" ("Each week we lose a hundred fine young men") and to a folk protest song about the plight of Native Americans called "The Ballad of Ira Hayes." It was a daring confrontation with a president who was popular with Cash's fans and about to sweep to a crushing reelection victory, but a glimpse of how Cash saw himself -- a foe of hypocrisy, an ally of the downtrodden. An American protest singer, in short, as much as a country music legend.
More here
Antonino D'Ambrosio
Thursday, November 05, 2009
Blood is a Rover ~ James Ellroy
From nybooks.com
Fever Dreams of Your FBI
By Norman Rush
James Ellroy's astonishing creation, the Underworld USA Trilogy, is complete. Its concluding volume, Blood's a Rover, has just been published. The three long thrillers that make up the trilogy (American Tabloid, 1995; The Cold Six Thousand, 2001; Blood's a Rover, 2009) present a brutal counterhistory of America in the 1960s and 1970s—the assassinations, the social convulsions, the power-elite plotting—through the lives of invented second- and third-echelon operatives in the great political crimes of the era. The trilogy is biblical in scale, catholic in its borrowing from conspiracy theories, absorbing to read, often awe-inspiring in the liberties taken with standard fictional presentation, and, in its imperfections and lapses, disconcerting.
Ellroy, who is in his early sixties, is the celebrated, prolific, and popular author of a body of genre crime fiction, crime journalism, and a forensic memoir dealing with his own dark family history. His work has been made into movies and television shows, and widely translated. There are Web sites devoted to Ellroy, and he connects with his fans through Facebook.
The Underworld USA Trilogy is generally regarded as Ellroy's magnum opus. It is unique in its ambitions, and proceeds at a level of art distinctly above that attained in his famously lurid and violent but more conventional books. It is a fiction unlike any other.
More here
From
washingtonpost.com
The Heart of Nixonian Darkness
By Bill Sheehan
"Blood's a Rover," like the volumes that precede it, is clearly not a conventional thriller. It is, rather, a rigorously constructed, idiosyncratic novel that uses the materials of crime fiction to examine the forces that have shaped -- and warped -- our recent history: racial tension, ideological warfare, greed, corruption and unbridled fanaticism in all its forms. Ellroy's bleak, brooding worldview, his dense, demanding style and his unflinching descriptions of extreme violence will almost certainly alienate large numbers of readers. But anyone who succumbs to the sheer tidal force of these novels will experience something darker, stranger and more compelling than almost anything else contemporary fiction has to offer.
More here
Fever Dreams of Your FBI
By Norman Rush
James Ellroy's astonishing creation, the Underworld USA Trilogy, is complete. Its concluding volume, Blood's a Rover, has just been published. The three long thrillers that make up the trilogy (American Tabloid, 1995; The Cold Six Thousand, 2001; Blood's a Rover, 2009) present a brutal counterhistory of America in the 1960s and 1970s—the assassinations, the social convulsions, the power-elite plotting—through the lives of invented second- and third-echelon operatives in the great political crimes of the era. The trilogy is biblical in scale, catholic in its borrowing from conspiracy theories, absorbing to read, often awe-inspiring in the liberties taken with standard fictional presentation, and, in its imperfections and lapses, disconcerting.
Ellroy, who is in his early sixties, is the celebrated, prolific, and popular author of a body of genre crime fiction, crime journalism, and a forensic memoir dealing with his own dark family history. His work has been made into movies and television shows, and widely translated. There are Web sites devoted to Ellroy, and he connects with his fans through Facebook.
The Underworld USA Trilogy is generally regarded as Ellroy's magnum opus. It is unique in its ambitions, and proceeds at a level of art distinctly above that attained in his famously lurid and violent but more conventional books. It is a fiction unlike any other.
More here
From
washingtonpost.com
The Heart of Nixonian Darkness
By Bill Sheehan
"Blood's a Rover," like the volumes that precede it, is clearly not a conventional thriller. It is, rather, a rigorously constructed, idiosyncratic novel that uses the materials of crime fiction to examine the forces that have shaped -- and warped -- our recent history: racial tension, ideological warfare, greed, corruption and unbridled fanaticism in all its forms. Ellroy's bleak, brooding worldview, his dense, demanding style and his unflinching descriptions of extreme violence will almost certainly alienate large numbers of readers. But anyone who succumbs to the sheer tidal force of these novels will experience something darker, stranger and more compelling than almost anything else contemporary fiction has to offer.
More here
The Age of the Informavore - A Talk With Frank Schirrmacher
From The Edge
We are apparently now in a situation where modern technology is changing the way people behave, people talk, people react, people think, and people remember. And you encounter this not only in a theoretical way, but when you meet people, when suddenly people start forgetting things, when suddenly people depend on their gadgets, and other stuff, to remember certain things. This is the beginning...
More here
We are apparently now in a situation where modern technology is changing the way people behave, people talk, people react, people think, and people remember. And you encounter this not only in a theoretical way, but when you meet people, when suddenly people start forgetting things, when suddenly people depend on their gadgets, and other stuff, to remember certain things. This is the beginning...
More here
Wednesday, November 04, 2009
Monday, November 02, 2009
Escape from Facebook
Ewan McNaught does not propose to subject you to yet another instantly obsolescent facebook status update. Not for me the "unreflective instantaneousness" and shallow insights into unremarkable lives of this particular milieu. I'm temporarily relocating to the blogosphere, an almost equally ephemeral and narcissistic construct, yet it beckons rather beguilingly to the marginally more reflective virtual persona. "You too can live in a chat free world" is it's siren song.
5 minutes ago · Comment · Like
5 minutes ago · Comment · Like
Nicholas Lezard: So You're Eating Lunch? Fascinating
from the independent
Where does one start with the momentous news that Stephen Fry was considering leaving Twitter? Apparently someone, although broadly sympathetic to Fry in general – no, better, someone who admired and adored him – complained that some of his tweets, as I gather they are called, were "a bit... boring." Fry took hurt, and announced his intentions before having a re-think.
Now, I have nothing against Stephen Fry, although one questions the wisdom of someone with an easily-bruised ego telling 800,000 people that he's eating a sandwich and expecting every one of them to be thrilled by the news, but I certainly have something against Twitter.
The name tells us straightaway: it's inconsequential, background noise, a waste of time and space. Actually, the name does a disservice to the sounds birds make, which are, for the birds, significant, and for humans, soothing and, if you're Messiaen, inspirational. But Twitter? Inspirational?
No – it's inspiration's opposite. The online phenomenon is about humanity disappearing up its own fundament, or the air leaking out of the whole Enlightenment project. In short, I feel about Twitter the way some people feel about nuclear weapons: it's wrong. It makes blogging look like literature. It's anti-literature, the new opium of the masses.
Its unreflective instantaneousness encourages neurotic behaviour in both the tweeters and the twatted (seriously, the Americans have proposed that "twatted" should be the past participle of "tweet", which is the only funny thing about the whole business); it encourages us in the delusion that our random thoughts, our banal experiences, are significant. It is masturbatory and infantile, and the amazing thing is that people can't get enough of it – possibly because it IS masturbatory and infantile.
Answering the question: "Why do so many people seem to like Twitter?" Twitter itself does not say: "Because people are idiots with a steadily decreasing attention span, and 140 characters is pretty much all anyone has space for in their atrophied brains any more," but instead, "People are eager to connect with other people and Twitter makes that simple."
Twitter asks one question: "What are you doing?" (It also adds, in the next paragraph, that "Twitter's core technology is a device agnostic message routing system with rudimentary social networking features", and I hope that clears everything up for you.)
Oh God, that it should have come to this. Centuries of human thought and experience drowned out in a maelstrom of inconsequential rubbish (and don't tell me about Trafigura – one good deed is not enough, and an ordinary online campaign would have done the trick just as well). It is like some horrible science-fiction prediction come to pass: it is not just that Twitter signals the end of nuanced, reflective, authoritative thought – it's that no one seems to mind.
And I suspect that it's psychologically dangerous. We have evolved over millions of years to learn not to bore other people with constant updates about what we're doing (I'm opening a jar of pickles ... I'm picking my nose ... I'm typing out a message on Twitter ...) and we're throwing it all away. Twitter encourages monstrous egomania, and the very fact that Fry used Twitter to announce that he was leaving Twitter shows his dependence on it. He was never going to give it up. He's addicted to it.
Nicholas Lezard
Where does one start with the momentous news that Stephen Fry was considering leaving Twitter? Apparently someone, although broadly sympathetic to Fry in general – no, better, someone who admired and adored him – complained that some of his tweets, as I gather they are called, were "a bit... boring." Fry took hurt, and announced his intentions before having a re-think.
Now, I have nothing against Stephen Fry, although one questions the wisdom of someone with an easily-bruised ego telling 800,000 people that he's eating a sandwich and expecting every one of them to be thrilled by the news, but I certainly have something against Twitter.
The name tells us straightaway: it's inconsequential, background noise, a waste of time and space. Actually, the name does a disservice to the sounds birds make, which are, for the birds, significant, and for humans, soothing and, if you're Messiaen, inspirational. But Twitter? Inspirational?
No – it's inspiration's opposite. The online phenomenon is about humanity disappearing up its own fundament, or the air leaking out of the whole Enlightenment project. In short, I feel about Twitter the way some people feel about nuclear weapons: it's wrong. It makes blogging look like literature. It's anti-literature, the new opium of the masses.
Its unreflective instantaneousness encourages neurotic behaviour in both the tweeters and the twatted (seriously, the Americans have proposed that "twatted" should be the past participle of "tweet", which is the only funny thing about the whole business); it encourages us in the delusion that our random thoughts, our banal experiences, are significant. It is masturbatory and infantile, and the amazing thing is that people can't get enough of it – possibly because it IS masturbatory and infantile.
Answering the question: "Why do so many people seem to like Twitter?" Twitter itself does not say: "Because people are idiots with a steadily decreasing attention span, and 140 characters is pretty much all anyone has space for in their atrophied brains any more," but instead, "People are eager to connect with other people and Twitter makes that simple."
Twitter asks one question: "What are you doing?" (It also adds, in the next paragraph, that "Twitter's core technology is a device agnostic message routing system with rudimentary social networking features", and I hope that clears everything up for you.)
Oh God, that it should have come to this. Centuries of human thought and experience drowned out in a maelstrom of inconsequential rubbish (and don't tell me about Trafigura – one good deed is not enough, and an ordinary online campaign would have done the trick just as well). It is like some horrible science-fiction prediction come to pass: it is not just that Twitter signals the end of nuanced, reflective, authoritative thought – it's that no one seems to mind.
And I suspect that it's psychologically dangerous. We have evolved over millions of years to learn not to bore other people with constant updates about what we're doing (I'm opening a jar of pickles ... I'm picking my nose ... I'm typing out a message on Twitter ...) and we're throwing it all away. Twitter encourages monstrous egomania, and the very fact that Fry used Twitter to announce that he was leaving Twitter shows his dependence on it. He was never going to give it up. He's addicted to it.
Nicholas Lezard
Sunday, August 23, 2009
A lecture from the FBI
Scotland's Justice Secretary Kenny MacAskill has had to endure a lecture on the subject of "justice" from FBI Director Robert Mueller after the release, on compassionate grounds, of the terminally ill Abdel Baset al-Megrahi.
See here.
It's good to know that the FBI are taking an interest in such things. Let's hope they don't adopt the ostensibly simplistic, yet highly idiosyncratic and relativistic, definition of "justice" that seemed to characterise Obama's idiotic predecessor's Presidency. Let's hope they take an idealistic, inclusive, internationalist approach to justice, one more in tune with the current incumbent of the White House's Brave New World. I'm almost certain they've sent similarly outraged letters of condemnation to the perpetrators of injustice at Guantanamo Bay and Abu Ghraib, the US Army for the massacre at My Lai, the CIA for their renditon flights programmme and the US Navy for the Iran Air Flight 655 incident. Keep writing those outraged letters, Bob....
See here.
It's good to know that the FBI are taking an interest in such things. Let's hope they don't adopt the ostensibly simplistic, yet highly idiosyncratic and relativistic, definition of "justice" that seemed to characterise Obama's idiotic predecessor's Presidency. Let's hope they take an idealistic, inclusive, internationalist approach to justice, one more in tune with the current incumbent of the White House's Brave New World. I'm almost certain they've sent similarly outraged letters of condemnation to the perpetrators of injustice at Guantanamo Bay and Abu Ghraib, the US Army for the massacre at My Lai, the CIA for their renditon flights programmme and the US Navy for the Iran Air Flight 655 incident. Keep writing those outraged letters, Bob....
Thursday, May 14, 2009
My Ever Changing Virtual Moods
10.23 hrs:
Stuck in a doppelganger dive in Second Life listening to an ersatz Style Council performing a turgid little number entitled "My Ever Changing Virtual Moods."
Stuck in a doppelganger dive in Second Life listening to an ersatz Style Council performing a turgid little number entitled "My Ever Changing Virtual Moods."
Wednesday, May 13, 2009
My Ever Changing Virtual Moods
14.05 hrs:
In a Carlito Brigante state of mind: rehabilitated, reinvigorated, reassimilated and relocated. First Life is the future.
getafirstlife.com
In a Carlito Brigante state of mind: rehabilitated, reinvigorated, reassimilated and relocated. First Life is the future.
getafirstlife.com
My Ever Changing Virtual Moods
14.00 hrs:
Bored with facebook, tired of blogging, impervious to the attractions of twitter & disinclined to chat. Virtual profile reduction is the future.
Bored with facebook, tired of blogging, impervious to the attractions of twitter & disinclined to chat. Virtual profile reduction is the future.
Tuesday, April 28, 2009
Pandemonium: it's an Armageddemic!
Is pandemic really a dramatic enough word for the scaremongering media?
Armageddemic would surely be preferable.
Classic Chris Morris-esque line from Sky News tonight:
Armageddemic would surely be preferable.
Classic Chris Morris-esque line from Sky News tonight:
"New York authorities are warning that literally hundreds of cases of Swine Flu could have gone unreported as the victims are now recovering."
Friday, February 27, 2009
Stompin' In My Air Force One. How will Obama's presidency change hip-hop?
From Slate
By Jonah Weiner
Barack Obama arrived at the Oval Office with a long parade of expectations in tow. One special-interest group with a particularly colorful wish list is the hip-hop community, which has been plotting this moment for years. If Obama makes his policy decisions based on Nas' 1996 single "If I Ruled the World," for instance, he will appoint Coretta Scott King to a mayoralty, fling open the gates of Attica, and grant every citizen an Infiniti Q45. If he follows the Pharcyde's more modestly pitched "If I Were President," he'll buy Michelle some new clothes and treat himself to a new pair of sneakers. If he heeds the urgent lessons of Public Enemy's 1994 video for "So Whatcha Gone Do Now?" Obama will staff the Secret Service exclusively with beret-clad black militants or else risk assassination at the hands of a far-reaching neo-Nazi conspiracy.
Hip-hop fantasies of a black executive have popped up throughout the genre's history, visions of empowerment that speak to a real-life condition of powerlessness. In this sense, they're merely a loftier version of the standard hip-hop fantasies of potency, whether it's sexual domination, VIP access, or street-corner supremacy.
With Obama's win, this dynamic stands to change. For 25-odd years, hip-hop has been black America's main ambassador to the white American mainstream. How will hip-hop see itself now that the most powerful man in the country is a) black and b) a Jay-Z fan? Obama is doubtless the warmest—and smartest—rap critic ever to take the oath of office. When he has praised hip-hop, he has done so with near-impeccable taste. (His admiration for Jay-Z, Lil Wayne, Ludacris, and Kanye West would displease no rap blogger worth his RSS feed.) When he's criticized it, he's spoken with none of the condescension or cluelessness politicians often bring to the endeavor. For him, hip-hop is an art form, not culture-war fodder. "I love the art of hip-hop," he told MTV last year. "I don't always love the message." Though it's too early to say precisely how, there are already clues as to the effect Obama's rise will have on both.
More here
By Jonah Weiner
Barack Obama arrived at the Oval Office with a long parade of expectations in tow. One special-interest group with a particularly colorful wish list is the hip-hop community, which has been plotting this moment for years. If Obama makes his policy decisions based on Nas' 1996 single "If I Ruled the World," for instance, he will appoint Coretta Scott King to a mayoralty, fling open the gates of Attica, and grant every citizen an Infiniti Q45. If he follows the Pharcyde's more modestly pitched "If I Were President," he'll buy Michelle some new clothes and treat himself to a new pair of sneakers. If he heeds the urgent lessons of Public Enemy's 1994 video for "So Whatcha Gone Do Now?" Obama will staff the Secret Service exclusively with beret-clad black militants or else risk assassination at the hands of a far-reaching neo-Nazi conspiracy.
Hip-hop fantasies of a black executive have popped up throughout the genre's history, visions of empowerment that speak to a real-life condition of powerlessness. In this sense, they're merely a loftier version of the standard hip-hop fantasies of potency, whether it's sexual domination, VIP access, or street-corner supremacy.
With Obama's win, this dynamic stands to change. For 25-odd years, hip-hop has been black America's main ambassador to the white American mainstream. How will hip-hop see itself now that the most powerful man in the country is a) black and b) a Jay-Z fan? Obama is doubtless the warmest—and smartest—rap critic ever to take the oath of office. When he has praised hip-hop, he has done so with near-impeccable taste. (His admiration for Jay-Z, Lil Wayne, Ludacris, and Kanye West would displease no rap blogger worth his RSS feed.) When he's criticized it, he's spoken with none of the condescension or cluelessness politicians often bring to the endeavor. For him, hip-hop is an art form, not culture-war fodder. "I love the art of hip-hop," he told MTV last year. "I don't always love the message." Though it's too early to say precisely how, there are already clues as to the effect Obama's rise will have on both.
More here
Mindfuck Movies
From themorningnews.org
by Matthew Baldwin
There’s a certain brand of movie that I most enjoy. Some people call them “Puzzle Movies.” Others call them “Brain Burners.” Each has, at some point or another, been referred to as “that flick I watched while I was baked out of my mind.”
But the phrase I find myself employing, when casting around for a succinct term for the entire genre, is “Mindfuck Movies.” It’s an expression I picked up from a college roommate of mine, an enormous Star Trek: The Next Generation fan who adored those episodes when the nature of reality itself was called into question, usually after the holodeck went berserk or Q showed up and hornswoggled everyone into thinking they were intergalactic dung beetles (or whatever…I never really followed the show myself).
Mindfuckers aren’t just Dadaism by another name—there has to be some rationale for the mayhem, even if it’s far-fetched (orbiting hallucination-inducing lasers!) or lame (it was all a dream!).
And they are not those movies where the audience (and the characters) think they know what’s happening, only to discover in the final moments some key twist that turns everything on its head. (Bruce Willis was balding the whole time?!) I love those films as well, but that’s not what we’re discussing. In Mindfuck Movies you know that Something Is Going On. It’s just not clear what.
Here are 16 of my absolute favorites from this rarefied class of motion pictures. And, really, the phrase “Mindfuck Movies” is too crude for such works of arts. These films are sophisticated. They make love to your mind.
2001: A Space Odyssey (1968)
Stanely Kubrick’s tour de force is grounded on scientific principles so sound that even now, 40 years after its release, it is still routinely cited as the finest “hard science-fiction” film ever made. (“Hard science-fiction” is defined as “stories in which Han Solo does not saunter around the surface of an asteroid wearing only an oxygen mask and a leather jacket”).
Yes, it’s a meticulously crafted and imminently rational three-course meal of a film. For the first two hours, anyhow. And then, in the final 30 minutes, it serves up a steaming bowl of WTF for dessert.
Why is the ending of 2001 so hard to comprehend? Because it doesn’t make a goddamned lick of sense—unless you read the book, that is. And this is by design. Kubrick and author Arthur C. Clarke intended the film and the novel to serve as companion pieces, to be consumed one right after the other.
Solyaris (1972)
Perhaps worried about the widening “Mindblowing Cinematic Science-Fiction Gap,” the Ruskies released this Solyaris four years after 2001 and doubled-down on the hallucinogenic quotient. Based on a novel by legendary sci-fi maven Stanislaw Lem, the plot revolves around a mysterious water-world and the cosmonaut in its orbit, as they attempt to communicate with one another. Think My Dinner With Andre if Andre were a Class-O Planetoid and the dinner consisted solely of psilocybin mushrooms.
Director Andrei Tarkovsky loved making these kinds of movies—his 1979 film Stalkers is equally fantastic, in both the adjectival and superlative sense. And he could also create a meditative, haunting, and beautiful sequence like nobody’s business...
Mulholland Dr. (2001)
Here are the makings of a fun evening. Step one: Take your parents to see Mulholland Dr. Step 2: Endure the hottest girl-on-girl sex scene in the history of mainstream cinema while sitting a foot and a half from your mom. Step 3: Take your parents out for dinner afterwards, but instead of making chit-chat spend your entire meal staring into the middle-distance while attempting to make sense of the previous two hours.
It’s amazing that this film isn’t a mess. David Lynch originally filmed the story as a pilot of a television series, then re-shot some scenes and cobbled together a feature-length film after the studio executives took a pass. And yet somehow this rejected half-breed wound up as Lynch’s finest work to date.
Primer (2004)The first 20 minutes of Primer are among the most prosaic ever committed to celluloid: four wanna-be entrepreneurs dicking around in a garage, then sitting around a kitchen table stuffing envelopes. We know they are not working on anything new or exciting because the opening narration tells us as much. We know their previous research has proven fruitless because the characters argue about who’s to blame for their miserable lack of success.
And then they invent…something. The viewer doesn’t know exactly what they’ve invented, primarily because the protagonists themselves don’t know what they’ve invented. In fact, much of the film’s remaining 60 minutes focuses on their efforts to figure out what this thing does and how they can capitalize on it.
If the films in this list were arranged in order of ascending awesomeness rather than chronologically, Primer would still occupy the final slot. Made on a budget of $7,000 (seven! thousand!), Primer is one of the few movies I have ever watched twice in a row—and certainly the only movie I’ve ever watched at 8 a.m. after having watched it twice in a row on the evening prior. It’s like a deep-tissue massage for your brain—afterwards you may hurt like hell, but you’ll also feel strangely invigorated.
More here
by Matthew Baldwin
There’s a certain brand of movie that I most enjoy. Some people call them “Puzzle Movies.” Others call them “Brain Burners.” Each has, at some point or another, been referred to as “that flick I watched while I was baked out of my mind.”
But the phrase I find myself employing, when casting around for a succinct term for the entire genre, is “Mindfuck Movies.” It’s an expression I picked up from a college roommate of mine, an enormous Star Trek: The Next Generation fan who adored those episodes when the nature of reality itself was called into question, usually after the holodeck went berserk or Q showed up and hornswoggled everyone into thinking they were intergalactic dung beetles (or whatever…I never really followed the show myself).
Mindfuckers aren’t just Dadaism by another name—there has to be some rationale for the mayhem, even if it’s far-fetched (orbiting hallucination-inducing lasers!) or lame (it was all a dream!).
And they are not those movies where the audience (and the characters) think they know what’s happening, only to discover in the final moments some key twist that turns everything on its head. (Bruce Willis was balding the whole time?!) I love those films as well, but that’s not what we’re discussing. In Mindfuck Movies you know that Something Is Going On. It’s just not clear what.
Here are 16 of my absolute favorites from this rarefied class of motion pictures. And, really, the phrase “Mindfuck Movies” is too crude for such works of arts. These films are sophisticated. They make love to your mind.
2001: A Space Odyssey (1968)
Stanely Kubrick’s tour de force is grounded on scientific principles so sound that even now, 40 years after its release, it is still routinely cited as the finest “hard science-fiction” film ever made. (“Hard science-fiction” is defined as “stories in which Han Solo does not saunter around the surface of an asteroid wearing only an oxygen mask and a leather jacket”).
Yes, it’s a meticulously crafted and imminently rational three-course meal of a film. For the first two hours, anyhow. And then, in the final 30 minutes, it serves up a steaming bowl of WTF for dessert.
Why is the ending of 2001 so hard to comprehend? Because it doesn’t make a goddamned lick of sense—unless you read the book, that is. And this is by design. Kubrick and author Arthur C. Clarke intended the film and the novel to serve as companion pieces, to be consumed one right after the other.
Solyaris (1972)
Perhaps worried about the widening “Mindblowing Cinematic Science-Fiction Gap,” the Ruskies released this Solyaris four years after 2001 and doubled-down on the hallucinogenic quotient. Based on a novel by legendary sci-fi maven Stanislaw Lem, the plot revolves around a mysterious water-world and the cosmonaut in its orbit, as they attempt to communicate with one another. Think My Dinner With Andre if Andre were a Class-O Planetoid and the dinner consisted solely of psilocybin mushrooms.
Director Andrei Tarkovsky loved making these kinds of movies—his 1979 film Stalkers is equally fantastic, in both the adjectival and superlative sense. And he could also create a meditative, haunting, and beautiful sequence like nobody’s business...
Mulholland Dr. (2001)
Here are the makings of a fun evening. Step one: Take your parents to see Mulholland Dr. Step 2: Endure the hottest girl-on-girl sex scene in the history of mainstream cinema while sitting a foot and a half from your mom. Step 3: Take your parents out for dinner afterwards, but instead of making chit-chat spend your entire meal staring into the middle-distance while attempting to make sense of the previous two hours.
It’s amazing that this film isn’t a mess. David Lynch originally filmed the story as a pilot of a television series, then re-shot some scenes and cobbled together a feature-length film after the studio executives took a pass. And yet somehow this rejected half-breed wound up as Lynch’s finest work to date.
Primer (2004)The first 20 minutes of Primer are among the most prosaic ever committed to celluloid: four wanna-be entrepreneurs dicking around in a garage, then sitting around a kitchen table stuffing envelopes. We know they are not working on anything new or exciting because the opening narration tells us as much. We know their previous research has proven fruitless because the characters argue about who’s to blame for their miserable lack of success.
And then they invent…something. The viewer doesn’t know exactly what they’ve invented, primarily because the protagonists themselves don’t know what they’ve invented. In fact, much of the film’s remaining 60 minutes focuses on their efforts to figure out what this thing does and how they can capitalize on it.
If the films in this list were arranged in order of ascending awesomeness rather than chronologically, Primer would still occupy the final slot. Made on a budget of $7,000 (seven! thousand!), Primer is one of the few movies I have ever watched twice in a row—and certainly the only movie I’ve ever watched at 8 a.m. after having watched it twice in a row on the evening prior. It’s like a deep-tissue massage for your brain—afterwards you may hurt like hell, but you’ll also feel strangely invigorated.
More here
Tuesday, February 10, 2009
Impulsivity - The Fabulous Bud E. Love
From budelove.com
Impulsivity
From time to time, a perfectly sensitive individual will walk up to me and say, "Bud E., it's not working. I'm not getting laid. I'm being sensitive, and it's getting me nowhere. I cry frequently for little or no apparent reason. I admire subtleties in fabrics. I'm unhappy when bugs die. And yet I'm ignored."
I feel bad for these cats. They're trying hard. They're honest men. But they're missing an important truth about chicks: they dig contradiction.
Chicks eat it up when a cat does a 180. Sinatra knows it. Sammy knew it. The Beatles knew it. Trini Lopez was unsure about it. A cat whispers sweet nothings, then yells at the top of his lungs. He's Hurricane Andrew one day - and Andy Griffith the next. One minute you can't get the cat off the couch with a spatula, the next minute he's flipping you first-class tickets to Jamaica - the plane leaves in an hour. Forget the bikini!
What's happened?
He's possessed.
He's in a spell.
And the chick he's with has just turned into putty.
Why?
Because the cat is not boring. He's not the same schmuck day after day. He's impulsive. Unpredictable. Impossible to resist.
Practice being impulsive at your place of employment. Walk in one day and give everyone the day off, even though you're not the boss. Park in someone else's parking space. Drive their car home. Call the phone company and have the business number changed. When your boss inquires why you've done these things, say you have no more idea than she does.
Don't be just one cat when you can be a litter. Be mercurial. Be impulsive. But make sure the chicks see you being impulsive. When the babes at your office get wind of the fact that you're Mr. Impulsive, they'll blow into your life like a scirocco.
Excerpted From: You Oughta Be Me: How To Be A Lounge Singer And Live Like One. St. Martins Press, 1993.
More here
Impulsivity
From time to time, a perfectly sensitive individual will walk up to me and say, "Bud E., it's not working. I'm not getting laid. I'm being sensitive, and it's getting me nowhere. I cry frequently for little or no apparent reason. I admire subtleties in fabrics. I'm unhappy when bugs die. And yet I'm ignored."
I feel bad for these cats. They're trying hard. They're honest men. But they're missing an important truth about chicks: they dig contradiction.
Chicks eat it up when a cat does a 180. Sinatra knows it. Sammy knew it. The Beatles knew it. Trini Lopez was unsure about it. A cat whispers sweet nothings, then yells at the top of his lungs. He's Hurricane Andrew one day - and Andy Griffith the next. One minute you can't get the cat off the couch with a spatula, the next minute he's flipping you first-class tickets to Jamaica - the plane leaves in an hour. Forget the bikini!
What's happened?
He's possessed.
He's in a spell.
And the chick he's with has just turned into putty.
Why?
Because the cat is not boring. He's not the same schmuck day after day. He's impulsive. Unpredictable. Impossible to resist.
Practice being impulsive at your place of employment. Walk in one day and give everyone the day off, even though you're not the boss. Park in someone else's parking space. Drive their car home. Call the phone company and have the business number changed. When your boss inquires why you've done these things, say you have no more idea than she does.
Don't be just one cat when you can be a litter. Be mercurial. Be impulsive. But make sure the chicks see you being impulsive. When the babes at your office get wind of the fact that you're Mr. Impulsive, they'll blow into your life like a scirocco.
Excerpted From: You Oughta Be Me: How To Be A Lounge Singer And Live Like One. St. Martins Press, 1993.
More here
The 50 Greatest Things That Just Popped Into My Head by Jack Pendarvis
From The Believer
50. Ice cream
49. Blue—Don’t try to get creative with your favorite color. We all know it’s blue.
48. Sneeze—It feels great to sneeze. What a relief.
47. Jesus—Love him or hate him, you’ve got to hand it to the Son of God. Over two thousand years old and still going strong. It’s no wonder His name is also an exclamation. Jesus!
46. Oxygen—What a fantastic element! And not just because it’s easy to breathe. There is so much to recommend oxygen. According to scientists, nothing beats oxygen.
45. Bill—A friend of mine. He’s nice. I’m sorry we haven’t kept in touch as well as we might have, Bill. Say hello to Lisa.
44. Lisa—I secretly think you are “hot.” Don’t tell Bill.
43. Windows—You can see what’s outside yet feel safe and protected. It’s all thanks to windows. Who thought up windows? We may never know. But that man or lady had it “goin’ on”!
42. Lists of fifty things in magazines—Thought provoking! Not a waste of time at all.
41. House slippers
More here
50. Ice cream
49. Blue—Don’t try to get creative with your favorite color. We all know it’s blue.
48. Sneeze—It feels great to sneeze. What a relief.
47. Jesus—Love him or hate him, you’ve got to hand it to the Son of God. Over two thousand years old and still going strong. It’s no wonder His name is also an exclamation. Jesus!
46. Oxygen—What a fantastic element! And not just because it’s easy to breathe. There is so much to recommend oxygen. According to scientists, nothing beats oxygen.
45. Bill—A friend of mine. He’s nice. I’m sorry we haven’t kept in touch as well as we might have, Bill. Say hello to Lisa.
44. Lisa—I secretly think you are “hot.” Don’t tell Bill.
43. Windows—You can see what’s outside yet feel safe and protected. It’s all thanks to windows. Who thought up windows? We may never know. But that man or lady had it “goin’ on”!
42. Lists of fifty things in magazines—Thought provoking! Not a waste of time at all.
41. House slippers
More here
Thursday, February 05, 2009
Best Bar (None) ~ Chris Morris
From warprecords.com
INFECTION
It's catching! But seriously, it is. Think back to the last time you had a flu - just before you went down, you felt better than cocaine. Why? Because your metabolism peaks as it gears up to fight the germs - before the symptoms kick in...
The idea at Infection is that you're dosed with a superbug eyedrop of your choice (we recommend XenoBac3), and within an hour your immune system's in hyperdrive - high-altitude bacterial cruising all night. And as you leave, spazz the microbes with a free 20,000mg vitamin quadratab.
More here
INFECTION
It's catching! But seriously, it is. Think back to the last time you had a flu - just before you went down, you felt better than cocaine. Why? Because your metabolism peaks as it gears up to fight the germs - before the symptoms kick in...
The idea at Infection is that you're dosed with a superbug eyedrop of your choice (we recommend XenoBac3), and within an hour your immune system's in hyperdrive - high-altitude bacterial cruising all night. And as you leave, spazz the microbes with a free 20,000mg vitamin quadratab.
More here
Monday, February 02, 2009
Thursday, January 29, 2009
George W. Bush's Sci-Fi Disaster
From In These Times
In retrospect, George W. Bush’s presidency could be viewed as a science-fiction disaster movie in which an alien force seizes illegitimate control of a nation, saps its wealth, wreaks devastation, but is finally dislodged and forced to depart amid human hope for a rebirth.
There was even a satisfying concluding scene as a new human leader takes power amid cheers of a liberated populace. The alien flees aboard a form of air transportation (in this case, a helicopter), departing to the jeers of thousands and many wishes of good riddance.
After Bush’s departure on Jan. 20, 2009, the real-life masses actually had the look of survivors in a disaster movie, dressed mostly in ragtag clothing—ski caps, parkas, boots and blankets—bent against the cold winds trudging through streets largely devoid of traffic.
My 20-year-old son, Jeff, and I made our way home from the Mall to our house in Arlington, Virginia, by hiking across the 14th Street Bridge, part of the normally busy Interstate 395, except that only buses and official vehicles were using it on Inauguration Day.
So, the bridge became an impromptu walkway with clumps of half-frozen pedestrians straggling across it, over the icy Potomac. Jeff and I picked an exit ramp near the Pentagon, clambered over some road dividers, and worked our way to Pentagon City where we’d parked the car. It took much of the afternoon and evening for the cold to work its way out of our bodies.
Everyone I’ve talked to who attended Barack Obama’s Inauguration had similar tales of transportation woes—standing in long lines in freezing temperatures, frustrated by jammed subway stations, walking long distances—but no one was angry. Remarkably, police reported no Inaugural-related arrests.
Despite the grim economy and other havoc left behind by Bush and his associates, Inauguration Day 2009 was filled with a joy that I have rarely seen on the streets of Washington, a city that even at its best is not known for spontaneous bursts of happiness.
But there was more than joy that day; there was a sense of liberation.
An estimated 1.8 million people braved the frigid temperatures and the transportation foul-ups to witness not only Obama’s swearing-in, but Bush’s ushering-out. They not only cheered Obama and other favorites, but many booed those considered responsible for the national plundering, especially Bush and the wheelchair-bound Dick Cheney.
Robert Parry
More here
In retrospect, George W. Bush’s presidency could be viewed as a science-fiction disaster movie in which an alien force seizes illegitimate control of a nation, saps its wealth, wreaks devastation, but is finally dislodged and forced to depart amid human hope for a rebirth.
There was even a satisfying concluding scene as a new human leader takes power amid cheers of a liberated populace. The alien flees aboard a form of air transportation (in this case, a helicopter), departing to the jeers of thousands and many wishes of good riddance.
After Bush’s departure on Jan. 20, 2009, the real-life masses actually had the look of survivors in a disaster movie, dressed mostly in ragtag clothing—ski caps, parkas, boots and blankets—bent against the cold winds trudging through streets largely devoid of traffic.
My 20-year-old son, Jeff, and I made our way home from the Mall to our house in Arlington, Virginia, by hiking across the 14th Street Bridge, part of the normally busy Interstate 395, except that only buses and official vehicles were using it on Inauguration Day.
So, the bridge became an impromptu walkway with clumps of half-frozen pedestrians straggling across it, over the icy Potomac. Jeff and I picked an exit ramp near the Pentagon, clambered over some road dividers, and worked our way to Pentagon City where we’d parked the car. It took much of the afternoon and evening for the cold to work its way out of our bodies.
Everyone I’ve talked to who attended Barack Obama’s Inauguration had similar tales of transportation woes—standing in long lines in freezing temperatures, frustrated by jammed subway stations, walking long distances—but no one was angry. Remarkably, police reported no Inaugural-related arrests.
Despite the grim economy and other havoc left behind by Bush and his associates, Inauguration Day 2009 was filled with a joy that I have rarely seen on the streets of Washington, a city that even at its best is not known for spontaneous bursts of happiness.
But there was more than joy that day; there was a sense of liberation.
An estimated 1.8 million people braved the frigid temperatures and the transportation foul-ups to witness not only Obama’s swearing-in, but Bush’s ushering-out. They not only cheered Obama and other favorites, but many booed those considered responsible for the national plundering, especially Bush and the wheelchair-bound Dick Cheney.
Robert Parry
More here
Wednesday, January 28, 2009
Che: The Ronald McDonald of Revolution
From WorldHum via Art & Letters Daily.
Visit the Museo de la Revolución in central Havana, and two things about the museum’s photo displays will immediately capture your attention. First, it’s clear that the battle to control Cuba in the late 1950s was ultimately won by the cool guys. Young, bearded and ruggedly handsome, the rebel warriors of Fidel Castro’s 26th of July Movement look like Beat hipsters and rock stars—Fidel tall and imposing in his fatigues; Camillo Cienfuegos grinning under his broad-brimmed cowboy hat; Ernesto “Che” Guevara looking smolderingly photogenic in his black beret. By contrast, the U.S.-backed dictator Fulgencio Batista and his cronies look bloated, balding and unquestionably corrupt in their stubby neckties and damp armpits and oversized paunches. Even without reading the captions, it’s easy to discern the heroes from the villains.
Look closer, however, and you’ll notice that the triumphant photos of Fidel and Che are faded and mildewed, their corners curled by age and humidity. The photo captions are spelled out in a clunky die-cast typeset that hasn’t been used in a generation, and contain glowing present-tense references to the magnanimity of the Soviet Union—a country that hasn’t existed since 1991. Despite the grungy glamour of the young men who toppled a tyrant all those years ago, the anachronism and decay of the museum’s exhibits reveal just how tired and toothless Cuba’s revolutionary myths have become in Havana. In many ways, the building is a museum of a museum—a yellowing relic of how the communist regime chose to portray itself in the 1970s.
Step outside the Museo de la Revolución into the humid Havana air, and the glamorous sheen of the bygone Cuban revolution seems to have been distilled into a single image—Alberto Korda’s famous 1960 photo of a bearded Che Guevara looking steely and determined in his beret. In a city where few buildings outside the restored Habana Vieja district have seen a new coat of paint in half a century, freshly retouched renderings of Che’s mug adorn countless walls and billboards. Moreover, in a country largely devoid of public advertising and religious iconography, Guevara’s ubiquitous image appears to fill the role of both Jesus Christ and Ronald McDonald—a sainted martyr of unwavering purity who also happens to promote a meticulously standardized (if not particularly nutritious) political menu.
Rolf Potts
More here
Tuesday, January 27, 2009
Swingin' Sounds for Hipsters Vol 7
Smokin' swing, rockin' r'&b', jumpin' jazz & voodoo grooves from Vegas/Voodoo Rooms vinyl villain Frankie Sumatra.
Monday, January 26, 2009
The End of Solitude
From The Chronicle Review
What does the contemporary self want? The camera has created a culture of celebrity; the computer is creating a culture of connectivity. As the two technologies converge — broadband tipping the Web from text to image, social-networking sites spreading the mesh of interconnection ever wider — the two cultures betray a common impulse. Celebrity and connectivity are both ways of becoming known. This is what the contemporary self wants. It wants to be recognized, wants to be connected: It wants to be visible. If not to the millions, on Survivor or Oprah, then to the hundreds, on Twitter or Facebook. This is the quality that validates us, this is how we become real to ourselves — by being seen by others. The great contemporary terror is anonymity. If Lionel Trilling was right, if the property that grounded the self, in Romanticism, was sincerity, and in modernism it was authenticity, then in postmodernism it is visibility.
William Deresiewicz
More here
What does the contemporary self want? The camera has created a culture of celebrity; the computer is creating a culture of connectivity. As the two technologies converge — broadband tipping the Web from text to image, social-networking sites spreading the mesh of interconnection ever wider — the two cultures betray a common impulse. Celebrity and connectivity are both ways of becoming known. This is what the contemporary self wants. It wants to be recognized, wants to be connected: It wants to be visible. If not to the millions, on Survivor or Oprah, then to the hundreds, on Twitter or Facebook. This is the quality that validates us, this is how we become real to ourselves — by being seen by others. The great contemporary terror is anonymity. If Lionel Trilling was right, if the property that grounded the self, in Romanticism, was sincerity, and in modernism it was authenticity, then in postmodernism it is visibility.
William Deresiewicz
More here
Thursday, January 22, 2009
If Charlie Parker Was a Gunslinger, There'd Be a Whole Lot of Dead Copycats.
Nat King Cole and Francis Albert Sinatra.
Howard Hawks is momentarily distracted on the set of Rio Bravo (1958).
These great photos and many more at
If Charlie Parker Was a Gunslinger, There'd be a Whole Lot of Dead Copycats.
Wednesday, January 21, 2009
Random Transmissions 13 ~ Dorothy Parker
They sicken of the calm, who knew the storm.
Oh, life is a glorious cycle of song,
A medley of extemporanea;
And love is a thing that can never go wrong;
And I am Marie of Romania.
She runs the gamut of emotions from A to B.
(speaking of Katharine Hepburn).
There's a hell of a distance between wise-cracking and wit. Wit has truth in it; wise-cracking is simply calisthenics with words.
This is not a novel to be tossed aside lightly. It should be thrown with great force.
(regarding "Atlas Shrugged").
It serves me right for putting all my eggs in one bastard.
(On her abortion).
You can lead a horticulture, but you can't make her think.
(When asked to use the word horticulture during a game of Can-You-Give-Me-A-Sentence).
Razors pain you,
Rivers are damp,
Acids stain you,
And drugs cause cramp.
Guns aren't lawful,
Nooses give,
Gas smells awful.
You might as well live.
I require only three things of a man. He must be handsome, ruthless and stupid.
It was produced according to the fine old Shubert precept that "nothing succeeds like undress."
(on "Sinbad").
At a dinner party, the actress Ilka Chase defended Clare Boothe Luce by saying that not only was Clare loyal to her friends, but "she's very kind to her inferiors." Dorothy countered with, "And where does she find them?"
The transatlantic crossing was so rough the only thing that I could keep on my stomach was the first mate.
The Monte Carlo casino refused to admit me until I was properly dressed so I went and found my stockings, and then came back and lost my shirt.
Tuesday, January 20, 2009
Monday, January 19, 2009
Sean Penn - Mountain of Snakes
from The Huffington Post
The disadvantages of being a writer, who is often written about, are numerous. I begin with an enthusiastic call to my 81-year old mother, hoping to share my enthusiasm from an assignment abroad. "Hey ma..." "I know," she says, "You're on Jupiter, it's all over the Internet. They say you're cavorting with the planet's president! They say he's anti-earth! And Sean, why is your hair so big in the pictures?" I muse, "Lack of gravity?" "That's what Hannity said!!" she tells me. It seems that American movies are pretty popular in far away places, and one must dance a bit to avoid being more a spun story, than the true story one intends to tell. However, there are also grand upsides.
I have been in the public eye to varying degrees, for most of my 48 years, and had many occasions to sit in the front row of popular and political culture. I can speak in firsthand, to bearing witness to an often untruthful, reckless and demonizing media. Yes, in many cases, the smoke would prove an accurate expectation of fire. But, the fact is, that our most respected, call that mainstream media, in print and on television are, in part, conscious manufacturers of deception. In one case, I have photographic evidence. It was widely reported that I had commissioned my own photographer to self-promote my involvement among many other volunteers in New Orleans in the aftermath of Katrina. This simply did not happen. Though the notion of self-promotion had not occurred to me, I did later regret that I had not gotten some snaps of the devastation I saw. I will probably bring someone along to document the next fuck-up of media or government. Meanwhile, I challenge anyone to hunt up the few pictures that were taken by the random photojournalists who'd stumbled upon me, and find a single one that would've passed the test of my own narcissistic scrutiny. But a benefit greater than the insight offered by this front row seat, is finding that having a public persona, inclusive of a perceived open mind to the qualities of countries outside one's own, may grant breathtaking access.
Who'd a thunk? There I was with the biggest hair on the planet. Oh yeah. Big, big hair. It does that in the tropics. It gets big. And I mean American big, baby. And there I was with my big, American hair, finding faith in American democracy in the unlikeliest of places. Sitting in the Salon de Protocol at the Convention Palace in Havana's Miramar district, all I had to do was tell the five-foot-six bespectacled man who sat in the chair across from me in his khaki dress militaries, that these words would not be published until after the American election. And with that, granting his first ever interview to a foreign journalist since the beginning of the 1957 Cuban revolution, President Raul Castro smiled warmly and simply said, "We want Obama." His initial reluctance was due to a concern that an endorsement by a Cuban president might be detrimental to the Obama candidacy. And this is where the faith came in: Though Obama would be the 11th American president in the long history of the Castro brother's reign, and despite tumultuous U.S. Cuban relations since what Henry Cabot Lodge called, "the large policy," as justification for American violations of the Teller amendment in the late 1800s. Despite multiple assassination attempts by the CIA on his older brother Fidel, the destabilization tactics of Robert F. Kennedy and the Bay of Pigs, The Platt Amendment with the taking of Guantanamo Bay, and even despite an endless and unjustified embargo (in effect: blockade) on Cuba by the United States, here we were in 2008, and Raul Castro said flat out that if the American people, who today stand with candidate Barack Obama, continue to stand with President Barack Obama, then "meaningful and productive advances could be achieved in Cuba and the world."
In my anticipation of a brief interview, I pulled from my pocket the dwindling remains of my small note pad. Again, Castro smiled, and slid a fresh, full pad across the small polished table to me. We would spend the next seven hours together.
More here
Part Two here
The disadvantages of being a writer, who is often written about, are numerous. I begin with an enthusiastic call to my 81-year old mother, hoping to share my enthusiasm from an assignment abroad. "Hey ma..." "I know," she says, "You're on Jupiter, it's all over the Internet. They say you're cavorting with the planet's president! They say he's anti-earth! And Sean, why is your hair so big in the pictures?" I muse, "Lack of gravity?" "That's what Hannity said!!" she tells me. It seems that American movies are pretty popular in far away places, and one must dance a bit to avoid being more a spun story, than the true story one intends to tell. However, there are also grand upsides.
I have been in the public eye to varying degrees, for most of my 48 years, and had many occasions to sit in the front row of popular and political culture. I can speak in firsthand, to bearing witness to an often untruthful, reckless and demonizing media. Yes, in many cases, the smoke would prove an accurate expectation of fire. But, the fact is, that our most respected, call that mainstream media, in print and on television are, in part, conscious manufacturers of deception. In one case, I have photographic evidence. It was widely reported that I had commissioned my own photographer to self-promote my involvement among many other volunteers in New Orleans in the aftermath of Katrina. This simply did not happen. Though the notion of self-promotion had not occurred to me, I did later regret that I had not gotten some snaps of the devastation I saw. I will probably bring someone along to document the next fuck-up of media or government. Meanwhile, I challenge anyone to hunt up the few pictures that were taken by the random photojournalists who'd stumbled upon me, and find a single one that would've passed the test of my own narcissistic scrutiny. But a benefit greater than the insight offered by this front row seat, is finding that having a public persona, inclusive of a perceived open mind to the qualities of countries outside one's own, may grant breathtaking access.
Who'd a thunk? There I was with the biggest hair on the planet. Oh yeah. Big, big hair. It does that in the tropics. It gets big. And I mean American big, baby. And there I was with my big, American hair, finding faith in American democracy in the unlikeliest of places. Sitting in the Salon de Protocol at the Convention Palace in Havana's Miramar district, all I had to do was tell the five-foot-six bespectacled man who sat in the chair across from me in his khaki dress militaries, that these words would not be published until after the American election. And with that, granting his first ever interview to a foreign journalist since the beginning of the 1957 Cuban revolution, President Raul Castro smiled warmly and simply said, "We want Obama." His initial reluctance was due to a concern that an endorsement by a Cuban president might be detrimental to the Obama candidacy. And this is where the faith came in: Though Obama would be the 11th American president in the long history of the Castro brother's reign, and despite tumultuous U.S. Cuban relations since what Henry Cabot Lodge called, "the large policy," as justification for American violations of the Teller amendment in the late 1800s. Despite multiple assassination attempts by the CIA on his older brother Fidel, the destabilization tactics of Robert F. Kennedy and the Bay of Pigs, The Platt Amendment with the taking of Guantanamo Bay, and even despite an endless and unjustified embargo (in effect: blockade) on Cuba by the United States, here we were in 2008, and Raul Castro said flat out that if the American people, who today stand with candidate Barack Obama, continue to stand with President Barack Obama, then "meaningful and productive advances could be achieved in Cuba and the world."
In my anticipation of a brief interview, I pulled from my pocket the dwindling remains of my small note pad. Again, Castro smiled, and slid a fresh, full pad across the small polished table to me. We would spend the next seven hours together.
More here
Part Two here
Sunday, January 18, 2009
What Will Change Everything?
The Edge.org question 2009.
WHAT WILL CHANGE EVERYTHING?
DANIEL C. DENNETT
Philosopher; University Professor, Co-Director, Center for Cognitive Studies, Tufts University; Author, Breaking the Spell
THIS VERY EXPLORATION IS CHANGING EVERYTHING
What will change everything? The question itself and many of the answers already given by others here on Edge.org point to a common theme: reflective, scientific investigation of everything is going to change everything. When we look closely at looking closely, when we increase our investment in techniques for increasing our investment in techniques... for increasing our investment in techniques, we create non-linearities, — like Doug Hofstadter's strange loops — that amplify uncertainties, allowing phenomena that have heretofore been orderly and relatively predictable to escape our control. We figure out how to game the system, and this initiates an arms race to control or prevent the gaming of the system, which leads to new levels of gamesmanship and so on.
The snowball has started to roll, and there is probably no stopping it. Will the result be a utopia or a dystopia? Which of the novelties are self-limiting, and which will extinguish institutions long thought to be permanent? There is precious little inertia, I think, in cultural phenomena once they are placed in these arms races of cultural evolution. Extinction can happen overnight, in some cases. The almost frictionless markets made possible by the internet are already swiftly revolutionizing commerce.
Will universities and newspapers become obsolete? Will hospitals and churches go the way of corner grocery stores and livery stables? Will reading music soon become as arcane a talent as reading hieroglyphics? Will reading and writing themselves soon be obsolete? What will we use our minds for? Some see a revolution in our concept of intelligence, either because of "neurocosmetics" (Marcel Kinsbourne) or quantum-computing (W. H. Hoffman), or "just in time storytelling" (Roger Schank). Nick Humphrey reminds us that when we get back to basics — procreating, eating, just staying alive — not that much has changed since Roman times, but I think that these are not really fixed points after all.
Our species' stroll through Design Space is picking up speed. Recreational sex, recreational eating, and recreational perception (hallucinogens, alcohol), have been popular since Roman times, but we are now on the verge of recreational self-transformations that will dwarf the modifications the Romans indulged in. When you no longer need to eat to stay alive, or procreate to have offspring, or locomote to have an adventure — packed life, when the residual instincts for these activities might be simply turned off by genetic tweaking, there may be no constants of human nature left at all. Except, maybe, our incessant curiosity.
YOCHAI BENKLER
Berkman Professor of Entrepreneurial Legal Studies, Harvard; Author, The Wealth of Networks: How Social Production Transforms Markets and Freedom
RECOMBINATIONS OF THE NEAR POSSIBLE
What will change everything within forty to fifty years (optimistic assumptions about my longevity, I know)? One way to start to think about this is to look at the last “change everything” innovation, and work back fifty years from it. I would focus on the Internet's generalization into everyday life as the relevant baseline innovation that changed everything. We can locate its emergence to widespread use to the mid-1990s. So what did we have that existed in the mid-1940s that was a precursor? We had mature telephone networks, networked radio stations, and point-to-point radio communications. We had the earliest massive computers. So to me the challenge is to look at what we have now, some of which may be quite mature; other pieces of which may be only emerging; and to think of how they could combine in ways that will affect social and cultural processes in ways that will “change everything,” which I take to mean: will make a big difference to the day to day life of many people. Let me suggest four domains in which combinations and improvements of existing elements, some mature, some futuristic, will make a substantial difference, not all of it good.
Communications
We already have handsfree devices. We already have overhead transparent display in fighter pilot helmets. We already have presence-based and immediate communications. We already upload images and movies, on the fly, from our mobile devices, and share them with friends. We already have early holographic imaging for conference presentations, and high-quality 3D imaging for movies. We already have voice-activated computer control systems, and very very early brainwave activated human-computer interfaces. We already have the capacity to form groups online, segment and reform them according to need, be they in World of Warcraft or Facebook groups. What is left is to combine all these pieces into an integrated, easily wearable system that will, for all practical purposes, allow us to interact as science fiction once imagined telepathy working. We will be able to call upon another person by thinking of them; or, at least, whispering their name to ourselves. We will be able to communicate and see them; we will be able to see through their eyes if we wish to, in real time in high resolution to the point that it will seem as though we were in fact standing there, next to them or inside their shoes. However much we think now that collaboration at a distance is easy; what we do today will seem primitive. We won't have “beam me up, Scotty” physically; but we will have a close facsimile of the experience. Coupled with concerns over global warming, these capabilities will make business travel seem like wearing fur. However much we talk now about telecommuting today; these new capabilities, together with new concerns over environmental impact, will make virtual workplaces in the information segments of the economy as different from today's telecommuting as today's ubiquitous computing and mobile platforms are from the mini-computer “revolution” of the 1970s.
Medicine
It is entirely plausible that 110 or 120 will be an average life expectancy; with senescence delayed until 80 or 90. This will change the whole dynamic of life: how many careers a lifetime can support; what the ratio or professional moneymaking to volunteering; how early in life one starts a job; length of training. But this will likely affect, if at all within the relevant period, only the wealthiest societies. Simple innovations that are more likely will have a much wider effect on many more people. A cheap and effective malaria vaccine. Cheap and ubiquitous clean water filters. Cheap and effective treatments and prevention techniques against parasites. All these will change life in the Global South on scales and with values that they will swamp, from the perspective of a broad concern with human values, whatever effects lengthening life in the wealthier North will have.
Military Robotics
We are already have unmanned planes that can shoot live targets. We are seeing land robots, for both military and space applications. We are seeing networked robots performing functions in collaboration. I fear that we will see a massive increase in the deployment and quality of military robotics, and that this will lead to a perception that war is cheaper, in human terms. This, in turn, will lead democracies in general, and the United States in particular, to imagine that there are cheap wars, and to overcome the newly-learned reticence over war that we learned so dearly in Iraq.
(Casa del ionesco editor's note: see Robots at War: The New Battlefield for P.W.Singer's perspective on the future of Military Robotics).
Free market ideology
This is not a technical innovation but a change in realm of ideas. The resurgence of free market ideology, after its demise in the Great Depression, came to dominance between the 1970s and the late 1990s as a response to communism. As communism collapsed, free market ideology triumphantly declared its dominance. In the U.S. And the UK it expressed itself, first, in the Reagan/Thatcher moment; and then was generalized in the Clinton/Blair turn to define their own moment in terms of integrating market-based solutions as the core institutional innovation of the “left.” It expressed itself in Europe through the competition-focused, free market policies of the technocratic EU Commission; and in global systems through the demands and persistent reform recommendations of the World Bank, the IMF, and the world trade system through the WTO. But within less than two decades, its force as an idea is declining. On the one hand, the Great Deflation of 2008 has shown the utter dependence of human society on the possibility of well-functioning government to assure some baseline stability in human welfare and capacity to plan for the future. On the other hand, a gradual rise in volunteerism and cooperation, online and offline, is leading to a reassessment of what motivates people, and how governments, markets, and social dynamics interoperate. I expect the binary State/Market conception of the way we organize our large systems to give way to a more fluid set of systems, with greater integration of the social and commercial; as well as of the state and the social. So much of life, in so many of our societies, was structured around either market mechanisms or state bureaucracies. The emergence of new systems of social interaction will affect what we do, and where we turn for things we want to do, have, and experience.
MARTI HEARST
Computer Scientist, UC Berkeley, School of Information; Author, Search User Interfaces
THE DECLINE OF TEXT
As an academic I am of course loathe to think about a world without reading and writing, but with the rapidly increasing ease of recording and distributing video, and its enormous popularity, I think it is only a matter of time before text and the written word become relegated to specialists (such as lawyers) and hobbyists.
Movies have already replaced books as cultural touchstones in the U.S. And most Americans dislike watching movies with subtitles. I assume that given a choice, the majority of Americans would prefer a video-dominant world to a text-dominant one. (Writing as a technologist, I don't feel I can speak for other cultures.) A recent report by Pew Research included a quote from a media executive who said that emails containing podcasts were opened 20% more often than standard marketing email. And I was intrigued by the use of YouTube questions in the U.S. presidential debates. Most of the citizen-submitted videos that were selected by the moderators consisted simply of people pointing the camera at themselves and speaking their question out loud, with a backdrop consisting of a wall in a room of their home. There were no visual flourishes; the video did not add much beyond what a questioner in a live audience would have conveyed. Video is becoming a mundane way to communicate.
Note that I am not predicting the decline of erudition, in the tradition of Allan Bloom. Nor am I arguing that video will make us stupid, as in Niel Postman's landmark "Amusing Ourselves to Death." The situation is different today. In Postman's time, the dominant form of video communication was television, which allowed only for one-way, broadcast-style interaction. We should expect different consequences when everyone uses video for multi-way communication. What I am espousing is that the forms of communication that will do the cultural "heavy lifting" will be audio and video, rather than text.
How will this come about? As a first step, I think there will be a dramatic reduction in typing; input of textual information will move towards audio dictation. (There is a problem of how to avoid disturbing officemates or exposing seat-mates on public transportation to private information; perhaps some sound-canceling technology will be developed to solve this problem.) This will succeed in the future where it has failed in the past because of future improvements in speech recognition technology and ease-of-use improvements in editing, storage, and retrieval of spoken words.
There already is robust technology for watching and listening to video at a faster speed than recorded, without undue auditory distortion (Microsoft has an excellent in-house system for this). And as noted above, technology for recording, editing, posting, and storing video has become ubiquitous and easy to use. As for the use of textual media to respond to criticisms and to cite other work, we already see "video responses" as a heavily used feature on YouTube. One can imagine how technology and norms will develop to further enrich this kind of interaction.
The missing piece in technology today is an effective way to search for video content. Automated image analysis is still an unsolved problem, but there may well be a breakthrough on the horizon. Most algorithms of this kind are developed by "training", that is, by exposing them to large numbers of examples. The algorithms, if fed enough data, can learn to recognize patterns which can be applied to recognize objects in videos the algorithm hasn't yet seen. This kind of technology is behind many of the innovations we see in web search engines, such as accurate spell checking and improvements in automated language translation. Not yet available are huge collections of labeled image and video data, where words have been linked to objects within the images, but there are efforts afoot to harness the willing crowds of online volunteers to gather such information.
What about developing versus developed nations? There is of course an enormous literacy problem in developing nations. Researchers are experimenting with cleverly designed tools such as the Literacy Bridge Talking Book project which uses a low-cost audio device to help teach reading skills. But perhaps just as developing nations "leap-frogged" developed ones by skipping land-line telephones to go straight to cell phones, the same may happen with skipping written literacy and moving directly to screen literacy.
I am not saying text will disappear entirely; one counter-trend is the replacement of orality with text in certain forms of communication. For short messages, texting is efficient and unobtrusive. And there is the question of how official government proclamations will be recorded. Perhaps there will be a requirement for transliteration into written text as a provision of the Americans with Disabilities Act, for the hearing-impaired (although we can hope in the future for increasingly advanced technology to reverse such conditions). But I do think the importance of written words will decline dramatically both in culture and in how the world works. In a few years, will I be submitting my response to the Edge question as a podcast?
DAVID EAGLEMAN
Assistant Professor of Neuroscience, Baylor College of Medicine; Author, Sum
SILICON IMMORTALITY : DOWNLOADING CONSCIOUSNESS INTO COMPUTERS
While medicine will advance in the next half century, we are not on a crash-course for achieving immortality by curing all disease. Bodies simply wear down with use. We are on a crash-course, however, with technologies that let us store unthinkable amounts of data and run gargantuan simulations. Therefore, well before we understand how brains work, we will find ourselves able to digitally copy the brain's structure and able to download the conscious mind into a computer.
If the computational hypothesis of brain function is correct, it suggests that an exact replica of your brain will hold your memories, will act and think and feel the way you do, and will experience your consciousness — irrespective of whether it's built out of biological cells, Tinkertoys, or zeros and ones. The important part about brains, the theory goes, is not the structure, it is about the algorithms that ride on top of the structure. So if the scaffolding that supports the algorithms is replicated — even in a different medium — then the resultant mind should be identical. If this proves correct, it is almost certain we will soon have technologies that allow us to copy and download our brains and live forever in silica. We will not have to die anymore. We will instead live in virtual worlds like the Matrix. I assume there will be markets for purchasing different kinds of afterlives, and sharing them with different people — this is future of social networking. And once you are downloaded, you may even be able to watch the death of your outside, real-world body, in the manner that we would view an interesting movie.
Of course, this hypothesized future embeds many assumptions, the speciousness of any one of which could spill the house of cards. The main problem is that we don't know exactly which variables are critical to capture in our hypothetical brain scan. Presumably the important data will include the detailed connectivity of the hundreds of billions of neurons. But knowing the point-to-point circuit diagram of the brain may not be sufficient to specify its function. The exact three-dimensional arrangement of the neurons and glia is likely to matter as well (for example, because of three-dimensional diffusion of extracellular signals). We may further need to probe and record the strength of each of the trillions of synaptic connections. In a still more challenging scenario, the states of individual proteins (phosphorylation states, exact spatial distribution, articulation with neighboring proteins, and so on) will need to be scanned and stored. It should also be noted that a simulation of the central nervous system by itself may not be sufficient for a good simulation of experience: other aspects of the body may require inclusion, such as the endocrine system, which sends and receives signals from the brain. These considerations potentially lead to billions of trillions of variables that need to be stored and emulated.
The other major technical hurdle is that the simulated brain must be able to modify itself. We need not only the pieces and parts, we also the physics of their ongoing interactions — for example, the activity of transcription factors that travel to the nucleus and cause gene expression, the dynamic changes in location and strength of the synapses, and so on. Unless your simulated experiences change the structure of your simulated brain, you will be unable to form new memories and will have no sense of the passage of time. Under those circumstances, is there any point in immortality?
The good news is that computing power is blossoming sufficiently quickly that we are likely to make it within a half century. And note that a simulation does not need to be run in real time in order for the simulated brain to believe it is operating in real time. There's no doubt that whole brain emulation is an exceptionally challenging problem. As of this moment, we have no neuroscience technologies geared toward ultra-high-resolution scanning of the sort required — and even if we did, it would take several of the world's most powerful computers to represent a few cubic millimeters of brain tissue in real time. It's a large problem. But assuming we haven't missed anything important in our theoretical frameworks, then we have the problem cornered and I expect to see the downloading of consciousness come to fruition in my lifetime.
KEVIN SLAVIN
Digital Technologist; Managing Director, Co-Founder, area/code
THE EBB OF MEMORY
In just a few years, we’ll see the first generation of adults whose every breath has been drawn on the grid. A generation for whom every key moment (e.g., birth) has been documented and distributed globally. Not just the key moments, of course, but also the most banal: eating pasta, missing the train, and having a bad day at the office. Ski trips and puppies.
These trips and puppies are not simply happening, they are becoming data, building up the global database of distributed memories. They are networked digital photos – 3 billion on Flickr, 10 billion on Facebook. They were blog posts, and now they are tweets, too (a billion in 18 months). They are Facebook posts, Dopplr journals, Last.FM updates.
Further, more and more of these traces we produce will be passive or semi-passive. Consider Loopt, which allows us to track ourselves, our friends through GPS. Consider voicemail transcription bots that transcribe the voice messages we leave into searchable text in email boxes on into eternity. The next song you listen to will likely be stored in a database record somewhere. Next time you take a phonecam photo, it may well have the event’s latitude and longitude baked into the photo’s metadata.
The sharp upswing in all of this record-keeping – both active and passive – are redefining one of the core elements of what it means to be human, namely to remember. We are moving towards a culture that has outsourced this essential quality of existence to machines, to a vast and distributed prosthesis. This infrastructure exists right now, but very soon we’ll be living with the first adult generation whose entire lives are embedded in it.
In 1992, the artist Thomas Bayrle wrote that the great mistakes of the future would be that as everything became digital, we would confuse memory with storage. What’s important about genuine memory and how it differs from digital storage is that human memory is imperfect, fallible, and malleable. It disappears over time in a rehearsal and echo of mortality; our abilities to remember, distort and forget are what make us who we are.
We have built the infrastructure that makes it impossible to forget. As it hardens and seeps into every element of daily life, it will make it impossible to remember. Changing what it means to remember changes what it means to be.
There are a few people with who already have perfect episodic memory, total recall, neurological edge cases. They are harbingers of the culture to come. One of them, Jill Price, was profiled in Der Spiegel:
"In addition to good memories, every angry word, every mistake, every disappointment, every shock and every moment of pain goes unforgotten. Time heals no wounds for Price. 'I don't look back at the past with any distance. It's more like experiencing everything over and over again, and those memories trigger exactly the same emotions in me. It's like an endless, chaotic film that can completely overpower me. And there's no stop button.'"
This also describes the life of Steve Mann, passively recording his life through wearable computers for many years. This is an unlikely future scenario, but like any caricature, it is based on human features that will be increasingly recognizable. The processing, recording and broadcasting prefigured in Mann’s work will be embedded in everyday actions like the twittering, phonecam shots and GPS traces we broadcast now. All of them entering into an outboard memory that is accessible (and searchable) everywhere we go.
Today is New Year’s Eve. I read today (on Twitter) that three friends, independent of each other, were looking back at Flickr to recall what they were doing a year ago. I would like to start the New Year being able to remember 2008, but also to forget it.
For the next generation, it will be impossible to forget it, and harder to remember. What will change everything is our ability to remember what everything is. Was. And wasn’t.
More here
WHAT WILL CHANGE EVERYTHING?
DANIEL C. DENNETT
Philosopher; University Professor, Co-Director, Center for Cognitive Studies, Tufts University; Author, Breaking the Spell
THIS VERY EXPLORATION IS CHANGING EVERYTHING
What will change everything? The question itself and many of the answers already given by others here on Edge.org point to a common theme: reflective, scientific investigation of everything is going to change everything. When we look closely at looking closely, when we increase our investment in techniques for increasing our investment in techniques... for increasing our investment in techniques, we create non-linearities, — like Doug Hofstadter's strange loops — that amplify uncertainties, allowing phenomena that have heretofore been orderly and relatively predictable to escape our control. We figure out how to game the system, and this initiates an arms race to control or prevent the gaming of the system, which leads to new levels of gamesmanship and so on.
The snowball has started to roll, and there is probably no stopping it. Will the result be a utopia or a dystopia? Which of the novelties are self-limiting, and which will extinguish institutions long thought to be permanent? There is precious little inertia, I think, in cultural phenomena once they are placed in these arms races of cultural evolution. Extinction can happen overnight, in some cases. The almost frictionless markets made possible by the internet are already swiftly revolutionizing commerce.
Will universities and newspapers become obsolete? Will hospitals and churches go the way of corner grocery stores and livery stables? Will reading music soon become as arcane a talent as reading hieroglyphics? Will reading and writing themselves soon be obsolete? What will we use our minds for? Some see a revolution in our concept of intelligence, either because of "neurocosmetics" (Marcel Kinsbourne) or quantum-computing (W. H. Hoffman), or "just in time storytelling" (Roger Schank). Nick Humphrey reminds us that when we get back to basics — procreating, eating, just staying alive — not that much has changed since Roman times, but I think that these are not really fixed points after all.
Our species' stroll through Design Space is picking up speed. Recreational sex, recreational eating, and recreational perception (hallucinogens, alcohol), have been popular since Roman times, but we are now on the verge of recreational self-transformations that will dwarf the modifications the Romans indulged in. When you no longer need to eat to stay alive, or procreate to have offspring, or locomote to have an adventure — packed life, when the residual instincts for these activities might be simply turned off by genetic tweaking, there may be no constants of human nature left at all. Except, maybe, our incessant curiosity.
YOCHAI BENKLER
Berkman Professor of Entrepreneurial Legal Studies, Harvard; Author, The Wealth of Networks: How Social Production Transforms Markets and Freedom
RECOMBINATIONS OF THE NEAR POSSIBLE
What will change everything within forty to fifty years (optimistic assumptions about my longevity, I know)? One way to start to think about this is to look at the last “change everything” innovation, and work back fifty years from it. I would focus on the Internet's generalization into everyday life as the relevant baseline innovation that changed everything. We can locate its emergence to widespread use to the mid-1990s. So what did we have that existed in the mid-1940s that was a precursor? We had mature telephone networks, networked radio stations, and point-to-point radio communications. We had the earliest massive computers. So to me the challenge is to look at what we have now, some of which may be quite mature; other pieces of which may be only emerging; and to think of how they could combine in ways that will affect social and cultural processes in ways that will “change everything,” which I take to mean: will make a big difference to the day to day life of many people. Let me suggest four domains in which combinations and improvements of existing elements, some mature, some futuristic, will make a substantial difference, not all of it good.
Communications
We already have handsfree devices. We already have overhead transparent display in fighter pilot helmets. We already have presence-based and immediate communications. We already upload images and movies, on the fly, from our mobile devices, and share them with friends. We already have early holographic imaging for conference presentations, and high-quality 3D imaging for movies. We already have voice-activated computer control systems, and very very early brainwave activated human-computer interfaces. We already have the capacity to form groups online, segment and reform them according to need, be they in World of Warcraft or Facebook groups. What is left is to combine all these pieces into an integrated, easily wearable system that will, for all practical purposes, allow us to interact as science fiction once imagined telepathy working. We will be able to call upon another person by thinking of them; or, at least, whispering their name to ourselves. We will be able to communicate and see them; we will be able to see through their eyes if we wish to, in real time in high resolution to the point that it will seem as though we were in fact standing there, next to them or inside their shoes. However much we think now that collaboration at a distance is easy; what we do today will seem primitive. We won't have “beam me up, Scotty” physically; but we will have a close facsimile of the experience. Coupled with concerns over global warming, these capabilities will make business travel seem like wearing fur. However much we talk now about telecommuting today; these new capabilities, together with new concerns over environmental impact, will make virtual workplaces in the information segments of the economy as different from today's telecommuting as today's ubiquitous computing and mobile platforms are from the mini-computer “revolution” of the 1970s.
Medicine
It is entirely plausible that 110 or 120 will be an average life expectancy; with senescence delayed until 80 or 90. This will change the whole dynamic of life: how many careers a lifetime can support; what the ratio or professional moneymaking to volunteering; how early in life one starts a job; length of training. But this will likely affect, if at all within the relevant period, only the wealthiest societies. Simple innovations that are more likely will have a much wider effect on many more people. A cheap and effective malaria vaccine. Cheap and ubiquitous clean water filters. Cheap and effective treatments and prevention techniques against parasites. All these will change life in the Global South on scales and with values that they will swamp, from the perspective of a broad concern with human values, whatever effects lengthening life in the wealthier North will have.
Military Robotics
We are already have unmanned planes that can shoot live targets. We are seeing land robots, for both military and space applications. We are seeing networked robots performing functions in collaboration. I fear that we will see a massive increase in the deployment and quality of military robotics, and that this will lead to a perception that war is cheaper, in human terms. This, in turn, will lead democracies in general, and the United States in particular, to imagine that there are cheap wars, and to overcome the newly-learned reticence over war that we learned so dearly in Iraq.
(Casa del ionesco editor's note: see Robots at War: The New Battlefield for P.W.Singer's perspective on the future of Military Robotics).
Free market ideology
This is not a technical innovation but a change in realm of ideas. The resurgence of free market ideology, after its demise in the Great Depression, came to dominance between the 1970s and the late 1990s as a response to communism. As communism collapsed, free market ideology triumphantly declared its dominance. In the U.S. And the UK it expressed itself, first, in the Reagan/Thatcher moment; and then was generalized in the Clinton/Blair turn to define their own moment in terms of integrating market-based solutions as the core institutional innovation of the “left.” It expressed itself in Europe through the competition-focused, free market policies of the technocratic EU Commission; and in global systems through the demands and persistent reform recommendations of the World Bank, the IMF, and the world trade system through the WTO. But within less than two decades, its force as an idea is declining. On the one hand, the Great Deflation of 2008 has shown the utter dependence of human society on the possibility of well-functioning government to assure some baseline stability in human welfare and capacity to plan for the future. On the other hand, a gradual rise in volunteerism and cooperation, online and offline, is leading to a reassessment of what motivates people, and how governments, markets, and social dynamics interoperate. I expect the binary State/Market conception of the way we organize our large systems to give way to a more fluid set of systems, with greater integration of the social and commercial; as well as of the state and the social. So much of life, in so many of our societies, was structured around either market mechanisms or state bureaucracies. The emergence of new systems of social interaction will affect what we do, and where we turn for things we want to do, have, and experience.
MARTI HEARST
Computer Scientist, UC Berkeley, School of Information; Author, Search User Interfaces
THE DECLINE OF TEXT
As an academic I am of course loathe to think about a world without reading and writing, but with the rapidly increasing ease of recording and distributing video, and its enormous popularity, I think it is only a matter of time before text and the written word become relegated to specialists (such as lawyers) and hobbyists.
Movies have already replaced books as cultural touchstones in the U.S. And most Americans dislike watching movies with subtitles. I assume that given a choice, the majority of Americans would prefer a video-dominant world to a text-dominant one. (Writing as a technologist, I don't feel I can speak for other cultures.) A recent report by Pew Research included a quote from a media executive who said that emails containing podcasts were opened 20% more often than standard marketing email. And I was intrigued by the use of YouTube questions in the U.S. presidential debates. Most of the citizen-submitted videos that were selected by the moderators consisted simply of people pointing the camera at themselves and speaking their question out loud, with a backdrop consisting of a wall in a room of their home. There were no visual flourishes; the video did not add much beyond what a questioner in a live audience would have conveyed. Video is becoming a mundane way to communicate.
Note that I am not predicting the decline of erudition, in the tradition of Allan Bloom. Nor am I arguing that video will make us stupid, as in Niel Postman's landmark "Amusing Ourselves to Death." The situation is different today. In Postman's time, the dominant form of video communication was television, which allowed only for one-way, broadcast-style interaction. We should expect different consequences when everyone uses video for multi-way communication. What I am espousing is that the forms of communication that will do the cultural "heavy lifting" will be audio and video, rather than text.
How will this come about? As a first step, I think there will be a dramatic reduction in typing; input of textual information will move towards audio dictation. (There is a problem of how to avoid disturbing officemates or exposing seat-mates on public transportation to private information; perhaps some sound-canceling technology will be developed to solve this problem.) This will succeed in the future where it has failed in the past because of future improvements in speech recognition technology and ease-of-use improvements in editing, storage, and retrieval of spoken words.
There already is robust technology for watching and listening to video at a faster speed than recorded, without undue auditory distortion (Microsoft has an excellent in-house system for this). And as noted above, technology for recording, editing, posting, and storing video has become ubiquitous and easy to use. As for the use of textual media to respond to criticisms and to cite other work, we already see "video responses" as a heavily used feature on YouTube. One can imagine how technology and norms will develop to further enrich this kind of interaction.
The missing piece in technology today is an effective way to search for video content. Automated image analysis is still an unsolved problem, but there may well be a breakthrough on the horizon. Most algorithms of this kind are developed by "training", that is, by exposing them to large numbers of examples. The algorithms, if fed enough data, can learn to recognize patterns which can be applied to recognize objects in videos the algorithm hasn't yet seen. This kind of technology is behind many of the innovations we see in web search engines, such as accurate spell checking and improvements in automated language translation. Not yet available are huge collections of labeled image and video data, where words have been linked to objects within the images, but there are efforts afoot to harness the willing crowds of online volunteers to gather such information.
What about developing versus developed nations? There is of course an enormous literacy problem in developing nations. Researchers are experimenting with cleverly designed tools such as the Literacy Bridge Talking Book project which uses a low-cost audio device to help teach reading skills. But perhaps just as developing nations "leap-frogged" developed ones by skipping land-line telephones to go straight to cell phones, the same may happen with skipping written literacy and moving directly to screen literacy.
I am not saying text will disappear entirely; one counter-trend is the replacement of orality with text in certain forms of communication. For short messages, texting is efficient and unobtrusive. And there is the question of how official government proclamations will be recorded. Perhaps there will be a requirement for transliteration into written text as a provision of the Americans with Disabilities Act, for the hearing-impaired (although we can hope in the future for increasingly advanced technology to reverse such conditions). But I do think the importance of written words will decline dramatically both in culture and in how the world works. In a few years, will I be submitting my response to the Edge question as a podcast?
DAVID EAGLEMAN
Assistant Professor of Neuroscience, Baylor College of Medicine; Author, Sum
SILICON IMMORTALITY : DOWNLOADING CONSCIOUSNESS INTO COMPUTERS
While medicine will advance in the next half century, we are not on a crash-course for achieving immortality by curing all disease. Bodies simply wear down with use. We are on a crash-course, however, with technologies that let us store unthinkable amounts of data and run gargantuan simulations. Therefore, well before we understand how brains work, we will find ourselves able to digitally copy the brain's structure and able to download the conscious mind into a computer.
If the computational hypothesis of brain function is correct, it suggests that an exact replica of your brain will hold your memories, will act and think and feel the way you do, and will experience your consciousness — irrespective of whether it's built out of biological cells, Tinkertoys, or zeros and ones. The important part about brains, the theory goes, is not the structure, it is about the algorithms that ride on top of the structure. So if the scaffolding that supports the algorithms is replicated — even in a different medium — then the resultant mind should be identical. If this proves correct, it is almost certain we will soon have technologies that allow us to copy and download our brains and live forever in silica. We will not have to die anymore. We will instead live in virtual worlds like the Matrix. I assume there will be markets for purchasing different kinds of afterlives, and sharing them with different people — this is future of social networking. And once you are downloaded, you may even be able to watch the death of your outside, real-world body, in the manner that we would view an interesting movie.
Of course, this hypothesized future embeds many assumptions, the speciousness of any one of which could spill the house of cards. The main problem is that we don't know exactly which variables are critical to capture in our hypothetical brain scan. Presumably the important data will include the detailed connectivity of the hundreds of billions of neurons. But knowing the point-to-point circuit diagram of the brain may not be sufficient to specify its function. The exact three-dimensional arrangement of the neurons and glia is likely to matter as well (for example, because of three-dimensional diffusion of extracellular signals). We may further need to probe and record the strength of each of the trillions of synaptic connections. In a still more challenging scenario, the states of individual proteins (phosphorylation states, exact spatial distribution, articulation with neighboring proteins, and so on) will need to be scanned and stored. It should also be noted that a simulation of the central nervous system by itself may not be sufficient for a good simulation of experience: other aspects of the body may require inclusion, such as the endocrine system, which sends and receives signals from the brain. These considerations potentially lead to billions of trillions of variables that need to be stored and emulated.
The other major technical hurdle is that the simulated brain must be able to modify itself. We need not only the pieces and parts, we also the physics of their ongoing interactions — for example, the activity of transcription factors that travel to the nucleus and cause gene expression, the dynamic changes in location and strength of the synapses, and so on. Unless your simulated experiences change the structure of your simulated brain, you will be unable to form new memories and will have no sense of the passage of time. Under those circumstances, is there any point in immortality?
The good news is that computing power is blossoming sufficiently quickly that we are likely to make it within a half century. And note that a simulation does not need to be run in real time in order for the simulated brain to believe it is operating in real time. There's no doubt that whole brain emulation is an exceptionally challenging problem. As of this moment, we have no neuroscience technologies geared toward ultra-high-resolution scanning of the sort required — and even if we did, it would take several of the world's most powerful computers to represent a few cubic millimeters of brain tissue in real time. It's a large problem. But assuming we haven't missed anything important in our theoretical frameworks, then we have the problem cornered and I expect to see the downloading of consciousness come to fruition in my lifetime.
KEVIN SLAVIN
Digital Technologist; Managing Director, Co-Founder, area/code
THE EBB OF MEMORY
In just a few years, we’ll see the first generation of adults whose every breath has been drawn on the grid. A generation for whom every key moment (e.g., birth) has been documented and distributed globally. Not just the key moments, of course, but also the most banal: eating pasta, missing the train, and having a bad day at the office. Ski trips and puppies.
These trips and puppies are not simply happening, they are becoming data, building up the global database of distributed memories. They are networked digital photos – 3 billion on Flickr, 10 billion on Facebook. They were blog posts, and now they are tweets, too (a billion in 18 months). They are Facebook posts, Dopplr journals, Last.FM updates.
Further, more and more of these traces we produce will be passive or semi-passive. Consider Loopt, which allows us to track ourselves, our friends through GPS. Consider voicemail transcription bots that transcribe the voice messages we leave into searchable text in email boxes on into eternity. The next song you listen to will likely be stored in a database record somewhere. Next time you take a phonecam photo, it may well have the event’s latitude and longitude baked into the photo’s metadata.
The sharp upswing in all of this record-keeping – both active and passive – are redefining one of the core elements of what it means to be human, namely to remember. We are moving towards a culture that has outsourced this essential quality of existence to machines, to a vast and distributed prosthesis. This infrastructure exists right now, but very soon we’ll be living with the first adult generation whose entire lives are embedded in it.
In 1992, the artist Thomas Bayrle wrote that the great mistakes of the future would be that as everything became digital, we would confuse memory with storage. What’s important about genuine memory and how it differs from digital storage is that human memory is imperfect, fallible, and malleable. It disappears over time in a rehearsal and echo of mortality; our abilities to remember, distort and forget are what make us who we are.
We have built the infrastructure that makes it impossible to forget. As it hardens and seeps into every element of daily life, it will make it impossible to remember. Changing what it means to remember changes what it means to be.
There are a few people with who already have perfect episodic memory, total recall, neurological edge cases. They are harbingers of the culture to come. One of them, Jill Price, was profiled in Der Spiegel:
"In addition to good memories, every angry word, every mistake, every disappointment, every shock and every moment of pain goes unforgotten. Time heals no wounds for Price. 'I don't look back at the past with any distance. It's more like experiencing everything over and over again, and those memories trigger exactly the same emotions in me. It's like an endless, chaotic film that can completely overpower me. And there's no stop button.'"
This also describes the life of Steve Mann, passively recording his life through wearable computers for many years. This is an unlikely future scenario, but like any caricature, it is based on human features that will be increasingly recognizable. The processing, recording and broadcasting prefigured in Mann’s work will be embedded in everyday actions like the twittering, phonecam shots and GPS traces we broadcast now. All of them entering into an outboard memory that is accessible (and searchable) everywhere we go.
Today is New Year’s Eve. I read today (on Twitter) that three friends, independent of each other, were looking back at Flickr to recall what they were doing a year ago. I would like to start the New Year being able to remember 2008, but also to forget it.
For the next generation, it will be impossible to forget it, and harder to remember. What will change everything is our ability to remember what everything is. Was. And wasn’t.
More here
Friday, January 16, 2009
Swingin' Sounds for Hipsters Vol 6
Mellow Movers and Smooth Summer Grooves from Vegas/Voodoo Rooms Vinyl Villain Frankie Sumatra
This mix is a few months old. Vol 7 to follow shortly.
Thursday, January 15, 2009
Is Stupid Making Us Google?
From The New Atlantis
Immersing myself in a book or a lengthy article used to be easy. My mind would get caught up in the narrative or the turns of the argument, and I’d spend hours strolling through long stretches of prose. That’s rarely the case anymore. Now my concentration often starts to drift after two or three pages. I get fidgety, lose the thread, begin looking for something else to do. I feel as if I’m always dragging my wayward brain back to the text. The deep reading that used to come naturally has become a struggle.” Sound familiar? Describing, in The Atlantic Monthly, his own struggles to keep his attention span from contracting like the wild ass’s skin in Balzac’s novel, Nicholas Carr cites a British study of research habits among visitors to two serious scholarly websites which suggests a more general problem: that “users are not reading online in the traditional sense; indeed there are signs that new forms of ‘reading’ are emerging as users ‘power browse’ horizontally through titles, contents pages and abstracts going for quick wins. It almost seems that they go online to avoid reading in the traditional sense.”
Almost seems? I don’t know about Mr. Carr, but I have no doubt that I go online to avoid reading in the traditional sense. The question is, how guilty do I need to feel about this? In his view, presumably, quite a lot guilty, since by reading online as much as I do I am depriving myself of the ability to read offline. He takes this insight to an even more alarming conclusion in the end, writing that “as we come to rely on computers to mediate our understanding of the world, it is our own intelligence that flattens into artificial intelligence.” And if that’s the case for veteran readers, think how much worse it must be for the jeunesse dorée of the information age, if they never developed the habits that accompany “deep reading” in the first place.
It is these poor cultural orphans, for whom “information retrieval” online is the only kind of reading they know, who are the main concern of Mark Bauerlein in his new book, The Dumbest Generation: How the Digital Age Stupefies Young Americans and Jeopardizes Our Future. One would think that a whole future in jeopardy would be too serious a matter for the flippancy of the rest of the subtitle: Or, Don’t Trust Anyone Under 30. But Professor Bauerlein, who teaches English at Emory University and is a former director of research and analysis at the National Endowment for the Arts, is not always sure just how much a matter of mirth “the dumbest generation” is, or isn’t. After all, it is not really their fault if, as he says, they have been “betrayed” by the mentors who should have taught them better. Yet he seems to agree with Nicholas Carr that what we are witnessing is not just an educational breakdown but a deformation of the very idea of intelligence.
James Bowman
More here
Immersing myself in a book or a lengthy article used to be easy. My mind would get caught up in the narrative or the turns of the argument, and I’d spend hours strolling through long stretches of prose. That’s rarely the case anymore. Now my concentration often starts to drift after two or three pages. I get fidgety, lose the thread, begin looking for something else to do. I feel as if I’m always dragging my wayward brain back to the text. The deep reading that used to come naturally has become a struggle.” Sound familiar? Describing, in The Atlantic Monthly, his own struggles to keep his attention span from contracting like the wild ass’s skin in Balzac’s novel, Nicholas Carr cites a British study of research habits among visitors to two serious scholarly websites which suggests a more general problem: that “users are not reading online in the traditional sense; indeed there are signs that new forms of ‘reading’ are emerging as users ‘power browse’ horizontally through titles, contents pages and abstracts going for quick wins. It almost seems that they go online to avoid reading in the traditional sense.”
Almost seems? I don’t know about Mr. Carr, but I have no doubt that I go online to avoid reading in the traditional sense. The question is, how guilty do I need to feel about this? In his view, presumably, quite a lot guilty, since by reading online as much as I do I am depriving myself of the ability to read offline. He takes this insight to an even more alarming conclusion in the end, writing that “as we come to rely on computers to mediate our understanding of the world, it is our own intelligence that flattens into artificial intelligence.” And if that’s the case for veteran readers, think how much worse it must be for the jeunesse dorée of the information age, if they never developed the habits that accompany “deep reading” in the first place.
It is these poor cultural orphans, for whom “information retrieval” online is the only kind of reading they know, who are the main concern of Mark Bauerlein in his new book, The Dumbest Generation: How the Digital Age Stupefies Young Americans and Jeopardizes Our Future. One would think that a whole future in jeopardy would be too serious a matter for the flippancy of the rest of the subtitle: Or, Don’t Trust Anyone Under 30. But Professor Bauerlein, who teaches English at Emory University and is a former director of research and analysis at the National Endowment for the Arts, is not always sure just how much a matter of mirth “the dumbest generation” is, or isn’t. After all, it is not really their fault if, as he says, they have been “betrayed” by the mentors who should have taught them better. Yet he seems to agree with Nicholas Carr that what we are witnessing is not just an educational breakdown but a deformation of the very idea of intelligence.
James Bowman
More here
Subscribe to:
Posts (Atom)