Tuesday, July 31, 2007

The Rise & Fall of the Prefrontal Lobotomy


Lobotomy (from the Greek lobos, meaning lobes of the brain, and tomos, meaning cut) is a psychosurgical procedure in which the connections the prefrontal cortex and underlying structures are severed, or the frontal cortical tissue is destroyed, the theory being that this leads to the uncoupling of the brain's emotional centres and the seat of intellect (in the subcortical structures and the frontal cortex, respectively).

The lobotomy was first performed on humans in the 1890s. About half a century later, it was being touted by some as a miracle cure for mental illness, and its use became widespread; during its heyday in the 1940s and '50s, the lobotomy was performed on some 40,000 patients in the United States, and on around 10,000 in Western Europe. The procedure became popular because there was no alternative, and because it was seen to alleviate several social crises: overcrowding in psychiatric institutions, and the increasing cost of caring for mentally ill patients.

Although psychosurgery has been performed since the dawn of civilization, the origins of the modern lobotomy are found in animal experiments carried out towards the end of the nineteenth century. The German physiologist Friedrich Goltz (1834-1902) performed ablations of the neocortex in dogs, and observed the changes in behaviour that occurred as a result:

I have mentioned that dogs with a large lesion in the anterior part of the brain generally show a change in character in the sense that they become excited and quite apt to become irate. Dogs with large lesions of the occipital lobe on the other hand become sweet and harmless, even when they were quite nasty before.


These findings inspired the physician Gottlieb Burkhardt (1836- ?), the director of a small asylum in Prefargier, Switzerland, to use ablations of the cortex to try and cure his mentally ill patients. In 1890, Burkhardt removed parts of the frontal cortex from 6 of his schizophrenic patients. One of these patients later committed suicide, and another died within one week of his surgery. Thus, although Burkhardt believed that his method had been somewhat successful, he faced strong opposition, and stopped experimenting with brian surgery.

It was not until the 1930s that lobotomy was again performed on humans. The modern procedure was pioneered at that time by the Portugese neuropsychiatrist Antonio Egas Moniz, a professor at the University of Lisbon Medical School. While attending a frontal lobe symposium in London, Moniz learned of the work of Carlyle Jacobsen and John Fulton, both of whom were experimental neurologists at Yale University.

Jacobsen and Fulton reported that frontal and prefrontal cortical damage in chimpanzees led to a massive reduction in aggression, while complete removal of the frontal cortex led to the inability to induce experimental neuroses in the chimps. Here, they describe the post-operational behaviour of a chimp named "Becky", who had previously got extremely distressed after making mistakes during the task she had learnt:

The chimpanzee...went to the experimental cage. The usual procedure of baiting the cup and lowering the opaque screen was followed...If the animal made a mistake, it showed no evidence of emotional disturbance but quietly awaited the loading of the cups for the next trial. It was as if the animal had joined the "happiness cult of the Elder Micheaux," and had placed its burdens on the Lord!


On hearing the presentation by Jacobsen and Fulton, Moniz asked if the surgical procedure would be beneficial for people with otherwise untreatable psychoses. Although the Yale researchers were shocked by the question, Moniz, together with his colleague Almeida Lima, operated on his first patient some three months later.

On November, 12th, 1935, Moniz and Lima performed for the first time what they called a prefrontal leucotomy ("white matter cutting"). The operation was carried out on a female manic depressive patient, and lasted about 30 minutes. The patient was first anaesthetized, and her skull was trepanned on both sides (that is, holes were drilled through the bone). Then, absolute alcohol was injected through the holes in the skull, into the white matter beneath the prefrontal area.

In this way, two of the bundles of nerve fibres connecting the frontal cortex and the thalamus were severed. (The thalamus is a subcortical structure that relays sensory information to the neocortex, and the thalamo-cortical projections are called the corona radiata.) Moniz reported that the patient seemed less anxious and paranoid afterwards, and pronounced the operation a success. Subsequently, he and Lima used a knife, which, when inserted through the holes in skull and moved back and forth within the brain substance would sever the thalamo-cortical connections. They later developed a special wire knife called a leucotome, which had an open steel loop at its end; when closed, the loop severed the nerve tracts within it.

These procedures were "blind" - the exact path of the leucotome could not be determined, so the operations produced mixed results. In some cases, there were improvements in behaviour; in others, there was no noticable difference; and in yet others, the symptoms being treated became markedly worse. In all, Moniz and Lima operated on approximately 50 patients. The best results were obtained in patients with mood disorders, while the treatment was least effective in schizophrenics.

In 1936, Moniz published his findings in medical journals, and travelled to London, where he presented his work to others in the medical community. In 1949, he was shot four times by one of his patients (not one who had been lobotomized); one of the bullets entered his spine and remained lodged there until his death some years later. In the same year as the shooting, Moniz was awarded the Nobel Prize for Medicine, for his innovations in neurosurgery.

The American clinical neurologst Walter Freeman (1895-1972) had been following the work of Moniz closely, and had also attended the symposium on the frontal lobe. It was Freeman who introduced the lobotomoy to the United States, and who would later become the biggest advocate of the technique. With neurosurgeon James Watts, Freeman refined the technique developed by Moniz. They changed the name of the technique to "lobotomy", to emphasize that it was white and grey matter that was being destroyed.



The Freeman-Watts Standard Procedure was used for the first time in September 1936. Also known as "the precision method", this involved inserting a blunt spatula through holes in both sides of the skull; the instrument was moved up and down to sever the thalamo-cortical fibers (above). However, Freeman was unhappy with the new procedure. He considered it to be both time-consuming and messy, and so developed a quicker method, the so-called "ice-pick"lobotomy, which he performed for the first time on January 17th, 1945.

With the patient rendered unconscious by electroshock, an instrument was inserted above the eyeball through the orbit using a hammer. Once inside the brain, the instrument was moved back and forth; this was then repeated on the other side. (The ice-pick lobotomy, named as such because the instrument used resembled the tool with which ice is broken, is therefore also known as the transorbital lobotomy. The photograph at the top shows Freeman performing the procedure on an unidentified patient.)

Freeman's new technique could be performed in about 10 minutes. Because it did not require anaesthesia, it could be performed outside of the clinical setting, and lobotomized patients did not need hospital internment afterwards. Thus, Freeman often performed lobotomies in his Washington D.C. office, much to the horror of Watts, who would later dissociate himself from his former colleague and the procedure.

Freeman happily performed ice-pick lobotomies on anyone who was referred to him. During his career, he would perform almost 3,500 operations. Like the leucotomies performed by Moniz and Lima, those performed by Freeman were blind, and also gave mixed results. Some of his patients could return to work, while others were left in something like a vegetative state.

Most famously, Freeman lobotomized President John F. Kennedy's sister Rosemary, who was incapacitated by the operation, which was performed on her when she was 23 years of age. And, on December 16th, 1960, Freeman notoriosly performed an ice-pick lobotomy on a 12-year-old boy named Howard Dully, at the behest of Dully's stepmother, who had grown tired of his defiant behaviour.


My stepmother hated me. I never understood why, but it was clear she'd do anything to get rid of me...If you saw me you'd never know I'd had a lobotomy.

The only thing you'd notice is that I'm tall and weigh about 350 pounds. But I've always felt different - wondered if something's missing from my soul. I have no memory of the operation, and never had the courage to ask my family about it.

So [recently] I set out on a journey to learn everything I could about my lobotomy...It took me years to get my life together. Through it all I've been haunted by questions: 'Did I do something to deserve this?, Can I ever be normal?', and, most of all, 'Why did my dad let this happen?'



Howard Dully during his ice-pick lobotomy, Dec. 16th, 1960.
(George Washington University Gelman Library)


Dully's mother had died when he was 5 years old, and his father subsequently remarried a woman named Lou. Freeman's notes later revealed that Lou Dully feared her stepson, and described him as "defiant and savage-looking". According to the notes:

He doesn't react to either love or punishment. He objects to going to to bed but then sleeps well. He does a good deal of daydreaming and when asked about it says 'I don't know.' He turns the room's lights on when there is broad daylight outside.


Freeman recorded the events leading up to Dully's lobotomy:

[Nov. 30, 1960] Mrs. Dully came in for a talk about Howard. Things have gotten much worse and she can barely endure it. I explained to Mrs. Dully that the family should consider the possibility of changing Howard's personality by means of transorbital lobotomy. Mrs. Dully said it was up to her husband, that I would have to talk with him and make it stick.

[Dec. 3, 1960] Mr. and Mrs. Dully have apparently decided to have Howard operated on. I suggested [they] not tell Howard anything about it.


Following the operation, the notebook reads:

I told Howard what I'd done to him...and he took it without a quiver. He sits quietly, grinning most of the time and offering nothing.


Now in his late fifties, Dully works as a bus driver in California. About 40 years after his lobotomy, he discussed the operation with his father for the first time. He discovered that it was his stepmother who had found Dr. Freeman, after being told by other doctors that there was nothing wrong, and that his father had been manipulated by his second wife and Freeman into allowing the operation to be performed.

It was largely because of Freeman that the lobotomy became so popular during the 1940s and '50s. He travelled across the U. S., teaching his technique to groups of psychiatrists who were not qualified to perform surgery. Freeman was very much a showman; he often deliberately tried to shock observers by performing two-handed lobotomies, or by performing the operation in a production line manner. (He once lobotomized 25 women in a single day.) Journalists were often present on his "tours" of hospitals, so that his appearance would end up on the front page of the local newspaper; he was also featured in highly popular publications such as Time and Life. Often, these news stories exaggerated the success of lobotomy in alleviating the symptoms of mental illness.

Consequently, the use of lobotomies became widespread. As well as being used to treat the criminally insane, lobotomies were also used to "cure" political dissidents. It was alleged that the procedure was used routinely on prisoners against their will, and the use of lobotomies was strongly criticised on the grounds that it infringed the civil liberties of the patients.

An excellent account of the effects of lobotomy, and of the ethical implications of the use of the procedure, can be found in Ken Kesey's book One Flew Over the Cuckoo's Nest. (made into a film in 1975, by Milos Forman, who received the Academy Award for Best Director. Jack Nicholson won the award for Best Actor in a Lead Role.)

The use of lobotomies began to decline in the mid- to late-1950s, for several reasons. Firstly, although there had always been critics of the technique, opposition to its use became very fierce. Secondly, and most importantly, phenothiazine-based neuroleptic (anti-psychotic) drugs, such as chlorpromazine, became widely available. These had much the same effect as psychosurgery gone wrong; thus, the surgical method was quickly superseded by the chemical lobotomy.

Remembering Bergman


Swedish film director Ingmar Bergman drinks a cup of tea while shooting "Smiles of a Summer Night" in this file photo dated 1955.

Ingmar Bergman changed the face of filmmaking -- and may have been the 20th century's greatest artist.

By Andrew O'Hehir

July 31, 2007 | Sometime in the fall of 1980, I went to see Ingmar Bergman's film "Persona." I can literally say that it changed my life. I had seen other so-called art films, and even other Bergman films, but nothing quite like that ambiguous black-and-white masterpiece from 1966, a critical point of contact between regular narrative filmmaking and the parallel tradition of experimental film.

If you haven't seen the film, it begins as an acutely observed, relatively straightforward story about the tense relationship between two women. One, played by Bergman's former wife Liv Ullmann, is a famous actress who has suddenly fallen mute, apparently in the grip of a psychological or spiritual crisis. The other, played by Bibi Andersson, is the chatty, overly confessional nurse assigned to care for the actress while she heads to the seaside for some rest and relaxation. At a certain point in the story, an act of cruelty ruptures the superficial friendship, and literally seems to destroy the film. The film appears to stick in the projector and burn from the heat of the bulb, and all sorts of fragmentary, unexplained images (many of them snippets of silent movies) erupt onto the screen. "Reality" is eventually restored, but the rest of "Persona" has a troubled, dreamlike quality, as if we're now in a world where old-fashioned narrative clarity is no longer available.

I remember sitting up nearly all night in my dorm room digesting what I had seen, and then going back to see it again the following night. A year or so later, one of my friends who had bought a 16mm projector at a flea market checked out a print of "Persona" from the Baltimore public library. We hung a bedsheet on the wall of his apartment and watched the movie perhaps eight times in two weeks, with various constellations of bored or enthralled or bewildered acquaintances. Wherever those people are today, I know what memories were called up for them by reading of Bergman's death on Monday, at age 89, on Faro, the remote Swedish island where he lived and had set several films.

Those bedsheet screenings exemplified the kind of devotion Ingmar Bergman's movies demanded from their adherents, and against which his detractors rebelled. For better and for worse, Bergman was the high priest of a certain vision of cinema, one that essentially vanished long ago. He made only a handful of films after his official retirement with the Oscar-winning "Fanny and Alexander" in 1983, but his death is still a landmark moment. Bergman was the last survivor among the foursome of legendary directors whose work created and defined the art-film market in the years after World War II, the others being Federico Fellini, Akira Kurosawa and François Truffaut.

It's misleading and overly narrow, however, to suggest that Bergman or the other art-house lions belonged entirely to the tradition of high art. His films encompass the carnival as well as the cathedral; they include comedies, romances and family melodramas as well as fables of the dark night of the soul. Only a few of them are as self-consciously confrontational as "Persona," and in the 1960s and '70s you could certainly find film buffs -- followers of Jean-Luc Godard, for instance -- who found Bergman to be conservative and conventional. (Compared to the work of his Russian disciple Andrei Tarkovsky, most of Bergman's pictures feel like crackerjack entertainment.)

It's nonetheless accurate to say that Bergman understood himself first and foremost as an artist who belonged to a European tradition stretching back to the Middle Ages, which he evoked so memorably in his first big international success, "The Seventh Seal" (1957). Most obviously, his work borrowed from the Scandinavian theatrical tradition of Ibsen and Strindberg, from various northern European strains of painting and sculpture, from Freudian psychology and severe Lutheran theology and the tormented philosophy of Nietzsche and Schopenhauer. On the other hand, Bergman was certainly not immune to popular culture; his sense of craft was shaped by the classic Hollywood films of his youth, especially those of George Cukor, a personal favorite. (One can certainly see, in several early Bergman pictures, the influence of Cukor films like "Dinner at Eight," "The Women" or "The Philadelphia Story.")

In an interview published in 1972, the critic John Simon said to Bergman, "It must be a great responsibility, I was thinking, just to be you; because film is probably the most important art today and I think you're the most important filmmaker in the world. To be the most important man in the most important art is a terrible responsibility." Simon is a contentious and disagreeable fellow, and no doubt the remark struck some people as fatuous even then. But it was not inherently ridiculous to suggest, 35 or 40 years ago, that the director of "Persona," "Smiles of a Summer Night," "The Seventh Seal," "Wild Strawberries," "The Virgin Spring" and "Cries and Whispers" might be the most important artist in the world.

Bergman struggled to combine the various intellectual and psychological currents that shaped him against a particular context, that of the postwar West traumatized by Auschwitz and the Bomb, in which belief in God was fading but, as Bergman would often observe, fear of God was not. For an entire generation of the European and American intelligentsia (which included my parents), Bergman's wrestling matches with existential doubt and religious guilt, with fractured family relationships and what seemed a civilization in disrepair, came to stand for its own. Max von Sydow's medieval knight playing chess with Death in the plague Europe of "The Seventh Seal" seemed to symbolize mankind on the brink of nuclear annihilation, and the aging professor facing his own death in "Wild Strawberries" (played so marvelously by Victor Sjöström) captured the anxiety of a culture that believed itself crippled by an inability to express or fulfill its emotional needs.

Some of those concerns now seem remote and old-fashioned to us, just as the boundary-smashing impact of the "interrupted" film in "Persona" looks like nothing special to a viewer acclimated to 20 years of music videos and increasingly sophisticated digital editing techniques. The conception that there could be a "most important" artist, or even a most important art, seems alien to the fragmented, niche-marketed, endlessly commodified spirit of the 21st century. Pop culture has become a self-propelling engine that endlessly consumes and recycles its own waste products, increasingly unconscious of anything that predates its own predominance.

Although Bergman remains the subject of sporadic repertory revivals and university film courses, his movies have lost most of their once-mystical aura. After an onslaught of recent DVD releases, most of his important pictures are now readily available (exceptions include "Sawdust and Tinsel," "Dreams" and "The Magician"), but they too are just cultural commodities from the past, and must fend for themselves on the virtual or actual shelves alongside Antonioni and Godard films and "Spartacus" and "Attack of the 50 Foot Woman."

That's probably for the best. By focusing on Bergman as a great artist and deep thinker who grappled with God and existentialism and boiled the soul of the post-Holocaust world in his crucible, critics like Simon have done much to drive audiences away from his work, and have distorted Bergman's own conception of his art. Entirely too much emphasis has been placed on the ideas that allegedly lie behind Bergman's movies; those who haven't seen them are often startled to discover that those ideas are delivered as memorably intimate images and as affecting human stories. Bergman never conceived of his "art" as distinct from cinematic and dramatic craftsmanship, and his very best films, like the battle-of-the-sexes comedy "Smiles of a Summer Night" or the magical family chronicle "Fanny and Alexander," are never reducible to theses or pronouncements.

"I am a man making things for use, and highly esteemed as a professional," Bergman told Simon. "I am proud of my knowing how to make those things." In another interview, with Andrew Sarris, Bergman famously compared himself to the thousands of anonymous stone carvers who worked together to build medieval cathedrals. "Whether I am a believer or an unbeliever, Christian or pagan," he said, "I work with all the world to build a cathedral because I am artist and artisan, and because I have learned to draw faces, limbs, and bodies out of stone."

As every obituary of Bergman will note, he grew up in Uppsala, the ecclesiastical and academic capital of Sweden, as the son of a strict Lutheran preacher and a mother he adored but who sometimes treated him coldly. (This family dynamic is presented vividly in "Fanny and Alexander," which, at least in emotional terms, is highly autobiographical.) In some respects, that's all you need to know about his background; the passionate blend of love, hatred and fear with which Bergman viewed women, God, spirituality and death, and family life in general is all present in his childhood.

He began working in the theater as a teenager, and continued directing plays throughout his life, serving as director in residence at the Royal National Theatre of Stockholm long after his semi-retirement from filmmaking in 1983. This sometimes leads to the misconception that Bergman saw film essentially as a "larger theater" (to use the phrase of Joseph L. Mankiewicz), when in fact he saw theater and film as drastically different media. What is startling about Bergman's movies (at least after his first few apprentice efforts), and what will ensure their survival, is not their philosophical concerns but their intense attention to cinematic craft.

Bergman's films are economical and intimate, and legendarily focused on the human face. (The split-screen optical merging of Ullmann's and Andersson's faces, at the climactic moment of "Persona," both epitomizes this tendency and simultaneously undermines or renounces it.) Working with cinematographer Sven Nykvist from about 1960 onward, Bergman constructed an expressive visual vocabulary that was both naturalistic and symbolic, in which the human face, considered in loving or excruciating detail, becomes an architectural element, and houses and buildings become characters with moods and temperaments of their own. (Nykvist gets plenty of credit for the "Bergman feeling," and he should. But he did not shoot "Smiles of a Summer Night," "The Seventh Seal" or Bergman's other great 1950s films.)

One thing Bergman brought from the theater was the idea of a revolving repertory company, an idea borrowed or imitated by many subsequent directors, but never to the same effect. Ullmann and von Sydow appeared in nearly a dozen Bergman films each, and Bibi Andersson, Harriet Andersson, Erland Josephson, Gunnar Björnstrand and various other actors kept making return appearances. After a while, seeing another Bergman film felt like a family reunion with people you fundamentally loved and trusted, whatever pain they might inflict on each other and on you. In what turned out to be Bergman's last film, the very fine 2003 "Saraband," Ullmann and Josephson reprised their roles as the warring married couple of "Scenes From a Marriage," made 30 years earlier.

Bergman's movies became the focus of intense intellectual combat: During his period of worldwide fame, he was accused of being a misogynist and a man-hater, of being an apolitical aesthete and a crypto-Marxist nihilist. But the films that occasioned the most controversy in days of yore, and that seem the most implicated in philosophical or psychological heavy lifting -- say, "The Silence" and "Cries and Whispers" and "Shame" and "The Virgin Spring" (and "Persona" too, much as I still love it) -- strike me more as intellectual curiosities today, not necessarily his best work. It's his more "realistic" -- or at least less transparently allegorical -- works about the wounded human quest for love that form the basis of a monumental legacy.

Everywhere I go and as long as I live, I'll carry with me images from Bergman's movies: the beautiful Eva Dahlbeck, weaving her spidery lover's schemes in "Smiles of a Summer Night"; Harriet Andersson and Lars Passgard, as the brother and sister performing a midsummer play for their father in "Through a Glass Darkly"; Bergman and Ullmann's daughter, glimpsed in the audience for his marvelous adaptation of Mozart's "Magic Flute"; Ingrid Bergman, so unforgettable in "Autumn Sonata" (the only time she ever worked with her namesake, to whom she was not related); young Alexander (Bertil Guve) nestled in the lap of his grandmother (the marvelous Gunn Wallgren) in "Fanny and Alexander," a film that captures the joys, terrors and enchantments of childhood better than any I've ever seen.

Bergman's fame may have faded to a ghostly shade of its former self, which he probably didn't mind, and at the moment he's not exactly fashionable outside film-buff circles. ("Saraband" did relatively poor business in its 2005 American release.) His influence is so widespread among younger filmmakers, both in Europe and in American independent cinema, as to be almost invisible. Anyone who makes emotional dramas or what might be called "serious" comedies about parents and children, men and women, is operating on Bergman's turf. Anyone who photographs a door opening in an empty house, a clock ticking on a mantelpiece, or someone reading a letter addressed to somebody else (bad mistake!) is borrowing his vocabulary, consciously or not.

If Ingmar Bergman was the most important man in the most important art form in 1972, his cultural significance on his death in 2007 seems much less clear. No single artist can stand for all the traditions of film (and film itself plays a more limited and ambiguous role in the media economy than it used to), and Bergman was undeniably a middle-class white European from an affluent, highly homogeneous society. Maybe we can agree that Bergman was the greatest of the 20th-century artists who tried to adapt the traditional craftsmanship of European theater to a new cultural form. Maybe we can agree that he believed in art as a redemptive, spiritual, even magical force, and did much to carry that ancient view of art into the movie theater.

Bergman lived a long life full of movies, plays and tumultuous marriages, and by all accounts left it behind with few regrets. He had the life, and the death, we would want for ourselves and those we love. I'm still grieving today because I know that, finally, there will be no more Bergman films. (His recurrent promises to quit making them had become almost comical.) If you've got a bedsheet and a projector, I'm coming over.

Monday, July 30, 2007

Creepy comic from the 1960s

Film director Ingmar Bergman dies

RIP. 'Scenes from a Marriage' was a masterpiece. First Altman and now Bergman..:(



STOCKHOLM, Sweden (AP) -- Swedish director Ingmar Bergman, an iconoclastic filmmaker widely regarded as one of the great masters of modern cinema, died Monday, the president of his foundation said. He was 89.


Ingmar Bergman became one of the towering figures of serious filmmaking.

"It's an unbelievable loss for Sweden, but even more so internationally," Astrid Soderbergh Widding, president of The Ingmar Bergman Foundation, which administers the directors' archives, told The Associated Press.

Bergman died at his home in Faro, Sweden, Swedish news agency TT said, citing his daughter Eva Bergman. A cause of death was not immediately available.

Through more than 50 films, Bergman's vision encompassed all the extremes of his beloved Sweden: the claustrophobic gloom of unending winter nights, the gentle merriment of glowing summer evenings and the bleak magnificence of the island where he spent his last years.

Bergman, who approached difficult subjects such as plague and madness with inventive technique and carefully honed writing, became one of the towering figures of serious filmmaking.

He was "probably the greatest film artist, all things considered, since the invention of the motion picture camera," Woody Allen said in a 70th birthday tribute in 1988.

Bergman first gained international attention with 1955's "Smiles of a Summer Night," a romantic comedy that inspired the Stephen Sondheim musical "A Little Night Music."

Don't Miss
Ingmar Bergman filmography
"The Seventh Seal," released in 1957, riveted critics and audiences. An allegorical tale of the medieval Black Plague years, it contains one of cinema's most famous scenes -- a knight playing chess with the shrouded figure of Death.

"I was terribly scared of death," Bergman said of his state of mind when making the film, which was nominated for an Academy Award in the best picture category.

The film distilled the essence of Bergman's work -- high seriousness, flashes of unexpected humor and striking images.

In a 2004 interview with Swedish broadcaster SVT, the reclusive filmmaker acknowledged that he was reluctant to view his work.

"I don't watch my own films very often. I become so jittery and ready to cry ... and miserable. I think it's awful," Bergman said.

Prominent stage director

Though best known internationally for his films, Bergman also was a prominent stage director. He worked at several playhouses in Sweden from the mid-1940s, including the Royal Dramatic Theater in Stockholm, which he headed from 1963 to 1966. He staged many plays by the Swedish author August Strindberg, whom he cited as an inspiration.

The influence of Strindberg's grueling and precise psychological dissections could be seen in the production that brought Bergman an even-wider audience: 1973's "Scenes From a Marriage." First produced as a six-part series for television, then released in a theater version, it is an intense detailing of the disintegration of a marriage.

Bergman showed his lighter side in the following year's "The Magic Flute," again first produced for TV. It is a fairly straight production of the Mozart opera, enlivened by touches such as repeatedly showing the face of a young girl watching the opera and comically clumsy props and costumes.

Bergman remained active later in life with stage productions and occasional TV shows. He said he still felt a need to direct, although he had no plans to make another feature film.

In the fall of 2002, Bergman, at age 84, started production on "Saraband," a 120-minute television movie based on the two main characters in "Scenes From a Marriage."

In a rare news conference, the reclusive director said he wrote the story after realizing he was "pregnant with a play."

"At first I felt sick, very sick. It was strange. Like Abraham and Sarah, who suddenly realized she was pregnant," he said, referring to biblical characters. "It was lots of fun, suddenly to feel this urge returning."

Severe upbringing

The son of a Lutheran clergyman and a housewife, Ernst Ingmar Bergman was born in Uppsala on July 14, 1918, and grew up with a brother and sister in a household of severe discipline that he described in painful detail in the autobiography "The Magic Lantern."

The title comes from his childhood, when his brother got a "magic lantern" -- a precursor of the slide-projector -- for Christmas. Ingmar was consumed with jealousy, and he managed to acquire the object of his desire by trading it for a hundred tin soldiers.

The apparatus was a spot of joy in an often-cruel young life. Bergman recounted the horror of being locked in a closet and the humiliation of being made to wear a skirt as punishment for wetting his pants.

He broke with his parents at 19 and remained aloof from them, but later in life sought to understand them. The story of their lives was told in the television film "Sunday's Child," directed by his own son Daniel.

Young Ingmar found his love for drama production early in life. The director said he had coped with the authoritarian environment of his childhood by living in a world of fantasies. When he first saw a movie he was greatly moved.

"Sixty years have passed, nothing has changed, it's still the same fever," he wrote of his passion for film in the 1987 autobiography.

But he said the escape into another world went so far that it took him years to tell reality from fantasy, and Bergman repeatedly described his life as a constant fight against demons, also reflected in his work.

The demons sometimes drove him to great art -- as in "Cries and Whispers," the deathbed drama that climaxes when the dying woman cries "I am dead, but I can't leave you." Sometimes they drove him over the top, as in "Hour of the Wolf," where a nightmare-plagued artist meets real-life demons on a lonely island.

Voluntary exile in Germany

Bergman also waged a fight against real-life tormentors: Sweden's powerful tax authorities.

In 1976, during a rehearsal at the Royal Dramatic Theater, police came to take Bergman away for interrogation about tax evasion. The director, who had left all finances to be handled by a lawyer, was questioned for hours while his home was searched. When released, he was forbidden to leave the country.

The case caused an enormous uproar in the media and Bergman had a mental breakdown that sent him to hospital for over a month. He later was absolved of all accusations and in the end only had to pay some extra taxes.

In his autobiography he admitted to guilt in only one aspect: "I signed papers that I didn't read, even less understood."

The experience made him go into voluntary exile in Germany, to the embarrassment of the Swedish authorities. After nine years, he returned to Stockholm, his longtime base.

It was in the Swedish capital that Bergman broke into the world of drama, starting with a menial job at the Royal Opera House after dropping out of college.

Bergman was hired by the script department of Swedish Film Industry, the country's main production company, as an assistant script writer in 1942.

In 1944, his first original screenplay was filmed by Alf Sjoeberg, the dominant Swedish film director of the time. "Torment" won several awards including the Grand Prize of the 1946 Cannes Film Festival, and soon Bergman was directing an average of two films a year as well as working with stage production.

After the acclaimed "The Seventh Seal," he quickly came up with another success in "Wild Strawberries," in which an elderly professor's car trip to pick up an award is interspersed with dreams.

Other noted films include "Persona," about an actress and her nurse whose identities seem to merge, and "The Autumn Sonata," about a concert pianist and her two daughters, one severely handicapped and the other burdened by her child's drowning.

The date of the funeral has not yet been set, but will be attended by a close group of friends and family, the TT news agency reported.

Yep, the Jim and Tammy are the Bakker clan. Click on the picture to listen to the song 'God is watching you'. God, if you're listening, please send me some acid to go with this song.

Ragnar's Got Your Nose


This book is ideal for introducing kids to the naughty side of life, instead of all that 'say please' and 'don't kick the cat' stuff parents teach their kids nowadays. I just bought a copy for my nephew, Seif, who's having his tonsils removed next week.

Sunday, July 29, 2007

Collected Works of Friedrich Nietzche

Thanks to DiClerico for leading me to this. The funny thing is I had just packed 'Beyond Good and Evil' in order to try and read it in England (take 487). Now that I have this, out of the suitcase it goes!

Found this while I was packing


Celebratory gunfire kills 4 as Iraq wins Asia Cup

Only 4 people killed! Hooray!! Fantastic!! No wonder the rest of the world thinks Arabs are lunatics! It's because they are!!


Prime Minister announces each player to receive $10,000. Jubilant Iraqis in Baghdad ignore gunfire, vehicle ban after 1-0 win. Violence after previous Iraqi wins in Asian Cup has killed at least 50 people

BAGHDAD, Iraq (CNN) -- Celebratory gunfire erupted across the capital Sunday when Iraq's soccer team won the Asian Cup, in a 1-0 shutout against champions Saudi Arabia.

A soldier takes position Sunday in central Baghdad after a vehicle ban was announced in the Iraqi capital.

Stray bullets killed four people and wounded 17 others in the capital, an official with the Iraqi interior Ministry told CNN.

Me

Saturday, July 28, 2007

Die, metrosexuals, die!

Penguin Teaboy


Tea too strong? Too weak? Problem solved. This nattily attired tea penguin always brews the perfect cup. Set the timer for your ideal brew time (from 1 minute up to 20) and he lowers the teabag into the water. When the time is up, he lifts it out.

Friday, July 27, 2007

Architecture In Helsinki - Heart It Races

Just discovered this song a few days ago. Its weird and wacky, but very catchy. They're an Australian band with a ridiculous name and they remind me of the Polyphonic Spree, They Might Be Giants! and other bands that have a hipsterish/ cultish feel to them. The video is utterly ridiculous but I'm revelling in the deep weirdness of it all.


Historic plaque from a town in Kansas.

Thursday, July 26, 2007

The Wrong War vs. the Right War

The director of the chilling "No End in Sight" explains how the Iraq occupation went horribly wrong. Plus: The American who made the world notice Darfur.

By Andrew O'Hehir

If everybody in this polarized country could be convinced to sit down tonight and watch the documentaries "No End in Sight" and "The Devil Came on Horseback," we might pull our troops out of Iraq next week and send them to Darfur the week after that.
But then, like every other idea relating to the collective dream-state known as American politics, that is no doubt wishful thinking. I watched those two films through my own distorted lens, and you'll see them through yours. What unites them is a passionate commitment to craft that signals, in turn, a belief in something so old-fashioned it seems Platonic: the idea of film as a medium for transcending subjectivity and opinion and grasping for truth.

Neither of these films is predicated on political ideology; I couldn't tell you whether the people who made them were Republicans or Democrats, and it doesn't much matter. Taken together they serve as an indictment of U.S. foreign policy that's more damning than the collected works of Noam Chomsky. In "No End in Sight," Charles Ferguson's magisterial history of the American occupation of Iraq over the past four years, it appears that all the crucial policy decisions affecting Iraq's future, the entire Middle East and by extension the world were made by a tiny, closeted group of ideologues with no expertise in the country, the region, Arab culture, military affairs or much of anything else.

We were too busy fucking up Iraq to save the people of Darfur, apparently. As Annie Sundberg and Ricki Stern's horrifying "The Devil Came on Horseback" makes clear, the State Department under Colin Powell investigated reports that government-sponsored Arab militias were carrying out a campaign of genocide against black Africans in that Sudanese province, decided they were true -- and did absolutely nothing. Being the world's sole superpower comes with responsibilities, and evidently that means spreading outrageous lies about the wars we start, while sweeping under the carpet the ones we refuse to stop. How can any American still wonder why our country is perceived as a force of immorality, chaos and disorder?

From the first frames of Charles Ferguson's "No End in Sight," replaying some of the oddest and twitchiest podium performances of Donald Rumsfeld during those heady days of spring 2003, you may feel the crushing weight of an almost Sophoclean impending doom. That was when that famous statue of Saddam came crashing down, when at least a few Iraqis really did greet American troops with kisses and flowers, when studly George W. Bush flew onto that aircraft carrier, with the world seemingly on its knees before his codpiece, to declare "Mission Accomplished."

Even at that point, says Ferguson, the war was already a gruesome failure. American troops arrived in Baghdad with insufficient numbers, no communications technology, very few translators, and almost no understanding of what they were supposed to do when the "major conflict" stopped. You may have blocked all this from your memory, but it will come flooding back: Looting spread through the city, devastating the national museum of antiquities, the national archives and almost every other public building. By the time American administrators made any serious effort to get the place up and running, Iraq's infrastructure had been destroyed, its army and most of its government bureaucracy were officially unemployed, and all the weapons, machinery and anything else of value were gone.

Ferguson is a political scientist and one-time technology pioneer (he sold his former company, Vermeer Technologies, to Microsoft in 1996, for $133 million) whose approach to the Iraq occupation is resolutely analytical and nonideological. He was not an opponent of the war, at least going into it. That may reduce his credibility in some quarters, but his foreign-policy credentials helped gain him access to a remarkable number of diplomatic, military and intelligence insiders, including several who provided background information but declined to appear on camera.

Ferguson's high-level interviewees include former Deputy Secretary of State Richard Armitage; Col. Lawrence Wilkerson, the former chief of staff to Colin Powell at the State Department; Gen. Jay Garner, the first coalition administrator of occupied Iraq; Col. Paul Hughes, who directed strategic policy for the U.S. occupation during its early stages; Barbara Bodine, who was ambassador in charge of Baghdad under the occupation; and Robert Hutchings, former chairman of the National Intelligence Council. That's not to mention numerous affiliated experts and sources, from Time reporter Chris Allbritton to Atlantic Monthly editor James Fallows, former Defense Intelligence Agency analyst Marc Garlasco, Harvard scholars Linda Bilmes and Samantha Power, and several American officers and soldiers who served on the ground.

These people represent a wide range of opinions and analyses, and many of Ferguson's insiders remain team players and (in many cases) loyal Republicans. All of them seem motivated by a combination of disgust and amazement at how badly things have gone since the fall of Baghdad and by a genuine desire to help make sense of it all. Only one interviewee, a former Defense Department advisor named Walter Slocombe, even attempts to pretend that the occupation hasn't been a disaster, with nothing but bad news ahead. Slocombe belonged to the small group of Pentagon insiders who made almost all the major decisions about Iraq and was the only one willing to appear on camera. (Shockingly, Rumsfeld, Paul Wolfowitz, Douglas Feith and Dick Cheney all resisted Ferguson's overtures.)

You don't have to sympathize with these people as individuals, or with their hard-headed, realpolitik, we're-the-grownups approach to policy, to be profoundly shocked by the story of arrogance, piss-poor planning and all-around incompetence that unfolds in "No End in Sight." It's one thing for those of us who opposed the whole damn thing from the get-go to waggle our fingers and say we told them so. It's quite another to see people who presumably thought the general idea was OK (as Ferguson did), and who were entrusted with various details of the project, speak wistfully about their massive failure, whose ripple effects will go on screwing up the world far into the lives of our children and grandchildren.

Ferguson met me in his large and empty New York apartment, in a West Village luxury high-rise overlooking the Hudson River. (He spends much of his time at another house in Berkeley, Calif., where he lectures at the UC-Berkeley journalism school.) His living room is at least as large as my entire apartment, and it contains a grand piano he does not know how to play. He speaks quietly, thoughtfully and precisely, but almost never laughs or displays emotion. Despite the ruthless rationality of his policy dissection in "No End in Sight," he says the ultimate explanation for the botched occupation of Iraq may lie in that murkiest of realms, individual human psychology.

You're a newcomer to making films. What made you think you wanted to make one, and make this one in particular?


I've been obsessed with movies since I was a little kid. I love movies of all kinds, trash as well as high culture. I wanted to make films for a long time, and I came to a point in my life, about three years ago, when I no longer had an excuse not to do it. I had time and I had financial security. I had finished a book I was working on ["The Broadband Problem: Anatomy of a Market Failure and a Policy Dilemma," published in 2004]. I started thinking about making a movie, and then our president gave us the Iraq war. It just seemed obvious and important.

I thought of it fairly early on, and friends of mine dissuaded me, saying that it was a difficult first film to make and that many people would be making it. After a year of waiting, nobody was making it.

Well, that's true. There have been numerous other films about the Iraq war, but they've been very granular and subjective. More about what happened to individuals on the ground, whether they were American soldiers or Iraqi civilians. Nobody's tried to take this global, policy-oriented perspective.


Exactly.

I assume your foreign-policy expertise literally made this film possible. I mean, if I called up Larry Wilkerson or Dick Armitage and said I wanted to interview them about the Iraq war and their role in planning and executing it, they might tell me to go jump in the lake.


I don't know what they would say to others, but they didn't say that to me. Larry Wilkerson has spoken out a fair bit; he's been quoted in the press. But I believe that we have the only lengthy interview that Richard Armitage has done about the Iraq war, which is a bit of a surprise. But it's true.

Yeah, he's very cagey and very loyal. He never directly criticizes his former boss [Powell] or the president. But at the same time, he does seem to want to express grave reservations about what happened. People will kind of have to see it, but to me he looks like he's radiating disapproval when he talks about the White House and the political decision making that went down.


I think we used four minutes of him in the film, but the interview was an hour and a half long. We're going to put that up on the Web site at some point. There are places where he's very cagey and doesn't quite say what he thinks, and there are other places where he's remarkably candid. When I asked him to assign a grade to the war, the planning and all the foreign-policy making that went into it, he said, well, you have to distinguish between the military campaign itself and the subsequent occupation. He said he would give the military campaign an A and the occupation a C-minus. For somebody who was the deputy secretary of state during the relevant period, that's a striking statement.

Sure. He's about as much of a trusted Republican policy insider as you can find in the world. He worked for Reagan and both Bushes. If I'm not mistaken, he worked for George W. Bush's election campaign.


Absolutely. Yes.

Did you meet other insiders, people at or near his level, who weren't willing to go on the record?


Yes, quite a number of them. Particularly career military officers who are still serving. But also people in the State Department and elsewhere. One person in the intelligence community, quite senior, who was working for a high-level policy person during the planning of the war and the occupation period and then went back to their intelligence job. We had quite a long conversation, just as this person was heading back to Iraq for a yearlong period, and what they had to say was quite disturbing. I also spoke to a high-level military officer who was working for high-level civilians during the occupation.

You have obviously tried to avoid making a directly political film. It's certainly not an antiwar film in any general sense. I understand that, going back to March 2003 or whenever, you were not necessarily opposed to the war.


That is correct. I was very favorably inclined, in a general way, to the idea of using military force to remove Saddam. Partly for reasons of regional stability -- geopolitical, WMD-related reasons -- and partly for humanitarian reasons. Now, reasonable people can disagree about whether it was wise or just or necessary or important to use force to remove Saddam, but there's a perfectly reasonable case that it should have been done at some point. Which is of course quite different from saying that I was in favor of what the Bush administration actually did. The film is, I guess, about the disjuncture between those two things.

[Pause.] Well, it's actually not about the first thing. I consciously made a film that wasn't about the question of whether it was right or wrong to use military force to remove Saddam. I tried to make a film about what actually happened.

This story reminds me of Greek tragedy in a way. A certain number of things have to go wrong in a certain order before we end up with Oedipus killing his father and sleeping with his mother, in fulfillment of a dire prophecy. This is a story about everything going wrong all the time. I guess it's more like Murphy's Law in action on a grand, fatalistic scale.


I certainly agree with that last statement. I think they made so many horrendous mistakes that they kind of overdetermined the result. Any three of those mistakes might have doomed the occupation. The fact that they made 500, you know, or 1,000 -- certainly by early 2004 it was already over, actually.

Right, that's certainly the case you make. I think for many Americans, the episode in Fallujah early in 2004, when those four contractors were killed, dragged through the streets, and hung from the bridge, was a turning point. But you think it was already too late by then.


It probably was. Even after the first half-dozen fundamental errors -- not enough troops, allowing the looting, [coalition administrator L. Paul] Bremer's three early decisions, the early handling of the political decision, the mishandling of the U.N., not guarding the weapons -- even at that point, in July or August of 2003, if they had realized then, "Oh God, we've really blown it," you can conceive of how they could have recouped the situation. But after six months of having a half-million Iraqi military men on the streets with no income, it was too late.

My translator when I was in Baghdad had been an emergency-room doctor. He worked through the war. When the Americans invaded, he was making seven dollars a day. The country was in ruins. There was 40 to 50 percent unemployment, and then you take the entire army and throw it into the streets, give each soldier a $50 severance payment, and let that stew for six months. What do you expect?

Let's talk about Bremer. It's too bad that you couldn't get him on camera. He plays a very important role in the whole fiasco.


We tried hard. Really hard.

You spend a lot of time developing the consequences of the three decisions that Bremer made just as he was arriving there in May 2003, a few weeks after the occupation began. Run through those for us.


He made these decisions essentially simultaneously. One was to institute a formal American occupation and to delay for what turned out to be a long period -- over a year -- sovereignty for the Iraqi nation. The second was his de-Baathification order, which purged the Iraqi government of most of its senior administrators and technocrats, including many who were not affiliated with Saddam. By most accounts, that crippled the economy and administration of the country. The third, and by far the most important, was disbanding the entire Iraqi military and intelligence services.

Right. So that we wound up with however many thousands of men on the street.


The lowest estimate is 450,000. Somewhere between 450,000 and 650,000.

Out of work, financially destitute and psychologically...


Infuriated.

You make the case in the film that a large percentage of the Iraqi military was prepared to come back to work and do what armies are supposed to do after they surrender -- take orders from the new boss in town, and do their jobs.


Yep. The exact fraction of the army that could have been used in that way and how quickly they could have been called can be debated. But there's no question that at least half the army, and possibly the overwhelming majority, could have been recalled and used pretty quickly. In fact, 137,000 soldiers had already signed registration statements, giving a lot of information to the American occupiers and stating their willingness to return to duty.

That shocked me. I mean, you're the expert. But just to take that number, if you've got 137,000 Iraqi troops in somewhat good shape, who speak the language and know the country, and they're prepared to follow orders from American commanders on the ground, and you assign them to police the streets and secure public buildings and restore some semblance of public order -- well, it strikes me that you've got a vastly different situation, almost right away.


Sure, of course. A completely different situation.

Maybe this was not what you intended, but this film seems like a strong defense of the foreign-policy and intelligence communities, and to a significant extent the military leadership. You argue that most of those professionals made correct or at least reasonable predictions and prognostications, and that what happened wasn't their fault. Is that fair?


Hmm. I think it's largely fair. I don't think any of them was perfect. The intelligence community did get roughly correct its assessment that Iraq was a troubled place and that occupying it would be difficult, there would be tensions and so on. But they got the WMD thing wrong. They got it wrong under intense political pressure from the White House, no question. But they still got it wrong. They also didn't know much about Iraq, and knew virtually none of the things you needed to know to run an effective occupation. You would want to know the names, addresses and telephone numbers of the top, say, 2,000 administrators in the country so you could make the place run. Well, they didn't. When the occupiers got to Baghdad, they didn't have telephones and they didn't have interpreters. They had no idea how to get in touch with anyone. No idea.

The military understood that more troops were required, but did they make that case forcefully to the president? No. Something very different would have happened if all four of the joint chiefs had stood up in public or gone to the president. I don't think they could have stopped the war, but could they have gotten another 50,000 troops? Yes, I think so.

All these groups had significant flaws that contributed to the problems, but it's nonetheless correct that on balance what happened in Iraq is a vindication of the general proposition that you should pay attention to the professional opinion of people who spend their lives looking at a certain class of questions. If you totally ignore them, you do so at your peril.

Late in the film you ask Gen. Jay Garner, who was briefly Bremer's predecessor in running occupied Iraq, why all these mistakes were made. He says he doesn't know, that he finds it puzzling. Just to take Bremer's three key decisions, how do you explain them? Anybody who knew anything about Iraq thought they were bad ideas. So where were they coming from?


This really is the core of the whole thing. Those three decisions, and a lot of others, were made by a very small group of people in a very short period of time. They were made by some combination of Bremer, Wolfowitz, Rumsfeld, Feith and Walter Slocombe. Dick Cheney was indirectly involved, but he was not part of the meetings and discussions at which these decisions were made. These decisions were made at a series of meetings in the Pentagon between May 1 and May 9 [of 2003], and it was at one of those meetings on May 9 that Bremer decided to dissolve the army, on Slocombe's recommendation.

This group of people had never been to Iraq. [Actually, Rumsfeld was there in 1983, at the time of his infamous handshake with Saddam Hussein.] None of them spoke Arabic. None of them had serious experience in the Mideast. Only one of them had served in the military at all, and that was Rumsfeld, who was a Navy pilot in the 1950s. They had no postwar reconstruction experience. In a perfect vacuum of information, these guys made these extraordinary, sweeping decisions. Many of the most important decisions were made this way, by less than six people. Arguably less than four. That small group of people, allowed to behave this way by President Bush, basically felt that they knew enough that they did not have to consult with anyone else. When they did talk to other people and were told, "This is a crazy thing," they simply disregarded everything they heard.

One thing that keeps coming up in the film is the lack of Arabic speakers among the Americans who went to Iraq. This just seems like a critical failing and an incredibly dumb mistake to make. I know it's not an easy language for English speakers, but there are Americans who speak it and they can be found.


There are 600,000 Americans who speak Arabic. Not to mention the possibility of hiring people from many other nations. Yes, it's astonishing. Part of the problem, although it's well below the top 10 mistakes on the list, was an overreliance on wealthy, cultivated Iraqi exiles who spoke English.

Ahmad Chalabi, for instance.


For example. Others as well. That gave them a very slanted, limited view of what Iraq was like and Iraqis were like. If you were an Iraqi who hadn't gone to Harvard, didn't have a Ph.D., weren't politically extremely conservative, didn't speak fluent English, and hadn't lived in the United States for 10 years, you were out of the loop. You didn't get to talk to these guys.

You haven't used the word "neoconservative" in describing that small group of men who made the decisions, and I imagine you've got a good reason for that. But clearly those guys are united by a shared ideology and view of the world. Didn't that play a defining role in how they understood the conflict and its aftermath, and every decision they made?


To some extent it clearly did. At the same time, much of what was done was contrary to their own interests, as they themselves would have defined those interests. If you think that it's important to remove Saddam by force and install a democratic regime in Iraq in order to remake the Middle East, then you don't do 10 of the things they did. Like not have anybody who spoke Arabic, and so forth. Many things. So I think that ideology's not a sufficient explanation. I think this has to have something to do with the individual psychologies of the very small number of people who were in control of this.

So if Paul Wolfowitz has a shrink, maybe he can help us figure this out.


Maybe. Certainly with regard to Bremer, and probably also Cheney and Wolfowitz and Rumsfeld, you need to ask psychological questions. You also need to ask, how can it be that three, four, five people can impose their psychological predispositions on an entire nation without other places in the system controlling them, disciplining them, limiting them? Yet somehow that happened.

In the film, you bring events up to pretty much the beginning of 2007, with that horrible number of what you think the war has cost so far.


$1.8 trillion.

OK, $1.8 trillion. I have no way of understanding a sum that large. Beyond spending a lot more money, what has happened in the last six or seven months, if anything, to change the picture?


Well, it now seems increasingly likely that domestic political pressure will force at least a drawdown or partial withdrawal by the United States. It's impossible to say exactly what will happen as a result of that. I speak to many different people about this, and their opinions range widely. The center of gravity of their opinions is that the situation will be bad no matter what we do. If we stay it's bad; if we leave it's bad. If we reduce our presence but don't totally leave, it's also bad.

They differ about how bad. The best scenario is pretty much Northern Ireland, a low-grade civil war that lasts 20 or 30 years.

That would be a lot better than what we've got now, wouldn't it?


Well, it's more or less what we've got now. It's more violent than Northern Ireland ever was, but it's kind of in that zone. It's not Congo, it's not Rwanda, it's not Somalia. Many people think that those models are real possibilities.

Well, that's encouraging. Do you have any feeling about what the right thing to do is?


No, I don't. Honestly. I've asked this question of many well-informed people, who are much better attuned to what's going on in Iraq than I am. Some people think we have to effect a partition of the country, as gracefully as possible. Other people think that's very dangerous and can't be done. Baghdad is too integrated and heterogeneous, there's too much intermarriage, it's important to preserve an Iraqi nation-state. Should we withdraw or stay? Should we be forceful toward the Iranians, or conciliatory? How do we handle the Turks? How do we handle the Kurds? How do we deal with oil revenues? It's very complicated, and people say very different things. I don't think anybody knows. In private, people will say, well, let's try something. If it looks like it's working, we'll go with it. If it doesn't work, we'll have to be prepared to change course rapidly.

George Packer, I believe, recently wrote that in his judgment this is now the worst foreign-policy blunder in American history. Is that overstated?


It could well prove to be true. The Vietnam War killed 3 million people, but its geopolitical ramifications were relatively limited. This war has so far killed a quarter of a million, but it could easily kill a million or more. And its geopolitical ramifications could be enormous and long-lasting. It could trigger enduring civil wars and conflict in the Mideast, a nuclear arms race. It could be very bad.

"No End in Sight" opens July 27 at Film Forum in New York and the E Street Cinema in Washington; Aug. 3 in Los Angeles; Aug. 10 in Chicago, Minneapolis, Philadelphia, Portland, Ore., San Francisco and Seattle; Aug. 17 in Detroit and St. Louis; Aug. 24 in Boston; Aug. 31 in Indianapolis and Austin, Texas; and Sept. 7 in Durango, Colo., with more cities to follow.

"The Devil Comes on Horseback": The first Holocaust of the 21st century, as a ratings flop

This shouldn't be a competitive sport or anything, but I'm pretty sure that Annie Sundberg and Ricki Stern's documentary "The Devil Came on Horseback" has the most horrifying images I have ever seen in a motion picture. There aren't words to describe them, really. There are pictures of people who have been tortured and burned alive, children who have been chained in place and hacked to pieces, corpses reduced to ghostly outlines of ash on the ground, people so badly mutilated you can't identify them as male or female, child or adult. You won't sleep well after you see this movie, and I don't suppose you should.

One could argue that Claude Lanzmann's "Shoah," which includes very few images of atrocity, is a more chilling exploration of genocidal history. But "The Devil Came on Horseback" has galvanized audiences at film festivals around the world precisely because it presents, in its calm, measured fashion and without much ceremony, pictures that nobody really wants to see.

Most of those photographs were taken by Brian Steidle, a former U.S. Marine Corps captain who served for six months as an unarmed military observer in and around the Sudanese province of Darfur, not realizing at the time that he was one of a tiny number of documentary eyewitnesses to the ongoing massacres that have resulted in about 450,000 deaths and perhaps 2.5 million refugees, according to some estimates. In 2003, the long-running civil war between Sudan's Arab-dominated government and the largely black southern rebels sputtered to a close, freeing the government to focus on a few unrelated bands of ragtag rebels in Darfur.

As everyone except Sudan's government now admits, the Arab militias known as "janjaweed" (linguistic experts differ, but Steidle says it means "devil on a horse") who have been killing off or driving out the black population of Darfur are funded, supported and egged on by Sudanese authorities, often with air support from Antonov bombers. The African Union sent a tiny group of observers to Sudan, with the faint hope of quelling the violence, toward the end of 2003. Among them was Steidle, who snapped away with his telephoto lens as he watched janjaweed raiders shoot children, rape women, massacre men and burn entire villages to the ground.

Of course "The Devil Came on Horseback" is about a big issue, a horrifying conflict most of us, including our highest officials, have chosen not to learn too much about. But it's also about a smaller, exemplary issue, the transformation of an ordinary, jocked-out military dude into a crusader. Steidle says he joined the African Union's observer force mostly out of a taste for travel and exotic adventure; he was hoping to retire soon, at age 35, and spend the rest of his life on his sailboat. He came back from the Sudan partway through 2004 and tried to forget about the whole thing.

But after realizing that he was virtually the only American who had seen the Darfur massacres personally, and could prove it, he became something of a national conscience and gadfly, testifying before Congress, speaking at rallies, talking to journalists whenever and wherever he could. Nicholas Kristof of the New York Times published several of his pictures to accompany an Op-Ed, which created a brief wave of media interest in Darfur, and in the question of whether the West could or should do something.

Steidle says he often had the thought in Darfur that if Americans could see what he was seeing, Marines would be there inside a week. Fearsome as they are to Darfur's villagers, the janjaweed are bands of a few dozen men with automatic weapons and Toyota pickup trucks. Two or three battalions of Western troops with helicopters and armored vehicles would suffice to stop them; a few more could disperse or kill them. (And believe me, after you see this movie you won't feel too many scruples about using force against those people.)

But the people of Darfur, predictably, have become more collateral damage in the bottomless fiasco documented in Ferguson's film. Both in the political and financial senses, U.S. policy makers believe they cannot afford to intervene in another overseas conflict, and the tense racial politics that affects all interactions between the West and the developing world, between the United Nations and the barely functioning African Union, has meant that bureaucrats continue to dither in big cities while the killing goes on.

Steidle and his activist sister continue to work for Darfur-related charities and visit the refugee camps across the border in Chad (for obvious reasons, he can't return to the Sudan). He has written a book, and testified before the International Criminal Court in the Hague, giving names, dates and places of the massacres he observed. But if the court hands down indictments, who's going to go into that hellhole and arrest the suspects? To paraphrase Gandhi's famous quip, international justice sounds like a good idea, but we haven't seen it yet. Ultimately, if the American people are too numb, too infotained and too narcotized to care, then we don't have anyone to blame for Darfur, or for the next Darfur, whenever and wherever it happens.

"The Devil Came on Horseback" is now playing at the IFC Center in New York. It opens Aug. 17 in Boston and Helena, Mont., Aug. 24 in San Francisco, Sept. 7 in Nashville and Sept. 21 in Seattle. Other screenings include July 28 in Philadelphia, July 29 in Rochester, Minn., July 31 in Sedona, Ariz., Aug. 6 in Wilmington, N.C., Aug. 23 in Huntington, N.Y., Aug. 28 in Norfolk, Va., Sept. 7 in St. Louis and Sept. 20 in Milwaukee. See Web site for complete schedule.

No Actual Spoilers

No Country for Old Men: Trailer

My God, I love the Coens..

Wednesday, July 25, 2007

Planet Hiltron

Planet Hiltron is a site that invites people to photoshop celebrities so they look as they would, if they looked and dressed and lived like ordinary people. The results are amazing--not only do they age them well, but the clothes, the gormless expressions, the hair disasters. Below, are six of my favorites.






Google/ Ebay vs. Phone Companies


Internet companies are lobbying the FCC to open up wireless networks to new applications and devices. If they win, we could all have cheaper, better, more wonderful cellphones.
By Farhad Manjoo


In 1921, a small company in New York -- you might call it a tech start-up -- invented a solid-state device to ease the social discomfort occasioned by the advent of the telephone. The earliest phones had poor microphones, and people were forced to bark into them rather than talk; because phone lines were beginning to show up in drugstores, saloons, hotels and offices, all the yelling posed a challenge to privacy (of the callers) and peace (of everyone else). The start-up firm came up with a solution that engineers today would label a kludge -- an inelegant quick fix, but hey, it worked. It was a portable bell-shaped cup that fit over the phone's mouthpiece, a fixed version of the shield you'd make with your hands around your mouth if you were trying to keep your business on the D.L. Hence the device's inspired name: the Hush-A-Phone.

Over the next few decades, the Hush-A-Phone Corp. of New York saw its kludge become a big hit, selling more than 125,000 units to a phone-crazy public. But not everyone was happy about its success. In the late 1940s, AT&T, the monopoly that controlled the nation's phone system, charged Hush-A-Phone and its users with violating a rule: Only devices "furnished by the telephone company" could be used on the telephone network. The phone company threatened to close Hush-A-Phone users' phone lines and shut down the stores that sold them. An epic legal fight ensued, stretching on for eight years and involving the Federal Communications Commission and several levels of the federal courts. When it was over, in 1956, tiny Hush-A-Phone had prevailed -- and so too every telecom start-up since.

The Hush-A-Phone court decision inspired a more far-ranging rule known as Carterfone, a 1968 FCC judgment that undid AT&T's control of the "edge" of the network. The Carterfone rule prohibited the phone company from dictating how people could us the lines coming into their homes and offices. It presaged a new age of innovation in communications technology. Just about every amazing thing we now use on the phone network -- cordless phones, answering machines, TiVos, home security systems, fax machines, dial-up modems, DSL modems (the Internet itself, you could say) -- is a direct consequence of Hush-A-Phone and Carterfone decisions. But consider this: The rules do not apply to cellular networks.

Though wireless carriers depend on public radio space for their fortunes, they're currently free to proscribe public freedom on the networks they run. AT&T, Verizon, Sprint and T-Mobile decide which phones customers can use on their systems; which programs and features they'll allow on those phones; and in what manner people are allowed to use those devices. The technology of 2007 is flashier than that of 1921, but the networks operate in roughly the same manner as the pre-Hush-A-Phone landline system -- if you use your phone in a way not sanctioned by the phone company, they're free to shut you down.

Last week Google announced that it would bid $4.6 billion for a slice of the public airwaves that the FCC plans to put up for auction next year. The radio space being sold -- known as the 700 MHz band -- could provide faster, more reliable wireless Internet connections throughout the nation, and large telecom firms are setting aside huge sums to snap it up. But Google, which is acting in concert with a host of Internet companies, has lobbied the FCC to make the waves open -- to force any firm that purchases the radio spectrum to follow the Carterfone rule, among several other principles of openness. Google says it will participate in the auction only if the FCC agrees to moves its way.

The fight may seem like an obscure regulatory tangle between large corporations, and telecom firms -- which oppose Google's bid -- are already accusing it of seeking corporate welfare. But the decision could prove no less profound than the Hush-A-Phone ruling. If the FCC comes down on the side of openness, proponents of a wireless Carterfone rule say, customers would see a host of new technologies pop up on their phones, and they'd likely see prices come down, too.
For instance, every major cellphone company currently prohibits customers from using non-sanctioned Internet phone services on their cellphones, says Chris Libertelli, the head of government affairs for Skype, which recently called on the FCC to expand the Carterfone decision to wireless networks. (Here's the PDF of Skype's petition.)

Skype makes one such voice-over-Internet-protocol (or VoIP) phone system. If you had Skype on your cell, you could make voice calls to other Skype users -- whether in Illinois or Iraq -- for only the cost of sending Web data over your phone (essentially for the cost of an unlimited data plan on your cell). Skype's software -- like the Hush-A-Phone -- poses no harm to the phone companies' networks; Skype has produced a cellphone version of its application that is widely in use in Europe and Asia, Libertelli says. But American wireless companies have a financial motive to block it: If you're using Skype, you're not using regular cell minutes, after all.

Skype's not the only thing you can't get on a cellphone. As the law professor Tim Wu has pointed out, phone networks have blocked handset manufacturers from adding GPS services, Wi-Fi, Web browser and e-mail software, file-transfer applications, and a host of other software and hardware capabilities that could potentially eat into carriers' profits. The Wall Street Journal reported that RIM, the company that makes BlackBerry phones, wanted to add a free maps program on their devices; AT&T prohibited it because it had a $10-a-month mapping service to sell to users. RIM also built a phone capable of seamlessly switching between networks in Europe and networks in America -- handy for international travelers. But Verizon, the Journal reported, locked down that capability to anyone who didn't pay a fee.

I asked Verizon -- which has been the most vocal of all carriers in its opposition to openness principles -- to explain its rationale to me. A spokesman declined, pointing me to the company's regulatory filings with the FCC on the matter. Verizon, like other cell firms, generally argue that government regulation over its businesses would harm American economic vitality. The wireless market, unlike the old land-line system, isn't a monopoly, Verizon points out. Four companies fiercely compete with each other for customers -- and if customers (that is, "the marketplace") really demanded open networks, companies would surely see it in their interest to provide it.

Libertelli, though, argues that the phone companies have a narrow definition of "free market." They compete with each other, but they don't want to compete against the Skypes, Googles, and other Internet innovators of the world. "There's a whole new Internet model out there, and it's nothing like what the telecom world has seen. In some ways it's a clash of worldviews. We're trying to build a free market for devices and applications," Libertelli says. "Carterfone is the way to bring the innovation of the Internet to the wireless market."

Fortunately, Kevin Martin, the Republican chairman of the FCC, seems to be listening. He has proposed adopting a wireless Carterfone rule for the 700 MHz band, and according to lobbyists who've spoken to him, he seems to be genuinely considering adopting a model to make wireless networks fully open. Gigi Sohn, who heads the public policy group Public Knowledge, says Martin understands the weight of his ruling -- that, like the Hush-A-Phone decision, it could change everything. "He knows that this is his legacy," Sohn says.

Ramblefish 'Pinnacle of Fashion' Series Presents


This handsome Pac-Man plush hat is available in adult and child sizes for $29.99. A perfect match for the Vibram Fivefingers.

W's Grandpa Planned Fascist Coup of USA


A BBC Radio 4 investigation sheds new light on a major subject that has received little historical attention, the conspiracy on behalf of a group of influential powerbrokers, led by Prescott Bush, to overthrow FDR and implement a fascist dictatorship in the U.S. based around the ideology of Mussolini and Hitler.

The coup was aimed at toppling President Franklin D Roosevelt with the help of half-a-million war veterans. The plotters, who were alleged to involve some of the most famous families in America, (owners of Heinz, Birds Eye, Goodtea, Maxwell Hse & George Bush’s Grandfather, Prescott) believed that their country should adopt the policies of Hitler and Mussolini to beat the great depression.

Mike Thomson investigates why so little is known about this biggest ever peacetime threat to American democracy.

Tuesday, July 24, 2007


Click the picture to read Peanuts if it'd been written by Bukowski.