Tuesday, June 24, 2008

Editorial: Marriage shouldn't be government's concern
An Orange County Register editorial
Perhaps because leaders on both sides of the issue urged supporters to avoid undue disruption, the first few days of legal same-sex marriages in California went reasonably smoothly. An expensive and potentially acrimonious political tussle may be brewing behind the scenes, but both the celebrations and protests so far have been relatively low-key.
The uncertainty looming on the horizon is a measure on the November ballot that would declare that the state recognizes only marriages between a man and a woman. Proposition 22, passed in 2000, declaring that marriage is only between a man and a woman, was a statute. The California Supreme Court decided in May that the law violated state constitutional provisions guaranteeing equal protection of the laws to all. The measure in November is a constitutional amendment, so it would nullify the court's decision.
Our preference would be for the government not to be involved in marriage, the most fundamental of institutions in a civil society. Why two people who want to be married should be required to get a license from the state is something of a mystery. Marriage existed long before the California or U.S. governments came into being and will continue long after they have been consigned to history. Whether a marriage is valid should be up to the people involved and the churches, synagogues, mosques or other religious institutions that choose to perform them or not.
As a practical matter, however, the government has so entwined itself into our daily lives that state recognition is important. Filing taxes as a married couple or as individuals makes a difference, as does the ability to own real estate, make end-of-life decisions or adopt children. Considering all this and the importance of equality before the law, the high court's decision was justified.
It is argued that allowing same-sex marriage will infringe on the religious freedom of people who have a religiously based objection to it. It is hard to see the validity. Church and state are correctly separate in this country, and the fact that the state recognizes a union as a marriage doesn't mean that a religious person or institution has to recognize it or approve of it. It's hard to imagine a minister, rabbi or imam who objects to same-sex marriages being forced to perform one, and we would be the first to object if anybody tried it.
Over time same-sex couples will find, as has been the case in Massachusetts, where such marriages have been legal for four years, (and as heterosexual couples know all too well) that marriage is not always easy. Married people disagree about all kinds of things, from money to recreational preferences, and have to find ways to work out their differences.
The relatively smooth transition to allowing same-sex marriages may be the calm before the storm. Still, it's nice that it has been calm so far.

Sunday, June 15, 2008

Charging by the Byte to Curb Internet Traffic
By
BRIAN STELTER
Some people use the Internet simply to check e-mail and look up phone numbers. Others are online all day, downloading big video and music files.
For years, both kinds of Web surfers have paid the same price for access. But now three of the country’s largest Internet service providers are threatening to clamp down on their most active subscribers by placing monthly limits on their online activity.
One of them,
Time Warner Cable, began a trial of “Internet metering” in one Texas city early this month, asking customers to select a monthly plan and pay surcharges when they exceed their bandwidth limit. The idea is that people who use the network more heavily should pay more, the way they do for water, electricity, or, in many cases, cellphone minutes.
That same week,
Comcast said that it would expand on a strategy it uses to manage Internet traffic: slowing down the connections of the heaviest users, so-called bandwidth hogs, at peak times.
AT&T also said Thursday that limits on heavy use were inevitable and that it was considering pricing based on data volume. “Based on current trends, total bandwidth in the AT&T network will increase by four times over the next three years,” the company said in a statement.
All three companies say that placing caps on broadband use will ensure fair access for all users.
Internet metering is a throwback to the days of dial-up service, but at a time when video and interactive games are becoming popular, the experiments could have huge implications for the future of the Web.
Millions of people are moving online to watch movies and television shows, play multiplayer video games and talk over videoconference with family and friends. And media companies are trying to get people to spend more time online: the Disneys and NBCs of the world keep adding television shows and movies to their Web sites, giving consumers convenient entertainment that soaks up a lot of bandwidth.
Moreover, companies with physical storefronts, like
Blockbuster, are moving toward digital delivery of entertainment. And new distributors of online content — think YouTube — are relying on an open data spigot to make their business plans work.
Critics of the bandwidth limits say that metering and capping network use could hold back the inevitable convergence of television, computers and the Internet.
The Internet “is how we deliver our shows,” said Jim Louderback, chief executive of Revision3, a three-year-old media company that runs what it calls a television network on the Web. “If all of a sudden our viewers are worried about some sort of a broadband cap, they may think twice about downloading or watching our shows.”
Even if the caps are far above the average users’ consumption, their mere existence could cause users to reduce their time online. Just ask people who carefully monitor their monthly allotments of cellphone minutes and text messages.
“As soon as you put serious uncertainty as to cost on the table, people’s feeling of freedom to predict cost dries up and so does innovation and trying new applications,” Vint Cerf, the chief Internet evangelist for
Google who is often called the “father of the Internet,” said in an e-mail message.
But the companies imposing the caps say that their actions are only fair. People who use more network capacity should pay more,
Time Warner argues. And Comcast says that people who use too much — like those who engage in file-sharing — should be forced to slow down.
Time Warner also frames the issue in financial terms: the broadband infrastructure needs to be improved, it says, and maybe metering could pay for the upgrades. So far its trial is limited to new subscribers in Beaumont, Tex., a city of roughly 110,000.
In that trial, new customers can buy plans with a 5-gigabyte cap, a 20-gigabyte cap or a 40-gigabyte cap. Prices for those plans range from $30 to $50. Above the cap, customers pay $1 a gigabyte. Plans with higher caps come with faster service.
“Average customers are way below the caps,” said Kevin Leddy, executive vice president for advanced technology at Time Warner Cable. “These caps give them years’ worth of growth before they’d ever pay any surcharges.”
Casual Internet users who merely send e-mail messages, check movie times and read the news are not likely to exceed the caps. But people who watch television shows on Hulu.com, rent movies on iTunes or play the multiplayer game Halo on Xbox may start to exceed the limits — and millions of people are already doing those things.
Streaming an hour of video on Hulu, which shows programs like “
Saturday Night Live,” “Family Guy” and “The Daily Show With Jon Stewart,” consumes about 200 megabytes, or one-fifth of a gigabyte. A higher-quality hour of the same content bought through Apple’s iTunes store can use about 500 megabytes, or half a gigabyte.
A high-definition episode of “Survivor” on
CBS.com can use up to a gigabyte, and a DVD-quality movie through Netflix’s new online service can eat up about five gigabytes. One Netflix download alone, in fact, could bring a user to the limit on the cheapest plan in Time Warner’s trial in Beaumont.
Even services like Skype and
Vonage that use the Internet to transmit phone calls could help put users over the monthly limits.
Time Warner would not reveal how many gigabytes an average customer uses, saying only that 95 percent of customers use under 40 gigabytes each in a month.
That means that 5 percent of customers use more than 50 percent of the network’s overall capacity, the company said, and many of those people are assumed to be sharing copyrighted video and music files illegally.
The Time Warner plan has the potential to bring Internet use full circle, back to the days when pay-as-you-go pricing held back the Web’s popularity. In the early days of dial-up access,
America Online and other providers offered tiered pricing, in part because audio and video were barely viable online. Consumers feared going over their allotted time and bristled at the idea that access to cyberspace was billed by the hour.
In 1996, when AOL started offering unlimited access plans, Internet use took off and the online world started moving to the center of people’s daily lives. Today most Internet packages provide a seemingly unlimited amount of capacity, at least from the consumer’s perspective.
But like water and electricity, even digital resources are finite. Last year Comcast disclosed that it was temporarily turning off the connections of customers who used file-sharing services like BitTorrent, arguing that they were slowing things down for everyone else. The people who got cut off complained and asked how much broadband use was too much; the company did not have a ready answer.
Thus, like Time Warner, Comcast is considering a form of Internet metering that would apply to all online activity.
The goal, says Mitch Bowling, a senior vice president at Comcast, is “ensuring that a small number of users don’t impact the experience for everyone else.”
Last year Comcast was sued when it was disclosed that the company had singled out BitTorrent users.
In February, Comcast departed from that approach and started collaborating with the company that runs BitTorrent. Now it has shifted to what it calls a “platform agnostic” approach to managing its network, meaning that it slows down the connection of any customer who uses too much bandwidth at congested times.
Mr. Bowling said that “typical Internet usage” would not be affected. But on the Internet, “typical” use is constantly being redefined.
“The definitions of low and high usage today are meaningless, because the Internet’s going to grow, and nothing’s going to stop that,” said Eric Klinker, the chief technology officer of BitTorrent.
As the technology company
Cisco put it in a recent report, “today’s ‘bandwidth hog’ is tomorrow’s average user.”
One result of these experiments is a tug-of-war between the Internet providers and media companies, which are monitoring the Time Warner experiment with trepidation.
“We hate it,” said a senior executive at a major media company, who requested anonymity because his company, like all broadcasters, must play nice with the same cable operators that are imposing the limits. Now that some television shows are viewed millions of times online, the executive said, any impediment would hurt the advertising model for online video streaming.
Mr. Leddy of Time Warner said that the media companies’ fears were overblown. If the company were to try to stop Web video, “we would not succeed,” he said. “We know how much capacity they’re going to need in the future, and we know what it’s going to cost. And today’s business model doesn’t pay for it very well.”

Wednesday, June 11, 2008

The Democrats' Reagan
By Bob Beckel
Question: Name the presidential candidate described below.
An unpopular incumbent president sits in the Oval Office. His party's brand is badly tarnished. The economy is in shambles, unemployment on the rise. The housing market is in crisis. Gasoline has become a major issue.America is enmeshed in a protracted crisis in the Middle East with no end in sight. We are near war footing with Iran. The reputation of the United States is diminished world wide. In historically high numbers, voters believe the country is on the wrong track.
The opposition party has nominated a charismatic candidate for president whose oratorical skills are compared to JFK, perhaps better. He had been introduced to the majority of Americans by way of a spellbinding keynote speech at a previous national party convention.
He has a fervent core of supporters and has emerged as the leader of his party through an insurgency that challenged and ultimately defeated his party's establishment. He runs against Washington and the special interests that control the Capitol. His message is change and hope.
If ever the public demanded change in Washington, it is in this presidential year. It could not be a better political environment for the party out of power. Yet with all the stars aligned perfectly for a party change in the White House, national polls show the opposition candidate barely ties, and often trails, his opponent.
There is little doubt about the voter's desire for change, but there is plenty of doubt about this candidate who pledges to deliver it. Who is the candidate?
Answer; A) Barack Obama B) Ronald Reagan C) Both
The correct answer is C.
Barack Obama's current political circumstance is eerily similar to that of Ronald Reagan in his 1980 campaign for president. Both Obama and Reagan, from the beginning of their insurgent campaigns, were viewed as transformative political figures. Both enjoyed passionate grassroots support.
Both men had defeated centrist establishment candidates for their party's nomination. Reagan defeated George H.W. Bush, who was viewed by the growing conservative base of the Republican Party as too moderate. Obama beat Hilary Clinton whose husband had been elected twice by moving away from his party's traditional progressive roots and running as a centrist, a path Clinton herself followed (at least at the beginning of her campaign).
In 1980 most conventional political observers failed to recognize the growing grassroots power of the rock solid conservative activists who propelled Reagan to his party's nomination. In the 2008 presidential campaign supporters of Hillary Clinton failed to recognize the growing assertiveness of the Democrats progressive base, especially over the Iraq war which she initially supported and Obama opposed.
The failures of the Bush Administration convinced many progressives that the conservative cycle, deep into its third decade, had run its course. These activists believed the country was ready to tack back toward more progressive and transparent government. Barack Obama recognized and embraced this growing progressive movement.
Obama's message that it was time to "turn the page" on politics as usual (a not very subtle reference to both the Bush and Clinton years) resonated with progressives. That message coupled with his message of post-partisan, anti-polarization politics, so attractive to independent voters, provided Obama with a core of progressive activists along with a solid base of black voters and young voters energized by his youth and oratorical gifts.
But insurgency campaigns by definition run counter to the established order. Even in years when voters clamor for change, insurgent candidates must prove that neither they nor the change they offer is perceived as too far from the mainstream. It is this potential fear that opponents of insurgent candidates seek to exploit.
For most of the general election in 1980 Democrats succeeded in raising doubts about Reagan's brand of conservatism. They charged that he was too far right, and questioned his past conservative associations with the John Birch Society which, like Reagan; had been strong supporters of Barry Goldwater in 1964. Democrats argued that Reagan's brand of virulent anti-Communism coupled with his lack of foreign-policy experience was a dangerous mixture for the man whose finger would be on the proverbial "button."
For most of the 1980 general election the attacks on Reagan raised enough doubt about him to neutralize the public's strong desire for change. I was managing Carter's campaign in Texas that fall, and even in that conservative bastion, Carter led Reagan in the polls until mid-October. Our strategy was simple: On a risk scale of 1 to 10 (one being no risk, 10 being far too risky) we had managed to keep Reagan in the 7 to 8 range. Then came the only Carter/Reagan debate and the flood gates opened.
On stage with the President of the United States, Reagan did not come across as a threatening mad bomber. He was collegial, surefooted, and calm. His performance shattered expectations that he was a risk, which allowed Reagan, at the end of the debate, to pivot to the state of the economy with his devastating question, "Are you better off now than you were four years ago?" Reagan was elected in a landslide and proceeded to transform politics in America well beyond his two terms.
The Republicans are employing the same "risk" strategy against Barack Obama in 2008. McCain and company have used Obama's willingness to meet with avowed enemies of the United States like Iran as a sign of naiveté and weakness. Republican operatives and their radio talk show allies have sought to tie Obama to the anti-American rants of his former pastor Jeremiah Wright and his neighbor William Ayers, a former '60s radical.
Republicans have even dragged Obama's wife Michelle into the fight. They cite her Princeton senior thesis, selected campaign comments, and Obama's failure to wear an American flag lapel pin as evidence of passive patriotism.
Democrats in 1980 charged that Reagan would rip apart the social safety net for the poor, while Republicans in 2008 accuse Obama of inciting class warfare and suggest as president he will undertake a classic liberal redistribution of wealth by increasing taxes on wealthy Americans and profitable US corporations.
It is incumbent on Obama to diffuse the "risk" issue. In some ways his will be an easier job than Reagan's. Reagan ran against an incumbent president, always a difficult race, while Obama faces a 71-year-old Senate veteran. (McCain turns 72 on Aug. 29.) Reagan faced a president preoccupied with 52 American hostages in Iran while Obama's opponent supports an unpopular war in Iraq that has already cost over 4000 American lives.
The "risk" factor for insurgents can best be addressed in direct candidate to candidate debates. Insurgents tend to have low expectations in these matchups, and hence a greater upside potential. Ronald Reagan had only one debate opportunity to counter his "risk" problem. Obama is likely to have a minimum of three encounters with John McCain and potentially several other town hall joint appearances.
John McCain will not be irrelevant in these face-offs, but only Barack Obama can confront the question of risk. It is an enviable position for Barack Obama Only he can win the race for the White House, and only he can lose it.
If Obama has proved one thing in his short political career, it is that he is far more likely to win than to lose.
Bob Beckel managed Walter Mondale’s 1984 presidential campaign. He is a senior political analyst for the Fox News Channel and a columnist for USA Today. Beckel is the co-author with Cal Thomas of the book "Common Ground."
Copyright 2008, Real Clear PoliticsPage Printed from: http://www.realclearpolitics.com/articles/2008/06/the_democrats_reagan.html at June 11,

Wednesday, June 04, 2008

How Gene McCarthy's response to Bobby Kennedy's murder crippled the Democrats.
By David Greenberg (Slate)
Forty years ago, Robert F. Kennedy was murdered on the very night he defeated his fellow anti-war insurgent Eugene McCarthy in the California Democratic presidential primary. This week the news media are full of remembrances of RFK, rehearsing how his assassination, echoing his brother's five years earlier, dashed a generation's hopes for a new era of liberalism. But in a political season that resembles 1968, another aspect of the assassination is also worth considering, especially with the Democratic Party now seeking to unify its ranks. For in 1968, the persistence of intra-party divisions—which helped usher in the presidency of Richard M. Nixon—stemmed not just from the tragedy of Kennedy's murder but also from McCarthy's own subsequent failure of leadership. McCarthy's refusal to extend a hand to disoriented Kennedy supporters after June 6 left the party sundered, directionless, and ripe for defeat.
Eugene McCarthy never liked the Kennedys. At least since 1960, when he had placed Adlai Stevenson's name in nomination at the Democratic convention that chose JFK for president, the high-minded Minnesota senator had resented the hardball style and political success of the whole family. Understandably, he begrudged RFK's entry into the 1968 race. After all, back in November 1967, McCarthy had courageously challenged Lyndon B. Johnson, a sitting president, for the Democratic nomination, arguing that it was time to bring home the half-million Americans fighting in Vietnam. McCarthy's close second-place finish in the March 12 New Hampshire primary exposed Johnson's profound vulnerabilities. Only then did Kennedy—after some perfunctory soundings about a joint anti-war effort with McCarthy—throw his hat in the ring, quickly earning him treatment as a more plausible pretender to the nomination. McCarthy, who later claimed RFK had promised him he wouldn't run, was livid.
Two weeks later, LBJ forswore a second term. Anti-war Democrats rushed to align with one insurgent or the other. McCarthy won the intellectuals, the professionals, and the young, who, distancing themselves from their long-haired contemporaries, vowed to get "Clean for Gene." Kennedy attracted blue-collar, Hispanic, and black support. He complained that McCarthy got the "A" students, and he got the "B" students.
The primary battles were brutal, producing at least as much bad feeling as this year's. Against a backdrop of violent campus protests and the assassination of Martin Luther King Jr., McCarthy and Kennedy squared off in Indiana, Nebraska, Oregon, and California. (
Not until 1972 did primaries become the dominant method of delegate selection.) Playing to his upscale base, McCarthy blasted Kennedy for having wiretapped King while attorney general. RFK, for his part, catered to the concerns of his new base—stressing, for example, his former credentials as "the chief law enforcement officer of the United States" in front of audiences worried about rising crime and urban riots. He also assailed McCarthy's previous opposition to a minimum-wage law and his allegedly weak civil rights record—enduring charges of being "ruthless" and dishonest in distorting his rival's record.
Even as McCarthy styled himself the clean politician, however, he dished it out, too. He mocked Kennedy and his supporters. A major gaffe occurred in Oregon, when McCarthy sniffed that Kennedy supporters were "less intelligent" than his own and belittled Indiana (which had by then gone for Kennedy) for lacking a poet of the stature of Robert Lowell—a friend of McCarthy's who often traveled with him. McCarthy also took swipes at Kennedy for chasing after black and white working-class votes.

More negativity infused a debate before the California primary. McCarthy made two ill-considered statements: that he would accept a coalition government that included Communists in Saigon and that only the relocation of inner-city blacks would solve the urban problem. Kennedy pounced, portraying the former idea as soft on communism and the latter diagnosis as a scheme to bus tens of thousands of ghetto residents into white, conservative Orange County. Angered at these characterizations, McCarthy resolved not to support Kennedy if he became the nominee.
By the time of Kennedy's murder, there was no love lost between the two men. Still, McCarthy's reaction to the assassination was singularly hardhearted. One aide recalled him sneering about his fallen rival, "Demagoguing to the last." Another heard him say that Kennedy "brought it on himself"—implying, by perverse logic, that because Kennedy had promised military support to the state of Israel, he had somehow provoked Sirhan Sirhan, the Arab-American gunman who killed him. (In fact, Sirhan had long planned to commit the murder on the first anniversary of the Six-Day War.)
Kennedy's death, of course, did not leave McCarthy alone in the race. All along, many party regulars had preferred Vice President Hubert Humphrey, who announced his candidacy in April but sat out the primaries, instead building his delegate base in states without primaries—which back then constituted a majority. Indeed, with Kennedy's assassination, many observers thought that front-runner status had devolved not to McCarthy but to Humphrey. Yet while McCarthy formally suspended his campaign in recognition of Kennedy's death, and although he proceeded to engage in various acts of willful self-sabotage, he nonetheless won a big victory in the June 18 New York primary and swept around the country in search of uncommitted delegates. Yet, stubbornly, he refused to make any gestures of reconciliation toward Kennedy's inner circle or his millions of supporters.
A few key Kennedy aides soon prevailed on McGovern to join the race as a kind of placeholder at the upcoming Chicago convention—a possible nominee but also a candidate for Kennedy's delegates to rally behind until a deal could be struck. The move, of course, also made clear to McCarthy that they hadn't forgiven his various digs at RFK during the primary season. Meanwhile, others started an informal "Draft Ted" movement to get the youngest Kennedy brother, then 36, to pick up the standard. Both ploys reflected a recognition that Humphrey, for all his delegates, still wasn't the inevitable nominee and that McCarthy's cache of several hundred delegates, when coupled with Kennedy's, might still produce an anti-war nominee.
For a moment it looked possible. In Chicago, Richard Goodwin—the former JFK aide who'd gone to work for McCarthy, switched to RFK, then returned to the McCarthy camp after the assassination—sent word to friends in the Kennedy camp that McCarthy wanted to talk. Privately, the senator told Kennedy in-law Steve Smith that he would be willing to step aside in favor of Ted. But even in concession, McCarthy couldn't be gracious. He told Smith that he would take such a step for Ted, but he wouldn't have done it for Bobby. The gratuitous jab killed any prospect of a deal. In his conversations with Humphrey, meanwhile, McCarthy insisted that he not choose Ted Kennedy as his running mate.
McCarthy made almost no efforts on his own behalf at the convention. In a debate with Humphrey and McGovern before the California delegation, he refused to state his position on the war, saying, "The people know my position." He didn't even speak during the convention's debate over what the platform would say about Vietnam. But when Humphrey got the nod, McCarthy suggested that, as the winner of the most primary votes, he had been robbed of the nomination. He didn't endorse Humphrey until Oct. 29, and even then he took swipes at the vice president for his stands on the war and the draft. Humphrey lost to Nixon by 0.7 percent of the popular vote, although Nixon took 301 electoral votes to Humphrey's 191.
Whether Robert Kennedy could have beaten Humphrey for the nomination is impossible to say. Certainly, it would have been hard. But following Kennedy's death, Gene McCarthy's willful aloofness and inability to bring unity to a party cleaved during a hard-fought primary season amounted to a second tragedy for the Democrats.

Monday, June 02, 2008

Yves Saint Laurent, Giant of Couture, Dies at 71
By ANNE-MARIE SCHIRO NY Times
Yves Saint Laurent, who exploded on the fashion scene in 1958 as the boy-wonder successor to Christian Dior and endured as one of the best-known and most influential couturiers of the second half of the 20th century, died on Sunday at his apartment in Paris. He was 71.
His death was confirmed by Dominique Deroche, a spokeswoman for the Pierre Bergé-Yves Saint Laurent Foundation.
During a career that ran from 1957 to 2002 he was largely responsible for changing the way modern women dress, putting them into pants both day and night, into peacoats and safari jackets, into “le smoking” (as the French call a man’s tuxedo jacket), and into leopard prints, trench coats and, for a time in the 1970s, peasant-inspired clothing in rich fabrics.
Mr. Saint Laurent often sought inspiration on the streets, bringing the Parisian beatnik style to couture runways and adapting the sailors’ peacoats he found in Army-Navy stores in New York into jackets that found their way into fashionable women’s wardrobes around the world. His glamorous evening clothes were often adorned with appliqués and beadwork inspired by artists like
Picasso, Miró and Matisse. Above all, he was a master colorist, able to mix green, blue, rose and yellow in one outfit to achieve an effect that was artistic and never garish.
Among the women of style who wore his clothes were
Catherine Deneuve, Paloma Picasso, Nan Kempner, Lauren Bacall, Marella Agnelli and Marie-Hélène de Rothschild.
Mr. Saint Laurent achieved instant fame in 1958 at the age of 21 when he showed his Trapeze collection, his first for Christian Dior following the master’s death. But unlike many overnight sensations, Mr. Saint Laurent managed to remain at the top of his profession as fashion changed from an emphasis on formal, custom-made haute couture to casual sportswear.
For many years after he opened his own couture house, in 1962, his collections were eagerly anticipated by fashion enthusiasts, who considered his the final word on that season’s style. His influence was at its height during the 1960s and ’70s, when it was still normal for couturiers to change silhouettes and hemlines drastically every six months.
Among his greatest successes were his Mondrian collection in 1965, based on the Dutch artist’s gridlike paintings, and the “rich peasant” collection of 1976, which stirred so much interest that the Paris show was restaged in New York for his American admirers. “The clothes incorporated all my dreams,” he said after the show, “all my heroines in the novels, the operas, the paintings. It was my heart — everything I love that I gave to this collection.”
Originally a maverick and a generator of controversy — in 1968, his suggestion that women wear pants as an everyday uniform was considered revolutionary — Mr. Saint Laurent developed into a more conservative designer, a believer in evolution rather than revolution. He often said that all a woman needed to be fashionable was a pair of pants, a sweater and a raincoat.
“My small job as a couturier,” he once said, “is to make clothes that reflect our times. I’m convinced women want to wear pants.”
A Rare Retrospective
By 1983, when he was 47, his work was recognized by fashion scholars as so fundamentally important to women’s dress that a retrospective of his designs was held at the
Costume Institute of the Metropolitan Museum of Art, the first time the museum had honored a living designer. Diana Vreeland, the legendary magazine editor and the doyenne of the Costume Institute, who masterminded the exhibition, called him “a living genius” and “the Pied Piper of fashion.”
“Whatever he does,” she said, “women of all ages, from all over the world, follow.” That exhibition was followed by retrospectives in Paris, Beijing, Moscow, St. Petersburg, Tokyo and Sydney, Australia.
But the New York exhibition could be considered the peak of Mr. Saint Laurent’s career, for after that he settled into a classical mode of reinterpreting his earlier successes. The boy wonder had turned into the elder statesman. He said in an interview in 1983: “A woman’s wardrobe shouldn’t change every six months. You should be able to use the pieces you already own and add to them. Because they are like timeless classics.”
Yet because so many of his early designs seeped into the public domain of fashion (and into many other designers’ collections), he managed to retain his stellar position in the world of fashion through his retirement in 2002.
Yves Henri Donat Mathieu-Saint-Laurent came a long way from Oran, Algeria, where he was born on Aug. 1, 1936, to Charles and Lucienne Andrée Mathieu-Saint-Laurent. His father was a lawyer and insurance broker, his mother a woman of great personal style. He grew up in a villa by the Mediterranean with his two younger sisters, Michelle and Brigitte.
His mother and sisters, all of Paris, survive him.
The young Yves was said to be a quiet and retiring child (and as an adult was also often described as quiet and retiring), who avoided all sports except swimming and developed a love for fashion and the theater at an early age. After seeing a play by
Molière when he was 11, he recreated the play in miniature, pasting the costumes together. As a teenager, he designed clothes for his mother, who had them whipped up by a local seamstress. (His mother became his greatest fan, sitting in the front row at all his shows and wearing no one else’s designs.)
Although his parents wanted him to study law, Mr. Saint Laurent — lanky and brown-haired, his blue eyes framed by glasses — went to Paris when he was 17 to try his luck in theatrical and fashion design. He briefly studied design at the Chambre Syndicale de la Couture, leaving because he said he was bored. Shortly thereafter, he won first prize in an International Wool Secretariat design competition for his sketch of a cocktail dress. This led to an interview with Christian Dior, who noted an uncanny resemblance between Mr. Saint Laurent’s cocktail dress and one he himself was working on. Recognizing the young designer’s talent, Dior hired him on the spot as his assistant.
Dior’s Protégé
For three years, Mr. Saint Laurent worked closely with Dior, who called him “my dauphin” and “my right arm.” After Dior died suddenly in 1957, shocking the fashion world, the House of Dior named Mr. Saint Laurent its head designer. At 21, he found himself at the head of a $20-million-a-year fashion empire, succeeding a legend, the man who had radically changed the way women dressed in 1947 with the wasp-waisted New Look.
Mr. Saint Laurent’s first collection in his new position, shown on Jan. 30, 1958, was based on the trapeze, a youthful silhouette that started with narrow shoulders and a raised waistline, then flared out gently to a wide hemline. The collection was received with great enthusiasm, and Mr. Saint Laurent’s name was well on its way to becoming a household word across Europe and America.
He was credited by many with rejuvenating French fashion and securing his country’s pre-eminent position in the world of haute couture. Newsboys shouted his triumph across the streets of Paris while he waved to the crowds below the balcony of the House of Dior on the Avenue Montaigne. The dauphin was crowned king.
His last collection for Dior, in July 1960, was based on a “chic beatnik” look of knitted turtlenecks and black leather jackets. It was less warmly received, though eventually the style became the uniform of the avant-garde.
In September of that year, Mr. Saint Laurent was called up for 27 months of compulsory military service during the war France was then fighting in Algeria. He had previously been given deferments because 2,000 jobs depended on his talent.
About three weeks after his induction, he was hospitalized for a nervous collapse. In October 1960, the House of Dior gave his job to Marc Bohan, his former assistant. In November, Mr. Saint Laurent was discharged from the army and entered a private clinic near Paris. In later years, he suffered from depression and a dependency on alcohol and drugs, a dependency he attributed to the drugs he was given in a military psychiatric hospital. But he almost always recovered in time to take the ritual walk down the runway, however unsteadily, at the finale of his shows.
In January 1961, Mr. Bohan’s collection for Dior was a huge success. Mr. Saint Laurent sued Dior for severance pay and damages after the house refused to reinstate him after his army discharge. He was awarded 680,000 francs by the court, then about $140,000.
In September 1961, Mr. Saint Laurent announced plans to open his own haute couture house in partnership with his lover, Pierre Bergé. Mr. Bergé remained his lifelong business partner and was responsible for the company’s financial success, although they split up as a couple in the early 1980s. The fledgling house was backed by J. Mack Robinson, an Atlanta businessman, who later said his confidence was based on the excitement Mr. Saint Laurent had created when he replaced Dior.
His Own Collection
The first Yves Saint Laurent collection was shown on Jan. 19, 1962. It was the beginning of a success story that led eventually to a ready-to-wear line sold in the designer’s own Rive Gauche boutiques around the world; to hundreds of licenses for scarves, jewelry, furs, shoes, men’s wear, cosmetics and perfumes, and even cigarettes; to set and costume designs for the ballet, theater and movies (most notably, for Catherine Deneuve in “Belle de Jour” in 1967); to a listing on the Paris Bourse, and to a host of awards, including the French Legion of Honor in 1985.
The House of Saint Laurent had various owners over the years, including Lanvin-Charles of the Ritz and Squibb-Beech Nut. In 1993, in a $636 million transaction, it became part of the state-owned French pharmaceuticals conglomerate Elf Sanofi, but 43 percent of the fashion group remained in the hands of Mr. Bergé and Mr. Saint Laurent. In 2000,
Gucci Group bought the ready-to-wear and fragrance divisions of the company, while Mr. Bergé and Mr. Saint Laurent retained the haute couture business until the designer’s retirement. Under Gucci, to Mr. Saint Laurent’s vocal displeasure, the YSL ready-to-wear line was designed by the American fashion star Tom Ford.
“The poor guy does what he can,” Mr. Saint Laurent said of his successor.
Mr. Ford, who simultaneously designed the Gucci and Yves Saint Laurent collections with an overtly racy and sexualized aesthetic during those years, left the company in 2003; the Yves Saint Laurent collections have since been designed by one of Mr. Ford’s former assistants,
Stefano Pilati.
In January 2002, Mr. Saint Laurent announced his retirement in Paris at a press conference at his couture house at 5 Avenue Marceau, where many fashion editors and teary-eyed friends of the house considered the possibility that Mr. Saint Laurent had felt pressured to resign. He and Mr. Bergé denied that, and a week later announced plans to turn the house into a museum, which has since displayed exhibitions of Mr. Saint Laurent’s tuxedo jackets and the clothes he designed for Ms. Kempner.
‘Opium’ Wars
The designer, of course, managed several times to create controversy during his career with, of all things, his fragrances. In 1971, he appeared nude in an advertisement for his men’s cologne YSL. Then, in 1977, he named one of his women’s perfumes Opium, which led to charges that he was glamorizing drug use and trivializing the 19th-century Opium Wars in China. Its slogan was “Opium, for those who are addicted to Yves Saint Laurent.” In 1992, his plans to call another perfume Champagne prompted a lawsuit by French wine makers (the Saint Laurent company lost).
In another legal battle, Mr. Saint Laurent won a 1994 suit in the French courts against
Ralph Lauren, whom he accused of copying the design for his tuxedo dress (a style Mr. Saint Laurent reinterpreted many times over the years).
In 1992, a celebration at the Bastille Opera in Paris of the 30th anniversary of the House of Saint Laurent was attended by 2,750 admirers who applauded as 100 models took to the stage in clothes from the three decades. Writing about the event in The New York Times, Bernadine Morris said, “What was wondrous about these clothes, besides their breathtaking beauty, was that nothing looked dated.”
As befitted his success, Mr. Saint Laurent lived elegantly. All his homes — including famous ones in Deauville, France, and Marrakech, Morocco — which he shared with a succession of French bulldogs, always named Moujik, were lavishly decorated and filled with antiques and artwork by his favorite artists, who included Picasso, Cocteau, Braque and Christian Bérard. He often said that Bérard was one of the greatest influences on his designs, particularly in the use of color.
“Every man needs aesthetic phantoms in order to exist,” Mr. Saint Laurent said at the announcement of his retirement. “I have known fear and the terrors of solitude. I have known those fair-weather friends we call tranquilizers and drugs. I have known the prison of depression and the confinement of hospital. But one day, I was able to come through all of that, dazzled yet sober.”

Tuesday, May 13, 2008

A Drastic Remedy
The case for intervention in Burma.

By Anne Applebaum (SLATE)
They are "cruel, power hungry and dangerously irrational," in the words of one British journalist. They are "violent and irrational" according to a journalist in neighboring Thailand. Our own State Department leadership has condemned their "xenophobic, ever more irrational policies."
On the evidence of the last few days alone, those are all perfectly accurate descriptions. But in one very narrow sense, the cruel, power-hungry, violent, and xenophobic generals who run Burma are not irrational at all: Given their own most urgent goal—to maintain power at all costs—their reluctance to accept international aid in the wake of a devastating cyclone makes perfect sense. It's straightforward, as the Washington Post's Fred Hiatt put it Monday: "
The junta cares about its own survival, not the survival of its people." Thus, the death toll is thought to have reached 100,000, a further 1.5 million Burmese are now at risk of epidemics and starvation, parts of the country are still underwater, hundreds of thousands of people are camped in the open without food or clean water—and, yes, if foreigners come and distribute aid, the legitimacy of the regime might be threatened.
Especially foreigners in large numbers, using high-tech vehicles that don't exist in Burma, distributing cartons of rice marked "Made in the USA" or even "UNDP." All natural disasters—from the Armenian earthquake, which helped bring down the Soviet Union, to Hurricane Katrina, which damaged the Bush administration—have profound political implications, as do the aid efforts that follow them, and the Burmese generals clearly know it.
Hence the "logic" of the regime's behavior in the days since the cyclone: the impounding of airplanes full of food; the refusal to grant visas to relief workers and landing rights to foreign aircraft; the initial refusal to allow U.S. military (or indeed any foreign military) to supply the ships, planes, and helicopters needed for the mass distribution of food and supplies that Burma needs. Nor is this simple anti-Western paranoia: The foreign minister of Thailand has been kept out, too. Even Burmese citizens have been prevented from bringing food to the flood-damaged regions, on the grounds that "
all assistance must be channeled through the military."
The result: Aid organizations that have staff on the ground are talking about the hundreds of thousands of homeless Burmese who may soon begin dying of
cholera, diarrhea, and other diseases. This isn't logic by our standards, but it is logic by theirs. Which is why we have to assume that the regime's fear of foreign relief workers could even increase as the crisis grows, threatening them further.
If we fail to persuade the junta to relent soon—despite what I hope are assurances that Oxfam, Médecins Sans Frontières, and the American military will bring only food, not regime change, much as we all might like to see it—then we have to start considering alternatives. According to
some accounts, the U.S. military is already looking at a range of options, including helicopter food deliveries from offshore ships, or convoys from across the Thai border. The U.S. government should be looking at wider diplomatic options, too. The U.N. Security Council has already refused to take greater responsibility for Burma—China won't allow the sovereignty of its protectorate to be threatened, even at the price of hundreds of thousands of lives—but there is no need to act alone. In fact, it would be a grave error to do so, since anything resembling a foreign "invasion" might provoke military resistance.
Unfortunately, the phrase "coalition of the willing" is tainted forever—once again proving that the
damage done by the Iraq war goes far beyond the Iraqi borders—but a coalition of the willing is exactly what we need. The French—whose foreign minister, Bernard Kouchner, was himself a co-founder of Médecins Sans Frontières—are already talking about finding alternative ways of delivering aid. Others in Europe and Asia might join in, along with some aid organizations. The Chinese should be embarrassed into contributing, asked again and again to help. This is their satrapy, after all, not ours.
Think of it as the true test of the Western humanitarian impulse: The international effort that went into coordinating the tsunami relief effort in late 2004 has to be repeated, but in much harsher, trickier, uglier political circumstances. Yes, we should help the Burmese, even against the will of their irrational leaders. Yes, we should think hard about the right way to do it. And, yes, there isn't much time to ruminate about any of this.Anne Applebaum is a Washington Post and Slate columnist. Her most recent book is
Gulag: A History.
Article URL:
http://www.slate.com/id/2191196/
Copyright 2008 Washingtonpost.Newsweek Interactive Co. LLC.

Friday, April 25, 2008

Masturbation 'cuts cancer risk'
Men could reduce their risk of developing prostate cancer through regular masturbation, researchers suggest.
They say cancer-causing chemicals could build up in the prostate if men do not ejaculate regularly.
And they say sexual intercourse may not have the same protective effect because of the possibility of contracting a sexually transmitted infection, which could increase men's cancer risk.
Australian researchers questioned over 1,000 men who had developed prostate cancer and 1,250 who had not about their sexual habits.
This is a plausible theory Dr Chris Hiley, Prostate Cancer Charity
They found those who had ejaculated the most between the ages of 20 and 50 were the least likely to develop the cancer.
The protective effect was greatest while the men were in their 20s.
Men who ejaculated more than five times a week were a third less likely to develop prostate cancer later in life.
Fluid
Previous research has suggested that a high number of sexual partners or a high level of sexual activity increased a man's risk of developing prostate cancer by up to 40%.
But the Australian researchers who carried out this study suggest the early work missed the protective effect of ejaculation because it focussed on sexual intercourse, with its associated risk of STIs.
Graham Giles, of the Cancer Council Victoria in Melbourne, who led the research team, told New Scientist: "Had we been able to remove ejaculations associated with sexual intercourse, there should have been an even stronger protective effect of ejaculations."
The researchers suggest that ejaculating may prevent carcinogens accumulating in the prostate gland.
The prostate provides a fluid into semen during ejaculation that activates sperm and prevents them sticking together.
The fluid has high concentrations of substances including potassium, zinc, fructose and citric acid, which are drawn from the bloodstream.
But animal studies have shown carcinogens such as 3-methylchloranthrene, found in cigarette smoke, are also concentrated in the prostate.
'Flushing out'
Dr Giles said fewer ejaculations may mean the carcinogens build up.
"It's a prostatic stagnation hypothesis. The more you flush the ducts out, the less there is to hang around and damage the cells that line them."
A similar connection has been found between breast cancer and breastfeeding, where lactating appeared to "flush out" carcinogens, reduce a woman's risk of the disease, New Scientist reports.
Another theory put forward by the researchers is that ejaculation may induce prostate glands to mature fully, making them less susceptible to carcinogens.
Dr Chris Hiley, head of policy and research at the UK's Prostate Cancer Charity, told BBC News Online: "This is a plausible theory."
She added: "In the same way the human papillomavirus has been linked to cervical cancer, there is a suggestion that bits of prostate cancer may be related to a sexually transmitted infection earlier in life."
Anthony Smith, deputy director of the Australian Research Centre in Sex, Health and Society at La Trobe University in Melbourne, said the research could affect the kind of lifestyle advice doctors give to patients.
"Masturbation is part of people's sexual repertoire.
"If these findings hold up, then it's perfectly reasonable that men should be encouraged to masturbate," he said.
Story from BBC NEWS:

Thursday, April 24, 2008

Penis theft panic hits city
Reuters
KINSHASA - Police in Congo have arrested 13 suspected sorcerers accused of using black magic to steal or shrink men's penises after a wave of panic and attempted lynchings triggered by the alleged witchcraft.
Reports of so-called penis snatching are not uncommon in West Africa, where belief in traditional religions and witchcraft remains widespread, and where ritual killings to obtain blood or body parts still occur.
Rumours of penis theft began circulating last week in Kinshasa, Democratic Republic of Congo's sprawling capital of some 8 million inhabitants. They quickly dominated radio call-in shows, with listeners advised to beware of fellow passengers in communal taxis wearing gold rings.
Purported victims, 14 of whom were also detained by police, claimed that sorcerers simply touched them to make their genitals shrink or disappear, in what some residents said was an attempt to extort cash with the promise of a cure.
"You just have to be accused of that, and people come after you. We've had a number of attempted lynchings. ... You see them covered in marks after being beaten," Kinshasa's police chief, Jean-Dieudonne Oleko, told Reuters.
Police arrested the accused sorcerers and their victims in an effort to avoid the sort of bloodshed seen in Ghana a decade ago, when 12 suspected penis snatchers were beaten to death by angry mobs. The 27 men have since been released.
"I'm tempted to say it's one huge joke," Oleko said.
"But when you try to tell the victims that their penises are still there, they tell you that it's become tiny or that they've become impotent. To that I tell them, 'How do you know if you haven't gone home and tried it'," he said.
Some Kinshasa residents accuse a separatist sect from nearby Bas-Congo province of being behind the witchcraft in revenge for a recent government crackdown on its members.
"It's real. Just yesterday here, there was a man who was a victim. We saw. What was left was tiny," said 29-year-old Alain Kalala, who sells phone credits near a Kinshasa police station.
© Reuters 2008

Sunday, April 20, 2008




Midcentury-modern buildings in Dallas attract preservationists
By DAVID FLICK / The Dallas Morning News dflick@dallasnews.com
Susan Risinger fell in love with her midcentury-modern house five years ago, well before it was discovered by SpongeBob SquarePants.
CHERYL DIAZ MEYER/DMN Susan Risinger, with son William, 6, was drawn to the midcentury-modern architecture of the Midway Hills neighborhood in northwest Dallas.
When Ms. Risinger and her family moved to Dallas from New York, she knew she wanted something very much like the 1950s-era house in the Midway Hills neighborhood of northwest Dallas.
"I don't really care for the more classical look," she said. "I like the clean lines, the architecture, the windows of midcentury houses."
Not only did Ms. Risinger find the house she coveted, she now lives in an outstanding example of an architectural design that is the new front line in the battle for historic preservation.
In the past few years, the midcentury-modern style has begun attracting preservationists' efforts nationwide. For a variety of reasons – including the city's affinity for teardowns – one of the movement's epicenters is Dallas.
While Preservation Dallas, the city's leading organization dedicated to the conservation of historic structures, is linked in the public mind with protecting Victorian homes and ornate skyscrapers, its most recent battles have centered on buildings from the 1950s and '60s.
In just the past few weeks, for example, the group:
•Obtained national historic status for the 3525 Turtle Creek condominiums, built in 1956.
•Persuaded the City Plan Commission to deny a rezoning request that would have doomed a 1959 insurance office building near Oak Lawn.
•Honored developers who converted the old Fidelity Union Life Towers, built in 1952 and 1959, into condos.
The organization's officers, meanwhile, have declared saving the downtown Statler Hilton Hotel building, constructed in 1958, its highest priority.
The sudden interest may come as a surprise to baby boomers who grew up in the 1950s and '60s. For them, a structure from that era may seem less like an architectural treasure and more like the building where they went to the dentist.
"It's always part of the job of preservation organizations to sell the public on buildings from a time period that is coming of age," said Katherine Seale, executive director of Preservation Dallas.
"When people first began to work to save Victorian homes, a lot of people thought they were pretty ugly."
The interest is, in one sense, inevitable, given the simple passage of time. Some postwar architecture is now half a century old – an age that typically transforms a building from "outdated" to "historic."
"There's no rhyme or reason for 50 years to be the benchmark, but the feeling is that after five decades, we begin to have a clearer view of an era," Ms. Seale said.
"We're coming out of the fog and we can look at it with more distance."
The National Trust for Historic Preservation is widely credited with nurturing the reawakening, although its earliest pronouncements a few years ago were almost defensive. The group noted then that some of the buildings were not even 50 years old and thus were not a part of history, but of the "recent past."
Dallas author Virginia McAlester is updating her book, A Field Guide to American Houses, to include midcentury architecture.
"You're starting to see people on the cutting edge of the arts, the tastemakers, who are into preserving midcentury architecture," she said.
"It just hasn't caught up to the rest of us yet. When you talk about preserving a typical ranch house, people say, 'Oh, well, I grew up in a house like that.' "
But another generation sees it differently.
"I think the clean modern lines are still very popular, and it's got a retro look that's attractive to younger people," Ms. Seale said. "It's got a bit of an edge, a bit of fun to it."
Ms. Risinger's house on Pinocchio Drive, for example, had exactly the feel sought by the makers of a commercial for a SpongeBob SquarePants board game.
"They filmed it here because they said they liked the 1950s look," she said. "Whenever we see it on television, we can really see it's our place. My kids love it."
Dallas is a natural center for midcentury architecture, which thrived during the 1950s and '60s. Subdivisions and commercial buildings were built by the thousands throughout the country during that period, but particularly in booming Sun Belt cities like Dallas.
And, in many cases, they were better than the work of previous generations. The earliest structures in Western cities like Dallas were usually designed by local architects, or by builders with no formal architectural training at all.
But the city's postwar wealth changed that.
"After the war, they were able to hire architects that were nationally and even internationally known," Ms. Seale said.
The very abundance of midcentury-modern buildings creates its own challenges for preservationists. To many people, efforts to protect ranch houses in North Texas sound like an attempt to declare ants an endangered species.
Furthermore, the purposely unadorned look makes midcentury houses and buildings more difficult to love.
"Because they're not eclectic like classical or Tudor or French, it's hard to contemplate midcentury as a distinctive style, but it is," said Willis Winters, a Dallas parks department assistant director who has written several books on local architecture.
"A lot of people see them as throwaways and the first targets for teardowns."
Over the past few years, Mr. Winters cataloged 400 houses for a new book that he co-wrote, Great American Suburbs: Houses of the Park Cities, Texas. Although it is still months before the book's publication date, he said, 20 percent of the houses have since been torn down.
The destruction of midcentury houses and buildings follows a pattern familiar to preservationists – a common and unappreciated architectural style is threatened by new construction. But the threat creates an opportunity.
For one thing, tearing down a building triggers a new appreciation for what is lost. Absence, as they say, makes the heart grow fonder. And it can spur political action.
"Nothing starts up a neighborhood movement like a teardown," Ms. Seale said.
The attrition also allows preservationists to be pickier about what they seek to save.
"Something doesn't qualify as historic just because it's old," Ms. Seale said. "There are a lot of criteria we look at."
In the case of midcentury houses, the appearance of the entire neighborhood is taken into account, as well as the quality of the planning and construction.
Two postwar neighborhoods are considered standouts by local preservationists – Midway Hills in northwest Dallas and Wynnewood North in Oak Cliff, Ms. Seale said.
Both have well-constructed, interesting examples of midcentury-modern styles, and both have been relatively untroubled by teardowns. Still, there are conservation efforts afoot in both places.
The so-called Disney Streets area, where Ms. Risinger lives, is a particular favorite of local preservationists. It was the site of the Dallas Parade of Homes in 1954 and 1955, in which builders showcased the latest in residential styles and technology.
The local home show was among the largest and most popular in the country, attracting up to 100,000 visitors.
In Midway Hills, Jacqueline Ziff is living in a showcased house her late husband bought in 1962, when it was still considered cutting edge. She was surprised when a real estate agency told her recently that it was coming back into style.
"She cautioned me not to do a lot with it," she said. "It has a pink-tile bathroom that I was considering updating. But I guess I won't now."

Thursday, April 17, 2008

Mark A. LeVine
War Crimes are Only the Beginning (From HNN)
[Mr. LeVine is professor of modern Middle Eastern history, culture, and Islamic studies at the University of California, Irvine, and author of the forthcoming book, Heavy Metal Islam.]
As reported by Jason Linkins in the Huffington Post over the weekend
(http://www.huffingtonpost.com), the release of the full text of the so-called “Torture Memo” written by UC Berkeley Law professor—and at the time, Assistant Attorney General John Yoo, has fired up even conservative commentators such as Andrew Sullivan. On the Chris Matthews show Sunday he declared that it was a sure bet that at some point point Donald Rumsfeld, David Addington and John Yoo would be indicted for war crimes.
For anyone who's traveled unembedded through Iraq since the US invasion and occupation began five years ago, Sullivan's warning brings a sad smile of recognition, and a hope that his words will prove prophetic. Even if all three men were indicted it would only be the tip of the iceberg in addressing the issue of war crimes in Iraq. In fact, the continued controversy over the torture memo and its justification of waterboarding and other illegal interrogation techniques that have been performed on at most a dozen or so detainees obscures the far more systematic war crimes that have constituted the every day reality of the occupation. Indeed, from the first day of the invasion, war crimes have been the currency of US military activities across Iraq. When I was in the country one year into the invasion I counted dozens of violations just in my travels and discussions with Iraqi doctors, activists and government personnel, which together made the occupation one giant war crime. All were a direct a violation of the obligation of the United States and other members of the coalition under UN Security Council Resolution 1483 of May 22, 2003 to “promote the welfare of the Iraqi people through the effective administration of the territory, including in particular working towards the restoration of conditions of security and stability and the creation of conditions in which the Iraqi people can freely determine their own political future.” More broadly, the resolution also called upon the coalition to “comply fully with their obligations under international law.” that is, US troops were obligated to assure humane treatment for the civilian population (Article 27 of the 4th Geneva Convention) and more broadly permit life in Iraq to continue without being affected by its presence; and to ensure the public order, safety and welfare of the population, from providing for basic food and clothing needs to health care as well that are, according to Articles 68 and 69 of Protocol 1 of the Geneva Conventions (which is accepted as customary international law by the U.S. even though it hasn’t signed the Protocol) “essential to the survival of the civilian population.”As an April 2004 report by Amnesty on the human rights situation in Iraq made clear ,
http://www.amnestyusa.org, "Under international humanitarian law, as occupying powers it was their duty to maintain and restore public order, and provide food, medical care and relief assistance. They failed in this duty, with the result that millions of Iraqis faced grave threats to their health and safety." With each death due to the decrepit health care system that could have been fixed with modest inputs of money, supplies and effort, the purposeful shooting of ambulances http://news.bbc.co.uk or the prevention or delay in the receiving of medical care http://www.guardian.co.uk as happened during the fighting in Fallujah and numerous other occasions, the U.S. crosses the line between “merely” violating international humanitarian law (specifically articles 17 through 19 of the 4th Geneva Conventions) and the commission of actual war crimes, defined as grave breaches of the 4th Geneva Convention as described in article 147 as including the "willful killing, torture or inhuman treatment, including... willfully causing great suffering or serious injury to body or health, unlawful deportation or transfer or unlawful confinement of a protected person... or willfully depriving a protected person of the rights of fair and regular trial ...taking of hostages and extensive destruction and appropriation of property, not justified by military necessity and carried out unlawfully and wantonly."In sum, whether it's been the killing of tens of thousands of civilians (if not more)) by US forces to the torture of a relatively few in prisons such as Abu Ghraib, the issue has never been on of soldiers exceeding their authority. It's an issue of the Commander and Chief of the United States armed forces, along with his top commanders and officials, being responsible for a military system that, once unleashed, cannot but commit systematic violations of humanitarian law. Indeed, only weeks after the occupation began a group of Belgian doctors who’d spent the previous year in Baghdad explained that whatever crimes might be committed by Iraqis, as the internationally recognized belligerent occupiers “the current humanitarian catastrophe is entirely and solely the responsibility of the US and British authorities.” http://www.globalresearch.ca Even that early into the occupation they documented violations of at least a dozen articles of the 4th Geneva Convention by Coalition forces (including articles 10, 12, 15, 21, 35, 36, 41, 45, 47, 48, 51 & 55).What's most surprising is that given the clear evidence of such systematic war crimes and the direct line of responsibility directly up to the President for these crimes, is that the Peace movement has been almost completely silent on the issue of bringing the perpetrators of these crimes to justice in a court of law. Indeed, aside from Code Pink, which has always been at the vanguard of protesting the Iraq war (in good measure because unlike most other mainstream peace groups, its leaders have actually visited Iraq since the occupation), no member of the anti-war coalition dedicated even a modicum of time or energy to pushing an agenda of indictment—rather than the politically unimaginable impeachment—of the President and his senior aides for the crimes committed in Iraq, or Afghanistan as well. This has been a strategic disaster. Unless Americans are forced to confront just how systematic have been the abuses committed by our troops (for more evidence of this, see the powerful documentary “The Ground Truth” by Patricia Foulkrod (thegroundtruth.org)they will continue to imagination that the main problem in Iraq is one of incompetence or bad management, when the reality is that the main problem has been that Iraq has gone more or less exactly as the Bush Administration has hoped it would: the United States, after illegally invading a UN-member state--an act which itself was a Crime Against Humanity” as it violated the paramount law of the United Nations against “breaches of the peace and acts of aggression”--has in good imperialist fashion, managed to turn the occupied population against each other and generate enough chaos and violence to insure that Iraq's leadership cannot ask it to leave. Need proof of how well this strategy has worked? Recall President Bush's blithe 2005 statement that the US was ready to leave “If the Iraqi government asks us to” (http://www.nytimes.com). Of course, the point is they can't ask us to leave. And we're not leaving any time soon, no matter who is the next American President, as the recent leak of the "secret plan" to ensure a long-term US presence in Iraq reported in the Guardian yesterday (http://www.guardian.co.uk) makes clear. Of course, this plan was never a secret; the miles-long construction convoys making their way across Iraq even in the first year building the new bases told anyone who wanted to listen what the long-term US plan was for Iraq.Viewed from this perspective, the hullaballoo over the official release of the infamous “torture memo,” whose main points have long been known publicly, will do nothing to change the fundamental political dynamics surrounding the unending US occupation of Iraq. They might even help perpetuate by shifting focus away from the even more damning evidence of systematic and large scale war crimes and crimes against humanity at the core of the US occupation, for which all Americans, having reelecting President Bush after ample evidence of these crimes was available for them to consider, are complicit.It is certainly responsible for the fact that despite the ongoing disaster, recent polls indicate that a majority of Americans consider John McCain, one of the biggest boosters—and most ill-informed commentators on—the occupation, better equipped to handle Iraq and the larger war on terror than either of his Democratic challengers (http://www.alternet.org). That might seem astonishing; but if the issue is framed as one of better management or prosecution of the occupation rather than the immorality and illegality of the war and occupation itself, there is a logic to Americans assuming that a war hero can do a better job than opponents who have never been in battle.All hope is not lost, however. If Barack Obama can make a speech about Iraq that has the same level of honesty and power as did his recent speech about race, there is a chance that he can change the perception of Democrats as being unable to manage the country through a quagmire most Americans seem instinctively to assume is not going to end any time soon. Perhaps he might even get the hundreds of thousands of people regularly into the streets to bring the troops home, in the absence of which the occupation might well continue, as McCain seems to hope, for “100 years.”In the meantime, would it be too much to ask for UC Berkeley to fire John Yoo for encouraging the commission of war crimes and gross ethical and intellectual incompetence? (http://hnn.us)?
Posted on Wednesday, April 9, 2008 at 2:08 PM

Tuesday, April 15, 2008

Instant Digital Prints (and Polaroid Nostalgia)
By ANNE EISENBERG (NY TIMES)
MILLIONS of families once snapped Polaroid photographs and enjoyed passing around the newly minted prints on the spot, instead of waiting a week for them to be developed.
Now, Polaroid wants to conjure up those golden analog days of vast sales and instant gratification — this time with images captured by digital cameras and camera phones.
This fall, the company expects to market a hand-size printer that produces color snapshots in about 30 seconds.
Beam a photograph from a cellphone to the printer and, with a gentle purr, out comes the full-color print — completely formed and dry to the touch.
The printer, which connects wirelessly by Bluetooth to phones and by cable to cameras, will cost about $150. The images are 2 inches by 3 inches, the size of a credit card. The new printers are so lightweight that a Polaroid executive demonstrating them recently had three tucked unnoticeably into various pockets of his trim jacket, whipping them out as if he were Harpo Marx.
The printer opens like a compact with a neat, satisfying click. Inside, no cartridges or toner take up space. Instead, there is a computer chip, a 2-inch-long thermal printhead and a novel kind of paper embedded with microscopic layers of dye crystals that can create a multitude of colors when heated.
When the image file is beamed from the camera to the printer, a program translates pixel information into heat information. Then, as the paper passes under the printhead, the heat activates the colors within the paper and forms crisp images.
The unusual paper is the creation of former employees of Polaroid who originated the process there. They spun off as a separate company, Zink Imaging, in 2005 after Polaroid’s bankruptcy and eventual sale to the Petters Group Worldwide in Minnetonka, Minn. The Alps Electric Company in Tokyo will make the printers.
The potential market for instant printing of photos captured by phones and digital cameras is vast and largely untapped, said Steve Hoffenberg, an analyst at Lyra Research, a market research firm in Newtonville, Mass. “There’s an explosion in picture taking,” he said, “primarily because of the sheer number of camera phones out there on a worldwide basis.” Lyra projects shipments of about 880 million camera phones in 2008.
But it may be hard for the new printers to find a niche. About 478 billion photographs will be taken worldwide in 2008, Mr. Hoffenberg said, most of them by camera phones, but only a tiny fraction of those clicks will end up as prints.
“People can just post picture files on a Web page, or e-mail them to other people,” he said. “These days people have many options.”
The printers might catch on for social occasions like family gatherings, he said, or among teenagers who enjoy exchanging photos, or among professional groups like real estate agents who want to hand an instant image to a prospective home buyer.
The snapshots will cost less than traditional Polaroid prints, which typically have run at least $1, and often more, during the last decade, said Jim Alviani, director for business development for Polaroid. The Zink paper for the printer will sell in 10-packs for $3.99, and in 30-packs for $9.99, so the cost will be about 33 to 40 cents a sheet.
The rechargeable lithium ion battery that runs the printer will last for about 15 shots.
The prints, which are borderless, have a semigloss finish and an adhesive backing that can be peeled off if users want to stick them on a locker or a notebook cover, for instance.
The paper that makes the small printer possible will be used not only with Polaroid, but also with other brands in the future, said Steve Herchen, the chief technology officer of Zink, in Bedford, Mass.
The Tomy Company in Tokyo, for example, will embed a Zink-friendly printer directly within a camera that it plans to distribute, he said. The Foxconn Technology Group of Taiwan will make this integrated camera-printer.
Zink paper looks like ordinary white photographic paper, but its composition is different.
“We begin with a plastic web,” Mr. Herchen said, “and then put down our image-forming materials in multiple thin layers of dye crystals.”
Each 2-by-3-inch print has about 100 billion of these crystals. During printing, about 200 million heat pulses are delivered to the paper to form the colors.
However ingenious the process, Mr. Hoffenberg of Lyra said, people might still not be tempted to convert camera clicks into prints.
“Potential markets can exist because they aren’t tapped, but also because they aren’t actually a market,” he said. “It’s not always evident up front which is the case.”
E-mail: novelties@nytimes.com.

Monday, April 14, 2008


The Moral Life of Cubicles
The Utopian Origins of Dilbert’s Workspace
David Franz
Few arenas can match the business office for its combination of humdrummery and world-shaping influence. Sociologist C. Wright Mills wrote of office workers, “Whatever history they have had is a history without events.” The history of office technology seems especially uninspiring: the invention of double-entry bookkeeping, calculators, and spreadsheets are unlikely material for a
captivating History Channel feature, to be sure. Yet the importance of the business office and its techniques is undeniable. Max Weber saw the office’s methods of organization, its rationality, and its disciplines as hallmarks of modern capitalism, making possible dramatic gains in efficiency and forever altering the economic and cultural landscape. Perhaps even more significant in our time, when millions of American workers spend most of their waking day in an office, is the sense that the organizational technologies of office life provide a kind of moral education, that offices shape character, that they create a certain kind of person. And perhaps no aspect of today’s office is more symbolic of office life and office lives than the cubicle.
Mills, in his 1951 attack on corporate bureaucracy, White Collar, imagined each office as “a segment of the enormous file.” Honeycombed floors of skyscrapers organized the “billion slips of paper that gear modern society into its daily shape.” Mills’s book was soon joined by The Organization Man and The Man in the Gray Flannel Suit in the decade’s series of attempts to assess the damage office life inflicted upon the worker. The composite picture that emerged was of a character driven by petty desires: for a slightly bigger office at work, a slightly bigger yard at home, and modest respectability everywhere. The man of the office was a middling figure without passion or creativity. These images of the office and its inhabitants were joined in the 1960s and 1970s with the counterculture’s critique of the stifling bureaucracies of the state, the corporation, the university. Standing on the steps of Berkeley’s Sproul Hall, Free Speech Movement leader Mario Savio echoed Mills’s condemnation of the great bureaucratic filing machine, now symbolized by IBM punch cards, and suggested to his fellow protesters that they put their “bodies on the gears and wheels” to stop it.
For many, this soullessness of office life is now most aptly represented by the cubicle—that open, wall-less, subdivision of office space. Beginning in the late 1960s, the cubicle spread quickly across the white-collar landscape. A market research firm estimated that by 1974 cubicles accounted for 20 percent of new office-furniture expenditures. In 1980, another study showed that half of new office furniture was placed in cubicled offices. According to Steelcase, one of the largest cubicle manufacturers, nearly 70 percent of office work now happens in cubicles.
The rise of the cubicle is surely due in part to its economics. Partitions are simply a very efficient way of organizing office space. Construction for cubicle offices is standard and cheap, made and assembled in large quantities and with minimal skilled labor. The building shell, lighting, and air conditioning can be set up with little consideration of interior walls, allowing contractors to build economical big white boxes to be filled in later with “office furniture systems.” Perhaps most importantly, cubicles maximize floor space, granting workers only the necessary square footage—a number that is shrinking all the time. According to brokerage surveys cited in National Real Estate Investor, the average office space per worker in the United States dropped from 250 square feet in 2000 to 190 square feet in 2005. Some observers expect this number to drop another 20 percent by 2010. This shrinkage not only saves space, but time as well—time wasted walking to restrooms, the coffee pot, and the marketing department, for example. Supervision is made more efficient too: with no walls to hide behind, slackers have to work or at least imitate work in a convincing way.
The cubicle is the very essence of efficiency—the kind of office only a spreadsheet could love, one is tempted to say. But not quite: alongside the economic arguments that brought the cubicle into ascendancy, there were also moral arguments. Offices in the 1970s and 1980s seemed to their critics burdensome remnants of an older age, symbolic shackles of bureaucracy—a system as inhuman as it was ineffective. Cubicles, by contrast, seemed to lack the fixity, and the constraints of bureaucracy of the old office. Moreover, cubicles eliminated the hierarchical distinctions between managers and workers; every cubicle had an open door, everyone was equally a worker. Empowering and humane, cubicles seemed to create a workplace with a soul.
The cubicle has its roots in the cybernetic school of thought that arose in the middle of the last century. The meaning of “cybernetics” has largely been swept up in the exuberant imagery of movies and commercials with their glowing rivers of ones and zeros flowing through the air. However, cybernetics has an older and deeper history, predating both the personal computer and the cubicle. Fred Turner’s recent book, From Counterculture to Cyberculture, shows how the cybernetic idea of seeing the world in terms of information flows grew out of government-sponsored World War II military research and into the information technology industry of Silicon Valley. In the 1960s and 1970s, cybernetic ideas brought groups of military-funded computer researchers together with Deadheads, radical environmentalists, and art communards in the San Francisco Bay area. This collection of long-haired eccentrics began to think of everything from bee behavior to dance parties to computer programming as information processes. In doing so, they liberated the images of information and the computer from the clutches of the military-industrial complex, joining them instead to a new cybernetic-counterculture vision of egalitarianism, communal networks, and democratic “people power.”
Architecture textbooks and journals in the 1960s and 1970s began to talk about a new “cybernetic” idea of the office. Starting with the assumption that offices were fundamentally places for the exchange of information, advocates of the cybernetic office aimed to eliminate walls that stop the “free flow of ideas,” replacing them with cubicle workstations. If the pictures in cubicle advertisements of the time are any indication, cubicles helped ideas flow quite freely indeed. Employees in these ads lack computers, to say nothing of e-mail and the Internet, yet they always seem caught in moments of frenzied, often low-tech, information exchange: pointing to each other across the room, handing papers over and around the burnt orange (“aesthetically pleasing and humanly satisfying”) partitions, all while talking on the phone and jotting down notes.
As California computer companies grew into large businesses, then, cubicles were their natural office form. It was through these companies that cubicles first entered the public imagination. In the late 1970s and early 1980s, business sections of newspapers and magazines described the radical work arrangements of Silicon Valley with curiosity and often breathless enthusiasm. Intel served as the chief example of the creative and egalitarian cubicle workplace. The company had no time cards, no dress codes, no assigned parking spots, no special cafeterias for executives, and above all, no offices, just a sea of half-wall partitions. The long, low buildings of Intel were fields of shared labor, like the communal farms that had so recently dotted the hills around Intel’s campus. CEO Andrew Grove, hip and casual in an open-necked wide-collar shirt and gold chains, was an unpretentious man of the people. He moved among the workers of Intel “empowering” them to do their jobs, and sat at a cubicle at one side of the vast work floor ready to help. Most incredible of all (and unlike the communal farms) this social experiment was economically viable. In a time when the great industrial giants were falling to Japanese competition, Intel was making money hand over fist. For some observers of American business, the Intel office model seemed like a savior. In The Atlantic, James Fallows asked the question on the minds of so many who dared to hope for the future of American industry, “Could the tire companies, the machine tool makers, the color TV industry, learn to work this way?”
This taste for fluid, egalitarian organization was elevated to a general philosophy by a new group of popular management writers. In the early 1980s, precisely at the moment of the cubicle’s introduction to the mainstream of American culture, management consultants, business professors, and CEOs all found a public hungry for management wisdom. Publishers and bookstores quickly seized upon this new market and suddenly management books, previously relegated to obscurity as business school texts, joined diet manuals and self-help books as best-sellers. The Art of Japanese Management and Theory Z (also about the art of Japanese management) were bestsellers in 1981, followed closely by In Search of Excellence, which argued that some Americans were still pretty good managers. These books and those that followed instructed Americans in the subtleties of international business, quality control, and other practical matters. More than this, however, they declared the beginning of a new era in which bureaucratic hierarchy would be obsolete and equality, creativity, and collaboration would rule the day. Separate offices, like formal business attire and human resources departments, were suffused with the musty smell of the old bureaucratic order—what one book called “the barnacle” of the status quo. The new office, with its minimal architectural and bureaucratic structure, would allow for new ideas to move more quickly and naturally through the company. Work would not be guided by policies and procedures, but the “shared values” of a “corporate culture.” One popular book even suggested a future of “boss-less companies” ruled only by a cultural canopy of shared understanding and inspiration. Tom Peters was the most prominent voice of this group, calling throughout the 1980s and 1990s for a “management revolution” and advocating such “anti-bureaucratic” management techniques as “management by walking around,” systematically “defying rules and regulations” and eliminating the barriers between departments. Peters suggested breaking down the figurative and literal walls between departments to encourage “disruptive innovation.” This kind of management thinking drew its lessons from the California technology boom and placed expectations of workplace equality in the idiom of the counterculture and political radicalism. Peters even wrote a book called Liberation Management.
But the moral philosophy of cubicle life was not limited to the sushi-and-Zen crowd of Northern California. Max De Pree, one of the most important figures of both the cubicle revolution and its theories of management, hails from a place far from California in almost every possible way. The little community of Zeeland, Michigan is home to the Herman Miller office furniture company, about 5,000 people, and more than a dozen Dutch Reformed churches. De Pree spent most of his career as an executive at Herman Miller, the company his father founded. Under the leadership of Max and his brother Hugh, Herman Miller sold the first office cubicle, the Action Office, in 1968. De Pree remained active in the cubicle revolution, overseeing various elaborations and improvements on the original design, including snap-in colored panels and new openings for aquariums and ant farms, until he retired from his position as CEO in 1987.
While most of the company’s employees worked in factories rather than offices, De Pree wanted to make Herman Miller an example of the kind of fun, egalitarian workplace that cubicle systems were supposed to encourage. The walls and ceilings of Herman Miller factories were decorated with colorful, life-size papier-mâché sculptures of workers. Employees were encouraged to find ways to use their “gifts” at work. One supervisor wrote poems for the factory newsletter, which were later printed on signs around the factory. De Pree dreamed of a time when the joys of work and the company spirit would make supervision itself unnecessary: “When they go home at night, they don’t actually need a supervisor to tell them how to be a good parent. And being a good parent is a lot tougher than making chairs.” If the California version of equality and freedom at work took its inspiration from communal farms and the remnants of hippie spirituality, De Pree’s version was straight Midwestern Protestantism. A member of the Reformed Church in America, De Pree told a reporter in 1986, “Each of us is made in the image of God. And if that’s true, then you cannot make the assumption that some of us are uncommon, and some of us are common.... We are all uncommon.” This “uncommonness” had two important implications for De Pree. First, it implied a fundamental equality. Second, it meant that individuals are different and must be handled with sensitivity and discernment. Both of these themes would be important in De Pree’s writings.
De Pree achieved a great deal more fame in his second career of leadership writer, speaker, and consultant than he did as an executive. His books, Leadership is an Art, Leadership Jazz, and Leading Without Power, have all sold well and made his name synonymous with “servant leadership” in business and leadership circles. While his writings do not suggest eliminating the category of leadership entirely, he does ask leaders to take a rather self-effacing view of their role. “Leadership is a posture of debt; it is a forfeiture of rights.” What leaders owe their followers is the opportunity “to fulfill their potential.” Organizational life, for De Pree, is a profoundly personal, even spiritual enterprise of self-improvement. He takes “finding voice” and connecting “voice and touch” to be central managerial tasks. De Pree’s description of a good leader is a humble, gentle soul who is “vulnerable,” “discerning,” and “aware of the human spirit.”
Those with moral aspirations for the cubicle—from countercultural Californians like Tom Peters to Midwestern Protestants like Max De Pree—sought to defend some idea of “humanity” against the inhumanity of bureaucracy. Yet, to say that bureaucracy is inhuman has not always been an objection to it. As defined by Max Weber a century ago, bureaucracy makes its great contribution to the world precisely by ignoring the human spirit. Operating according to fixed rules, policies, and positions, bureaucracy in its purest form functions, as Weber wrote, “without regard for persons.” As bureaucracy “develops more perfectly, the more the bureaucracy is ‘dehumanized,’ the more completely it succeeds in eliminating from official business love, hatred, and all purely personal, irrational, and emotional elements which escape calculation.” The central impulse of bureaucracy is to fashion a world in conformity to the impersonal abstraction and precise relationships of an organizational chart.
Peters and the Californians saw bureaucracy imposing arbitrary restrictions on the natural flow of creativity and information. Separate offices encouraged self-importance and unproductive groveling before the lordly egos of bosses. They created insular silos of knowledge and turf battles between them. Paperwork gummed up tasks that would better be handled with a little common sense and informal conversation. Good ideas were stalled in the system of procedures. In short, bureaucracy hindered human agency. For De Pree, the inadequacies of bureaucracy lay in its effort to cordon off into the private sphere both the emotion and the practice of unique “gifts.” Genuine leadership, for De Pree, is inescapably emotional and personal. Leadership involves nothing less than helping people become who they were meant to be. Such a task could not rest on impersonal procedures and systems alone.
While these humanistic sentiments remain common in writing about management and leadership, the cubicle has been detached from them entirely. In Dilbert, The Office, Office Space, and many other popular satires of contemporary work, cubicles are a symbol of all that is uninspiring about office life, and on this point, cubicles seem utterly without defenders. Fortune recently ran an article called “Cubicles: The Great Mistake,” complete with a public apology from one of the first cubicle designers. Twenty years after his Atlantic article extolling the virtues of the cubicled office, James Fallows wrote another on how he changed his mind. The promises of a cubicle utopia now seem curious, to say the least. In fact, the companies that make cubicles increasingly offer up apologies of their own. Steelcase, in its “State of the Cubicle” report, addresses the “Dilbert-type issues” that surround them, turning to head of design James Ludwig for a response. “Our goal in design would be to unfold the cubicle in ways that might make it unrecognizable.” The cubicle, once a cutting edge statement of corporate identity, has become an embarrassment, even for its makers.
What explains this change in meaning? Cubicle utopianism was probably a victim of its own success. The idea that cubicles formed a more exciting, humane workplace became less plausible to those who had the experience of working in one. As partitions and the space allotted to each worker shrunk, few things seemed to matter to office dwellers more than privacy. From the very beginning, workers reacted to cubicles by blocking up the openings of the “open office.” Newspaper reports of early cubicle offices tell of employees raiding supply closets for cardboard and extra panels to extend partitions. Some workers went so far as to push large filing cabinets into the space created by their cubicle’s missing fourth wall. While the beige rat-maze aesthetics of partition living attract all the jokes, the basic geometric facts of cubicles—their doorlessness and 360-degree visibility—are probably more central to the experience of cubicle work. Private conversations, whether in person or by phone, take on the character of an intrigue, a fact exploited endlessly in office sitcoms where ordinarily private matters of romance, betrayal, and personal failure are made public in the open office to the dismay of those involved. In an odd twist, privacy often requires venturing out into some more public space, one that is either anonymous (like a sidewalk) or relatively soundproof (like a central conference room).
The utopian visions of the cubicle have been crushed by reality. However, while the cubicled office no longer seems brave or new, an aspect of its original moral impulse remains. Indeed, the experiential facts of cubicle life are not so much in contradiction with the ambition to humanize the office as the revelation of the dark side of this effort. The ideals of office equality, fluidity, and collaboration in all their forms—including servant leadership, worker empowerment, and flattened organizations—required a kind of control more diffuse and amorphous, but also more personal than the old hierarchical bureaucracy. As Tom Peters and the other management theorists of “corporate culture” saw (albeit in a more positive light), the real managerial possibility contained in the cubicle was not lower costs or even the ability of managers to watch workers more closely. It was rather the creation of a culture in which workers would feel obliged to manage themselves. With everyone visible to everyone else, managerial obligation could spread itself throughout the entire office, becoming more personal and intense at the same time. Cubicles are not alone in this trend. The advent of 360-degree evaluations (filled out by those above, below, and beside an employee in the organizational hierarchy), the creation of company mission statements followed inevitably (and sometimes preceded) by facilitator-led meetings designed to get “buy-in,” and corporate campuses (which, by containing everything from grocery stores to fitness clubs eliminate reasons to leave), all tend to blur distinctions between personal and professional.
The ideal of the cultural workplace and its embodiment in cubicles also moves against another longstanding distinction of office work—the distinction between managers and workers. The ideal of a boss-less company has not been realized on anything like the large scale the management writers dreamed of, if it has in fact been realized anywhere. However, the impulse to equality and management through culture has led to something like the opposite of the boss-less company with bosses everywhere. As the managerial role is increasingly shorn of “authoritarian” tendencies and managers adopt the stance of a servant and facilitator, the scope of demands upon ordinary workers has risen. Observation, evaluation, encouraging the proper attitude and habits in other employees—these are all managerial tasks that are supposed to be shared. Such is the nature of being a team member. Cubicles may not be inspiring, but they have clearly contributed to new obligations.
These obligations go beyond the management of work to the management of self. The teamwork and collaboration of the open office elevate the importance of relational dexterity and a sunny (but not too sunny) disposition at work. Books promising work success through “emotional intelligence” and pharmaceutical advertisements portraying the difficulties faced by office workers with anxiety and attention disorders are both responding to the emotional demands of a work environment that puts a premium on self-presentation.
It would, in a way, be comforting if the rise of cubicles were simply the result of a bad decision to grant spreadsheets and their budgeteer masters imperial dominion over office space, but that’s just not how it happened. The cubicle revolution, in fact, was above all ideological. The clichés hurled at cubicles were woven into their sound-dampening fabric board from the beginning. Any discerning criticism of office life will have to take this moral history into account. Indeed, it is precisely the axioms of what makes for a good company and a good person buried within the cubicle that most need to be uncovered and held to critical attention.
David Franz is a Ph.D. candidate in sociology and a dissertation fellow in the Institute for Advanced Studies in Culture at the University of Virginia. Portions of this essay were previously published in the Institute’s magazine, Culture.
David Franz, "The Moral Life of Cubicles," The New Atlantis, Number 19, Winter 2008, pp. 132-139.