Friday, April 25, 2008

Masturbation 'cuts cancer risk'
Men could reduce their risk of developing prostate cancer through regular masturbation, researchers suggest.
They say cancer-causing chemicals could build up in the prostate if men do not ejaculate regularly.
And they say sexual intercourse may not have the same protective effect because of the possibility of contracting a sexually transmitted infection, which could increase men's cancer risk.
Australian researchers questioned over 1,000 men who had developed prostate cancer and 1,250 who had not about their sexual habits.
This is a plausible theory Dr Chris Hiley, Prostate Cancer Charity
They found those who had ejaculated the most between the ages of 20 and 50 were the least likely to develop the cancer.
The protective effect was greatest while the men were in their 20s.
Men who ejaculated more than five times a week were a third less likely to develop prostate cancer later in life.
Fluid
Previous research has suggested that a high number of sexual partners or a high level of sexual activity increased a man's risk of developing prostate cancer by up to 40%.
But the Australian researchers who carried out this study suggest the early work missed the protective effect of ejaculation because it focussed on sexual intercourse, with its associated risk of STIs.
Graham Giles, of the Cancer Council Victoria in Melbourne, who led the research team, told New Scientist: "Had we been able to remove ejaculations associated with sexual intercourse, there should have been an even stronger protective effect of ejaculations."
The researchers suggest that ejaculating may prevent carcinogens accumulating in the prostate gland.
The prostate provides a fluid into semen during ejaculation that activates sperm and prevents them sticking together.
The fluid has high concentrations of substances including potassium, zinc, fructose and citric acid, which are drawn from the bloodstream.
But animal studies have shown carcinogens such as 3-methylchloranthrene, found in cigarette smoke, are also concentrated in the prostate.
'Flushing out'
Dr Giles said fewer ejaculations may mean the carcinogens build up.
"It's a prostatic stagnation hypothesis. The more you flush the ducts out, the less there is to hang around and damage the cells that line them."
A similar connection has been found between breast cancer and breastfeeding, where lactating appeared to "flush out" carcinogens, reduce a woman's risk of the disease, New Scientist reports.
Another theory put forward by the researchers is that ejaculation may induce prostate glands to mature fully, making them less susceptible to carcinogens.
Dr Chris Hiley, head of policy and research at the UK's Prostate Cancer Charity, told BBC News Online: "This is a plausible theory."
She added: "In the same way the human papillomavirus has been linked to cervical cancer, there is a suggestion that bits of prostate cancer may be related to a sexually transmitted infection earlier in life."
Anthony Smith, deputy director of the Australian Research Centre in Sex, Health and Society at La Trobe University in Melbourne, said the research could affect the kind of lifestyle advice doctors give to patients.
"Masturbation is part of people's sexual repertoire.
"If these findings hold up, then it's perfectly reasonable that men should be encouraged to masturbate," he said.
Story from BBC NEWS:

Thursday, April 24, 2008

Penis theft panic hits city
Reuters
KINSHASA - Police in Congo have arrested 13 suspected sorcerers accused of using black magic to steal or shrink men's penises after a wave of panic and attempted lynchings triggered by the alleged witchcraft.
Reports of so-called penis snatching are not uncommon in West Africa, where belief in traditional religions and witchcraft remains widespread, and where ritual killings to obtain blood or body parts still occur.
Rumours of penis theft began circulating last week in Kinshasa, Democratic Republic of Congo's sprawling capital of some 8 million inhabitants. They quickly dominated radio call-in shows, with listeners advised to beware of fellow passengers in communal taxis wearing gold rings.
Purported victims, 14 of whom were also detained by police, claimed that sorcerers simply touched them to make their genitals shrink or disappear, in what some residents said was an attempt to extort cash with the promise of a cure.
"You just have to be accused of that, and people come after you. We've had a number of attempted lynchings. ... You see them covered in marks after being beaten," Kinshasa's police chief, Jean-Dieudonne Oleko, told Reuters.
Police arrested the accused sorcerers and their victims in an effort to avoid the sort of bloodshed seen in Ghana a decade ago, when 12 suspected penis snatchers were beaten to death by angry mobs. The 27 men have since been released.
"I'm tempted to say it's one huge joke," Oleko said.
"But when you try to tell the victims that their penises are still there, they tell you that it's become tiny or that they've become impotent. To that I tell them, 'How do you know if you haven't gone home and tried it'," he said.
Some Kinshasa residents accuse a separatist sect from nearby Bas-Congo province of being behind the witchcraft in revenge for a recent government crackdown on its members.
"It's real. Just yesterday here, there was a man who was a victim. We saw. What was left was tiny," said 29-year-old Alain Kalala, who sells phone credits near a Kinshasa police station.
© Reuters 2008

Sunday, April 20, 2008




Midcentury-modern buildings in Dallas attract preservationists
By DAVID FLICK / The Dallas Morning News dflick@dallasnews.com
Susan Risinger fell in love with her midcentury-modern house five years ago, well before it was discovered by SpongeBob SquarePants.
CHERYL DIAZ MEYER/DMN Susan Risinger, with son William, 6, was drawn to the midcentury-modern architecture of the Midway Hills neighborhood in northwest Dallas.
When Ms. Risinger and her family moved to Dallas from New York, she knew she wanted something very much like the 1950s-era house in the Midway Hills neighborhood of northwest Dallas.
"I don't really care for the more classical look," she said. "I like the clean lines, the architecture, the windows of midcentury houses."
Not only did Ms. Risinger find the house she coveted, she now lives in an outstanding example of an architectural design that is the new front line in the battle for historic preservation.
In the past few years, the midcentury-modern style has begun attracting preservationists' efforts nationwide. For a variety of reasons – including the city's affinity for teardowns – one of the movement's epicenters is Dallas.
While Preservation Dallas, the city's leading organization dedicated to the conservation of historic structures, is linked in the public mind with protecting Victorian homes and ornate skyscrapers, its most recent battles have centered on buildings from the 1950s and '60s.
In just the past few weeks, for example, the group:
•Obtained national historic status for the 3525 Turtle Creek condominiums, built in 1956.
•Persuaded the City Plan Commission to deny a rezoning request that would have doomed a 1959 insurance office building near Oak Lawn.
•Honored developers who converted the old Fidelity Union Life Towers, built in 1952 and 1959, into condos.
The organization's officers, meanwhile, have declared saving the downtown Statler Hilton Hotel building, constructed in 1958, its highest priority.
The sudden interest may come as a surprise to baby boomers who grew up in the 1950s and '60s. For them, a structure from that era may seem less like an architectural treasure and more like the building where they went to the dentist.
"It's always part of the job of preservation organizations to sell the public on buildings from a time period that is coming of age," said Katherine Seale, executive director of Preservation Dallas.
"When people first began to work to save Victorian homes, a lot of people thought they were pretty ugly."
The interest is, in one sense, inevitable, given the simple passage of time. Some postwar architecture is now half a century old – an age that typically transforms a building from "outdated" to "historic."
"There's no rhyme or reason for 50 years to be the benchmark, but the feeling is that after five decades, we begin to have a clearer view of an era," Ms. Seale said.
"We're coming out of the fog and we can look at it with more distance."
The National Trust for Historic Preservation is widely credited with nurturing the reawakening, although its earliest pronouncements a few years ago were almost defensive. The group noted then that some of the buildings were not even 50 years old and thus were not a part of history, but of the "recent past."
Dallas author Virginia McAlester is updating her book, A Field Guide to American Houses, to include midcentury architecture.
"You're starting to see people on the cutting edge of the arts, the tastemakers, who are into preserving midcentury architecture," she said.
"It just hasn't caught up to the rest of us yet. When you talk about preserving a typical ranch house, people say, 'Oh, well, I grew up in a house like that.' "
But another generation sees it differently.
"I think the clean modern lines are still very popular, and it's got a retro look that's attractive to younger people," Ms. Seale said. "It's got a bit of an edge, a bit of fun to it."
Ms. Risinger's house on Pinocchio Drive, for example, had exactly the feel sought by the makers of a commercial for a SpongeBob SquarePants board game.
"They filmed it here because they said they liked the 1950s look," she said. "Whenever we see it on television, we can really see it's our place. My kids love it."
Dallas is a natural center for midcentury architecture, which thrived during the 1950s and '60s. Subdivisions and commercial buildings were built by the thousands throughout the country during that period, but particularly in booming Sun Belt cities like Dallas.
And, in many cases, they were better than the work of previous generations. The earliest structures in Western cities like Dallas were usually designed by local architects, or by builders with no formal architectural training at all.
But the city's postwar wealth changed that.
"After the war, they were able to hire architects that were nationally and even internationally known," Ms. Seale said.
The very abundance of midcentury-modern buildings creates its own challenges for preservationists. To many people, efforts to protect ranch houses in North Texas sound like an attempt to declare ants an endangered species.
Furthermore, the purposely unadorned look makes midcentury houses and buildings more difficult to love.
"Because they're not eclectic like classical or Tudor or French, it's hard to contemplate midcentury as a distinctive style, but it is," said Willis Winters, a Dallas parks department assistant director who has written several books on local architecture.
"A lot of people see them as throwaways and the first targets for teardowns."
Over the past few years, Mr. Winters cataloged 400 houses for a new book that he co-wrote, Great American Suburbs: Houses of the Park Cities, Texas. Although it is still months before the book's publication date, he said, 20 percent of the houses have since been torn down.
The destruction of midcentury houses and buildings follows a pattern familiar to preservationists – a common and unappreciated architectural style is threatened by new construction. But the threat creates an opportunity.
For one thing, tearing down a building triggers a new appreciation for what is lost. Absence, as they say, makes the heart grow fonder. And it can spur political action.
"Nothing starts up a neighborhood movement like a teardown," Ms. Seale said.
The attrition also allows preservationists to be pickier about what they seek to save.
"Something doesn't qualify as historic just because it's old," Ms. Seale said. "There are a lot of criteria we look at."
In the case of midcentury houses, the appearance of the entire neighborhood is taken into account, as well as the quality of the planning and construction.
Two postwar neighborhoods are considered standouts by local preservationists – Midway Hills in northwest Dallas and Wynnewood North in Oak Cliff, Ms. Seale said.
Both have well-constructed, interesting examples of midcentury-modern styles, and both have been relatively untroubled by teardowns. Still, there are conservation efforts afoot in both places.
The so-called Disney Streets area, where Ms. Risinger lives, is a particular favorite of local preservationists. It was the site of the Dallas Parade of Homes in 1954 and 1955, in which builders showcased the latest in residential styles and technology.
The local home show was among the largest and most popular in the country, attracting up to 100,000 visitors.
In Midway Hills, Jacqueline Ziff is living in a showcased house her late husband bought in 1962, when it was still considered cutting edge. She was surprised when a real estate agency told her recently that it was coming back into style.
"She cautioned me not to do a lot with it," she said. "It has a pink-tile bathroom that I was considering updating. But I guess I won't now."

Thursday, April 17, 2008

Mark A. LeVine
War Crimes are Only the Beginning (From HNN)
[Mr. LeVine is professor of modern Middle Eastern history, culture, and Islamic studies at the University of California, Irvine, and author of the forthcoming book, Heavy Metal Islam.]
As reported by Jason Linkins in the Huffington Post over the weekend
(http://www.huffingtonpost.com), the release of the full text of the so-called “Torture Memo” written by UC Berkeley Law professor—and at the time, Assistant Attorney General John Yoo, has fired up even conservative commentators such as Andrew Sullivan. On the Chris Matthews show Sunday he declared that it was a sure bet that at some point point Donald Rumsfeld, David Addington and John Yoo would be indicted for war crimes.
For anyone who's traveled unembedded through Iraq since the US invasion and occupation began five years ago, Sullivan's warning brings a sad smile of recognition, and a hope that his words will prove prophetic. Even if all three men were indicted it would only be the tip of the iceberg in addressing the issue of war crimes in Iraq. In fact, the continued controversy over the torture memo and its justification of waterboarding and other illegal interrogation techniques that have been performed on at most a dozen or so detainees obscures the far more systematic war crimes that have constituted the every day reality of the occupation. Indeed, from the first day of the invasion, war crimes have been the currency of US military activities across Iraq. When I was in the country one year into the invasion I counted dozens of violations just in my travels and discussions with Iraqi doctors, activists and government personnel, which together made the occupation one giant war crime. All were a direct a violation of the obligation of the United States and other members of the coalition under UN Security Council Resolution 1483 of May 22, 2003 to “promote the welfare of the Iraqi people through the effective administration of the territory, including in particular working towards the restoration of conditions of security and stability and the creation of conditions in which the Iraqi people can freely determine their own political future.” More broadly, the resolution also called upon the coalition to “comply fully with their obligations under international law.” that is, US troops were obligated to assure humane treatment for the civilian population (Article 27 of the 4th Geneva Convention) and more broadly permit life in Iraq to continue without being affected by its presence; and to ensure the public order, safety and welfare of the population, from providing for basic food and clothing needs to health care as well that are, according to Articles 68 and 69 of Protocol 1 of the Geneva Conventions (which is accepted as customary international law by the U.S. even though it hasn’t signed the Protocol) “essential to the survival of the civilian population.”As an April 2004 report by Amnesty on the human rights situation in Iraq made clear ,
http://www.amnestyusa.org, "Under international humanitarian law, as occupying powers it was their duty to maintain and restore public order, and provide food, medical care and relief assistance. They failed in this duty, with the result that millions of Iraqis faced grave threats to their health and safety." With each death due to the decrepit health care system that could have been fixed with modest inputs of money, supplies and effort, the purposeful shooting of ambulances http://news.bbc.co.uk or the prevention or delay in the receiving of medical care http://www.guardian.co.uk as happened during the fighting in Fallujah and numerous other occasions, the U.S. crosses the line between “merely” violating international humanitarian law (specifically articles 17 through 19 of the 4th Geneva Conventions) and the commission of actual war crimes, defined as grave breaches of the 4th Geneva Convention as described in article 147 as including the "willful killing, torture or inhuman treatment, including... willfully causing great suffering or serious injury to body or health, unlawful deportation or transfer or unlawful confinement of a protected person... or willfully depriving a protected person of the rights of fair and regular trial ...taking of hostages and extensive destruction and appropriation of property, not justified by military necessity and carried out unlawfully and wantonly."In sum, whether it's been the killing of tens of thousands of civilians (if not more)) by US forces to the torture of a relatively few in prisons such as Abu Ghraib, the issue has never been on of soldiers exceeding their authority. It's an issue of the Commander and Chief of the United States armed forces, along with his top commanders and officials, being responsible for a military system that, once unleashed, cannot but commit systematic violations of humanitarian law. Indeed, only weeks after the occupation began a group of Belgian doctors who’d spent the previous year in Baghdad explained that whatever crimes might be committed by Iraqis, as the internationally recognized belligerent occupiers “the current humanitarian catastrophe is entirely and solely the responsibility of the US and British authorities.” http://www.globalresearch.ca Even that early into the occupation they documented violations of at least a dozen articles of the 4th Geneva Convention by Coalition forces (including articles 10, 12, 15, 21, 35, 36, 41, 45, 47, 48, 51 & 55).What's most surprising is that given the clear evidence of such systematic war crimes and the direct line of responsibility directly up to the President for these crimes, is that the Peace movement has been almost completely silent on the issue of bringing the perpetrators of these crimes to justice in a court of law. Indeed, aside from Code Pink, which has always been at the vanguard of protesting the Iraq war (in good measure because unlike most other mainstream peace groups, its leaders have actually visited Iraq since the occupation), no member of the anti-war coalition dedicated even a modicum of time or energy to pushing an agenda of indictment—rather than the politically unimaginable impeachment—of the President and his senior aides for the crimes committed in Iraq, or Afghanistan as well. This has been a strategic disaster. Unless Americans are forced to confront just how systematic have been the abuses committed by our troops (for more evidence of this, see the powerful documentary “The Ground Truth” by Patricia Foulkrod (thegroundtruth.org)they will continue to imagination that the main problem in Iraq is one of incompetence or bad management, when the reality is that the main problem has been that Iraq has gone more or less exactly as the Bush Administration has hoped it would: the United States, after illegally invading a UN-member state--an act which itself was a Crime Against Humanity” as it violated the paramount law of the United Nations against “breaches of the peace and acts of aggression”--has in good imperialist fashion, managed to turn the occupied population against each other and generate enough chaos and violence to insure that Iraq's leadership cannot ask it to leave. Need proof of how well this strategy has worked? Recall President Bush's blithe 2005 statement that the US was ready to leave “If the Iraqi government asks us to” (http://www.nytimes.com). Of course, the point is they can't ask us to leave. And we're not leaving any time soon, no matter who is the next American President, as the recent leak of the "secret plan" to ensure a long-term US presence in Iraq reported in the Guardian yesterday (http://www.guardian.co.uk) makes clear. Of course, this plan was never a secret; the miles-long construction convoys making their way across Iraq even in the first year building the new bases told anyone who wanted to listen what the long-term US plan was for Iraq.Viewed from this perspective, the hullaballoo over the official release of the infamous “torture memo,” whose main points have long been known publicly, will do nothing to change the fundamental political dynamics surrounding the unending US occupation of Iraq. They might even help perpetuate by shifting focus away from the even more damning evidence of systematic and large scale war crimes and crimes against humanity at the core of the US occupation, for which all Americans, having reelecting President Bush after ample evidence of these crimes was available for them to consider, are complicit.It is certainly responsible for the fact that despite the ongoing disaster, recent polls indicate that a majority of Americans consider John McCain, one of the biggest boosters—and most ill-informed commentators on—the occupation, better equipped to handle Iraq and the larger war on terror than either of his Democratic challengers (http://www.alternet.org). That might seem astonishing; but if the issue is framed as one of better management or prosecution of the occupation rather than the immorality and illegality of the war and occupation itself, there is a logic to Americans assuming that a war hero can do a better job than opponents who have never been in battle.All hope is not lost, however. If Barack Obama can make a speech about Iraq that has the same level of honesty and power as did his recent speech about race, there is a chance that he can change the perception of Democrats as being unable to manage the country through a quagmire most Americans seem instinctively to assume is not going to end any time soon. Perhaps he might even get the hundreds of thousands of people regularly into the streets to bring the troops home, in the absence of which the occupation might well continue, as McCain seems to hope, for “100 years.”In the meantime, would it be too much to ask for UC Berkeley to fire John Yoo for encouraging the commission of war crimes and gross ethical and intellectual incompetence? (http://hnn.us)?
Posted on Wednesday, April 9, 2008 at 2:08 PM

Tuesday, April 15, 2008

Instant Digital Prints (and Polaroid Nostalgia)
By ANNE EISENBERG (NY TIMES)
MILLIONS of families once snapped Polaroid photographs and enjoyed passing around the newly minted prints on the spot, instead of waiting a week for them to be developed.
Now, Polaroid wants to conjure up those golden analog days of vast sales and instant gratification — this time with images captured by digital cameras and camera phones.
This fall, the company expects to market a hand-size printer that produces color snapshots in about 30 seconds.
Beam a photograph from a cellphone to the printer and, with a gentle purr, out comes the full-color print — completely formed and dry to the touch.
The printer, which connects wirelessly by Bluetooth to phones and by cable to cameras, will cost about $150. The images are 2 inches by 3 inches, the size of a credit card. The new printers are so lightweight that a Polaroid executive demonstrating them recently had three tucked unnoticeably into various pockets of his trim jacket, whipping them out as if he were Harpo Marx.
The printer opens like a compact with a neat, satisfying click. Inside, no cartridges or toner take up space. Instead, there is a computer chip, a 2-inch-long thermal printhead and a novel kind of paper embedded with microscopic layers of dye crystals that can create a multitude of colors when heated.
When the image file is beamed from the camera to the printer, a program translates pixel information into heat information. Then, as the paper passes under the printhead, the heat activates the colors within the paper and forms crisp images.
The unusual paper is the creation of former employees of Polaroid who originated the process there. They spun off as a separate company, Zink Imaging, in 2005 after Polaroid’s bankruptcy and eventual sale to the Petters Group Worldwide in Minnetonka, Minn. The Alps Electric Company in Tokyo will make the printers.
The potential market for instant printing of photos captured by phones and digital cameras is vast and largely untapped, said Steve Hoffenberg, an analyst at Lyra Research, a market research firm in Newtonville, Mass. “There’s an explosion in picture taking,” he said, “primarily because of the sheer number of camera phones out there on a worldwide basis.” Lyra projects shipments of about 880 million camera phones in 2008.
But it may be hard for the new printers to find a niche. About 478 billion photographs will be taken worldwide in 2008, Mr. Hoffenberg said, most of them by camera phones, but only a tiny fraction of those clicks will end up as prints.
“People can just post picture files on a Web page, or e-mail them to other people,” he said. “These days people have many options.”
The printers might catch on for social occasions like family gatherings, he said, or among teenagers who enjoy exchanging photos, or among professional groups like real estate agents who want to hand an instant image to a prospective home buyer.
The snapshots will cost less than traditional Polaroid prints, which typically have run at least $1, and often more, during the last decade, said Jim Alviani, director for business development for Polaroid. The Zink paper for the printer will sell in 10-packs for $3.99, and in 30-packs for $9.99, so the cost will be about 33 to 40 cents a sheet.
The rechargeable lithium ion battery that runs the printer will last for about 15 shots.
The prints, which are borderless, have a semigloss finish and an adhesive backing that can be peeled off if users want to stick them on a locker or a notebook cover, for instance.
The paper that makes the small printer possible will be used not only with Polaroid, but also with other brands in the future, said Steve Herchen, the chief technology officer of Zink, in Bedford, Mass.
The Tomy Company in Tokyo, for example, will embed a Zink-friendly printer directly within a camera that it plans to distribute, he said. The Foxconn Technology Group of Taiwan will make this integrated camera-printer.
Zink paper looks like ordinary white photographic paper, but its composition is different.
“We begin with a plastic web,” Mr. Herchen said, “and then put down our image-forming materials in multiple thin layers of dye crystals.”
Each 2-by-3-inch print has about 100 billion of these crystals. During printing, about 200 million heat pulses are delivered to the paper to form the colors.
However ingenious the process, Mr. Hoffenberg of Lyra said, people might still not be tempted to convert camera clicks into prints.
“Potential markets can exist because they aren’t tapped, but also because they aren’t actually a market,” he said. “It’s not always evident up front which is the case.”
E-mail: novelties@nytimes.com.

Monday, April 14, 2008


The Moral Life of Cubicles
The Utopian Origins of Dilbert’s Workspace
David Franz
Few arenas can match the business office for its combination of humdrummery and world-shaping influence. Sociologist C. Wright Mills wrote of office workers, “Whatever history they have had is a history without events.” The history of office technology seems especially uninspiring: the invention of double-entry bookkeeping, calculators, and spreadsheets are unlikely material for a
captivating History Channel feature, to be sure. Yet the importance of the business office and its techniques is undeniable. Max Weber saw the office’s methods of organization, its rationality, and its disciplines as hallmarks of modern capitalism, making possible dramatic gains in efficiency and forever altering the economic and cultural landscape. Perhaps even more significant in our time, when millions of American workers spend most of their waking day in an office, is the sense that the organizational technologies of office life provide a kind of moral education, that offices shape character, that they create a certain kind of person. And perhaps no aspect of today’s office is more symbolic of office life and office lives than the cubicle.
Mills, in his 1951 attack on corporate bureaucracy, White Collar, imagined each office as “a segment of the enormous file.” Honeycombed floors of skyscrapers organized the “billion slips of paper that gear modern society into its daily shape.” Mills’s book was soon joined by The Organization Man and The Man in the Gray Flannel Suit in the decade’s series of attempts to assess the damage office life inflicted upon the worker. The composite picture that emerged was of a character driven by petty desires: for a slightly bigger office at work, a slightly bigger yard at home, and modest respectability everywhere. The man of the office was a middling figure without passion or creativity. These images of the office and its inhabitants were joined in the 1960s and 1970s with the counterculture’s critique of the stifling bureaucracies of the state, the corporation, the university. Standing on the steps of Berkeley’s Sproul Hall, Free Speech Movement leader Mario Savio echoed Mills’s condemnation of the great bureaucratic filing machine, now symbolized by IBM punch cards, and suggested to his fellow protesters that they put their “bodies on the gears and wheels” to stop it.
For many, this soullessness of office life is now most aptly represented by the cubicle—that open, wall-less, subdivision of office space. Beginning in the late 1960s, the cubicle spread quickly across the white-collar landscape. A market research firm estimated that by 1974 cubicles accounted for 20 percent of new office-furniture expenditures. In 1980, another study showed that half of new office furniture was placed in cubicled offices. According to Steelcase, one of the largest cubicle manufacturers, nearly 70 percent of office work now happens in cubicles.
The rise of the cubicle is surely due in part to its economics. Partitions are simply a very efficient way of organizing office space. Construction for cubicle offices is standard and cheap, made and assembled in large quantities and with minimal skilled labor. The building shell, lighting, and air conditioning can be set up with little consideration of interior walls, allowing contractors to build economical big white boxes to be filled in later with “office furniture systems.” Perhaps most importantly, cubicles maximize floor space, granting workers only the necessary square footage—a number that is shrinking all the time. According to brokerage surveys cited in National Real Estate Investor, the average office space per worker in the United States dropped from 250 square feet in 2000 to 190 square feet in 2005. Some observers expect this number to drop another 20 percent by 2010. This shrinkage not only saves space, but time as well—time wasted walking to restrooms, the coffee pot, and the marketing department, for example. Supervision is made more efficient too: with no walls to hide behind, slackers have to work or at least imitate work in a convincing way.
The cubicle is the very essence of efficiency—the kind of office only a spreadsheet could love, one is tempted to say. But not quite: alongside the economic arguments that brought the cubicle into ascendancy, there were also moral arguments. Offices in the 1970s and 1980s seemed to their critics burdensome remnants of an older age, symbolic shackles of bureaucracy—a system as inhuman as it was ineffective. Cubicles, by contrast, seemed to lack the fixity, and the constraints of bureaucracy of the old office. Moreover, cubicles eliminated the hierarchical distinctions between managers and workers; every cubicle had an open door, everyone was equally a worker. Empowering and humane, cubicles seemed to create a workplace with a soul.
The cubicle has its roots in the cybernetic school of thought that arose in the middle of the last century. The meaning of “cybernetics” has largely been swept up in the exuberant imagery of movies and commercials with their glowing rivers of ones and zeros flowing through the air. However, cybernetics has an older and deeper history, predating both the personal computer and the cubicle. Fred Turner’s recent book, From Counterculture to Cyberculture, shows how the cybernetic idea of seeing the world in terms of information flows grew out of government-sponsored World War II military research and into the information technology industry of Silicon Valley. In the 1960s and 1970s, cybernetic ideas brought groups of military-funded computer researchers together with Deadheads, radical environmentalists, and art communards in the San Francisco Bay area. This collection of long-haired eccentrics began to think of everything from bee behavior to dance parties to computer programming as information processes. In doing so, they liberated the images of information and the computer from the clutches of the military-industrial complex, joining them instead to a new cybernetic-counterculture vision of egalitarianism, communal networks, and democratic “people power.”
Architecture textbooks and journals in the 1960s and 1970s began to talk about a new “cybernetic” idea of the office. Starting with the assumption that offices were fundamentally places for the exchange of information, advocates of the cybernetic office aimed to eliminate walls that stop the “free flow of ideas,” replacing them with cubicle workstations. If the pictures in cubicle advertisements of the time are any indication, cubicles helped ideas flow quite freely indeed. Employees in these ads lack computers, to say nothing of e-mail and the Internet, yet they always seem caught in moments of frenzied, often low-tech, information exchange: pointing to each other across the room, handing papers over and around the burnt orange (“aesthetically pleasing and humanly satisfying”) partitions, all while talking on the phone and jotting down notes.
As California computer companies grew into large businesses, then, cubicles were their natural office form. It was through these companies that cubicles first entered the public imagination. In the late 1970s and early 1980s, business sections of newspapers and magazines described the radical work arrangements of Silicon Valley with curiosity and often breathless enthusiasm. Intel served as the chief example of the creative and egalitarian cubicle workplace. The company had no time cards, no dress codes, no assigned parking spots, no special cafeterias for executives, and above all, no offices, just a sea of half-wall partitions. The long, low buildings of Intel were fields of shared labor, like the communal farms that had so recently dotted the hills around Intel’s campus. CEO Andrew Grove, hip and casual in an open-necked wide-collar shirt and gold chains, was an unpretentious man of the people. He moved among the workers of Intel “empowering” them to do their jobs, and sat at a cubicle at one side of the vast work floor ready to help. Most incredible of all (and unlike the communal farms) this social experiment was economically viable. In a time when the great industrial giants were falling to Japanese competition, Intel was making money hand over fist. For some observers of American business, the Intel office model seemed like a savior. In The Atlantic, James Fallows asked the question on the minds of so many who dared to hope for the future of American industry, “Could the tire companies, the machine tool makers, the color TV industry, learn to work this way?”
This taste for fluid, egalitarian organization was elevated to a general philosophy by a new group of popular management writers. In the early 1980s, precisely at the moment of the cubicle’s introduction to the mainstream of American culture, management consultants, business professors, and CEOs all found a public hungry for management wisdom. Publishers and bookstores quickly seized upon this new market and suddenly management books, previously relegated to obscurity as business school texts, joined diet manuals and self-help books as best-sellers. The Art of Japanese Management and Theory Z (also about the art of Japanese management) were bestsellers in 1981, followed closely by In Search of Excellence, which argued that some Americans were still pretty good managers. These books and those that followed instructed Americans in the subtleties of international business, quality control, and other practical matters. More than this, however, they declared the beginning of a new era in which bureaucratic hierarchy would be obsolete and equality, creativity, and collaboration would rule the day. Separate offices, like formal business attire and human resources departments, were suffused with the musty smell of the old bureaucratic order—what one book called “the barnacle” of the status quo. The new office, with its minimal architectural and bureaucratic structure, would allow for new ideas to move more quickly and naturally through the company. Work would not be guided by policies and procedures, but the “shared values” of a “corporate culture.” One popular book even suggested a future of “boss-less companies” ruled only by a cultural canopy of shared understanding and inspiration. Tom Peters was the most prominent voice of this group, calling throughout the 1980s and 1990s for a “management revolution” and advocating such “anti-bureaucratic” management techniques as “management by walking around,” systematically “defying rules and regulations” and eliminating the barriers between departments. Peters suggested breaking down the figurative and literal walls between departments to encourage “disruptive innovation.” This kind of management thinking drew its lessons from the California technology boom and placed expectations of workplace equality in the idiom of the counterculture and political radicalism. Peters even wrote a book called Liberation Management.
But the moral philosophy of cubicle life was not limited to the sushi-and-Zen crowd of Northern California. Max De Pree, one of the most important figures of both the cubicle revolution and its theories of management, hails from a place far from California in almost every possible way. The little community of Zeeland, Michigan is home to the Herman Miller office furniture company, about 5,000 people, and more than a dozen Dutch Reformed churches. De Pree spent most of his career as an executive at Herman Miller, the company his father founded. Under the leadership of Max and his brother Hugh, Herman Miller sold the first office cubicle, the Action Office, in 1968. De Pree remained active in the cubicle revolution, overseeing various elaborations and improvements on the original design, including snap-in colored panels and new openings for aquariums and ant farms, until he retired from his position as CEO in 1987.
While most of the company’s employees worked in factories rather than offices, De Pree wanted to make Herman Miller an example of the kind of fun, egalitarian workplace that cubicle systems were supposed to encourage. The walls and ceilings of Herman Miller factories were decorated with colorful, life-size papier-mâché sculptures of workers. Employees were encouraged to find ways to use their “gifts” at work. One supervisor wrote poems for the factory newsletter, which were later printed on signs around the factory. De Pree dreamed of a time when the joys of work and the company spirit would make supervision itself unnecessary: “When they go home at night, they don’t actually need a supervisor to tell them how to be a good parent. And being a good parent is a lot tougher than making chairs.” If the California version of equality and freedom at work took its inspiration from communal farms and the remnants of hippie spirituality, De Pree’s version was straight Midwestern Protestantism. A member of the Reformed Church in America, De Pree told a reporter in 1986, “Each of us is made in the image of God. And if that’s true, then you cannot make the assumption that some of us are uncommon, and some of us are common.... We are all uncommon.” This “uncommonness” had two important implications for De Pree. First, it implied a fundamental equality. Second, it meant that individuals are different and must be handled with sensitivity and discernment. Both of these themes would be important in De Pree’s writings.
De Pree achieved a great deal more fame in his second career of leadership writer, speaker, and consultant than he did as an executive. His books, Leadership is an Art, Leadership Jazz, and Leading Without Power, have all sold well and made his name synonymous with “servant leadership” in business and leadership circles. While his writings do not suggest eliminating the category of leadership entirely, he does ask leaders to take a rather self-effacing view of their role. “Leadership is a posture of debt; it is a forfeiture of rights.” What leaders owe their followers is the opportunity “to fulfill their potential.” Organizational life, for De Pree, is a profoundly personal, even spiritual enterprise of self-improvement. He takes “finding voice” and connecting “voice and touch” to be central managerial tasks. De Pree’s description of a good leader is a humble, gentle soul who is “vulnerable,” “discerning,” and “aware of the human spirit.”
Those with moral aspirations for the cubicle—from countercultural Californians like Tom Peters to Midwestern Protestants like Max De Pree—sought to defend some idea of “humanity” against the inhumanity of bureaucracy. Yet, to say that bureaucracy is inhuman has not always been an objection to it. As defined by Max Weber a century ago, bureaucracy makes its great contribution to the world precisely by ignoring the human spirit. Operating according to fixed rules, policies, and positions, bureaucracy in its purest form functions, as Weber wrote, “without regard for persons.” As bureaucracy “develops more perfectly, the more the bureaucracy is ‘dehumanized,’ the more completely it succeeds in eliminating from official business love, hatred, and all purely personal, irrational, and emotional elements which escape calculation.” The central impulse of bureaucracy is to fashion a world in conformity to the impersonal abstraction and precise relationships of an organizational chart.
Peters and the Californians saw bureaucracy imposing arbitrary restrictions on the natural flow of creativity and information. Separate offices encouraged self-importance and unproductive groveling before the lordly egos of bosses. They created insular silos of knowledge and turf battles between them. Paperwork gummed up tasks that would better be handled with a little common sense and informal conversation. Good ideas were stalled in the system of procedures. In short, bureaucracy hindered human agency. For De Pree, the inadequacies of bureaucracy lay in its effort to cordon off into the private sphere both the emotion and the practice of unique “gifts.” Genuine leadership, for De Pree, is inescapably emotional and personal. Leadership involves nothing less than helping people become who they were meant to be. Such a task could not rest on impersonal procedures and systems alone.
While these humanistic sentiments remain common in writing about management and leadership, the cubicle has been detached from them entirely. In Dilbert, The Office, Office Space, and many other popular satires of contemporary work, cubicles are a symbol of all that is uninspiring about office life, and on this point, cubicles seem utterly without defenders. Fortune recently ran an article called “Cubicles: The Great Mistake,” complete with a public apology from one of the first cubicle designers. Twenty years after his Atlantic article extolling the virtues of the cubicled office, James Fallows wrote another on how he changed his mind. The promises of a cubicle utopia now seem curious, to say the least. In fact, the companies that make cubicles increasingly offer up apologies of their own. Steelcase, in its “State of the Cubicle” report, addresses the “Dilbert-type issues” that surround them, turning to head of design James Ludwig for a response. “Our goal in design would be to unfold the cubicle in ways that might make it unrecognizable.” The cubicle, once a cutting edge statement of corporate identity, has become an embarrassment, even for its makers.
What explains this change in meaning? Cubicle utopianism was probably a victim of its own success. The idea that cubicles formed a more exciting, humane workplace became less plausible to those who had the experience of working in one. As partitions and the space allotted to each worker shrunk, few things seemed to matter to office dwellers more than privacy. From the very beginning, workers reacted to cubicles by blocking up the openings of the “open office.” Newspaper reports of early cubicle offices tell of employees raiding supply closets for cardboard and extra panels to extend partitions. Some workers went so far as to push large filing cabinets into the space created by their cubicle’s missing fourth wall. While the beige rat-maze aesthetics of partition living attract all the jokes, the basic geometric facts of cubicles—their doorlessness and 360-degree visibility—are probably more central to the experience of cubicle work. Private conversations, whether in person or by phone, take on the character of an intrigue, a fact exploited endlessly in office sitcoms where ordinarily private matters of romance, betrayal, and personal failure are made public in the open office to the dismay of those involved. In an odd twist, privacy often requires venturing out into some more public space, one that is either anonymous (like a sidewalk) or relatively soundproof (like a central conference room).
The utopian visions of the cubicle have been crushed by reality. However, while the cubicled office no longer seems brave or new, an aspect of its original moral impulse remains. Indeed, the experiential facts of cubicle life are not so much in contradiction with the ambition to humanize the office as the revelation of the dark side of this effort. The ideals of office equality, fluidity, and collaboration in all their forms—including servant leadership, worker empowerment, and flattened organizations—required a kind of control more diffuse and amorphous, but also more personal than the old hierarchical bureaucracy. As Tom Peters and the other management theorists of “corporate culture” saw (albeit in a more positive light), the real managerial possibility contained in the cubicle was not lower costs or even the ability of managers to watch workers more closely. It was rather the creation of a culture in which workers would feel obliged to manage themselves. With everyone visible to everyone else, managerial obligation could spread itself throughout the entire office, becoming more personal and intense at the same time. Cubicles are not alone in this trend. The advent of 360-degree evaluations (filled out by those above, below, and beside an employee in the organizational hierarchy), the creation of company mission statements followed inevitably (and sometimes preceded) by facilitator-led meetings designed to get “buy-in,” and corporate campuses (which, by containing everything from grocery stores to fitness clubs eliminate reasons to leave), all tend to blur distinctions between personal and professional.
The ideal of the cultural workplace and its embodiment in cubicles also moves against another longstanding distinction of office work—the distinction between managers and workers. The ideal of a boss-less company has not been realized on anything like the large scale the management writers dreamed of, if it has in fact been realized anywhere. However, the impulse to equality and management through culture has led to something like the opposite of the boss-less company with bosses everywhere. As the managerial role is increasingly shorn of “authoritarian” tendencies and managers adopt the stance of a servant and facilitator, the scope of demands upon ordinary workers has risen. Observation, evaluation, encouraging the proper attitude and habits in other employees—these are all managerial tasks that are supposed to be shared. Such is the nature of being a team member. Cubicles may not be inspiring, but they have clearly contributed to new obligations.
These obligations go beyond the management of work to the management of self. The teamwork and collaboration of the open office elevate the importance of relational dexterity and a sunny (but not too sunny) disposition at work. Books promising work success through “emotional intelligence” and pharmaceutical advertisements portraying the difficulties faced by office workers with anxiety and attention disorders are both responding to the emotional demands of a work environment that puts a premium on self-presentation.
It would, in a way, be comforting if the rise of cubicles were simply the result of a bad decision to grant spreadsheets and their budgeteer masters imperial dominion over office space, but that’s just not how it happened. The cubicle revolution, in fact, was above all ideological. The clichés hurled at cubicles were woven into their sound-dampening fabric board from the beginning. Any discerning criticism of office life will have to take this moral history into account. Indeed, it is precisely the axioms of what makes for a good company and a good person buried within the cubicle that most need to be uncovered and held to critical attention.
David Franz is a Ph.D. candidate in sociology and a dissertation fellow in the Institute for Advanced Studies in Culture at the University of Virginia. Portions of this essay were previously published in the Institute’s magazine, Culture.
David Franz, "The Moral Life of Cubicles," The New Atlantis, Number 19, Winter 2008, pp. 132-139.

Monday, April 07, 2008



Clinton strategist Mark Penn left mark
By: Ben Smith (Politico)
Mark Penn's exit from his role as Hillary Rodham Clinton's chief strategist likely will portend no dramatic shift in message for the campaign in coming weeks but will bring satisfaction to scores of Clinton loyalists who have wanted the controversial image-meister sacked for months. Penn's decision to step down Sunday came after a contretemps about his work for a foreign government triggered the latest in a series of public distractions from her presidential campaign. Clinton's campaign manager Maggie Williams appeared to tie Penn's departure to his meeting with the Colombian government, a client of the public relations firm he leads, about their campaign for a free trade agreement Clinton opposes. But for Clinton's aides and advisors, the meeting was just the latest in a list of complaints about Penn that began with his strategic decision last year to focus on a message of strength and electability rather than inspiration, and his insistence on highlighting policy over passion. His $13 million in campaign billing and his insistence on maintaining his salary as worldwide CEO of the firm Burson Marsteller were also sources of tension within the campaign. "After the events of the last few days, Mark Penn has asked to give up his role as Chief Strategist of the Clinton Campaign; Mark, and Penn, Schoen and Berland Associates, Inc. will continue to provide polling and advice to the campaign," said Williams in a statement to reporters early Sunday evening. Williams said Clinton's communications director, Howard Wolfson, and a pollster who recently joined the campaign, Geoff Garin, "will coordinate the campaign's strategic message team going forward." Few in Clinton's circle foresee a dramatic shift in direction, largely because it is now too late for a dramatic shift in message – voters in the largest remaining state, Pennsylvania, cast their ballots in two weeks. "This was really a function of Colombia, not a referendum on his strategic vision," insisted one senior campaign aide. If there's any immediate consequence, though, it will be to morale inside a shell-shocked campaign where Penn's compensation, his attention to his business, and his gruff demeanor made him a divisive figure. "It was very demoralizing for the staff that's working 24/7 to see him doing book tours and engaging in private sector activities," said a prominent Clinton supporter. "It was an important statement for the campaign to make to its own team."
Penn was the source of a long series of campaign flare-ups. One subsidiary of Burson Marsteller represented the controversial private security firm Blackwater. Another angered unions Clinton was courting by working for companies trying to defeat union organizing campaigns. His firm also collected more than $11 million from Clinton's campaign through the end of February, and was owed $2.5 million more in the most recent filing, though part of that sum is polling expenses. In recent weeks, Penn created additional controversy by suggesting New Mexico Governor Bill Richardson's endorsement – which Clinton had assiduously sought – was meaningless and by his matter-of-fact assertion to reporters that Obama simply can't beat the presumptive Republican nominee, Arizona Senator John McCain. Penn, one of the most influential political advisors of his generation, nevertheless accomplished a great deal as the campaign's top strategist. Clinton began the campaign with obvious vulnerabilities: A high percentage of Americans disliking her; the ambiguous experience of a White House spouse; and little natural charisma. With his trademark barrage of poll numbers, Penn – who had worked with Bill Clinton since the former president's 1996 re-election campaign — managed to persuade journalists and Democratic voters that the New York senator could actually win a general election.
And his focus on her strength – along with her own tough debate performances – erased any lingering doubts about whether a woman would be tough enough to serve as commander-in-chief. That accomplishment revealed Penn at his best--a cocksure advisor to some of the world's most important figures in politics and commerce, with clients ranging from Microsoft's Bill Gates to U.K. Prime Minister Tony Blair. But even as Clinton proved an able candidate, Illinois
Senator Barack Obama's soaring rhetoric connected with voters and generated tremendous grass roots enthusiasm, providing a stark contrast between the two. Since Clinton's 2000 campaign for Senate, Penn had argued for a focus on policy over emotion. "Being human is overrated," Penn reportedly joked at a debate prep session last winter, and the remark was, to his critics, his central failing. Indeed, Penn spent part of 2007 marketing his new book, "Microtrends," which stressed the power of tiny demographic groups, rather than broad messages. The book begins by invoking an old Volkswagon slogan, "Think Small." "It was a revolutionary idea – a call for the shrinking of perspective, ambition, and scale," Penn wrote. But as Clinton thought small and targeted narrow groups – adults who care for their parents, for instance – with specific messages, Obama's broader themes often carried the day. And when it came to organizing on the ground in Iowa and elsewhere, it was Obama who proved more able to focus on the small things. Penn predicted victory for Clinton in Iowa, and was ridiculed for it to the extent that Bill Clinton rose to his defense in response to a hostile question a few days later in New Hampshire. "You can take a shot at Mark Penn if you want. It wasn't his best day. He was hurt, he felt badly that we didn't do better in Iowa," he said. Penn's internal rivals, including advisor Harold Ickes, never let him forget his Iowa error. "I'm not going to make any predictions. I'll leave that up to Mark," Ickes said on a March conference call with reporters. And the core of Penn's strategy appeared to go out the window after Clinton seemed to turn around the polls in New Hampshire by almost crying in a conversation with a woman at a diner. Until this weekend, however, Penn remained a central figure in the Clinton campaign. He led strategy calls, headed Clinton's team in post-debate "spin rooms," and was a rumpled, occasionally aggrieved figure on television until a new campaign manager, Williams, forbade him to make any further appearances. The end of Penn's tenure as chief strategist offered immediate satisfaction to his many internal detractors, but the campaign signaled no change from its latest strategic direction, which Penn helped shape: an argument that only Clinton has a good chance of beating the Republican nominee.
© 2007 Capitol News Company, LLC