See the New Mad Sociologist at

Sociological Imagination

Call Girl Barbie!

NOTE: The picture below is not the original. Somehow, many of my images on this site were corrupted (one of the reasons a switched to the new site and format). The original was a Barbie in a very low cut evening gown. Fortunately, I didn’t have to look far to reproduce a suitable image. I simply clicked on a major retailer website and selected four images. You see that though the image is not quite the same, the message and the insinuations about sexuality remain. 

This is a picture from the cell phone of one of my wife’s friends.  Believe it or not this was a Barbie (or Barbie like doll, I don’t want to step on any name brands) that was being sold in the toy section of the store.  Notice the low cut top and the obviously, surgically enhanced…um…cleavage.


call girl barbie

If a picture says a thousand words, just how much is this doll saying to our children, especially our little girls?

What does it take to eradicate an idea?

Ideas, once created, do not die easily. About twenty-five hundred years ago the ancient Greeks proved that the Earth was a sphere. Eratosthenes even calculated the circumference of this sphere. Since then the Earth has been circumnavigated countless times and been photographed from orbit and from the moon. It is clear beyond any doubt that the Earth is, in fact, a sphere. Yet the idea that the Earth is flat still remains and is defended by a very real Flat Earth Society. Mythology going back to ancient Egypt is still alive and well in the Summum Church. I still find myself bewildered by those who refuse modern medical treatment, preferring ancient herbal remedies against often extreme illnesses. Regardless of our social and cultural evolution even the most outrageous ideas are still alive, if only in a small knot of the population.

It makes sense. Once an idea is developed and disseminated throughout a population through language and symbolism it develops a very robust vitality. Add on other cultural elements and institute this idea into active populations and you are looking at something that is very vital. Once an idea is reified, or normalized and naturalized, within a population it becomes the lens through which reality itself is viewed. So long as adherents remain who accept the reified elements of an idea and socialize them into the next generation the idea itself will survive if only in small, holdout populations or as foundational elements of other knowledge.

The advent of new and even better ideas is not enough to eradicate the old. The eradication of an idea cannot be accomplished through debate and illuminating discourse. It’s not good enough to merely disprove an idea, even if that idea has no valid foundation at all. If the idea has been reified it will survive even the most acute and exacting proof to the contrary. People become invested in their ideas. To suggest that they change may actually require people to abandon their own sense of reality. This is especially true when ideas are the foundation of institutions with significant power investments, such as religions or political parties. To eradicate the idea means upending the institutional framework itself.

Ideas are, in a very real sense, well nigh impossible to eradicate. This may be exasperating to those of us who actively strive to institute social change and see ourselves as working toward a more just and free society. Antiquated ideas, like those that define a “woman’s place,” are frustratingly tenacious. On the other hand, if beliefs and their associated norms and values were as malleable as some of us may want society could be too unstable, unpredictable and normless.

Despite this, attempts have been made to eradicate ideas considered or constructed as dangerous. The Roman Empire tried on a number of occasions to destroy the dangerous concept of Christianity through persecution and execution. Christian churches were largely successful in wiping out indigenous, pagan beliefs—largely, but not completely. Of course, there are the infamous modern attempts to suppress ideas in Nazi Germany, manifest in Kristallnacht, book burnings and the final solution, and in the Soviet Union, represented in the gulags. Regardless, blatant policies of eradicating ideas always involve oppression and even violence. Eliminating ideas through knowledge and illumination is rare. After all, there are still those who insist that the earth is flat. (The concept of bodily humours comes to mind as an idea that died through atrophy though I’m certain if I look hard enough I’ll find it’s not so dead). So when paranoids like Glenn Beck suggest that an idea, especially a well established idea like progressivism, should be eradicated, this pronouncement should be immediately followed with an explanation of what eradication entails.

To eradicate an idea the rights of individuals to speak and to freely access information and personal networks that perpetuate these ideas must also be eradicated. Adherents to the offending ideas must be completely silenced, because mere discrediting will not suffice. Outright brutality must be brought to bear on those who would perpetuate an idea, and even then the idea will be perpetuated if only in secret.
Eradicating ideas requires some combination of the following:

  1. Attacking the ideas directly as being illegitimate or, most likely, a threat.
  2. Delegitimizing and dehumanizing those who hold these ideas.
  3. Censorship of the offending ideas.
  4. The denial of rights for and/or violence against those who hold the offending ideas.

Attacking ideas directly is not the same as debating and disproving. In fact, it has nothing to do with the relative validity of the ideas being attacked. To attack an idea the claim must be made that the belief system constitutes a threat or some malicious intent. An idea may be defined as being un-American, or unpatriotic—or seditious. The first Sedition Act in the United States was passed under the Adams administration just seven years after the ratification of the Bill of Rights. In the 20th century the Wilson administration signed into law the Espionage Act aimed at curtailing dangerous ideas against the war effort. Charles Schenck was imprisoned under this act for speaking out against the draft. The famous jurist Oliver Wendell Holmes defined Schenck’s ideas as a “clear and present danger” akin to shouting “fire” in a crowded theater. Three time presidential candidate Eugene V. Debs was put in jail for speaking out against US entry into World War I among other dangerous ideas.

After World War I the US Attorney General A. Mitchell Palmer and his assistant, J. Edgar Hoover concocted a Red Scare that lead to the arrests of thousands of innocent people who happened to be adherents of these dangerous ideas. According to Palmer, “The whole purpose of communism appears to be the mass formation of the criminals of the world to overthrow the decencies of private life, to usurp property, to disrupt the present order of life regardless of health, sex or religious rights.” Hundreds of radical thinkers, like Emma Goldman, were deported while others, including the Nobel Peace Prize winning Jane Addams, were placed under surveillance.

Later, the US House of Representatives instituted the House Un-American Activities Committee, a government body dedicated to stamping out the dangerous ideas of communism. It succeeded in bullying and largely silencing some of America’s greatest talents, including Arthur Miller, Dalton Trumbo, Richard Wright and Pete Seeger. HUAC became the model later used by Senator Joseph McCarthy to achieve his ends by suggesting that the United States was being invaded from within. “The reason why we find ourselves in a position of impotency is not because the enemy has sent men to invade our shores, but rather because of the traitorous actions of those who have had all the benefits that the wealthiest nation on earth has had to offer – the finest homes, the finest college educations, and the finest jobs in Government we can give.”

So when men like Glenn Beck suggest that ideas they don’t like are a “cancer” that must be “cut out” and “eradicated” they are echoing a long line of fear mongering and of stifling the environment of thought in the United States.

From delegitimizing ideas it is not a far leap to delegitimize the people who hold these ideas. Really, the only way to really stop an idea is to silence people. That was the function of the Sedition Act of 1798 and the Espionage Act of 1917. The Palmer Raids and HUAC, McCarthy and the blacklists were efforts to intimidate those with so called dangerous ideas. For those who could not be bullied, there was deportation, exile and imprisonment—even death. In the case of Sacco and Vanzetti, the fear associated with their radical ideas was instrumental in their death sentence.

One heart-wrenching example of the destructive potential of trying to destroy ideas comes from US policies toward native peoples. The expansion westward created for Americans an “Indian problem.” Native people were defined as savages, and their culture as inferior to, even in the way of, Euro-American culture. Indigenous Americans were routinely invaded, killed and removed from their homes and exiled into reservations. Native people often resisted, and resisted violently. In many cases they resisted brutally. To suggest, however, that perhaps Native people had a right to defend their homes and their cultures was not a legitimate idea. American response to such Indian resistance was reflected in the observation of General Phillip Sheridan that “the only good Indian is a dead Indian.” Despite the US government repeatedly betraying promises and treaties with Indian tribes, the term “Indian giver” remains a euphemism for those who go back on their word.

Once defeated and exiled onto dependent reservations the most invasive attacks on Native American ideas began. The Dawes Act mandated that native people abandon their communal lifestyles and take up property ownership like good Americans. In the meantime the Bureau of Indian Affairs (BIA) mandated that native people give up their languages, rituals, religions. Native American children were taken from their parents and sent to boarding schools where their cultural identities were suppressed and replaced with appropriate, white American ideals. The first of these boarding schools was founded by an army officer named Richard Pratt. Pratt summarized his goals by saying, “all the Indian there is in the race should be dead. Kill the Indian in him, and save the man.” Native children were given Euro-American names, not allowed to speak their native languages, denied practice of native rituals and forced conversion into Christianity. This was a concerted effort to destroy the ideas and identities of many cultures.

In 1889 a Paiute shaman named Wovoka had a vision. In this vision it was revealed that the land would be restored to the Indian people, the fertile soil restored and the great herds reborn. The white man would disappear if the native people everywhere performed a great Ghost Dance. The Ghost Dance movement spread rapidly through Lakota country. This was a non-violent movement aimed at restoring native culture. Weapons were not allowed during the Ghost Dance. However, Ghost Dance was also defined by white Americans as a dangerous idea that had to be “cut out.” To a certain extent, this is understandable as the premise of the Ghost Dance was the “eradication” of the white race. But this eradication, however, was to be divine intervention, not a human revolution. Leaders associated with the Ghost Dance were targeted for arrest. Among them the great Sioux Chief Sitting Bull who was killed in an attempt to take him into custody. The fear and ignorance of a minority of white settlers that the Sioux were “dancing” lead directly to the Massacre of Wounded Knee.

The process of eradicating ideas is not just a rhetorical construct. It has very real and terrible human consequences. I could go on with the stories. The bottom line is that in all of the history I’ve studied throughout the years I’ve not found one instance in which an idea was “eradicated” without the denial of basic rights and even violence. Some ideas die from atrophy as they become irrelevant in process of cultural and historical change. This is not the result of an open attempt to eradicate these ideas, but rather a social evolutionary process.

In all of the history I’ve studied there has only ever been one “cancer” on any society. That cancer is fear. But the cure for this particular form of social cancer is not “eradication” or some brutal process of “cutting out,” the offending idea. The cure is reason. Fear is stoked by ignorance. Only those who are ignorant of history and philosophy can be motivated by fear. It is incumbent upon us to meet ignorance with knowledge and understanding.

“Because they breed…” Social Darwinism in Contemporary Conservatism

A couple of weeks ago I was teaching my students about the fundamental theories of sociology. Of course, no such lesson is complete without mentioning British biologist and social philosopher Herbert Spencer.

Spencer was a founding father of the structural functionalist perspective in sociology. It was his belief that everything that happens in society happened for a reason, that it served some organic and evolutionary purpose. This was a valuable contribution to the social sciences, which was elaborated upon later by Emile Durkheim.

Unfortunately, Spencer’s philosophy did not stop there. Indeed, before Darwin’s Origin of Species Spencer was expostulating theories of evolution, and applying those ideas to the study of society. It was his belief that society was guided by a natural, evolutionary law, which would ultimately lead to a perfected equilibrium. To manifest this evolutionary outcome, Spencer advocated a radical concept of “freedom.” To Spencer, any government interventions on the function of society were a form of oppression and an obstacle for our progress toward some vaunted equilibrium. In fact, Spencer suggested that it was a matter of time before the state would be replaced by, in essence, a market based institution.

Spencer, who coined the term “survival of the fittest” was absolute in this concept. ANY intervention on the part of the government stood in the way of social progress. Whereas he understood the rationale of human empathy and charity, he advocated against such activity. Humanitarian intervention for the poor, sick, mentally handicapped or other marginalized groups only perpetuated these groups into the next generation, prohibiting the “natural” progress toward our ultimate human destiny…a free market, one would say “libertarian,” society. He admonished 19th century liberal reformists for being so short sighted. His politics was to let the weak die! It was, according to Spencer, for the greater good.

Usually, the students are at this point stunned by this coarse and vile philosophy. It’s my belief that they should be. Sociologists are usually in their field because they want to contribute to society, and this contribution does not include letting large segments of the population die from hunger and disease, even if it does fulfill some evolutionary purpose. Spencer’s theory is known as Social Darwinism (though Darwin himself was not a proponent, and Spencer’s evolution drew more from Lamarck).

I tell my students that there remain subtle whispers of Spencer in the modern discourse. Grover Norquist’s goal to shrink government to the size where it can be drowned in a bathtub would have met with Spencer’s approval. Ronald Reagan’s admonition that government is the problem, not the solution is the underlying argument against all government programs from Medicare to Welfare (but not the military for some reason). Milton Friedman’s extremist free market economics has, at its base, a Spencerian motif.

Then I heard this statement from South Carolina’s Lt. Governor Andrew Bauer with regard to government assistance, “My grandmother was not a highly educated woman but she taught me as a small child to quit feeding stray animals. You know why? Because they breed. You’re facilitating the problem. If you give an animal or a person ample food supply they will reproduce…” I was reminded that sometimes the riffs of Spencerism are not so subtle.

Visit for breaking news, world news, and news about the economy

It’s interesting, if not somewhat appalling that contemporary neo-conservatism draws from the inspiration of Herbert Spencer. Now I can’t say that neo-conservatives have read the works of Spencer, but Spencerism has long been a common element of conservative discourse. It has always been the justification for ignoring those in our society who are in need, for the exploitation of labor, for the exultation of the business elite. Whenever you hear the rhetoric behind privatization of those things held in the public trust, such as public schools and even defense contracting, know that you are hearing the whispers of Herbert Spencer.

Spencer and the conservative tradition, even today’s neo-conservatives, do have some legitimate criticisms and concerns about the dangers of state power. However, it never ceases to amaze me that their solutions are consistently at the expense of the marginalized, disempowered and dispossessed. It’s hard to justify philosophically why government intervention in people’s lives is oppressive, but corporate intervention in people’s lives is an extension of “liberty.” A government bureaucracy is just this side of Stalinism, but corporate bureaucracy is a free market ideal.

Of course, Spencer’s philosophy hinges on the idea that the actions of free individuals are the driving mechanisms of this fantastical social evolution, but the actions of institutions, composed of individuals, are interference in the natural order. Could it be that government and even corporations might, through checks against their propensity to consolidate power, actually be contributing members of our society?

The Rights of the Corporation

Let’s face it. American democracy has been on life support for some time now. On January 21st, however, the five conservative members (or should we from now on refer to them as corporate conservatives) of the US Supreme court took it upon themselves to pull the plug. American democracy died that day with only a muted whimper as Americans were more concerned about the fate of the Tonight Show and the corresponding careers of Jay Leno and Conan O’Brien.

The decision was Citizen’s United v. Federal Election Commission. Justice Kennedy presented the “opinion of the court,” or rather the opinions he shared with Justices Alito, Scalia, Thomas and Chief Justice Roberts, declaring that corporations have the same rights as do individuals, and pouring unlimited amounts of money into political campaigns is a manifestation of the corporation’s right to free speech. So there you go. Corporations are people. They are just people who cannot vote, hold office, eat, drink, plan a family, experience joy or pain or, necessarily, die. But they are people nonetheless.

In fact, they are very special people. People in America can only give $2,300 to political campaigns. Corporations, on the other hand through their polymorphous arrangements and organizations can give unlimited funds to campaigns and related Political Action Committees, 527s and ad hoc media outlets to support or destroy a political candidate.

Of course, corporations are not people, they are institutional constructs. In my sociology classes I teach my students what I call the Law of Institutions. The Law of Institutions states that the primary function of an institution is the perpetuation of that institution. Sociologically, this is not necessarily a bad thing, as institutional stability is prerequisite for societal stability. However, when the needs of people conflict with the Law of Institutions, the needs of the people must necessarily take a back seat.

Corporations, as institutional constructs, need only to perpetuate themselves. Corporations do not require clean air or medical care or humpback whales. Many people, however, feel that these things are important and make their political decisions based on the positions candidates make on these issues. Corporations, however, do operate quite well in environments conducive to exploitation and deregulation and the preservation of destructive commodity chains, risky investment options and monopoly.

Those candidates who toe the corporate line can expect the kind of political protection from corporations that would make the Mafia jealous. Those candidates, however, who might want to…oh, I don’t know…suggest that corporations shouldn’t be allowed to dump dioxin in our drinking water, or destroy habitats or exploit their works, could face a cessation of funds and an opponent with virtually unlimited resources to discredit him or her.

By removing what little restriction corporations had in political campaigning, the Supreme Court has transformed the United States of America into what I’ve long called a corporatocracy. According to Representative Alan Grayson, the only politician I’ve seen who is ahead of this issue, “You won’t have any more senators from Kansas or Oregon, you’ll have senators from Cheekies and Exxon. Maybe we’ll have to wear corporate logos like Nascar drivers.”

Grayson, in anticipation of this ruling, has put together a legislative package called the Save Our Democracy Reform Package. This legislation would impose a 500% tax on corporate expenditures to political campaigns, requires corporations to disclose the funds they use in political campaigns, would bar politically active corporations from participating in national stock exchanges, and prevent corporations that receive government contracts from political spending.

Grayson’s package makes sense and may have a great deal of support with the general public (once the whole Leno/O’Brien thing blows over, of course). It will certainly not have the support of the GOP, as Republican National Chairman Michael Steele is on record supporting the Supreme Court decision. Individual Republicans, however, such as Senators John McCain and Olympia Snow have some history with campaign finance reform and have expressed concern about Citizens United v. FEC.

As citizens, real flesh and blood people, we are democracy and we should be the first responders desperately trying to resuscitate American democracy before the body is cold. Call your representatives today and demand that they support the Save Our Democracy Reform Package. Find out where they stand on Citizens United and hold them accountable to democratic, not corporatocratic principles.

An Aside on this issue:

Where are the conservatives crying “ACTIVIST JUDGES! ACTIVIST JUDGES!”?

Lieberman, de Tocqueville, Tyranny and Health Care!

 Lieberman vs de Tocqueville

In his history defining and seminal work, Democracy in America, Alexis de Tocqueville explored the ins and outs of early nineteenth century American democracy. Of de Tocqueville’s concerns about American governance the most famous was a critique that has become known as the Tyranny of the Majority.  According to Tocqueville, the majority in America is “omnipotent.”  This omnipotence ultimately translates into oppression over the rights of the minority, blindness to ancilary issues not within the aegis of the majority and legislative instability. “Hence the majority in the United States enjoys immense actual power together with a power of opinion that is almost as great.”

De Tocqeville’s criticisms of American democracy are valid.  Democracy in America should be required reading for all Americans.  Most especially to one who studies the sociology of knowledge is the nineteenth century philosopher’s unwitting nod to postmodernism when he suggests that the tyranny of the majority can be even more oppressive than any monarchy by virtue of its ability to define the very ideas of the citizens.

“The most absolute sovereigns in Europe today are powerless to prevent certain thoughts hostile to their authority from silently circulating through their states and even within their courts. The same cannot be said of America: As long as the majority remains in doubt, people talk, but as soon as it makes up its mind once and for all, everyone falls silent…I know of no country where there is in general less independence of mind and true freedom of discussion than in America.”

In going back to the well thumbed pages of my volume of Democracy in America I am still humbled by de Tocqueville’s analysis.  It’s no wonder that sociologists claim his as one of our own.  However, I’m left to wonder just how de Tocqueville would analyze our current debate on health care reform.

No doubt de Toqueville would be confused.  Throughout the health care debate poll after poll demonstrates that majority opinion supports the foundation of a public option to control costs by providing an alternative to private insurance.   This, despite the negative PR blitz of this last summer and the veracity of a Tea Party movement venomously against health care reform or the prospect of even the slightest government interference in the free market.  If ever a majority had made up its mind it is with regard to the public option.

Yet the fate of the public option is in peril? How could that be? How could de Tocqueville’s famous analysis be so far off in this (and many other matters, but that’s a different blog)?

Of course, we can’t be too hard on a nineteenth century social commentator.  How could he have ever predicted the rise of what I now call Pathological Liebermanism or The Tyranny of the Lieberman.  This is a phenomenon in which the processes of American democracy invests disproportionate power into the hands of one elected official (in this case “Droopy” Joe Lieberman, but also Ben Nelson).  The majority of Americans support the public option.  The majority of legislators support the public option.  But majorities are not good enough.  Individuals like Lieberman and Nelson can bring the “omnipotence of the majority” to flaccid humility.

We must remember that our founders and their immediate heirs had very little regard for the will of the majority.  They established norms through which they could thwart “mob rule” in the chambers of congress.  Among those rules was the filibuster.   And the filibuster has become the weapon of choice for minority political parties.  Now, to be honest, I’ve supported the filibuster when it was being used to protect the Arctic National Wildlife Reserve.  On the other hand, the filibuster was also used to delay crucial civil rights legislation.  It seems that we have a love/hate relationship with this particular Senate rule.  In matters of health reform including a public option, an issue which I support, it is maddening to think that one senator, a Lieberaman or a Nelson, can stall the will of the majority of Americans.

It is equally maddening that such senators can, in essence, put their filibuster busting potential on the sale block.  In Nelson’s case it was an agreement that the federal government would pick up the tab of health care reform for the citizens of Nebraska…and only Nebraska.  Why should the people of Nebraska benefit at the expense of the rest of Americans who would have to pay more to make up for the absence of that state? Because Ben Nelson won’t shut up? In Lieberman’s case it was the elimination of popular reforms, the public option and the expansion of Medicare.  Why should Lieberman, a man who himself enjoys the benefits of a single payer, government run health program, be able to deny the same for the rest of us all by himself?

Indeed, de Tocqueville would be obliged to add a chapter or at least a long addendum to his master work to address the Lieberman Syndrome (another cool name for what we are witnessing).

De Tocqueville was also not privy to the idea of modern lobbying and money politics.  In almost every case the politician with the largest campaign coffers wins.  De Tocqueville’s assertion that our legislators change rapidly, leading to instability in our houses of government has turned out to be false.  Indeed, the majority of seats in congress are considered “safe” seats in which the sitting representative will almost certainly be re-elected. This fact, however, rather than stabilizing our legislature as one might predict using de Tocqueville’s reasoning, has lead to an entrenchment of ideas and ultimately to an institutional polarity that one might suggest is  even more destabilizing in effect.

This polarity is linked to campaign contributions.  Think about it.  A two party system is much easier and cheaper to fund than one in which multiple parties and ideas are competing for recognition.  As it stands, most corporations hedge their bets by donating large sums of money to both parties.  Imagine if there were three or four or even five parties demanding such control!

The polarity between liberal and conservative is also fed by campaign contributors. In the health care debate The Center for Responsive Politics has done interesting research on campaign contributions and position in health care reform.  They created a ratio between contributions from labor organizations and contributions from health care corporations and compared this ratio to the voting records of our senators.  Those senators with higher ratios, thus higher comparable contributions from labor, were more likely to vote yes to the Senate bill.  Of course, they were also more likely to be Democrat, indicating a traditional tendency for labor unions to contribute to this party.  Those who voted against the bill were more likely to have received larger contributions from the health industry.  It is important to understand, however, both health and labor organizations are sure to hedge their bets by contributing large sums to both parties.   It is also paramount that the money does not necessarily represent a “majority” view. (1)

When it comes to money and health care those who are least satisfied with the status quo are almost certainly the least likely to contribute large sums to politicians.  Labor organizations such as unions may have larger coffers to represent the interests of working people, but this is only a segment of the population so affected. Obviously the tyranny of the majority is not driving this debate, nor is the majority in any way “omnipotent” in the de Tocquevillian understanding of the term.

The health care debate might help us define American politics in a post de Tocquevillian way.  It is the Lieberman Doctrine (Yes, coining terms is my new hobby!) that seems to be the new “omnipotent tyranny” influencing contemporary American democracy.

Now this post is not a condemnation of the filibuster, nor is it a confirmation of the legitimacy of majority rule.  In this matter I happen to have the comfort of speaking with the majority.  That is not always the case.  When I do represent the minority opinion I  want to have processes in place, like the filibuster, to protect my interests. I also recognize the reality that the majority is not always right. But certainly these ideas must be revisited.

It is the contention of the Journal of a Mad Sociologist that any great disparity in power, regardless of the holder[s] of such power, is a danger to democracy and humanity.  The concentration of power in the hands of the majority has the potential to be just as oppressive as the concentration of power in the hands of Joe Lieberman. Neither should have the power to over-ride what this outlet has defined as a human right, the right to health care.


(1) In most cases this moneyed politics tends to limit the differences between parties.  In health care, however, polarization is the result as a conflict between funding sources emerges to define the debate.

What Nobel Peace Prize?

Obama Peace Prize

A Plausible, Fictional Scenario

Scene: conference room of the Nobel Prize Committee:

Committee Person 1: “Hey, I have an idea! Why don’t we give the Nobel Peace Prize to President Barack Obama? This might motivate him to abandon the militant/imperialist policies of his predecessors and pursue more peaceful and reasonable policies for solving international problems like terrorism.”

Committee Person 2: “That’s a great idea.  I second that motion!”

Committee Chairman: “All in favor?”

Committee: “Aye!”

So How’s This Working Out For You?

                About a month after being elected, Barack Obama sent 17,000 soldiers into Afghanistan.  A few months later he learned that he was the recipient of the Nobel Peace Prize.  A teacher might call this positive reinforcement of negative behavior.  Such a teacher would predict that the consequence of this reinforcement would be further escalation of warlike ends.  Just two weeks before giving his Nobel acceptance speech, he proved the hypothetical teacher correct by requesting 30,000 more troops be sent into an untenable battlefield.  In the meantime, the Obama Administration continues to wage and escalate a secret war in Pakistan manned by flying killer robots. 

                The new Nobel laureate has done nothing to reverse the breaches of humanity committed by the Bush Administration.  Despite his lofty rhetoric we still remain a nation under a surveillance infrastructure that ignores our rights. GITMO may be closing (someday), but you can forward any mail to Bagram, which the Obama Administration insists is outside the jurisdiction of the US Supreme Court—for exactly the same reasons given by Bush.

                Granted, the sheen of the Nobel Prize has long been tarnished by the likes of Henry Kissinger,  and this isn’t the first time that a US president has received the prize despite questionable peace credentials.  Barack Obama is just another layer of corrosion and rot, just another disappointment in the chronicle of peace.  Obama offered us change we could believe in.  Well, there are those who believe in Bigfoot and UFOs without actually seeing them.  It appears that actual change in America will be equally elusive. A more peaceful foreign policy will be no exception.


The Escalation of War


                None of this should come as a surprise. The relationship between the technologies of power and the empowered is one of mutual reinforcement.  Once measures such as war, surveillance and extra judicial activities are set into place they are very difficult to dislodge.  The established infrastructure and bureaucracy that support these measures become entrenched in the system, institutionalized and self perpetuating.  The benefits these institutions confer to the powerful then create a symbiosis that is, in fact, parasitic to enlightened, democratic societies. 

                And what ties these variables together?—war.  War justifies the use of such technologies and the expansion of the corresponding institutions.  If a nation can just remain in perpetual state of war then abuses of power can be defined as “national defense.”  And “national defense” is the key.  Enlightened and ethical people have long since turned their backs on glorious conquest.  As Obama conceded in his acceptance speech, the only just war is a defensive war.  Hence the Department of War becomes the Department of Defense and massive military investment in offensive weapons is called “defense spending.”

                Yet institutions of power are still very much motivated by the glory of conquest.  Entire industries have developed to fulfill imperialist ends.  These industries must be fed.  So with the rise of the military industrial complex the United States has been in a perpetual state of warfare—every single operation defined as one of defense.  From containing communism to the Domino Effect in Indonesia, Central and South America, and Cuba to non-existent weapons of mass destruction in Iraq, the United States has always defined its violence toward weaker nations as defensive.


9/11 and the War on Terrorism


                Upon watching the fall of the Twin Towers on 9/11 it became clear that there really was a necessity to defend ourselves. Despite our massive (the most massive in the history of mankind) “defensive” posture, the United States turned out to be just as vulnerable as everyone else on the planet.  Terrorists with enough guile were a threat to our security.  Something had to be done.

                And something was done–the wrong thing. Despite the danger posed by al Qaida, a non-state organization, the United States decided to invade a sovereign nation which, in and of itself, had nothing to do with 9/11.  Conveniently, however, they did have a great deal of natural gas that we wanted  and a government that was easily vilified. They were a perfect enemy for an imperialist military structure.

                The justification for invading Afghanistan was that they refused to turn over Osama bin Laden.  Of course, like all other excuses for going to war, this was not true.  Indeed, Afghanistan did what nations always do when they have something another nation wants—they negotiated.  Most negotiations begin with saying no.  Of course, negotiation was not in the interests of the Bush Administration. Power must have its wars.  This is especially true for power of questionable legitimacy, like the first term Bush Administration.

                Osama bin Laden was never the real target, at least not the only target.  If he was, the US would have developed a different response.   Any real strategy for dealing with non-state terrorist organizations would preclude traditional warfare as inadequate.  Terrorism must be countered through international policing.  Traditional warfare is designed to pit one state against another. It is not designed to counter non-state entities like al Qaida.  Using traditional warfare in such an instance is akin to bombing Chicago to fight the Mafia.  Bin Laden and 9/11 were nothing more than the pretext to expanding military/imperial interests in a resource rich nation. Afghanistan does not satisfy the requirements for a just war.


Traditional Warfare in the War on Terror


                That we invaded Afghanistan because it was the center of al Qaida’s terrorist network was demonstrably false.  International investigations revealed that the plan was designed, implemented and carried out from Hamburg.  Yet there was never a discussion about invading Germany to defend our nation.

                That Afghanistan had to be invaded to rob al Qaida of an important base of operations cannot be supported.  Al Qaida is an extra-national organization.  It’s unlikely that they need a base of operations.  However, if they do need training grounds and such there’s nothing to stop them from packing their terrorist bags and setting up in another country. Indeed, this appears to be exactly what they’ve done. According to reports there are virtually no al Qaida left in Afghanistan.

                At best, the United States tried to resolve a postmodern problem, the advent of extra state terrorism, by applying a medieval solution, military invasion. All of the progress against terrorist organizations has come through international policing and smaller counterterrorism strikes.  The invasion of two sovereign nations–and military entanglement in a third—has done nothing but bog down America’s military and economy, alienate us from the rest of the world and give rhetorical fodder to extremists with which to convince others that the US is engaged in a holy war against Islam. 

                Of course, policing and counterterrorism do not perpetuate war making institutions.  Though there’s profit to be made in such actions the obscene wealth accrued by war industries is just not there.  Also, ongoing, small scale, largely off the map operations do not serve to perpetuate the interests of power quite like a good, old fashioned war.


Obama and Peace


                Now that these military adventures have been discredited and our soldiers are entangled in regional, cultural conflicts such as those between the Shi’a and Sunni in Iraq and the Tajik and Pashtun in Afghanistan, the solution being offered by the Nobel Laureate is…more war.  Obama’s Nobel acceptances speech read less like an affirmation of peace than a talking points memo for war. 

                President Obama has explained that the US must escalate combat operations in Afghanistan to ensure stability and support for a newly founded “democratic” government.  This absurdity is betrayed by the fact that the Karzai government is blatantly corrupt and considered by Afghans to be a puppet government of the US. Recent elections cannot, by any measure, be considered “democratic.” That our soldiers, often referred to as our “treasure” by politicians and pundits, should be expended in defending a government entrenched in the drug trade should be a national outrage.  That our military can function in Afghanistan only by bribing the Taliban for safe passage to the very battlefields where it will be fighting the Taliban is only the most obvious example of the absurdity that is this particular military adventure. 

                Yes, Obama has conceded that there will be a time table for withdrawal.  American commitment in Afghanistan is not open ended.  Well, that’s great, if they really mean it. Not twenty four hours after Obama’s Afghanistan policy speech members of the Administration were back-pedaling the whole eighteen months time table.  Well, it might be eighteen months before we start withdrawing troops, or eighteen months is an estimate, not a firm policy statement, or there may be contingencies in which troops will remain longer.  If Obama’s GITMO policy is any precedent we can expect this timetable to be extended by at least 100%. We’ll see in eighteen months, but I wouldn’t hold my breath.

                But what do we expect from this latest winner of the Nobel Peace Prize.  Here’s a man who is making efforts toward limiting nuclear weapons, yet when it comes to immoral and internationally condemned weapons that are actually being used, like landmines and cluster bombs, Obama has been silent.  All enlightened people throughout history have condemned war.  Some such people have even won the Nobel Peace Prize.  Activists throughout the world have the audacity to hope for peace and humanity, but should not expect such from a sitting American president, regardless of having  a Nobel Peace Prize draped over his headboard.

                Perhaps next year the award will be better vouched.




For References go to the Reference Sources page on the Journal of a Mad Sociologist

For a PDF version of this Blog Click Here


UPDATE 12/30/2009



Visit for breaking news, world news, and news about the economy

A Sociologist’s Take on a Middle School Dance

Middle School dance

I had the opportunity to attend a middle school dance last week.  I didn’t volunteer.  Apparently I missed a meeting!

I didn’t dance.  I’m not much of a dancer.

But I am pretty good at watching crowds and identifying patterns. I didn’t expect to see anything different from this particular dance than any other dance I’ve been to, regardless of the age cohort.  There were the typical groupings.  Cliques of no more than five sticking together. Boys and girls largely segregated their groupings, as to be expected among that age group.  There were the wall flowers and the aimless kids wandering around wondering just what they were doing at this event. The dance floor population was largely girls.  Some of the dancing was an obvious expression of sexuality that adults were largely uncomfortable with.

There were a a couple of phenomena that struck me, however. First was the nostalgic kick I derived by an apparent renaissance of break-dancing.  Of course, it’s not called break-dancing any more, but it was there.  Kids would gather in circles and watch select dancers do their tricks.  They weren’t bad.  Some of the dancers would even show each other their moves then put on a show. It so happens that the genesis of break-dancing hit the scene around the time I was at the middle school age (I attended a junior high).  The breakers wore baggy, rayon jogging suits, carried huge boom boxes and often pieces of cardboard as a portable dance floor.  The breakers today didn’t seem so prepared. Boom boxes have been replaced by iPods and rayon jogging suits have gone the way of the dodo, but the kids still had a great time.  They believe they invented something, and that’s fine.  I got a kick out of watching them.

The break dancing circles were open.  Anyone regardless of talent was able to enter.  Of course, less talented sorts didn’t remain as they were conspicuously outclassed by the more talented.  The circles were also open to girls as well as boys.  Many of those who entered the circles had their followings as well as their signature moves.  Like break dancing of old, the dancing represented street contests, one dancer challenging another to match and best his or her moves.  On a couple of occasions boys in the circle danced for specific girls on the edge of the circle.  As per the script of such interactions the girls feigned indifference, refusing to be impressed by even the most impressive dance feats.  Once the boy’s back was turned and the interaction over the girls demonstrated their glee at being so spotlighted. They had given a perfect performance.

The next thing that I noticed was a contrast to the dances I attended when I was young.  In my time the dance floor was often crowded with people doing their own thing to the music.  There was little syncronicity.  I was surprised to see just how many group oriented dances there were for the kids last week.  Of course, this observation is from only one dance, so I don’t know if this is the trend. The dance steps, however, were popular and understood by many of the students. This leads me to believe that this is standard fare for modern youth dances. Yes, there were opportunities for dancers to express themselves individually, but many songs offered defined, synchronized dancing similar to line dancing.  One song explicitly described the dance moves that were to be done, from cha cha to stepping to the left, to stopping with your right foot.

Kids during this dance spent a great deal of time in this group endeavor.  Could this be indicative of a general conforming trend among young people? Or perhaps, since this was a school sponsored event, these particular songs were selected for the purpose of increasing groupness, assimilation and conformity.  Probably not, but it caught my attention.

Curing Homosexuality? Why?

Visit for breaking news, world news, and news about the economy

Above is a classic interview conducted by Rachel Maddow with Richard Cohen. Cohen believes that homosexuality can be “cured” if you follow his methods. Rachel does a fantastic job of destroying his logic. My only criticism is that I wish Rachel had asked Cohen why homosexuality should be cured. Why should what amounts to a variation of desire, normal desire, be subject to “treatment” thus defined as an illness? That gay people should be subject to the kinds of stigma that convinces them that their love and desires should be subject to treatment is an abuse of the very concept of “therapy.”

To be an illness, a phenomenon must inhibit an individual’s ability, or those who share their lives with said individuals, to live a long and satisfying life. Many homosexuals, including Maddow, demonstrate that whom one desires and how one expresses that desire in no way impacts one’s health as defined above. Nor does it impact the health of others who share their lives with homosexuals. The truth is that there’s nothing pathological about being gay. Social stigma is the only motivation for defining homosexuality as pathological. And it must be remembered that the pathologization of sexuality is…profitable. It’s especially profitable for folks like Cohen who have a line of products for sale by which to “cure” what is, in essence, everybody else’s problem.

By playing against a socially constructed stigma, playing against a socialized sense of inadequacy by convincing people that they are flawed, in need of help, people like Cohen do nothing more than contribute to hatred and prejudice. This is true regardless of their so called prescriptions of love and understanding. By virtue of the fact that they define a variation of sexual/emotional desire as something that needs treatment they are defining those who desire and feel in a certain way as being sick, pathological–inferior.

Social Closure and the Conservative Mindset

Conservative Enough

Perhaps we should feel sorry for the conservatives in the ninth year of the new millennium.  Thirty years of conservative ideology has produced great wealth for the highest tiers of society, but stagnation at best for the rest off us.  Then, in 2008, with the possibility of continuing dominance of the conservative presence in American politics at hand, the whole free market house of cards collapsed around us; perhaps this, more than any campaign strategy, is what brought Barack Obama to high office . With conservatives and conservatism so effectively against the ropes it’s understandable that the response is somewhat punch drunk.

Indeed, it’s predictable that conservatives might want to regroup, consolidate its members and redefine themselves as a reference group.  Exactly what does it mean to be a conservative? And does that definition include being a Republican?  After all, conservatism has been the defining ideology of the Republican party for over half a century.  The Republican Party, however, does not hold any special aegis over conservatism itself.  Indeed, the opposite could very well be considered true.

Conservative Republicanism has collapsed.  There’s nothing controversial about this statement. But conservatism itself is alive and well…and boy are they pissed! Right now they are looking for an institutional carrier to present their worldview. Conservative leaders, such as they are, have stated unequivocally that the Republicans may not remain their chosen vehicle toward this end.

As it stands, the conservative mentality is analogous to what sociologists call social closure.  This is when a social group strictly defines the parameters of inclusion and membership.  These parameters preclude a possibility of members bridging group identities by participating in other social groups.  The idea behind closure is that there are certain, significant benefits to being a member to a particular group, in this case a substantial conservative voting bloc, and that these benefits should only be vouched to true adherents of group norms.

Politics, on the other hand, often requires significant interaction, negotiation, compromise and overlap (bridging) between social groups.  Closure based on ideology is frustrated by political processes.  Conservatives and liberals, being ideologically driven rather than politically driven, are often disappointed by those whom they believe to be their ideological representatives in politics.  Often,  dis-empowered groups will accept a certain amount of bridging in order to attain some voice in the corridors of power. However, what happens when the dis-empowered group was only recently in power?  Could the current political paradigm be a case study?

Having been discredited, conservatives are retrenching.  The problem according the the conservative weltanschauung,  was not conservatism, but rather that past leaders (except Reagan, of course) weren’t conservative enough.  By compromising with liberals and centrists the conservative cause was lost. Ergo, the solution is not to rethink the values of conservatism, as values are very rarely subject to critique, but rather to purge conservatism of its compromised elements and resurrect the “true” conservative.

This notion became apparent to me while I was discussing (read arguing) politics with my conservative brother who claimed that John McCain was not a real conservative, but rather a “progressive.” Granted, the word “progressive” has gone through some etymological convolutions in the last hundred years, but to make such a suggestion was indicative of the retrenching mindset of conservatives. In my brother’s mind, and that of many others, John McCain, the Straight Talk Express, is not a “real conservative.”

During the 2009 off term election the Republican congressional candidate in New York’s 23rd district, Dede Scozzafava, was derailed by conservative stalwarts for not being conservative enough on issues like gay rights and abortion.  In one blog, conservative commentator Michelle Malkin even referred to Scozzafava as a “radical leftist.” Instead, conservatives rallied behind Conservative Party candidate Doug Hoffman, effectively handing the district to a Democrat for the first time since the Civil War.  This could be looked at as a demonstration of the benefits of belonging to the conservative group, and of the dangers of choosing a direction contrary to those of conservatism.  Some view this move as a nihilistic persecution of independent thinking. Liberal commentators like Rachel Maddow have coined scozzafava as a new verb to describe this strategy.

Republican National Committee chairman Michael Steele may well be on the spot if leaders in the RNC bring the so called “purity test” to the table at its winter meeting in Honolulu (talk about trouble in paradise). The purity test, recommended by some RNC leaders, enumerates ten positions of real conservatives.  Those conservatives who agree to less than 80% of the positions should be denied RNC funding. The test, ironically titled “Reagan’s Unity Principle for Support of Candidates” may well have the opposite effect of unity.  Indeed, according to liberal columnist John Nichols, even Ronald Reagan would fail the purity test that bears his name.

Purity is a typical theme when it comes to establishing social closure.  Indeed, a purity test McDonaldizes, the process of establishing purity.  With ten simple rules one can determine who the real conservatives are (and scozzafava the rest). This process is endorsed by conservative notables such as Glen Beck and Dick Armey.  By bringing such a technology into the Republican Party the aim is clear, to reinforce the Republican Party as the vehicle of conservative ideology.

Historically, however, the Republican party has been home to diverse ideologies.  The list of Republican Heroes on the GOP website includes such varied personalities as Abraham Lincoln, Frederick Douglass, Dwight Eisenhower and Ronald Reagan (interestingly, it does not include Theodore Roosevelt).  Should a party that makes room for such luminaries, some of whom have solid liberal or progressive credentials, become a purist expression of conservative ideology? According to claims made by GOP leadership, it’s what Ronald Reagan, the new patron saint of conservatism, would have wanted. (Reagan was unavailable for comment)

Indeed, it’s understandable that ideological conservatives should aim to control one of the two major political institutions in the country, the Republican Party.  Indeed, it could be argued that this process has been going on for quite some time–at least to the Goldwater campaign.  At the height of conservative power the Republican Party had no room for politicians like Lincoln Chafee who tended to have a higher liberal voting percentage than conservative Republicnas were confortable with (calculating the liberal/conservative voting percentages is another example of McDonaldizing).  The term RINO, Republican in Name Only, has been a term of derision for many years.

And certainly,  conservatives are not alone in trying to manipulate the direction of a political party. One might say the same about liberals with regard to the Democratic Party.  Neither party, having achieved political power, has a demonstrated history of ideological purity. The argument could be made that, as institutions, they do not owe allegiance to any particular ideological movement. Their main function is to remain empowered institutions. However, I would argue that conservatives have been a much more functional core of the Republican Party than liberals have been for the Democrats.  The ideological spectrum, for instance, between say Ben Nelson and Bernie Sanders of the Democrats is much wider than one would find between any two Republicans.  The current health care debate demonstrates more solidarity among Republicans than Democrats, and I would hypothesize that this can be largely explained on ideological terms.

The current pursuit of purity among conservatives in and out of the Republican Party seems…different, more persecuting.  The advent of purity tests and the punditocracy like Glen Beck, Rush Limbaugh and Michelle Malkin give this movement the aura of a postmodern Inquisition and purge.  It remains to be seen if Republicans will choose to continue this course.  Having been soundly routed Republicans may see the conservative route as paying dividends.

It’s unlikely that these dividends will pay.  Ideological purity narrows the appeal of a political party.   In a democratic society it is important for the institutions of power to be as inclusive of as many popular groupings, or to bridge into these groups, as is possible.  Exorcising centrists or those with multiple leanings in the interests of ideological purity may be self destructive.  Of course, this is the postmodern era where media talking heads mentioned above can sway, at least temporarily, the collective consciousness of large segments of the population.

In the meantime, it may be that conservatives will abandon the Republican Party for more ideologically pure third parties such as the Conservative Party, Libertarian Party, or even a nascent Tea Party.  Ironically, this is what I have been advocating with regard to both Republicans and Democrats for many years now.

Some important videos on body image:

Our kids need to know that even the most beautiful of celebrities don’t look like their pictures in the magazines or on the screen. The reason they believe they can never be as beautiful as their most admired celebrities is that no one, even the celebrities themselves, can look that good in real life. For some time now we’ve been involved in an ongoing and healthy debate about the nature of beauty and the consequences of creating fantastic expectations of real people.

What I find interesting is that all of the videos below emphasize the unrealistic expectations placed on women to live up to the standards of female models and actresses. Are women the only ones so influenced? Could things like steroid abuse and eating disorders among boys also be tied to unrealistic images of men and masculine bodies? Where’s the advocacy for boys?


Health Care is a Right!

 Health Care is a Right

(Click the logo above for a PDF version of this blog)


                There’s a fundamental argument in the health care debate that has yet to be truly elaborated.  The central question is, does our society recognize access to health care as a right or as a privilege? That this question has not been explicitly defined is intriguing as it could very well change the direction of the discourse into a more humanistic direction.  That might be the very reason for not framing the health care debate in such a way. After all, rights are very popular, but frowned upon by the power elite as being an undue burden in their quest for a wider profit margin.

                As it stands, it looks as though the meritocratic bias of our society predefines access to health care as a privilege.  One of my students, echoing the capitalist paradigm, defined health care as a commodity. Like all commodities, health care is subject to supply and demand curves and scarcity.  Indeed, this is a strong argument from the perspective of those providing health care and health insurance while trying to secure a profit for themselves and their shareholders. After all, sick people are expensive.

                That health care is a commodity is a great descriptor of our current situation.  In a health care market the goal is to maximize profits.  This is done by cutting the costs of health care providers while at the same time steering health care consumers to the most expensive products.  So the people who really need health insurance are more likely to be dropped, denied or subject to exorbitant costs. Those who can afford access to health care are often subject to the most expensive procedures, such as surgery over therapy.  This leaves millions of Americans who can only access health services in the emergency room, publicly funded clinics or the extraordinary efforts of charitable organizations.  In these instances the costs are passed on to consumers.

                If health care is framed as a basic human right, however, then profit motive must take a back seat to universal access, as making the claim that some people have “more” rights than others is culturally awkward. It is also subject to a different social process, as rights are socially and historically contingent. The founding fathers had no need to elaborate a right to health care.  For them, access to health care was almost universal, and quality of care was comparably equitable regardless of class. It’s likely that the health care that George Washington received was not much better than that received by anyone else.

                But that was before the advent of modern medicine.  The medical field has since evolved from an exercise of natural philosophy to a specialized profession of scientific and technological innovation. The miracle of modern medicine has improved the quality of life, but has also contributed increased costs of care.  The resulting health care industry can be described as a typology along a horizontal axis of cost and a vertical axis of quality. In such a system some are bound to be left out while others will be privileged.

                For a long time this typology was not so pronounced as to inspire the scrutiny of society as a whole.  The “miracle” of modern medicine allowed more people to survive infancy and live into old age. We have since taken it for granted that our children will survive and that we will grow old. However, the cost of this historical revolution is the shift in medical needs from the treatment of infectious disease and injury to the much more expensive chronic diseases that correlate to old age and the limits of medical science.

                Large corporations developed to fill the market demand for medical access.  These corporations were not motivated by the desire to do well by their fellow man, but rather to turn a profit.  In an exercise of free market principles at their best, health insurance companies thrived by guaranteeing access to care…for a price. Meanwhile, public clinics and government programs such as Medicare and Medicaid picked up most of the slack among unmarketable old and poor people.  This system lasted long enough for insurance companies to become entrenched stakeholders in the medical system, earning them a place at the discussion table for a crisis in which they themselves are complicit.

                Now we face a culmination of these dynamics.  As usual, the realities do not break down along clear lines of right and wrong.  Most people would agree to the simple statement that everyone (except illegal immigrants, but that’s another essay) deserves access to health care.  The question becomes more complicated when we add factors such as how much access and to which technologies—and who pays for it, the consumer or the tax payer? The fact that there exists an established institutional framework and its requisite capitalist paradigms that wields significant wealth and power is yet another obstacle to meaningful reform and a discursive restraint on radical debate.


Universal Health CAre


Privelege vs. Right


                The claim that health care is a commodity is a paradigm of privilege that denies the concept of an individual right to health care.  After all, individual rights are not for sale. This constraint predetermines the direction of the debate and ultimately the outcome of reform.

                A privilege is, by definition, something that is bestowed upon individuals through some social process.  The sociology of privilege is based either on merit or inheritance and is a defining feature of status. Achieved status, or status based on presumed merit, is the default discursive formation in American society.  Being of high status is assumed to be based on some individual quality, work ethic, drive or competence. Therefore, any privileges corresponding to status is deserved. Those who do not have such privileges may be offered some charitable entitlement, or they may simply be written off as undeserving.  

                Yet status is also ascribed, or inherited based on factors such as the socio-economic position of one’s parents, one’s race or gender, one’s biological contingencies or limitations.  Being born without a congenital health defect privileges one to easier access to health care just as being born to wealthy parents does.

                Rights, on the other hand, are much trickier matters.  By virtue of America’s cultural canon, such as the Declaration of Independence, the Federalist Papers and the Constitution human rights are “inherent” just by virtue of being American (or as some might suggest, human).  As such, human rights cannot be denied an individual based on status without some established and limited system of due process.

                In the United States the claim to rights is easy to postulate when those rights can be identified in personal terms.  The right of the individual to speak or to worship, or even to gather with other individuals is easily defensible. Such rights do not require an investment on the part of others for exercise. The right to free speech does not have attached to it a corresponding right to be listened to, in other words.

                Other rights, however, require a commitment on the part of others if the individual is to exercise them.  The right to due process including a jury of one’s peers, for instance, requires others to take on the responsibility of serving for jury duty. The right to privacy infers a responsibility to keep out of other people’s affairs. A more abstract right that most of us would recognize is the right to an education.  But in order to exercise that right a sacrifice (one might argue a responsibility) must be made on the part of the community in the form of taxes, land acquisition and the child’s time spent away from parents and out of the marketplace. Such rights, however, often come with a certain amount of contention as we debate just how much the community is to sacrifice as compared to what the community can expect to get in return.

                Thomas Jefferson did us no favors in defining these inalienable rights so eloquently as life, liberty and the pursuit of happiness.  All three such rights require some expense on the part of the community if they are to be manifest in the individual.  For instance, it could be argued that one must have access to meaningful work at a living wage to enjoy life, real freedom and the opportunity to happiness.  However, what is the responsibility of the community, of the society, of government, in assuring such access?

                Health care fits nicely into this paradigm.  Inequalities in access to health care certainly correlate to inequalities in life expectations, the liberty that comes with good health and unfettered ability to pursue one’s own ends.  Those with enough resources have a privileged advantage to these rights.  However, rights should not be subject to one’s socio-economic status.  Rights are inherent in the individual, not in their social position.


How do you know it’s a right


Toward a Right of Health Care


                Health care was not an issue for our founding fathers. Before the advent of a germ theory there was not much polarity in access to this right as there was for other matters addressed by the founders, such as speech, assembly, privacy. Consequently, health care is not neatly defined in our 220 year old Constitution.  The founders recognized, however, that as times change so does our concept of rights.  That’s why our Bill of Rights included the 9th Amendment, which ensured that the rights listed in the Constitution are not the beginning or the end of the story?

                Since the advent of modern medicine, contemporary statesmen have come to recognize the importance of health care to the satisfaction of individual rights.  Many governments and charters formally recognize access to health care as a human right. Not the least of these charters is the Universal Declaration of Human Rights, Article 25.  As a signatory to the UN Charter the United States is bound by our own Constitution to recognize the right to health care.

Yet this does not stop others, like Theodore Dalrymple, to suggest that there is no such right.  In his Wall Street Journal editorial Dalrymple, a pseudonym for British physician Dr. Anthony Daniels, states:


Whenever I deny the existence of a right to health care to a Briton who asserts it, he replies, “So

you think it is all right for people to be left to die in the street?” When I then ask my interlocutor whether he can think of any reason why people should not be left to die in the street, other than that they have a right to health care, he is generally reduced to silence. He cannot think of one.


                I have one! How about the Hippocratic Oath: “I will treat without exception all who seek my ministrations.” Or how about this version of the Oath, “I will remember that I remain a member of society, with special obligations to all my fell human beings…” Shouldn’t we expect a physician to accept an oath taken in their profession as binding? Perhaps I’m asking too much from Dr. Daniels.

                How about the story of the Good Samaritan? Well, I’m sure Dr. Daniels doesn’t wear a WWJD bracelet.

                How about the fact that it is universally, morally repugnant to allow the sick to suffer and die without giving whatever succor and aid at our disposal? According to the American Journal of Public health an estimated 45,000 deaths a year are associated with lack of health insurance.  The study found that those who do not have insurance are 40% more likely to die from their illnesses than those who are insured. This study offers the moral equivalent to Dr. Daniels’ people dying in the street.

                Jefferson declared that human rights are “self-evident,” “endowed by our creator,” that they are “inalienable.” This was nice rhetoric, but Jefferson was wrong.  Rights are a social process of the people demanding that those in power treat them with dignity, and asserting such through whatever means available.  As the chronicle of human history demonstrates, the struggle for human rights is ongoing and expanding. It is time to assert a basic and fundamental right to health care.  

Testing Ourselves into Lower Proficiency

Our school system needs to be reformed. I believe that and have been an advocate of reform since the beginning of my career.  Unfortunately, some time ago, the claim to reform was co-opted by those pushing “higher standards” and “standardized testing” as the means to that end. And it hasn’t worked. And it can’t work. Testing is not reform, it’s just a means of measuring something that still needs to be reformed.

The latest example of the futility of testing ourselves into a better education was reported by the New York Times.  It turns out that, in an effort to meet the high standards of proficiency mandated by No Child Left Behind fifteen states have hit upon a plan–lowering their standards of proficiency. If I’m not mistaken, this is the opposite of reform.

At what point are we going to abandon this vain (though lucrative for publishing companies) attempt to test ourselves into better education.  Granted, we should not abandon assessment, or establishing standards, but we should not delude ourselves into thinking that assessment and standards will lead to improvements in education.

We are sold this bill of goods by those who insist that American education is a failed system and only by setting high standards and holding students, teachers and schools accountable to these standards through testing is the answer. Well that’s great.  We’ve been doing that for eight years and what do we have to show for it? Students who are getting really good at taking tests; teachers who are really good at teaching test taking skills; and schools that are really good at administering tests.  Can we say this is educational reform? Of course not.

The fact is that the American education system is not a “failure” per say.  Many students receive an exceptional education, most even receive an adequate education. Of course, there are those who are woefully under-served. There are many inequities entrenched in education as an institution: racial/ethnic, socio-economic, gendered, regional inequities as well as inequities with regard to individuals with different learning modalities.

Our schools can address these inequities. There are much better ways to educate our children rather than trying to coerce them into learning by burdening them with high stakes tests.  The problem is methodological, so the solution must be methodological. We cannot keep doing the same  failed practices and expect that “assessing” the outcomes of these practices will lead to better results. We have to change how we structure our schools, what we do in the classroom.  All of our social constructs regarding teaching and education must come under scrutiny…scientific scrutiny.

Alternative Fuels! It’s not just for envirofascist-neopagans anymore!

For the last couple of years I’ve come to realize that the debate about global warming has become so polar that it might be time for environmentalists to change track.  As a paradigm shifting discourse, global warming has gone as far as it can.  Those who accept the mountains of data verifying the reality of global warming may be convinced of the righteousness of their cause, however, those who deny the validity of global warming are also entrenched in their own ideas. I use the term Ostrich Theory to describe such individuals.  They are so invested in their beliefs that they will ignore any evidence to the contrary and embrace any supporting evidence regardless of it’s inanity. Another term for this is “confirmation bias.”

That’s not to say that environmentalists and scientists should stop educating us on global warming. Indeed, there may be some whose minds are not made up.  There’s the upcoming generations, those who will be most affected by global warming, who need to know what they are walking into.  Also, it’s possible, albeit unlikely, that some global warming deniers can be convinced of the validity of global warming claims (of course the same could be said about those who accept global warming!)

Regardless, the debate has stagnated.  Instead of a didactic discussion in which the merits of one side prevails over the other, we are faced with a polar issue with clear and equally entrenched lines of division and disagreement.  Environmentalists, those who are driving the debate on global warming, can only lose legitimacy in such a contest.

For this reason I suggesting chancing our paradigms for encouraging the development of alternative fuels.  Yes, burning fossil fuels is the cause of a significant amount of global warming, but it is so much more harmful than that. There are significant health costs associated with the burning of fossil fuels.  The New York Times reports that as many as 20,000 people die every year due to health complications resulting from the combustion and pollution of fossil fuel.(1)  The Times is referencing an extensive report by the National Academy of Sciences that puts the unaccounted health costs of burning fossil fuels at $120 Billion!

If anything, this report understates the hidden costs of fossil fuel as it only includes those attributed with energy production.  The profound damages caused by extraction and waste or accidents (such as the TVA coal ash spill in Tennessee pictured below), as well as the political costs of being dependent upon other nations for our energy needs, were not included in this assessment. The study also excluded trains, ships and planes from its cost analysis. If all costs were included it’s likely that the figures would be much more dramatic.

Clean Coal

Global warming is an abstract concept laced with a certain amount of futility. On the one hand, the convolutions of as dynamic a system as the global climate is so far beyond the understanding of most people (including the scientists who study it) that it’s hard to make a concrete claim.  The complex mathematics that describes global warming can be reshaped into convincing arguments that global warming is not an issue at all, perfect for the “Ostriches” noted above.

Even if convinced, one must deal with the ominous reality that this is a global issue requiring people all over the world to work together toward a common solution, something that has never happened in all of history.  How can we as activists affect change in the face of the awesome obstacles of dynamic climactic phenomenon as well as the daunting task of getting politicians to put their immediate national interests aside to come up with a global consensus.  Then there are the economic contingencies that are so complex as to defy description.

But the information in the above study is clear and disturbing in its conservatism.  People are dying and the rest of us are losing billions of dollars in the fossil fuel game.

The costs of fossil fuels are not just defined by the digital score keeper on our gas pumps. There are hidden costs that need to be brought to light in a study even more comprehensive than the one mentioned above. When these costs are factored into the final billing for fossil fuels, suddenly the benefits of alternative energy becomes much more marketable. This is true even if global warming is not included in the discussion. Wind energy and solar energy, so far billed as being too expensive to replace fossil fuels at this point, are comparatively more affordable in light of the NAS study.


1. The number of dead does not include those who die as a result of global warming.  The World Health Organization estimates that over 150,000 people worldwide die every year as a result of global warming.

What Do President Obama and Mahatma Gandhi NOT Have In Common?

Gandhi Obama peace

Answer: Gandhi never won the Nobel Peace Prise.

Yes, it’s true. One of the greatest examples of peaceful resistance in the history of man never received the Nobel Peace Prize.  What’s funny is that I’d always assumed he had.  I was shocked to discover this omission when I was researching the Nobel Prize after hearing that President Obama had won.

So then my sociological imagination started kicking in.  Exactly what is the Nobel Peace Prize? If people like Gandhi are denied the medal while war criminals like Henry Kissinger become laureates there must be something going on that I just never took the time to understand.

Upon the announcement of Obama’s win the debate ensued.  Does President Obama deserve the Nobel Peace Prize (NPP)? I guess there are a couple of ways to answer this question.  One possible response was my most immediate thought: shouldn’t we wait until the whole War in Afghanistan thing is figured out? Exactly what has Obama done toward the fulfillment of world peace? The second response is more cynical: If Henry Kissinger deserves the NPP, anyone does!

The NPP is an interesting ritual on which to apply the sociological imagination. Yes, there are those who have been honored who have dedicated their lives to the cause of peace, even if only in their corners of the world. Desmond Tutu, Aung San Suu Kyi and Eli Wiesel come to mind in this category.    Others, however, have demonstrated a significantly less consistent record on matters of peace.

Let’s take President Obama’s contemporaries, US presidents who won the NPP. We can start with President Teddy Roosevelt. Roosevelt was nominated because of his role in negotiating the Treaty of Portsmouth between Russia and Japan. The Treaty of Portsmouth did a great deal to enhance the reputation of the United States in international relations.  This was, of course, Roosevelt’s goal if it wasn’t so much about peace.  Roosevelt’s commitment to peace was, at best, questionable.  In 1902 he engineered a revolution in Panama in order to secure a canal site. Before the Spanish America War, then Secretary of the Navy Roosevelt was among the most ardent saber rattlers, stating “I should welcome almost any war, for I think this country needs one.” He considered the “great day” of his life to be his charge up San Juan Hill during that conflict. In world affairs, Roosevelt advocated “big stick” tactics which closely resemble the realpolitik of his age.  In a dramatic demonstration of American power he sent the US Navy, the Great White Fleet, on a worldwide show of arms.


Woodrow Wilson would win the NPP for drafting an international idea of peace and humanity in his Fourteen Points. Indeed, Wilson would work tirelessly toward American inclusion in a League of Nations.  Such exertions may have shortened his life. But Wilson didn’t shy away from use of force in foreign policy as he seized the port of Vera Cruz in Mexico for no other reason than the Mexicans insulted the US by refusing to offer a 21 gun salute after apologizing for arresting some American sailors.  Nor did Wilson hesitate to use force in Haiti or the Dominican Republic. In 1916 he ran under the banner, “he kept us out of war.” Within months of taking the Oath of Office for his second term, the United States was embroiled in one of the bloodiest wars of all time. To convince reluctant Americans to participate in the war Wilson’s government created the Creel Commission, a massive propaganda effort to drum up support among people who preferred to remain uninvolved in European conflicts.  Dissent against the war became illegal when Wilson signed the Espionage Act of 1917.


President Jimmy Carter won the NPP in 2002, the only US president to be awarded after his term of office.  He won in recognition for his incredible negotiation of the Camp David Accords between Israel and Egypt, and for the extraordinary efforts of the Carter Center in world affairs.  Indeed, Carter’s peace resume is the most impressive of the presidential laureates, but Carter’s presidency was not without its questionable and decidedly unpeaceful actions.  Carter supported militant and brutal regimes like the military junta in El Salvador, the Marcos regime in the Philipines, Samoza in Nicaragua and the Shah of Iran.


So the Nobel Peace Prize is not necessarily a recognition for a life’s dedication to the pursuit of peace.  So what is the Nobel Peace Prize, if not such a recognition? The NPP appears to be more an instrument for promoting the ideals of the Nobel Committee. In each case above, the holders of the prize did, in fact, make significant contributions to the ideal of world peace.  In recognizing specific and targeted acts while at the same time turning a blind eye to the less palatable actions of the same men the Nobel Committee may be trying to direct the attention of social reformers and peace advocates to support the real actions of statesmen who uphold the goals of the social movement.

Obama recognized this when he said, “The Nobel Peace Prize has not just been used to honor specific achievement, it has also been used to give momentum to a set of causes.” Often these causes are embattled and victories are hard won.  Their are many heavily scarred peace activists, such as the imprisoned laureate Aung San Suu Kyi. By pointing out that statesmen are hearing the call, that they are working, albeit slowly and awkwardly, toward peace the Nobel Committee may be trying to offer a glimmer of hope to the embattled. This may especially be true when the Committee can find a reason to recognize one so powerful as the American President. Here we have someone in a position of power who is not deaf to our pleas. But in doing so may the Nobel Committee be offering a sense of false hope? Wilson received his award shortly before his death.  Carter received his late in life.  At the end of one’s political career the prize may be a safe way to communicate support for certain policies.  But Roosevelt, like Obama, was awarded with many years to go in his presidency.  And in those years one could certainly not define the Rough Rider as an ardent advocate for peace. So any encouragement the NPP might have vouched to pursue the path of peace was lost on Teddy. Those who may have turned to Roosevelt as an exemplar of moral leadership must have been woefully disappointed.

Perhaps Nicolas Sarkozy’s  claim that “the award marks America’s return to the hearts of the people of the world,” is closer to the intent of the Nobel Committee in bestowing the award on Obama.  The Bush Doctrine was a disturbing and detested policy among the rest of the world. That a nation with so much raw power should turn its objectives inward, to promote self interest at the expense of diplomacy, even to encourage pre-emptive warfare reminiscent of the calamitous 19th and early 20th centuries, could only have been an international nightmare. Obama’s early overtures to return to the community of nations and a route of diplomacy over militarism is a relief to the citizens of the world.  From this perspective the NPP may represent not only an encouragement for further participation with the world community, but also a repudiation of Bush Doctrine unilateralism.

In short, we might conclude that the Nobel Peace Prize is awarded strategically.  On the one hand, the NPP is a sincere recognition of earnest and tenacious individuals and groups working toward world peace.  On the other hand the prize may be a recognition that the tireless efforts of such people and such groups are not in vane. After all, if the peace movement can influence such hard hearted individuals as Henry Kissinger, and big stick practitioners as Teddy Roosevelt, then there is value in peace work.  The life of a peace activist is often fraught with great defeats punctuated by small victories. Recognizing world leaders when they even reluctantly hear the call of peace may be a tool for shoring up support for the peace movement.

So it may not be about how deserving Obama is of the Nobel Peace Prize.  Perhaps the prize was offered in recognition of Obama’s nascent work in nuclear disarmament and repudiation of dangerous American unilateralism. Maybe it’s an attempt to motivate Obama to adopt a more peaceful posture in future endeavors. Perhaps it was a demonstration to the peace movement that their work can and will bare fruit.

Regardless, there’s still the matter of Afghanistan, Pakistan, Bagram, rendition, the perpetuation of Bush’s domestic intelligence policies.  Obama is a long way from the ideals advocated by the Nobel Committee or anyone involved in the peace movement.

The Politics of Political Surveys

Just when I was wondering what to blog about (I, in fact, have another blog in the making).  Yesterday I received an invitation from Representative Connie Mack to attend his Health Care Forum in Fort Myers. Included with the invitation is a Health Care Reform Survey.  Well, I’m a sociologist.  Surveys are the staples of sociology, so I’m interested.  Then I read the survey and, come to find out, it’s not anything that would pass for a valid research instrument in any academic department that I know of.  Indeed, I teach introduction level courses, and if any of my students presented such a survey to me I would fail them.

Mack’s survey disregards the rules of actual survey taking.  It’s not hard to spot that, in fact, the Mack survey is not a survey, but rather a propoganda instrument.  Let’s take a look at the four survey questions for a fuller understanding of what I’m talking about.

Mack Survey

Question 1:

Should Representative Mack keep fighting to protect our nation’s health care system from total government control? Yes, No, Unsure.

This question is great. First, take a look at the opening language.  It presents Connie Mack as fighting to “protect.” Indeed, there’s a bias toward protection.  We usually want our politicians to protect us from whatever it is we need protection from. In this case, the protection is from total government control of the nation’s health care system.  Oooh! Scary stuff! Except, of course, that there is no such battle going on.  Not one of the bills going through congress right now, nor any of the dominant voices on health care reform, is advocating for “total government control” of health care.  Sure, if such was the case I might just be against that policy.  But it’s not true. So exactly what is this question asking.  Rather, it’s a tool to spread misinformation about the current health care debate. Just what we need.

Question 2:

Do you agree that a free enterprise system is better than too much government? Yes, No, Unsure.

This is a great question for analysis. To my knowledge there is no way to use the phrase “too much” in a survey and expect valid results. “Too much” is always a bad thing.  Of course a free enterprise system is better than too much government.  It’s also better than not enough government.  It’s also better than too much mayo in a tuna fish sandwich. What’s your point? This question is designed to lead the respondent to the “correct” answer rather than a “valid” answer.

Question 3:

Do you think a government run health care system would be better or worse than what we have now? Yes, No, Unsure.

This question almost looks like a valid question if, that is, that there was a real probability of a government run health care system (which there isn’t). The problem is that the question is looking for a “better/worse” option, but does not offer either “better” or “worse” as a choice.  How exactly does one respond “yes” or “no” to this question? What are you saying yes or no to?

Question 4:

Are you in favor of paying higher taxes for universal health care? Yes, No, Unsure

Now this is a great question. One that has been asked in other survey venues.  Indeed, according to a CBS/New York Times Poll, a majority of Americas would be willing to pay higher taxes if it meant everyone was covered (the definition of universal health care).  That number increases if you ask Americans if rich people should be taxed more to provided universal health care. Of course, the CBS/New York Times poll may have a liberal bias, and of course people are going to be fine with “other people” paying high taxes, especially if they perceive that the other people can afford it and are subject to negative perceptions.  That’s an issue for another blog.  For this question it’s important to look at the rest of the Mack pamphlet.  Before we get to the survey, Mack states, “I oppose the attempt to nationalize our health care industry. Some call it universal health care; others call it nationalized health care or socialized medicine. Regardless of what you call it, it’s a bad idea.” Then, in the survey, he asks you if you think it is a good idea. Talk about teaching to the test!

Mack Survey 2

Connie Mack is not one of my favorite politicians. There’s no secret there.  But I would have been willing to pay some attention to this survey if it was a legitimate instrument for measuring people’s attitudes regarding health care. It’s not. It’s a means of spreading propaganda and falsehoods. My guess is that Mack website will publish the results of this survey as definitive of what his constituents want him to do.  It may even reinforce his resolve to keep “fighting to protect our nation’s health care system,” even though this health care system he presumes to protect is a national embarrassment.

A Larger Fishbowl! (An Agitate Commentary)


                If you were to design a surveillance system for keeping our nation safe from terrorists how would you do it? You might start by analyzing the data that you already have regarding terrorists and terrorist organizations.  If you have good relationships with allied nations you might also be able to tap into their databases to reinforce your own. From there, you might want to establish legal surveillance operations on known terrorist organizations and, as further networks are exposed to your analysis, expand your surveillance to include additional branches.

                If you follow this method you can maximize your use of intelligence resources by focusing on known threats. You are also minimizing the probability that your system will trammel the rights and expectations of privacy among innocent citizens. You won’t eliminate this threat, of course, as social networks are often very complex and intertwined.  It’s certain that some innocent bystanders will be caught under your surveillance microscope, their privacy can restored through new legal means that take current technology and historical contingencies into account and are designed to protect the innocent

                This is a sensible system.  One that recognizes the importance of surveillance in maintaining the national security while at the same time minimizing illegal and immoral intrusions on the rights of innocent people.  Nothing radical here.

                Now, let’s say you wanted to create a surveillance system designed to maximize and extend elite power throughout society.  How would you design that system? Well, such a system would need access to as much information about as many people as is technologically possible.  Every possible intrusion into the lives of individuals, regardless of their affiliations, would have to be maximized.  Computer technologies that could filter and sort countless bytes of information would have to be developed.  Such a system would have to remain secret, with no accountability to the general public. 

                In 1974 the French social theorist Michel Foucault used a prison designed by the humanitarian philosopher Jeremy Bentham to elaborate a new model of power dynamics.  The prison was called a panopticon, and Foucault’s theory became panopticism.  The panopticon was an idea for a humane prison designed so that a few guards could keep an eye on all prisoners at all times. The idea was that if prisoners knew that there was a certainty that they could be seen at all times then they would adjust their behavior accordingly despite the fact that the guards were not necessarily always looking at them. Prisoners would govern their own behaviors without physical coercion from the guards. It’s the power of the gaze.

                The Foucaultian idea of panopticism works much the same way for society as a whole.  If people know that they can be watched at all times then they will act as if they are being watched at all times. They will be less inclined to participate in acts of deviance or crime. In essence, they will govern their own behaviors in accordance to the dictates of the state without the state having to resort to militaristic technologies of coercion.  And, just as with Bentham’s prison in which the guards cannot be seen by the prisoners, a Foucaultian panopticon must operate under the awareness of the population, yet under strict secrecy.  In other words, one must know that they can be watched at any given time, yet can never see who is watching and when. “In order to be exercised, this power had to be given the instrument of permanent, exhaustive, omnipresent surveillance capable of making all visible as long as it could itself remain invisible.” (Foucault 214)  

                Of course, Foucault was talking about rationalized (bureaucratic) institutional power. He saw the regimens of schools, hospitals, clinics, prisons and the military as a non-coercive, though all encompassing, technology of power. This was the 1970’s and the level of technology had not developed to the point it has today.  At the beginning of the twenty-first century, the information age, technology is such that a cyberpanopticism is not only possible, but imminent.  The ability of the elite to record and analyze the everyday routines of our lives  is at hand. New technologies of power, the likes of which Foucault could only have imagined, are no longer relegated to paranoid science fiction novels.  The future is now. 

                According to author James Bamford the government and its corporate allies have built and are expanding a surveillance infrastructure that can subject every citizen to the power of the gaze.  In the meantime, laws are being written to ensure that the exercise of this power remains beyond public scrutiny and outside of any conventions of checks and balances.

                This vast infrastructure is being constructed on the premise of fighting terrorism.  If such was the case then we could expect that it would be designed much in the manner as the first program described in this essay.  It is not.  The intelligence infrastructure captained by the NSA is not merely targeting known terrorist groups and individuals in an ever expanding examination of affiliated networks.  Indeed, it is being designed to intercept every communication, every commercial transaction, every movement of individual citizens regardless of affiliation. At the same time, legislation is being created to keep those involved in intelligence gathering secret and under the wing of government protection. 


A New Member of the Power Elite


                In 1954, sociologist C. Wright Mills published the results of his extensive research into the power elite.  Mills recognized that the power elite was a collection of institutions working together to perpetuate its own class interests. At the top of this pyramid was corporate executives who, despite the supposition that they are expected to compete, are actually better served by cooperating with regard to their class interests.  Next is the executive branch of government, the President and his cabinet, and the high ranking members of the executive bureaucracy.  Then there is the top brass of the military, headquartered in the Pentagon and represented by the Joint Chiefs of Staff.

                According to Mills, these three institutions share common interests despite what may appear to be very different functions.  Indeed, since the advent of the military industrial complex as exposed by none other than President Dwight D. Eisenhower, corporate, political and military power are conjoined in ever tightening bonds.  “As each of these domains has coincided with the others, as decisions tend to become total in their consequences, the leading men in each of the three domains of power—the warlords, the corporation chieftains, the political directorate—tend to come together, to form the power elite of America.” (Mills 9)

                Indeed, they do more than come together. In fact, they overlap in significant ways.  The corporations fund political campaigns. Often they hedge their bets by funding both parties.  In exchange for important funds the politician agrees to give access to corporate lobbyists, pander to the corporation’s legislative wet dreams and appoint corporate representatives to high level cabinet positions.  Corporations have also created institutional allies in the military, offering to add industrial might to the military machine.  Corporations win major contracts to produce the needs of the military.  In exchange for military support, politicians perpetuate the corporate projects regardless of their use. The military thus becomes a major player in the economy.  To further the convergence of interests at this highest level, corporations provide comfortable jobs and exorbitant salaries to retired, high ranking officials and military officers.  The military continues to grow while the executive continues to feed valuable contracts to corporations that, in turn, provide political and social security to both.  It’s a cozy relationship.  

                Now Mills wrote The Power Elite in the mid fifties.  I’d like to think that if he were conducting this research today he would add a fourth element to this trifecta.  In the 1950’s the intelligence community was a nascent institution getting its feet wet in the international arena. It was accurate to describe the intelligence community as a fraternal order of ivy league school mates playing a dangerous game of international espionage to greater or lesser effect. Despite the retrospectively obvious distinctions of class in this arena, it’s forgivable that Mills did not include them in his description of the power elite.

                Fifty years later, the intelligence community has graduated from fraternity to fully fledged member in good standing of the power elite.  Wars are no longer defined by the movement of armies, battle lines, logistics and tactics.  The postmodern army runs on information processing, satellite surveillance, smart weapons and computer hubs often thousands of miles away from the battlefield.  Intelligence is also a central aspect of civil law enforcement as the nation is carpeted with surveillance cameras and listening devices. Communication signals can be pulled from the air or culled from convergences of fiber optic cables in select cities in the United States. This communication infrastructure is owned and operated by corporations, already members of the power elite.

                As it stands, communications corporations allow intelligence officials to have access to their information.  The executive then distributes this information to civil law enforcement and the military.  For their assistance in spying the executive guarantees secrecy and immunity to prosecution for providing information on innocent citizens whose rights have been ignored. High ranking intelligence officials are often pulled from the corporate world, and again, cushy chairs on the boards of directors for intelligence firms await high ranking government and military officials.  In return, the executive finds more and more reasons to expand the intelligence community, often by conjuring a secretive and ubiquitous enemy that can only be defeated by surrendering our privacy.  Such an enemy is also a boon to the military industrial complex.

                Modern surveillance technology and refined intelligence gathering sciences are used to broker the intelligence community a seat at the elite table.  In a Foucaultian leap the intelligence community also offers the power elite the prospects of a true social panopticon. By being able to keep an eye on our every move, our every purchase, our every communication, the power elite can motivate our actions. 

                Oh, it’s not so much that citizens who know they are being watched are going to govern their behaviors in the Foucaultian sense.  I think Foucault took some theoretical leaps here. Rather, the power elite can collect vast amounts of data on us as a population.  They can then use this data to learn what is motivating us, our concerns, our fears.  Then, they can shape their paradigms, political speeches, advertisements, justifications for war, in such a way that they know we will respond to their liking.  They can legislate our fears into reality.  They can sell us solutions to our perceived problems, increasing our dependence on the corporate machine.   Total Information Awareness equates to total knowledge control.

                Our intelligence infrastructure is not designed to fight a war on terror.  It is designed to control the motivations of society. Surveillance is not a technology for keeping us safe. Rather it is a method for perpetuating elite interests that are contradictory to the interests of the commons.  As the intelligence community is further integrated into the highest echelons of society, the power elite becomes more monolithic. 

                That does not mean that the prospects of resistance are lost.  Indeed, it requires that those of us who dissent from the concentration of wealth and power demonstrate more courage to speak the truth.  We must demand more from the institutions that are supposed to serve our interests.  The very first thing we must defeat is the fear that the power elite perpetuates by committing us to war and false patriotism.  If we are not to be the pawns of the powerful we must not participate in their games.  

This Won’t Play Well…Using Heroin to treat Heroin Addiction!

In our culture we tend to look at addiction as a moral failing or a character fault, despite the volumes of evidence of profound biological variables.  We know that addiction is a physiological response to intoxication.  Yes, there are social and psychological factors that are important, but by the time we get to the point of full blown addiction, it’s the physiological dependency that plays the leading roll. We also know that some people are more prone to addiction than others, and some people become more physically dependent upon certain drugs than are others.

Despite this knowledge, we still hold the the archaic social constructs of moral failing with regard to addiction.  And the cure for this moral failing is purging (purgatory).  To redeem oneself from moral failing one is expect to pass through a trial of purgatory, to cleans oneself.  Hence we possess a certain admiration for erstwhile addicts recounting their stalwart determination against the torment of withdrawals only to emerge from this suffering clean and cured. Twelve step programs are designed to lead people through a process of purgatory that is defined as a daily struggle. Purgatory becomes an expected and accepted paradigm for stories of overcoming addiction.

And the approved course of action for dealing with addiction is cold turkey.  Even using progressive treatments such as the nicotine patch or gum has the perceived value of quitting cold turkey when, in fact, one is merely diminishing their addiction in stages. We must also remember that, according to prevailing paradigms, once we are addicted, we are always addicts subject to a fall from grace.

So now we are faced with this story from the New York Times about using controlled doses of heroin to treat heroin addiction.  A study printed in the New England Journal of Medicine suggests that the use of diacetylmorphine, the active ingredient in heroin, produces better results than methadone for heroin addicts who have already failed other treatments.  Heroin addicts were more likely to continue treatment using the diacetylmorphine than the methadone.  Of course, methadone has the benefit of mimicking the chemical interactions of heroin without producing the high. Hence, it’s a great “cold turkey” solution that reduces the purgatorial horrors of heroin withdrawal.

Using heroin, on the other hand, offers no such approved means of resocialization in our culture.  There’s no purgatory.  There’s no cold turkey.  There may even still be a “high” involved.  And getting high is just not approved (getting drunk is okay, but getting high will not be tolerated in polite company).

The Journal of a Mad Sociologist is dedicated to revealing what I call social schizophrenia.  It’s my contention that social schizophrenia results when contemporary knowledge is influenced by outdated social constructs. Our contemporary knowledge of addiction is still mandated by often medieval concepts of personal character and failing, purging and self control. Progress in knowledge, ergo progress toward freedom and justice, occurs when we liberate ourselves from the social constructs, formulate relevant constructs and institutions to put those constructs into action.

Political and Rhetorical Uses of Hitler!

All right, I’ve had enough!

In the past couple of weeks I could not help but notice just how often the specter of Nazism and Hitler have been used as rhetorical conventions. It appears that Nazism is alive and well…and living in the minds of policy wonks and desperate lobbying firms intent on perpetuating their own interests at all costs.  It is not, however, alive in any real way in public policy or American politics.

This post has been moved to the new Mad Sociologist site. Click Here to finish reading.

Africa, Women’s Rights and Hillary Clinton

If there is a miracle that could happen in Africa today that would go furthest in bringing this beleaguered continent into the twenty-first century it would be a spontaneous recognition of the inherent and inalienable rights of women. Of course, this is not going to happen.  If the rights of women are to be recognized in Africa then the process must be the same that it has been in every nation and culture on earth that does so recognize…it must be fought for.

Cultures that do recognize an equal status for women with men, even where such status is unevenly applied, experience great benefits.  Birth rates decline allowing for healthier women and children.  Women become a dominant force in the marketplace, strengthening the economy.  Politics is enriched by the perspectives of increasing numbers of women in office. The arts and letters blossom with new found energy and creativity.

This is no small matter for Africa where, in many cultures, the second class status of women is not only reinforced through culture and tradition, but also by societies in which the devaluation of most men can only be countenanced by deeper devaluations of women. In places like the Congo, as analyzed on this blog, the social position of women is an effective weapon against rival groups.  Though there is a nascent women’s movement in Africa, and global networks directing their energies to that end, the rights of women are a long way from being realized.

When Secretary of State Hillary Clinton visited Africa she vouched the kind of example of encouragement and possibility for every woman she met.  Here was a woman who was the primary diplomatic voice for the most powerful nation on earth.  Regardless of how one feels about Clinton’s politics, imagine the impact she must have made on women who must struggle every day just to survive and to ensure the survival of their children.  She may have been the very epitome of hope…

…and fear…

…for men who are used to women confined to very acutely defined social spaces.  When the male student insulted the Secretary of State by asking her husband’s opinion, or the opinion of a male star athlete who was present it was likely that this was an attempt to remind Hillary and her audience of the proper woman’s place in society.  It may very well be that the young man was so institutionalized by a socially constructed reality of women that he did not even realize that his question was insulting. Such is the nature of reified social knowledge. He was quickly educated by an indignant Clinton.

What I thought was interesting was the response in the room.  There was a smattering of laughter and even some applause (by whom it was impossible to tell), but the most prominent response was the collective gasp.  There was a tension and uncertainty in that room at that moment that was palpable.  And it happened after Hillary’s response.  Had the student asked such a question of a male politician the response would have elicited nothing more than a joke, but to ask such a question of a woman was most likely considered reasonable.  What was not reasonable was the bombshell that landed when Clinton responded out of anger and, in essence, put the man in his place.  This must have been socially awkward. Here was a man being confronted with anger and sarcasm by a woman.  What’s more, there was nothing he could do about it.  This was an example of reinforced deference to authority, which was in this case a woman. And the assembly did not know exactly how to respond to such an alien presentation of roles. Awesome!

There is debate as to whether or not Secretary Clinton was justified in such a response.  Hillary was in Africa as a representative of the United States and, as an envoy between our culture and the many cultures of Africa.  She was also there as an emissary for the rights of women not just in the United States (where there’s still work to be done) but for women all over the world.  Had she backed down from the question, or even worse, answered the question, she would have demonstrated that even the most powerful diplomat from the most powerful nation on earth can be put in her deferential place if she happens to be a woman.  Instead she attacked the notion of second class status head on and demonstrated that she, as a political figure as well as a woman, did not have to submit to insult.

New York Post Picture of Hilary

Of course, there can be different interpretations and speculation as to the kind of example she provided.  Regardless, it would have been helpful if she received support from her own nation. Instead, Hillary’s response was often represented in the media as, yes, female hysterics.  Complaints abound about Clinton’s “undiplomatic” response, and some commentators to the Lede post suggested that she was just trying to act like a man, one suggesting that when a woman tries to act like a man it undercuts her message.  Wow! There’s also the suggestion that Hillary is frustrated (frustration is another emotion often applied to women who respond with anger) by living in the shadow of her husband.  Apparently Africa is not the only place that expects women to know their place.


I have one su-bnote here, and it’s purely speculative (I’ve not looked at any data). During the Clinton Administration, however, it was often rumored that Hillary was the really the secret president and Bill just a happy figure head.  Interesting how the paradigm shifts, but doesn’t really change.

Also, notice the misplaced modifier in the New York Post line above. It should read …she, not Bill, runs state.

Lessons learned from the Beer Summit

 Beer Summit

The so called Beer Summit was a learning experience for us. At least it should have been. We shall see.

Of course we learned that the issue of race is alive and well in the United States, regardless of having elected a black president. Though the arrest itself might not have been motivated by race (might not), race imbued the issue of what otherwise would have been a well educated professional man arrested from his own home for making a police officer mad.  That it was a white cop and a black Harvard professor became the defining element of this issue. A black police sergeant, Leon Lashley, who was at the scene reported that he has been referred to as “the black cop,” as well as an “Uncle Tom” for his support of Crowley.  Another police officer, Justin Barrett, sent an e-mail referring to Dr. Gates as a “banana-eating jungle monkey,” though he swears this is not about race. Okay.  Perhaps it’s about condoning police abuse as Barrett admitted that he would have “sprayed him [Gates] in the face with OC deserving of his belligerent non-compliance.”

Yes, race is still an issue. It’s moments like this that remind us that we still have a ways to go on race in the United States.  And that’s not necessarily a bad thing.  We need reminders, because otherwise it’s easy to put race on a back burner and pretend–despite the wealth of research and experience to the contrary–that racism is not a variable in our culture.  In this case, we are reminded at the expense of a well known academic who will, after all, be just fine after this, rather than be reminded by the heartrending catastrophe that befell the poor people of New Orleans and the banks of the Mississippi during Katrina.

We also learned that our President, a man who most will admit makes few oratoracle mistakes, can put his foot in his mouth every once in a while.  Though he never apologized, stating rather that that he could have “calibrated [his] words better,” and holding to his belief that the police over-reacted, Obama demonstrated that he was willing to recognize his own shortfalls and  do his part to fix his mistakes. As a counselor and a teacher there have been many times in my career that I responded in such a way that could only make a bad situation worse.  As the person in charge of my therapy group or my classroom, as President Obama is the leader of our nation, it was incumbent upon me to admit my mistakes and change course before the problems got worse.  President Obama was willing to admit that his choice of words helped “ratchet up” the issue rather than cool it down and/or address it. As a good leader he changed course and found a way to make a molehill out of this quickly rising mountain.  (contrast this with our last President)

The most important lesson of the beer summit, however, is what it says about the intersection between social policy and real life. The United States has many policies, paradigms, curricula and procedural rules for addressing historical inequities regarding race. From Affirmative Action to modified social studies curriculum to racial diversity training provided by police forces around the country, we are imbued with strategies for working toward a “post racial” society.  These programs all have their strengths and weaknesses and a certain amount of turbulence in their application.  But the bottom line is, that any effort to resolve racial, or racially inbued issues, boils down to real life individuals coming face to face with each other.  No amount of social engineering can work without real life interaction on an equal footing among people who are willing to share their goals, their dreams and their disappointments…maybe over a beer, maybe through some other means.

Granted, a sit down with the President and Vice President is not really one of those meetings between real people, even if the President roles up his sleeves and loosens his tie.  But the imagery is symbolic, and symbols carry meaning. To get past our socialized prejudices people with diverse racial, ethnic or cultural identities must interact on a regular basis on an equal footing and with equal input for setting goals and determining direction.  When that happens we see tolerance and understanding develop. It was toward this end that the Beer Summit was a learning opportunity.

Perhaps Gates and Crowely can continue their discourse, and affect a public discourse about race that not only has meaning on the macro sociological level, but also on a personal level among real individuals.

Gates/Crowley and the Sociology of Deference

It’s hard to know just exactly what role race played in the well publicized arrest of Harvard scholar Henry Louis Gates Jr.  Race is one of those variables that so imbues life in the United States that rare is the sociological research that does not have to control for race.  We are socialized to see race even when we are being educated about racial tolerance.  So to pinpoint just what role race played in this one issue, where race as a variable cannot be controlled, is well nigh impossible. And it may even be unnecessary.  This issue can be understood without the racial variable, and may even be of greater social value if we look at the role of power and deference rather than white and black.

By assuming that race is not a variable we can move on to a different, more concrete explanation and lesson.  Let’s assume that Sgt. Crowley was not looking at race.  He still arrested a man from his own home who, it could be argued, had good reason to over-react to what was happening.  Let’s leave race out of the equation when looking at Prof. Gates’ reaction to being confronted by a police officer in his own home.  On one hand, it is not pleasant, and may even be offensive to be accosted and accused of wrongdoing when all you were doing is trying to enter your own home.  On the other hand, if Prof. Gates had simply controlled his anger, it’s likely that Sgt. Crowley would have gone on his way and this would be nothing more than a one inch story on page A17 of the Boston Globe.

But what was it, exactly, that Sgt. Crowley and Prof. Gates were reacting to.  Of course, race may be a subtext, but it may not necessarily be the defining variable.  A better explanation may involve the role of deference and status between the two men.  Gates, an Ivy League professor, and Crowley, a police sergeant.  Both of these men hold positions of status and authority.  As such, both men expect (and some may argue, deserve) a certain amount of deference to be applied to any interaction concerning them.  Being accosted by the police in his  own home is definitely a breach in deference from the point of view of Prof. Gates.  True, it may be that such status for a black professor may require greater nuance than for a white professor, but we are trying to control the race variable here.  For Sgt. Crowley, Gates’ non-deferential behavior toward an officer of the law (a sergeant, so an officer with authority) could also be seen as an affront.

In essence, what we see here could be defined as a good old fashioned conflict cycle based on the refusal to recognize and apply requisite deference rituals.  As a man with a deep counseling background I can tell you that conflict cycles almost always end badly.  And this situation ended badly.

So what can be learned from this perspective.  I would suggest that the onus of this lesson should fall on the police.  Yes, Harvard professors could abide a lesson or two, but in this matter, it was Sgt. Crowley who was on another person’s property–albeit doing his job.  As the man with the badge and the authority of the state behind him, just as is the case with counselors engaged in conflict cycles with clients, he had the responsibility to defuse the conflict with strategies that did not involve force. I’m sure police receive training with regard to conflict cycles, but may not fully understand the role that ritual deference plays in their lives. An understanding of deference may make it easier for the police to deal with such situations where breaches in deference are hard to elaborate.

This blog is not meant to deny the role of race in this issue, but rather to offer a different, more practical, lesson that may be obscured by the race variable.

–Another aside to this story may have to do with neighborhood/community relations.  If the woman who reported the “break-in” (who was not the same woman who called 911) had been better incorporated into her community, knew her neighbors, it is unlikely she would have been alarmed by two men forcing their way into a home.  According to the New York Times, this woman was a fairly new resident in the neighborhood. But there was a time when neighbors would introduce themselves almost immediately to newcomers.  Rituals for accepting new members into a neighborhood have broken down over the years.  This is another matter that is not being addressed because race is the default debate.

On Baseball and Teaching

                I love baseball.  I really do. I grew up with the game as a kid, whether I was playing in my neighborhood street with a waffle bat and ball of tape, or a sandlot, or playing for the world’s worst little league team (we were 0-9. We would have been 0-10 but for one game when only 7 of our players even bothered to show up).  I used to be a Yankees fan deep in Red Sox territory, mostly because of my contrary nature I suppose.  Baseball was an important part of my youth as I tracked my favorite professional players, Dave Winfield, Don Mattingly, Nolan Ryan; their records were memorized and fiery debates defined my friendships.  I still have stacks of baseball cards collecting dust somewhere in my pile of collected stuff.

                Lately, however, I have turned my back on professional sports, including baseball.  I don’t follow the players, I don’t watch the games, I don’t buy the merchandise.  My disillusionment of baseball began in 1996. My favorite team, the Yankees, was on target to go to the World Series, which I had not seen since the early eighties.  I was excited to see my team do so well and reveled in being a part of that experience.  Embracing a team is an emotional experience for which its victory is your own victory.  That was until the players decided to go on strike. 

                I was flabbergasted. Why would they go on strike? It turned out that they wanted more money.  Now I come from a proud union tradition.  My father was a union organizer and president. I was walking picket lines when I was six years old.  I’m a supporter of unions and I support the right of workers who want to make more money…but baseball players? Really? 

                At the time I was working in a wilderness program.  I was a supervisor working with troubled young men in the northern skirts of the Everglades.  I was on the job twenty-four hours a day.  If I was lucky I would get two days off a week.  My pay was adequate as I had few living expenses, but it was certainly not high.  My colleagues and I worked very hard with very little material remuneration to improve the lives of the kids in our care.  So when the baseball players went on strike it made me take a good, hard look at what they were striking for. It was eye opening.

                In essence, these grown men were working very hard at hitting and throwing a ball.  That was it.  Their total contribution to society was in the satisfaction derived by others that a certain group of people were especially adept at manipulating a small, leather-bound ball of string. These groups of people, these teams, became reference groups for millions throughout the country if not throughout the world. I thought, this is a pretty tenuous thing on which to derive a sense of personal satisfaction. 

                In 1996 the median salary for a player on the New York Yankees was $1.1 million.  The lowest paid players, second baseman Andy Fox and outfielder Matt Luke, made $109,000 a year.  The highest paid, Ruben Sierra (pictured below), made $6.2 million a year…and he was a designated hitter, so he only played offense, which translates to about three or four times at bat, then he was done!


Ruben Sierra

                In contrast, the average salary for teachers in 1996 was just over $37,000.  In 2007, the median salary for a player on the New York Yankees was over $5 million, a 400% increase.  Average salary for teachers in 2007 was around $51,000, which amounts to around 27%.  However, when you look at the adjusted numbers for teachers $37,000 in 1996 was the equivalent of about $49,000 in 2007. So in real terms, teacher salaries have only increased less than a 4% since 1996.  Using the same calculus, if the median salary in real dollar value increase for a player on the New York Yankees still amounts to a 232% increase!

                I can argue that income is a good indicator of how much a society values its members and the work that those members do.[1]  After all, the salary of a player on the New York Yankees could be argued to reflect the value that individuals are willing to pay to attend ballgames, watch games on TV and/or purchase merchandise. Teacher salaries represent what taxpayers or in the case of private schools, tuition payers, are willing to pay to acquire the skills in a classroom. If this is the case, then it is very obvious that those who are charged with teaching our own children and infusing them with cultural understanding as well as capital are of significantly lower value than those who hit balls with sticks!  This is a problem.

                Does this model stand up to scrutiny in the real world? I argue it does.  Say, for instance, that the community had a choice between investing in teachers or investing in baseball, using my model above one should predict that the community would choose baseball.  Fortunately for this study, albeit regrettably for the community, this experiment was played out in my home of Lee County, Florida. 

In 2008 a budget shortfall of $29 million created a crisis for the Lee County school system.  The school board scrambled to find ways to resolve this crisis (which is, as of this writing, not yet resolved).  After making significant cuts the school board was still faced with a $14.6 million hole.  Almost 250 teachers were laid off at the end of the 2008/2009 school year.  Some suggestions for finding the money included cutting teacher benefits by as much as 73%. Nowhere among the local news articles and even editorials that I’ve examined on this subject did anyone, public or in office, suggest ways of raising more money for teachers and schools. Instead, all plans rested on the need to cut services.[2]

At the same time, the county received the shocking news that the Boston Red Sox were considering leaving Fort Myers and moving to Sarasota.  Shocker! This could not be allowed to happen! Lee County Commissioners stumbled all over each other to find ways keep the Red Sox in Fort Myers.  To do so they were willing to shell out an estimated $80 million to build the team a brand new stadium. Contrast this to the $14.6 million or even the $29 million shortfalls in the school budget that required cuts in educational services.

But wait, there’s more! The Red Sox had been in Fort Myers since 1993.  They were induced to open spring training in the city when commissioners decided to build the City of Palms Park on three city blocks taken over by eminent domain.  The new park cost around $23 million, toward which the county only paid against the interest.  Consequently, fifteen years later, Lee County still owes over $26 million on a park that may well be empty after 2011.  The park was built with the understanding that it would raise the standard of living in a troubled and impoverished part of the city.  This has not happened. Now it is hoped that another team can be induced to move its spring training to the City of Palms Park.  Really? Without renovations that will cost how many million dollars? Good luck with that.

Now the argument could be made that the Red Sox bring needed revenue into the county.  That has not been established scientifically.  One study estimated that having a professional team in the community for spring training can bring in as much as $25 million a year.  How this figure is derived, however is questionable.  Academics note that local taxes and sales revenue do not substantiate this conclusion.  Indeed, there’s no real evidence the Red Sox bring any revenue to the county.  A local study revealed that only 1.3% of tourists claim to come to Lee County to see spring training games.

Regardless, local officials demonstrated that when it comes to a crisis in education, the solution involved cutting funds, benefits, programs and jobs including instructional jobs. A crisis in baseball, however, involved finding new sources of revenue to raise the money needed to keep a team in the community whose overall value does not go beyond the personal satisfaction gained by knowing that people who are very good at hitting balls with sticks are doing so close by.  Again, we have a problem.

[1] Granted, I will admit that using income as such a model may be problematic. For instance, if I were to conduct a survey of who you would rather have with you if stranded on a deserted island, I would hypothesize that a teacher would score higher than a baseball player (maybe that depends on the teacher and the baseball player).  However, income is a real issue for real people in the real world; therefore I favor this model to the hypothetical surveys that could be done.

[2] Granted, there was some very constructive discourse into examining waste in the school system.

A Classic Sociological Question: Are Humans, by nature, Warlike?

Every semester I face the same challenges from students who are steeped in the the cultural awareness that much of human behavior is biologically and even genetically determined. Oftentimes in my discussions about social problems or the foibles of society I hear the comment, “it’s just human nature.” That sparks a response and a challenge from me.  What exactly is human nature? Where does it come from? How is it manifest?

This is the classic nature/nurture (n/n) argument. How much of our human behaviors are biological (n/) or how much is cultural (/n)? And this is sticky ground, especially for a sociologist.  Sociologists, by virtue of our training, see the social in everything. Of course we do. That’s how we are trained.  We also tend to be very skeptical of claims of a biological, genetic or evolutionary explanation of human behavior, especially when such explanations tend to reinforce what we have identified as negative stereotypes, such as male dominance, or female aptitude for math.

That being said, however, there does seem to be a growing body of science that would suggest that sociology does not explain everything about human behavior, even mass behavior.  That there are some n/ influences involved.  Influence, however, is not causation.  When scientists try to measure the influence of one factor compared to another factor with regard to any phenomenon what we are really measuring is an explanation for variance. So when the newspapers report a new finding on the genetic influence on violence, for instance, what they are really reporting is that geneticists have explained maybe 10% of what contributes to violence.

Another question that arises is how much nature versus how much nurture is involved in human phenomenon? What are the percentages? 50/50? 40/60? 60/40? Usually the class wants to compromise and suggest 50/50, not because there’s any scientific validity, but rather because 50/50 is a good conflict resolver.  So is it human “nature” to want to resolve conflict with a 50/50 scale, or is this a cultural expression of game theory? Hard to say.

The reality is that it is not that simple because I would offer that many human phenomena that are influenced by n/n actually involve an overlap rather than clear divisions on two sides of an argument. It may very well be that in order for certain phenomena to be expressed there must be both natural and cultural factors at play. Where one begins and the other ends is not so much a boundary as rather a blending or shading effect.

The example that I use is height.  Height in the United States is almost completely genetically pre-determined.  However, genetics does not explain the full nature of height.  Nutrition and health also contribute to the phenomenon of height, and these are often cultural and institutional /n variables.So I ask the seminal question: Are human beings naturally warlike? Are the efforts of the peace movement in vein because we, as human beings, are naturally pre-disposed to kill each other? And what are the obvious consequences in a world full of total weapons and weapons of mass destruction?

It’s a good question with very telling sociological implications.  A must for every sociology class. In class I often get an immediate response. Yes! Human beings are, by nature, warlike.  That explains the vast, bloody panorama of human history.  One war after another for five thousand years.  Was history really like that? Or is that simply an artifact of studying what is interesting to us, thus jumping from one war to another, creating the illusion that the history of man is the history of warfare?

As my students learn, however, sociology is not about the immediate and obvious response to an answer.  Sociology is in the details.  For sociologists, war is a social phenomenon involving power, institutional arrangements, cultural elements and historical components, and therefor not a biological imperative. War is hierarchical and involves divisions of labor.  It’s founded on social constructs of justice and threat, even empathy for those who are being victimized.

During the discussion I often ask why it is, if people are naturally warlike, that it takes a concerted and often expensive effort on the part of a government to convince its people to go to war.  We talk about the Gulf of Tonkin Resolution and the Mushroom Cloud paradigm offered by the Bush Administration.  If human beings are naturally warlike, why do governments have to lie to get us involved? War, from this perspective, is a purely sociological phenomenon.  Case closed?

Well, not really. War may be a social construct, but the larger issue of violence needs to be resolved. Are human beings naturally violent? And, of course, are certain human beings naturally more violent than others? This may be a little trickier. Again, the assumptions start to fly, sometimes with predictable racial overtones that it is imperative to explode.

The most likely answer to this is that yes, under certain circumstances, violence may be a natural response in human beings. When facing an unavoidable threat, it might be a natural response to strike out and fight.  Often, however, I see that it is more descriptive to suggest that it is human nature to avoid and run away from the threat rather than face it head on…if one is alone.  But what about when we are in a group and perceive an advantage in numbers.  Is it natural, then, to turn to violence when avoidance would have been the individual option? How much of this is nature and how much of this is the sociological response to group dynamics. After all, it may very well be human nature to form groups, while the kind of group and the group dynamics may be sociological.

I also remember reading some research many years ago that attributed aggressive behavior to poor nutrition.  I remember thinking, this makes sense from an evolutionary standpoint.  If you have poor nutrition your body may be reading that as a lack of resources. It is, therefore, in one’s biological best interest to become more aggressive to attain greater resources for survival. However, access to resources is one of the most basic sociological motivators and determinants. Our access to resources is often defined not by the natural consequences of drought, but rather through social inequity.  Again, where does biology kick in and sociology overcome? It’s difficult to say.

But here we have two examples of how aggression may be, at least to some extent, a natural phenomenon: threat to an established social group, and restricted access to life resources. Both of these phenomena have naturalistic, biological, genetic, evolutionary correlates as well as sociological influences and consequences.

We can then take a look at how power groups, such as governments, can use these very powerful phenomena to encourage acts of mass violence such as war or terrorism.  By invoking fear, a natural response, governments, sociological entities, can motivate populations to perform acts that they would not be so inclined, either by nature or nurture, to do on their own.

In class we often discuss the process by which people learn to kill other people in war.  The argument could be made that people, by nature, find killing other people distasteful. That’s why great effort is put in to define the enemy as being somewhat less than human.  A great deal of analysis has been put into this.  The process of dehumanization is a necessary correlate to war.  This becomes difficult to maintain as we spend time in an occupied nation and start to see those people who were the enemy acting like human beings. I started reflecting on this after talking to vets whose job it was to conduct surveillance on the enemy.  One vet stated, “I got to see them when they didn’t know I was watching.  They would be laughing and crying and tripping over things, sharing pictures of girlfriends, just like my buddies.” Under such circumstances it becomes necessary to remind soldiers that the enemy is a constant threat, hidden in the general population ready to strike, thus demonizing everyone.

That warfare is contrary to human nature is betrayed by the heavy, psychological toll paid by many (some would suggest “all”) veterans.  According to research conducted by the Defense Department, the best way to help veterans deal with the stresses of combat is to minimize the amount of time that they are actually in combat and maximize the time that they are living a normal life. So which circumstance is the “natural” condition of humanity?

Nature/Nurture debates are fascinating and great fun in the classroom and sure to get a response. Sociologists, however, need to allow more room for the nature aspects of social phenomenon otherwise the lessons that we can teach are muted by unrealistic explanation. By examining how biology and sociology overlaps we can reach more students and influence more minds to be aware of the very real social uses of this dialectic.  If science, be it biology or sociology, has taught us anything it is that there is no clear determinism in the lives of real people.

Secrecy is about power, not protection

Have you ever been a part of a group in which secrets were being shared? Where were you in relation to those secrets? Were you the person sharing the secrets? The person learning the secrets? Or were you out of  the secrets loop? Perhaps you were the subject of the secrets.  Regardless, where you are in relation to the secrets being shared is a definitive characteristic of one’s status within a group or sub-group because secrets are a mechanism of power.

The same is true at the national or societal level, though admittedly the dynamics become much more complex. In any given society there are those who learn and share information, those who receive the information, those who are the subjects of that information and those who are not.  What’s more, this flow of information is typically directed upward along the social hierarchy, with those at the top, whom we can call the power elite receiving the most information and those at the bottom receiving the least.

Those at the top of the ladder have access to the most sophisticated technologies and infrastructure for gathering, disseminating and analyzing information while those at the bottom are mostly dependent upon the elite to share their information.  And this is where secrecy comes in.  The more the power elite can control the dissemination of information to the lower tiers of the society, the more power they can incorporate into their social groups. Indeed, one measure of power may very well be one’s ability to control the flow of information.  Power may very well be defined as the product between how much information one can glean from other groups and how much information can be kept secure from the knowledge of other groups.

The Bush Administration knew this very well. They ran a tight ship of secrecy, making sure that very little leaked from the inner sanctums of their power groups. Everything was subject to strict control, to the point where Vice President Cheney actually invented his own Top Secret categorization.  To avoid sharing information the Bush team claimed executive privilege. When that didn’t work, Cheney actually defined himself as part of the legislature, not the executive, therefore not subject to judicial demands for executive information while at the same time claiming executive privilege. The Bush Tango around releasing information was almost a thing of beauty.

At the same time, no other administration placed so much emphasis on gathering information on as may people as is humanly possible while excluding as many from the flow of information as could happen.  It was under the Bush Administration that Total Information Awareness was developed, shot down by Congress, then forgotten, then renamed and passed on to the NSA.  Those laws designed to protect the privacy of American citizens.  No problem.  Ignore them, undermine them, undercut them and make sure no one knows what we are doing.  Those who do know, such as the gang of eight legislators in the House and Senate intel. committees, impose the strictest secrecy enforced by law, or political maneuvering to guarantee that no one knows the extent of the information being gathered. Otherwise, order the CIA to conduct program without even telling Congress. If Congress asks questions lie.

Now we have a new administration inheriting the power infrastructure from what may, by the algorithm defined above, be defined as the most powerful in the history of this country.  Will Obama take a step back and scale down these huge discrepancies of power.  He has promised that his would be a transparent administration.  In some ways maybe it is. On the other hand, the infrastructure, the rationalization, the experts are all there for his use, and Obama does not seem interested in dismantling such a colossal power structure.  If Lord Acton was correct about the corrupting effects of power, then we must admit that Obama has inherited the most corrupting infrastructure in history.  Power institutions do not have a history of disempowering themselves.