NOTE: The picture below is not the original. Somehow, many of my images on this site were corrupted (one of the reasons a switched to the new site and format). The original was a Barbie in a very low cut evening gown. Fortunately, I didn’t have to look far to reproduce a suitable image. I simply clicked on a major retailer website and selected four images. You see that though the image is not quite the same, the message and the insinuations about sexuality remain.
This is a picture from the cell phone of one of my wife’s friends. Believe it or not this was a Barbie (or Barbie like doll, I don’t want to step on any name brands) that was being sold in the toy section of the store. Notice the low cut top and the obviously, surgically enhanced…um…cleavage.
If a picture says a thousand words, just how much is this doll saying to our children, especially our little girls?
Ideas, once created, do not die easily. About twenty-five hundred years ago the ancient Greeks proved that the Earth was a sphere. Eratosthenes even calculated the circumference of this sphere. Since then the Earth has been circumnavigated countless times and been photographed from orbit and from the moon. It is clear beyond any doubt that the Earth is, in fact, a sphere. Yet the idea that the Earth is flat still remains and is defended by a very real Flat Earth Society. Mythology going back to ancient Egypt is still alive and well in the Summum Church. I still find myself bewildered by those who refuse modern medical treatment, preferring ancient herbal remedies against often extreme illnesses. Regardless of our social and cultural evolution even the most outrageous ideas are still alive, if only in a small knot of the population.
It makes sense. Once an idea is developed and disseminated throughout a population through language and symbolism it develops a very robust vitality. Add on other cultural elements and institute this idea into active populations and you are looking at something that is very vital. Once an idea is reified, or normalized and naturalized, within a population it becomes the lens through which reality itself is viewed. So long as adherents remain who accept the reified elements of an idea and socialize them into the next generation the idea itself will survive if only in small, holdout populations or as foundational elements of other knowledge.
The advent of new and even better ideas is not enough to eradicate the old. The eradication of an idea cannot be accomplished through debate and illuminating discourse. It’s not good enough to merely disprove an idea, even if that idea has no valid foundation at all. If the idea has been reified it will survive even the most acute and exacting proof to the contrary. People become invested in their ideas. To suggest that they change may actually require people to abandon their own sense of reality. This is especially true when ideas are the foundation of institutions with significant power investments, such as religions or political parties. To eradicate the idea means upending the institutional framework itself.
Ideas are, in a very real sense, well nigh impossible to eradicate. This may be exasperating to those of us who actively strive to institute social change and see ourselves as working toward a more just and free society. Antiquated ideas, like those that define a “woman’s place,” are frustratingly tenacious. On the other hand, if beliefs and their associated norms and values were as malleable as some of us may want society could be too unstable, unpredictable and normless.
Despite this, attempts have been made to eradicate ideas considered or constructed as dangerous. The Roman Empire tried on a number of occasions to destroy the dangerous concept of Christianity through persecution and execution. Christian churches were largely successful in wiping out indigenous, pagan beliefslargely, but not completely. Of course, there are the infamous modern attempts to suppress ideas in Nazi Germany, manifest in Kristallnacht, book burnings and the final solution, and in the Soviet Union, represented in the gulags. Regardless, blatant policies of eradicating ideas always involve oppression and even violence. Eliminating ideas through knowledge and illumination is rare. After all, there are still those who insist that the earth is flat. (The concept of bodily humours comes to mind as an idea that died through atrophy though I’m certain if I look hard enough I’ll find it’s not so dead). So when paranoids like Glenn Beck suggest that an idea, especially a well established idea like progressivism, should be eradicated, this pronouncement should be immediately followed with an explanation of what eradication entails.
To eradicate an idea the rights of individuals to speak and to freely access information and personal networks that perpetuate these ideas must also be eradicated. Adherents to the offending ideas must be completely silenced, because mere discrediting will not suffice. Outright brutality must be brought to bear on those who would perpetuate an idea, and even then the idea will be perpetuated if only in secret.
Eradicating ideas requires some combination of the following:
- Attacking the ideas directly as being illegitimate or, most likely, a threat.
- Delegitimizing and dehumanizing those who hold these ideas.
- Censorship of the offending ideas.
- The denial of rights for and/or violence against those who hold the offending ideas.
Attacking ideas directly is not the same as debating and disproving. In fact, it has nothing to do with the relative validity of the ideas being attacked. To attack an idea the claim must be made that the belief system constitutes a threat or some malicious intent. An idea may be defined as being un-American, or unpatrioticor seditious. The first Sedition Act in the United States was passed under the Adams administration just seven years after the ratification of the Bill of Rights. In the 20th century the Wilson administration signed into law the Espionage Act aimed at curtailing dangerous ideas against the war effort. Charles Schenck was imprisoned under this act for speaking out against the draft. The famous jurist Oliver Wendell Holmes defined Schenck’s ideas as a “clear and present danger” akin to shouting “fire” in a crowded theater. Three time presidential candidate Eugene V. Debs was put in jail for speaking out against US entry into World War I among other dangerous ideas.
After World War I the US Attorney General A. Mitchell Palmer and his assistant, J. Edgar Hoover concocted a Red Scare that lead to the arrests of thousands of innocent people who happened to be adherents of these dangerous ideas. According to Palmer, “The whole purpose of communism appears to be the mass formation of the criminals of the world to overthrow the decencies of private life, to usurp property, to disrupt the present order of life regardless of health, sex or religious rights.” Hundreds of radical thinkers, like Emma Goldman, were deported while others, including the Nobel Peace Prize winning Jane Addams, were placed under surveillance.
Later, the US House of Representatives instituted the House Un-American Activities Committee, a government body dedicated to stamping out the dangerous ideas of communism. It succeeded in bullying and largely silencing some of America’s greatest talents, including Arthur Miller, Dalton Trumbo, Richard Wright and Pete Seeger. HUAC became the model later used by Senator Joseph McCarthy to achieve his ends by suggesting that the United States was being invaded from within. “The reason why we find ourselves in a position of impotency is not because the enemy has sent men to invade our shores, but rather because of the traitorous actions of those who have had all the benefits that the wealthiest nation on earth has had to offer – the finest homes, the finest college educations, and the finest jobs in Government we can give.”
So when men like Glenn Beck suggest that ideas they don’t like are a “cancer” that must be “cut out” and “eradicated” they are echoing a long line of fear mongering and of stifling the environment of thought in the United States.
From delegitimizing ideas it is not a far leap to delegitimize the people who hold these ideas. Really, the only way to really stop an idea is to silence people. That was the function of the Sedition Act of 1798 and the Espionage Act of 1917. The Palmer Raids and HUAC, McCarthy and the blacklists were efforts to intimidate those with so called dangerous ideas. For those who could not be bullied, there was deportation, exile and imprisonmenteven death. In the case of Sacco and Vanzetti, the fear associated with their radical ideas was instrumental in their death sentence.
One heart-wrenching example of the destructive potential of trying to destroy ideas comes from US policies toward native peoples. The expansion westward created for Americans an “Indian problem.” Native people were defined as savages, and their culture as inferior to, even in the way of, Euro-American culture. Indigenous Americans were routinely invaded, killed and removed from their homes and exiled into reservations. Native people often resisted, and resisted violently. In many cases they resisted brutally. To suggest, however, that perhaps Native people had a right to defend their homes and their cultures was not a legitimate idea. American response to such Indian resistance was reflected in the observation of General Phillip Sheridan that “the only good Indian is a dead Indian.” Despite the US government repeatedly betraying promises and treaties with Indian tribes, the term “Indian giver” remains a euphemism for those who go back on their word.
Once defeated and exiled onto dependent reservations the most invasive attacks on Native American ideas began. The Dawes Act mandated that native people abandon their communal lifestyles and take up property ownership like good Americans. In the meantime the Bureau of Indian Affairs (BIA) mandated that native people give up their languages, rituals, religions. Native American children were taken from their parents and sent to boarding schools where their cultural identities were suppressed and replaced with appropriate, white American ideals. The first of these boarding schools was founded by an army officer named Richard Pratt. Pratt summarized his goals by saying, “all the Indian there is in the race should be dead. Kill the Indian in him, and save the man.” Native children were given Euro-American names, not allowed to speak their native languages, denied practice of native rituals and forced conversion into Christianity. This was a concerted effort to destroy the ideas and identities of many cultures.
In 1889 a Paiute shaman named Wovoka had a vision. In this vision it was revealed that the land would be restored to the Indian people, the fertile soil restored and the great herds reborn. The white man would disappear if the native people everywhere performed a great Ghost Dance. The Ghost Dance movement spread rapidly through Lakota country. This was a non-violent movement aimed at restoring native culture. Weapons were not allowed during the Ghost Dance. However, Ghost Dance was also defined by white Americans as a dangerous idea that had to be “cut out.” To a certain extent, this is understandable as the premise of the Ghost Dance was the “eradication” of the white race. But this eradication, however, was to be divine intervention, not a human revolution. Leaders associated with the Ghost Dance were targeted for arrest. Among them the great Sioux Chief Sitting Bull who was killed in an attempt to take him into custody. The fear and ignorance of a minority of white settlers that the Sioux were “dancing” lead directly to the Massacre of Wounded Knee.
The process of eradicating ideas is not just a rhetorical construct. It has very real and terrible human consequences. I could go on with the stories. The bottom line is that in all of the history I’ve studied throughout the years I’ve not found one instance in which an idea was “eradicated” without the denial of basic rights and even violence. Some ideas die from atrophy as they become irrelevant in process of cultural and historical change. This is not the result of an open attempt to eradicate these ideas, but rather a social evolutionary process.
In all of the history I’ve studied there has only ever been one “cancer” on any society. That cancer is fear. But the cure for this particular form of social cancer is not “eradication” or some brutal process of “cutting out,” the offending idea. The cure is reason. Fear is stoked by ignorance. Only those who are ignorant of history and philosophy can be motivated by fear. It is incumbent upon us to meet ignorance with knowledge and understanding.
A couple of weeks ago I was teaching my students about the fundamental theories of sociology. Of course, no such lesson is complete without mentioning British biologist and social philosopher Herbert Spencer.
Spencer was a founding father of the structural functionalist perspective in sociology. It was his belief that everything that happens in society happened for a reason, that it served some organic and evolutionary purpose. This was a valuable contribution to the social sciences, which was elaborated upon later by Emile Durkheim.
Unfortunately, Spencer’s philosophy did not stop there. Indeed, before Darwin’s Origin of Species Spencer was expostulating theories of evolution, and applying those ideas to the study of society. It was his belief that society was guided by a natural, evolutionary law, which would ultimately lead to a perfected equilibrium. To manifest this evolutionary outcome, Spencer advocated a radical concept of “freedom.” To Spencer, any government interventions on the function of society were a form of oppression and an obstacle for our progress toward some vaunted equilibrium. In fact, Spencer suggested that it was a matter of time before the state would be replaced by, in essence, a market based institution.
Spencer, who coined the term “survival of the fittest” was absolute in this concept. ANY intervention on the part of the government stood in the way of social progress. Whereas he understood the rationale of human empathy and charity, he advocated against such activity. Humanitarian intervention for the poor, sick, mentally handicapped or other marginalized groups only perpetuated these groups into the next generation, prohibiting the “natural” progress toward our ultimate human destiny
a free market, one would say “libertarian,” society. He admonished 19th century liberal reformists for being so short sighted. His politics was to let the weak die! It was, according to Spencer, for the greater good.
Usually, the students are at this point stunned by this coarse and vile philosophy. It’s my belief that they should be. Sociologists are usually in their field because they want to contribute to society, and this contribution does not include letting large segments of the population die from hunger and disease, even if it does fulfill some evolutionary purpose. Spencer’s theory is known as Social Darwinism (though Darwin himself was not a proponent, and Spencer’s evolution drew more from Lamarck).
I tell my students that there remain subtle whispers of Spencer in the modern discourse. Grover Norquist’s goal to shrink government to the size where it can be drowned in a bathtub would have met with Spencer’s approval. Ronald Reagan’s admonition that government is the problem, not the solution is the underlying argument against all government programs from Medicare to Welfare (but not the military for some reason). Milton Friedman’s extremist free market economics has, at its base, a Spencerian motif.
Then I heard this statement from South Carolina’s Lt. Governor Andrew Bauer with regard to government assistance, “My grandmother was not a highly educated woman but she taught me as a small child to quit feeding stray animals. You know why? Because they breed. You’re facilitating the problem. If you give an animal or a person ample food supply they will reproduce
” I was reminded that sometimes the riffs of Spencerism are not so subtle.
It’s interesting, if not somewhat appalling that contemporary neo-conservatism draws from the inspiration of Herbert Spencer. Now I can’t say that neo-conservatives have read the works of Spencer, but Spencerism has long been a common element of conservative discourse. It has always been the justification for ignoring those in our society who are in need, for the exploitation of labor, for the exultation of the business elite. Whenever you hear the rhetoric behind privatization of those things held in the public trust, such as public schools and even defense contracting, know that you are hearing the whispers of Herbert Spencer.
Spencer and the conservative tradition, even today’s neo-conservatives, do have some legitimate criticisms and concerns about the dangers of state power. However, it never ceases to amaze me that their solutions are consistently at the expense of the marginalized, disempowered and dispossessed. It’s hard to justify philosophically why government intervention in people’s lives is oppressive, but corporate intervention in people’s lives is an extension of “liberty.” A government bureaucracy is just this side of Stalinism, but corporate bureaucracy is a free market ideal.
Of course, Spencer’s philosophy hinges on the idea that the actions of free individuals are the driving mechanisms of this fantastical social evolution, but the actions of institutions, composed of individuals, are interference in the natural order. Could it be that government and even corporations might, through checks against their propensity to consolidate power, actually be contributing members of our society?
Let’s face it. American democracy has been on life support for some time now. On January 21st, however, the five conservative members (or should we from now on refer to them as corporate conservatives) of the US Supreme court took it upon themselves to pull the plug. American democracy died that day with only a muted whimper as Americans were more concerned about the fate of the Tonight Show and the corresponding careers of Jay Leno and Conan O’Brien.
The decision was Citizen’s United v. Federal Election Commission. Justice Kennedy presented the “opinion of the court,” or rather the opinions he shared with Justices Alito, Scalia, Thomas and Chief Justice Roberts, declaring that corporations have the same rights as do individuals, and pouring unlimited amounts of money into political campaigns is a manifestation of the corporation’s right to free speech. So there you go. Corporations are people. They are just people who cannot vote, hold office, eat, drink, plan a family, experience joy or pain or, necessarily, die. But they are people nonetheless.
In fact, they are very special people. People in America can only give $2,300 to political campaigns. Corporations, on the other hand through their polymorphous arrangements and organizations can give unlimited funds to campaigns and related Political Action Committees, 527s and ad hoc media outlets to support or destroy a political candidate.
Of course, corporations are not people, they are institutional constructs. In my sociology classes I teach my students what I call the Law of Institutions. The Law of Institutions states that the primary function of an institution is the perpetuation of that institution. Sociologically, this is not necessarily a bad thing, as institutional stability is prerequisite for societal stability. However, when the needs of people conflict with the Law of Institutions, the needs of the people must necessarily take a back seat.
Corporations, as institutional constructs, need only to perpetuate themselves. Corporations do not require clean air or medical care or humpback whales. Many people, however, feel that these things are important and make their political decisions based on the positions candidates make on these issues. Corporations, however, do operate quite well in environments conducive to exploitation and deregulation and the preservation of destructive commodity chains, risky investment options and monopoly.
Those candidates who toe the corporate line can expect the kind of political protection from corporations that would make the Mafia jealous. Those candidates, however, who might want to oh, I don’t know suggest that corporations shouldn’t be allowed to dump dioxin in our drinking water, or destroy habitats or exploit their works, could face a cessation of funds and an opponent with virtually unlimited resources to discredit him or her.
By removing what little restriction corporations had in political campaigning, the Supreme Court has transformed the United States of America into what I’ve long called a corporatocracy. According to Representative Alan Grayson, the only politician I’ve seen who is ahead of this issue, “You won’t have any more senators from Kansas or Oregon, you’ll have senators from Cheekies and Exxon. Maybe we’ll have to wear corporate logos like Nascar drivers.”
Grayson, in anticipation of this ruling, has put together a legislative package called the Save Our Democracy Reform Package. This legislation would impose a 500% tax on corporate expenditures to political campaigns, requires corporations to disclose the funds they use in political campaigns, would bar politically active corporations from participating in national stock exchanges, and prevent corporations that receive government contracts from political spending.
Grayson’s package makes sense and may have a great deal of support with the general public (once the whole Leno/O’Brien thing blows over, of course). It will certainly not have the support of the GOP, as Republican National Chairman Michael Steele is on record supporting the Supreme Court decision. Individual Republicans, however, such as Senators John McCain and Olympia Snow have some history with campaign finance reform and have expressed concern about Citizens United v. FEC.
As citizens, real flesh and blood people, we are democracy and we should be the first responders desperately trying to resuscitate American democracy before the body is cold. Call your representatives today and demand that they support the Save Our Democracy Reform Package. Find out where they stand on Citizens United and hold them accountable to democratic, not corporatocratic principles.
An Aside on this issue:
Where are the conservatives crying “ACTIVIST JUDGES! ACTIVIST JUDGES!”?
In his history defining and seminal work, Democracy in America, Alexis de Tocqueville explored the ins and outs of early nineteenth century American democracy. Of de Tocqueville’s concerns about American governance the most famous was a critique that has become known as the Tyranny of the Majority. According to Tocqueville, the majority in America is “omnipotent.” This omnipotence ultimately translates into oppression over the rights of the minority, blindness to ancilary issues not within the aegis of the majority and legislative instability. “Hence the majority in the United States enjoys immense actual power together with a power of opinion that is almost as great.”
De Tocqeville’s criticisms of American democracy are valid. Democracy in America should be required reading for all Americans. Most especially to one who studies the sociology of knowledge is the nineteenth century philosopher’s unwitting nod to postmodernism when he suggests that the tyranny of the majority can be even more oppressive than any monarchy by virtue of its ability to define the very ideas of the citizens.
“The most absolute sovereigns in Europe today are powerless to prevent certain thoughts hostile to their authority from silently circulating through their states and even within their courts. The same cannot be said of America: As long as the majority remains in doubt, people talk, but as soon as it makes up its mind once and for all, everyone falls silent…I know of no country where there is in general less independence of mind and true freedom of discussion than in America.”
In going back to the well thumbed pages of my volume of Democracy in America I am still humbled by de Tocqueville’s analysis. It’s no wonder that sociologists claim his as one of our own. However, I’m left to wonder just how de Tocqueville would analyze our current debate on health care reform.
No doubt de Toqueville would be confused. Throughout the health care debate poll after poll demonstrates that majority opinion supports the foundation of a public option to control costs by providing an alternative to private insurance. This, despite the negative PR blitz of this last summer and the veracity of a Tea Party movement venomously against health care reform or the prospect of even the slightest government interference in the free market. If ever a majority had made up its mind it is with regard to the public option.
Yet the fate of the public option is in peril? How could that be? How could de Tocqueville’s famous analysis be so far off in this (and many other matters, but that’s a different blog)?
Of course, we can’t be too hard on a nineteenth century social commentator. How could he have ever predicted the rise of what I now call Pathological Liebermanism or The Tyranny of the Lieberman. This is a phenomenon in which the processes of American democracy invests disproportionate power into the hands of one elected official (in this case “Droopy” Joe Lieberman, but also Ben Nelson). The majority of Americans support the public option. The majority of legislators support the public option. But majorities are not good enough. Individuals like Lieberman and Nelson can bring the “omnipotence of the majority” to flaccid humility.
We must remember that our founders and their immediate heirs had very little regard for the will of the majority. They established norms through which they could thwart “mob rule” in the chambers of congress. Among those rules was the filibuster. And the filibuster has become the weapon of choice for minority political parties. Now, to be honest, I’ve supported the filibuster when it was being used to protect the Arctic National Wildlife Reserve. On the other hand, the filibuster was also used to delay crucial civil rights legislation. It seems that we have a love/hate relationship with this particular Senate rule. In matters of health reform including a public option, an issue which I support, it is maddening to think that one senator, a Lieberaman or a Nelson, can stall the will of the majority of Americans.
It is equally maddening that such senators can, in essence, put their filibuster busting potential on the sale block. In Nelson’s case it was an agreement that the federal government would pick up the tab of health care reform for the citizens of Nebraska…and only Nebraska. Why should the people of Nebraska benefit at the expense of the rest of Americans who would have to pay more to make up for the absence of that state? Because Ben Nelson won’t shut up? In Lieberman’s case it was the elimination of popular reforms, the public option and the expansion of Medicare. Why should Lieberman, a man who himself enjoys the benefits of a single payer, government run health program, be able to deny the same for the rest of us all by himself?
Indeed, de Tocqueville would be obliged to add a chapter or at least a long addendum to his master work to address the Lieberman Syndrome (another cool name for what we are witnessing).
De Tocqueville was also not privy to the idea of modern lobbying and money politics. In almost every case the politician with the largest campaign coffers wins. De Tocqueville’s assertion that our legislators change rapidly, leading to instability in our houses of government has turned out to be false. Indeed, the majority of seats in congress are considered “safe” seats in which the sitting representative will almost certainly be re-elected. This fact, however, rather than stabilizing our legislature as one might predict using de Tocqueville’s reasoning, has lead to an entrenchment of ideas and ultimately to an institutional polarity that one might suggest is even more destabilizing in effect.
This polarity is linked to campaign contributions. Think about it. A two party system is much easier and cheaper to fund than one in which multiple parties and ideas are competing for recognition. As it stands, most corporations hedge their bets by donating large sums of money to both parties. Imagine if there were three or four or even five parties demanding such control!
The polarity between liberal and conservative is also fed by campaign contributors. In the health care debate The Center for Responsive Politics has done interesting research on campaign contributions and position in health care reform. They created a ratio between contributions from labor organizations and contributions from health care corporations and compared this ratio to the voting records of our senators. Those senators with higher ratios, thus higher comparable contributions from labor, were more likely to vote yes to the Senate bill. Of course, they were also more likely to be Democrat, indicating a traditional tendency for labor unions to contribute to this party. Those who voted against the bill were more likely to have received larger contributions from the health industry. It is important to understand, however, both health and labor organizations are sure to hedge their bets by contributing large sums to both parties. It is also paramount that the money does not necessarily represent a “majority” view. (1)
When it comes to money and health care those who are least satisfied with the status quo are almost certainly the least likely to contribute large sums to politicians. Labor organizations such as unions may have larger coffers to represent the interests of working people, but this is only a segment of the population so affected. Obviously the tyranny of the majority is not driving this debate, nor is the majority in any way “omnipotent” in the de Tocquevillian understanding of the term.
The health care debate might help us define American politics in a post de Tocquevillian way. It is the Lieberman Doctrine (Yes, coining terms is my new hobby!) that seems to be the new “omnipotent tyranny” influencing contemporary American democracy.
Now this post is not a condemnation of the filibuster, nor is it a confirmation of the legitimacy of majority rule. In this matter I happen to have the comfort of speaking with the majority. That is not always the case. When I do represent the minority opinion I want to have processes in place, like the filibuster, to protect my interests. I also recognize the reality that the majority is not always right. But certainly these ideas must be revisited.
It is the contention of the Journal of a Mad Sociologist that any great disparity in power, regardless of the holder[s] of such power, is a danger to democracy and humanity. The concentration of power in the hands of the majority has the potential to be just as oppressive as the concentration of power in the hands of Joe Lieberman. Neither should have the power to over-ride what this outlet has defined as a human right, the right to health care.
(1) In most cases this moneyed politics tends to limit the differences between parties. In health care, however, polarization is the result as a conflict between funding sources emerges to define the debate.
I had the opportunity to attend a middle school dance last week. I didn’t volunteer. Apparently I missed a meeting!
I didn’t dance. I’m not much of a dancer.
But I am pretty good at watching crowds and identifying patterns. I didn’t expect to see anything different from this particular dance than any other dance I’ve been to, regardless of the age cohort. There were the typical groupings. Cliques of no more than five sticking together. Boys and girls largely segregated their groupings, as to be expected among that age group. There were the wall flowers and the aimless kids wandering around wondering just what they were doing at this event. The dance floor population was largely girls. Some of the dancing was an obvious expression of sexuality that adults were largely uncomfortable with.
There were a a couple of phenomena that struck me, however. First was the nostalgic kick I derived by an apparent renaissance of break-dancing. Of course, it’s not called break-dancing any more, but it was there. Kids would gather in circles and watch select dancers do their tricks. They weren’t bad. Some of the dancers would even show each other their moves then put on a show. It so happens that the genesis of break-dancing hit the scene around the time I was at the middle school age (I attended a junior high). The breakers wore baggy, rayon jogging suits, carried huge boom boxes and often pieces of cardboard as a portable dance floor. The breakers today didn’t seem so prepared. Boom boxes have been replaced by iPods and rayon jogging suits have gone the way of the dodo, but the kids still had a great time. They believe they invented something, and that’s fine. I got a kick out of watching them.
The break dancing circles were open. Anyone regardless of talent was able to enter. Of course, less talented sorts didn’t remain as they were conspicuously outclassed by the more talented. The circles were also open to girls as well as boys. Many of those who entered the circles had their followings as well as their signature moves. Like break dancing of old, the dancing represented street contests, one dancer challenging another to match and best his or her moves. On a couple of occasions boys in the circle danced for specific girls on the edge of the circle. As per the script of such interactions the girls feigned indifference, refusing to be impressed by even the most impressive dance feats. Once the boy’s back was turned and the interaction over the girls demonstrated their glee at being so spotlighted. They had given a perfect performance.
The next thing that I noticed was a contrast to the dances I attended when I was young. In my time the dance floor was often crowded with people doing their own thing to the music. There was little syncronicity. I was surprised to see just how many group oriented dances there were for the kids last week. Of course, this observation is from only one dance, so I don’t know if this is the trend. The dance steps, however, were popular and understood by many of the students. This leads me to believe that this is standard fare for modern youth dances. Yes, there were opportunities for dancers to express themselves individually, but many songs offered defined, synchronized dancing similar to line dancing. One song explicitly described the dance moves that were to be done, from cha cha to stepping to the left, to stopping with your right foot.
Kids during this dance spent a great deal of time in this group endeavor. Could this be indicative of a general conforming trend among young people? Or perhaps, since this was a school sponsored event, these particular songs were selected for the purpose of increasing groupness, assimilation and conformity. Probably not, but it caught my attention.
Above is a classic interview conducted by Rachel Maddow with Richard Cohen. Cohen believes that homosexuality can be “cured” if you follow his methods. Rachel does a fantastic job of destroying his logic. My only criticism is that I wish Rachel had asked Cohen why homosexuality should be cured. Why should what amounts to a variation of desire, normal desire, be subject to “treatment” thus defined as an illness? That gay people should be subject to the kinds of stigma that convinces them that their love and desires should be subject to treatment is an abuse of the very concept of “therapy.”
To be an illness, a phenomenon must inhibit an individual’s ability, or those who share their lives with said individuals, to live a long and satisfying life. Many homosexuals, including Maddow, demonstrate that whom one desires and how one expresses that desire in no way impacts one’s health as defined above. Nor does it impact the health of others who share their lives with homosexuals. The truth is that there’s nothing pathological about being gay. Social stigma is the only motivation for defining homosexuality as pathological. And it must be remembered that the pathologization of sexuality is…profitable. It’s especially profitable for folks like Cohen who have a line of products for sale by which to “cure” what is, in essence, everybody else’s problem.
By playing against a socially constructed stigma, playing against a socialized sense of inadequacy by convincing people that they are flawed, in need of help, people like Cohen do nothing more than contribute to hatred and prejudice. This is true regardless of their so called prescriptions of love and understanding. By virtue of the fact that they define a variation of sexual/emotional desire as something that needs treatment they are defining those who desire and feel in a certain way as being sick, pathological–inferior.
Perhaps we should feel sorry for the conservatives in the ninth year of the new millennium. Thirty years of conservative ideology has produced great wealth for the highest tiers of society, but stagnation at best for the rest off us. Then, in 2008, with the possibility of continuing dominance of the conservative presence in American politics at hand, the whole free market house of cards collapsed around us; perhaps this, more than any campaign strategy, is what brought Barack Obama to high office . With conservatives and conservatism so effectively against the ropes it’s understandable that the response is somewhat punch drunk.
Indeed, it’s predictable that conservatives might want to regroup, consolidate its members and redefine themselves as a reference group. Exactly what does it mean to be a conservative? And does that definition include being a Republican? After all, conservatism has been the defining ideology of the Republican party for over half a century. The Republican Party, however, does not hold any special aegis over conservatism itself. Indeed, the opposite could very well be considered true.
Conservative Republicanism has collapsed. There’s nothing controversial about this statement. But conservatism itself is alive and well…and boy are they pissed! Right now they are looking for an institutional carrier to present their worldview. Conservative leaders, such as they are, have stated unequivocally that the Republicans may not remain their chosen vehicle toward this end.
As it stands, the conservative mentality is analogous to what sociologists call social closure. This is when a social group strictly defines the parameters of inclusion and membership. These parameters preclude a possibility of members bridging group identities by participating in other social groups. The idea behind closure is that there are certain, significant benefits to being a member to a particular group, in this case a substantial conservative voting bloc, and that these benefits should only be vouched to true adherents of group norms.
Politics, on the other hand, often requires significant interaction, negotiation, compromise and overlap (bridging) between social groups. Closure based on ideology is frustrated by political processes. Conservatives and liberals, being ideologically driven rather than politically driven, are often disappointed by those whom they believe to be their ideological representatives in politics. Often, dis-empowered groups will accept a certain amount of bridging in order to attain some voice in the corridors of power. However, what happens when the dis-empowered group was only recently in power? Could the current political paradigm be a case study?
Having been discredited, conservatives are retrenching. The problem according the the conservative weltanschauung, was not conservatism, but rather that past leaders (except Reagan, of course) weren’t conservative enough. By compromising with liberals and centrists the conservative cause was lost. Ergo, the solution is not to rethink the values of conservatism, as values are very rarely subject to critique, but rather to purge conservatism of its compromised elements and resurrect the “true” conservative.
This notion became apparent to me while I was discussing (read arguing) politics with my conservative brother who claimed that John McCain was not a real conservative, but rather a “progressive.” Granted, the word “progressive” has gone through some etymological convolutions in the last hundred years, but to make such a suggestion was indicative of the retrenching mindset of conservatives. In my brother’s mind, and that of many others, John McCain, the Straight Talk Express, is not a “real conservative.”
During the 2009 off term election the Republican congressional candidate in New York’s 23rd district, Dede Scozzafava, was derailed by conservative stalwarts for not being conservative enough on issues like gay rights and abortion. In one blog, conservative commentator Michelle Malkin even referred to Scozzafava as a “radical leftist.” Instead, conservatives rallied behind Conservative Party candidate Doug Hoffman, effectively handing the district to a Democrat for the first time since the Civil War. This could be looked at as a demonstration of the benefits of belonging to the conservative group, and of the dangers of choosing a direction contrary to those of conservatism. Some view this move as a nihilistic persecution of independent thinking. Liberal commentators like Rachel Maddow have coined scozzafava as a new verb to describe this strategy.
Republican National Committee chairman Michael Steele may well be on the spot if leaders in the RNC bring the so called “purity test” to the table at its winter meeting in Honolulu (talk about trouble in paradise). The purity test, recommended by some RNC leaders, enumerates ten positions of real conservatives. Those conservatives who agree to less than 80% of the positions should be denied RNC funding. The test, ironically titled “Reagan’s Unity Principle for Support of Candidates” may well have the opposite effect of unity. Indeed, according to liberal columnist John Nichols, even Ronald Reagan would fail the purity test that bears his name.
Purity is a typical theme when it comes to establishing social closure. Indeed, a purity test McDonaldizes, the process of establishing purity. With ten simple rules one can determine who the real conservatives are (and scozzafava the rest). This process is endorsed by conservative notables such as Glen Beck and Dick Armey. By bringing such a technology into the Republican Party the aim is clear, to reinforce the Republican Party as the vehicle of conservative ideology.
Historically, however, the Republican party has been home to diverse ideologies. The list of Republican Heroes on the GOP website includes such varied personalities as Abraham Lincoln, Frederick Douglass, Dwight Eisenhower and Ronald Reagan (interestingly, it does not include Theodore Roosevelt). Should a party that makes room for such luminaries, some of whom have solid liberal or progressive credentials, become a purist expression of conservative ideology? According to claims made by GOP leadership, it’s what Ronald Reagan, the new patron saint of conservatism, would have wanted. (Reagan was unavailable for comment)
Indeed, it’s understandable that ideological conservatives should aim to control one of the two major political institutions in the country, the Republican Party. Indeed, it could be argued that this process has been going on for quite some time–at least to the Goldwater campaign. At the height of conservative power the Republican Party had no room for politicians like Lincoln Chafee who tended to have a higher liberal voting percentage than conservative Republicnas were confortable with (calculating the liberal/conservative voting percentages is another example of McDonaldizing). The term RINO, Republican in Name Only, has been a term of derision for many years.
And certainly, conservatives are not alone in trying to manipulate the direction of a political party. One might say the same about liberals with regard to the Democratic Party. Neither party, having achieved political power, has a demonstrated history of ideological purity. The argument could be made that, as institutions, they do not owe allegiance to any particular ideological movement. Their main function is to remain empowered institutions. However, I would argue that conservatives have been a much more functional core of the Republican Party than liberals have been for the Democrats. The ideological spectrum, for instance, between say Ben Nelson and Bernie Sanders of the Democrats is much wider than one would find between any two Republicans. The current health care debate demonstrates more solidarity among Republicans than Democrats, and I would hypothesize that this can be largely explained on ideological terms.
The current pursuit of purity among conservatives in and out of the Republican Party seems…different, more persecuting. The advent of purity tests and the punditocracy like Glen Beck, Rush Limbaugh and Michelle Malkin give this movement the aura of a postmodern Inquisition and purge. It remains to be seen if Republicans will choose to continue this course. Having been soundly routed Republicans may see the conservative route as paying dividends.
It’s unlikely that these dividends will pay. Ideological purity narrows the appeal of a political party. In a democratic society it is important for the institutions of power to be as inclusive of as many popular groupings, or to bridge into these groups, as is possible. Exorcising centrists or those with multiple leanings in the interests of ideological purity may be self destructive. Of course, this is the postmodern era where media talking heads mentioned above can sway, at least temporarily, the collective consciousness of large segments of the population.
In the meantime, it may be that conservatives will abandon the Republican Party for more ideologically pure third parties such as the Conservative Party, Libertarian Party, or even a nascent Tea Party. Ironically, this is what I have been advocating with regard to both Republicans and Democrats for many years now.
(Click the logo above for a PDF version of this blog)
Theres a fundamental argument in the health care debate that has yet to be truly elaborated. The central question is, does our society recognize access to health care as a right or as a privilege? That this question has not been explicitly defined is intriguing as it could very well change the direction of the discourse into a more humanistic direction. That might be the very reason for not framing the health care debate in such a way. After all, rights are very popular, but frowned upon by the power elite as being an undue burden in their quest for a wider profit margin.
As it stands, it looks as though the meritocratic bias of our society predefines access to health care as a privilege. One of my students, echoing the capitalist paradigm, defined health care as a commodity. Like all commodities, health care is subject to supply and demand curves and scarcity. Indeed, this is a strong argument from the perspective of those providing health care and health insurance while trying to secure a profit for themselves and their shareholders. After all, sick people are expensive.
That health care is a commodity is a great descriptor of our current situation. In a health care market the goal is to maximize profits. This is done by cutting the costs of health care providers while at the same time steering health care consumers to the most expensive products. So the people who really need health insurance are more likely to be dropped, denied or subject to exorbitant costs. Those who can afford access to health care are often subject to the most expensive procedures, such as surgery over therapy. This leaves millions of Americans who can only access health services in the emergency room, publicly funded clinics or the extraordinary efforts of charitable organizations. In these instances the costs are passed on to consumers.
If health care is framed as a basic human right, however, then profit motive must take a back seat to universal access, as making the claim that some people have more rights than others is culturally awkward. It is also subject to a different social process, as rights are socially and historically contingent. The founding fathers had no need to elaborate a right to health care. For them, access to health care was almost universal, and quality of care was comparably equitable regardless of class. Its likely that the health care that George Washington received was not much better than that received by anyone else.
But that was before the advent of modern medicine. The medical field has since evolved from an exercise of natural philosophy to a specialized profession of scientific and technological innovation. The miracle of modern medicine has improved the quality of life, but has also contributed increased costs of care. The resulting health care industry can be described as a typology along a horizontal axis of cost and a vertical axis of quality. In such a system some are bound to be left out while others will be privileged.
For a long time this typology was not so pronounced as to inspire the scrutiny of society as a whole. The miracle of modern medicine allowed more people to survive infancy and live into old age. We have since taken it for granted that our children will survive and that we will grow old. However, the cost of this historical revolution is the shift in medical needs from the treatment of infectious disease and injury to the much more expensive chronic diseases that correlate to old age and the limits of medical science.
Large corporations developed to fill the market demand for medical access. These corporations were not motivated by the desire to do well by their fellow man, but rather to turn a profit. In an exercise of free market principles at their best, health insurance companies thrived by guaranteeing access to care for a price. Meanwhile, public clinics and government programs such as Medicare and Medicaid picked up most of the slack among unmarketable old and poor people. This system lasted long enough for insurance companies to become entrenched stakeholders in the medical system, earning them a place at the discussion table for a crisis in which they themselves are complicit.
Now we face a culmination of these dynamics. As usual, the realities do not break down along clear lines of right and wrong. Most people would agree to the simple statement that everyone (except illegal immigrants, but thats another essay) deserves access to health care. The question becomes more complicated when we add factors such as how much access and to which technologiesand who pays for it, the consumer or the tax payer? The fact that there exists an established institutional framework and its requisite capitalist paradigms that wields significant wealth and power is yet another obstacle to meaningful reform and a discursive restraint on radical debate.
Privelege vs. Right
The claim that health care is a commodity is a paradigm of privilege that denies the concept of an individual right to health care. After all, individual rights are not for sale. This constraint predetermines the direction of the debate and ultimately the outcome of reform.
A privilege is, by definition, something that is bestowed upon individuals through some social process. The sociology of privilege is based either on merit or inheritance and is a defining feature of status. Achieved status, or status based on presumed merit, is the default discursive formation in American society. Being of high status is assumed to be based on some individual quality, work ethic, drive or competence. Therefore, any privileges corresponding to status is deserved. Those who do not have such privileges may be offered some charitable entitlement, or they may simply be written off as undeserving.
Yet status is also ascribed, or inherited based on factors such as the socio-economic position of ones parents, ones race or gender, ones biological contingencies or limitations. Being born without a congenital health defect privileges one to easier access to health care just as being born to wealthy parents does.
Rights, on the other hand, are much trickier matters. By virtue of Americas cultural canon, such as the Declaration of Independence, the Federalist Papers and the Constitution human rights are inherent just by virtue of being American (or as some might suggest, human). As such, human rights cannot be denied an individual based on status without some established and limited system of due process.
In the United States the claim to rights is easy to postulate when those rights can be identified in personal terms. The right of the individual to speak or to worship, or even to gather with other individuals is easily defensible. Such rights do not require an investment on the part of others for exercise. The right to free speech does not have attached to it a corresponding right to be listened to, in other words.
Other rights, however, require a commitment on the part of others if the individual is to exercise them. The right to due process including a jury of ones peers, for instance, requires others to take on the responsibility of serving for jury duty. The right to privacy infers a responsibility to keep out of other peoples affairs. A more abstract right that most of us would recognize is the right to an education. But in order to exercise that right a sacrifice (one might argue a responsibility) must be made on the part of the community in the form of taxes, land acquisition and the childs time spent away from parents and out of the marketplace. Such rights, however, often come with a certain amount of contention as we debate just how much the community is to sacrifice as compared to what the community can expect to get in return.
Thomas Jefferson did us no favors in defining these inalienable rights so eloquently as life, liberty and the pursuit of happiness. All three such rights require some expense on the part of the community if they are to be manifest in the individual. For instance, it could be argued that one must have access to meaningful work at a living wage to enjoy life, real freedom and the opportunity to happiness. However, what is the responsibility of the community, of the society, of government, in assuring such access?
Health care fits nicely into this paradigm. Inequalities in access to health care certainly correlate to inequalities in life expectations, the liberty that comes with good health and unfettered ability to pursue ones own ends. Those with enough resources have a privileged advantage to these rights. However, rights should not be subject to ones socio-economic status. Rights are inherent in the individual, not in their social position.
Toward a Right of Health Care
Health care was not an issue for our founding fathers. Before the advent of a germ theory there was not much polarity in access to this right as there was for other matters addressed by the founders, such as speech, assembly, privacy. Consequently, health care is not neatly defined in our 220 year old Constitution. The founders recognized, however, that as times change so does our concept of rights. Thats why our Bill of Rights included the 9th Amendment, which ensured that the rights listed in the Constitution are not the beginning or the end of the story?
Since the advent of modern medicine, contemporary statesmen have come to recognize the importance of health care to the satisfaction of individual rights. Many governments and charters formally recognize access to health care as a human right. Not the least of these charters is the Universal Declaration of Human Rights, Article 25. As a signatory to the UN Charter the United States is bound by our own Constitution to recognize the right to health care.
Yet this does not stop others, like Theodore Dalrymple, to suggest that there is no such right. In his Wall Street Journal editorial Dalrymple, a pseudonym for British physician Dr. Anthony Daniels, states:
Whenever I deny the existence of a right to health care to a Briton who asserts it, he replies, So
you think it is all right for people to be left to die in the street? When I then ask my interlocutor whether he can think of any reason why people should not be left to die in the street, other than that they have a right to health care, he is generally reduced to silence. He cannot think of one.
I have one! How about the Hippocratic Oath: I will treat without exception all who seek my ministrations. Or how about this version of the Oath, I will remember that I remain a member of society, with special obligations to all my fell human beings Shouldnt we expect a physician to accept an oath taken in their profession as binding? Perhaps Im asking too much from Dr. Daniels.
How about the story of the Good Samaritan? Well, Im sure Dr. Daniels doesnt wear a WWJD bracelet.
How about the fact that it is universally, morally repugnant to allow the sick to suffer and die without giving whatever succor and aid at our disposal? According to the American Journal of Public health an estimated 45,000 deaths a year are associated with lack of health insurance. The study found that those who do not have insurance are 40% more likely to die from their illnesses than those who are insured. This study offers the moral equivalent to Dr. Daniels people dying in the street.
Jefferson declared that human rights are self-evident, endowed by our creator, that they are inalienable. This was nice rhetoric, but Jefferson was wrong. Rights are a social process of the people demanding that those in power treat them with dignity, and asserting such through whatever means available. As the chronicle of human history demonstrates, the struggle for human rights is ongoing and expanding. It is time to assert a basic and fundamental right to health care.
Our school system needs to be reformed. I believe that and have been an advocate of reform since the beginning of my career. Unfortunately, some time ago, the claim to reform was co-opted by those pushing “higher standards” and “standardized testing” as the means to that end. And it hasn’t worked. And it can’t work. Testing is not reform, it’s just a means of measuring something that still needs to be reformed.
The latest example of the futility of testing ourselves into a better education was reported by the New York Times. It turns out that, in an effort to meet the high standards of proficiency mandated by No Child Left Behind fifteen states have hit upon a plan–lowering their standards of proficiency. If I’m not mistaken, this is the opposite of reform.
At what point are we going to abandon this vain (though lucrative for publishing companies) attempt to test ourselves into better education. Granted, we should not abandon assessment, or establishing standards, but we should not delude ourselves into thinking that assessment and standards will lead to improvements in education.
We are sold this bill of goods by those who insist that American education is a failed system and only by setting high standards and holding students, teachers and schools accountable to these standards through testing is the answer. Well that’s great. We’ve been doing that for eight years and what do we have to show for it? Students who are getting really good at taking tests; teachers who are really good at teaching test taking skills; and schools that are really good at administering tests. Can we say this is educational reform? Of course not.
The fact is that the American education system is not a “failure” per say. Many students receive an exceptional education, most even receive an adequate education. Of course, there are those who are woefully under-served. There are many inequities entrenched in education as an institution: racial/ethnic, socio-economic, gendered, regional inequities as well as inequities with regard to individuals with different learning modalities.
Our schools can address these inequities. There are much better ways to educate our children rather than trying to coerce them into learning by burdening them with high stakes tests. The problem is methodological, so the solution must be methodological. We cannot keep doing the same failed practices and expect that “assessing” the outcomes of these practices will lead to better results. We have to change how we structure our schools, what we do in the classroom. All of our social constructs regarding teaching and education must come under scrutiny…scientific scrutiny.
For the last couple of years I’ve come to realize that the debate about global warming has become so polar that it might be time for environmentalists to change track. As a paradigm shifting discourse, global warming has gone as far as it can. Those who accept the mountains of data verifying the reality of global warming may be convinced of the righteousness of their cause, however, those who deny the validity of global warming are also entrenched in their own ideas. I use the term Ostrich Theory to describe such individuals. They are so invested in their beliefs that they will ignore any evidence to the contrary and embrace any supporting evidence regardless of it’s inanity. Another term for this is “confirmation bias.”
That’s not to say that environmentalists and scientists should stop educating us on global warming. Indeed, there may be some whose minds are not made up. There’s the upcoming generations, those who will be most affected by global warming, who need to know what they are walking into. Also, it’s possible, albeit unlikely, that some global warming deniers can be convinced of the validity of global warming claims (of course the same could be said about those who accept global warming!)
Regardless, the debate has stagnated. Instead of a didactic discussion in which the merits of one side prevails over the other, we are faced with a polar issue with clear and equally entrenched lines of division and disagreement. Environmentalists, those who are driving the debate on global warming, can only lose legitimacy in such a contest.
For this reason I suggesting chancing our paradigms for encouraging the development of alternative fuels. Yes, burning fossil fuels is the cause of a significant amount of global warming, but it is so much more harmful than that. There are significant health costs associated with the burning of fossil fuels. The New York Times reports that as many as 20,000 people die every year due to health complications resulting from the combustion and pollution of fossil fuel.(1) The Times is referencing an extensive report by the National Academy of Sciences that puts the unaccounted health costs of burning fossil fuels at $120 Billion!
If anything, this report understates the hidden costs of fossil fuel as it only includes those attributed with energy production. The profound damages caused by extraction and waste or accidents (such as the TVA coal ash spill in Tennessee pictured below), as well as the political costs of being dependent upon other nations for our energy needs, were not included in this assessment. The study also excluded trains, ships and planes from its cost analysis. If all costs were included it’s likely that the figures would be much more dramatic.
Global warming is an abstract concept laced with a certain amount of futility. On the one hand, the convolutions of as dynamic a system as the global climate is so far beyond the understanding of most people (including the scientists who study it) that it’s hard to make a concrete claim. The complex mathematics that describes global warming can be reshaped into convincing arguments that global warming is not an issue at all, perfect for the “Ostriches” noted above.
Even if convinced, one must deal with the ominous reality that this is a global issue requiring people all over the world to work together toward a common solution, something that has never happened in all of history. How can we as activists affect change in the face of the awesome obstacles of dynamic climactic phenomenon as well as the daunting task of getting politicians to put their immediate national interests aside to come up with a global consensus. Then there are the economic contingencies that are so complex as to defy description.
But the information in the above study is clear and disturbing in its conservatism. People are dying and the rest of us are losing billions of dollars in the fossil fuel game.
The costs of fossil fuels are not just defined by the digital score keeper on our gas pumps. There are hidden costs that need to be brought to light in a study even more comprehensive than the one mentioned above. When these costs are factored into the final billing for fossil fuels, suddenly the benefits of alternative energy becomes much more marketable. This is true even if global warming is not included in the discussion. Wind energy and solar energy, so far billed as being too expensive to replace fossil fuels at this point, are comparatively more affordable in light of the NAS study.
1. The number of dead does not include those who die as a result of global warming. The World Health Organization estimates that over 150,000 people worldwide die every year as a result of global warming.
In our culture we tend to look at addiction as a moral failing or a character fault, despite the volumes of evidence of profound biological variables. We know that addiction is a physiological response to intoxication. Yes, there are social and psychological factors that are important, but by the time we get to the point of full blown addiction, it’s the physiological dependency that plays the leading roll. We also know that some people are more prone to addiction than others, and some people become more physically dependent upon certain drugs than are others.
Despite this knowledge, we still hold the the archaic social constructs of moral failing with regard to addiction. And the cure for this moral failing is purging (purgatory). To redeem oneself from moral failing one is expect to pass through a trial of purgatory, to cleans oneself. Hence we possess a certain admiration for erstwhile addicts recounting their stalwart determination against the torment of withdrawals only to emerge from this suffering clean and cured. Twelve step programs are designed to lead people through a process of purgatory that is defined as a daily struggle. Purgatory becomes an expected and accepted paradigm for stories of overcoming addiction.
And the approved course of action for dealing with addiction is cold turkey. Even using progressive treatments such as the nicotine patch or gum has the perceived value of quitting cold turkey when, in fact, one is merely diminishing their addiction in stages. We must also remember that, according to prevailing paradigms, once we are addicted, we are always addicts subject to a fall from grace.
So now we are faced with this story from the New York Times about using controlled doses of heroin to treat heroin addiction. A study printed in the New England Journal of Medicine suggests that the use of diacetylmorphine, the active ingredient in heroin, produces better results than methadone for heroin addicts who have already failed other treatments. Heroin addicts were more likely to continue treatment using the diacetylmorphine than the methadone. Of course, methadone has the benefit of mimicking the chemical interactions of heroin without producing the high. Hence, it’s a great “cold turkey” solution that reduces the purgatorial horrors of heroin withdrawal.
Using heroin, on the other hand, offers no such approved means of resocialization in our culture. There’s no purgatory. There’s no cold turkey. There may even still be a “high” involved. And getting high is just not approved (getting drunk is okay, but getting high will not be tolerated in polite company).
The Journal of a Mad Sociologist is dedicated to revealing what I call social schizophrenia. It’s my contention that social schizophrenia results when contemporary knowledge is influenced by outdated social constructs. Our contemporary knowledge of addiction is still mandated by often medieval concepts of personal character and failing, purging and self control. Progress in knowledge, ergo progress toward freedom and justice, occurs when we liberate ourselves from the social constructs, formulate relevant constructs and institutions to put those constructs into action.
Have you ever been a part of a group in which secrets were being shared? Where were you in relation to those secrets? Were you the person sharing the secrets? The person learning the secrets? Or were you out of the secrets loop? Perhaps you were the subject of the secrets. Regardless, where you are in relation to the secrets being shared is a definitive characteristic of one’s status within a group or sub-group because secrets are a mechanism of power.
The same is true at the national or societal level, though admittedly the dynamics become much more complex. In any given society there are those who learn and share information, those who receive the information, those who are the subjects of that information and those who are not. What’s more, this flow of information is typically directed upward along the social hierarchy, with those at the top, whom we can call the power elite receiving the most information and those at the bottom receiving the least.
Those at the top of the ladder have access to the most sophisticated technologies and infrastructure for gathering, disseminating and analyzing information while those at the bottom are mostly dependent upon the elite to share their information. And this is where secrecy comes in. The more the power elite can control the dissemination of information to the lower tiers of the society, the more power they can incorporate into their social groups. Indeed, one measure of power may very well be one’s ability to control the flow of information. Power may very well be defined as the product between how much information one can glean from other groups and how much information can be kept secure from the knowledge of other groups.
The Bush Administration knew this very well. They ran a tight ship of secrecy, making sure that very little leaked from the inner sanctums of their power groups. Everything was subject to strict control, to the point where Vice President Cheney actually invented his own Top Secret categorization. To avoid sharing information the Bush team claimed executive privilege. When that didn’t work, Cheney actually defined himself as part of the legislature, not the executive, therefore not subject to judicial demands for executive information while at the same time claiming executive privilege. The Bush Tango around releasing information was almost a thing of beauty.
At the same time, no other administration placed so much emphasis on gathering information on as may people as is humanly possible while excluding as many from the flow of information as could happen. It was under the Bush Administration that Total Information Awareness was developed, shot down by Congress, then forgotten, then renamed and passed on to the NSA. Those laws designed to protect the privacy of American citizens. No problem. Ignore them, undermine them, undercut them and make sure no one knows what we are doing. Those who do know, such as the gang of eight legislators in the House and Senate intel. committees, impose the strictest secrecy enforced by law, or political maneuvering to guarantee that no one knows the extent of the information being gathered. Otherwise, order the CIA to conduct program without even telling Congress. If Congress asks questions lie.
Now we have a new administration inheriting the power infrastructure from what may, by the algorithm defined above, be defined as the most powerful in the history of this country. Will Obama take a step back and scale down these huge discrepancies of power. He has promised that his would be a transparent administration. In some ways maybe it is. On the other hand, the infrastructure, the rationalization, the experts are all there for his use, and Obama does not seem interested in dismantling such a colossal power structure. If Lord Acton was correct about the corrupting effects of power, then we must admit that Obama has inherited the most corrupting infrastructure in history. Power institutions do not have a history of disempowering themselves.