Saturday, December 15, 2007

WHAT DO WE DO WHEN VALUES COLLIDE IN DECIDING CONTROVERSIAL PUBLIC POLICY ISSUES?

While most public policy issues are easily viewed from a liberal or conservative perspective, any rational person must acknowledge that there are opposing moral views involved in some of the most controversial issues America confronts today. Such moral conflicts should be dealt with in thinking about the use of torture in interrogating terrorist suspects, providing the best health care for all people, and determining the appropriate limits of legal immigration. I think that we should be most critical of people, including politicians, who do not seriously consider and attempt to counter the moral convictions of those who come down on the other side of each of these issues.

One: Torture. When legal techniques fail to elicit information regarding terrorist plans, should the United States employ criminal interrogation techniques, including waterboarding (simulated drowning), when these methods may produce information that will enable us to disrupt terrorist plans and prevent terrorist actions?

John Kiriakou spent 14 years at the CIA, including a tour in Pakistan from 1998 to 2004. He was involved in the capture of Abu Zubaydah, an Al Qaeda militant, and was one of the first to interrogate him. The initial questioning bore no fruit, so waterboarding and other exceedingly harsh methods were applied. Kiriakou states that new or severe interrogation techniques were never applied without the approval of superior officers. Because Kiriakou was on his way to another assignment after the initial interrogations of Abu Zubaydah, his recounting of events after that point is based on classified cables and private communications with his colleagues. (Kiriakou says that he did not know that the interrogations was taped, and disagrees with the decision to scrap them. The faces of CIA interrogators could have been blurred to protect their identity.) Although Kiriakou’s recent public account omits details regarding the method of waterboarding, he has confirmed that the simulated drowning of Abu Zubaydah lasted 35 seconds. Before waterboarding, this Al Qaeda operative was “defiant and uncooperative”; afterward he was compliant and provided valuable information. Kiriakou’s conclusions are that waterboarding is torture; it worked in the case of Abu Zubaydah, and it “probably saved lives.” He also believes it is no longer necessary.

Waterboarding is torture. It is contrary to the Geneva Conventions and American law. Its use by the United States justifies the use of criminal methods by other nations against American nationals. It may lead prisoners to provide false information in order to appease their interrogators. And yet, as Kiriakou asserts was the case in with Abu Zubaydah, it may lead the prisoner to reveal information that will result in the saving of lives. Is it then justified?

Two: Health care. Optimal health care coverage can provide the highest practical level of medical care for all Americans. This can be achieved by barring independent medical practice for the benefit of people prepared to pay the additional cost for extraordinary treatment. Should the United States underwrite a health care system that provides extensive universal medical insurance, setting limits for those who are prepared to pay for more?

Any system of public health insurance must have finite limits. As generous as the nation might be in preventing and treating disease and ill health and however advanced medical science has become, some treatments will be so expensive, risky or rare that their coverage cannot be justified. Medical science seeks to alleviate and cure both common and rare medical conditions, but there comes a point at which choices must be made. American society cannot afford to employ limited medical resources in order to treat every adverse condition of every person.

There are always experimental and costly treatments that are available only to those prepared to pay the price. To the extent that society offers that option, it restricts the employment of medical resources for less costly treatments for a larger population. This option exists in America’s present system of health care and in every socialized system such as Great Britain’s in which doctors and patients can choose between public and private coverage. It would not exist in a genuinely socialized scheme of universal single-payer coverage. Should the United States subscribe to the utilitarian principle of “the greatest good for the greatest number,” knowing that carried to its logical extent this principle would deny the right of those who could afford it to buy even greater medical support for themselves?

Three: Immigration. For millions of impoverished people in Central and South America, the United States holds out hope for a better life. The millions of illegal immigrants already here pose challenges to the American economy, the ability of educational and medical institutions to serve them, and the nation’s social coherence. Should we attempt to compel those already here to leave? How can we keep millions more from gaining entry?

From its founding, the United States has represented to the rest of the world a haven for “your tired, your poor, your homeless masses yearning to breathe free.” Geographically, its vast borders provide access for those with the cash or courage to seek entry. We are all (so-called Native Americans included) immigrants or the their offspring. Today’s immigrants perform the crop-picking, dishwashing, and taxi-driving jobs that those of us long-settled here do not want. Nearly all illegal immigrants obey the laws in every other way and seek to adapt to American values.

At the same time, illegal immigrants unwittingly change established communities, increase the strain on local resources (including an increased cost of English-language education), and are prepared to work longer for less pay, challenging the living standards of unskilled American citizens. Nations must have the right to protect themselves against invasions, even invasions by unarmed hordes. What can we do? At what cost can unwanted aliens be expelled? How long and how high would the wall have to be that would succeed in keeping them out? Or should the United States welcome and not resist the entry of anyone who wishes to come here, short of permitting the entry of criminals, carriers of contagious diseases and those who would immediately become wards of the state? Beyond such considerations, may we draw a line? Where do we draw a line? And how do we draw the line?

Thursday, December 13, 2007

DID ROMNEY DEFEND HIS RELIGIOUS BELIEFS OR DENY THE RIGHTS OF NON-BELIEVERS?

Mitt Romney’s speech on December 5, 2007 in defense of his Mormon religion mentioned Mormon once and never mentioned the name of Romney’s church, the Church of Latter Day Saints. His avoidance of his religion was signaled in the title he gave to the speech, “Faith in America.” It turned out to be something else, an address on Romney’s faith in Faiths, but not all Faiths, and certainly no faith at all in those who have no Faith.

The exclusion of those who did not profess a belief in religion was made clear at the outset, when Romney said “Freedom requires religion just as religion requires freedom.” The first part of that sentence is prejudicial, because it leaves non-believers unfree. The second part of that sentence is preposterous because it ignores all intolerant faiths, including the state religions that Romney criticizes later in his speech as “theocratic tyranny.”

Romney stated “I believe in my Mormon faith and I endeavor to live by it,” an expression of belief that surely requires at least a simple statement of how it differs from other faiths, yet his only profession of religious belief is that “Jesus Christ is the Son of God and the Savior of mankind.” That belief encompasses Christians as well as Mormons, including the Evangelical Christians who will participate in large numbers in the Iowa caucuses, but excludes Jews, Muslims, Buddhists, Hindus, and members of other faiths that do not acknowledge the divinity of Jesus, as well as atheists, agnostics and other non-believers. It must also trouble millions of Christians who believe in a genuine separation of church and state.

Anyone who suggests parallels with John F. Kennedy’s 1960 speech on his religion and the presidency has not read or does not remember what Kennedy said. Kennedy said “I believe in an America where the separation of church and state is absolute,…where no church or church school is granted any public funds or political preference;… where no religious body seeks to impose its will directly or indirectly upon the general populace or the public acts of its officials.” He opposed diplomatic relations with the Vatican, aid to parochial schools, and other government support of organized religion. His was a categorical denial of the role of religion in government.

By contrast, Romney finds a necessary linking of church and state. Citing abolition, civil rights and the right to life itself, Romney makes a sweeping claim that “no movement of conscience can succeed in America that cannot speak to the convictions of religious people.” Unquestionably, the abolition and civil rights movements have required the support of religious as well as non-religious people. It is also true that the strongest opponents of the abolition and civil rights movements have included religious as well as non-religious people.

To make the point that the United States is founded in religion, Romney says “We are a nation ‘Under God.’…We should acknowledge the Creator as did the founders – in ceremony as in word.” He cites God on our currency and in the pledge of allegiance, ignoring the absence of God, the Creator or Jesus in the American Constitution, as well as the explicit constitutional prohibition of any religious test for any office or public trust. Romney is also wrong about the place of “under God” in the United States. To cite one example, the original pledge was written in 1892 by a socialist, Francis Bellamy, and it did not contain the words “under God” until Congress put them in it in 1954.

As for the moral implications of Romney’s political views, in a single Republican presidential debate, he opposed ever raising taxes, refused to define torture or waterboarding, and said of persons who have been held for as long as six years at Guantanamo: “I want to make sure that these people are kept at Guantanamo and not given legal representation in this country.” Millions of Americans, both religious and non-religious, will characterize these positions as immoral.

In contrast to Kennedy’s endorsement of the separationist views of Thomas Jefferson, James Madison and other outspoken founders of the United States, Romney believes that government and religion are inextricably bound together in ways that receive no support in the Constitution or the debates preceding its adoption. Given America’s heterogeneous political climate, it is probably unavoidable that all of the leading candidates for the presidency in 2008 arouse the hostility of many citizens who do not share what they perceive to be the candidate’s core values. Among them, Mitt Romney has the dubious and dangerous distinction of being the most divisive.

Monday, October 15, 2007

HOW WAS AL GORE DENIED THE VICTORY HE HAD WON IN THE 2000 PRESIDENTIAL ELECTION?

As soon as I heard that Al Gore had won a Nobel Peace Prize, I could not help recalling the prize he had won but which was denied him: the American presidency. We should not forget how much America lost in that election. Every fact in this brief account can be verified in detail. We can’t put the past behind us because we are living with its awful consequences.

Vice President Al Gore had a more than half-million vote plurality over Texas Governor George W. Bush, but victory depended on which candidate won Florida’s electoral votes. In the early Florida vote count after Election Day in 2000, the official difference between them was less than three-hundredths of one percent (0.0299) of the total, and the Florida Election Code required an automatic recount in every one of the state’s 67 counties if the margin of victory is less than one-half of one percent. It never took place. Two weeks after the election, the Florida Supreme Court unanimously ruled that hand counts must be included in the vote totals. Every possible obstacle was placed in the path of conducting such a count. Texas, the state of which Bush was governor, required that all punch-card ballots should be counted where a chad (the piece punched out by the voter) is not completely detached, with the overriding concern being the “clearly ascertainable intent of the voter.” But that was not the intent of those who acted on behalf of candidate Bush.

The strategy of the Bush camp was to prevent a full recount before December 12, after which date Florida could avoid a challenge by another slate of electors, should the complete vote reveal that Gore had in fact won. The tactics employed were to stall, delay, object, and terminate the vote counts that had been undertaken in order to prevent a full recount before that date. When Secretary of State Harris declared Bush the victor without a recount, the Florida Supreme Court extended the vote count deadline. The Republican-controlled Florida legislature was called into special session to appoint electors pledged to Bush.

On December 8, the Florida Supreme Court ordered an immediate manual recount of all ballots on which no vote had been machine-recorded. On the next day, shortly after the Miami-Dade County manual count resumed, the Supreme Court halted the hand count. It heard arguments the following day (December 10), waited a weekend, and on December 12 announced it was now too late to conclude the count. As much as the 5-4 Supreme Court majority wanted to shoot down Gore’s candidacy, they wanted someone else to fire the shot. Whether motivated by propriety or cowardice or political calculation, they wrote an opinion returning the case to the Florida Supreme Court, but the Florida court was not allowed to formulate a rule for continuing and concluding the hand count, or to do anything except declare that Gore’s candidacy was dead. The following day, Gore conceded defeat. This shameless stealing of the election after the votes were cast was exceeded only by the tactics employed to manipulate the voting itself.

Voters were denied the opportunity to cast their votes. Evidence demonstrates that the names of black voters were removed from registration rolls, voting sites were moved without notification, ballot boxes were uncollected, polling places were understaffed, language assistance was denied, and voting machines were unreliable. Secretary of State Harris, without competitive bidding, hired a private company to compile lists of convicted felons who should be barred from the polls. Many on the lists were innocent people who had names that sounded like those of convicted felons (no effort was made to check the names against Social Security numbers), many had been convicted only of lesser crimes (a misdemeanor conviction does not involve losing the right to vote), and thousands had been convicted and served time in the thirty-seven states that restore citizenship rights. Jeb Bush and Harris both testified under oath that the verification of the felon lists was neither their responsibility nor that of the company that compiled the lists. A University of Minnesota study estimated that if 80 percent of those unfairly barred from voting had voted, this would have resulted in more than 35,000 more votes for Gore and 4000 for Bush.

Voters who were not helped when they should have been. Although Florida law allows disabled voters to have another person assist them, ballots were disallowed from elderly or disabled voters whose signatures did not match the old originals.

Illegal voters and others were helped who should not have been. The Seminole County elections supervisor allowed the Republicans two weeks to make corrections and resubmissions on thousands of absentee ballots. In Martin County, Republicans were allowed to add identification numbers to application forms for absentee ballots and to remove applications from the elections supervisor’s office. In addition, The Miami Herald found that ineligible voters were allowed to sign in at polls where they were not registered.

Voters were unconstitutionally misled. As required by law, the local newspaper in Duval County printed a sample ballot. It stated that voters should vote on every page, but the names of the presidential candidates were printed on two pages. Voters who followed that instruction cast voided ballots. Thirty percent of the undercounted ballots in Duval County were cast in heavily black precincts that voted ten-to-one for Gore. As a consequence of this two-page presidential ballot and the improper instructions that accompanied it, Duval County discarded more than three times as many ballots in 2000 (26,909) as it had in 1996 (7762).

In Palm Beach County, the so-called butterfly ballot listed candidates’ names on facing left-hand and right-hand pages. The major party candidates were listed on the left-hand page, with Bush first and Gore second. Patrick Buchanan was listed first among the minor party candidates on the right-hand page. Voters had to punch a hole in a middle column that combined the lists, alternating names of candidates from the left and right. This meant that Bush, first on the left was first in the center, while Buchanan was listed next, and Gore, who was second on the left in the list of candidates, was third in the center column where voters made their choice. A voter could reasonably assume that the candidate listed second would appear second in the order in which they voted. In that way thousands of Palm Beach County voters who intended to vote for Gore mistakenly voted for Buchanan.

This ballot format violated Florida law that clearly specifies that votes must be cast to the right of the candidate’s name. It also violated common sense in requiring that a vote for the second candidate on the ballot (Gore) meant punching the third hole, and a vote for a candidate much lower on the ballot (Buchanan) meant punching the second hole. Because many voters recognized their error, they punched two holes, invalidating their ballots.

Buchanan received one-fifth (19.6 percent) of the Palm Beach vote, compared with one-twentieth (5.4 percent) in 1996, when the butterfly ballot was not used. It takes a leap of faithlessness to believe that Buchanan received his greatest support in a precinct consisting of a Jewish old age home with Holocaust survivors, who until that election had despised Buchanan. To his credit, Buchanan publicly stated his belief that these voters had not intended to vote for him. Analyses indicate that he received between 2000 and 3000 votes that had been intended for Gore, vastly more than Bush’s official victory margin of 537 votes.

The votes of many voters were not recorded through no fault of their own. One-third of all Florida voters lived in punch-card counties with a high proportion of low-income, minority, and first-time voters, in which it was nearly three times as likely that their ballots would be rejected as would the ballots of voters in counties using optical systems. (An inventor of the punch-card machine testified as to its unreliability.) Dimpled ballots, that is, punch-card ballots in which an indentation has been made but there is no visible hole, were counted in Texas, Illinois, Massachusetts and other states, but not in Florida. It is not surprising that Palm Beach County with its butterfly ballot and Duval County with its improper instructions to voters, and both with all the problems associated with punch-card voting, had thirty-one percent of the uncounted ballots, even though they cast only twelve percent of the statewide vote.

There were voters whose intentions were clear but whose votes were never counted. Thousands of voters indicated their preference of a candidate in the normal fashion and then wrote in the name of the same candidate in the place reserved for write-in votes. (This could not occur when voting on optical scanners.) Whether due to oversight or ignorance, these voters had not followed instructions, but their intentions were unmistakable. Whatever form the voting procedure takes, it is not designed to be an intelligence test or anything but an expression of the voter’s intention. Voters who emphasize and underline their choice of candidate by both punching a ballot and writing in the same candidate’s name have made the most unmistakable demonstration of their intention. Had the overvotes been counted, Gore would have received many times more votes than were necessary to overcome Bush’s official lead.

In retrospect, it is evident that the media exit polls on Election Day that called Florida for Gore were correct, because they reported the intentions of the voters who went to the polls and who believed that they had cast valid ballots. The National Opinion Research Center of the University of Chicago was hired by a consortium consisting of The New York Times, Washington Post, Wall Street Journal, and other newspapers to undertake a first count of approximately 120,000 overvotes (where voters appeared to have made two choices, even if both were for the same candidate) and 60,000 undervotes (where the machine count had not revealed the voter’s choice).

The consortium calculated eight ways of counting disputed punch-card votes – correctly marked paper ballots, full punches, poorly marked paper ballots, chads detached at three corners, chads detached at two corners, chads detached at only one corner, dimpled ballots with sunlight, or only dimples -– and concluded that Gore would have won the Florida vote by every one of the eight methods. Of course, no examination after the fact could take into account the thousands of voters who had mistakenly voted for Buchanan instead of Gore, the thousands who compounded their mistake by voting for both Buchanan and Gore, and the would-be voters who had been turned away from the polls or were discouraged from voting because of intimidation or the improper appearance of their names on lists of felons barred from voting.

Judicial involvement and determination of the result represented an unconstitutional intervention in the political process, because it violated the Electoral Count Act. That Act, adopted to prevent a repetition of the misguided judicial involvement in the election of 1876, states that Congress is the primary body to resolve any lingering contention after the states have tried to settle electoral disputes. It is an authority that was never given to the courts.

Who were the Justices who so egregiously made it impossible to ascertain the true will of Florida voter, acted contrary to their long-proclaimed deference to states’ rights, and awarded the election to the candidate who had lost the vote? During the 1986 hearings on William Rehnquist’s nomination as Chief Justice, five witnesses testified that Rehnquist had harassed black and Latino voters at Arizona polling places in 1962, demanding to know if they were “qualified to vote.” Justice Antonin Scalia’s son was a member of the law firm of Theodore Olson, who argued the Bush case before the Supreme Court. Justice Clarence Thomas’s wife was employed by the ultra-conservative Heritage Foundation to vet prospective office-holders in a future Bush administration. When Justice Sandra Day O’Connor heard Dan Rather call Florida for Gore on CBS, she exclaimed, “This is terrible.” Her husband explained to other guests at the election night party they were attending that his wife was upset because they wanted to retire to Arizona, and a Gore win meant that they would have to wait at least another four years. These facts demonstrate not illegal conduct, but indifference to avoiding the appearance of impropriety, which would have led them to remove themselves from consideration of the case.

Gore would have won Florida and the election had there been an immediate recount in every one of Florida’s 67 counties or just a revote in Palm Beach County. An unprecedented political juggernaut was rolled out to ensure that this would not happen. It employed the political power and resources of Governor Jeb Bush, the presidential candidate’s brother, who recruited his own political operatives and volunteers to move into disputed counties; James Baker, the Bush family’s consigliere, who masterminded the teams of lawyers and political operatives who engaged in “spontaneous demonstrations,” including a window-pounding protest that succeeded in halting the Miami-Dade County recount (many participants were rewarded with appointments in the Bush administration); Secretary of State Katherine Harris, co-chair of George W. Bush’s Florida campaign, who was prepared to change the rules and interpret the results as proved necessary in order to insure Bush’s election; the Florida House of Representatives, which voted on party lines to appoint electors pledged to George W. Bush, irrespective of how the recounts turned out, and the Florida Senate, which was prepared to cast a similar vote if the U.S. Supreme Court had not resolved the election in Bush’s favor, and finally the five predictable members of that Court.

The most fundamental elements of an American presidential election include the right of all qualified citizens to cast a vote, an accurate count of all discernible votes, and the awarding of a state’s electoral votes to the candidate whose slate of electors received the most votes. None of those conditions were met in Florida in 2000. The great pleasure that most Americans feel upon Gore’s winning the Nobel Peace Prize must be tempered by our awareness that a corrupt coalition denied him the presidency after winning the election. There are banana republics that conduct more honest elections than the United States did in 2000. It was the first time that Al Gore taught us an inconvenient truth.

If you wish to subscribe to Thinking Out Loud, e-mail
thinkingoutloud@stanleyfeingold.com, and write "subscribe."
The price of your subscription is occasionally thinking out loud by responding. Comments, criticisms and corrections are
welcome. This comment is an exception to my rule to keep the essays under 1500 words. To unsubscribe, write "cancel."
Stanley Feingold

Saturday, October 6, 2007

MUST AMERICA EMPLOY WAR PROFITEERS AND MERCENARIES IN IRAQ?

The United States is engaged in an undeclared war (the longest in American history) against an unnamed enemy (terrorism neither identifies whom we are fighting nor the meaning of victory), by means that do not affect most Americans, but will produce great profits for private contractors and great indebtedness for future generations.

This indictment demands demonstration. The war in Iraq is an undeclared war: American entry into World War II was based on the last congressional declaration of war, although the Constitution requires it. It is the longest war: It is now four-and-a-half years, and there is no end in sight. The enemy is unnamed: Al Qaeda was not even in Iraq when America invaded that country in March 2003, and the enemy now consists mostly of unnamed nationalists, insurgents and terrorists, whose ranks are subject to augmentation or diminution at any time, and with none of whom America can sign a treaty ending the war.

It once would have seemed unimaginable that a war of this magnitude, of such great cost to the United States, and involving such great loss of life to combatants and innocent civilians, could be waged without having an impact on the lives of most Americans, but it is clear that apart from the high cost of gasoline (insofar as it is related to the war), the overwhelming majority of Americans are personally untouched, neither knowing anyone in the armed forces nor being asked to make any sacrifice for the cost and conduct of the war.

The real cost now exceeds one trillion dollars. No one can guess how much greater it will be by the time the United States leaves Iran. To whom is that money going? The answer is that never in American history have private corporations profited so greatly or corruptly from the performance of tasks that until now were considered the responsibilities of the armed forces. Most non-combat roles, and the use of armed security forces, have been outsourced to 630 private companies work for the United States in Iraq, employing approximately as many persons as are there in the American military.

Of necessity, some of these are Iraqi firms, employing non-American personnel, because they have knowledge and linguistic skills which the American armed forces do not possess. However, the vast majority of contracts, dollars spent, and personnel employed are American civilian contractors who are subject to virtually no oversight or accountability for how and how much they spend or how much they profit.

The most famous or infamous of American contractors (but far from the largest) is Blackwater U.S.A., which provides security forces for the U.S. State Department, and is not subject to supervision or control by the U.S. Department of Defense. Blackwater, a company that contributes heavily to the Republican Party, was hired (as have other contractors) without competitive bidding. The notoriety of Blackwater derives from documented instances (described by American and Iraqi eyewitnesses) in which Blackwater employees have opened fire and killed unarmed Iraqis without provocation. As of this writing, Blackwater security guards and other personnel have been involved in 195 shooting incidents since 2005. In at least two cases, Blackwater, with the approval of it employer, the State Department, made cash payments to family members of its victims who complained, and it has sought to cover up other cases.

Last year on Christmas eve, a Blackwater employee, while drunk, killed a bodyguard for one of Iraq’s two vice presidents. Blackwater, with the help of the State Department, spirited the assailant out of Iraq within 36 hours. More than nine months have passed, and no charges have yet been brought against the assailant. Many other charges against these guards have been made by Iraqi and American military officers. Until now, they have been immune from prosecution in Iraqi courts and protected by agencies of the American government from effective prosecution in the United States.

Given the cost in human lives, it might be callous to consider the economic cost, but for the fact that the war has been a source of great wealth for those to whom the United States has outsourced much of the cost. A single example, typical of arrangements with other companies, will indicate how wasteful it has been for the United States and how very profitable it has been for the companies and individuals who have received these contracts.

Blackwater pays an individual security guard $600 a day (that comes to $180,000 a year), which is four or five times the income the security guard received when he was a member of the American military. (He also benefits from having armored cars that are safer than Army vehicles.) To the security guard’s salary, Blackwater adds a 36% markup (for a total of $815 a day) plus overhead and costs in Iraq, including insurance, room and board, travel, weapons, ammunition, vehicles, office space, and equipment. This bill goes to Regency Hotel, a Kuwaiti company, that tacks on the cost of its buying vehicles and weapons, plus a profit for itself, and sends an invoice to ESS, a German food services company that cooks meals for the troops. Regency has billed ESS a price of $1500 per man per day, but it has told Blackwater it was charging $1200, giving it a substantial secret profit. ESS adds on its costs and profit, and sends its bill to Halliburton, which also adds overhead and profit, and presents its bill to the Pentagon. The United States has no contract with ESS, which will not provide any information to the government or the relevant congressional committees.

Halliburton’s contract is an open-ended “cost-plus” contract to supply the U.S. armed forces with food, laundry, and other necessities. Cost-plus means the United States pays Halliburton all of its expenses (that is everything it spends and everything it pays to its subcontractors) plus 2% profit on top. The more it spends, the greater the profit it makes. Henry Bunting, a former Halliburton purchasing officer, has stated, “There is no incentive for KBR (Kellogg, Brown & Root, a Halliburton subsidiary) or their subs to try to reduce costs. No matter what it costs, KRB gets one hundred percent back, plus overhead, plus their profit.” Up to this point, the Army has committed $7.2 billion on a single contract with Halliburton. The Defense Contract Audit Agency recently stated that Halliburton could not document 42% of a $4 billion invoice in March 2007. Among other charges, it stated that Halliburton billed the government for up to three times as many meals as it served.

Halliburton has failed to respond to repeated requests for detailed information regarding its costs and profits.
Employers and former employers are discouraged from becoming whistleblowers. Blackwater does as other American contractors in Iraq do. It makes individual contractors sign confidentiality agreements that compel them to pay Blackwater $250,000 in instant damages if they violate their contract by publicly discussing the details of their agreements or work.

What then is the real cost of a security guard? A sergeant (the former rank of many private security guards) would receive around $38,000 a year in base pay and housing and subsistence allowances. This does not reflect additional costs for health and retirement benefits. When a private security guard is killed, even though he may be an American citizen, the U.S. government is not responsible for his burial, death benefits, or payment to his survivors. We save money, but it is doubtful if, even in the long run, it constitutes a saving for the United States.

The advantage to outsourcing personnel is entirely political. The United States can pretend that it is conducting war with fewer soldiers, not needing to call up more regular troops, National Guard and reserves. Of course, General Shinseki and other military leaders were correct, before and after we invaded Iraq, when they insisted that the U.S. needed at least twice as many uniformed soldiers than we had sent to Iraq.

We have euphemisms to describe them, but there can be no mistake that the individuals who take high-risk, high-paying jobs are mercenaries, and their employers, who are not held unaccountable for their greed, crimes and cover-ups, are war profiteers. Any veneer of idealism or unselfish motive has been stripped away. We should answer a single question regarding how America conducts war: Are we willing to continue to outsource both the supplying of necessary resources and the actual waging of war by armed persons not wearing military uniforms, or should we create a military force fully capable of defending itself? The name of the alternative to what we are now doing strikes panic in the hearts of those who want to continue to prosecute this war, those who want to start a new war against Iran, those who want America to be prepared for a future war, and most voters contemplating the next election. That name is: conscription.

Saturday, September 29, 2007

WHAT ARE AMERICAN SOLDIERS DOING IN IRAQ?

Seven American soldiers serving in Iraq wrote the following op-ed piece that appeared in The New York Times on August 19, 2007. I recently reread it and urge everyone to do so. Only one salient fact has changed since they wrote it.

The War As We Saw It by Buddhika Jayamaha, Wesley D. Smith, Jeremy Roebuck, Omar Mora, Edward Sandmeier, Yance T. Gray and Jeremy A. Murphy

Viewed from Iraq at the tail end of a 15-month deployment, the political debate in Washington is indeed surreal. Counterinsurgency is, by definition, a competition between insurgents and counterinsurgents for the control and support of a population. To believe that Americans, with an occupying force that long ago outlived its reluctant welcome, can win over a recalcitrant local population and win this counterinsurgency is far-fetched. As responsible infantrymen and noncommissioned officers with the 82nd Airborne Division soon heading back home, we are skeptical of recent press coverage portraying the conflict as increasingly manageable and feel it has neglected the mounting civil, political and social unrest we see every day. (Obviously, these are our personal views and should not be seen as official within our chain of command.)

The claim that we are increasingly in control of the battlefields in Iraq is an assessment arrived at through a flawed, American-centered framework. Yes, we are militarily superior, but our successes are offset by failures elsewhere. What soldiers call the "battle space" remains the same, with changes only at the margins. It is crowded with actors who do not fit neatly into boxes: Sunni extremists, Al Qaeda terrorists, Shiite militiamen, criminals and armed tribes. This situation is made more complex by the questionable loyalties and Janus-faced role of the Iraqi police and Iraqi Army, which have been trained and armed at United States taxpayers' expense.

A few nights ago, for example, we witnessed the death of one American soldier and the critical wounding of two others when a lethal armor-piercing explosive was detonated between an Iraqi Army checkpoint and a police one. Local Iraqis readily testified to American investigators that Iraqi police and Army officers escorted the triggermen and helped plant the bomb. These civilians highlighted their own predicament: had they informed the Americans of the bomb before the incident, the Iraqi Army, the police or the local Shiite militia would have killed their families.

As many grunts will tell you, this is a near-routine event. Reports that a majority of Iraqi Army commanders are now reliable partners can be considered only misleading rhetoric. The truth is that battalion commanders, even if well meaning, have little to no influence over the thousands of obstinate men under them, in an incoherent chain of command, who are really loyal only to their militias.

Similarly, Sunnis, who have been underrepresented in the new Iraqi armed forces, now find themselves forming militias, sometimes with our tacit support. Sunnis recognize that the best guarantee they may have against Shiite militias and the Shiite-dominated government is to form their own armed bands. We arm them to aid in our fight against Al Qaeda.

However, while creating proxies is essential in winning a counterinsurgency, it requires that the proxies are loyal to the center that we claim to support. Armed Sunni tribes have indeed become effective surrogates, but the enduring question is where their loyalties would lie in our absence. The Iraqi government finds itself working at cross purposes with us on this issue because it is justifiably fearful that Sunni militias will turn on it should the Americans leave.

In short, we operate in a bewildering context of determined enemies and questionable allies, one where the balance of forces on the ground remains entirely unclear. (In the course of writing this article, this fact became all too clear: one of us, Staff Sergeant Murphy, an Army Ranger and reconnaissance team leader, was shot in the head during a "time-sensitive target acquisition mission" on Aug. 12; he is expected to survive and is being flown to a military hospital in the United States.) While we have the will and the resources to fight in this context, we are effectively hamstrung because realities on the ground require measures we will always refuse - namely, the widespread use of lethal and brutal force.

Given the situation, it is important not to assess security from an American-centered perspective. The ability of, say, American observers to safely walk down the streets of formerly violent towns is not a resounding indicator of security. What matters is the experience of the local citizenry and the future of our counterinsurgency. When we take this view, we see that a vast majority of Iraqis feel increasingly insecure and view us as an occupation force that has failed to produce normalcy after four years and is increasingly unlikely to do so as we continue to arm each warring side.

Coupling our military strategy to an insistence that the Iraqis meet political benchmarks for reconciliation is also unhelpful. The morass in the government has fueled impatience and confusion while providing no semblance of security to average Iraqis. Leaders are far from arriving at a lasting political settlement. This should not be surprising, since a lasting political solution will not be possible while the military situation remains in constant flux.

The Iraqi government is run by the main coalition partners of the Shiite-dominated United Iraqi Alliance, with Kurds as minority members. The Shiite clerical establishment formed the alliance to make sure its people did not succumb to the same mistake as in 1920: rebelling against the occupying Western force (then the British) and losing what they believed was their inherent right to rule Iraq as the majority. The qualified and reluctant welcome we received from the Shiites since the invasion has to be seen in that historical context. They saw in us something useful for the moment.

Now that moment is passing, as the Shiites have achieved what they believe is rightfully theirs. Their next task is to figure out how best to consolidate the gains, because reconciliation without consolidation risks losing it all. Washington's insistence that the Iraqis correct the three gravest mistakes we made - de-Baathification, the dismantling of the Iraqi Army and the creation of a loose federalist system of government - places us at cross purposes with the government we have committed to support.

Political reconciliation in Iraq will occur, but not at our insistence or in ways that meet our benchmarks. It will happen on Iraqi terms when the reality on the battlefield is congruent with that in the political sphere. There will be no magnanimous solutions that please every party the way we expect, and there will be winners and losers. The choice we have left is to decide which side we will take. Trying to please every party in the conflict - as we do now - will only ensure we are hated by all in the long run.

At the same time, the most important front in the counterinsurgency, improving basic social and economic conditions, is the one on which we have failed most miserably. Two million Iraqis are in refugee camps in bordering countries. Close to two million more are internally displaced and now fill many urban slums. Cities lack regular electricity, telephone services and sanitation. "Lucky" Iraqis live in gated communities barricaded with concrete blast walls that provide them with a sense of communal claustrophobia rather than any sense of security we would consider normal.

In a lawless environment where men with guns rule the streets, engaging in the banalities of life has become a death-defying act. Four years into our occupation, we have failed on every promise, while we have substituted Baath Party tyranny with a tyranny of Islamist, militia and criminal violence. When the primary preoccupation of average Iraqis is when and how they are likely to be killed, we can hardly feel smug as we hand out care packages. As an Iraqi man told us a few days ago with deep resignation, "We need security, not free food."
In the end, we need to recognize that our presence may have released Iraqis from the grip of a tyrant, but that it has also robbed them of their self-respect. They will soon realize that the best way to regain dignity is to call us what we are - an army of occupation - and force our withdrawal.

Until that happens, it would be prudent for us to increasingly let Iraqis take center stage in all matters, to come up with a nuanced policy in which we assist them from the margins but let them resolve their differences as they see fit. This suggestion is not meant to be defeatist, but rather to highlight our pursuit of incompatible policies to absurd ends without recognizing the incongruities.

We need not talk about our morale. As committed soldiers, we will see this mission through.

They will not all “see this mission through.” Even before the article was published, Staff Sergeant Jeremy A. Murphy was shot in the head on August 12, and suffered a severe brain trauma. He is expected to survive. On September 10, Sergeant Omar Mora and Staff Sergeant Yance T. Gray and five other Americans were killed when the five-ton truck in which they were riding overturned. What are American soldiers doing in Iraq? Dying.

Saturday, September 22, 2007

WILL HEALTH CARE BE THE PRIMARY DOMESTIC ISSUE IN THE 2008 ELECTION?


Americans are aware of the paradox that, although the United States has the most sophisticated and innovative medical establishment in the world, more than one-quarter of all Americans derive very little benefit from it. At least forty-five million people in the United States have no health insurance and another forty million cannot count on enough coverage to provide for appropriate treatment in the event of a major disease or long-term care. President George W. Bush, with his curious form of compassionate conservatism, has pointed out that the emergency wards of hospitals are open to all. He has not pointed out that many emergency wards have closed because hospitals cannot afford them, that the cost of emergency treatment is far greater than timely and regular medical care, and that emergency treatment is often too late to be of value.

The present congressional consideration of health care for children has reminded the president that he is not only a compassionate conservative; he is also a fiscal conservative, despite the fact that, since taking office, he has turned a great surplus into the greatest indebtedness in American history, several times setting records for the biggest single yearly dollar increase in the debt. Nevertheless, despite the harsh criticism by former Federal Reserve Board Chairman Alan Greenspan, that Bush has betrayed conservatism, he plans to stem the tide of borrowing that has, on average, exceeded $500 billion a year by threatening to veto a children’s health insurance bill that would provide coverage for an additional four million children (beyond the 6.6 million already covered) at an additional cost of only $35 billion over five years. If Congress passes the bill and Bush vetoes it, health care for children will provide the centerpiece for what the Democrats will seek to make the defining domestic issue in the 2008 election.

On September 20, five leading Democrats – Joseph Biden, Hillary Clinton, Christopher Dodd, John Edwards, and Bill Richardson (Barack Obama was invited, but declined) – participated, with Judy Woodruff as moderator, in a wide-ranging ninety-minute discussion of health care. It was less confrontational than other public discussions in which the Democratic candidates have participated, but it was more informative because it was thoughtful and reasonably detailed. While there were a few not-so-subtle thrusts directed by one candidate at another, it was clear that all five believe in national health coverage, although they differ as to how it can be achieved.

Apparently, some doctors share that conviction. In 2004, in a physician-sponsored random sample of Massachusetts doctors, 63.5 percent of the 904 responding doctors believed a single-payer plan provided the best care for most people, 25.8 percent chose a fee-for-service system, and 10.7 percent selected managed care. Although many respondents doubted that most of their physician colleagues would support a single-payer system, most agreed that government has a responsibility to ensure the provision of medical care, it would be worth giving up some income to reduce paperwork, insurance firms should not play a major role in health care delivery, and they would prefer to work under a salary system. Of course, this may not reflect national preferences among doctors, but each of these changes could play an important role in a national health insurance system.

Earlier this year, John Edwards was the first Democrat to propose a detailed health care plan. One idea appears to have caught on with other Democratic aspirants. Edwards and his leading rivals agree that the American people should have the same health plan as members of Congress, which includes unlimited doctor visits, no deductibles and no co-payments, at a cost of $35 a month. Edwards has now gone further, promising legislation that would end the health insurance of the president and Congress in six months if they fail to adopt a comparable program for all Americans. Edwards would also minimize, if not eliminate, the role (and the profits) of insurance companies.

Barack Obama has also offered a detailed plan which would insure all children and require employers to share the costs of insuring workers, but would not mandate insurance for everyone. Hillary Clinton’s new plan would require it. Clinton would offer the equivalent of the congressional plan as one option, but she opposes the creation of a federal agency to achieve this objective. None of her rivals has taken Clinton to task for her commission’s 1994 recommendation to President Bill Clinton of a complex proposal that no one seemed to understand, except for its clear rejection of the single payer plan that had been favored in 1992-93 by many Democratic leaders in Congress.

The various plans sometimes differ or are unclear as to the extent to which individuals could buy coverage, employers would contribute, or the national government would underwrite the costs. When the Democratic candidates speculate as to the cost (often estimated at $100 billion or more), they tend to agree that a large part of this could be paid by repealing the income tax cuts on incomes above $200,000 enacted in the Bush presidency. Edwards would go further and increase the tax on investment income to the rate on earned income. Several candidates propose efficiencies in the health and tax systems that would serve to cut the cost of health care.

All of this stands in sharp contrast with the positions of the leading Republicans. Rudolph Giuliani predicts that the various Democratic plans would increase taxes and decrease the amount and quality of patient care (for example, increasing the waiting time to obtain an appointment). Giuliani advocates a $7500 tax deduction per taxpayer to defray insurance costs and tax credits for poor workers to supplement Medicaid and employer contributions. John McCain would go further than some of his Republican rivals, favoring prescription drug coverage for the elderly and expanded children’s coverage. Mitt Romney would offer incentives for the states to expand affordable coverage, condemning the Democratic proposals as “European-style socialized medicine.” The other Republican hopefuls (Fred Thompson, Sam Brownback, Mike Huckabee, Ron Paul, and Duncan Hunter) all voice variations of “market-based solutions” and “market-driven expansion” of affordable coverage to express their opposition to any government-run program, let alone guaranteed universal coverage.

Numerous statistics demonstrate that far less prosperous countries have far better health records. The United States ranks seventeenth in the percentage of one-year-old children who are fully immunized against polio. China and Brazil both rank ahead of the U.S. in this regard. Many countries, including Jordan and Egypt, have lower rates of low birth-rate babies than the U.S. Shockingly, but not surprisingly, these figures reflect the fact that, among industrialized nations, the United States has the highest percentage of children living below the poverty line.

The World Health Organization (WHO) of the United Nations has often pointed out that the United States ranks behind other industrial nations in what the WHO calls “healthy life expectancy.” All of these and other health statistics show a wide disparity along racial and ethnic lines, with blacks and Hispanics overrepresented among the least privileged populations. Dr. Ashish Jha of the Harvard School of Public Health has testified that surgeons are very much less likely to offer bypass surgery to black men than to white men who have had a heart attack similar to that suffered by President Bill Clinton.

“Socialized medicine” is a phrase that critics will employ and advocates will shun. Rational examination discloses that, by other names, socialization of public benefits have a place -- an honored place -- in American public policy. Free public education is the oldest example. More than seventy years ago, critics railed against adoption of the Social Security Act, which provides an assured old-age income to almost all older Americans. Medicare is Social Security’s logical extension, in that it extends health care beyond the recipient’s ability to pay. Scare tactics, such as condemnation of reform as “socialism” without a rational analysis of what particular reforms do or fail to do, will not dispose of the challenge, not if the American people believe, as evidence increasingly indicates they do believe, that the government has a role to play in the protection of health and the provision of essential health services.

The candidates are obliged to answer these questions. Where does universal health coverage exist and how well does it work? Does the quality of health care and its cost decline or improve under a government-mandated health care program? Is it compatible with private medical insurance? Does it require the regulation of pharmaceutical companies and drug prices? What is most remarkable is that this debate in the United States will take place, if it takes place, decades after it was resolved in other industrial countries. And that also raises the question as to why the richest country in the world, with its unparalleled medical research and resources, has come so late to this issue.

If you wish to subscribe to Thinking Out Loud, e-mail thinkingoutloud@stanleyfeingold.com, and write “subscribe.” The price of your subscription is your occasionally thinking out loud by responding. Please send your comments, criticisms and corrections to the same e-mail address. I intend to post a weekly essay of under 1500 words. To unsubscribe, write “cancel.” For earlier essays, go to www.stanleyfeingold.com. Stanley Feingold

WEDNESDAY, SEPTEMBER 19, 3007

POSTSCRIPT to September 15:
Does the President or the General decide whether and how to wage war?

MoveOn's sharp criticism of General Petraeus (calling him General Betray Us) is a stupid tactic and a completely false assessment of moral and military responsibility. This clumsy personal attack permits defenders of the Iraq war to deflect criticism of its conception and conduct and focus on a misguided attempt at character assassination. General Petraeus is neither the savior President Bush has represented him as being nor is he the subversive officer MoveOn has portrayed. He is simply the general President Bush has settled on to be the spokesman for his failed policies. The war is Bush's war. The false rationale for invading Iraq was accepted by President Bush. The grievous errors committed in the conduct of the war were the responsibility of the Commander-in-Chief, President Bush, who had the power to reverse errors. He did not. The loss of American prestige and good-will throughout the world is the consequence of policies pursued by President Bush. The absence of an exit strategy results from an inability to assess alternatives or face reality by President Bush.

Similarly, any charge that lays responsibility for America's greatest foreign policy failure upon Vice President Cheney or former Defense Secretary Rumsfeld or the proselytizers for American hegemony ignores the fundamental fact that none of the grievous errors could have taken place without the action of President Bush. It is even more inexcusable to blame the general who was eager or willing to carry out the president's orders. What President Truman reminded himself of in the Oval Office has not changed. The buck stops there.

Saturday, September 15, 2007

DOES THE PRESIDENT OR THE GENERAL DECIDE HOW OR WHETHER TO PROSECUTE WAR?

It is difficult to imagine a precedent for the buildup to and presentation of what President Bush called “the Petraeus report” (there has been no formal report, simply the general’s testimony) before House and Senate committees on the status of America’s war in Iraq. General Petraeus has been in charge of American forces in Iraq one half-year into the so-called “surge,” in which 30,000 additional troops had been added to the 130,000 already engaged in Iraq. President Bush’s prime-time address after the general’s two days of testimony invoked the general so often, one might have concluded that he was referring to the delivery of a new sacred text.

In fact, there was nothing new or unanticipated in the general’s testimony. That could not come as a surprise. A commanding general on active service does not rebut or qualify his president’s optimistic prognosis. If he did, he would be removed, as President Lincoln removed General George McClellan as commander of the Army of the Potomac, because of his failure to engage the Confederate Army and win the Civil War.

Similarly, when a general takes actions that contradict the president’s behavior, he will be removed. President Harry S Truman relieved General Douglas MacArthur of his command for insubordination when he issued an unauthorized statement threatening to expand the Korean war into China if it resisted, while the president was preparing to engage North Korea and China in peace negotiations. MacArthur’s independence led to the loss of many American lives. (These events are retold in David Halberstam’s posthumous The Coldest Winter: America and the Korean War, excerpted in “MacArthur’s Grand Illusion,” in the October 2007 issue of Vanity Fair.) General Omar Bradley expressed the prevalent military as well as political sentiment when he said that General MacArthur’s action “would have involved us in the wrong war in the wrong place at the wrong time against the wrong enemy.”

Disagreement is not the only inappropriate behavior for a general. It is also undesirable for a general to allow himself to become (albeit at the president’s instigation) a spokesperson for what is, after all, the president’s partisan politics. No longer simply a general, then-Secretary of State Colin Powell remained for the American public the general above politics and the Bush Administration’s most credible spokesperson when he was prevailed upon to address the United Nations to justify what would subsequently be America’s unilateral invasion of Iraq. Virtually no assertions of “facts” were true in Powell’s presentation of the doubtful and untruthful “evidence” supporting Iraq’s close alliance with Al Qaeda, possession of weapons of mass destruction, and intention to employ those weapons against the United States. Nevertheless, the authority of General Powell’s endorsement seemed credible to a public that would have been skeptical if these claims had come from another spokesperson.

That was the position that General Petraeus put himself into when just weeks before the 2004 presidential election, The Washington Post published an op-ed piece by him. Ever the optimist, General Petraeus saw “tangible progress” in Iraqi security forces, enabling “Iraqis to shoulder more of the load for their own security.” Petraeus detailed military victories and the increased capacity of the police forces. Regrettably, General Petraeus has not enjoyed the military success that the president and he have both implied that he has had. Sectarian warfare has escalated in areas under his command. His efforts at reaching political agreements have failed, as have the efforts of others. The loss of billions of dollars of Iraqi weapons have led to a major criminal investigation of Army mismanagement. Despite the general’s praise, recent reports recommend that the police should be disbanded because of their dismal failure to improve security.

Three years later, in his long-anticipated evaluation of the “surge,” the general once again has said that some progress had been made on the ground, adding that there were fewer fatalities in some areas in recent months [but didn’t count Sunnis killing Sunnis, Shias killing Shias, or assassination by being shot in the front of the head as distinct from the rear], and that some tribal groups that once used their weapons to kill Americans had entered into agreement with the Americans to use their new American-supplied weapons to kill insurgents. In passing, the president and general have acknowledged that no progress had been made to create a unified government in the devastated country. Most revealing of the limits of military judgment was the answer General Petraeus couldn’t give when asked whether America is safer. He confessed that he has not entertained that question.

It is appropriate that a general should echo the military judgment of the president, if he agrees with it. What is deliberately deceitful is the president’s pretense that he will be guided by the conclusions of his generals, as Bush stated one month before the general came home: “Troop levels will be decided by our commanders on the ground, not by political figures in Washington, D.C.” He couldn’t wait for General Petraeus’s testimony and flew to a secret desert air base 120 miles from Baghdad to declare that the surge is working. Of course, he already knew the general’s conclusion, because when Bush disagrees with a general, the general is removed or retired.

What did generals think of the invasion? General Eric Shinseki, then Army Chief of Staff, asked by a Senate committee to estimate the number of ground troops necessary to support the invasion of Iraq, replied “several hundred thousand.” Defense Secretary Rumsfeld and Deputy Defense Secretary Wolfowiz immediately declared that was ‘’wildly off the mark.” Shinseki soon retired. Commander-in-Chief United States Central Command General John Abizaid has since said that Shinseki’s estimate was correct. General Bernard Trainor has described a willfully self-deluding planning process. General William Odom, former director of the National Security Agency, has said that the American invasion of Iraq might be the worse strategic mistake in American history.

What did generals think of America’s conduct in Iraq? General Antonio Taguba, charged with reporting on the documented horrors and humiliations suffered by prisoners at Abu Ghraib (which provided the terrorists with their most persuasive recruitment tool) concluded that the crimes deserved severe punishment. Instead, the Department of Defense punished only the lowest-ranking soldiers and General Taguba was exiled to a Pentagon desk job and early retirement. CENTCOM General Anthony Zinni, later Bush’s special envoy to the Middle East, has stated: “In the lead up to the Iraq war and its later conduct, I saw at a minimum, true dereliction, negligence and irresponsibility; at worst, lying, incompetence and corruption.” Our mistakes, Zinni argues, include denying priority to the war on Al Qaeda in Afghanistan, disbanding the Iraqi army, and deBaathifying the police. The result of our ill-advised unilateral aggressive intervention, General Zinni concluded, is that “we are now being viewed as the modern crusaders, as the modern colonial power in this part of the world.”

Did the generals think that the “surge” was desirable? When General Abizaid was pressed this past November by Senator John McCain on the need for an increased U.S. military presence, he replied: “Senator McCain, I met with every divisional commander, General [George] Casey, the core commander, General [Martin[ Dempsey [head of the Multi-National Security Transition Command in Iraq}, we all talked together. And I said, in your professional opinion, if we were to bring in more American troops now, does it add considerably to our ability to achieve success in Iraq? And they all said no. And the reason is because we want the Iraqis to do more. It is easy for the Iraqis to rely upon us to do this work. I believe that more American forces prevent the Iraqis from doing more, from taking more responsibility for their own future.”

What did generals think of the civilian strategists of the war? General Paul Eaton, who helped revive the Iraqi army, described Rumsfeld as “incompetent strategically, operationally and tactically.” General John Batiste, commander of an infantry division in Iraq, turned down a promotion and a tour in Iraq as the second-ranking military officer, and chose to retire rather than continue to work for Rumsfeld. In 2006, according to a Military Times poll, almost 60 percent of the members of the United States Armed Forces do not believe that the civilians in the Pentagon had their “best interests at heart.”

Each month of the surge so far has cost $10 billion and the lives of one hundred American troops. Senator McCain warns that withdrawal would increase “the potential for genocide, wider war, spiraling oil prices and the perception of strategic American defeat.” Those grim consequences may occur. But it is the responsibility of President Bush, not of his generals, to clearly spell out when and under what circumstances the risk of these dire consequences of American withdrawal would be reduced. Absent President Bush’s clear analysis and projection of America’s future prospects in Iraq, his unstated cynical answer is that this is his legacy to a future Administration.

If you wish to subscribe to Thinking Out Loud, e-mail thinkingoutloud@stanleyfeingold.com, and write “subscribe.” The price of your subscription is your occasionally thinking out loud by responding. Comments, criticisms and corrections are welcome. I intend to post a weekly essay of under 1500 words. To unsubscribe, write “cancel.” Stanley Feingold

Friday, September 7, 2007

DOES RONALD REAGAN HAVE A CONSERVATIVE HEIR?

Ronald Reagan’s election in 1980 marked the beginning of an extraordinary change in American politics, made evident by sharper ideological differences between the leadership of the major parties than had been seen since before the presidency of Franklin D. Roosevelt. The one-sidedness of the 1932 Democratic triumph ushered in a era of one-party dominance that resulted in rheir control of the House of Representatives for fifty-eight and of the Senate for fifty-two of the next sixty-two years and five consecutive presidential victories before losing to an unbeatable war hero.

This was achieved initially as a result of the Democratic Party’s remarkable ability to hold together the most disparate interests. Southern white supremacists were there because they had been there since Lincoln “freed the slaves.” Twentieth-century working-class immigrants were there because there was no congenial home for them in a Republican Party whose leaders represented capitalist power and a laissez-faire philosophy epitomized in Calvin Coolidge’s observation that the business of government is business. Black Americans were there because they were ignored by the party that had ended slavery and they responded to the New Deal’s egalitarianism.

Republicans recognized that they had to make a broader appeal and, in choosing presidential nominees, they reached beyond conservative ideologues to nominate New York Governor Thomas E. Dewey (twice) and former Democrat Wendell Willkie before winning the presidency with General Dwight D. Eisenhower, whose domestic political leanings were largely unknown. Eisenhower was the candidate of the liberal internationalist wing of the party, barely and bitterly winning the nomination in 1952 against Ohio Senator Robert A. Taft, an authentic and beloved conservative widely known as Mr. Republican.

This shift toward more moderate Republican presidential candidates was reflected in efforts to broaden the party base, in order to make it more cross-sectional and multi-factional. Nevertheless, the Democratic Party remained more liberal, despite the power of conservative southerners who chaired most of the major congressional committees, and the Republican Party remained more conservative, despite the presence of eastern internationalists and middle-western LaFollette populists. The Democrats did a better job of keeping their coalition than the Republicans did of creating theirs, as was evident in the civil rights controversies, when both the leading advocates and leading opponents of racial equality were in the Democratic Party.

After Vice President Richard Nixon’s close defeat in 1960, conservatives captured enough Republican Party state organizations to nominate Arizona Senator Barry Goldwater, who, true to his promise to offer “a choice, not an echo,” championed the reduced size of government, repeal of the graduated income tax, an end to federal aid to education, and voluntary Social Security. Goldwater had written The Conscience of a Conservative, and he became the embodiment of that conscience.

In retrospect, Goldwater’s overwhelming defeat (equaled only by Democrat George McGovern’s loss eight years later) can be seen as the birth pangs of a new conservative alliance. Goldwater won only his home state of Arizona and the five southern states with the greatest proportion of black citizens, the states in which the issue of race was most important. The Solid South of a century after the Civil War was solid no more. The change begun by the Supreme Court’s 1954 decision in Brown v. Board of Education outlawing segregated public education was accelerated by passage of the Civil Rights Act in 1964 and the Voting Rights Act in 1965. The political landscape was decisively altered by school busing, forced school integration, affirmative action in higher education and hiring, white flight, race riots, and the perception of increased street crime. The enduring political consequence has been that no Democratic presidential candidate since Lyndon Johnson in 1964 has received a majority of white votes.

Despite the existence of laws inspired by religious beliefs (Sunday closing laws, prohibiting the mailing of immoral material, criminalization of birth control information and devices, and the insertion of “under God” into the Pledge of Allegiance), religious political influence abated after the 1925 Scopes “monkey trial,” in which a young science teacher was convicted for teaching evolutionary theory. It was a pyrrhic victory for religious orthodoxy because the public reaction was a political defeat for the public teaching of religious doctrine.
Religious moral conviction reemerged as a political force in 1965 when the United States Supreme Court upheld the right of married couples to obtain contraceptives. In 1973 in Roe v. Wade, the Supreme Court went further in recognizing a woman’s right to an abortion, inspiring a powerful grass-roots movement that has ever since aspired to reverse this decision by the selection of Supreme Court Justices who are likely to vote to overturn Roe or, short of that, limit the permissible period or methods of abortions.

The new mobilization of religious and racial conservatism became allied with the economic conservatism of business protectionism, laissez-faire government, and fiscal conservatism to reshape the Republican Party. It needed a candidate who would articulate this new conservative coalition, and it found him in Ronald Reagan. Reagan was identified with opposition to abortion, obscenity and pornography; respect for the flag and support for school prayer, and certain and severe punishment for violent crimes. Reagan’s presidency accompanied – and perhaps inspired – the revival of religious fundamentalism and equating American patriotism with Republican conservatism.

It is only peripherally relevant to this alliance symbolized by Reagan that his behavior was not nearly as conservative as his rhetoric. He frequently invoked God, but was not a churchgoer. He had been a populist before he became a conservative and, as president, he supported raising Social Security taxes rather than cutting benefits. In 1964, he characterized Medicare as socialized medicine, but Medicare spending increased by more than ten percent in each year of his presidency. He promised to decrease the size of government; it increased. He promised to cut the budget; it grew larger. He promised to decrease entitlements; in office, he supported vast increases. He promised to abolish two Cabinet departments; they were retained and another was created. He promised to outlaw abortion; nothing happened. Nevertheless, the symbolic reality was that Reagan made most Americans feel good about themselves and their country, and conservatives believed that they had an ally in the White House.

The conservative coalition could not feel a similar comradeship with Reagan’s vice president and successor, George H. W. Bush, who was defeated after one term, in part because of the bitterness of economic conservatives at his betrayal of his pledge of “no new taxes.” But they had captured the party machinery and, aided by Democratic President Bill Clinton’s political ineptitude and neglect of the party organization, succeeded in winning control of Congress in 1994, holding it for the last six years of the Clinton presidency and the first six of George W. Bush, the truest conservative to occupy the White House in the lifetime of anyone now living.

Even in victory, insecurity was apparent in the now-powerful conservative coalition. Where earlier conventions featured major addresses by staunch conservatives, in 2004 the most prominent speakers included John McCain, Arnold Schwarzenegger and Rudolph Giuliani. It can be dismissed as window-dressing, but it was clearly designed to entice more customers into the store. The reason was that the existence of a new conservative majority had not yet been established.

Al Gore outpolled Bush in 2000 and the party vote that year for the House was almost a dead heat, with little more than one vote in every thousand separating them. Four years later, the Republican margin of victory for House candidates was less than three votes in every thousand votes cast. Democrats received more votes than Republicans in the one hundred Senate races from 2000 to 2004 or 2002 to 2006. Republican control of Congress was due more to gerrymandering in the House and its dominance in the rural, less populous states than to greater popular support.

Is the confidence of the conservative coalition in the rightness (take it either way) of their cause diminishing? Their three leading aspirants for the 2008 presidential nomination are a former Governor of ultra-liberal Massachusetts, a former Mayor of New York City (neither of which any Republican can hope to win), and a distinguished Senator whose complex public record ranges from excoriating President Bush to embracing his most controversial policies. Fred Thompson has finally announced his candidacy as the savior of the conservative cause. Better known as a television and movie actor than as an eight-year Senator, his conservative credentials are modest compared with the records of many past and present Governors and members of Congress.

In order to win the nomination, each of these four leading candidates must now vow that he is the most authentic heir to Reagan Republicanism without identifying himself to closely with the current president, and then, to win the election, he must radically moderate his positions to win the support of a much broader electorate. Reagan could preach in 1980 that it was “morning in America.” Is it possible that it is now much later in the day – perhaps too late for his conservatism?


If you wish to subscribe to Thinking Out Loud, e-mail
thinkingoutloud@stanleyfeingold.com, and write
“subscribe.” The price of your subscription is your
occasionally thinking out loud by responding.
Comments, criticisms and corrections are welcome.
I intend to post a weekly essay of under 1500 words.

To unsubscribe, write “cancel.” Stanley Feingold

Saturday, September 1, 2007

IS A VICE PRESIDENT NECESSARY?

The thought must occur to President Bush’s harshest critics that, as unpopular as he is, even they fear the possibility of his departure before the end of his second term. There cannot be many Americans who might consider removing Bush who would be pleased by the prospect of replacing him with Vice President Cheney.

It isn’t a unique situation. President Richard Nixon was deeply embroiled in the Watergate scandal shortly after his reelection in 1972, but those who most condemned his shameful behavior feared that, if he were removed from office, Vice President Spiro Agnew would become the president. (The authors of the Constitution had not clearly indicated that, in such circumstances, the vice president would succeed to the title of president, but John Tyler, the first vice president whose president died one month after taking office, had himself sworn in, and every succeeding vice president has done the same.) Fortunately for Nixon’s critics, Agnew resigned nine months into his second term in an agreement that allowed him to escape trial on charges of having committed bribery, extortion, and tax evasion during his tenure as governor of Maryland. This cleared the way for the congressional inquiry into Nixon’s unlawful conduct that led to his resignation less than a year later. Imagine that Nixon had left office before Agnew, and this ignorant, bigoted and corrupt man, chosen as Nixon’s running-mate because he had delivered his state’s support to Nixon at a crucial point in the 1968 nomination campaign, had become President of the United States.

In the same fashion, imagine if President George H.W. Bush had departed from the presidency and been replaced by Vice President Dan Quayle, a choice of a running-mate that shocked even Bush’s supporters. Quayle was an amiable, ill-prepared and under-equipped Senator who is best-remembered two decades later for his misspelling of “potato” (he told a student to add an “e”) and a number of verbal gaffes, perhaps most famously his reference to the United Negro College Fund slogan, “A mind is a terrible thing to waste,” as “What a waste it is to lose one’s mind or not having a mind is being wasteful. How true that is.”

Today a very unpopular President Bush has an even more unpopular Vice President Dick Cheney. When Congress was examining President Nixon’s role in the Watergate break-in, a business associate reported that Cheney said (and Cheney has never denied saying it), that Watergate was “a political ploy by the president’s enemies.” His support for unchecked executive power was later on the public record when, as a member of Congress, he opposed congressional investigation of possible abuses of power in the Iran-Contra scandal and commended Colonel Oliver North as “the most effective and impressive witness certainly this committee has heard.”

As Vice President, Cheney has repeatedly stated that Saddam Hussein was involved in 9/11, that terrorist Abu Musab al Zarqawi established an Al Qaeda operation in Iraq, and made other claims that have been totally refuted; he persuaded President Bush to sign an order denying foreign terrorism suspects access to any military or civilian court (without informing either Secretary of State Powell or National Security Adviser Rice); he advocated “robust interrogation” of suspects, a code phrase for torture; he refused to tell Congress whom he had met to develop energy policy; he has refused to respond to a subpoena from a congressional committee, and offered the far-fetched claim (abandoned after widespread ridicule) that he was not an “entity within the executive branch.”

Of course, if he were not vice president, Cheney could make all of these unfounded (literally anti-republican and anti-democratic) claims, and President Bush could, as he has, adopt them as his own. However, because he is vice president, if President Bush was removed from office, Cheney would become president. It is beyond argument that neither Agnew nor Quayle nor Cheney would have received serious consideration as a presidential candidate. On the evidence of their political backgrounds, Agnew and Quayle would have been major embarrassments as President of the United States and Cheney would be an unmitigated disaster. His arrogance, obdurateness, passion for secrecy, and disrespect for the clear mandates of the Constitution would inspire unending constitutional crises.

Once upon a time, the vice presidency was a position that inspired ridicule. Mr. Dooley, Finley Peter Dunne’s famous fictional politician, observed: “Th’ prisidincy is th’ highest office in th’ gift iv th’ people. Th’ vice-presidency is th’ next highest an’ th’ lowest, It isn’t a crime exactly. Ye can’t be sent to jail f’r it, but it’s a kind iv a disgrace. It’s like writin’ anonymous letters.” In a similar humorous and derogatory spirit, the office was lampooned in the Pulitzer Prize-winning musical Of Thee I Sing, when Vice President Alexander P. Throttlebottom discovers that his sole constitutional power is to preside over the U.S. Senate, in which he cannot introduce legislation or speak, but can cast tie-breaking votes, which don’t occur once in the average vice president’s career.

The vice presidency is no longer a laughing matter. All four nineteenth century vice presidents who succeeded to the presidency upon the death of the president had been at odds with the presidents under whom they served, and all failed to be nominated in their own right before the next election. By contrast, four of five vice presidents who succeeded upon the death of a president in the twentieth century were subsequently elected in their own right. The fifth, Gerald Ford, who succeeded on the resignation of President Nixon, failed to be elected, in large part because of the blanket pardon he had given to Nixon. Altogether, the five 20th-century vice presidents who succeeded to the office served (including four elected terms, to which they would not have been elected if they had not been first elevated to the presidency) for a little more than 22 years and ten months, very nearly a quarter of a century.

In addition, other elections have been critically influenced by an earlier president’s choice of a running-mate. VP Nixon lost in 1960 but won twice in 1968 and 1972. Former VP Mondale lost in 1984, VP Bush won in 1988, and VP Gore was denied his victory in 2000. The presidential election of 2008 will be only the third election since 1900 in which neither an incumbent president nor a present or past vice president is a major party candidate.

The argument that an incumbent vice president is better prepared to assume the presidency is often untrue.
John Tyler, the first vice president to succeed on the death of an elected president, provides an instructive lesson. One month after his inauguration in 1841, President William Henry Harrison died and Tyler became president, opposed to most of Harrison’s policies and reviled for the next four years by the party that had elected him. Theodore Roosevelt, almost certainly the most highly regarded president who succeeded on the death of a president, became vice president because the death of President William McKinley’s first vice president gave Republican New York State boss Thomas Platt the opportunity to get rid of Roosevelt as the state’s governor by having him “kicked upstairs” to the vice presidency, where he would never be heard from again. Of course, those who got rid of Roosevelt did not anticipate McKinley’s assassination six months into his second term, when Roosevelt became the president and profoundly reshaped the politics of his party and the nation.

Dick Cheney to the contrary notwithstanding, vice presidents have rarely been the confidante of the president. When Harry Truman was sworn in as president immediately after the death of President Franklin D. Roosevelt, Secretary of War Henry Stimson took him into a corner to tell him about the atomic bomb. Before that, no one had thought it important to tell the vice president.

There has to be a better way. Suppose a sudden vacancy occurred in the presidency. That day the members of Congress could either quickly convene or be polled. On the first or second ballot, a new president could be chosen. If a majority of the 535 members of Congress were of the same party as the departed president, they would choose a leader of that party, more often than not one who would have declined selection as vice president in our present system. If a majority of the members of Congress were not of the president’s party, they would opt for a change, very possibly choosing their party’s defeated presidential candidate.

Presidential candidates often make this choice of a running-mate at the very last moment in a national convention, sometimes as a quid pro quo for convention support or as a concession to their opponents in the party. We don’t choose a president in order for him to choose his successor, but that us what so often occurs. There has to be a better way, and that way would involve the elimination of the office of vice president.

Saturday, August 25, 2007

DOES THE PRIMARY SYSTEM CHOOSE THE BEST CANDIDATES?

Who is the best (that is, the strongest) candidate for either major party to nominate? That’s easy. It’s the candidate who will appeal most successfully to the party’s unswerving supporters, to those voters who are often – but not necessarily – inclined to support that party, to independent voters who boast of voting for the person not the party, and to new voters. That candidate is not necessarily the single most popular potential candidate of the party, but one who is, if not a first choice, an acceptable choice for the largest number of prospective voters. To put it briefly – but watch out for the double negative – the best candidate is the party’s least unacceptable candidate.

In 1972, the first year in which the modern primary-caucus system of presidential nomination was decisive, Senator George McGovern won the Democratic nomination because he had won more delegates than any other candidate. In fact, former Vice President Hubert Humphrey, who narrowly lost the 1968 election to Richard Nixon, received slightly more primary votes than McGovern (68,000 more out of a total primary vote of 16 million), but had the support of fewer elected delegates. Each received marginally more than one-quarter of the primary votes.

The rational question the Democratic Party should have asked was which potential candidate would be most likely to maximize the party’s support in the election? It is not a reflection on McGovern’s integrity, intelligence or experience to observe that, in the light of his being perceived as a very liberal, lesser known and uncharismatic Senator, he was not that candidate. As McGovern himself observed after his defeat, the worst any Democratic presidential candidate has ever suffered, “I wanted to run for president in the worst possible way – and I did.” Based on Humphrey’s strong run four years earlier under adverse circumstances (the chaos at the 1968 Democratic convention was evidence of a bitterly divided party), he was likely to be a much more popular candidate.

It might have been worse. Until the attempted assassination of Alabama Governor George Wallace on May 15, which resulted in crippling him and removing him from the race, he had decisively won the southern states of Florida, Tennessee and North Carolina, finishing second to McGovern in Wisconsin and second to Humphrey in Pennsylvania, Indiana and West Virginia. The day after the shooting, Wallace won the Michigan and Maryland primaries. At that point, Wallace was well ahead of his rivals. Despite his incapacity, he continued to poll at least twenty percent of the primary vote in three of the four remaining primaries. It is not difficult to imagine, had Wallace not been shot, that he would have had a significant plurality of both votes cast and delegates elected when the Democrats convened their convention. He would then have been the single most popular candidate, but it is unarguable that he, who had been elected Governor on the slogan “Segregation Now, Segregation Tomorrow, and Segregation Forever,” could not be nominated, unless the Democrats were willing to commit political suicide.

In 1976, Gerald Ford, who had succeeded to the presidency upon the resignation of Richard Nixon, won the Republican nomination with 53 percent of the primary vote, compared with California Governor Ronald Reagan’s 46 percent. Incumbency was decisive in Ford’s winning the nomination, but his unconditional pardon of Nixon was probably decisive in his losing the election. Jimmy Carter, originally a little-known candidate, won 39 percent of the Democratic primary vote in a field without strong opponents and failed to win a primary majority outside of the south and near-south until the last primary day in June. Under the circumstances, Reagan, a less unacceptable candidate than Ford, would have been a likely winner if he had been nominated.

In 1980, liberal Massachusetts Senator Ted Kennedy sought to take the nomination away from President Carter. It soon became apparent that the tragedy at Chappaquiddick was as fatal to his chances as Carter’s reputation as a weak president was fatal to his. It is very likely that Reagan would have defeated any Democrat, but it is almost certain that a number of leading Democrats would have fared better than Carter.

It isn’t only losing candidates who demonstrate the failure to choose the best candidate. In 1991, thanks to the ease with which the United States won the Gulf war, President George H.W. Bush’s popularity reached a record high. He looked unbeatable in 1992. One by one, the leading Democrats declined to compete for their party’s nomination. These included Governors Bruce Babbitt of Arizona and Mario Cuomo of New York, Senators Al Gove, Sam Nunn and Paul Simon, Representative Richard Gephardt, and the Reverend Jesse Jackson. All had national reputations and figured in speculation regarding their party’s nomination. When they bowed out, the field contained five candidates who may be fairly characterized as the B team: a little-known radical populist Senator (Tom Harkin); an anti-charismatic moderate Senator without a power base (Bob Kerrey); a former Senator who had been ill, looked ill and was still ill, although he lied about his medical condition, and who prescribed unpopular glum remedies for what ailed the United States (Paul Tsongas); a former California Governor widely caricatured as Governor Moonbeam (Jerry Brown), and the long-time and long-running Governor of a poor and small state who, alone among the candidates, had spent the time and money to organize a campaign for the long haul (Bill Clinton).

When Clinton was battered by charges of draft-dodging and marital infidelity on the eve of the first primary in New Hampshire, and later performed poorly in the early non-Southern primaries, none of his rivals had either the financial resources or popular support to capture the lead, and it was too late for a stronger candidate to enter the race. It is futile to speculate as to which of the party leaders who had earlier declined to run would have been the strongest candidate, but several would almost certainly have been stronger.

Now, more than a year before the presidential election of 2008, public opinion polls reveal that the front-runners for their party’s nomination are Democrat Hillary Clinton and Republican Rudolph Giuliani. Yet, among the leading candidates of both parties, it is likely that Clinton and Giuliani will confront the most opposition and skepticism among party regulars and others inclined to vote for that party.

Clinton’s unfavorable rating in the electorate is approximately equal to her favorable rating. This is due to her critical role in sidetracking universal single payer health insurance in 1993-94, principled antipathy to having a husband and wife both serve as President (as much as opposition to a father and son, but that was not the choice of the Democrats), the negative reaction of many likely Democratic voters to Clinton’s personality, and the largely unexpressed reluctance of many voters to elect a woman. A very large number of voters have a similar unexpressed reluctance to support an African-American candidate. The primary difference is that a large proportion of voters who are reluctant to support a woman are inclined to support a Democrat, while a much smaller proportion of voters who are reluctant to support a black candidate are likely to support a Democrat. If Clinton is nominated, loyal Democrats are likely to suppress their doubts and vote for her, but critical independents are less certain to do so.

Republican partisans will exploit the potential weaknesses of their rivals, including Romney’s Mormonism and McCain’s departures from party orthodoxy on campaign reform and immigration. However, Giuliani’s vulnerabilities are likely to prove to be more critical, including liberal positions on abortion and gay marriage, his three marriages and informing his second wife in a press conference of his intent to divorce her, his alienation from his children, and increasing criticism of his public conduct after 9/11, the very event which made him a major national political figure. At least until he becomes an announced candidate, Fred Thompson may be the least unacceptable Republican. Of course, if Clinton and Giuliani are both nominated, they won’t both lose, any more than both Nixon and McGovern could lose in 1972. But many voters will confront an unhappy choice.

The flaw revealed in this, the tenth election in which the presidential candidates will have been chosen by the primary-caucus system, is that the unrepresentative voters who participate in the process choose the one candidate they most favor (and who may win the nomination with only a small plurality of the primary vote), and not the candidate who has the widest support within the party, let alone in the general electorate. It isn’t the only flaw of the primary system, but it significant enough to undermine any pretense that the process reflects the public’s will.

Friday, August 17, 2007

ARE SUSPECTED TERRORISTS ENTITLED TO "DUE PROCESS OF LAW"?

On August 16, Jose Padilla was found guilty, along with two co-defendants, of conspiracy to “murder, kidnap and maim” people in a foreign country. All three could be sentenced to prison for life. The case of Jose Padilla was brought to public attention by a number of events beginning more than five years ago.

Padilla was a native American citizen, arrested on May 2002, taken a month later to the Navy military brig in South Carolina, kept without human contact, lights, clock or a mirror, and interrogated without an attorney for another twenty-one months before he was permitted to speak to counsel, and retained in the brig for another twenty-two months before being transferred to a civilian prison in Miami, where he made his first court appearance on January 12, 2006. The extraordinary length of time between his arrest and court appearance is a gross violation of the fundamental right of habeas corpus (literally “to have a body”), that is, to bring a party before a judge or court in order to prevent the state from keeping an individual in unlawful restraint.

When Padilla was apprehended at O’Hare International Airport in Chicago upon ending a flight that began in Pakistan, he was carrying a small amount of money, a cell phone and e-mail addresses for Al-Qaeda operatives. President Bush had him designated as an “enemy combatant” and Attorney General John Ashcroft disclosed that he was suspected of planning to detonate a radioactive “dirty bomb” in an American city. More than a yeare and a half after he was detained, the Second Circuit Court of Appeals in New York ordered his release from military custody and permitted the government, if it chose, to try him in a civilian court. That ruling was suspended when the Bush Administration appealed to the U.S. Supreme Court.

A half-year later (more than two years after his arrest), the Justice Department released details about alleged admissions Padilla had made during interrogations about his involvement with top Al-Qaeda leaders, including the “dirty bomb” plan and another plot to fill apartments in high-rise buildings with natural gas and detonate them using timers. Nearly another eighteen months later, Padilla was added to an existing indictment in Miami claiming that he was part of a North American terror support cell that conspired to “murder, kidnap and maim” people overseas. No mention was made of the “dirty bomb” plot or any other earlier allegations. Fourth U.S. Circuit Court of Appeals Judge J. Michael Luttig criticized the Administration for using one set of facts to justify holding Padilla without charges and another set to persuade a Florida grand jury to indict him. The Supreme Court later overruled the Fourth Circuit and allowed the military to transfer Padilla to face the new criminal charges.

After a three month trial and one day of jury deliberations, Padilla, along with his co-defendants, was found guilty of the charges brought against them. During the trial, Padilla’s lawyers unsuccessfully sought to have him declared incompetent to stand trial because of the consequences of torture he had suffered in the military brig. All evidence concerning his military confinement was barred from the trial, as was any reference to the “dirty bomb” accusations. The government said that it had received the information by questioning other terrorism suspects abroad, and federal rules of evidence prohibit or limit the use of information obtained during such interrogations.

Padilla’s co-defendnts were two men of Middle Eastern decsent, one of whom he had met before. The three were charged with belonging to a terrorism support cell that provided money, recruits and supplies to Islamic extremists. The government had recorded voluminous messages in which his co-conspirators were charged with using code words to assist in supporting violent jihad. Padilla did not participate in any of these messages. The government also played wiretapped calls in which the two co-conspirators discussed a television interview with Osama bin Laden. There was no evidence that Padilla had seen or discussed the interview. Trying Padilla along with the other two men, inextricably linked him with them, but the only evidence linking Padilla to Al Qaeda was his name and six fingerprints on an application to attend an Al Qaeda training campl in Afghanistan in 2000.

Nothing in this summary account of the incarceration, interrogation or trial of Jose Padilla is offered in his defense. On the record, Padilla was a dangerous man. He had been a member of a street gang, was implicated in a murder when he was 13 and confined as a juvenile offender, and was later arrested in Florida in a road-rage shooting incident and spent a year in a Florida jail. It is plausible, if not conclusively proven, that his closeness to Al Qaeda signified a willingness to engage in acts of murder, kidnapping and maiming others. It is possible, although no evidence to this effect has been introducted into any court of law, that Padilla participated in a plot to set off a “dirty bomb.” It is possible that he was capable of the most horrendous terrorist acts against innocent people.

If all this were true, the question would remain: Has justice been done? Can a suspected criminal receive justice if he is without human contact or light or basic information in a military prison? Can a suspected criminal receive justice if he has no access to legal counsel for two years? Can a suspected criminal receive justice if his allegations of abusive treatment are barred from his trial because the results of illegal interrogations conducted in prison may not be introduced into evidence? Can a suspected criminal receive justice if he is incarcerated for five years on charges regarding which no evidence has been introduced and that are totally discarded when he is brought to trial?

The Fifth Amendment to the U.S. Constitution states that no one (citizen or non-citizen) shall “be deprived of life, liberty, or property, without due process of law.” The Sixth Amednment states: “In all criminal prosecutions, the accused shall enjoy the right to a speedy and public trial…and be informed of nature and cause of the accusation;…and to have the assistance of counsel for his defense.” There are no exceptions to these rights.

The rule of law does not apply less to the worst of men than it does to the best. Whatever the extent of Padilla’s guilt, justice has not been done.