Tuesday, February 26, 2008

DOES EXPERIENCE MATTER IN ELECTING A PRESIDENT?

If Americans really want a chief executive whose experience and leadership have been clearly demonstrated, we should abandon the constitutional separation of powers and adopt parliamentary government. Parliamentary party leaders who become prime ministers generally have exercised party leadership for a number of years before acquiring the power to govern.

The American system is very unlikely to choose leaders with comparable experience. In fact, national experience has proven to be more of a liability than an asset. Governors have been elected far more frequently than congressional leaders. The last twelve presidential elections are instructive: In three, a present or former governor defeated an incumbent president. In four, the losing candidate was a Senator; in three more the loser was a former Senator now serving as Vice President, and in still another a former Senator who had previously served as Vice President. In only one of the past twelve elections has the losing candidate not served in national office.

In the most remote of these elections, 1960, Senator John Kennedy was elected against Vice President and former Senator Richard Nixon, and it was only the second time in American history that a sitting Senator was elected President. It is an understatement to state that neither was a party leader. Kennedy, seriously but secretly ailing, had no real Senate record. The first Senator ever directly elected as president was Warren Harding in 1920, a nonentity utterly without power or influence in Congress. Ironically, when a desperate Republican Party nominated Senate majority leader Robert Dole in 1996, he resigned from the Senate in order to run, and went on to suffer a staggering defeat.

When the Republicans had the opportunity in 1952 to choose a national leader so identified with party ideals that Senator Robert Taft was affectionately dubbed “Mr. Republican,” it chose instead to nominate General Dwight Eisenhower, whose party affiliation was unknown to the public until shortly before his selection. The Democrats passed by their incumbent vice president, Alben Barkley, Senate leader Robert Russell, and well-known Senator Estes Kefauver to choose little-known Governor Adlai Stevenson. When the Republicans did choose nationally-known Senator Barry Goldwater in 1964, he suffered the worst defeat of any presidential candidate in the party’s history.

The most successful candidates in the recent past have been unemployed politicians (Nixon, Carter, Reagan), a vice president best known at the time for being “out of the loop” on the Iran-Contra crisis (Bush I), the governor of a very poor state (Clinton) and the governor of a state which has the constitutionally least powerful governor (Bush II). There were national leaders in all of those elections who would have come to the presidency with impressive experience. They were not chosen.

Clearly, neither much-experienced maverick Senator McCain, less experienced Senator Clinton nor little experienced Senator Obama is a Senate leader. A half-dozen or more members of the Senate in both parties have earned leadership roles, but none will be their party’s nominee. (Connecticut Senator Chris Dodd, who sought to be the Democratic candidate, is an example.)

What does experience tell us about a future president? The truth is that experience doesn’t tell us much about how the next president will exercise legislative leadership, share or usurp power, perceive and promote America’s place in the world, and respond to unanticipated events. In 1933, Franklin Delano Roosevelt took office without any clear idea of how he would cope with the deepening economic depression. What he did was begin a “hundred days” of frantic activity to provide short-term relief for the jobless and minimally regulate business. (In fact, the long-term effects of the National Industrial Recovery Act and the Agricultural Adjustment Act were to concentrate industry and agriculture, not to make them more competitive.) For years (until the beginning of World War II in Europe), FDR’s policies did not reverse the Depression, but they did revive hope. They came to represent the beginning of the very modest American not-quite-welfare state. Most recently, Texas Governor George W. Bush’s bipartisan cooperation with Democratic state legislators was an experience later contradicted by his presidency.
In 2008, John McCain’s candidacy of staying the course in Iran is, whether he wishes it or not, a reaffirmation of the Bush presidency: Respice. Hillary Clinton’s candidacy of thirty-five years of experience, whether she wishes it or not, a revival of the New Democratic liberalism of her husband’s presidency: Adspice. Barack Obama’s candidacy of articulating rarely expressed hope, is – and he wishes it to be – a platform of promise: Prospice. As of this date, this may be an election, like FDR’s in 1932 and Kennedy’s in 1960, in which hope wins. It succeeded politically and psychologically in Roosevelt’s presidency and was aborted in Kennedy’s (although it survived in the JFK legend). As has been often true in the past, promises may not be fulfilled, but it now appears likely that most voters will find the alternatives to be much more unpromising.

Sunday, February 24, 2008

HAS THE PRIMARY PROCESS PROVED ITS MERIT IN 2008?

Over the last nine presidential elections, the primary-caucus method of nominating candidates has often failed to choose each party’s strongest candidate. The major parties knew the system was flawed, leading to different methods of nomination and allocation of delegates between the two parties and among the states. Frustrated by the ability of factions unallied with the party organization to win the nominations of George McGovern in 1972 and Jimmy Carter in 1976, the Democrats in 1984 created superdelegates, who are members of Congress and other party leaders not bound to any candidate, but more likely to support the establishment choice against insurgents. What would happen if the superdelegates influence the choice of a nominee other than the plurality winner in the primaries? Until now, the superdelegates have been superfluous delegates, because they have not influenced the party’s nomination.

The rationale for the primary system is that it increases the public’s role in the process. Nevertheless, close to one-half of all voting-age citizens failed to vote in presidential elections. For most Americans, the presidential primary was one more election (and a difficult one to understand) in a nation which has more elections and fewer voters in relation to population than other industrial democracies.

The percentage of adult citizens voting in recent national elections has been well over 8o percent in West Germany, New Zealand, and France, over 75 in Canada, Great Britain, and Japan, and just barely 50 percent in the U.S. Other representative governments do not conduct elections to choose candidates. Critics argue that, to increase turnout, we should deal with the causes of non-voting, not increase the number of elections in which a large proportion of the electorate will not participate. In the nine presidential elections since reform, the highest turnout was 55.3 percent of potential voters in 1992, the lowest 49 percent in 1996. By contrast, turnout was never below 60 percent in the five previous elections.

For the party in power, the primary process has until now been little more than a reaffirmation of the power of the governing party’s hierarchy. In the nine elections since the primary reforms, every presidential candidate of the party in power has been either the incumbent or his vice president. For the party out of power, the nomination was often won by a candidate with a minority of primary votes. McGovern received barely 25 percent in 1972 (the first election in which the primaries determined the choice of the candidates), Carter 39 percent in 1976, Mondale 38 percent in 1984, and Dukakis 43 percent in 1988. Even those figures exaggerate their primary support, because they include larger proportions after other contenders left the race. Even incumbents sometimes barely eked out a majority. Ford received 53 percent in the 1976 primaries and Carter 51 percent in 1980, almost certain signs that they faced defeat in the election.

Primary voters tend to vote for their first choice, not necessarily their party’s best choice. That would be not the single most popular candidate but the one least unacceptable to the party faithful, independents, and dissident members of the other party. At its best, the old-fashioned conventions of party leaders and office-holders achieved that, because they wanted nothing so much as a winner. By contrast, the primary system reflected the judgment of a small proportion of the electorate from a few atypical states, required vast amounts of money to finance a campaign, made more difficult the candidacy of experienced members of Congress, diminished the importance of the party organization greatly reduced the relevance of the national convention, and has been repeatedly reformed by party commissions in election years without making it more likely to result in the selection of the strongest candidate within each party.

These are powerful arguments against the method of presidential nomination employed in the past nine elections, except for the fact that they seem contradicted by this year’s nomination process. Neither party had an obvious successor, the field seemed – or over time came to seem -- wide open and narrowed only as voters expressed their preferences, both parties will nominate an incumbent Senator (leading to only the third election in which a sitting Senator is elected as president), and the candidates have aroused strong support among voters who have not played a prominent role in the past.

It is almost certain that, in the absence of the primary system, the Republican Party would have chosen someone other than Senator John McCain, whose opposition to George W. Bush in 2000, support for campaign reform, and lack of support within the party’s congressional leadership would have doomed his candidacy. It is probable that several Governors, ex-Governors and other members of Congress would have competed for both parties’ nomination. The groundswell of support for Obama would never have taken place, and he most certainly would not have been chosen as the Democratic candidate.

The primary process has had unprecedented success in inspiring participation and choosing highly regarded candidates. Barring some unexpected event, the turnout this November is likely to far exceed that of recent elections, and this is entirely attributable to the campaign that has been waged until now. The flaws remain. The process takes place too long before the election. The amount of money it takes to compete is obscene and violates a basic premise of the democratic process. The ordering of the primaries and caucuses is arbitrary, creating unforeseen and irrational advantages for one or another candidate. And yet…this time very capable candidates have emerged.

* * *

There remains at this writing a small possibility that the primary process can fail. If, despite a national Obama plurality in primary and caucus votes and elected delegates, Clinton decisively wins the Texas and Ohio primaries, and then wins the Democratic nomination by winning the support of a sufficient number of superdelegates, including a strategy of seating the now-barred Florida and Michigan delegates, the Democrats will bitterly divide, possibly leading to their defeat in the presidential election, and almost certainly leading to radical reform in the nomination process.

Saturday, December 15, 2007

WHAT DO WE DO WHEN VALUES COLLIDE IN DECIDING CONTROVERSIAL PUBLIC POLICY ISSUES?

While most public policy issues are easily viewed from a liberal or conservative perspective, any rational person must acknowledge that there are opposing moral views involved in some of the most controversial issues America confronts today. Such moral conflicts should be dealt with in thinking about the use of torture in interrogating terrorist suspects, providing the best health care for all people, and determining the appropriate limits of legal immigration. I think that we should be most critical of people, including politicians, who do not seriously consider and attempt to counter the moral convictions of those who come down on the other side of each of these issues.

One: Torture. When legal techniques fail to elicit information regarding terrorist plans, should the United States employ criminal interrogation techniques, including waterboarding (simulated drowning), when these methods may produce information that will enable us to disrupt terrorist plans and prevent terrorist actions?

John Kiriakou spent 14 years at the CIA, including a tour in Pakistan from 1998 to 2004. He was involved in the capture of Abu Zubaydah, an Al Qaeda militant, and was one of the first to interrogate him. The initial questioning bore no fruit, so waterboarding and other exceedingly harsh methods were applied. Kiriakou states that new or severe interrogation techniques were never applied without the approval of superior officers. Because Kiriakou was on his way to another assignment after the initial interrogations of Abu Zubaydah, his recounting of events after that point is based on classified cables and private communications with his colleagues. (Kiriakou says that he did not know that the interrogations was taped, and disagrees with the decision to scrap them. The faces of CIA interrogators could have been blurred to protect their identity.) Although Kiriakou’s recent public account omits details regarding the method of waterboarding, he has confirmed that the simulated drowning of Abu Zubaydah lasted 35 seconds. Before waterboarding, this Al Qaeda operative was “defiant and uncooperative”; afterward he was compliant and provided valuable information. Kiriakou’s conclusions are that waterboarding is torture; it worked in the case of Abu Zubaydah, and it “probably saved lives.” He also believes it is no longer necessary.

Waterboarding is torture. It is contrary to the Geneva Conventions and American law. Its use by the United States justifies the use of criminal methods by other nations against American nationals. It may lead prisoners to provide false information in order to appease their interrogators. And yet, as Kiriakou asserts was the case in with Abu Zubaydah, it may lead the prisoner to reveal information that will result in the saving of lives. Is it then justified?

Two: Health care. Optimal health care coverage can provide the highest practical level of medical care for all Americans. This can be achieved by barring independent medical practice for the benefit of people prepared to pay the additional cost for extraordinary treatment. Should the United States underwrite a health care system that provides extensive universal medical insurance, setting limits for those who are prepared to pay for more?

Any system of public health insurance must have finite limits. As generous as the nation might be in preventing and treating disease and ill health and however advanced medical science has become, some treatments will be so expensive, risky or rare that their coverage cannot be justified. Medical science seeks to alleviate and cure both common and rare medical conditions, but there comes a point at which choices must be made. American society cannot afford to employ limited medical resources in order to treat every adverse condition of every person.

There are always experimental and costly treatments that are available only to those prepared to pay the price. To the extent that society offers that option, it restricts the employment of medical resources for less costly treatments for a larger population. This option exists in America’s present system of health care and in every socialized system such as Great Britain’s in which doctors and patients can choose between public and private coverage. It would not exist in a genuinely socialized scheme of universal single-payer coverage. Should the United States subscribe to the utilitarian principle of “the greatest good for the greatest number,” knowing that carried to its logical extent this principle would deny the right of those who could afford it to buy even greater medical support for themselves?

Three: Immigration. For millions of impoverished people in Central and South America, the United States holds out hope for a better life. The millions of illegal immigrants already here pose challenges to the American economy, the ability of educational and medical institutions to serve them, and the nation’s social coherence. Should we attempt to compel those already here to leave? How can we keep millions more from gaining entry?

From its founding, the United States has represented to the rest of the world a haven for “your tired, your poor, your homeless masses yearning to breathe free.” Geographically, its vast borders provide access for those with the cash or courage to seek entry. We are all (so-called Native Americans included) immigrants or the their offspring. Today’s immigrants perform the crop-picking, dishwashing, and taxi-driving jobs that those of us long-settled here do not want. Nearly all illegal immigrants obey the laws in every other way and seek to adapt to American values.

At the same time, illegal immigrants unwittingly change established communities, increase the strain on local resources (including an increased cost of English-language education), and are prepared to work longer for less pay, challenging the living standards of unskilled American citizens. Nations must have the right to protect themselves against invasions, even invasions by unarmed hordes. What can we do? At what cost can unwanted aliens be expelled? How long and how high would the wall have to be that would succeed in keeping them out? Or should the United States welcome and not resist the entry of anyone who wishes to come here, short of permitting the entry of criminals, carriers of contagious diseases and those who would immediately become wards of the state? Beyond such considerations, may we draw a line? Where do we draw a line? And how do we draw the line?

Thursday, December 13, 2007

DID ROMNEY DEFEND HIS RELIGIOUS BELIEFS OR DENY THE RIGHTS OF NON-BELIEVERS?

Mitt Romney’s speech on December 5, 2007 in defense of his Mormon religion mentioned Mormon once and never mentioned the name of Romney’s church, the Church of Latter Day Saints. His avoidance of his religion was signaled in the title he gave to the speech, “Faith in America.” It turned out to be something else, an address on Romney’s faith in Faiths, but not all Faiths, and certainly no faith at all in those who have no Faith.

The exclusion of those who did not profess a belief in religion was made clear at the outset, when Romney said “Freedom requires religion just as religion requires freedom.” The first part of that sentence is prejudicial, because it leaves non-believers unfree. The second part of that sentence is preposterous because it ignores all intolerant faiths, including the state religions that Romney criticizes later in his speech as “theocratic tyranny.”

Romney stated “I believe in my Mormon faith and I endeavor to live by it,” an expression of belief that surely requires at least a simple statement of how it differs from other faiths, yet his only profession of religious belief is that “Jesus Christ is the Son of God and the Savior of mankind.” That belief encompasses Christians as well as Mormons, including the Evangelical Christians who will participate in large numbers in the Iowa caucuses, but excludes Jews, Muslims, Buddhists, Hindus, and members of other faiths that do not acknowledge the divinity of Jesus, as well as atheists, agnostics and other non-believers. It must also trouble millions of Christians who believe in a genuine separation of church and state.

Anyone who suggests parallels with John F. Kennedy’s 1960 speech on his religion and the presidency has not read or does not remember what Kennedy said. Kennedy said “I believe in an America where the separation of church and state is absolute,…where no church or church school is granted any public funds or political preference;… where no religious body seeks to impose its will directly or indirectly upon the general populace or the public acts of its officials.” He opposed diplomatic relations with the Vatican, aid to parochial schools, and other government support of organized religion. His was a categorical denial of the role of religion in government.

By contrast, Romney finds a necessary linking of church and state. Citing abolition, civil rights and the right to life itself, Romney makes a sweeping claim that “no movement of conscience can succeed in America that cannot speak to the convictions of religious people.” Unquestionably, the abolition and civil rights movements have required the support of religious as well as non-religious people. It is also true that the strongest opponents of the abolition and civil rights movements have included religious as well as non-religious people.

To make the point that the United States is founded in religion, Romney says “We are a nation ‘Under God.’…We should acknowledge the Creator as did the founders – in ceremony as in word.” He cites God on our currency and in the pledge of allegiance, ignoring the absence of God, the Creator or Jesus in the American Constitution, as well as the explicit constitutional prohibition of any religious test for any office or public trust. Romney is also wrong about the place of “under God” in the United States. To cite one example, the original pledge was written in 1892 by a socialist, Francis Bellamy, and it did not contain the words “under God” until Congress put them in it in 1954.

As for the moral implications of Romney’s political views, in a single Republican presidential debate, he opposed ever raising taxes, refused to define torture or waterboarding, and said of persons who have been held for as long as six years at Guantanamo: “I want to make sure that these people are kept at Guantanamo and not given legal representation in this country.” Millions of Americans, both religious and non-religious, will characterize these positions as immoral.

In contrast to Kennedy’s endorsement of the separationist views of Thomas Jefferson, James Madison and other outspoken founders of the United States, Romney believes that government and religion are inextricably bound together in ways that receive no support in the Constitution or the debates preceding its adoption. Given America’s heterogeneous political climate, it is probably unavoidable that all of the leading candidates for the presidency in 2008 arouse the hostility of many citizens who do not share what they perceive to be the candidate’s core values. Among them, Mitt Romney has the dubious and dangerous distinction of being the most divisive.

Monday, October 15, 2007

HOW WAS AL GORE DENIED THE VICTORY HE HAD WON IN THE 2000 PRESIDENTIAL ELECTION?

As soon as I heard that Al Gore had won a Nobel Peace Prize, I could not help recalling the prize he had won but which was denied him: the American presidency. We should not forget how much America lost in that election. Every fact in this brief account can be verified in detail. We can’t put the past behind us because we are living with its awful consequences.

Vice President Al Gore had a more than half-million vote plurality over Texas Governor George W. Bush, but victory depended on which candidate won Florida’s electoral votes. In the early Florida vote count after Election Day in 2000, the official difference between them was less than three-hundredths of one percent (0.0299) of the total, and the Florida Election Code required an automatic recount in every one of the state’s 67 counties if the margin of victory is less than one-half of one percent. It never took place. Two weeks after the election, the Florida Supreme Court unanimously ruled that hand counts must be included in the vote totals. Every possible obstacle was placed in the path of conducting such a count. Texas, the state of which Bush was governor, required that all punch-card ballots should be counted where a chad (the piece punched out by the voter) is not completely detached, with the overriding concern being the “clearly ascertainable intent of the voter.” But that was not the intent of those who acted on behalf of candidate Bush.

The strategy of the Bush camp was to prevent a full recount before December 12, after which date Florida could avoid a challenge by another slate of electors, should the complete vote reveal that Gore had in fact won. The tactics employed were to stall, delay, object, and terminate the vote counts that had been undertaken in order to prevent a full recount before that date. When Secretary of State Harris declared Bush the victor without a recount, the Florida Supreme Court extended the vote count deadline. The Republican-controlled Florida legislature was called into special session to appoint electors pledged to Bush.

On December 8, the Florida Supreme Court ordered an immediate manual recount of all ballots on which no vote had been machine-recorded. On the next day, shortly after the Miami-Dade County manual count resumed, the Supreme Court halted the hand count. It heard arguments the following day (December 10), waited a weekend, and on December 12 announced it was now too late to conclude the count. As much as the 5-4 Supreme Court majority wanted to shoot down Gore’s candidacy, they wanted someone else to fire the shot. Whether motivated by propriety or cowardice or political calculation, they wrote an opinion returning the case to the Florida Supreme Court, but the Florida court was not allowed to formulate a rule for continuing and concluding the hand count, or to do anything except declare that Gore’s candidacy was dead. The following day, Gore conceded defeat. This shameless stealing of the election after the votes were cast was exceeded only by the tactics employed to manipulate the voting itself.

Voters were denied the opportunity to cast their votes. Evidence demonstrates that the names of black voters were removed from registration rolls, voting sites were moved without notification, ballot boxes were uncollected, polling places were understaffed, language assistance was denied, and voting machines were unreliable. Secretary of State Harris, without competitive bidding, hired a private company to compile lists of convicted felons who should be barred from the polls. Many on the lists were innocent people who had names that sounded like those of convicted felons (no effort was made to check the names against Social Security numbers), many had been convicted only of lesser crimes (a misdemeanor conviction does not involve losing the right to vote), and thousands had been convicted and served time in the thirty-seven states that restore citizenship rights. Jeb Bush and Harris both testified under oath that the verification of the felon lists was neither their responsibility nor that of the company that compiled the lists. A University of Minnesota study estimated that if 80 percent of those unfairly barred from voting had voted, this would have resulted in more than 35,000 more votes for Gore and 4000 for Bush.

Voters who were not helped when they should have been. Although Florida law allows disabled voters to have another person assist them, ballots were disallowed from elderly or disabled voters whose signatures did not match the old originals.

Illegal voters and others were helped who should not have been. The Seminole County elections supervisor allowed the Republicans two weeks to make corrections and resubmissions on thousands of absentee ballots. In Martin County, Republicans were allowed to add identification numbers to application forms for absentee ballots and to remove applications from the elections supervisor’s office. In addition, The Miami Herald found that ineligible voters were allowed to sign in at polls where they were not registered.

Voters were unconstitutionally misled. As required by law, the local newspaper in Duval County printed a sample ballot. It stated that voters should vote on every page, but the names of the presidential candidates were printed on two pages. Voters who followed that instruction cast voided ballots. Thirty percent of the undercounted ballots in Duval County were cast in heavily black precincts that voted ten-to-one for Gore. As a consequence of this two-page presidential ballot and the improper instructions that accompanied it, Duval County discarded more than three times as many ballots in 2000 (26,909) as it had in 1996 (7762).

In Palm Beach County, the so-called butterfly ballot listed candidates’ names on facing left-hand and right-hand pages. The major party candidates were listed on the left-hand page, with Bush first and Gore second. Patrick Buchanan was listed first among the minor party candidates on the right-hand page. Voters had to punch a hole in a middle column that combined the lists, alternating names of candidates from the left and right. This meant that Bush, first on the left was first in the center, while Buchanan was listed next, and Gore, who was second on the left in the list of candidates, was third in the center column where voters made their choice. A voter could reasonably assume that the candidate listed second would appear second in the order in which they voted. In that way thousands of Palm Beach County voters who intended to vote for Gore mistakenly voted for Buchanan.

This ballot format violated Florida law that clearly specifies that votes must be cast to the right of the candidate’s name. It also violated common sense in requiring that a vote for the second candidate on the ballot (Gore) meant punching the third hole, and a vote for a candidate much lower on the ballot (Buchanan) meant punching the second hole. Because many voters recognized their error, they punched two holes, invalidating their ballots.

Buchanan received one-fifth (19.6 percent) of the Palm Beach vote, compared with one-twentieth (5.4 percent) in 1996, when the butterfly ballot was not used. It takes a leap of faithlessness to believe that Buchanan received his greatest support in a precinct consisting of a Jewish old age home with Holocaust survivors, who until that election had despised Buchanan. To his credit, Buchanan publicly stated his belief that these voters had not intended to vote for him. Analyses indicate that he received between 2000 and 3000 votes that had been intended for Gore, vastly more than Bush’s official victory margin of 537 votes.

The votes of many voters were not recorded through no fault of their own. One-third of all Florida voters lived in punch-card counties with a high proportion of low-income, minority, and first-time voters, in which it was nearly three times as likely that their ballots would be rejected as would the ballots of voters in counties using optical systems. (An inventor of the punch-card machine testified as to its unreliability.) Dimpled ballots, that is, punch-card ballots in which an indentation has been made but there is no visible hole, were counted in Texas, Illinois, Massachusetts and other states, but not in Florida. It is not surprising that Palm Beach County with its butterfly ballot and Duval County with its improper instructions to voters, and both with all the problems associated with punch-card voting, had thirty-one percent of the uncounted ballots, even though they cast only twelve percent of the statewide vote.

There were voters whose intentions were clear but whose votes were never counted. Thousands of voters indicated their preference of a candidate in the normal fashion and then wrote in the name of the same candidate in the place reserved for write-in votes. (This could not occur when voting on optical scanners.) Whether due to oversight or ignorance, these voters had not followed instructions, but their intentions were unmistakable. Whatever form the voting procedure takes, it is not designed to be an intelligence test or anything but an expression of the voter’s intention. Voters who emphasize and underline their choice of candidate by both punching a ballot and writing in the same candidate’s name have made the most unmistakable demonstration of their intention. Had the overvotes been counted, Gore would have received many times more votes than were necessary to overcome Bush’s official lead.

In retrospect, it is evident that the media exit polls on Election Day that called Florida for Gore were correct, because they reported the intentions of the voters who went to the polls and who believed that they had cast valid ballots. The National Opinion Research Center of the University of Chicago was hired by a consortium consisting of The New York Times, Washington Post, Wall Street Journal, and other newspapers to undertake a first count of approximately 120,000 overvotes (where voters appeared to have made two choices, even if both were for the same candidate) and 60,000 undervotes (where the machine count had not revealed the voter’s choice).

The consortium calculated eight ways of counting disputed punch-card votes – correctly marked paper ballots, full punches, poorly marked paper ballots, chads detached at three corners, chads detached at two corners, chads detached at only one corner, dimpled ballots with sunlight, or only dimples -– and concluded that Gore would have won the Florida vote by every one of the eight methods. Of course, no examination after the fact could take into account the thousands of voters who had mistakenly voted for Buchanan instead of Gore, the thousands who compounded their mistake by voting for both Buchanan and Gore, and the would-be voters who had been turned away from the polls or were discouraged from voting because of intimidation or the improper appearance of their names on lists of felons barred from voting.

Judicial involvement and determination of the result represented an unconstitutional intervention in the political process, because it violated the Electoral Count Act. That Act, adopted to prevent a repetition of the misguided judicial involvement in the election of 1876, states that Congress is the primary body to resolve any lingering contention after the states have tried to settle electoral disputes. It is an authority that was never given to the courts.

Who were the Justices who so egregiously made it impossible to ascertain the true will of Florida voter, acted contrary to their long-proclaimed deference to states’ rights, and awarded the election to the candidate who had lost the vote? During the 1986 hearings on William Rehnquist’s nomination as Chief Justice, five witnesses testified that Rehnquist had harassed black and Latino voters at Arizona polling places in 1962, demanding to know if they were “qualified to vote.” Justice Antonin Scalia’s son was a member of the law firm of Theodore Olson, who argued the Bush case before the Supreme Court. Justice Clarence Thomas’s wife was employed by the ultra-conservative Heritage Foundation to vet prospective office-holders in a future Bush administration. When Justice Sandra Day O’Connor heard Dan Rather call Florida for Gore on CBS, she exclaimed, “This is terrible.” Her husband explained to other guests at the election night party they were attending that his wife was upset because they wanted to retire to Arizona, and a Gore win meant that they would have to wait at least another four years. These facts demonstrate not illegal conduct, but indifference to avoiding the appearance of impropriety, which would have led them to remove themselves from consideration of the case.

Gore would have won Florida and the election had there been an immediate recount in every one of Florida’s 67 counties or just a revote in Palm Beach County. An unprecedented political juggernaut was rolled out to ensure that this would not happen. It employed the political power and resources of Governor Jeb Bush, the presidential candidate’s brother, who recruited his own political operatives and volunteers to move into disputed counties; James Baker, the Bush family’s consigliere, who masterminded the teams of lawyers and political operatives who engaged in “spontaneous demonstrations,” including a window-pounding protest that succeeded in halting the Miami-Dade County recount (many participants were rewarded with appointments in the Bush administration); Secretary of State Katherine Harris, co-chair of George W. Bush’s Florida campaign, who was prepared to change the rules and interpret the results as proved necessary in order to insure Bush’s election; the Florida House of Representatives, which voted on party lines to appoint electors pledged to George W. Bush, irrespective of how the recounts turned out, and the Florida Senate, which was prepared to cast a similar vote if the U.S. Supreme Court had not resolved the election in Bush’s favor, and finally the five predictable members of that Court.

The most fundamental elements of an American presidential election include the right of all qualified citizens to cast a vote, an accurate count of all discernible votes, and the awarding of a state’s electoral votes to the candidate whose slate of electors received the most votes. None of those conditions were met in Florida in 2000. The great pleasure that most Americans feel upon Gore’s winning the Nobel Peace Prize must be tempered by our awareness that a corrupt coalition denied him the presidency after winning the election. There are banana republics that conduct more honest elections than the United States did in 2000. It was the first time that Al Gore taught us an inconvenient truth.

If you wish to subscribe to Thinking Out Loud, e-mail
thinkingoutloud@stanleyfeingold.com, and write "subscribe."
The price of your subscription is occasionally thinking out loud by responding. Comments, criticisms and corrections are
welcome. This comment is an exception to my rule to keep the essays under 1500 words. To unsubscribe, write "cancel."
Stanley Feingold

Saturday, October 6, 2007

MUST AMERICA EMPLOY WAR PROFITEERS AND MERCENARIES IN IRAQ?

The United States is engaged in an undeclared war (the longest in American history) against an unnamed enemy (terrorism neither identifies whom we are fighting nor the meaning of victory), by means that do not affect most Americans, but will produce great profits for private contractors and great indebtedness for future generations.

This indictment demands demonstration. The war in Iraq is an undeclared war: American entry into World War II was based on the last congressional declaration of war, although the Constitution requires it. It is the longest war: It is now four-and-a-half years, and there is no end in sight. The enemy is unnamed: Al Qaeda was not even in Iraq when America invaded that country in March 2003, and the enemy now consists mostly of unnamed nationalists, insurgents and terrorists, whose ranks are subject to augmentation or diminution at any time, and with none of whom America can sign a treaty ending the war.

It once would have seemed unimaginable that a war of this magnitude, of such great cost to the United States, and involving such great loss of life to combatants and innocent civilians, could be waged without having an impact on the lives of most Americans, but it is clear that apart from the high cost of gasoline (insofar as it is related to the war), the overwhelming majority of Americans are personally untouched, neither knowing anyone in the armed forces nor being asked to make any sacrifice for the cost and conduct of the war.

The real cost now exceeds one trillion dollars. No one can guess how much greater it will be by the time the United States leaves Iran. To whom is that money going? The answer is that never in American history have private corporations profited so greatly or corruptly from the performance of tasks that until now were considered the responsibilities of the armed forces. Most non-combat roles, and the use of armed security forces, have been outsourced to 630 private companies work for the United States in Iraq, employing approximately as many persons as are there in the American military.

Of necessity, some of these are Iraqi firms, employing non-American personnel, because they have knowledge and linguistic skills which the American armed forces do not possess. However, the vast majority of contracts, dollars spent, and personnel employed are American civilian contractors who are subject to virtually no oversight or accountability for how and how much they spend or how much they profit.

The most famous or infamous of American contractors (but far from the largest) is Blackwater U.S.A., which provides security forces for the U.S. State Department, and is not subject to supervision or control by the U.S. Department of Defense. Blackwater, a company that contributes heavily to the Republican Party, was hired (as have other contractors) without competitive bidding. The notoriety of Blackwater derives from documented instances (described by American and Iraqi eyewitnesses) in which Blackwater employees have opened fire and killed unarmed Iraqis without provocation. As of this writing, Blackwater security guards and other personnel have been involved in 195 shooting incidents since 2005. In at least two cases, Blackwater, with the approval of it employer, the State Department, made cash payments to family members of its victims who complained, and it has sought to cover up other cases.

Last year on Christmas eve, a Blackwater employee, while drunk, killed a bodyguard for one of Iraq’s two vice presidents. Blackwater, with the help of the State Department, spirited the assailant out of Iraq within 36 hours. More than nine months have passed, and no charges have yet been brought against the assailant. Many other charges against these guards have been made by Iraqi and American military officers. Until now, they have been immune from prosecution in Iraqi courts and protected by agencies of the American government from effective prosecution in the United States.

Given the cost in human lives, it might be callous to consider the economic cost, but for the fact that the war has been a source of great wealth for those to whom the United States has outsourced much of the cost. A single example, typical of arrangements with other companies, will indicate how wasteful it has been for the United States and how very profitable it has been for the companies and individuals who have received these contracts.

Blackwater pays an individual security guard $600 a day (that comes to $180,000 a year), which is four or five times the income the security guard received when he was a member of the American military. (He also benefits from having armored cars that are safer than Army vehicles.) To the security guard’s salary, Blackwater adds a 36% markup (for a total of $815 a day) plus overhead and costs in Iraq, including insurance, room and board, travel, weapons, ammunition, vehicles, office space, and equipment. This bill goes to Regency Hotel, a Kuwaiti company, that tacks on the cost of its buying vehicles and weapons, plus a profit for itself, and sends an invoice to ESS, a German food services company that cooks meals for the troops. Regency has billed ESS a price of $1500 per man per day, but it has told Blackwater it was charging $1200, giving it a substantial secret profit. ESS adds on its costs and profit, and sends its bill to Halliburton, which also adds overhead and profit, and presents its bill to the Pentagon. The United States has no contract with ESS, which will not provide any information to the government or the relevant congressional committees.

Halliburton’s contract is an open-ended “cost-plus” contract to supply the U.S. armed forces with food, laundry, and other necessities. Cost-plus means the United States pays Halliburton all of its expenses (that is everything it spends and everything it pays to its subcontractors) plus 2% profit on top. The more it spends, the greater the profit it makes. Henry Bunting, a former Halliburton purchasing officer, has stated, “There is no incentive for KBR (Kellogg, Brown & Root, a Halliburton subsidiary) or their subs to try to reduce costs. No matter what it costs, KRB gets one hundred percent back, plus overhead, plus their profit.” Up to this point, the Army has committed $7.2 billion on a single contract with Halliburton. The Defense Contract Audit Agency recently stated that Halliburton could not document 42% of a $4 billion invoice in March 2007. Among other charges, it stated that Halliburton billed the government for up to three times as many meals as it served.

Halliburton has failed to respond to repeated requests for detailed information regarding its costs and profits.
Employers and former employers are discouraged from becoming whistleblowers. Blackwater does as other American contractors in Iraq do. It makes individual contractors sign confidentiality agreements that compel them to pay Blackwater $250,000 in instant damages if they violate their contract by publicly discussing the details of their agreements or work.

What then is the real cost of a security guard? A sergeant (the former rank of many private security guards) would receive around $38,000 a year in base pay and housing and subsistence allowances. This does not reflect additional costs for health and retirement benefits. When a private security guard is killed, even though he may be an American citizen, the U.S. government is not responsible for his burial, death benefits, or payment to his survivors. We save money, but it is doubtful if, even in the long run, it constitutes a saving for the United States.

The advantage to outsourcing personnel is entirely political. The United States can pretend that it is conducting war with fewer soldiers, not needing to call up more regular troops, National Guard and reserves. Of course, General Shinseki and other military leaders were correct, before and after we invaded Iraq, when they insisted that the U.S. needed at least twice as many uniformed soldiers than we had sent to Iraq.

We have euphemisms to describe them, but there can be no mistake that the individuals who take high-risk, high-paying jobs are mercenaries, and their employers, who are not held unaccountable for their greed, crimes and cover-ups, are war profiteers. Any veneer of idealism or unselfish motive has been stripped away. We should answer a single question regarding how America conducts war: Are we willing to continue to outsource both the supplying of necessary resources and the actual waging of war by armed persons not wearing military uniforms, or should we create a military force fully capable of defending itself? The name of the alternative to what we are now doing strikes panic in the hearts of those who want to continue to prosecute this war, those who want to start a new war against Iran, those who want America to be prepared for a future war, and most voters contemplating the next election. That name is: conscription.

Saturday, September 29, 2007

WHAT ARE AMERICAN SOLDIERS DOING IN IRAQ?

Seven American soldiers serving in Iraq wrote the following op-ed piece that appeared in The New York Times on August 19, 2007. I recently reread it and urge everyone to do so. Only one salient fact has changed since they wrote it.

The War As We Saw It by Buddhika Jayamaha, Wesley D. Smith, Jeremy Roebuck, Omar Mora, Edward Sandmeier, Yance T. Gray and Jeremy A. Murphy

Viewed from Iraq at the tail end of a 15-month deployment, the political debate in Washington is indeed surreal. Counterinsurgency is, by definition, a competition between insurgents and counterinsurgents for the control and support of a population. To believe that Americans, with an occupying force that long ago outlived its reluctant welcome, can win over a recalcitrant local population and win this counterinsurgency is far-fetched. As responsible infantrymen and noncommissioned officers with the 82nd Airborne Division soon heading back home, we are skeptical of recent press coverage portraying the conflict as increasingly manageable and feel it has neglected the mounting civil, political and social unrest we see every day. (Obviously, these are our personal views and should not be seen as official within our chain of command.)

The claim that we are increasingly in control of the battlefields in Iraq is an assessment arrived at through a flawed, American-centered framework. Yes, we are militarily superior, but our successes are offset by failures elsewhere. What soldiers call the "battle space" remains the same, with changes only at the margins. It is crowded with actors who do not fit neatly into boxes: Sunni extremists, Al Qaeda terrorists, Shiite militiamen, criminals and armed tribes. This situation is made more complex by the questionable loyalties and Janus-faced role of the Iraqi police and Iraqi Army, which have been trained and armed at United States taxpayers' expense.

A few nights ago, for example, we witnessed the death of one American soldier and the critical wounding of two others when a lethal armor-piercing explosive was detonated between an Iraqi Army checkpoint and a police one. Local Iraqis readily testified to American investigators that Iraqi police and Army officers escorted the triggermen and helped plant the bomb. These civilians highlighted their own predicament: had they informed the Americans of the bomb before the incident, the Iraqi Army, the police or the local Shiite militia would have killed their families.

As many grunts will tell you, this is a near-routine event. Reports that a majority of Iraqi Army commanders are now reliable partners can be considered only misleading rhetoric. The truth is that battalion commanders, even if well meaning, have little to no influence over the thousands of obstinate men under them, in an incoherent chain of command, who are really loyal only to their militias.

Similarly, Sunnis, who have been underrepresented in the new Iraqi armed forces, now find themselves forming militias, sometimes with our tacit support. Sunnis recognize that the best guarantee they may have against Shiite militias and the Shiite-dominated government is to form their own armed bands. We arm them to aid in our fight against Al Qaeda.

However, while creating proxies is essential in winning a counterinsurgency, it requires that the proxies are loyal to the center that we claim to support. Armed Sunni tribes have indeed become effective surrogates, but the enduring question is where their loyalties would lie in our absence. The Iraqi government finds itself working at cross purposes with us on this issue because it is justifiably fearful that Sunni militias will turn on it should the Americans leave.

In short, we operate in a bewildering context of determined enemies and questionable allies, one where the balance of forces on the ground remains entirely unclear. (In the course of writing this article, this fact became all too clear: one of us, Staff Sergeant Murphy, an Army Ranger and reconnaissance team leader, was shot in the head during a "time-sensitive target acquisition mission" on Aug. 12; he is expected to survive and is being flown to a military hospital in the United States.) While we have the will and the resources to fight in this context, we are effectively hamstrung because realities on the ground require measures we will always refuse - namely, the widespread use of lethal and brutal force.

Given the situation, it is important not to assess security from an American-centered perspective. The ability of, say, American observers to safely walk down the streets of formerly violent towns is not a resounding indicator of security. What matters is the experience of the local citizenry and the future of our counterinsurgency. When we take this view, we see that a vast majority of Iraqis feel increasingly insecure and view us as an occupation force that has failed to produce normalcy after four years and is increasingly unlikely to do so as we continue to arm each warring side.

Coupling our military strategy to an insistence that the Iraqis meet political benchmarks for reconciliation is also unhelpful. The morass in the government has fueled impatience and confusion while providing no semblance of security to average Iraqis. Leaders are far from arriving at a lasting political settlement. This should not be surprising, since a lasting political solution will not be possible while the military situation remains in constant flux.

The Iraqi government is run by the main coalition partners of the Shiite-dominated United Iraqi Alliance, with Kurds as minority members. The Shiite clerical establishment formed the alliance to make sure its people did not succumb to the same mistake as in 1920: rebelling against the occupying Western force (then the British) and losing what they believed was their inherent right to rule Iraq as the majority. The qualified and reluctant welcome we received from the Shiites since the invasion has to be seen in that historical context. They saw in us something useful for the moment.

Now that moment is passing, as the Shiites have achieved what they believe is rightfully theirs. Their next task is to figure out how best to consolidate the gains, because reconciliation without consolidation risks losing it all. Washington's insistence that the Iraqis correct the three gravest mistakes we made - de-Baathification, the dismantling of the Iraqi Army and the creation of a loose federalist system of government - places us at cross purposes with the government we have committed to support.

Political reconciliation in Iraq will occur, but not at our insistence or in ways that meet our benchmarks. It will happen on Iraqi terms when the reality on the battlefield is congruent with that in the political sphere. There will be no magnanimous solutions that please every party the way we expect, and there will be winners and losers. The choice we have left is to decide which side we will take. Trying to please every party in the conflict - as we do now - will only ensure we are hated by all in the long run.

At the same time, the most important front in the counterinsurgency, improving basic social and economic conditions, is the one on which we have failed most miserably. Two million Iraqis are in refugee camps in bordering countries. Close to two million more are internally displaced and now fill many urban slums. Cities lack regular electricity, telephone services and sanitation. "Lucky" Iraqis live in gated communities barricaded with concrete blast walls that provide them with a sense of communal claustrophobia rather than any sense of security we would consider normal.

In a lawless environment where men with guns rule the streets, engaging in the banalities of life has become a death-defying act. Four years into our occupation, we have failed on every promise, while we have substituted Baath Party tyranny with a tyranny of Islamist, militia and criminal violence. When the primary preoccupation of average Iraqis is when and how they are likely to be killed, we can hardly feel smug as we hand out care packages. As an Iraqi man told us a few days ago with deep resignation, "We need security, not free food."
In the end, we need to recognize that our presence may have released Iraqis from the grip of a tyrant, but that it has also robbed them of their self-respect. They will soon realize that the best way to regain dignity is to call us what we are - an army of occupation - and force our withdrawal.

Until that happens, it would be prudent for us to increasingly let Iraqis take center stage in all matters, to come up with a nuanced policy in which we assist them from the margins but let them resolve their differences as they see fit. This suggestion is not meant to be defeatist, but rather to highlight our pursuit of incompatible policies to absurd ends without recognizing the incongruities.

We need not talk about our morale. As committed soldiers, we will see this mission through.

They will not all “see this mission through.” Even before the article was published, Staff Sergeant Jeremy A. Murphy was shot in the head on August 12, and suffered a severe brain trauma. He is expected to survive. On September 10, Sergeant Omar Mora and Staff Sergeant Yance T. Gray and five other Americans were killed when the five-ton truck in which they were riding overturned. What are American soldiers doing in Iraq? Dying.