It is difficult to imagine a precedent for the buildup to and presentation of what President Bush called “the Petraeus report” (there has been no formal report, simply the general’s testimony) before House and Senate committees on the status of America’s war in Iraq. General Petraeus has been in charge of American forces in Iraq one half-year into the so-called “surge,” in which 30,000 additional troops had been added to the 130,000 already engaged in Iraq. President Bush’s prime-time address after the general’s two days of testimony invoked the general so often, one might have concluded that he was referring to the delivery of a new sacred text.
In fact, there was nothing new or unanticipated in the general’s testimony. That could not come as a surprise. A commanding general on active service does not rebut or qualify his president’s optimistic prognosis. If he did, he would be removed, as President Lincoln removed General George McClellan as commander of the Army of the Potomac, because of his failure to engage the Confederate Army and win the Civil War.
Similarly, when a general takes actions that contradict the president’s behavior, he will be removed. President Harry S Truman relieved General Douglas MacArthur of his command for insubordination when he issued an unauthorized statement threatening to expand the Korean war into China if it resisted, while the president was preparing to engage North Korea and China in peace negotiations. MacArthur’s independence led to the loss of many American lives. (These events are retold in David Halberstam’s posthumous The Coldest Winter: America and the Korean War, excerpted in “MacArthur’s Grand Illusion,” in the October 2007 issue of Vanity Fair.) General Omar Bradley expressed the prevalent military as well as political sentiment when he said that General MacArthur’s action “would have involved us in the wrong war in the wrong place at the wrong time against the wrong enemy.”
Disagreement is not the only inappropriate behavior for a general. It is also undesirable for a general to allow himself to become (albeit at the president’s instigation) a spokesperson for what is, after all, the president’s partisan politics. No longer simply a general, then-Secretary of State Colin Powell remained for the American public the general above politics and the Bush Administration’s most credible spokesperson when he was prevailed upon to address the United Nations to justify what would subsequently be America’s unilateral invasion of Iraq. Virtually no assertions of “facts” were true in Powell’s presentation of the doubtful and untruthful “evidence” supporting Iraq’s close alliance with Al Qaeda, possession of weapons of mass destruction, and intention to employ those weapons against the United States. Nevertheless, the authority of General Powell’s endorsement seemed credible to a public that would have been skeptical if these claims had come from another spokesperson.
That was the position that General Petraeus put himself into when just weeks before the 2004 presidential election, The Washington Post published an op-ed piece by him. Ever the optimist, General Petraeus saw “tangible progress” in Iraqi security forces, enabling “Iraqis to shoulder more of the load for their own security.” Petraeus detailed military victories and the increased capacity of the police forces. Regrettably, General Petraeus has not enjoyed the military success that the president and he have both implied that he has had. Sectarian warfare has escalated in areas under his command. His efforts at reaching political agreements have failed, as have the efforts of others. The loss of billions of dollars of Iraqi weapons have led to a major criminal investigation of Army mismanagement. Despite the general’s praise, recent reports recommend that the police should be disbanded because of their dismal failure to improve security.
Three years later, in his long-anticipated evaluation of the “surge,” the general once again has said that some progress had been made on the ground, adding that there were fewer fatalities in some areas in recent months [but didn’t count Sunnis killing Sunnis, Shias killing Shias, or assassination by being shot in the front of the head as distinct from the rear], and that some tribal groups that once used their weapons to kill Americans had entered into agreement with the Americans to use their new American-supplied weapons to kill insurgents. In passing, the president and general have acknowledged that no progress had been made to create a unified government in the devastated country. Most revealing of the limits of military judgment was the answer General Petraeus couldn’t give when asked whether America is safer. He confessed that he has not entertained that question.
It is appropriate that a general should echo the military judgment of the president, if he agrees with it. What is deliberately deceitful is the president’s pretense that he will be guided by the conclusions of his generals, as Bush stated one month before the general came home: “Troop levels will be decided by our commanders on the ground, not by political figures in Washington, D.C.” He couldn’t wait for General Petraeus’s testimony and flew to a secret desert air base 120 miles from Baghdad to declare that the surge is working. Of course, he already knew the general’s conclusion, because when Bush disagrees with a general, the general is removed or retired.
What did generals think of the invasion? General Eric Shinseki, then Army Chief of Staff, asked by a Senate committee to estimate the number of ground troops necessary to support the invasion of Iraq, replied “several hundred thousand.” Defense Secretary Rumsfeld and Deputy Defense Secretary Wolfowiz immediately declared that was ‘’wildly off the mark.” Shinseki soon retired. Commander-in-Chief United States Central Command General John Abizaid has since said that Shinseki’s estimate was correct. General Bernard Trainor has described a willfully self-deluding planning process. General William Odom, former director of the National Security Agency, has said that the American invasion of Iraq might be the worse strategic mistake in American history.
What did generals think of America’s conduct in Iraq? General Antonio Taguba, charged with reporting on the documented horrors and humiliations suffered by prisoners at Abu Ghraib (which provided the terrorists with their most persuasive recruitment tool) concluded that the crimes deserved severe punishment. Instead, the Department of Defense punished only the lowest-ranking soldiers and General Taguba was exiled to a Pentagon desk job and early retirement. CENTCOM General Anthony Zinni, later Bush’s special envoy to the Middle East, has stated: “In the lead up to the Iraq war and its later conduct, I saw at a minimum, true dereliction, negligence and irresponsibility; at worst, lying, incompetence and corruption.” Our mistakes, Zinni argues, include denying priority to the war on Al Qaeda in Afghanistan, disbanding the Iraqi army, and deBaathifying the police. The result of our ill-advised unilateral aggressive intervention, General Zinni concluded, is that “we are now being viewed as the modern crusaders, as the modern colonial power in this part of the world.”
Did the generals think that the “surge” was desirable? When General Abizaid was pressed this past November by Senator John McCain on the need for an increased U.S. military presence, he replied: “Senator McCain, I met with every divisional commander, General [George] Casey, the core commander, General [Martin[ Dempsey [head of the Multi-National Security Transition Command in Iraq}, we all talked together. And I said, in your professional opinion, if we were to bring in more American troops now, does it add considerably to our ability to achieve success in Iraq? And they all said no. And the reason is because we want the Iraqis to do more. It is easy for the Iraqis to rely upon us to do this work. I believe that more American forces prevent the Iraqis from doing more, from taking more responsibility for their own future.”
What did generals think of the civilian strategists of the war? General Paul Eaton, who helped revive the Iraqi army, described Rumsfeld as “incompetent strategically, operationally and tactically.” General John Batiste, commander of an infantry division in Iraq, turned down a promotion and a tour in Iraq as the second-ranking military officer, and chose to retire rather than continue to work for Rumsfeld. In 2006, according to a Military Times poll, almost 60 percent of the members of the United States Armed Forces do not believe that the civilians in the Pentagon had their “best interests at heart.”
Each month of the surge so far has cost $10 billion and the lives of one hundred American troops. Senator McCain warns that withdrawal would increase “the potential for genocide, wider war, spiraling oil prices and the perception of strategic American defeat.” Those grim consequences may occur. But it is the responsibility of President Bush, not of his generals, to clearly spell out when and under what circumstances the risk of these dire consequences of American withdrawal would be reduced. Absent President Bush’s clear analysis and projection of America’s future prospects in Iraq, his unstated cynical answer is that this is his legacy to a future Administration.
If you wish to subscribe to Thinking Out Loud, e-mail thinkingoutloud@stanleyfeingold.com, and write “subscribe.” The price of your subscription is your occasionally thinking out loud by responding. Comments, criticisms and corrections are welcome. I intend to post a weekly essay of under 1500 words. To unsubscribe, write “cancel.” Stanley Feingold
Saturday, September 15, 2007
Friday, September 7, 2007
DOES RONALD REAGAN HAVE A CONSERVATIVE HEIR?
Ronald Reagan’s election in 1980 marked the beginning of an extraordinary change in American politics, made evident by sharper ideological differences between the leadership of the major parties than had been seen since before the presidency of Franklin D. Roosevelt. The one-sidedness of the 1932 Democratic triumph ushered in a era of one-party dominance that resulted in rheir control of the House of Representatives for fifty-eight and of the Senate for fifty-two of the next sixty-two years and five consecutive presidential victories before losing to an unbeatable war hero.
This was achieved initially as a result of the Democratic Party’s remarkable ability to hold together the most disparate interests. Southern white supremacists were there because they had been there since Lincoln “freed the slaves.” Twentieth-century working-class immigrants were there because there was no congenial home for them in a Republican Party whose leaders represented capitalist power and a laissez-faire philosophy epitomized in Calvin Coolidge’s observation that the business of government is business. Black Americans were there because they were ignored by the party that had ended slavery and they responded to the New Deal’s egalitarianism.
Republicans recognized that they had to make a broader appeal and, in choosing presidential nominees, they reached beyond conservative ideologues to nominate New York Governor Thomas E. Dewey (twice) and former Democrat Wendell Willkie before winning the presidency with General Dwight D. Eisenhower, whose domestic political leanings were largely unknown. Eisenhower was the candidate of the liberal internationalist wing of the party, barely and bitterly winning the nomination in 1952 against Ohio Senator Robert A. Taft, an authentic and beloved conservative widely known as Mr. Republican.
This shift toward more moderate Republican presidential candidates was reflected in efforts to broaden the party base, in order to make it more cross-sectional and multi-factional. Nevertheless, the Democratic Party remained more liberal, despite the power of conservative southerners who chaired most of the major congressional committees, and the Republican Party remained more conservative, despite the presence of eastern internationalists and middle-western LaFollette populists. The Democrats did a better job of keeping their coalition than the Republicans did of creating theirs, as was evident in the civil rights controversies, when both the leading advocates and leading opponents of racial equality were in the Democratic Party.
After Vice President Richard Nixon’s close defeat in 1960, conservatives captured enough Republican Party state organizations to nominate Arizona Senator Barry Goldwater, who, true to his promise to offer “a choice, not an echo,” championed the reduced size of government, repeal of the graduated income tax, an end to federal aid to education, and voluntary Social Security. Goldwater had written The Conscience of a Conservative, and he became the embodiment of that conscience.
In retrospect, Goldwater’s overwhelming defeat (equaled only by Democrat George McGovern’s loss eight years later) can be seen as the birth pangs of a new conservative alliance. Goldwater won only his home state of Arizona and the five southern states with the greatest proportion of black citizens, the states in which the issue of race was most important. The Solid South of a century after the Civil War was solid no more. The change begun by the Supreme Court’s 1954 decision in Brown v. Board of Education outlawing segregated public education was accelerated by passage of the Civil Rights Act in 1964 and the Voting Rights Act in 1965. The political landscape was decisively altered by school busing, forced school integration, affirmative action in higher education and hiring, white flight, race riots, and the perception of increased street crime. The enduring political consequence has been that no Democratic presidential candidate since Lyndon Johnson in 1964 has received a majority of white votes.
Despite the existence of laws inspired by religious beliefs (Sunday closing laws, prohibiting the mailing of immoral material, criminalization of birth control information and devices, and the insertion of “under God” into the Pledge of Allegiance), religious political influence abated after the 1925 Scopes “monkey trial,” in which a young science teacher was convicted for teaching evolutionary theory. It was a pyrrhic victory for religious orthodoxy because the public reaction was a political defeat for the public teaching of religious doctrine.
Religious moral conviction reemerged as a political force in 1965 when the United States Supreme Court upheld the right of married couples to obtain contraceptives. In 1973 in Roe v. Wade, the Supreme Court went further in recognizing a woman’s right to an abortion, inspiring a powerful grass-roots movement that has ever since aspired to reverse this decision by the selection of Supreme Court Justices who are likely to vote to overturn Roe or, short of that, limit the permissible period or methods of abortions.
The new mobilization of religious and racial conservatism became allied with the economic conservatism of business protectionism, laissez-faire government, and fiscal conservatism to reshape the Republican Party. It needed a candidate who would articulate this new conservative coalition, and it found him in Ronald Reagan. Reagan was identified with opposition to abortion, obscenity and pornography; respect for the flag and support for school prayer, and certain and severe punishment for violent crimes. Reagan’s presidency accompanied – and perhaps inspired – the revival of religious fundamentalism and equating American patriotism with Republican conservatism.
It is only peripherally relevant to this alliance symbolized by Reagan that his behavior was not nearly as conservative as his rhetoric. He frequently invoked God, but was not a churchgoer. He had been a populist before he became a conservative and, as president, he supported raising Social Security taxes rather than cutting benefits. In 1964, he characterized Medicare as socialized medicine, but Medicare spending increased by more than ten percent in each year of his presidency. He promised to decrease the size of government; it increased. He promised to cut the budget; it grew larger. He promised to decrease entitlements; in office, he supported vast increases. He promised to abolish two Cabinet departments; they were retained and another was created. He promised to outlaw abortion; nothing happened. Nevertheless, the symbolic reality was that Reagan made most Americans feel good about themselves and their country, and conservatives believed that they had an ally in the White House.
The conservative coalition could not feel a similar comradeship with Reagan’s vice president and successor, George H. W. Bush, who was defeated after one term, in part because of the bitterness of economic conservatives at his betrayal of his pledge of “no new taxes.” But they had captured the party machinery and, aided by Democratic President Bill Clinton’s political ineptitude and neglect of the party organization, succeeded in winning control of Congress in 1994, holding it for the last six years of the Clinton presidency and the first six of George W. Bush, the truest conservative to occupy the White House in the lifetime of anyone now living.
Even in victory, insecurity was apparent in the now-powerful conservative coalition. Where earlier conventions featured major addresses by staunch conservatives, in 2004 the most prominent speakers included John McCain, Arnold Schwarzenegger and Rudolph Giuliani. It can be dismissed as window-dressing, but it was clearly designed to entice more customers into the store. The reason was that the existence of a new conservative majority had not yet been established.
Al Gore outpolled Bush in 2000 and the party vote that year for the House was almost a dead heat, with little more than one vote in every thousand separating them. Four years later, the Republican margin of victory for House candidates was less than three votes in every thousand votes cast. Democrats received more votes than Republicans in the one hundred Senate races from 2000 to 2004 or 2002 to 2006. Republican control of Congress was due more to gerrymandering in the House and its dominance in the rural, less populous states than to greater popular support.
Is the confidence of the conservative coalition in the rightness (take it either way) of their cause diminishing? Their three leading aspirants for the 2008 presidential nomination are a former Governor of ultra-liberal Massachusetts, a former Mayor of New York City (neither of which any Republican can hope to win), and a distinguished Senator whose complex public record ranges from excoriating President Bush to embracing his most controversial policies. Fred Thompson has finally announced his candidacy as the savior of the conservative cause. Better known as a television and movie actor than as an eight-year Senator, his conservative credentials are modest compared with the records of many past and present Governors and members of Congress.
In order to win the nomination, each of these four leading candidates must now vow that he is the most authentic heir to Reagan Republicanism without identifying himself to closely with the current president, and then, to win the election, he must radically moderate his positions to win the support of a much broader electorate. Reagan could preach in 1980 that it was “morning in America.” Is it possible that it is now much later in the day – perhaps too late for his conservatism?
If you wish to subscribe to Thinking Out Loud, e-mail
thinkingoutloud@stanleyfeingold.com, and write
“subscribe.” The price of your subscription is your
occasionally thinking out loud by responding.
Comments, criticisms and corrections are welcome.
I intend to post a weekly essay of under 1500 words.
To unsubscribe, write “cancel.” Stanley Feingold
This was achieved initially as a result of the Democratic Party’s remarkable ability to hold together the most disparate interests. Southern white supremacists were there because they had been there since Lincoln “freed the slaves.” Twentieth-century working-class immigrants were there because there was no congenial home for them in a Republican Party whose leaders represented capitalist power and a laissez-faire philosophy epitomized in Calvin Coolidge’s observation that the business of government is business. Black Americans were there because they were ignored by the party that had ended slavery and they responded to the New Deal’s egalitarianism.
Republicans recognized that they had to make a broader appeal and, in choosing presidential nominees, they reached beyond conservative ideologues to nominate New York Governor Thomas E. Dewey (twice) and former Democrat Wendell Willkie before winning the presidency with General Dwight D. Eisenhower, whose domestic political leanings were largely unknown. Eisenhower was the candidate of the liberal internationalist wing of the party, barely and bitterly winning the nomination in 1952 against Ohio Senator Robert A. Taft, an authentic and beloved conservative widely known as Mr. Republican.
This shift toward more moderate Republican presidential candidates was reflected in efforts to broaden the party base, in order to make it more cross-sectional and multi-factional. Nevertheless, the Democratic Party remained more liberal, despite the power of conservative southerners who chaired most of the major congressional committees, and the Republican Party remained more conservative, despite the presence of eastern internationalists and middle-western LaFollette populists. The Democrats did a better job of keeping their coalition than the Republicans did of creating theirs, as was evident in the civil rights controversies, when both the leading advocates and leading opponents of racial equality were in the Democratic Party.
After Vice President Richard Nixon’s close defeat in 1960, conservatives captured enough Republican Party state organizations to nominate Arizona Senator Barry Goldwater, who, true to his promise to offer “a choice, not an echo,” championed the reduced size of government, repeal of the graduated income tax, an end to federal aid to education, and voluntary Social Security. Goldwater had written The Conscience of a Conservative, and he became the embodiment of that conscience.
In retrospect, Goldwater’s overwhelming defeat (equaled only by Democrat George McGovern’s loss eight years later) can be seen as the birth pangs of a new conservative alliance. Goldwater won only his home state of Arizona and the five southern states with the greatest proportion of black citizens, the states in which the issue of race was most important. The Solid South of a century after the Civil War was solid no more. The change begun by the Supreme Court’s 1954 decision in Brown v. Board of Education outlawing segregated public education was accelerated by passage of the Civil Rights Act in 1964 and the Voting Rights Act in 1965. The political landscape was decisively altered by school busing, forced school integration, affirmative action in higher education and hiring, white flight, race riots, and the perception of increased street crime. The enduring political consequence has been that no Democratic presidential candidate since Lyndon Johnson in 1964 has received a majority of white votes.
Despite the existence of laws inspired by religious beliefs (Sunday closing laws, prohibiting the mailing of immoral material, criminalization of birth control information and devices, and the insertion of “under God” into the Pledge of Allegiance), religious political influence abated after the 1925 Scopes “monkey trial,” in which a young science teacher was convicted for teaching evolutionary theory. It was a pyrrhic victory for religious orthodoxy because the public reaction was a political defeat for the public teaching of religious doctrine.
Religious moral conviction reemerged as a political force in 1965 when the United States Supreme Court upheld the right of married couples to obtain contraceptives. In 1973 in Roe v. Wade, the Supreme Court went further in recognizing a woman’s right to an abortion, inspiring a powerful grass-roots movement that has ever since aspired to reverse this decision by the selection of Supreme Court Justices who are likely to vote to overturn Roe or, short of that, limit the permissible period or methods of abortions.
The new mobilization of religious and racial conservatism became allied with the economic conservatism of business protectionism, laissez-faire government, and fiscal conservatism to reshape the Republican Party. It needed a candidate who would articulate this new conservative coalition, and it found him in Ronald Reagan. Reagan was identified with opposition to abortion, obscenity and pornography; respect for the flag and support for school prayer, and certain and severe punishment for violent crimes. Reagan’s presidency accompanied – and perhaps inspired – the revival of religious fundamentalism and equating American patriotism with Republican conservatism.
It is only peripherally relevant to this alliance symbolized by Reagan that his behavior was not nearly as conservative as his rhetoric. He frequently invoked God, but was not a churchgoer. He had been a populist before he became a conservative and, as president, he supported raising Social Security taxes rather than cutting benefits. In 1964, he characterized Medicare as socialized medicine, but Medicare spending increased by more than ten percent in each year of his presidency. He promised to decrease the size of government; it increased. He promised to cut the budget; it grew larger. He promised to decrease entitlements; in office, he supported vast increases. He promised to abolish two Cabinet departments; they were retained and another was created. He promised to outlaw abortion; nothing happened. Nevertheless, the symbolic reality was that Reagan made most Americans feel good about themselves and their country, and conservatives believed that they had an ally in the White House.
The conservative coalition could not feel a similar comradeship with Reagan’s vice president and successor, George H. W. Bush, who was defeated after one term, in part because of the bitterness of economic conservatives at his betrayal of his pledge of “no new taxes.” But they had captured the party machinery and, aided by Democratic President Bill Clinton’s political ineptitude and neglect of the party organization, succeeded in winning control of Congress in 1994, holding it for the last six years of the Clinton presidency and the first six of George W. Bush, the truest conservative to occupy the White House in the lifetime of anyone now living.
Even in victory, insecurity was apparent in the now-powerful conservative coalition. Where earlier conventions featured major addresses by staunch conservatives, in 2004 the most prominent speakers included John McCain, Arnold Schwarzenegger and Rudolph Giuliani. It can be dismissed as window-dressing, but it was clearly designed to entice more customers into the store. The reason was that the existence of a new conservative majority had not yet been established.
Al Gore outpolled Bush in 2000 and the party vote that year for the House was almost a dead heat, with little more than one vote in every thousand separating them. Four years later, the Republican margin of victory for House candidates was less than three votes in every thousand votes cast. Democrats received more votes than Republicans in the one hundred Senate races from 2000 to 2004 or 2002 to 2006. Republican control of Congress was due more to gerrymandering in the House and its dominance in the rural, less populous states than to greater popular support.
Is the confidence of the conservative coalition in the rightness (take it either way) of their cause diminishing? Their three leading aspirants for the 2008 presidential nomination are a former Governor of ultra-liberal Massachusetts, a former Mayor of New York City (neither of which any Republican can hope to win), and a distinguished Senator whose complex public record ranges from excoriating President Bush to embracing his most controversial policies. Fred Thompson has finally announced his candidacy as the savior of the conservative cause. Better known as a television and movie actor than as an eight-year Senator, his conservative credentials are modest compared with the records of many past and present Governors and members of Congress.
In order to win the nomination, each of these four leading candidates must now vow that he is the most authentic heir to Reagan Republicanism without identifying himself to closely with the current president, and then, to win the election, he must radically moderate his positions to win the support of a much broader electorate. Reagan could preach in 1980 that it was “morning in America.” Is it possible that it is now much later in the day – perhaps too late for his conservatism?
If you wish to subscribe to Thinking Out Loud, e-mail
thinkingoutloud@stanleyfeingold.com, and write
“subscribe.” The price of your subscription is your
occasionally thinking out loud by responding.
Comments, criticisms and corrections are welcome.
I intend to post a weekly essay of under 1500 words.
To unsubscribe, write “cancel.” Stanley Feingold
Saturday, September 1, 2007
IS A VICE PRESIDENT NECESSARY?
The thought must occur to President Bush’s harshest critics that, as unpopular as he is, even they fear the possibility of his departure before the end of his second term. There cannot be many Americans who might consider removing Bush who would be pleased by the prospect of replacing him with Vice President Cheney.
It isn’t a unique situation. President Richard Nixon was deeply embroiled in the Watergate scandal shortly after his reelection in 1972, but those who most condemned his shameful behavior feared that, if he were removed from office, Vice President Spiro Agnew would become the president. (The authors of the Constitution had not clearly indicated that, in such circumstances, the vice president would succeed to the title of president, but John Tyler, the first vice president whose president died one month after taking office, had himself sworn in, and every succeeding vice president has done the same.) Fortunately for Nixon’s critics, Agnew resigned nine months into his second term in an agreement that allowed him to escape trial on charges of having committed bribery, extortion, and tax evasion during his tenure as governor of Maryland. This cleared the way for the congressional inquiry into Nixon’s unlawful conduct that led to his resignation less than a year later. Imagine that Nixon had left office before Agnew, and this ignorant, bigoted and corrupt man, chosen as Nixon’s running-mate because he had delivered his state’s support to Nixon at a crucial point in the 1968 nomination campaign, had become President of the United States.
In the same fashion, imagine if President George H.W. Bush had departed from the presidency and been replaced by Vice President Dan Quayle, a choice of a running-mate that shocked even Bush’s supporters. Quayle was an amiable, ill-prepared and under-equipped Senator who is best-remembered two decades later for his misspelling of “potato” (he told a student to add an “e”) and a number of verbal gaffes, perhaps most famously his reference to the United Negro College Fund slogan, “A mind is a terrible thing to waste,” as “What a waste it is to lose one’s mind or not having a mind is being wasteful. How true that is.”
Today a very unpopular President Bush has an even more unpopular Vice President Dick Cheney. When Congress was examining President Nixon’s role in the Watergate break-in, a business associate reported that Cheney said (and Cheney has never denied saying it), that Watergate was “a political ploy by the president’s enemies.” His support for unchecked executive power was later on the public record when, as a member of Congress, he opposed congressional investigation of possible abuses of power in the Iran-Contra scandal and commended Colonel Oliver North as “the most effective and impressive witness certainly this committee has heard.”
As Vice President, Cheney has repeatedly stated that Saddam Hussein was involved in 9/11, that terrorist Abu Musab al Zarqawi established an Al Qaeda operation in Iraq, and made other claims that have been totally refuted; he persuaded President Bush to sign an order denying foreign terrorism suspects access to any military or civilian court (without informing either Secretary of State Powell or National Security Adviser Rice); he advocated “robust interrogation” of suspects, a code phrase for torture; he refused to tell Congress whom he had met to develop energy policy; he has refused to respond to a subpoena from a congressional committee, and offered the far-fetched claim (abandoned after widespread ridicule) that he was not an “entity within the executive branch.”
Of course, if he were not vice president, Cheney could make all of these unfounded (literally anti-republican and anti-democratic) claims, and President Bush could, as he has, adopt them as his own. However, because he is vice president, if President Bush was removed from office, Cheney would become president. It is beyond argument that neither Agnew nor Quayle nor Cheney would have received serious consideration as a presidential candidate. On the evidence of their political backgrounds, Agnew and Quayle would have been major embarrassments as President of the United States and Cheney would be an unmitigated disaster. His arrogance, obdurateness, passion for secrecy, and disrespect for the clear mandates of the Constitution would inspire unending constitutional crises.
Once upon a time, the vice presidency was a position that inspired ridicule. Mr. Dooley, Finley Peter Dunne’s famous fictional politician, observed: “Th’ prisidincy is th’ highest office in th’ gift iv th’ people. Th’ vice-presidency is th’ next highest an’ th’ lowest, It isn’t a crime exactly. Ye can’t be sent to jail f’r it, but it’s a kind iv a disgrace. It’s like writin’ anonymous letters.” In a similar humorous and derogatory spirit, the office was lampooned in the Pulitzer Prize-winning musical Of Thee I Sing, when Vice President Alexander P. Throttlebottom discovers that his sole constitutional power is to preside over the U.S. Senate, in which he cannot introduce legislation or speak, but can cast tie-breaking votes, which don’t occur once in the average vice president’s career.
The vice presidency is no longer a laughing matter. All four nineteenth century vice presidents who succeeded to the presidency upon the death of the president had been at odds with the presidents under whom they served, and all failed to be nominated in their own right before the next election. By contrast, four of five vice presidents who succeeded upon the death of a president in the twentieth century were subsequently elected in their own right. The fifth, Gerald Ford, who succeeded on the resignation of President Nixon, failed to be elected, in large part because of the blanket pardon he had given to Nixon. Altogether, the five 20th-century vice presidents who succeeded to the office served (including four elected terms, to which they would not have been elected if they had not been first elevated to the presidency) for a little more than 22 years and ten months, very nearly a quarter of a century.
In addition, other elections have been critically influenced by an earlier president’s choice of a running-mate. VP Nixon lost in 1960 but won twice in 1968 and 1972. Former VP Mondale lost in 1984, VP Bush won in 1988, and VP Gore was denied his victory in 2000. The presidential election of 2008 will be only the third election since 1900 in which neither an incumbent president nor a present or past vice president is a major party candidate.
The argument that an incumbent vice president is better prepared to assume the presidency is often untrue.
John Tyler, the first vice president to succeed on the death of an elected president, provides an instructive lesson. One month after his inauguration in 1841, President William Henry Harrison died and Tyler became president, opposed to most of Harrison’s policies and reviled for the next four years by the party that had elected him. Theodore Roosevelt, almost certainly the most highly regarded president who succeeded on the death of a president, became vice president because the death of President William McKinley’s first vice president gave Republican New York State boss Thomas Platt the opportunity to get rid of Roosevelt as the state’s governor by having him “kicked upstairs” to the vice presidency, where he would never be heard from again. Of course, those who got rid of Roosevelt did not anticipate McKinley’s assassination six months into his second term, when Roosevelt became the president and profoundly reshaped the politics of his party and the nation.
Dick Cheney to the contrary notwithstanding, vice presidents have rarely been the confidante of the president. When Harry Truman was sworn in as president immediately after the death of President Franklin D. Roosevelt, Secretary of War Henry Stimson took him into a corner to tell him about the atomic bomb. Before that, no one had thought it important to tell the vice president.
There has to be a better way. Suppose a sudden vacancy occurred in the presidency. That day the members of Congress could either quickly convene or be polled. On the first or second ballot, a new president could be chosen. If a majority of the 535 members of Congress were of the same party as the departed president, they would choose a leader of that party, more often than not one who would have declined selection as vice president in our present system. If a majority of the members of Congress were not of the president’s party, they would opt for a change, very possibly choosing their party’s defeated presidential candidate.
Presidential candidates often make this choice of a running-mate at the very last moment in a national convention, sometimes as a quid pro quo for convention support or as a concession to their opponents in the party. We don’t choose a president in order for him to choose his successor, but that us what so often occurs. There has to be a better way, and that way would involve the elimination of the office of vice president.
It isn’t a unique situation. President Richard Nixon was deeply embroiled in the Watergate scandal shortly after his reelection in 1972, but those who most condemned his shameful behavior feared that, if he were removed from office, Vice President Spiro Agnew would become the president. (The authors of the Constitution had not clearly indicated that, in such circumstances, the vice president would succeed to the title of president, but John Tyler, the first vice president whose president died one month after taking office, had himself sworn in, and every succeeding vice president has done the same.) Fortunately for Nixon’s critics, Agnew resigned nine months into his second term in an agreement that allowed him to escape trial on charges of having committed bribery, extortion, and tax evasion during his tenure as governor of Maryland. This cleared the way for the congressional inquiry into Nixon’s unlawful conduct that led to his resignation less than a year later. Imagine that Nixon had left office before Agnew, and this ignorant, bigoted and corrupt man, chosen as Nixon’s running-mate because he had delivered his state’s support to Nixon at a crucial point in the 1968 nomination campaign, had become President of the United States.
In the same fashion, imagine if President George H.W. Bush had departed from the presidency and been replaced by Vice President Dan Quayle, a choice of a running-mate that shocked even Bush’s supporters. Quayle was an amiable, ill-prepared and under-equipped Senator who is best-remembered two decades later for his misspelling of “potato” (he told a student to add an “e”) and a number of verbal gaffes, perhaps most famously his reference to the United Negro College Fund slogan, “A mind is a terrible thing to waste,” as “What a waste it is to lose one’s mind or not having a mind is being wasteful. How true that is.”
Today a very unpopular President Bush has an even more unpopular Vice President Dick Cheney. When Congress was examining President Nixon’s role in the Watergate break-in, a business associate reported that Cheney said (and Cheney has never denied saying it), that Watergate was “a political ploy by the president’s enemies.” His support for unchecked executive power was later on the public record when, as a member of Congress, he opposed congressional investigation of possible abuses of power in the Iran-Contra scandal and commended Colonel Oliver North as “the most effective and impressive witness certainly this committee has heard.”
As Vice President, Cheney has repeatedly stated that Saddam Hussein was involved in 9/11, that terrorist Abu Musab al Zarqawi established an Al Qaeda operation in Iraq, and made other claims that have been totally refuted; he persuaded President Bush to sign an order denying foreign terrorism suspects access to any military or civilian court (without informing either Secretary of State Powell or National Security Adviser Rice); he advocated “robust interrogation” of suspects, a code phrase for torture; he refused to tell Congress whom he had met to develop energy policy; he has refused to respond to a subpoena from a congressional committee, and offered the far-fetched claim (abandoned after widespread ridicule) that he was not an “entity within the executive branch.”
Of course, if he were not vice president, Cheney could make all of these unfounded (literally anti-republican and anti-democratic) claims, and President Bush could, as he has, adopt them as his own. However, because he is vice president, if President Bush was removed from office, Cheney would become president. It is beyond argument that neither Agnew nor Quayle nor Cheney would have received serious consideration as a presidential candidate. On the evidence of their political backgrounds, Agnew and Quayle would have been major embarrassments as President of the United States and Cheney would be an unmitigated disaster. His arrogance, obdurateness, passion for secrecy, and disrespect for the clear mandates of the Constitution would inspire unending constitutional crises.
Once upon a time, the vice presidency was a position that inspired ridicule. Mr. Dooley, Finley Peter Dunne’s famous fictional politician, observed: “Th’ prisidincy is th’ highest office in th’ gift iv th’ people. Th’ vice-presidency is th’ next highest an’ th’ lowest, It isn’t a crime exactly. Ye can’t be sent to jail f’r it, but it’s a kind iv a disgrace. It’s like writin’ anonymous letters.” In a similar humorous and derogatory spirit, the office was lampooned in the Pulitzer Prize-winning musical Of Thee I Sing, when Vice President Alexander P. Throttlebottom discovers that his sole constitutional power is to preside over the U.S. Senate, in which he cannot introduce legislation or speak, but can cast tie-breaking votes, which don’t occur once in the average vice president’s career.
The vice presidency is no longer a laughing matter. All four nineteenth century vice presidents who succeeded to the presidency upon the death of the president had been at odds with the presidents under whom they served, and all failed to be nominated in their own right before the next election. By contrast, four of five vice presidents who succeeded upon the death of a president in the twentieth century were subsequently elected in their own right. The fifth, Gerald Ford, who succeeded on the resignation of President Nixon, failed to be elected, in large part because of the blanket pardon he had given to Nixon. Altogether, the five 20th-century vice presidents who succeeded to the office served (including four elected terms, to which they would not have been elected if they had not been first elevated to the presidency) for a little more than 22 years and ten months, very nearly a quarter of a century.
In addition, other elections have been critically influenced by an earlier president’s choice of a running-mate. VP Nixon lost in 1960 but won twice in 1968 and 1972. Former VP Mondale lost in 1984, VP Bush won in 1988, and VP Gore was denied his victory in 2000. The presidential election of 2008 will be only the third election since 1900 in which neither an incumbent president nor a present or past vice president is a major party candidate.
The argument that an incumbent vice president is better prepared to assume the presidency is often untrue.
John Tyler, the first vice president to succeed on the death of an elected president, provides an instructive lesson. One month after his inauguration in 1841, President William Henry Harrison died and Tyler became president, opposed to most of Harrison’s policies and reviled for the next four years by the party that had elected him. Theodore Roosevelt, almost certainly the most highly regarded president who succeeded on the death of a president, became vice president because the death of President William McKinley’s first vice president gave Republican New York State boss Thomas Platt the opportunity to get rid of Roosevelt as the state’s governor by having him “kicked upstairs” to the vice presidency, where he would never be heard from again. Of course, those who got rid of Roosevelt did not anticipate McKinley’s assassination six months into his second term, when Roosevelt became the president and profoundly reshaped the politics of his party and the nation.
Dick Cheney to the contrary notwithstanding, vice presidents have rarely been the confidante of the president. When Harry Truman was sworn in as president immediately after the death of President Franklin D. Roosevelt, Secretary of War Henry Stimson took him into a corner to tell him about the atomic bomb. Before that, no one had thought it important to tell the vice president.
There has to be a better way. Suppose a sudden vacancy occurred in the presidency. That day the members of Congress could either quickly convene or be polled. On the first or second ballot, a new president could be chosen. If a majority of the 535 members of Congress were of the same party as the departed president, they would choose a leader of that party, more often than not one who would have declined selection as vice president in our present system. If a majority of the members of Congress were not of the president’s party, they would opt for a change, very possibly choosing their party’s defeated presidential candidate.
Presidential candidates often make this choice of a running-mate at the very last moment in a national convention, sometimes as a quid pro quo for convention support or as a concession to their opponents in the party. We don’t choose a president in order for him to choose his successor, but that us what so often occurs. There has to be a better way, and that way would involve the elimination of the office of vice president.
Saturday, August 25, 2007
DOES THE PRIMARY SYSTEM CHOOSE THE BEST CANDIDATES?
Who is the best (that is, the strongest) candidate for either major party to nominate? That’s easy. It’s the candidate who will appeal most successfully to the party’s unswerving supporters, to those voters who are often – but not necessarily – inclined to support that party, to independent voters who boast of voting for the person not the party, and to new voters. That candidate is not necessarily the single most popular potential candidate of the party, but one who is, if not a first choice, an acceptable choice for the largest number of prospective voters. To put it briefly – but watch out for the double negative – the best candidate is the party’s least unacceptable candidate.
In 1972, the first year in which the modern primary-caucus system of presidential nomination was decisive, Senator George McGovern won the Democratic nomination because he had won more delegates than any other candidate. In fact, former Vice President Hubert Humphrey, who narrowly lost the 1968 election to Richard Nixon, received slightly more primary votes than McGovern (68,000 more out of a total primary vote of 16 million), but had the support of fewer elected delegates. Each received marginally more than one-quarter of the primary votes.
The rational question the Democratic Party should have asked was which potential candidate would be most likely to maximize the party’s support in the election? It is not a reflection on McGovern’s integrity, intelligence or experience to observe that, in the light of his being perceived as a very liberal, lesser known and uncharismatic Senator, he was not that candidate. As McGovern himself observed after his defeat, the worst any Democratic presidential candidate has ever suffered, “I wanted to run for president in the worst possible way – and I did.” Based on Humphrey’s strong run four years earlier under adverse circumstances (the chaos at the 1968 Democratic convention was evidence of a bitterly divided party), he was likely to be a much more popular candidate.
It might have been worse. Until the attempted assassination of Alabama Governor George Wallace on May 15, which resulted in crippling him and removing him from the race, he had decisively won the southern states of Florida, Tennessee and North Carolina, finishing second to McGovern in Wisconsin and second to Humphrey in Pennsylvania, Indiana and West Virginia. The day after the shooting, Wallace won the Michigan and Maryland primaries. At that point, Wallace was well ahead of his rivals. Despite his incapacity, he continued to poll at least twenty percent of the primary vote in three of the four remaining primaries. It is not difficult to imagine, had Wallace not been shot, that he would have had a significant plurality of both votes cast and delegates elected when the Democrats convened their convention. He would then have been the single most popular candidate, but it is unarguable that he, who had been elected Governor on the slogan “Segregation Now, Segregation Tomorrow, and Segregation Forever,” could not be nominated, unless the Democrats were willing to commit political suicide.
In 1976, Gerald Ford, who had succeeded to the presidency upon the resignation of Richard Nixon, won the Republican nomination with 53 percent of the primary vote, compared with California Governor Ronald Reagan’s 46 percent. Incumbency was decisive in Ford’s winning the nomination, but his unconditional pardon of Nixon was probably decisive in his losing the election. Jimmy Carter, originally a little-known candidate, won 39 percent of the Democratic primary vote in a field without strong opponents and failed to win a primary majority outside of the south and near-south until the last primary day in June. Under the circumstances, Reagan, a less unacceptable candidate than Ford, would have been a likely winner if he had been nominated.
In 1980, liberal Massachusetts Senator Ted Kennedy sought to take the nomination away from President Carter. It soon became apparent that the tragedy at Chappaquiddick was as fatal to his chances as Carter’s reputation as a weak president was fatal to his. It is very likely that Reagan would have defeated any Democrat, but it is almost certain that a number of leading Democrats would have fared better than Carter.
It isn’t only losing candidates who demonstrate the failure to choose the best candidate. In 1991, thanks to the ease with which the United States won the Gulf war, President George H.W. Bush’s popularity reached a record high. He looked unbeatable in 1992. One by one, the leading Democrats declined to compete for their party’s nomination. These included Governors Bruce Babbitt of Arizona and Mario Cuomo of New York, Senators Al Gove, Sam Nunn and Paul Simon, Representative Richard Gephardt, and the Reverend Jesse Jackson. All had national reputations and figured in speculation regarding their party’s nomination. When they bowed out, the field contained five candidates who may be fairly characterized as the B team: a little-known radical populist Senator (Tom Harkin); an anti-charismatic moderate Senator without a power base (Bob Kerrey); a former Senator who had been ill, looked ill and was still ill, although he lied about his medical condition, and who prescribed unpopular glum remedies for what ailed the United States (Paul Tsongas); a former California Governor widely caricatured as Governor Moonbeam (Jerry Brown), and the long-time and long-running Governor of a poor and small state who, alone among the candidates, had spent the time and money to organize a campaign for the long haul (Bill Clinton).
When Clinton was battered by charges of draft-dodging and marital infidelity on the eve of the first primary in New Hampshire, and later performed poorly in the early non-Southern primaries, none of his rivals had either the financial resources or popular support to capture the lead, and it was too late for a stronger candidate to enter the race. It is futile to speculate as to which of the party leaders who had earlier declined to run would have been the strongest candidate, but several would almost certainly have been stronger.
Now, more than a year before the presidential election of 2008, public opinion polls reveal that the front-runners for their party’s nomination are Democrat Hillary Clinton and Republican Rudolph Giuliani. Yet, among the leading candidates of both parties, it is likely that Clinton and Giuliani will confront the most opposition and skepticism among party regulars and others inclined to vote for that party.
Clinton’s unfavorable rating in the electorate is approximately equal to her favorable rating. This is due to her critical role in sidetracking universal single payer health insurance in 1993-94, principled antipathy to having a husband and wife both serve as President (as much as opposition to a father and son, but that was not the choice of the Democrats), the negative reaction of many likely Democratic voters to Clinton’s personality, and the largely unexpressed reluctance of many voters to elect a woman. A very large number of voters have a similar unexpressed reluctance to support an African-American candidate. The primary difference is that a large proportion of voters who are reluctant to support a woman are inclined to support a Democrat, while a much smaller proportion of voters who are reluctant to support a black candidate are likely to support a Democrat. If Clinton is nominated, loyal Democrats are likely to suppress their doubts and vote for her, but critical independents are less certain to do so.
Republican partisans will exploit the potential weaknesses of their rivals, including Romney’s Mormonism and McCain’s departures from party orthodoxy on campaign reform and immigration. However, Giuliani’s vulnerabilities are likely to prove to be more critical, including liberal positions on abortion and gay marriage, his three marriages and informing his second wife in a press conference of his intent to divorce her, his alienation from his children, and increasing criticism of his public conduct after 9/11, the very event which made him a major national political figure. At least until he becomes an announced candidate, Fred Thompson may be the least unacceptable Republican. Of course, if Clinton and Giuliani are both nominated, they won’t both lose, any more than both Nixon and McGovern could lose in 1972. But many voters will confront an unhappy choice.
The flaw revealed in this, the tenth election in which the presidential candidates will have been chosen by the primary-caucus system, is that the unrepresentative voters who participate in the process choose the one candidate they most favor (and who may win the nomination with only a small plurality of the primary vote), and not the candidate who has the widest support within the party, let alone in the general electorate. It isn’t the only flaw of the primary system, but it significant enough to undermine any pretense that the process reflects the public’s will.
In 1972, the first year in which the modern primary-caucus system of presidential nomination was decisive, Senator George McGovern won the Democratic nomination because he had won more delegates than any other candidate. In fact, former Vice President Hubert Humphrey, who narrowly lost the 1968 election to Richard Nixon, received slightly more primary votes than McGovern (68,000 more out of a total primary vote of 16 million), but had the support of fewer elected delegates. Each received marginally more than one-quarter of the primary votes.
The rational question the Democratic Party should have asked was which potential candidate would be most likely to maximize the party’s support in the election? It is not a reflection on McGovern’s integrity, intelligence or experience to observe that, in the light of his being perceived as a very liberal, lesser known and uncharismatic Senator, he was not that candidate. As McGovern himself observed after his defeat, the worst any Democratic presidential candidate has ever suffered, “I wanted to run for president in the worst possible way – and I did.” Based on Humphrey’s strong run four years earlier under adverse circumstances (the chaos at the 1968 Democratic convention was evidence of a bitterly divided party), he was likely to be a much more popular candidate.
It might have been worse. Until the attempted assassination of Alabama Governor George Wallace on May 15, which resulted in crippling him and removing him from the race, he had decisively won the southern states of Florida, Tennessee and North Carolina, finishing second to McGovern in Wisconsin and second to Humphrey in Pennsylvania, Indiana and West Virginia. The day after the shooting, Wallace won the Michigan and Maryland primaries. At that point, Wallace was well ahead of his rivals. Despite his incapacity, he continued to poll at least twenty percent of the primary vote in three of the four remaining primaries. It is not difficult to imagine, had Wallace not been shot, that he would have had a significant plurality of both votes cast and delegates elected when the Democrats convened their convention. He would then have been the single most popular candidate, but it is unarguable that he, who had been elected Governor on the slogan “Segregation Now, Segregation Tomorrow, and Segregation Forever,” could not be nominated, unless the Democrats were willing to commit political suicide.
In 1976, Gerald Ford, who had succeeded to the presidency upon the resignation of Richard Nixon, won the Republican nomination with 53 percent of the primary vote, compared with California Governor Ronald Reagan’s 46 percent. Incumbency was decisive in Ford’s winning the nomination, but his unconditional pardon of Nixon was probably decisive in his losing the election. Jimmy Carter, originally a little-known candidate, won 39 percent of the Democratic primary vote in a field without strong opponents and failed to win a primary majority outside of the south and near-south until the last primary day in June. Under the circumstances, Reagan, a less unacceptable candidate than Ford, would have been a likely winner if he had been nominated.
In 1980, liberal Massachusetts Senator Ted Kennedy sought to take the nomination away from President Carter. It soon became apparent that the tragedy at Chappaquiddick was as fatal to his chances as Carter’s reputation as a weak president was fatal to his. It is very likely that Reagan would have defeated any Democrat, but it is almost certain that a number of leading Democrats would have fared better than Carter.
It isn’t only losing candidates who demonstrate the failure to choose the best candidate. In 1991, thanks to the ease with which the United States won the Gulf war, President George H.W. Bush’s popularity reached a record high. He looked unbeatable in 1992. One by one, the leading Democrats declined to compete for their party’s nomination. These included Governors Bruce Babbitt of Arizona and Mario Cuomo of New York, Senators Al Gove, Sam Nunn and Paul Simon, Representative Richard Gephardt, and the Reverend Jesse Jackson. All had national reputations and figured in speculation regarding their party’s nomination. When they bowed out, the field contained five candidates who may be fairly characterized as the B team: a little-known radical populist Senator (Tom Harkin); an anti-charismatic moderate Senator without a power base (Bob Kerrey); a former Senator who had been ill, looked ill and was still ill, although he lied about his medical condition, and who prescribed unpopular glum remedies for what ailed the United States (Paul Tsongas); a former California Governor widely caricatured as Governor Moonbeam (Jerry Brown), and the long-time and long-running Governor of a poor and small state who, alone among the candidates, had spent the time and money to organize a campaign for the long haul (Bill Clinton).
When Clinton was battered by charges of draft-dodging and marital infidelity on the eve of the first primary in New Hampshire, and later performed poorly in the early non-Southern primaries, none of his rivals had either the financial resources or popular support to capture the lead, and it was too late for a stronger candidate to enter the race. It is futile to speculate as to which of the party leaders who had earlier declined to run would have been the strongest candidate, but several would almost certainly have been stronger.
Now, more than a year before the presidential election of 2008, public opinion polls reveal that the front-runners for their party’s nomination are Democrat Hillary Clinton and Republican Rudolph Giuliani. Yet, among the leading candidates of both parties, it is likely that Clinton and Giuliani will confront the most opposition and skepticism among party regulars and others inclined to vote for that party.
Clinton’s unfavorable rating in the electorate is approximately equal to her favorable rating. This is due to her critical role in sidetracking universal single payer health insurance in 1993-94, principled antipathy to having a husband and wife both serve as President (as much as opposition to a father and son, but that was not the choice of the Democrats), the negative reaction of many likely Democratic voters to Clinton’s personality, and the largely unexpressed reluctance of many voters to elect a woman. A very large number of voters have a similar unexpressed reluctance to support an African-American candidate. The primary difference is that a large proportion of voters who are reluctant to support a woman are inclined to support a Democrat, while a much smaller proportion of voters who are reluctant to support a black candidate are likely to support a Democrat. If Clinton is nominated, loyal Democrats are likely to suppress their doubts and vote for her, but critical independents are less certain to do so.
Republican partisans will exploit the potential weaknesses of their rivals, including Romney’s Mormonism and McCain’s departures from party orthodoxy on campaign reform and immigration. However, Giuliani’s vulnerabilities are likely to prove to be more critical, including liberal positions on abortion and gay marriage, his three marriages and informing his second wife in a press conference of his intent to divorce her, his alienation from his children, and increasing criticism of his public conduct after 9/11, the very event which made him a major national political figure. At least until he becomes an announced candidate, Fred Thompson may be the least unacceptable Republican. Of course, if Clinton and Giuliani are both nominated, they won’t both lose, any more than both Nixon and McGovern could lose in 1972. But many voters will confront an unhappy choice.
The flaw revealed in this, the tenth election in which the presidential candidates will have been chosen by the primary-caucus system, is that the unrepresentative voters who participate in the process choose the one candidate they most favor (and who may win the nomination with only a small plurality of the primary vote), and not the candidate who has the widest support within the party, let alone in the general electorate. It isn’t the only flaw of the primary system, but it significant enough to undermine any pretense that the process reflects the public’s will.
Friday, August 17, 2007
ARE SUSPECTED TERRORISTS ENTITLED TO "DUE PROCESS OF LAW"?
On August 16, Jose Padilla was found guilty, along with two co-defendants, of conspiracy to “murder, kidnap and maim” people in a foreign country. All three could be sentenced to prison for life. The case of Jose Padilla was brought to public attention by a number of events beginning more than five years ago.
Padilla was a native American citizen, arrested on May 2002, taken a month later to the Navy military brig in South Carolina, kept without human contact, lights, clock or a mirror, and interrogated without an attorney for another twenty-one months before he was permitted to speak to counsel, and retained in the brig for another twenty-two months before being transferred to a civilian prison in Miami, where he made his first court appearance on January 12, 2006. The extraordinary length of time between his arrest and court appearance is a gross violation of the fundamental right of habeas corpus (literally “to have a body”), that is, to bring a party before a judge or court in order to prevent the state from keeping an individual in unlawful restraint.
When Padilla was apprehended at O’Hare International Airport in Chicago upon ending a flight that began in Pakistan, he was carrying a small amount of money, a cell phone and e-mail addresses for Al-Qaeda operatives. President Bush had him designated as an “enemy combatant” and Attorney General John Ashcroft disclosed that he was suspected of planning to detonate a radioactive “dirty bomb” in an American city. More than a yeare and a half after he was detained, the Second Circuit Court of Appeals in New York ordered his release from military custody and permitted the government, if it chose, to try him in a civilian court. That ruling was suspended when the Bush Administration appealed to the U.S. Supreme Court.
A half-year later (more than two years after his arrest), the Justice Department released details about alleged admissions Padilla had made during interrogations about his involvement with top Al-Qaeda leaders, including the “dirty bomb” plan and another plot to fill apartments in high-rise buildings with natural gas and detonate them using timers. Nearly another eighteen months later, Padilla was added to an existing indictment in Miami claiming that he was part of a North American terror support cell that conspired to “murder, kidnap and maim” people overseas. No mention was made of the “dirty bomb” plot or any other earlier allegations. Fourth U.S. Circuit Court of Appeals Judge J. Michael Luttig criticized the Administration for using one set of facts to justify holding Padilla without charges and another set to persuade a Florida grand jury to indict him. The Supreme Court later overruled the Fourth Circuit and allowed the military to transfer Padilla to face the new criminal charges.
After a three month trial and one day of jury deliberations, Padilla, along with his co-defendants, was found guilty of the charges brought against them. During the trial, Padilla’s lawyers unsuccessfully sought to have him declared incompetent to stand trial because of the consequences of torture he had suffered in the military brig. All evidence concerning his military confinement was barred from the trial, as was any reference to the “dirty bomb” accusations. The government said that it had received the information by questioning other terrorism suspects abroad, and federal rules of evidence prohibit or limit the use of information obtained during such interrogations.
Padilla’s co-defendnts were two men of Middle Eastern decsent, one of whom he had met before. The three were charged with belonging to a terrorism support cell that provided money, recruits and supplies to Islamic extremists. The government had recorded voluminous messages in which his co-conspirators were charged with using code words to assist in supporting violent jihad. Padilla did not participate in any of these messages. The government also played wiretapped calls in which the two co-conspirators discussed a television interview with Osama bin Laden. There was no evidence that Padilla had seen or discussed the interview. Trying Padilla along with the other two men, inextricably linked him with them, but the only evidence linking Padilla to Al Qaeda was his name and six fingerprints on an application to attend an Al Qaeda training campl in Afghanistan in 2000.
Nothing in this summary account of the incarceration, interrogation or trial of Jose Padilla is offered in his defense. On the record, Padilla was a dangerous man. He had been a member of a street gang, was implicated in a murder when he was 13 and confined as a juvenile offender, and was later arrested in Florida in a road-rage shooting incident and spent a year in a Florida jail. It is plausible, if not conclusively proven, that his closeness to Al Qaeda signified a willingness to engage in acts of murder, kidnapping and maiming others. It is possible, although no evidence to this effect has been introducted into any court of law, that Padilla participated in a plot to set off a “dirty bomb.” It is possible that he was capable of the most horrendous terrorist acts against innocent people.
If all this were true, the question would remain: Has justice been done? Can a suspected criminal receive justice if he is without human contact or light or basic information in a military prison? Can a suspected criminal receive justice if he has no access to legal counsel for two years? Can a suspected criminal receive justice if his allegations of abusive treatment are barred from his trial because the results of illegal interrogations conducted in prison may not be introduced into evidence? Can a suspected criminal receive justice if he is incarcerated for five years on charges regarding which no evidence has been introduced and that are totally discarded when he is brought to trial?
The Fifth Amendment to the U.S. Constitution states that no one (citizen or non-citizen) shall “be deprived of life, liberty, or property, without due process of law.” The Sixth Amednment states: “In all criminal prosecutions, the accused shall enjoy the right to a speedy and public trial…and be informed of nature and cause of the accusation;…and to have the assistance of counsel for his defense.” There are no exceptions to these rights.
The rule of law does not apply less to the worst of men than it does to the best. Whatever the extent of Padilla’s guilt, justice has not been done.
Padilla was a native American citizen, arrested on May 2002, taken a month later to the Navy military brig in South Carolina, kept without human contact, lights, clock or a mirror, and interrogated without an attorney for another twenty-one months before he was permitted to speak to counsel, and retained in the brig for another twenty-two months before being transferred to a civilian prison in Miami, where he made his first court appearance on January 12, 2006. The extraordinary length of time between his arrest and court appearance is a gross violation of the fundamental right of habeas corpus (literally “to have a body”), that is, to bring a party before a judge or court in order to prevent the state from keeping an individual in unlawful restraint.
When Padilla was apprehended at O’Hare International Airport in Chicago upon ending a flight that began in Pakistan, he was carrying a small amount of money, a cell phone and e-mail addresses for Al-Qaeda operatives. President Bush had him designated as an “enemy combatant” and Attorney General John Ashcroft disclosed that he was suspected of planning to detonate a radioactive “dirty bomb” in an American city. More than a yeare and a half after he was detained, the Second Circuit Court of Appeals in New York ordered his release from military custody and permitted the government, if it chose, to try him in a civilian court. That ruling was suspended when the Bush Administration appealed to the U.S. Supreme Court.
A half-year later (more than two years after his arrest), the Justice Department released details about alleged admissions Padilla had made during interrogations about his involvement with top Al-Qaeda leaders, including the “dirty bomb” plan and another plot to fill apartments in high-rise buildings with natural gas and detonate them using timers. Nearly another eighteen months later, Padilla was added to an existing indictment in Miami claiming that he was part of a North American terror support cell that conspired to “murder, kidnap and maim” people overseas. No mention was made of the “dirty bomb” plot or any other earlier allegations. Fourth U.S. Circuit Court of Appeals Judge J. Michael Luttig criticized the Administration for using one set of facts to justify holding Padilla without charges and another set to persuade a Florida grand jury to indict him. The Supreme Court later overruled the Fourth Circuit and allowed the military to transfer Padilla to face the new criminal charges.
After a three month trial and one day of jury deliberations, Padilla, along with his co-defendants, was found guilty of the charges brought against them. During the trial, Padilla’s lawyers unsuccessfully sought to have him declared incompetent to stand trial because of the consequences of torture he had suffered in the military brig. All evidence concerning his military confinement was barred from the trial, as was any reference to the “dirty bomb” accusations. The government said that it had received the information by questioning other terrorism suspects abroad, and federal rules of evidence prohibit or limit the use of information obtained during such interrogations.
Padilla’s co-defendnts were two men of Middle Eastern decsent, one of whom he had met before. The three were charged with belonging to a terrorism support cell that provided money, recruits and supplies to Islamic extremists. The government had recorded voluminous messages in which his co-conspirators were charged with using code words to assist in supporting violent jihad. Padilla did not participate in any of these messages. The government also played wiretapped calls in which the two co-conspirators discussed a television interview with Osama bin Laden. There was no evidence that Padilla had seen or discussed the interview. Trying Padilla along with the other two men, inextricably linked him with them, but the only evidence linking Padilla to Al Qaeda was his name and six fingerprints on an application to attend an Al Qaeda training campl in Afghanistan in 2000.
Nothing in this summary account of the incarceration, interrogation or trial of Jose Padilla is offered in his defense. On the record, Padilla was a dangerous man. He had been a member of a street gang, was implicated in a murder when he was 13 and confined as a juvenile offender, and was later arrested in Florida in a road-rage shooting incident and spent a year in a Florida jail. It is plausible, if not conclusively proven, that his closeness to Al Qaeda signified a willingness to engage in acts of murder, kidnapping and maiming others. It is possible, although no evidence to this effect has been introducted into any court of law, that Padilla participated in a plot to set off a “dirty bomb.” It is possible that he was capable of the most horrendous terrorist acts against innocent people.
If all this were true, the question would remain: Has justice been done? Can a suspected criminal receive justice if he is without human contact or light or basic information in a military prison? Can a suspected criminal receive justice if he has no access to legal counsel for two years? Can a suspected criminal receive justice if his allegations of abusive treatment are barred from his trial because the results of illegal interrogations conducted in prison may not be introduced into evidence? Can a suspected criminal receive justice if he is incarcerated for five years on charges regarding which no evidence has been introduced and that are totally discarded when he is brought to trial?
The Fifth Amendment to the U.S. Constitution states that no one (citizen or non-citizen) shall “be deprived of life, liberty, or property, without due process of law.” The Sixth Amednment states: “In all criminal prosecutions, the accused shall enjoy the right to a speedy and public trial…and be informed of nature and cause of the accusation;…and to have the assistance of counsel for his defense.” There are no exceptions to these rights.
The rule of law does not apply less to the worst of men than it does to the best. Whatever the extent of Padilla’s guilt, justice has not been done.
Labels:
Al Qaeda,
due process of law,
Jose Padilla,
terrrorism
Saturday, August 11, 2007
CAN CONGRESS VALIDATE THE PRESIDENT'S USE OF WARRANTLESS WIRETAPPING?
On August 4, before adjourning for a month-long recess, Congress quickly ratified, 60-28 in the Senate and 227-183 in the House, President Bush’s once-secret program (which had been exposed by The New York Times in December 2005) to allow the National Security Agency to eavesdrop without obtaining a court warrant on telephone and e-mail conversations between people in the United States and “a person reasonably believed to be located outside the United States.” The Bush Administration has assured us that we need have no fears regarding this ambiguity. Attorney General Alberto Gonzales, who can now issue surveillance orders without judicial approval, but cannot remember meetings he has attended or the subjects discussed, is required by the new statute to report “incidents of noncompliance.”
The requirement that one of the parties is “reasonably believed” to be outside the U.S. implies the assumption that at least one person whose communication is being intercepted is not a U.S. citizen. Of course, this may not be true. It doesn’t matter. The Constitution contains no claim that the government may violate the rights of non-citizens or, for that matter, deny non-citizens due process of law or incarcerate them indefinitely without charges, trial, legal representation, and the presentation of evidence. American citizenship is not a prerequisite for the enjoyment of basic rights in the United States.
Congress’s rubber-stamping of the President’s long-standing independent policy, bearing the Orwellian title of the Protect America Act of 2007, may have been in response to the possibility that several federal courts might conclude that the NSA’s surveillance program violated the Foreign Intelligence Surveillance Act. (As a political compromise, the new law expires in six months, a curiously brief period in which to protect America, which was adopted to appease some of the doubting Democrats.)
For all practical purposes, the NSA can now intercept your communications at any time and without court approval. Based on American history and human nature, nothing can be more certain than that zealous spying will invade private protected speech and conduct. The Senate Judiciary Committee has repeatedly requested the Bush Administration to provide legal justification for its warrantless wiretapping program, most recently subpoenaing the National Security Agency, Justice Department, White House, and Office of the Vice President for information, setting August 20 as the return date for the subpoenas. No federal officials have as yet responded.
To allow the government to invade our privacy without a demonstration in court of a vital security need is to denigrate the right of privacy. The fact that this right is not mentioned in the Constitution is a bogus argument against it. Neither are the right of marriage, the right to vote, political parties, judicial review, and other rights and institutions that we consider fundamental in American government.
In 1890, the Harvard Law Review published an essay, “The Right to Privacy,” by Samuel D. Warren and Louis D. Brandeis, that became the classic statement of privacy as a fundamental right. Of course, Warren and Brandeis could not then imagine the importance of what was then the recently-invented telephone or of computer communications a century later. Nevertheless, their claim for a right of privacy remains valid for present-day communications. These very brief extracts demonstrate the tone and relevance of the article:
“The common law secures to each individual the right of determining, ordinarily, to what extent his thoughts, sentiments, and emotions shall be communicated to others….The protection afforded to thoughts, sentiments, and emotions, expressed through the medium of writing or of the arts, so far as it consists in preventing publication, is merely an instance of the enforcement of the more general right of the individual to be let alone….The principle which protects personal writings and all other personal productions, not against theft and physical appropriation, but against publication in any form, is in reality not the principle of private property, but that of an inviolate personality….
“We must…conclude that the rights, so protected, whatever their exact nature, are not rights arising from contract or from special trust, but are rights as against the world; and…the principle which has been applied to protect these rights is in reality not the principle of private property, unless that word be used in an extended and unusual sense. The principle which protects personal writings and any other productions of the intellect or of the emotions, is the right to privacy, and the law has no new principle to formulate when it extends this protection.”
Thirty-eight years later, now-Justice Louis Brandeis applied the doctrine to the interception of telephone messages: “The evil incident to invasion of the privacy of the telephone is far greater than that involved in tampering with the mails. Whenever a telephone line is tapped, the privacy of the persons at both ends of the line is invaded, and all conversations between them on any subject, and although proper, confidential, and privileged, may be overheard. Moreover, the tapping of one man’s telephone line involves the tapping of the telephone of every other person whom he may call, or who may call him. As a means of espionage, writs of assistance and general warrants are but puny instruments of tyranny and oppression when compared with wire tapping.”
The right of privacy was finally acknowledged in 1965, when the Supreme Court struck down a Connecticut law prohibiting the possession, sale and distribution of contraceptives to married couples. Justice Douglas’s opinion for the Court awkwardly placed the right in the “penumbras” and “emanations” of a number of Bill of Rights guarantees. Justice Goldberg relied in part on the Ninth Amendment’s reference to “other rights retained by the people.” Justice Harlan argued that the liberty clause of the Fourteenth Amendment barred state conduct inconsistent with a government based “on the concept of ordered liberty.” It is clear that the Fourth Amendment’s prohibition of “unreasonable searches and seizures” was prompted by the opposition to warrantless searches conducted in the colonies by the British Empire.
Two years later, the Supreme Court concluded that the Fourth Amendment’s prohibition of “unreasonable searches and seizures” applied to electronic surveillance as well as physical searches. In 1972, the Court held that the Fourth Amendment required a court order for domestic surveillance. The right of privacy has become so well established that it is difficult to imagine any present public figure, except for never-Justice Robert Bork, denying that the American people possess that right.
Were a Justice Brandeis sitting on the Supreme Court today, one can imagine how he would characterize the improper interception of computer and telephone communications. According to a report issued by the Administrative Office of the United States Courts, state and federal courts authorized 1773 interceptions of wire, oral, and electronic communications in 2005. Only one application was denied by the courts. It was revealed in 2005 that the National Security Agency had purchased over 1.9 trillion call-detail records of phone calls made after September 11, 2001. The almost-certain result of the Protect America Act will be an accelerating invasion of the right of privacy.
The right to privacy, as formulated by Warren and Brandeis and later incorporated into constitutional principle, does not deny the right of the government to obtain a valid court warrant in support of a vital national interest. It rejects the unprincipled and indiscriminate use of warrantless wiretapping and surreptitious recording of private conversations in which this Administration has long engaged. Those who continue to believe in the right of privacy must condemn the cowardice of Congress in legislating its support for the continued unguarded interception of private communications without a court order. What was unconstitutional when undertaken by President Bush remains unconstitutional when condoned by Congress.
The requirement that one of the parties is “reasonably believed” to be outside the U.S. implies the assumption that at least one person whose communication is being intercepted is not a U.S. citizen. Of course, this may not be true. It doesn’t matter. The Constitution contains no claim that the government may violate the rights of non-citizens or, for that matter, deny non-citizens due process of law or incarcerate them indefinitely without charges, trial, legal representation, and the presentation of evidence. American citizenship is not a prerequisite for the enjoyment of basic rights in the United States.
Congress’s rubber-stamping of the President’s long-standing independent policy, bearing the Orwellian title of the Protect America Act of 2007, may have been in response to the possibility that several federal courts might conclude that the NSA’s surveillance program violated the Foreign Intelligence Surveillance Act. (As a political compromise, the new law expires in six months, a curiously brief period in which to protect America, which was adopted to appease some of the doubting Democrats.)
For all practical purposes, the NSA can now intercept your communications at any time and without court approval. Based on American history and human nature, nothing can be more certain than that zealous spying will invade private protected speech and conduct. The Senate Judiciary Committee has repeatedly requested the Bush Administration to provide legal justification for its warrantless wiretapping program, most recently subpoenaing the National Security Agency, Justice Department, White House, and Office of the Vice President for information, setting August 20 as the return date for the subpoenas. No federal officials have as yet responded.
To allow the government to invade our privacy without a demonstration in court of a vital security need is to denigrate the right of privacy. The fact that this right is not mentioned in the Constitution is a bogus argument against it. Neither are the right of marriage, the right to vote, political parties, judicial review, and other rights and institutions that we consider fundamental in American government.
In 1890, the Harvard Law Review published an essay, “The Right to Privacy,” by Samuel D. Warren and Louis D. Brandeis, that became the classic statement of privacy as a fundamental right. Of course, Warren and Brandeis could not then imagine the importance of what was then the recently-invented telephone or of computer communications a century later. Nevertheless, their claim for a right of privacy remains valid for present-day communications. These very brief extracts demonstrate the tone and relevance of the article:
“The common law secures to each individual the right of determining, ordinarily, to what extent his thoughts, sentiments, and emotions shall be communicated to others….The protection afforded to thoughts, sentiments, and emotions, expressed through the medium of writing or of the arts, so far as it consists in preventing publication, is merely an instance of the enforcement of the more general right of the individual to be let alone….The principle which protects personal writings and all other personal productions, not against theft and physical appropriation, but against publication in any form, is in reality not the principle of private property, but that of an inviolate personality….
“We must…conclude that the rights, so protected, whatever their exact nature, are not rights arising from contract or from special trust, but are rights as against the world; and…the principle which has been applied to protect these rights is in reality not the principle of private property, unless that word be used in an extended and unusual sense. The principle which protects personal writings and any other productions of the intellect or of the emotions, is the right to privacy, and the law has no new principle to formulate when it extends this protection.”
Thirty-eight years later, now-Justice Louis Brandeis applied the doctrine to the interception of telephone messages: “The evil incident to invasion of the privacy of the telephone is far greater than that involved in tampering with the mails. Whenever a telephone line is tapped, the privacy of the persons at both ends of the line is invaded, and all conversations between them on any subject, and although proper, confidential, and privileged, may be overheard. Moreover, the tapping of one man’s telephone line involves the tapping of the telephone of every other person whom he may call, or who may call him. As a means of espionage, writs of assistance and general warrants are but puny instruments of tyranny and oppression when compared with wire tapping.”
The right of privacy was finally acknowledged in 1965, when the Supreme Court struck down a Connecticut law prohibiting the possession, sale and distribution of contraceptives to married couples. Justice Douglas’s opinion for the Court awkwardly placed the right in the “penumbras” and “emanations” of a number of Bill of Rights guarantees. Justice Goldberg relied in part on the Ninth Amendment’s reference to “other rights retained by the people.” Justice Harlan argued that the liberty clause of the Fourteenth Amendment barred state conduct inconsistent with a government based “on the concept of ordered liberty.” It is clear that the Fourth Amendment’s prohibition of “unreasonable searches and seizures” was prompted by the opposition to warrantless searches conducted in the colonies by the British Empire.
Two years later, the Supreme Court concluded that the Fourth Amendment’s prohibition of “unreasonable searches and seizures” applied to electronic surveillance as well as physical searches. In 1972, the Court held that the Fourth Amendment required a court order for domestic surveillance. The right of privacy has become so well established that it is difficult to imagine any present public figure, except for never-Justice Robert Bork, denying that the American people possess that right.
Were a Justice Brandeis sitting on the Supreme Court today, one can imagine how he would characterize the improper interception of computer and telephone communications. According to a report issued by the Administrative Office of the United States Courts, state and federal courts authorized 1773 interceptions of wire, oral, and electronic communications in 2005. Only one application was denied by the courts. It was revealed in 2005 that the National Security Agency had purchased over 1.9 trillion call-detail records of phone calls made after September 11, 2001. The almost-certain result of the Protect America Act will be an accelerating invasion of the right of privacy.
The right to privacy, as formulated by Warren and Brandeis and later incorporated into constitutional principle, does not deny the right of the government to obtain a valid court warrant in support of a vital national interest. It rejects the unprincipled and indiscriminate use of warrantless wiretapping and surreptitious recording of private conversations in which this Administration has long engaged. Those who continue to believe in the right of privacy must condemn the cowardice of Congress in legislating its support for the continued unguarded interception of private communications without a court order. What was unconstitutional when undertaken by President Bush remains unconstitutional when condoned by Congress.
Sunday, August 5, 2007
WHAT WENT WRONG IN IRAQ?
Why did we invade Iraq? President Bush, Vice President Cheney, Secretary of State Powell, and Secretary of Defense Rumsfeld repeatedly stated and professed to provide proof that Iraq had nuclear, chemical and biological weapons of mass destruction. This was false. Such WMDs that Iraq had possessed were destroyed years earlier. Iraq’s alleged effort to buy fuel for nuclear weapons was based on a discredited forgery. United Nations inspectors found no evidence of any efforts to establish new weapons programs.
President Bush and others purported to provide “proof” that Iraq was closely allied to Al Qaeda in the 9/11 attack upon the United States. This too was false. Fifteen of the nineteen hijackers came from Saudi Arabia; none from Iraq. No evidence links the hijackers with Iraq. Alleged meetings between Iraqi and Qaeda representatives never took place, nor was there any cooperation between them. On the contrary, Saddam Hussein, a secular dictator, and Al Qaeda, a religiously fanatical movement, were antagonistic to one another.
President Bush stated and purported to provide proof, that Iraq was planning aggression directed toward its neighbors and the United States. This also was false. Iraq’s dictatorial government continued to behave brutally toward its domestic critics and minorities, but engaged in no aggressive action abroad after its defeat and retreat from Kuwait after the 1991 war. There is literally no evidence that Iraq was either planning or in a position to engage in aggression abroad.
Let us assume that every one of the cited errors was a mistake in judgment and not a deliberate deception. A democratic nation does not have the right to engage in war that might result in the loss of tens of thousands of lives (which this war has) based on a single basic mistake, let alone a series of mistakes that utterly disregarded warnings and corrections. Given the abundant evidence rebutting these claims, that continue to be repeated by President Bush and Vice President Cheney to the present day, most American and world opinion has concluded that the Bush Administration was motivated by goals that disregarded reason and truth.
How did we believe that we would win? President Bush and members of his Administration predicted that we would be greeted by the Iraqis as liberators and that flowers would be strewn in the path of American tanks. They were wrong. Military victory was greeted by widespread looting (which American forces did virtually nothing to stop), and improvised explosive devices (IEDs) were and continue to be placed in the path of American tanks, resulting in a majority of the more than 3600 American military deaths suffered thus far.
President Bush and his Administration believed that peace and security could be achieved with an undersized and underequipped volunteer army. They were wrong. Inadequate armor on tanks resulted in the needless loss of many lives. The regular armed forces and National Guard members who never imagined their ever going to war have been sent back for repeated and extended tours of duty. When
Army Chief of Staff Eric Shinseki told Congress before the war began that not enough troops were being sent to Iraq, he was publicly derided by Defense Secretary Rumsfeld. When it became evident that Shinseki was right, Rumsfeld alibied that you go to war with the army you have. He was reckless. If you start a war, you shouldn’t do it until you have the army you need.
President Bush and his Administration believed that debaathification (that is, the removal of members of Saddam Hussein’s ruling Baath party) in the Iraqi army and police forces would result in loyalty to a successor regime. They were wrong. Debaathification led former Baathists to ally themselves with other enemies of the occupation, and left the new government without effective internal security.
Rumsfeld also famously said “stuff happens,” which is the verbal equivalent of responding to unhappy unexpected consequences with a shrug of the shoulders. It is the solemn responsibility of the nation’s leaders to anticipate and prepare for the widest range of stuff that may happen before it engages in a war of its choosing.
How do we plan to get out? President Bush sought to create a government that genuinely unifies the mutually suspicious and often hostile Shia, Sunni and Kurdish populations. It has not happened. He sought to create a successful central government in free elections, first under American-sponsored puppets with no credible base of support in Iraq, and finally under the leadership of Nouri al-Maliki. Understandably, the feeble Iraqi government is responsive to internal interests that are not sympathetic to American intervention. There has been little progress toward creation of a multi-sectarian state.
The failure of the United States to understand internal conflicts, provide security and protect borders resulted in the easy entry of members of Al Qaeda (conspicuously absent prior to the American invasion) and other forces bent upon increasing chaos, encouraging anti-American sentiment, and fomenting civil war and lawlessness throughout Iraq.
What is the reality? America is hated in Iraq as being anti-Muslim because of its occupation and its ignorance and disregard of Muslim customs and beliefs. It is impossible to calculate the emotional and political consequences of the callous brutality of Abu Ghraib, the extradition of Muslims (who the United States refuses to acknowledge to be prisoners or war protected by the Geneva Conventions) to unknown prisons in undisclosed countries, where their harsh treatment will not risk examination by American courts, the accidental or criminal acts of American servicemen who kill innocent civilians, and the ignorance and insensitivity of much American behavior, as in a photograph of American servicemen sitting in a Muslim mosque wearing their combat boots.
Investigative journalists estimate that the followers of Al Qaeda constitute between five and ten percent of the insurgent forces, although they represent a larger proportion of those who engage in the most successful violent opposition to the American occupation. No one can know the consequences of total American withdrawal. Partial or piecemeal withdrawal is unacceptable, because the vulnerability of American forces would be increased and the symbolic negative significance of the American presence would not be reduced. No credence should be given to the predictions of those who have until now been wrong about everything connected with the causes and consequences of the American occupation. Some responsible scholars are persuaded that things will get worse – not better – as long as resentment of America’s presence continues to be felt.
Why have we failed? When Osama bin Laden was believed to be cornered in Afghanistan, we deployed our military forces there to the war we were beginning in Iraq, making easier his escape into Pakistan. We betrayed our most cherished ideals in the humiliating treatment of prisoners in Abu Ghraib and the brutal actions of servicemen in widely publicized incidents of promiscuous slaughter of Iraqi soldiers and civilians. When General Antonio Taguba released his detailed report recounting the complicity of government officials in imposing abusive interrogation policies at Abu Ghraib, he was mocked and shunned by the Pentagon and forced to retire early. We violated the basic precepts of habeas corpus and international law in denying legal counsel to prisoners in Guantanamo and elsewhere, and subjecting them to humiliating treatment and torture. We have turned our backs on Iraqis who risked their lives working with the American occupation forces, but have been denied immigration into the United States. In all these respects, we have alienated millions of people throughout the world who once held the United States in the highest regard for its love of liberty and justice.
What can we do? We can get out. Whenever the United States finally decides to leave Iraq, defenders of our failed policies will declare: If only we had stayed…. (This is what they said when we left Vietnam.) But if we stay, and thousands more Americans and tens of thousands more Iraqis die, the invaders would still proclaim that they would succeed if only…. The truth is that the sooner we leave, the sooner we reduce the killing and maiming of Americans, as well as the distrust of America which now extends to most people in most nations. The sooner we leave, the sooner we can participate in international efforts to promote peace and security in the Middle East. The sooner we leave, the sooner we can divert hundreds of billions of dollars from the conduct of an unending war that enriches private corporations and corrupt foreign politicians, to be used to fight poverty and disease in the United States and throughout the world. The sooner we leave, the sooner the United States can regain its place of pride among the free and liberty-loving nations of the world.
President Bush and others purported to provide “proof” that Iraq was closely allied to Al Qaeda in the 9/11 attack upon the United States. This too was false. Fifteen of the nineteen hijackers came from Saudi Arabia; none from Iraq. No evidence links the hijackers with Iraq. Alleged meetings between Iraqi and Qaeda representatives never took place, nor was there any cooperation between them. On the contrary, Saddam Hussein, a secular dictator, and Al Qaeda, a religiously fanatical movement, were antagonistic to one another.
President Bush stated and purported to provide proof, that Iraq was planning aggression directed toward its neighbors and the United States. This also was false. Iraq’s dictatorial government continued to behave brutally toward its domestic critics and minorities, but engaged in no aggressive action abroad after its defeat and retreat from Kuwait after the 1991 war. There is literally no evidence that Iraq was either planning or in a position to engage in aggression abroad.
Let us assume that every one of the cited errors was a mistake in judgment and not a deliberate deception. A democratic nation does not have the right to engage in war that might result in the loss of tens of thousands of lives (which this war has) based on a single basic mistake, let alone a series of mistakes that utterly disregarded warnings and corrections. Given the abundant evidence rebutting these claims, that continue to be repeated by President Bush and Vice President Cheney to the present day, most American and world opinion has concluded that the Bush Administration was motivated by goals that disregarded reason and truth.
How did we believe that we would win? President Bush and members of his Administration predicted that we would be greeted by the Iraqis as liberators and that flowers would be strewn in the path of American tanks. They were wrong. Military victory was greeted by widespread looting (which American forces did virtually nothing to stop), and improvised explosive devices (IEDs) were and continue to be placed in the path of American tanks, resulting in a majority of the more than 3600 American military deaths suffered thus far.
President Bush and his Administration believed that peace and security could be achieved with an undersized and underequipped volunteer army. They were wrong. Inadequate armor on tanks resulted in the needless loss of many lives. The regular armed forces and National Guard members who never imagined their ever going to war have been sent back for repeated and extended tours of duty. When
Army Chief of Staff Eric Shinseki told Congress before the war began that not enough troops were being sent to Iraq, he was publicly derided by Defense Secretary Rumsfeld. When it became evident that Shinseki was right, Rumsfeld alibied that you go to war with the army you have. He was reckless. If you start a war, you shouldn’t do it until you have the army you need.
President Bush and his Administration believed that debaathification (that is, the removal of members of Saddam Hussein’s ruling Baath party) in the Iraqi army and police forces would result in loyalty to a successor regime. They were wrong. Debaathification led former Baathists to ally themselves with other enemies of the occupation, and left the new government without effective internal security.
Rumsfeld also famously said “stuff happens,” which is the verbal equivalent of responding to unhappy unexpected consequences with a shrug of the shoulders. It is the solemn responsibility of the nation’s leaders to anticipate and prepare for the widest range of stuff that may happen before it engages in a war of its choosing.
How do we plan to get out? President Bush sought to create a government that genuinely unifies the mutually suspicious and often hostile Shia, Sunni and Kurdish populations. It has not happened. He sought to create a successful central government in free elections, first under American-sponsored puppets with no credible base of support in Iraq, and finally under the leadership of Nouri al-Maliki. Understandably, the feeble Iraqi government is responsive to internal interests that are not sympathetic to American intervention. There has been little progress toward creation of a multi-sectarian state.
The failure of the United States to understand internal conflicts, provide security and protect borders resulted in the easy entry of members of Al Qaeda (conspicuously absent prior to the American invasion) and other forces bent upon increasing chaos, encouraging anti-American sentiment, and fomenting civil war and lawlessness throughout Iraq.
What is the reality? America is hated in Iraq as being anti-Muslim because of its occupation and its ignorance and disregard of Muslim customs and beliefs. It is impossible to calculate the emotional and political consequences of the callous brutality of Abu Ghraib, the extradition of Muslims (who the United States refuses to acknowledge to be prisoners or war protected by the Geneva Conventions) to unknown prisons in undisclosed countries, where their harsh treatment will not risk examination by American courts, the accidental or criminal acts of American servicemen who kill innocent civilians, and the ignorance and insensitivity of much American behavior, as in a photograph of American servicemen sitting in a Muslim mosque wearing their combat boots.
Investigative journalists estimate that the followers of Al Qaeda constitute between five and ten percent of the insurgent forces, although they represent a larger proportion of those who engage in the most successful violent opposition to the American occupation. No one can know the consequences of total American withdrawal. Partial or piecemeal withdrawal is unacceptable, because the vulnerability of American forces would be increased and the symbolic negative significance of the American presence would not be reduced. No credence should be given to the predictions of those who have until now been wrong about everything connected with the causes and consequences of the American occupation. Some responsible scholars are persuaded that things will get worse – not better – as long as resentment of America’s presence continues to be felt.
Why have we failed? When Osama bin Laden was believed to be cornered in Afghanistan, we deployed our military forces there to the war we were beginning in Iraq, making easier his escape into Pakistan. We betrayed our most cherished ideals in the humiliating treatment of prisoners in Abu Ghraib and the brutal actions of servicemen in widely publicized incidents of promiscuous slaughter of Iraqi soldiers and civilians. When General Antonio Taguba released his detailed report recounting the complicity of government officials in imposing abusive interrogation policies at Abu Ghraib, he was mocked and shunned by the Pentagon and forced to retire early. We violated the basic precepts of habeas corpus and international law in denying legal counsel to prisoners in Guantanamo and elsewhere, and subjecting them to humiliating treatment and torture. We have turned our backs on Iraqis who risked their lives working with the American occupation forces, but have been denied immigration into the United States. In all these respects, we have alienated millions of people throughout the world who once held the United States in the highest regard for its love of liberty and justice.
What can we do? We can get out. Whenever the United States finally decides to leave Iraq, defenders of our failed policies will declare: If only we had stayed…. (This is what they said when we left Vietnam.) But if we stay, and thousands more Americans and tens of thousands more Iraqis die, the invaders would still proclaim that they would succeed if only…. The truth is that the sooner we leave, the sooner we reduce the killing and maiming of Americans, as well as the distrust of America which now extends to most people in most nations. The sooner we leave, the sooner we can participate in international efforts to promote peace and security in the Middle East. The sooner we leave, the sooner we can divert hundreds of billions of dollars from the conduct of an unending war that enriches private corporations and corrupt foreign politicians, to be used to fight poverty and disease in the United States and throughout the world. The sooner we leave, the sooner the United States can regain its place of pride among the free and liberty-loving nations of the world.
Subscribe to:
Posts (Atom)