Friday, February 29, 2008

From Kosovo to Gaza and Beyond III

It seems to me that a “baseline of expectations” has to be developed that can serve as a general guide for the international community when it is confronted with demands to allow the formation and to fully accept the sovereignty of a new country.

I suggest that a pre-condition for even considering the creation of a new country in the 21st century is the absence of armed conflict between the ruling class or its agents (e.g., the Sudanese government and the Janjaweed) and those among the ruled who believe they suffer discriminatory treatment – whether from unofficial or official policy – on the basis of some common distinguishing characteristic.

This pre-condition would seem to introduce a bias against the “method” by which so many countries achieved self-rule in the 20th century either outright through armed rebellion or by succeeding in involving regional or even global powers in mediating settlements that provided for separation but not resolution of the underlying grievances that first impelled the resort to arms – and subsequently resurfaced in support for rebellions in adjacent countries. But the ever-increasing lethality of military “solutions” combined with the ever-increasing time and resources required to reconstitute an area devastated in war suggest that – as a general rule – the price of nation-state disaggregation via warfare is one that the international system as presently constituted can no longer afford.

Does not a prohibition against armed conflict as a path to nationhood condemn an oppressed group to unending subjugation? Not if the international community employs tools it has possessed but did not use in the last century or tools that have come into force in the current century.

In the former category is the Universal Declaration of Human Rights, enacted by the UN General Assembly on December 10th, 1948, but largely ignored in the 45 year Cold War competition between the superpower military alliances.

In the latter category are two advances that give force to the Universal Declaration.

One is the creation of the International Criminal Court (Rome Statute) as a permanent international venue with jurisdiction over war crimes and crimes against humanity committed by individuals, including heads of state. The second is the evolution of restraints on the previously untrammeled sovereignty of a government over its internal affairs – an evolution from the right of the international community to intervene to protect a segment of a country’s population from depredations of government to a responsibility to intervene in such cases. It is under this latter rubric, broadly defined beyond the familiar role of UN or regional “peacekeeping” missions that are reactive to violence – that provides justification for international involvement in the first instance.

What is missing from this structure is a regularized methodology and a permanently constituted international quasi-judicial organization empowered to hear petitions for devolution and to issue binding arbitration where necessary.

But a greater deficiency exists, one traceable to the hodge-podge evolution of a system that is less concerned with the loss of individual rights and usurpation of civil liberties than with population control required to maintain the rulers in power. This deficiency is both the weakness and the strength of the process of building a strong consensus for globalization. Beyond a point, devolution becomes economically, diplomatically, and socially unsustainable as each subgrouping and then individuals concentrate on their own situation. In contrast, a more broadly based constituency anchored in an accelerating process of globalization offers the opportunity for greater returns on invested human energy and resources moving across all artificial barriers

Whereas the chief problems of the past grew from the failure of governors to govern is a new total reversal of the more effort on integrating different distinctive groups not only within existing national boundaries but to build stronger regional organizations that can search for solutions to the growing list of trans-national challenges stemming from the past narrow interests of individual countries.

In short, like it or not, we are in an age of escalating globalization in which even the most powerful countries can no longer be allowed to place their unilateral interests before those of the rest of the world. This may have been the pathway to power in the past, but attempted or perpetuated today it will be the road to ruin – for everyone.

Wednesday, February 27, 2008

Kosovo to Gaza and Beyond II

Picking up from Monday, what “entitles” a group of people to declare themselves independent and demand – and get – recognition as a distinct ethnic entity and a “homeland” of their own?

Earlier this month, 2 million ethnic Albanians living in the province of Kosovo – representing some 90 percent of the entire Kosovo population, declared their independence from Serbia-Montenegro which is majority ethnic Slav. The declaration is unlikely to be reversed, even though other ethnic-majority Slav countries are unlikely to diplomatically recognize the new country.

Or consider Armenia, one of the oldest civilizations in the world. Worldwide, there are approximately 8 million ethnic Armenians, but most of these live outside the small modern territory in the Caucasus called Armenia. Only an estimated 20 percent of the total Armenian population still resides in that small part of their ancient homeland.

If a distinctive cultural, linguistic, religious, artistic, and racial group of 2 million in one place or 8 million world-wide can have their own homeland, a fair question is why cannot 15 million? That’s the general consensus about the number of ethnic Kurds living in Iran, Iraq, Turkey, and Syria.

And there is the seemingly intractable confrontation in Palestine.

More later

Monday, February 25, 2008

Kosovo to Gaza and Beyond

After Kosovo, Can Gaza be Far Behind?

Last week, in a widely anticipated move that had the approval, if not the blessing, of the United States, Canada, and every country in Europe except Russia and Serbia, the governing authorities in Pristina, Kosovo, declared their independence from Serbia.

This is not the first time in modern history that Kosovo has made a bid for freedom. In 1981, Yugoslavian armed forces put down the stirrings of rebellion, only to see the complete collapse of the Yugoslav state in the 1990.s. As part of this disintegration, the majority ethnic Albanians in Kosovo declared their independence from Slav-dominated Serbia that, by then, was ruled by Slobodan Milosevic, who many regard as responsible for the collapse of the Yugoslav state

Relations between Belgrade and the Kosovo capital of Pristina went downhill steadily until 1998 when attacks by the Kosovo Liberation Army drove Milosevic to direct a more punitive military and police response to the activities of Kosovar military units became more aggressive in responsible problem. By 1998, the world – particularly the United States, Canada, and Western Europe, none of which intervened while all sides in the Bosnian civil war committed war crimes against each other – concurred that “atrocities” were being committed against ethnic Albanian Kosovar civilians and members of the Kosovar Liberation Army.

After eleven weeks of aerial attacks that destroyed significant parts of the Serb infrastructure and government facilities, Kosovo became a “ward” of the UN. After nine years of fruitless negotiation, the Kosovo parliament – at least those present in Pristina – followed Prime Minister Hashim Thaci and voted February 17th to declare the province’s independence.

The reaction in Belgrade was sharp. Serbia’s Prime Minister Vojislav Kostunica vowed that his country would follow every course open to it short of armed intervention to compel the break-away province to reverse course.

This declaration comes one month after another group – or more accurately, sub-group – rebelled against punitive conditions imposed on it by an occupation force. Last month, the 12 kilometer “border” wall erected by Israel along the Egyptian-Gaza Strip border was breached by Hamas militants. The floodgates were opened, relieving in the short-term the pent up emotions of those living in the Gaza Strip and complicating Egypt’s stance and standing in the region.

The “peace talks” between Palestinian Authority President Abbas and Israeli Prime Minister Olmert further complicate this situation and differentiate it from Kosovo.

More later.

Friday, February 22, 2008

Back From Africa

In 1864, Dr. Davis Livingstone, who had captured the imagination of the British with stories of his explorations in central Africa, returned to the continent to seek the source of the River Nile. Months passed into years with no communication from the explorer. Finally, after seven years of silence, public pressure to find what had happened to Livingstone became overwhelming. An expedition of 2,000 men, financed by the New York Herald and led by the British-born, U.S. Civil War (Confederate) veteran turned journalist, Henry Stanley, set off into the interior of the “Dark Continent” to solve the mystery. Eight months later, in a tiny village on the shore of Lake Tanganyika, Stanley greeted the man he had been sent to find with the question, “Dr. Livingstone, I presume?”

George Bush first visited Africa July 7-12 2003. He had “lost” seven months from the original timetable for the visit – January 2003 – because of the March 2003 Iraq invasion. Africa’s security problems in 2003 were largely intra-national, albeit the repercussions spilled across national boundaries. Why did he go to the five countries selected in 2003? Good question, considering what was going on.

By the time Bush began his trip, the worst of the fighting in eastern Congo – where Bush did not go – had ended, largely because France sent approximately 1,000 soldiers into eastern Congo to suppress the Congolese rebels and foreign guerrillas that used this area as their base camps. The French deployment breathed life into the cease fire arrangement brokered by South Africa.

French troops were also on the ground in Cote d’Ivoire along with British soldiers in Sierra Leone while U.S. warships stood off the coast of Liberia as that country’s autocratic president, Charles Taylor, came under pressure to step down and go into exile.

Senegal was in the limelight as the first sub-Saharan country to accept counter-insurgency training by U.S. personnel under the Clinton Administration’s Africa Crisis Response Initiative. This was the one country on the trip itinerary with which the U.S. had a military connection.

Nigeria, always tumultuous because of the great economic disparity between those who benefit from its oil and those shut out, was also sharply divided between the Islamic north and the Christian/animist south. The U.S. interest here was (and is) oil.

Of the five countries visited, Botswana was the harbinger of one of the more frightening “security dangers” looming on the horizon – a country collapsing not from the weight of or the result of arms but from disease – in Botswana’s case, HIV/AIDS.

On his just completed trip, Bush visited Benin, Ghana, Liberia (under a democratically elected president), Rwanda, and Tanzania. The latter two countries abut nations being torn by civil strife. Tanzania’s neighbor Kenya – a steadfast “ally” of the U.S. against al-Qaeda in Africa – has been wracked by violence stemming from a rigged election in late December. Rwanda remains susceptible to incursions from dissidents still using east Congo as a base area – just as happened five years ago.

What is striking about this trip is that none of the countries are military “power houses” even in African terms. On the other hand, the trip was billed as one that would highlight administration successes. Military “successes” on the continent would be hard if not impossible to find, given what is happening (still) in Sudan, Ethiopia-Eritrea, Somalia, Cote d’Ivoire, Sierra Leone, and Nigeria as well as the strife going on in those nations mentioned in the paragraph above.

In fact, some pundits see the trip as more of a morale boost for the president than a trip that will strengthen the U.S. hand on the continent. One thing is for sure: no one will ask, as Henry Stanley did in 1871, “President Bush, I presume?”

Wednesday, February 20, 2008

Overheard on the Circuit

Studies of the political preferences and leanings of the military, active duty and reserve/National Guard, have noted a steady and growing alignment with conservative-supported principles and Republican-backed programs for implementing those principles.

The phenomenon is hardly new, for most military establishments are instinctively in favor of the status quo. But until the coming of the all-volunteer military, the point of imbalance was closer to the middle of the scale because there was less self-selection under conscription.

Moreover, according to a September 2004 AP/MSNBC report, eligible military voters are more inclined to exercise the franchise (70 percent in 2000) than the general public (51 percent in 2000). And one of the main reasons that the figure for the military was not higher was the failure of the military postal system to identify where soldiers were stationed and deliver the voting packets to them in time to mail them “back home.”
This rightward tilt, some are beginning to suggest, may reverse itself sharply this year, largely because of Iraq and Afghanistan. Soldiers are tired, worn down by repeated deployments to war zones. Equipment is wearing out, and the improved blast-resistant models have been slow to reach the front lines. The military family structure is under severe siege. Horror stories about mismanaged health care for the physically wounded and mentally traumatized continue to surface regularly.

And the swing, if it does come, will be sharpest among the active-duty officer corps from the generals and admirals on down who feel the Republicans lied to them in 2004 about Iraq, Afghanistan, and the whole global war on terror. At the same time, the officers still are leery of the Democrats who are seen as anti-military liberals.

The question is whether or not they can get beyond the bumper-sticker slogans of the past, for if not, they may find themselves in the proverbial political wilderness.

Monday, February 18, 2008

Reason in the American Experiment

The first book is Freethinkers: A History of American Secularism.

It was a gift from about seven months ago, and so far I have managed to read about 50 pages. Written by Susan Jacoby, Freethinkers traces the role of reason in the founding of the American experiment. Indeed, the book makes the case that the United States was the first country to base the legitimacy of its government on the power of human reason rather than on the power of a deity or the “mandate of heaven.”

It is easy to overlook this point for a number of reasons. For example, last year the nation and the state of Virginia in particular celebrated the 400th anniversary of the founding of the first permanent English settlement in North America at Jamestown. But most history courses then switch to and remain focused on New England: the 1620 voyage of the Pilgrim Fathers aboard the Mayflower and the landing at Plymouth Rock, the 1631 voyage of the Arbella to Massachusetts Bay, the sermon penned sometime during this latter voyage in which John Winthrop equated the Puritan experiment as a “city on the hill,” the New Jerusalem that would draw to itself all those whom God had destined to be saved.

Moreover, the quintessential U.S. national holiday is not July Fourth, Independence Day. Every country has a date on which a watershed event or declaration is endowed with this distinction – the very latest to do so occurring just yesterday when Kosovo declared itself independent of Serbia. The supremely “American” holiday is Thanksgiving, whose origin and tradition are intimately bound to Puritan life, work, and belief that they were God’s newly chosen people.

The list of influential Puritan divines is long: John Cotton, Thomas Hooker, John Eliot, Increase Mather and Cotton Mather. They were influential in part because of the long hours they spent reading, writing, and --ironically -- reasoning about God, man, and salvation. And because they were generally the most learned among the colonists, their views and opinions, expressed in pamplets, sermons or in the various print media of the age commanded the attentions of officials while pushing aside less erudite wrtings in the northern colonies.

This dominance ended in the first decades of the 18th century. By the time of the Great Awakening of the 1740s, not even the preaching and writings of Jonathan Edwards and other revivalists was powerful enough to stem the rationalist tide that celebrated the ascent of reason in the economic, political, and eventually the social advances of the era.

This past Friday, Bill Moyers’ guest on his weekly journal “Now” on PBS was the same Susan Jacoby whose latest book, The Age of American Unreason, was released last week. In it she describes what can best be described as the reversal – or the attempted reversal – of the application of reason to the challenges of the modern world and modern U.S. political and social institutions. The ferment of the 18th century involved the integration of ideas across a growing number of disciplines by men and women willing and able to challenge “received wisdom” by thinking critically from premises and assumptions to via analyses to conclusions.

The ferment of the 21st century seems to be in the opposite direction: moving away from examining the major issues of the day and seeking resolution thereof – which can demand concentrated thought and sometimes radical changes in context – and settling for the “sound bite” or “bumper sticker” kind of knowledge that more and more passes for “informed opinion.”

The Puritan divines were, above all else, teachers who had a powerful message. But this New Jerusalem bestowed on those who came to its shores the opportunity to rise above both the content and the processes of the Old World, steeped as they were in belief, in favor of fact (content) and investigation (process).

This is what the country lacks today – men and women in public life and in the public eye who are teachers, who can demand of the people what is well within their capability to achieve if only they are challenged to look beyond themselves.

If you missed the program, it is at

Friday, February 15, 2008

On Money and Real Estate

At one time, asking a real estate agent to name the three most important considerations affecting their profession would invariably draw the stock answer: “Location. Location. Location.”

With the sub-prime mortgage financing debacle and the collapse of the U.S. housing market still rippling through economies and contributing to sharp declines in stock market valuations worldwide, that glib answer may no longer be so readily on the tip of the tongue. Indeed, Congress heard again this week from Federal Reserve Chairman Ben Bernanke that 2008 will see sluggish growth.

The administration’s answer – one supported by Congress and the Federal Reserve Board – to preclude these accumulating pressures from coalescing into recession is to give every tax payer a $600 rebate (couples filing joint returns are eligible for up to $1,200) with additional sums given to parents for minor-age children.

Not to be outdone, the Army has just unveiled a new recruiting incentive that also is – for now – dependent on “location, location, location.” Five test markets in New York, Alabama, Washington State, Illinois, and Georgia will experiment over the next six months with awards of up to $40,000 that can be applied toward purchasing a house or starting a business when soldiers successfully completes their obligation.

Whatever one thinks of this particular program, the tradition established by the post-World War II Montgomery GI Bill of Rights to help military personnel transition successfully into civilian occupations has served as an important bridge for those eligible for the program and as a source of energy for the economy – i.e., every dollar put into the program has returned seven dollars to the economy.

Now most recruits don’t know – and probably don’t much care – about the multiplier effect of GI benefits on the national economy. What they see is a guaranteed government windfall in their post-service future (which is also “guaranteed” by the shared youthful sense of indestructibility).

What is increasingly and appalling apparent, however, is that this White House doesn’t seem to care either. It is quite ready to throw money or other special benefits to entice potential recruits to join up, but when the recruits come back from as long as 15 months of active anti-insurgency combat with this youthful sense of invincibility – and frequently their bodies and psyches as well – shattered, the institutional services and critical support structures simply have been underfunded if not unfunded.

(One cannot help but draw a parallel between the administration’s priorities and actions when the issue is the moral obligation to defend and care for those unable to fend for themselves – e.g., unborn children and those severely wounded while serving their country.)

This failure is immediately evident by a cursory look at the President’s Fiscal Year 2009 Budget request sent to Congress February 4th, particularly the State Department (DoS) and Foreign Affairs budget, the Department of Defense (DoD) budget, and the Department of Veterans Affairs (VA) budget. Taken together, these three constitute almost the entire “people” life-cycle for engaging with foreign nations on their turf with the goals of preventing war, removing the causes of war; fighting wars when absolutely necessary for the nation’s survival, and caring for all those who served and their families when war is over.

Administration budget documents show a $4.93 billion increase in State Department 2009 discretionary outlays above the projected 2008 total outlays of $34.40. But this increase, as welcome as it is in such areas as fighting HIV/AIDS and malaria or in funding international development banks and UN peacekeeping, pales along side of DoD’s one year increase of $68.11 billion to $651.16 billion. For the third leg of the soldier support stool, the VA, The White House proposed a $5.25 billion increase from the $86.65 billion allocated for 2008. Of this increase, $4.88 billion will go to medical care.

Rough addition gives $130 billion that the administration is willing to spend in 2009 on non-military means and methods in dealing with other countries and international organizations, for taking care of those wounded in war (always unknowable when starting a war), and for fulfilling the promises made to new recruits when they signed up for military service. But what it will spend on preparing for, recovering from, and fighting wars in 2009 is five times the spending proposed for DoS and VA.

Unlike another threat – global warming, which will continue to increase even should every country dramatically cut greenhouse gas emissions tomorrow – this five-to-one war spending ratio can be cut this year. Pull all troops out from all bases in Iraq and Afghanistan, rebalance the forces that are still needed for military defense, and relocate these within the U.S. according to the real estate agent’s credo: “location, location, location.”

Wednesday, February 13, 2008

International Humanitarian Law's Death PenaltY?

International Humanitarian Law and the Death Penalty

“We do not accept the paradox that legal responsibility should
be the least where power is the greatest:”
Associate Justice Robert H. Jackson
May 3, 1945

What is it about international law and the Bush administration that causes the latter to always shoot itself in the foot?

An Associated Press story in the February 13th Philadelphia Inquirer described an unclassified State Department memo to all U.S. embassies instructing Foreign Service personnel on the “politically correct” (from the administration’s viewpoint) responses to “sample questions” about the forthcoming trials of 9/11 “terrorists “under the Bush administration’s military commissions.

State is particularly worried about the possible (and in some locales, the inevitable) official protests by governments and the reaction of “the street” when it becomes widely known that the U.S. prosecutors are asking for the death penalty for six of the men named as defendants in proceedings under the “military commissions” established by Bush.

The memo is an attempt to draw a parallel between 9/11 and the Nazi Holocaust and, by extension, to equate the Guantanamo proceedings with the Nuremberg Trials of top Nazi military and civilian officials: “International Humanitarian Law contemplates the use of the death penalty for serious violations of the laws of war.” And as if to reinforce the message, the cable adds that “The most serious war criminals sentenced at Nuremberg were executed for their actions.”

In 1945, “the laws of war” pertaining to the conduct of war were still not well developed – certainly not as developed as the criteria under which nations felt justified in going to war. The first Geneva Convention, signed in 1864, in recognizing the neutral status of medical personnel on the field of battle, was the first modern restriction on the virtually unregulated power of the warrior to do what was necessary to win. “Common sense,” never abundant on the battlefield, when allowed to come into play in post-Medieval times in Europe, might induce some restraint, but there was never any guarantee how much or for how long. Not until the Hague Conventions of 1907 (Article 22), did a real proscription come into force: “the right of belligerents to adopt means of injuring the enemy is not unlimited, and this rule does not lose its binding force in a case of necessity.”

In fact, Justice Jackson concedes this point in his interim report to President Harry Truman when he proposes that “our test of what legally is a crime gives recognition to those things which fundamentally outraged the conscience of the American people and brought them finally to the conviction that their own liberty and civilization could not persist in the same world with the Nazi power.”

But it is not enough, as Justice Jackson realized, just for American sensibilities to be outraged. In 1945, the world did not have the current formal framework of multinational agreements that regulate the conduct of war. Without specific international agreements, the best “authority” on which the Nazis could be tried would be a consensus – “customary “international practices – derived from examining statutes of numerous countries, that particular actions “offended the conscience of our peoples [and thus] were criminal by standards generally accepted in all civilized countries.”

In the end, 12 high ranking Nazis were executed. In Japan, where General of the Army Douglas MacArthur ruled, seven men were executed. For Jackson, what counted most was to not allow war respectable, which he saw could best be achieved by criminalizing “making unjustifiable war.” MacArthur focused more on the actions of individuals – exemplified in affirming the death sentence for General Yamashita: “The soldier…is charged with the protection of the weak and unarmed….When he violates this sacred trust, he not only profanes his entire cult but threatens the fabric of international society.”

Americans were outraged by 9/11. Perhaps they should be outraged by what has transpired in the subsequent six and a half years to their own civil liberties?

Monday, February 11, 2008

If George Bush Gave the Equivalent of Lincoln's Gettysburg Address

Two score and 17 months ago, I brought forth in the Persian Gulf a new conflict misconceived in secret and dedicated to the proposition that all oil belongs to Halliburton.

Now we are caught in a bloody great insurgency, testing whether that conglomerate or any conglomerate so greedy can get away with stiffing the world.

They are met on the great commodity futures trading floor.

They consistently honor the brokers and traders who make multi-millions in bonuses for driving the price of oil over $100 a barrel, allowing them to take early retirement while the bottom 20%.of the people struggle to make ends meet.

It is altogether fitting and proper that they should do this.
For in a larger sense, they dare not rest, they dare not slacken, they dare not forget that competitors abound.

These free marketers, struggling day after day, strive for $200 a barrel – which they claim would cut greenhouse gas emissions more efficiently than the government ever dreamed could be done.

Thankfully, the world will little note and quickly forget what is said here, for it will be too busy trying to find the money to pay the bill it has been handed here.

It is for each consumer, employed or laid off, to cash in your savings so that oil lobbyists can rig legislation that allows drilling wells in new, previously protected and environmentally sensitive regions.

It is also in the cards for your children and their children to go to war so this nation can meet its needs in the post-peak oil era – the work being so nobly advanced today in Iraq.

The public must re-dedicate itself to this task remaining before it – that from these honored traders and lobbyists it is to expect nothing even though the poor and the near poor, without a dime to their names, pay to the last drop of blood in the oil wars to prove their commitment to the cause of unbridled capitalism.

But hey – those at the top are highly resolved that the dead and wounded from the oil wars shall not have died in vain, that this nation , under martial law, shall have a rebirth of the unitary presidency, and that government in cahoots with conglomerates, benefitting only conglomerates, and then becoming a conglomerate, finally kills all life on this earth.

Friday, February 08, 2008

The Psychology of Killing "Close In"

"There will one day spring from the brain of science a machine or force
so fearful in its potentialities, so absolutely terrifying, that even man,
the fighter, who will dare torture and death in order to inflict torture
and death, will be appalled, and so abandon war forever."
-Thomas A. Edison

War is a progressive concept.

Not sociologically, but in the sense that what began as an “art” has evolved through direct and indirect absorption of advances in peripheral disciplines (e.g., chemical and nuclear energy and health and medicine) into a separate “discipline” that is studied in its own right. Nonetheless, the elements of science – ballistics, ordnance engineering, propellant source , mechanical engineering, electronics and nanotechnology – focus more on the generally incremental development of weapons and support systems than on analyzing the implications for fighting formations and tactics of more effective weaponry.

(There are many who contend that success or failure in battle arguably is as much the result of one commander’s superior or inferior imagination and ability to integrate the essential elements of mission, enemy, terrain, troops available, and training in formulating and implementing a battle plan.)

Modern “conventional” war – as well as the possibility of nuclear war – complicates armed conflict because the fighting systems cannot simply be plucked off a shelf at a moment’s notice. Those who engage in or favor a “war footing” thus are forever seeking new materials, new combinations of known materials, or new variations in fabricating instruments that can kill and destroy efficiently.

Contrast the huge amount of resources devoted to modern weapons development with the historically resource-starved and thus limited (or even totally ignored) study of the psyche’s rational and emotional “switches” inhibit or propel extreme behavior in groups who are allowed or who have seized an opportunity to rampage through towns and villages in a manner comparable to the “hordes” of recorded history.

While obviously incomplete and invariably written from the perspective of the winner, oral traditions and the earliest chronicles detail numerous instances when the “hordes” of “barbarians” on far-ranging conquests engaged in the frenzied slaughter of entire populations – acts that today would be considered war crimes and crimes against humanity.

One of the characteristics of weapons development has been the increasing size of the gap between opposing forces that could be bridged by the new weapons. Many erudite observers have concluded that this separation between the attacker and the attacked has so de-personalized war, that it is now easier for leaders to go to war and for those doing the fighting to kill without remorse. From 15,000 feet in the sky and five or ten miles distance, a pilot only has targets to strike. Precision guided fire-and-forget missiles used against an armored force psychologically translates into a number of tanks destroyed, not the number of people killed in the destroyed tanks.

Moreover, when the attacker employs weapons such as cluster munitions which can be detonated days or weeks or months later by unwary civilians, those killed are completely unknown to the attackers.

Perhaps high technology does depersonalize warfighting. But it is equally apparent that the human race in the 21st century has not evolved psychologically beyond our pre-historic ancestors in discerning – let alone understanding – the conditions and the “triggers” that turns otherwise rational groups into blood-frenzied mobs eager to fight to the death in “close-quarters” combat.

In the former Yugoslavia in the 1990s, in Rwanda in 1994, in Kenya in 2008, eyewitness accounts describe a shocking, absolutely chilling blood-lust that surfaced when mobs rampaged through towns and parts of towns inhabited by “them” It seemed to take hold even when the target had been in the community for years, often having raised a family with no apparent animosity from neighbors.

Which leaves us with two dangerous psychological states:

the coldness of a rational, calculated, uninvolved, unemotional and therefore inhuman response to killing other humans; or

the emotionally driven, irrational, highly unstable frenzy that, requiring discharge, attacks whatever is different (and therefore “dangerous”).

In itself, it is scary enough to impel us all to work harder for peace.

Wednesday, February 06, 2008

A Few Words on Kenya

What with the run-up to Super Tuesday – which just happened to coincide with the conclusion of Mardi Gras and Carnival on February 5th – and then the event itself, much of the rest of the important national and international news slipped off the front pages of papers and even the television and cable channels.

Of course, most of the rest of the world didn’t bother to alter their routine because of Super Tuesday, so in one sense the day was, globally speaking, a wash.

Today, however, my focus reset itself on the situation in the African country of Kenya as a recent traveler there was at my office to relate what she saw, heard herself, or was told by others.

Once hailed as an example – if not an island – of political stability and economic success, the country exploded last December when the incumbent president, Mwai Kibaki, was pronounced the winner of the ballot to choose Kenya’s next president under obviously fraudulent conditions and, five minutes later, was sworn in for a second term. Within 30 minutes, violence had broken out as the followers of the challenger – and presumptive winner – Raila Odinga, took to the streets. The GSU – an elite Praetorian Guard police unit – had been dispatched to the edge of Nairobi’s huge shantytown, and within minutes of the announcement of the “win” by Kibaki, they began shooting and kept shooting for half an hour, killing some 100 people.

The reaction was explosive as the huge number of unemployed and under-employed youth surged out of the shantytown armed with knives, machetes, bow-and-arrows, even stones. What the world saw in Rwanda a dozen years ago was repeated in much of Kenya – and for the same reasons. Some 80% of the population is between 18-30; only in Brazil is the disparity in earnings between the richest and the rest of the population greater than in Kenya. And then there are the ancient vendettas, supposedly long forgotten or settled, but never forgiven. It was almost as if the old men and women acted as the conduits of information about past wrongs committed by individuals of another tribe. Sometimes, the “injury” committed was nothing more than that a man or women had married outside their tribe and now risked losing their lives if they remained in their homes.

In short, our visitor said that the root problem for Kenyans is not a disputed election. Yes, that was the proximate trigger that ignited the violence. But the real problems go much deeper and are more ancient.

The question may be whether they are so entrenched, reaching back to pre-colonial times, that to exorcize them would entail unraveling the country and the sense of what it means to be “Kenyan.”

Monday, February 04, 2008

Who Runs for President?

Who makes a better President?”
Or History-Making Presidential Politics

Not too long ago – that is to say, “Once upon-a-time….” [for this is a fairy-tale of sorts], politicians in the U.S. fell to arguing about which occupation best prepared an individual to serve as president of the country. In the early days of the Republic, being s member of Congress was a “second job,” and most definitely not the work that put bread on the table. Many in the early congresses assembled were farmers, including George Washington and Thomas Jefferson – gentlemen farmers to be sure but still farmers Others were in business (“trade”) and still others were, even in the early days, lawyers.

As a matter of fact, the question has surfaced again and hovers as part of the atmospherics of the current run for the Oval Office. On the Republican side, two former governors, one sitting senator, and one sitting member of the House continue to vie for the G.O.P. nomination. One governor is the first member of the Church of Latter Day Saints (Mormons) to make a serious run for chief executive. For their part, the Democrats have broken a number of barriers.

- Governor Bill Richardson (NM) became the first competitive Hispanic to seek the presidency competitive, although he no longer is campaigning;
- Senator Hillary Rodham Clinton (NY) is the first woman to mount a credible campaign for the position of chief executive;
- Senator Barack Obama (IL) is the first African-American to mount a credible campaign for that same position.
But history may also be made in another way if, as expected, Senator John McCain (AZ) continues to gain on the other three candidates seeking the Republican nomination, even quite possibly winning enough delegates February 5 to go “over the top.” Regardless, whenever McCain secures the Republican nomination, the country will have, for the first time in its political history, sitting senators as the nominees for president in the two (or three) major parties

Among the 43 men who have either won the presidency or succeeded to that office on the death or resignation of the incumbent, nearly one-third – 13 -- served as vice-president under their predecessor in the Oval Office. In the early days of the Republic, for example, John Adams, a Federalist, served as vice-president for George Washington’s two terms before being chosen president in 1796. Adams’ vice-president was Thomas Jefferson, a Democratic-Republican. (Initially, whoever received the most votes in the Electoral College was president and the man with the second-highest number was vice-president regardless of party affiliation. Even so, the 1800 election was thrown into the House of Representatives because the Electoral College deadlocked when Jefferson and his opponent, Aaron Burr, each received 73 votes.).

Those who rose from the vice-presidency to chief executive and how they attained the office break out as follows:

Electoral College ballot: Adams, Jefferson, Van Buren, George H. W. Bush

Predecessor’s death from natural causes: Polk (on the death of William Henry
Harrison), Fillmore (on the death of Zachary Taylor), Coolidge (on the death of Warren Harding), Truman (on the death of Franklin Roosevelt),

Predecessor assassinated: Andrew Johnson (after Lincoln), Chester Arthur (after James Garfield), Theodore Roosevelt (after William McKinley), Lyndon Johnson (after Kennedy),

Predecessor resigned: Ford (after Nixon).

Nixon, Dwight Eisenhower’s vice-president, was chosen president eight years after the end of Eisenhower’s second term.

Seven governors have gone from the state mansion to the White House while an equal number failed in the transition – some more than once. Six incumbents were refused their party’s endorsement to run for a second term while eight seeking reelection were defeated. Lyndon Johnson famously declined to run in 1968.

Only once has a sitting governor met a sitting senator – Senator Harding versus Governor Cox.

Six members of a president’s cabinet have gone directly into the Oval Office: Madison, Monroe, John Quincy Adams, Grant, Taft, and Hoover. Madison, Monroe, and Adams were at state, although Monroe also served concurrently for two years at the War Department. Grant was Secretary of War ad interim in the last months of Andrew Johnson’s presidency, a position Taft also held. Hoover was Secretary of Commerce under Harding and Coolidge.

Interestingly, no race has been run between two governors.

Friday, February 01, 2008

Grim Statistics Again

Listening to President Bushes State of the Union address to the American people and their representatives duly assembled on January 28, I failed to note that nowhere in his opening paragraphs did he utter the formulaic phrase that most presidents get in early: “The State of the Union is strong.”

Just as well, as more in the radio and television world would probably have switched off their sets earlier than they did.

Of course the evidence was all around that the economy was in trouble. It was so bad that he had to be seen ass taking action to at least stop if not reverse the January trends. And he was not shy about highlighting his participation in the process as he pointed out to his listeners that he had been working with the leadership of both major political parties in the House to devise an emergency “economic stimulus” package. Worth an estimated$150 billion and featuring tax rebates and reductions, that the package was aimed not at calendar 2008 taxes but at 2007 taxes signaled how critical the White House felt the situation had become.

Separately, on January 22nd – ironically the original date for the State of the Union – the Federal Reserve Board, in what can be described only as an extraordinary and emergency conference between its regularly scheduled meetings, lowered both the federal funds (the “overnight”) rate and the discount rate by 75 basis points to 3½ and 4 percent, respectively, the largest drop in the overnight rate in 23 years. Then, a week later at their scheduled January 30th meeting – they lowered the overnight rate an additional 50 basis points to 3 percent.

Underlying these dismal events was the sub-prime lending crisis and the housing market crash. In the weeks before the speech, some of the biggest names among U.S. financial institutions were negotiating (or already had negotiated) with European and Far East competitors seeking, in some cases, billions of dollars to get through the still unfolding sub-prime lending horror. Continuing revelations led some observers to compare this debacle with the 1970s savings and loan crisis.

If that were not enough bad economic news, four days after the speech the Labor Department announced that for the first time in 52 months, the U.S. economy failed to deliver a net increase in jobs – ironically one of the “accomplishments” touted by Bush in the State of the Union

The war-related statistics for January are as grim as ever. In Iraq, 3,450 coalition troops have died from hostile action since March 19, 2003 and another 800 from non-hostile causes. Total U.S. fatalities in Iraq as of January 31 number 3,943. This month saw the first uptick in fatalities in four months – 39 – all U.S. The last time all fatalities in a month were U.S. was December 2005. Total wounded number 28,800. And in an unusual month, the number of wounded reported to date in January were five fewer than the number killed.

The Iraqi government announced that civilian deaths in January continued to decline. But today’s toll gets February off to a bloody start.

In Afghanistan seven U.S. and seven coalition troops dies in January, bring the total since October 8, 2001 to 482 U.S. and 281 coalition fatalities. Total U.S. wounded is 1,472,

And this year, February has 29 days on which people can be killed.