Friday, September 12, 2008

War in September

When doing a time line of important events connected to World War II, September seems to stand out as a month of many beginnings and endings.

Western Europeans are apt to recall with little effort that September 1, 1939 was the date Hitler sent tanks and aircraft streaming across the German-Polish border. Two days later France and Britain declared war on Germany, and two weeks after these declarations the Soviet Union invaded Poland from the east. The following September (September 16 to be exact) brought an important war milestone for the United States: the start of U.S. conscription under the Selective Service Act.

But there were to be more endings than beginnings in following Septembers. On September 3, 1943, British troops ended “Fortress Europe” with their invasion of (southern) mainland Italy. Five days later, before the first Americans went ashore at Salerno, Italy capitulated to the allies. September 8, 1944 marked the day the first U.S. soldiers penetrated the Siegfried Line into Germany proper. Almost a year later, September 2, 1945, the war formally ended as Japanese envoys signed the terms of surrender aboard the battleship USS Missouri in Tokyo Bay.

When it comes to war, the generals and admirals can count as a “good war” one that they win. World War II was “good” because, in spite of the deaths of more than 61 million individuals (military and civilian), the allies prevented the subjugation of countries with key resources (petroleum in Dutch East Indies, in Romania, the Caucasus and possibly Iran; natural rubber in Indochina) that the allied powers themselves needed and would want sometime in the future.

Well, this week we learned of the intelligence community’s “Global Trends 2025” which predicts that one, if not the most common, cause for war in the next 17 years will be a shortage of key resources and the refusal of nations to work out “sharing" agreements. These shortages are in the basics: clean water, clean energy, food, perhaps clothing or shelter, according to the chief writer of the report, Thomas Fingar.

The other bit of news from Fingar is that U.S. dominance in economic, diplomatic, military, and even cultural fields will decrease significantly except for the one area where being the dominant power will mean relatively little: military affairs. But what struck me about the media report of Fingar’s forecasts was the insistence on describing the relative position the United States will still have in all these other fields as “dominance.”

It was exactly this mindset among the triumphant Washington politicians that led to the post-Cold War “death spiral” of higher military spending for ever-older equipment operated by troops with ever-lower hours of training. Administrations became locked into asking for and of successive congresses funding, without enforcing proper oversight and review, what defense industries were selling to the Pentagon (all the "bells and whistles" they could develop) in the name of not merely "superiority" but "dominance." And it wasn’t enough to dominate every other country; the U.S. had to dominate all other countries combined.

Other than being September again, what connects the beginnings and endings of a war that, for the U.S., started 67 years ago and ended 3¾ years later?

The simplest expression I can give is that World War II, not infrequently referred to as the 20th century’s “good war” from the perspective of the allied powers, was at bottom the first modern “resource” war. A mere four years before the first German bullet was fired into Poland, President Franklin Roosevelt and King Ibn Saud, whose family ruled Saudi Arabia, struck an “oil for security” deal. That began to wither in the early 1970s as the Arab countries used the “oil card” against the U.S. for Washington’s unremitting support of Israel.

The undisputable victory by the U.S.-led coalition that ousted the Iraqis from Kuwait first Gulf War (1990-1991), coming so quickly after the eight-year-long Iran-Iraq War (1980-1988), should have reaffirmed the Saudi-U.S. relationship. But for the next decade, a series of policy miscues (particularly affecting Palestine-Israel relations) and cultural insensitivities (e.g., sending gender-mixed military units to defend the kingdom of the two Holy Mosques) created and sustained a reservoir of ill-will that found expression in the events of September 11, 2001.

Ironically, September 11 at first generated extremely wide empathy for the United States and for the other countries whose citizens died that day. But true to form, another Washington administration quickly squandered this collective good-will by launching a war against a poor and resource deprived country on the basis that its government harbored the terror group responsible for 9/11.

Seven years after the U.S. attacked Afghanistan, George Bush is sending more U.S. troops to fight a resurgent Taliban movement. He – and the U.S. field commanders in Iraq – is betting that the coming integration of the Iraqi Sunni “Sons of Iraq” movement into the Shi’a dominated army and police will not find the Sunnis suddenly victimized by the more numerous Shi’a who will constitute 80% of the security forces. Should that happen, all bets are off on Iraq remaining “stable” enough to shift U.S. troops to Afghanistan, thereby allowing George Bush to end his second term on a "positive" note: "victory" in Iraq.

In the same vein, it is noteworthy that Bush, in a September 9, 2008 speech at the National Defense University, mentioned “victory” only once – this from a president who assured the American public in 2001 and every year until now that victory in the war on terror was assured if only the nation persisted. This war has lasted twice as long as World War II and its monetary costs persist at $12 billion per month. We no longer can afford such a costly “victory.”

0 Comments:

Post a Comment

<< Home