History News Network - Front Page History News Network - Front Page articles brought to you by History News Network. Fri, 22 Feb 2019 20:42:59 +0000 Fri, 22 Feb 2019 20:42:59 +0000 Zend_Feed_Writer 2 (http://framework.zend.com) https://new.hnn.us/site/feed What Historians are Saying: In Response to Max Boot's Op Ed on Historians Click inside the image below and scroll to see tweets.

]]>
Fri, 22 Feb 2019 20:42:59 +0000 https://historynewsnetwork.org/article/171293 https://historynewsnetwork.org/article/171293 0
Why Aren't Americans More Enraged About Russian Interference? World War 2 History Helps Explain

 

No one should be surprised if a substantial minority of American voters remains unconvinced that Russian agents interfered in the 2016 presidential election, or if an even larger percentage of the American public downplays the urgency of the Russian threat to the nation’s electoral system.  Complacency in the face of foreign dangers is nothing new in the United States; during World War Two, it was plainly visible among Americans less than two months after the Japanese attack upon Pearl Harbor.

 

In the days immediately following December 7, 1941, outraged patriots flocked to recruiting stations, purchased more than $1 billion worth of war bonds, and destroyed all the Japanese-manufactured merchandise they could find — including Christmas tree ornaments, silver bells on street lights, and china plates which rampaging customers smashed on department store floors.  Political leaders of both parties pledged their loyalty to the Roosevelt administration. But the initial outburst of enthusiasm and unity soon faded.

 

By the middle of January — following the fall of Guam, Wake Island, and Manila to Japanese forces — numerous commentators noticed that Americans seemed to be going about their business “apparently unmindful,” as the Chicago Tribune complained, “of the soldiers who are dying in the Philippines,” notably on the Bataan peninsula.  After traveling across the country, newscaster Edward Murrow agreed that most Americans regarded the war with no sense of urgency, viewing the conflict instead as a spectacle, and themselves as mere spectators. Upon completing his own cross-country journey, columnist Walter Lippmann lamented “the unawareness, the overconfidence, and the complacency” of Americans.

 

“I find all around me a smugness and satisfaction which to my mind are entirely unjustified,” concluded William Batt, director of the War Production Board’s materials division.  “It is no secret, that, by and large, the American people have not yet settled down to the grim task of fighting for their freedom and their lives,” noted the Washington Postwith a palpable sense of frustration. On Capitol Hill, veteran Congressman Hatton Sumners of Texas complained that “I do not see yet that vital, stirring consciousness of responsibility, consciousness of danger . . . which we have got to have.”

 

After touring defense production plants on the Pacific coast in early 1942, Congressman Lyndon Johnson noted that Americans’ “lack of mental concern is evident on every facial expression. Their complacency, indifference, and bewilderment are an open invitation to [enemy] direction.”  In Louisiana, Governor Sam Houston Jones grew so alarmed at his constituents’ lethargy that he mounted a three-week, statewide speaking tour in a sound truck with the words “Awake America” on the side.

 

Declining war bond sales provided hard evidence that a significant segment of the public remained disconnected from the war effort.  In January, Americans purchased $1.075 billion worth of bonds, although only one in seven American wage earners bought any bonds at all. February sales tumbled to $711 million, and March fell to $565 million; some people reportedly hesitated to buy bonds because they believed rumors that Roosevelt was putting the cash into his personal savings account.

 

Instead of buying war bonds, American consumers bought and hoarded goods which they expected to soon become scarce:  sugar (sometimes 100 pounds at a time), nylon stockings, radios, wool clothing (after federal officials announced that the Army would require most of the nation’s wool supply over the next twelve months), bicycles, soap, socks, and women’s foundation garments, which were disappearing rapidly due to a shortage of rubber.  From coast to coast, retailers sold more merchandise to consumers in the first several months of 1942 than at any time in the nation’s history.  Treasury Secretary Henry Morgenthau repeatedly warned the public that excessive spending and the subsequent rise in prices could cripple the war effort, and merchants in New York, Philadelphia and Cincinnati took out newspaper ads begging their customers to stop buying so much.  “You cannot buy victory and luxury in the same market at the same time,” warned the New York Times, but consumers kept on stockpiling merchandise.

 

Other Americans, fearful that the Roosevelt administration planned to confiscate their savings to pay for the war, withdrew their savings from banks and buried their cash in a safe (rentals of safe-deposit boxes surged) or stuffed it in a sock, instead of lending it to the federal government or investing it for productive use.  “Hoarded dollars are idle dollars, slacker dollars,” one New York bank reminded its customers.  “Hoarding in times of war amounts to sabotage against the Government.” Still the drain continued; by mid-April, Americans were hoarding an estimated $500 million to $1 billion.

 

By far the most prevalent explanation for the public’s “dangerously complacent” attitude was the insistence of military censors and administration spokesmen in painting the military situation in a far too optimistic light.  As the War Department and the Navy controlled the release of military information, official communiqués downplayed or concealed American defeats — “Never before in history,” complained one critic, “have so few kept so much from so many” — and released stories that transformed relatively minor triumphs into brilliant victories.  “They are misleading,” charged the Washington Post,” in the sense that they are utterly out of proportion.”

 

The government’s practice of distorting news reports fueled the conviction of many Americans that “one American can lick ten Japs or five Germans and that is all there is to it.” The public’s overconfidence was boosted further by the Roosevelt administration’s decision to keep arms-production figures secret, while promoting reassuring  — but misleading — news stories about the progress of the nation’s output of war goods, typically accompanied by multicolored charts and graphs of upwardly spiraling production and the same adjectives used over and over: “huge,” “enormous,” “immense,” “tremendous,” or “magnificent.”  “We here in the United States,” observed journalist Alistair Cooke, “studied our own production story and assumed the victory.”

 

Not even the loss of Bataan and Corregidor in April and May could shock Americans out of their complacency.  Factory workers who were earning more than a subsistence income for the first time in their lives spent their newfound wealth on good times in restaurants, theaters, night clubs, and strip joints.  Sales of jewelry and champagne soared.  And throughout the spring and summer, Americans bet more money on sports than ever before.  One after another, horse-racing tracks across the nation broke their own records for wagers; on Kentucky Derby Day, bettors at Churchill Downs wagered nearly $2 million on the first leg of the Triple Crown, while nearby booths selling defense bonds reportedly took in less than $200.

 

Meanwhile, German U-boat commanders were waging an increasingly successful campaign — “Operation Paukenschlag,” or “Drumbeat” — against American and Allied merchant shipping along the East Coast.  The task of sighting and sinking slow-moving vessels, particularly oil tankers, was made easier by carelessly illuminated shops, homes, and thoroughfares along the coast.  Sometimes visible for ten miles or more at sea, the lights from shore silhouetted the tankers and created a “neon shooting gallery” for the submarines.  Despite the horrific toll — by the end of April, nearly two hundred ships had been sunk, and more than four thousand sailors and passengers killed — a disturbing number of Americans ignored repeated requests by civilian and military authorities to dim or extinguish their lights.  Dimout inspections in early June consistently revealed widespread and “flagrant violations” of Army regulations; New York City remained a “murderous mound of light”; and a sailor on a merchant ship passing a New Jersey resort town at night noted that “the lights were like Coney Island.  It was lit up like daylight all along the beach.”

 

In late summer and autumn, a new problem emerged as a rising wave of absenteeism at war plants and shipyards undercut arms production efforts.  “The extent to which absenteeism impedes our war effort is beyond belief,” grumbled a high-ranking official of the War Manpower Commission, as workers increasingly took time off to nurse hangovers (“Monday morning sickness”), to look for better-paying jobs, or simply to engage in shopping sprees (“Pay-day richness”).  In November, a Senate investigating committee headed by Harry Truman reported that excessive worker absences were reducing production by as much as ten percent in many war plants; in some shipyards, the absentee rate reached 18 percent, while Ford’s massive Willow Run bomber plant in Michigan, suffered absences of nearly 25 percent.  Federal officials attempted to combat the trend with posters and slogans designed to make “work skippers” feel like slackers, but to little effect.

 

One year after Pearl Harbor, many Americans still refused to give their wholehearted commitment to the war effort; calls for sacrifice often had been ignored, and restrictions evaded.  “There’s one thing America hasn’t yet got around to,” claimed the Carrier Corporation in an early December appeal to the public.  “We’re still waiting for that old-fashioned American ‘drive’ that hits the line head-on and sweeps everything before it.”

 

The war would have to wait a little while longer.

 

 

 

 

 

         

         

         

]]>
Fri, 22 Feb 2019 20:42:59 +0000 https://historynewsnetwork.org/article/171184 https://historynewsnetwork.org/article/171184 0
The Polar Bear Expedition: When America Intervened in Russian Affairs The Great War had been over for more than two months when on the frigid morning of January 19, 1919. Lt. Harry Mead peered into the slim, gray light of dawn and saw hundreds of ghostlike figures, armed and dressed in white, skimming the frozen ice of Russia’s Vaga River on skis. They were coming towards the forty-five men of his first platoon of Company A, 339th United States Infantry Regiment. 

 

Soon enough, the Americans were overwhelmed, and as Americans fell left and right,the survivors of the initial assault began a fighting retreat through deep snows and temperatures of minus-fifty; more than half of the platoon would be wiped out at the hands of 1,700 of their Bolshevik counterparts. 

 

The long-anticipated attack by the Reds marked the beginning of the end of a particular episode in U.S. history that few today are aware of:  the intervention into the Russian Civil War by America and its British, French, Polish, and even Italian allies. 

 

As allegations and investigations swirl around the questions of Russian intervention in our 2016 elections, and whether the Republican campaign of now-President Trump actively colluded with agents of Vladimir Putin, it’s worth a look back one hundred years to our own intervention—invasion, if you will—in Russia, if only to add some context to the current air of distrust and enmity that exists between the U.S. and it former Cold War opponent. 

 

While most Americans might point to the post-World War 2 Cold War between east and west as the starting point of antagonism between the two super powers, the story line must in fact be pushed back much farther, back to the summer of 1918.Then-President Woodrow Wilson was agonizing over whether to accede to Allied insistence that the U.S. supply troops for a planned foray into Russia. 

 

The aims of the intervention, as put forward mainly by the British, were twofold:  (1) the reestablishment of the Eastern Front, which the Bolsheviks had quit in March, 1918, leaving Germany free to transfer some eighty divisions to the Western Front; and (2), more grandly, the incitement of a counter-revolution that would throw the Reds out of power and alleviate the nascent “Red threat” that caused some governments in the Western democracies to worry. 

 

Wilson wanted no part of an intervention, but under tremendous pressure he finally agreed in mid-July 1918 to send a single U.S. regiment to northern Russia. His orders were that the Americans were only to guard the tons of materiel that the Allies had sent to Russia during the war, and stay out of the post-revolution turmoil that saw Reds fighting so-called “Whites”—Tsarist loyalists—across Russia.

 

That regiment was the 339th, part of the 85th Division. Mostly draftees from Michigan and Wisconsin, the men had arrived in England in August and were anticipating a trip across the Channel to France when they were plucked for assignment in Russia. 

 

The regiment sailed for Archangel, Russia in late August of 1918, and arrived at the bustling northern port on September 5. But almost as soon as they arrived, the men were hustled off their transports by their British overseers and sent by train south in the direction of Vologda, and southeast on the wide Dvina River towards Kotlas, hundreds of miles away.

 

Left behind were some seventy men stricken with or already dead from the flu that had haunted the ships that had carried them to Archangel. Many more—more than 150—Americans would die in battle or from wounds in the coming months. 

 

Living or dead, all were mere pawns in an epic drama that would play out through the coming fall, winter and spring—and well past the November 11, 1918 Armistice that ended World War 1. 

 

Company by company, the men took up posts in isolated positions across a front of four hundred miles, the farthest-flung being Company A, which after seeing duty on the lower Dvina River was went down the Vaga River, a tributary, to a God-forsaken village called Nijni Gora, 250 miles from Archangel.

 

Polar Bear Memorial at White Chapel Cemetery in Troy, Michigan

On November 11, ironically, Company B was attacked by hundreds of Bolsheviks—the men called them “Bolos”—at Toulgas on the Dvina. Firing from hastily made blockhouses, the Bolos were finally put at bay after several days of fighting with the help of a single battery of Canadian artillery. But the company would remain there in that isolated spot until March, under constant attack and harassment by the Bolsheviks. 

 

On the railroad, the third battalion struggled through the fall and winter to push south towards Vologda, but never got to within three hundred miles of the city.  Meanwhile men died, and men suffered in the brutal cold—so cold that the men slept with their water-cooled machine guns in an effort to keep them operable.   

 

Meanwhile, the Reds’ minister of war, Leon Trotsky, was building an army of 600,000 men, which he vowed to use to push the invading and slim Allied force into the White Sea. That process began at the farthest Allied base at Nijni Gora, where Company A in the third week of January was routed, and each man subsequently fought for his survival during a two-week retreat through the harsh elements. 

 

By then the main reason for the intervention—the recreation of the Eastern Front—was a non-issue. So, too, was Woodrow Wilson’s stated intent of guarding war materiel from the predations of the Germans. 

 

And still the men—American and Allied—suffered and fought bravely and wondered why they had never been given a single good reason for their being there. As the winter progressed, they wondered even more when they might be withdrawn. 

 

An increasing clamor back home in the U.S. was rising at the same time, and in February Wilson made the decision to pull out—an impossibility, however, until the frozen White Sea broke up. Finally, the Americans began withdrawing across the fronts and shipped from Archangel in early June, followed by the British in September.

 

The intervention would be quickly forgotten in the U.S., if not by the survivors, by the public in general. Few lessons would be learned from the expedition, and similar mostly unsuccessful invasions—in Vietnam, in Iraq, in Afghanistan—would be undertaken with no look back at our experience in Russia. 

 

But the average Russian has been taught about the intervention, and the average Russian remembers. Though Presidents Richard Nixon and Ronald Reagan would both proclaim separately that Americans and Russians never faced off in battle, the Russians still remember a time when foreign, Western nations interfered in their affairs. 

]]>
Fri, 22 Feb 2019 20:42:59 +0000 https://historynewsnetwork.org/article/171182 https://historynewsnetwork.org/article/171182 0
Roundup Top 10!  

Why Are We Still Segregating Black History in February?

by Christina Proenza-Coles

Even before the U.S. was a nation, African-Americans played crucial roles in nearly every stage of history in the new world. ‘Honoring’ that history in one month is a travesty.

 

Americans’ ignorance of history is a national scandal

by Max Boot

You simply can’t understand the present if you don’t understand the past.

 

 

Max Boot’s Screed Against Historians

by John Fea

Boot is the latest public intellectual to chide academic historians for failing to speak to public audiences.

 

 

A quick response to Max Boot’s critique of historians

by Glenn David Brasher

Aren’t retention and anti-intellectualism the real problems?

 

 

Do American Women Still Need an Equal Rights Amendment?

by Susan Chira

We’re already living in Phyllis Schlafly’s nightmare.

 

 

Winthrop's "City" Was Exceptional, Not Exceptionalist

by Jim Sleeper

There are compelling anthropological reasons why almost every society in history has invented “special covenant” and “origin” myths, or “constitutive fictions.”

 

 

The black men of the Civil War were America’s original ‘dreamers’

by Colbert I. King

Like dreamers of today, those black soldiers and sailors also had families and attended churches; some were enrolled in schools.

 

 

What we get wrong about the roots of slavery in America

by Eric Herschthal

How we remember the past shapes the fight for racial justice today

 

 

Protesting on Bended Knee: Race, Dissent and Patriotism in 21st Century America

by Eric Burin

This digital book is available for free download!

 

 

The "Historovox" and the Bad Synergy Between Historians and Journalism

by Corey Robin

When academic knowledge is on tap for the media, the result is not a fusion of the best of academia and the best of journalism but the worst of both worlds.

 

 

Why History is Important Today

by Luis Martínez-Fernández

"The most effective way to destroy people is to deny and obliterate their own understanding of their history.”

 

 

The Catholic Church is bursting with secrets. Investigating one will unravel them all.

by Garry Wills

Secrecy in one clerical area intersects with secrecy in others.

 

 

How George Washington would fix partisan politics in America today

by Eli Merritt

The United States' first President George Washington would prescribe rule of law and emotional intelligence to help us heal.

 

]]>
Fri, 22 Feb 2019 20:42:59 +0000 https://historynewsnetwork.org/article/171319 https://historynewsnetwork.org/article/171319 0
We Need to Acknowledge the Power of the Israel Lobby

 

Last week, Rep. Ilhan Omar (D-MN) flippantly tweeted, “It’s all about the Benjamins” alluding to the financial role of pro-Israel groups on unstinting US support for Israel. Widely condemned for invoking an anti-Semitic trope pertaining to Jewish money, Ilhan apologized for the remark but not for the essence of her larger point, which is unquestionably true, namely that money in politics plays a role in the lopsided pro-Israeli policy that the United States has pursued for decades. The absurdity is that everyone in Congress already knows this.

 

While reams of type and hype have spilled forth concerning the intrusions by the big, bad Russian bear (yes, he’s back after a post-Cold War hibernation) on American politics, we hear very little about Israel’s influence, which has profoundly shaped United States Middle East diplomacy since World War II. As I document in a forthcoming book, the Israel lobby goes much deeper historically than most people realize and has long exercised an outsized influence on Congress and presidential elections.

 

 

On February 15 the British Guardian did something American newspapers, magazines—and historians--rarely do: it published an analysis of pro-Israeli financing in American politics. “The data examined by the Guardian suggests that the pro-Israel lobby is highly active and spends heavily to influence US policy.” (“Pro-Israel Donors spent over $22 million in lobbying and contributions in 2018,” The Guardian online.) There’s more to the story than the Guardian grasps, namely individual campaign contributions that are made with the explicit or implicit understanding of unquestioned political support for Israel.

 

Fear of baseless charges of anti-Semitism must not prevent us from making relevant and scarcely disputable arguments about money and political influence. Let’s examine some basic facts: Israel, a tiny country of 8.5 million people, is the largest recipient of US foreign assistance since World War II (more than $100 billion), according to the Congressional Research Service. In 2016 President Barack Obama--despite being treated contemptuously by Israeli President Benjamin Netanyahu--signed a 10-year, $38 billion military assistance pact with Israel.

 

Well, you might say, Israel needs the money to defend itself from the hostile forces that surround and threaten to destroy it. This cliché misrepresents realities that have prevailed in the Middle East since the creation of Israel, namely that Israel more often than not has been the aggressor in the region, has won every war, and is more than capable of defending itself without receiving another dollar in US military assistance. Since 1967 the country has illegally occupied Palestinian territory and illegally sponsored settlements, which have been funded in part with American money, and has repeatedly engaged in indiscriminate warfare notably in Lebanon and the Gaza strip. Beyond dispute Israel has suffered from terror attacks but it is far from innocent and has killed many times more people than it has lost in the conflict. And now it is illegally absorbing, with Trump’s blessing, a historic holy city that under international law was meant to be shared by people of all faiths. 

 

Even more absurd than over-hyping Russian influence on US elections while ignoring those of Israel, is the widespread condemnation of Iran for supposedly pursuing a nuclear weapon, while ignoring the history of Israel’s utter contempt for nuclear non-proliferation in defiance of the United States dating back to the Eisenhower administration. Israel has the bomb, scores of them, acquired secretly and mendaciously--that is, it was developed even as the American special ally was told Israel would not introduce such weapons to the Middle East. Turns out what Israel meant is that it would not hold a press conference and say, “Hey, we’ve got the bomb!” Meanwhile Netanyahu and the Israel lobby spare no effort to condemn Iran, which unlike Israel proved willing to negotiate a non-proliferation agreement, which it entered into with Obama in 2015. Netanyahu--joined by Trump, Mike Pompeo, John Bolton, Elliot Abrams, and other assorted fanatics of the far right who are now in power—are lusting for an excuse to go to war with Iran. 

 

In sum, beyond a doubt, Israel and its American supporters have assembled the most powerful lobby pursuing the interests of a foreign country in all American history. Certainly, there are cultural and historical affinities on the part of American Christians (think Mike Pence) as well as Jews that help explain broad-based and historical US support for Israel. But only a fool—or an apologist—would argue that pro-Israeli money and influence do not play a significant role in American politics.

 

I hope this gets into print while it is still legal to criticize a foreign country (other than Russia, China, Venezuela--and France and Canada when they don’t do what we tell them). In an effort to head off a growing boycott, divestment and sanctions movement, Israel and the lobby now have set their sights on curbing freedom of speech and expression in this country. The matter is before Congress.

 

The time is past due to speak truth to power about Israeli policies and the American Israel lobby. 

]]>
Fri, 22 Feb 2019 20:42:59 +0000 https://historynewsnetwork.org/article/171292 https://historynewsnetwork.org/article/171292 0
“National Security Crisis” or “Power Grab”?

 

On February 14, 2019, Congress allotted Donald Trump some 1.4 billion dollars for 55 miles of wall between Mexico and the United States. Trump was flabbergasted—a poor Valentine’s gift indeed!—as he had demanded 5.7 billion dollars for the wall. Consequently, Trump, on the next day, declared a state of national emergency in order to access money, earmarked for defense spending, toward his wall. Trump said defiantly and with rodomontade: “We’re going to confront the national security crisis on our southern border and we’re going to do it one way or the other. It’s an invasion. We have an invasion of drugs and criminals coming into our country.” Trump’s declaration, signed later that day, will enable him to use 3.6 billion additional dollars, earmarked for military projects, for his beloved wall, so long as there are no impediments. He has plans to allocate 2.5 additional billion dollars for the wall, for a total of some 7.5 billion dollars.

Why such precipitancy? What is the national crisis?

 

An emergency is defined as an unexpected and urgent state of affairs that requires immediate action—here, a crisis of national significance. The invasion of drugs and criminals of which Trump spoke, it seems, had taken on the status of an immediate threat to national security. Yet Trump added in his Rose Garden speech: “I didn’t need to do this, but I’d rather do it faster. I want to get it done faster, that’s all.” Those words undermined the notion that his actions were precipitated by a national emergency. They were not. The border problem has been longstanding. There was, however, a large and looming crisis: Trump did not get what he wanted.

 

As anticipated, Democrats, headed by Speaker of the House of Representatives, Nancy Pelosi, argued that Trump’s maneuver was not on account of a national emergency. “This is plainly a power grab by a disappointed president, who has gone outside the bounds of the law to try to get what he failed to achieve in the constitutional legislative process.” They vowed to do what they could, with the help of sympathetic Republicans, to block the power-hungry president.

 

The prickly issue is one of Congressional authority, and thus, the constitutionality of Trump's gambit. Congress failed to allot Trump the money he wanted, so he decided to ignore Congress and to do whatever he could to get the needed money for his wall. Yet the problem is this: Does the president have the constitutional authority to sidestep Congress? If there is a real state of emergency, then the answer is yes, but as we have seen, the real crisis is that a man who is used to getting his way, usually through bullying, did not get his way. It is a matter of presidential pouting, and that is a dangerous precedent.

 

Still one could argue that the border wall was a campaign pledge of Trump and he has steadfastly stuck to keeping that promise, and that seems laudable, does it not? Promises ought not to be lightly made and once made, they ought to be kept. 

 

Yet there is a larger, more fundamental nodus. According to a Gallup poll, conducted in January, 2019, and polling randomly 1,022 American adults all over the United States (95% confidence range and +/- .03 margin of error), 60 percent of American oppose construction of the wall (39 percent, strongly opposing), while 40 percent favor construction of the wall (26 percent, strongly in favor). With the margin of error, there is no question that the majority of Americans disfavor the wall. The question redounds: Ought Trump to build a wall if the American citizenry does not want a wall?

 

After the Revolutionary War, American politicians and visionaries, in keeping with Jefferson’s sentiments in his Declaration of Independence, sought to do politically something that had never been done before: build a nation beholden to the will of the majority of the people over, paceAthens, a large expanse of land. For Jefferson, vox populi (the voice of the people) was the axial principle of a representative democracy, and he was fond of using the metaphor of machine to describe governmental efficiency in keeping with vox populi in two distinct senses.

 

A fine illustration of the first occurs in a letter to Comte de Tracy (I26 Jan. 1811), in which Jefferson writes fondly of Washington’s cabinet. “Had that cabinet been a directory, like positive and negative quantities in algebra, the opposing wills would have balanced each other and produced a state of absolute inaction. But the President heard with calmness the opinions and reasons of each, decided the course to be pursued, and kept the government steadily in it, unaffected by the agitation. The public knew well the dissensions of the cabinet, but never had an uneasy thought on their account, because they knew also they had provided a regulating power which would keep the machine in steady movement” (see also, TJ to Joel Barlow, 11 Apr. 1811, and TJ to Jedediah Morse, 6 Mar. 1822). Machine-like efficiency meant, thus, that the various, often disparate, parts of government would strive to work together with a common aim. That common aim was to give expression politically to vox populi, which Jefferson, in his First Inaugural Address, called a “sacred principle”—“The will of the majority is in all cases to prevail,” though he added that that will “to be rightful must be reasonable.”

 

Given government united by the aim of actualizing vox populi, Jefferson was also fond of describing the function of an elected official to be, in some sense, machine-like—as it were, perfunctory. In Summary View of the Rights of British America (1774), Jefferson enjoined King George III: “This his Majesty will think we have reason to expect when he reflects that he is no more than the chief officer of the people, appointed by the laws, and circumscribed with definite powers, to assist in working the great machine of government, erected for their use, and consequently subject to their superintendance.” The sentiment is that a governor is a superintendent of the great machine of government—a steward, not a lord. To Benjamin Rush (13 June 1805), Jefferson said, “I am but a machine erected by the constitution for the performance of certain acts according to laws of action laid down for me.” The notion here is that, once elected, a president’s will is no longer his own, but that of the people.

 

With today’s amaranthine political bickering, the metaphor of government as a machine, whose parts work toward the efficient functionality of the machine, is laughable. What is more ludicrous is the notion of elected representatives functioning as machines in working for the will of their constituency, and the president being beholden to the will of the general American citizenry. The aim of the Revolutionary War, if we follow Jefferson in his Summary View and Declaration, was resistance to tyranny. He was adamant in his Declaration that no revolution ought to be begun for “light & transient causes,” but only on account of “a long train of abuses & usurpations pursuing invariably the same object.”The greatest tyranny is government indifferent to the will of the people.

 

That is where we are with Trump’s wall and the national security crisis, birthed by Congress’ failure to acquiesce to the will of The One. We are in a state of national emergency because of presidential pouting. In all of the partisan bickering, vox populi is too infrequently mentioned. 

 

Jefferson also wrote in his Declaration that “mankind are more disposed to suffer while evils are sufferable, than to right themselves by abolishing the forms to which they are accustomed.” We are today flooded with governmental evils, and we suffer them, not because they are sufferable, but because we have become apathetic to injustice and the ideals for which our forebears fought. As Jefferson prophesied in Query VII of his Notes on the State of Virginia, when the Revolutionary War is long forgotten and Americans fix themselves on “the sole faculty of making money,” they will become mindless of their rights. Our shackles “will remain on us long, will be made heavier and heavier, till our rights shall revive or expire in a convulsion.” If we follow the general trend of indifference, expiration of rights through convulsion, if it has not already occurred, seems the most likely path.

]]>
Fri, 22 Feb 2019 20:42:59 +0000 https://historynewsnetwork.org/article/171291 https://historynewsnetwork.org/article/171291 0
Proof that Bombs Can Stop Genocide

 

A new study has documented the Syrian government’s role in more than 300 chemical weapons attacks against its own citizens. Significantly, the new report found no evidence of any such gassing attacks in the ten months since last year’s U.S. missile strike on Syrian chemical warfare sites. 

It’s time for some soul-searching by those who denounced that U.S. military action—including one of the current candidates for the Democratic presidential nomination and officials of the U.S. Holocaust Memorial Museum. It’s time to acknowledge that American bombs can stop genocide.

The landmark Syria report, released February 17 by the Berlin-based Global Public Policy Institute, found there were at least 336 chemical weapons attacks in Syria between 2012 and 2018, and 98% of them were perpetrated by the Assad regime.

(Link to the full text of the report:  https://www.gppi.net/2019/02/17/the-logic-of-chemical-weapons-use-in-syria)

According to the report’s detailed timeline of the Syrian chemical atrocities, there have been no such attacks since April 7, 2018. That date is significant—it was one week before the United States carried out major missile strikes on multiple Syrian chemical weapons facilities. 

The strikes were praised by a wide range of foreign leaders and by prominent voices across the American political spectrum. But there were some notable, and disturbing, exceptions.

Rep. Tulsi Gabbard (D-Hawaii) said that since “Syria has not declared war against the U.S.,” missile strikes on the chemical sites were “illegal” and unconstitutional. Gabbard earlier said she was “skeptical” as to whether the Assad regime really was gassing civilians. She is now a candidate for the Democratic presidential nomination.

Rebecca Erbelding, a staff historian at the U.S. Holocaust Memorial Museum, tweeted in response to the missile strikes: “There are viable ways that the US can aid those being persecuted under an evil regime. Bombing isn't one of them.”

A few months earlier, the Museum had stirred controversy by issuing a report arguing it would have been “very difficult” for the US “to take effective action to prevent atrocities in Syria.” That sounded like a justification of President Barack Obama’s embarrassing failure to act on his famous “red line” ultimatum. After an outcry, the Museum backed down and deleted the report’s most objectionable language.

Amnesty International responded to the U.S. strike on the chemical weapons targets with a press release characterizing Assad’s atrocities as “alleged violations of the Syrian government.” Meanwhile, Code Pink and other antiwar groups staged “Hands off Syria” rallies around the country.

Seventy-five years ago this spring, the Roosevelt administration learned the full details of the mass gassing of Jews in the Auschwitz death camp. Thanks to two escapees from the camp, US officials even received detailed maps pinpointing the location of the gas chambers and crematoria.

Jewish groups in the United States and elsewhere pleaded with US officials to order air strikes on the mass-murder machinery, or on the railway lines and bridges over which hundreds of thousands of Jews were being deported to Auschwitz.

Since US planes were already bombing German oil fields within five miles of the Auschwitz gas chambers as well as railway lines and bridges throughout that region of Europe, it would not have diverted from the war effort to drop a few bombs on the transportation lines to Auschwitz or the mass-murder machinery. But the Roosevelt administration refused.

Those who have not learned from the moral failures of the Holocaust era should at least pay attention to more recent evidence of how U.S. military force can be used to interrupt genocide or other atrocities.

Recall that President Bill Clinton used air strikes to put an end to atrocities in the Balkans. President Obama used military force to preempt the plan by Libyan dictator Muammar Qadaffi to carry out what the president called “a massacre that would have reverberated across the region and stained the conscience of the world.” Obama also took military action to end the ISIS siege of thousands of Yazidi civilians in Iraq.

The new report on Syria urges the US to further damage Assad’s chemical weapons potential by “directly targeting the military formations that would be responsible for any future attacks.” It argues that “the Syrian helicopter fleet, which has played a critical role in the delivery of conventional and chemical barrel bombs, should be a primary target.”

If President Roosevelt had heeded the pleas that were made in 1944 to use force against the machinery of genocide, many lives could have been saved. Let’s hope the current president will learn from FDR’s mistake.

]]>
Fri, 22 Feb 2019 20:42:59 +0000 https://historynewsnetwork.org/article/171263 https://historynewsnetwork.org/article/171263 0
How the Shortest Presidency Spurred a Brief Constitutional Crisis

William Henry Harrison took the Oath of Office on a cold and stormy day. Standing in the freezing weather without a coat or hat, the 68-year-old military hero delivered the longest inaugural address in American history. At more than 8,000 words, it took nearly two hours to read (even after Daniel Webster had edited it for length!).

A few days later, Harrison caught a bad cold which quickly turned into pneumonia. Doctors tried to cure the president with opium, castor oil, and other remedies, but the treatments only made Harrison worse, and he died on April 4, 1841. The first American president to die in office, Harrison served only 31 days.

Only lasting a month, Harrison’s presidency was too short to leave a stamp on the office, but one thing is certain: his death caused a brief constitutional crisis involving presidential succession. 

The question was whether Vice-President John Tyler would be “acting” as President or actually become President upon Harrison's death.

Article II of the Constitution could be read either way. The relevant text states: 

"In Case of the Removal of the President from Office, or of his Death, Resignation, or Inability to discharge the Powers and Duties of the said Office, the Same shall devolve on the Vice President..."

Did "the Same" mean the Office of the Presidency itself or merely the powers and duties of the office?

After consulting with Chief Justice Roger Taney (who responded with extreme caution, saying he wished to avoid raising "the suspicion of desiring to intrude into the affairs which belong to another branch of government"), Harrison’s advisors decided that if Tyler simply took the Oath of Office, he would become president. Despite his own strong reservations, Tyler obliged and was sworn in as the 10th president of the United States on April 6, 1841.

When Congress convened in May, it passed a resolution confirming Tyler as president. Once established, this precedent of presidential succession remained in effect until 1967 when Congress passed the Twenty-Fifth Amendment, which provides that in case of the removal of the President from office or of his death or resignation, the Vice President shall become President.

Harrison’s death, by the way, resulted in three presidents serving in one year (Martin Van Buren, Harrison, and Tyler). This has happened on only one other occasion in American history. In 1881, Rutherford B. Hayes was succeeded by James Garfield, who died from an assassin's bullet later that year and Chester Arthur became president.

]]>
Fri, 22 Feb 2019 20:42:59 +0000 https://historynewsnetwork.org/article/171039 https://historynewsnetwork.org/article/171039 0
What Dinesh D’Souza's "Death of a Nation" Gets Wrong About Martin Van Buren

In the recently published Death of a Nation: Plantation Politics and the Making of the Democratic Party, Dinesh D’Souza takes Democratic party founder Martin Van Buren to task for creating a racist political party in the 1820s. According to D’Souza, “based on his observations of the rural plantation–and the similarities he noted between slaves in the South and newly arriving impoverished immigrants in the North–Van Buren adapted the Democrats’ plantation model to urban conditions.” In doing so, Van Buren “helped create new ethnic plantations based in the cities, populated by immigrants who were dependent and exploited by the Democratic Party in the North in somewhat the same manner as the slaves were by the Democrats in the South” (18).

      

Like many nineteenth-century northern politicians, Van Buren was contradictory when it came to race. One the one hand, he held views at times that appeared anti-slavery. In 1822, Van Buren, then in the U.S. Senate, supported a legislative amendment that would have made it difficult to bring enslaved people into the new state of Florida. Three years later, he called the slave trade “detestable.” When nominated as the presidential candidate for the antislavery Free-Soil party in 1848, Van Buren argued against “the evils of slavery.” 

 

On the other hand, Van Buren was no saint when it came to race. Early in his political career, he owned an enslaved man by the name of Tom. During his presidential campaign of 1836, Van Buren came out against congressional interference with slavery in the nation’s capital. As civil war threatened to divide the nation toward the end of James Buchanan’s presidency, Van Buren shared the perspective of many proslavery northern Democrats and wished the “slavery agitation,” but not the institution itself, would simply disappear. It was a position that he had held since the Missouri Crisis of 1819.

D’Souza largely ignores Van Buren’s documented views on, and actions regarding, race. Instead, he pushed forward the argument that Van Buren created the Democratic party to perpetuate a racist welfare state. His main piece of evidence for Van Buren’s intentionality in this regard centers on the New Yorker’s 1827 letter to Richmond, Virginia, newspaper editor Thomas Ritchie. In this famous letter, Van Buren outlined his plan to construct what became the Democratic party and the two-party system that persists today. He argued that the original partisanship of the founding era had disintegrated into factionalism, allowing sectional jealousy and antagonism to emerge as a divisive force that threatened the Union. Van Buren’s solution was to create a national party comprising “the planters of the South and the plain Republicans of the north.” “Party attachment in former times furnished a complete antidote for sectional prejudices by producing counteracting feelings,” he argued to Ritchie.

       

Instead of seeing a fairly straightforward description of political party organization, D’Souza sees Van Buren working to strike a nefarious bargain with proslavery southerners to protect a reprehensible system of exploitative labor. He crafts a story that has Van Buren traveling secretly through the South in the 1820s, skulking in the background as he sought to strike bargains with prominent southerners such as Ritchie and Vice President John C. Calhoun of South Carolina. In D’Souza’s words, Van Buren “was an unscrupulous man in the process of creating an unscrupulous party” (84), and he charges “progressive admirers,” such as historians Robert V. Remini and Ted Widmer, for covering up or excusing Van Buren’s motives, all to mislead an unsuspecting public (83).

         

D’Souza’s most damning claim is that Van Buren used the Democrats to create an “urban plantation” model, exemplified in the urban political machine (75). In his words, “Van Buren had an idea, one that could only have come from a complex, conniving brain such as his. . . . Why not re-create the Democratic model of the rural plantation in the Northern cities? Why not make the new immigrants just as dependent on the Democratic Party in the North as the slaves were dependent on the Democratic planters of the South?” (86) 

 

At this point, D’Souza reveals his intellectual bankruptcy. What is the evidence for his claim? “We have to tread carefully here,” he cautions, “because nowhere did Van Buren write his idea down.” “If he did,” D’Souza continues, “it must have been in the manuscripts that he–ever the circumspect strategist–subsequently destroyed” (86). So, how does D’Souza know that Van Buren had this idea for creating an urban plantation modeled on the slave plantations of the South? “The idea can be seen in its implementation,” he concludes, which took place via the urban political machine that Van Buren created. This machine “demanded the complete allegiance of organizers and constituents from the state right down to the local level. . . . The machine told them how to vote and required them to campaign for its entire slate during election periods” (87).

 

Here, D’Souza attributes to Van Buren a number of thoughts that are unsupported by evidence. According to him, Van Buren saw “the distinguishing feature” of immigrant groups as “a clannish solidarity that was based on their origins” (87). The New Yorker also “saw that the slaves, in a parallel if not similar situation, had created precisely this sort of communal solidarity on the plantation” (87). D’Souza then asserts, “from the immigrant yearning for survival and security that he well understood, and from their collective ethnic identity that he carefully observed, Van Buren realized the possibility for creating the same type of enduring dependency he had witnessed on the slave plantation but this time in Northern cities” (87). He concludes, “These clans, Van Buren figured, could then be ruled by Democratic Party bosses who would demand ethnic loyalty in exchange for political patronage” (88).

The historiography of the Jacksonian era, and of U.S. history, bears out D’Souza’s argument that political parties have appealed to immigrants’ ethnic identities in order to bind them to a partisan identity. But what D’Souza imputes to Van Buren in Death of a Nation is something different. He says that Van Buren observed enslaved people on southern plantations and, based on that experience, used his upstate New York political machine–the Albany Regency–to create a parallel experience among immigrants in the northern cities. From that, D’Souza argues, flows the Democratic party’s treatment of non-white voters today. That is a huge claim to hang on zero evidence.

 

The irony of D’Souza’s portrait of a malevolent Van Buren is that the purported architect of the urban plantation model failed to use his genius to his advantage. Yes, Van Buren won the presidency in 1836, but his victory showed significant weaknesses within the Democratic coalition. He narrowly won Pennsylvania and Virginia, two states that, given D’Souza’s argument, should have been solidly behind Van Buren. Four years later, Van Buren decisively lost his reelection bid to Whig candidate William Henry Harrison in an electoral catastrophe for the Democrats. Seeking another shot at the White House in 1844, Van Buren was unable even to secure the nomination of his party, losing to James K. Polk at the Democratic national convention. For such an alleged mastermind of the urban political machine, Van Buren was an abject failure at making it work in his favor.

         

Death of a Nation contains many problems, and D’Souza’s flawed presentation of Van Buren is emblematic of the book’s main issue. Since he believes that modern-day Democrats are exploiting race to their political advantage, then it only appears logical, to D’Souza at least, that this alleged racism stems from the party’s origins. In embracing this claim, he wrongly contends that the Democratic and Republican parties of the twenty-first century possess the same ideologies, principles, and motivations as those of their nineteenth-century predecessors. To make this ahistorical argument work, he has to ignore the shifts that have occurred as the two major political parties have developed and transformed. D’Souza’s inability, or unwillingness, to ground his case in actual evidence calls into question his motivation and suggests that he wants to use history as partisan propaganda, not as a way to illuminate the past.

]]>
Fri, 22 Feb 2019 20:42:59 +0000 https://historynewsnetwork.org/article/171018 https://historynewsnetwork.org/article/171018 0
Whose Country is This? Trump, Coolidge, and Immigration

 

The current divisive debate over national immigration policy has two sets of confrontational positions. On one side, advocates of immigration favor a liberal policy of admitting sizable numbers of immigrants, no discrimination based on ethnicity, religion, or national origin, and protection of undocumented immigrants. On the other side, President Trump is the leading spokesperson and advocate for building a wall on our southern border with Mexico, banning certain immigrants from entering the country, and deporting those living here illegally, many of whom, he insists, are criminals. 

 

The debate in some ways echoes discussions in the nation a century ago.

 

In 1921, the vice president published an article entitled "Whose Country Is This?" in the popular magazine Good Housekeeping.  "We are confronted by the clamor of multitudes who desire the opportunity offered by American life," the author noted. But America has no place for "the vicious, the weak of body, the shiftless or the improvident....Our country must cease to be regarded as a dumping ground." People accorded the privilege of immigrating to the U.S. should become productive, patriotic citizens. "It would not be unjust to ask of every alien:  What will you contribute to the common good, once your were admitted through the gates of liberty?" 

 

"There are racial considerations too grave to be brushed aside for any sentimental reasons," the author continued.  "Biological laws tell us that certain divergent people will not mix or blend.  The Nordics propagate themselves successfully.  With other races, the outcome shows deterioration on both sides."

 

What was needed was "the right kind of immigration."

 

That sounds a bit like some government leaders who are demanding immigration restriction today. Actually, it was Calvin Coolidge (R, Vice President, 1921-1923, President 1923-1929).

 

He became President on August 2, 1923, upon the death of President Warren G. Harding, and was elected in his own right the next year. Coolidge was bland and taciturn. He tried to avoid controversy. But Coolidge had strong views on immigration, some with parallels to today.

 

In his first address to Congress on December 6, 1923, he struck a theme of limited, selective immigration: “New arrivals should be limited to our capacity to absorb them into the ranks of good citizenship. America must be kept American. For this purpose, it is necessary to continue a policy of restricted immigration.” 

 

In 1924, he signed the Johnson-Reed Immigration Act which severely limited immigration, imposed a quota system based on the 1890 census which in effect favored northern Europeans over others, continued a long-standing ban on Chinese immigration, and imposed a new one on Japanese immigration.

 

His views on immigration were complicated.

 

Speaking to a delegation of labor leaders on September 1, 1924, he asserted that "Restricted immigration has been adopted by this administration chiefly for the purpose of maintaining American standards. It undoubtedly has a very great economic effect. We want the people who live in America, no matter what their origin, to be able to continue in the enjoyment of their present unprecedented advantages. This opportunity would certainly be destroyed by the tremendous influx of foreign peoples if immigration were not restricted. Unemployment would become a menace, and there would follow an almost certain reduction of wages with all the attendant distress and despair which are now suffered in so many parts of Europe. Our first duty is to our own people." 

 

The Republican party platform that Coolidge campaigned on that year put the economic case this way: "The unprecedented living conditions in Europe following the world war created a condition by which we were threatened with mass immigration that would have seriously disturbed our economic life. The law recently enacted [the Johnson-Reed Act] is designed to protect the inhabitants of our country, not only the American citizen, but also the alien already with us who is seeking to secure an economic foothold for himself and family from the competition that would come from unrestricted immigration." 

 

Putting the jobs argument more directly, immigration restriction "saves the American job for the American workman," as Coolidge said in a speech in December of that year.

 

On the other hand, he opposed some immigration restrictions and celebrated America as a melting pot. 

 

For instance, he lobbied Congress not to include the Japanese provision in the immigration act, and instead to continue a longstanding, informal agreement by which Japan voluntarily limited the number of its citizens emigrating to America. Congress included it anyway. In his formal signing statement on May 26, 1924, an angry Coolidge called the provision "unnecessary and deplorable" and asserted that Americans had a "sentiment of admiration and cordial friendship for the Japanese people" despite the new law.

 

He told the American Legion convention in 1925 that “Whether one traces his Americanism back three centuries to the Mayflower, or three years [ago in] the steerage, is not half so important as whether his Americanism of today is real and genuine. No matter by what various crafts we came here, we are all now in the same boat.”

 

In a 1926 speech, he said "when once our feet have touched this soil, when once we have made this land our home, wherever our place of birth, whatever our race, we are all blended in one common country. All artificial distinctions of lineage and rank are cast aside. We all rejoice in the title of Americans.”

 

In Calvin Coolidge's public utterances and his actions on immigration, several themes emerge. Some have reverberations for today.

 

* Coolidge emphasized that America has prospered and excelled in the past. Times were good then. But things seem to be slipping. Principles and values seemed in danger and future prospects appeared dimmer. Coolidge thought Americans had to be on guard. That sentiment sounds similar to Trump's slogan of "Make America Great Again."

 

* Coolidge encouraged assimilation. He believed that most past immigrants adopted American values and assimilated with the population already living here. Race, religion, and a consensus about the importance of family, hard work, and patriotism were important parts of that process. But, he went on, people now clamoring for admission were of different races and religions, and were determined to hold onto their own cultures and values. These new immigrants tended to stay together rather than assimilate and blend in and, to Coolidge, that made them a threat to the nation. Coolidge's views in this area seem similar in some ways to Trump's and other immigration restrictionists.

 

*Economics was a critical issue in Coolidge's thinking. The economy was expanding but there were only so many jobs to go around, he implied. Letting in too many immigrants would take jobs from citizens already here. America’s capacity to absorb newcomers was therefore limited. That sounds a lot like immigration restrictionists' arguments that immigrants (particularly undocumented immigrants) compete with American citizens for jobs, especially low-paying positions.

 

* Coolidge felt that Americans need not be concerned with conditions in other countries or the fate or prospects of people who wanted to come in as immigrants but were not allowed to do so. That was not something for which Americans had responsibility. It was up to those countries, and to the individuals living there, to fend for themselves. That, too, parallels the view expressed by immigration restrictionists today that unemployment, poverty, and violence elsewhere in the world, e.g., Central and South America, do not justify people from those nations seeking sanctuary here in the United States. 

 

* We have to keep to "America First!" -- a vague and undefined but popular slogan among Coolidge and conservatives in those days and occasionally used by President Trump.  It has overtones of American exceptionalism, nationalism, and patriotism but also undertones of nativism and racism. 

 

Whose country is this? It was a central question a century ago, and still is today. President Coolidge and President Trump might have similar answers to the question.

 

For more from Bruce W. Dearstyne, check out his latest book:

 

]]>
Fri, 22 Feb 2019 20:42:59 +0000 https://historynewsnetwork.org/article/171187 https://historynewsnetwork.org/article/171187 0
What To Know About the History of Impeachment

Calls for impeachment of the incumbent president echo throughout the blogosphere. I cannot open my Facebook account without seeing at least a dozen impeachment references, some as simple as Impeach Trump, some far more labored and pedantic about why or why not impeachment is possible. Thus far, I have not joined the online twitter. Here I shall.  

 

Impeachment is the province of the lower house of the assembly, in the federal government, the House of Representatives. Trial on impeachment articles (separate alleged misconducts) lies in the upper house, the US Senate. This division of responsibility follows the English law, with impeachment in the House of Commons and trial in the House of Lords. In England, anyone could be impeached and tried for any offense. Punishment could be anything the Lords imposed. 

 

The American precedents were different from their inception, as impeachment was a tool to impose colonial accountability on otherwise immune royal officials. Thus impeachment was limited to misconduct in office, by officials, with punishment limited to removal and disqualification for future office. These were the rules adopted by the new states when they added impeachment and trial to their state constitutions. The state cases involved misfeasance in office (for example neglecting duties), malfeasance (for example taking bribes) and other crimes under state law. Relying on the states’ precedents, these were the rules adopted by the framers of the federal Constitution. The English precedent was like a distant cousin, once removed. 

 

The constitutional provisions for impeachment and trial on impeachment are almost maddeningly terse, a characteristic of much of the rest of the document. For one thing, they are dispersed between two articles of the Constitution. Article I, section 2: “The House of Representatives shall choose their speaker and other officers; and shall have the sole power of impeachment.” Article I, section 3, “The Senate shall have the sole Power to try all Impeachments. When sitting for that Purpose, they shall be on Oath or Affirmation. When the President of the United States is tried, the Chief Justice shall preside: And no Person shall be convicted withoutthe Concurrence of two thirds of the Members present. Judgment in Cases of Impeachment shall not extend further than to removal from Office, and disqualification to hold and enjoy any Office of honor, Trust or Profit under the United States: but the Party convicted shall nevertheless be liable and subject to Indictment, Trial, Judgment and Punishment, according to Law.” Article II, section 4: “The President, Vice President and all civil Officers of the United States, shall be removed from Office on Impeachment for, and Conviction of, Treason, Bribery, or other high Crimes and Misdemeanors.” In addition, Article II, section 2, the “President "shall have power to grant reprieves and pardons for offenses against the United States, except in cases of impeachment." 

 

Curiously, although conviction on any of the articles of impeachment in the Senate requires a 2/3 vote, the House only needs a majority to pass an article of impeachment. Because the upper houses of the states were deliberative bodies, and the lower houses representative delegations, it was a lot easier to bring an impeachment than to gain a conviction. This is what the US government learned when the new Congress met. One could remove a district judge who failed to attend sessions because of a drinking problem (no crime, but certainly misfeasance), but not a justice of the Supreme Court who demonstrated extreme political partisanship in his charges to the federal grand jury and his treatment of counsel in court. A judge who used his contempt power to stifle a critic would be acquitted, but not one who retained his office and supported secession. Judges could bully parties before them and be acquitted, but not solicit or extort money from them. Tax evasion and perjury were grounds for impeachment and conviction, as was any criminal offense, though most often officials accused of this sort of financial misconduct resigned, and thus escaped impeachment. The airily promoted notion that the House could impeach a sitting official for anything the House decided was an impeachment offense was not sustainable—at least as a matter of law. 

 

Two presidents have been impeached, although impeachment was threatened against a good many others, and one of these resigned before the House could act. Although corruption was alleged, clear partisanship was involved in all but one of these cases—that is, members of the president’s party defended him, while members of the opposing party sought to remove him. In these cases, the corrupt acts were not laid at the presidents’ door, but at his appointees and personal friends. The one case of bipartisan inquiry, against Richard Nixon, ended with his resignation prior to House voting on impeachment articles framed and adopted by the judicial affairs committee. 

 

Two impeachments of presidents went to trial, the first of Andrew Johnson in 1868 and the second of William Jefferson Clinton in 1999. Johnson’s opposition to Republican Reconstruction plans had earned him the enmity of the majority of that party in both Houses. Passage of the Tenure of Office Act in 1867 set a trap for Johnson, into which he arrogantly marched. By removing Edwin Stanton from the Cabinet, in violation of the act, he provided his enemies with arguable grounds to seek his removal. The eleven articles of impeachment all derived from the removal of Stanton, but the impeachment failed in the Senate by one vote. 

 

In the Clinton case, the two articles voted up by a Republican majority (including five Democrats) alleged that Clinton had perjured himself and obstructed justice in his deposition in the Paula Jones case—not that he lied about his relations with Jones, but that he lied about his sexual liaison with Monica Lewinsky. The Senate declined to convict, no article gaining a majority vote.

 

What then does “treason, bribery, and other high crimes and misdemeanors” mean? Has its meaning changed when the officer was president rather than a federal judge or other inferior official? Could the high crimes and misdemeanors derive from actions before his or her term of office? Did the actions have to pertain to official duties? A little common sense here will go a long way. No one wants to deny the American people the services of the chief executive they elected, or to hamper that individual while he or she attempts to perform official duties. At the same time, a much higher principle as that no one in our republic is above the law. As evidence of this, note that the presidential pardoning power, which extends broadly to all offenses including treason and bribery, does not extend to impeachments. 

 

If a president committed a felony in office, there is no doubt that he or she would be liable to impeachment. He or she could not pardon himself or herself. If the president simply refused to perform the duties of the office, that would be misfeasance, and impeachable. If the president violated an act of Congress or a provision of the Constitution, for example the Emoluments Clause, that would be grounds for impeachment. If the president perverted or obstructed the course of justice in a federal grand jury proceeding or a prosecution, that would be grounds for impeachment. 

 

Would these also be sufficient for a vote to remove the president from office however? Even if a member of the Senate believed that the president had committed any one of these offenses, would that compel a vote to remove the president? 

 

These are matters of constitutional jurisprudence. In law, I think the answers to the questions in the paragraph above are clear. In politics, they are much murkier. Members of the upper house have in most cases shown great loyalty to their party and to their party’s president. In other words, for veteran politicians, party means more than other considerations. This is unfortunate, for in a republic of laws, whose Constitution was framed in a time when political parties were regarded as dangerous factions, loyalty to party above fidelity to law is dangerous to all of us.     

]]>
Fri, 22 Feb 2019 20:42:59 +0000 https://historynewsnetwork.org/article/171123 https://historynewsnetwork.org/article/171123 0
Sacrificing Greatness For Personal Ambition

 

Over the past year, I had the opportunity to speak to many Turkish scholars and former government officials, the majority of whom left your country because they feared for their lives and their loved ones. The one question that kept on surfacing is why a leader like you—who has achieved the pinnacle of power by undertaking the most significant social, political, and judiciary reforms, pursued economic development, and came close to establishing a model of Islamic democracy—reversed gears and abandoned your most impressive achievements? From every angle I examined your behavior, I could not escape the conclusion that your reforms were nothing more than a vehicle by which you could solidify power to allow you to promote your political Islamic agenda. As far back as December 1997, you publicly equated Islam to a military crusade, citing a poem that states in part: “The mosques are our barracks, the domes our helmets, the minarets our bayonets, and the believers our soldiers.” Your unruly ambition to become the leader of the Sunni Muslim world is the driving force behind your ceaseless efforts to implant a Turkish-oriented religious dogma in many countries in Europe and the Middle East. You have invested hundreds of millions of dollars, building mosques and religious educational institutions and appointing Turkish Imams to promote your brand of Islam. MP Alparslan Kavaklıoğlu, a member of your AKP and head of the parliament’s Security and Intelligence Commission, echoed your skewed sentiment, stating last year that “Europe will be Muslim. We will be effective there, Allah willing. I am sure of that.” In May 2018, you said that “Jerusalem is not just a city. [It] is a symbol, a test, a qiblah. If we can’t protect our first qiblah, we can’t be confident about the future of [our] last qiblah.” On a separate occasion, you told the crowd that the Muslim reconquest of Jerusalem would be “Soon, God willing.” The EU countries have caught on to your alarming scheme; Chancellor Sebastian Kurz of Austria in June 2018 ordered the closing of seven mosques and scrutinized the right of dozens of Turkish imams to remain in the country, citing suspected violations of an Austrian law that bans “political Islam” or foreign financing of Muslim institutions. You have shattered the dreams and aspirations of your fellow countrymen by denying them their inherent human rights, and dismantled every pillar of genuine democracy. You have sought and were granted constitutional amendments providing you near-absolute power, which you grossly abuse to realize your pipedream of becoming the new Atatürk of Turkey’s modern era. You have subordinated the judiciary and the rule of law to your whims, ensuring that courts pass judgments consistent with your own position on any alleged offenses, all in the name of social piety, national security, and solidarity. You have orchestrated a military coup that was planned to fail, calling it “a gift from God” that permits you to chase your enemies with vengeance. You accuse your arch enemy Fetullah Gülen of being behind it, but have produced no credible evidence to support your charges. You used the ‘coup’ to enact emergency laws, seizing ever more power. Tens of thousands of suspected Gülen followers were incarcerated, including teachers, judges, law enforcement officers, human rights defenders, doctors, lawyers, political activists, and students, leaving behind their despondent and despairing families. You have imprisoned over 17,000 women with their children, wreaking havoc on a multitude of innocent civilians. You have hunted Turkish nationals whom you accuse of being affiliated with the Gülen movement and pressure many countries to expel such nationals. Fearing public accountability, you have targeted the press and labeled it as the enemy of the people. You have closed scores of media outlets, and imprisoned and tortured more than 200 journalists. In today’s Turkey, freedom of the press is a thing from the past, as just about all existing licensed media outlets must dance to your tune and strongly support your political agenda to stay in business. In the same vein, you are prohibiting the right to assembly while stifling the academic community and think tanks, who are the forces of social, political, and economic advancement. Your Kurdishphobia seems to blind you. You have been fighting the PKK and accusing them of being terrorists, when in fact they represent the cause of their fellow Turkish Kurds. At least 40,000 have been killed on both sides over the past 50 years, and there is no end in sight as you vow to kill every last one standing, which is completely delusional. The PKK are freedom fighters fighting on behalf of their Kurdish community, who simply want to maintain their cultural heritage, keep their language alive, enjoy their folk music and dance, and have their basic human rights respected. Instead, you abruptly ended the peace negotiations in July 2017 and continue to persecute them indiscriminately, inflicting unbearable pain and suffering on your own fellow citizens yet demanding absolute loyalty. You have and continue to sternly object to the establishment of Kurdish autonomous rule in Syria, fearing that your own Kurdish community would follow suit. Under the pretext of fighting ISIS, you invaded Syria in order to actually wage merciless war against the Syrian Kurdish militia—the YPG—whom you falsely accused of being a terrorist organization allied with the PKK. On the top of your agenda in Syria, however, is the eventual establishment of a permanent presence there. You betrayed your alliance with the US-led coalition to fight ISIS by providing ISIS with logistical support, allowing volunteers to cross the border to join their ranks in Syria and Iraq, and trading oil for weapons, all while turning a blind eye to their reign of terror and unspeakable atrocities. You pretend to be a Western ally, but you cozy up to Russia’s Putin—the West’s staunchest enemy. You remain determined to purchase Russia’s S-400 air defense system, in defiance of the US and NATO. As a NATO member, you have violated every clause of the charter by committing horrifying human rights abuses at home and destroying every tenet of democracy. You are creating alliances with Islamic states and organizations, and drawing to your orbit countries with predominantly Muslim populations, including the Balkan countries, by investing in their infrastructure and providing them with military equipment and training. Your policy of “zero problems with neighbors” has failed; instead Turkey has problems with just about every neighboring state, including Syria, Iraq, Greece, Cyprus, and Armenia. Many world leaders know you are a conniving, power-thirsty dictator, but deal with you out of necessity only because of Turkey’s geostrategic importance as a bridge between East and West and a hub of oil and gas. You have no shame about dispatching your thugs to foreign countries to do your bidding. Among many other egregious incidents, 15 members of your security detail physically attacked protestors outside the Turkish Embassy in Washington, DC in 2017. In 2016, your security team fought with Kurdish protestors in Ecuador. In 2015, your bodyguards scuffled repeatedly with Belgian police, and in 2011 they fought UN security personnel. You are antisemitic to the core. Among many of your antisemitic statements, at a recent meeting of the Turkey Youth Foundation in Istanbul, you stated that “the Jews in Israel kick men, but also women and children, when they’re on the ground.” You also told the audience, “Don’t be like the Jews.” More than 100 years have passed since the genocide of over one million Armenians by Ottoman soldiers in the wake of World War I. Even though history books have fully documented, and the international community recognizes, this horrifying event, you still haven’t mustered the courage to admit it. In fact, you vehemently criticize any country and condemn any individual that attributes the genocide to the Ottomans. Although it was the Ottomans and not the present Republic of Turkey that committed these atrocities, you do not want to malign the Ottomans as you have made no secret of your ambition to revive the power and influence of the Ottoman Empire under your leadership. The sad thing, Mr. Erdogan, is that we are living now in the 21st century; the days of conquest and undue influence over the fate of other countries are over. You had a historic opportunity to become a respected, benevolent leader, loved by your countrymen and admired by world leaders. But you have squandered it all because of your desire to become the new Sultan of a would-be empire that exists only in your wretched imagination. You, like any other mortal, will be gone. Perhaps it is a good time for you to reflect and ask yourself, what am I leaving behind? In your wake, Mr. Erdogan, you leave a shattered Turkish people, yearning to be free, free to think and believe, free to assemble, free to criticize, free to use their ingenuity and resourcefulness to create a free society. But you have sacrificed the welfare for the Turkish people for blind personal ambition, for which you will be remembered.

]]>
Fri, 22 Feb 2019 20:42:59 +0000 https://historynewsnetwork.org/article/171262 https://historynewsnetwork.org/article/171262 0
Do We Make Writing on Jefferson Harder than It Really Is?

 

Merrill Peterson, the preeminent Jeffersonian scholar, writes in his watershed work, The Jefferson Image in the American Mind, “Jefferson was a baffling series of contradictions.” Albert Ellis in American Sphinxstates magisterially that Jefferson’s “multiple personalities” are much like “the artful disguises of a confidence man.” Peter Onuf writes in The Mind of Thomas Jefferson, “The search for a single definitive, ‘real’ Jefferson is a fool’s errand, setting us off on a hopeless search for the kind of ‘knowledge’ that even (or especially) elude sophisticated moderns in their encounters with each other—and themselves.” Thus, there appears at the beginning of nearly every Jeffersonian biography, including the best biographers, some statement of or caveat concerning the difficulty, if not impossibility, of getting to know Jefferson. It has become part of the ritual. It is also part of the lure.

Is Jefferson really a “baffling series of contradictions” and an “American sphinx,” and is Jeffersonian scholarship really a “fool’s errand”? Might not such caveats merely be rationalizations for the possibility of scholarly mistakes?

What is frequently missed in reading Peterson’s work is that he is not committed to, as is Onuf, the impossibility of continued progress in coming to know Jefferson. Yet there are for Peterson nodi.

 

One difficulty is that Jefferson—a man of great intellectual breadth and depth, and a man of uncommon ideals—wrote voluminously and appealed to everyone at some cognitive or visceral level. Because he appeals to everyone at some level, says Peterson, scholars give numerous depictions of him. “In his letters, account books, and other memoranda, Jefferson left ample records of his personal tastes and habits; yet, as with his public record, it was possible to draw from these almost any picture the writer wished.”

 

Furthermore, Jefferson often dissimulated. “More ardent in his imagination than his affections, he did not always speak exactly as he felt towards either friends or enemies. As a consequence, he has left hanging over a part of his public life a vapor of duplicity, or, to say the least, of indirection, the presence of which is generally felt more than it is seen.”

 

Moreover, Jefferson was fundamentally a curious immixture of everyday citizen and philosopher. “It was precisely because Jefferson combined, or seemed to combine, the traits of the man-of-the-people and the man-of-vision that he was capable of being mythicized as the Father of Democracy.”

 

Yet Peterson is clear that those difficulties can be overcome. The perplexity is in the scholars, not in Jefferson. Peterson writes: “The historians could not fairly plead the lack of information on Jefferson. If still fragmentary, it was constantly on the increase. The difficulty was less one of the scholars’ knowledge than of the uses they made of it. The image of Jefferson shattered when they came through the doors of partisan, and perhaps hereditary, prejudice to the interpretation of the facts.” He adds, “If Madison was right [in asserting an early and uniform devotion to liberty and the equal rights of man], as I think he was, the apparent ironies, paradoxes, and contradictions in Jefferson’s life and thought, so much dwelled upon by latter-day scholars, mattered little in the light of this fundamental harmony and clarity of purpose.”

 

Jefferson indeed was a man of fundamental harmony and clarity of purpose. Because of those enduring qualities, Jeffersonian scholarship might be a dead lift—an inordinately difficult task—but it is not the cul de sacthat scholars habitually claim it is.

 

There are reasons why scholars—and here I refer to first-tier scholars—make mistakes when approaching Jefferson.

 

First, there is refusal to take Jefferson at his word. As Peterson states, Jefferson does not always speak frankly. He often dissimulates. The reasons are politeness and guardedness. Jefferson is in the habit of speaking to correspondents in language with which they are familiar and on topics in which they are especially interested. Moreover, dissembling often occurs because of caution. Jefferson was wont, for instance, not to share his religious views with correspondents or the general public—his own family did not know his religious views—for fear of public censure. That fear was genuine. Close friend Dr. Thomas Cooper, for instance, was kept from a professorship at University of Virginia on account of his liberal religious views made public. Had Jefferson’s religious views been commonly known, his political career also would have doubtless been hampered. Such things noted, scholars who are committed to a Protean Jefferson tend to read into or “deconstruct” Jefferson’s writings, when there is no good reason for doing so, and the result is a proliferation of amphigories that follow the whims of scholars. unprofitably lead readers in a number of directions, and tell us nothing about Jefferson.

 

Second, there is the tendency to read the secondary literature without reading much of Jefferson. This mistake occurs especially on the subjects of race and slavery, where having thoughts of one’s own might be a signal of one’s own racism. On both topics, scholars characteristically remind themselves and other scholars that it is sufficient to glance at Query XIV of his Notes on the State of Virginia,without uptake of Jefferson’s caveat that the views expressed on Blacks are based on limited and biased observations, and to read the writings of Gordon-Reed and uptake her views on both subjects. Jefferson is racist because he owned slaves and freed too few in his life. Furthermore, Jefferson is hypocritical because he politically preached austerity but lived high on the hog and because he preached small government and strict constructionism but went forward with the Louisiana Purchase without constitutional sanction. The result of too much immersion in the secondary literature at the expense of reading Jefferson is scholarly moribundity. I have in my own years of Jeffersonian study found that many of the “contradictions” we find in Jefferson evanesce when one takes Jefferson at his word.

 

Third, there is aversion or unwillingness to engage critically with others in the secondary literature. In essays and biographies on Jefferson, there is all too little critical engagement with the writings of others. The unfortunate result is a scholarly inertia in the field of Jeffersonian studies, which is somewhat of a fetid mishmash. Just about anything goes and there is little, if any, forward movement. It is one thing to recognize the right of authors to express idiosyncratic views on some issue, but that not to say that all such idiosyncratic views carry the same weight. Some views are not well supported by evidence and those views ought to be weeded out through scholarly critical appraisal. They are not.

 

Last, there is failure to read what Jefferson readand what shaped his thinking, other than the political literature to which Jefferson had access and that Jefferson assimilated. Jefferson was widely read. He studied the sciences, religion, law, philology, morality, political thinking, and the arts, inter alia—viz., anything that might improve the human condition.

 

It is said by a grandchild that he was more often seen with a book by a Roman or Greek author than any other author. Authors such as Homer, Tacitus, Seneca, Cicero, Epictetus, and Demosthenes shaped his thinking more than others and lack of acquaintancy with that literature—especially in the original language—and with Greek culture and Roman culture leads to misapprehension of Jefferson’s political, educational, and moral views. I shall go so far to say that anyone who wishes to be a competent Jeffersonian scholar should be trained also as a Classical scholar.

 

Again, there is Jefferson’s empiricism. He was a dyed-in-the-wool empiricist in the manner of Bacon, Locke, Kames, and Hume and lack of acquaintancy with philosophical empiricism often leads to egregious errors—especially when it comes to apprehension of Jefferson’s views in Notes on the State of Virginia. Empiricism in the manner of Bacon and Newton—e.g., use of hypothetic-deductive reasoning, appeal to simplicity, detailed description without critical commentary—appear in abundancy in the book, especially in the early naturalistic queries. Again, no one without amply acquaintancy with philosophical empiricism—e.g., Newton’s Principia Philosophica and Stewarts’ Elements of the Philosophy of the Human Mind—ought to enter seriously into Jeffersonian studies. That is why Jefferson is often said to be wishy-washy and confused in his Notes on the State of Virginia, when he claims that he is not afforded evidence sufficient to confirm a hypothesis or decide among competing hypotheses—e.g., the strange existence of petrified shells in the foothills of the Appalachian Mountains of Kentucky in Query VI.

 

Also, Jefferson took morality very seriously. I have argued in several publications that he was preeminently a moralist. His moral views were shaped not only mostly by ancient virtue ethics, but also by the New Testament, the moral-sense and moral-sentiment literature of his day like Kames’ Principles of Morality and Natural Religionand Hutcheson’s A Short Introduction to Moral Philosophy, religious sermons like those of Rev. Bourdaloue and Rev. Massillon, novels like Cervantes’ Don Quijote, poetry like Shakespeare’s plays and Homer’s Odyssey, and utopian literature like Mercier’s L’an 2440. His moral views were also the grounding of his political views, as the aim of a Jeffersonian republic was not only efficient governing, but also a happy and thriving citizenry in conformance with political liberalism—what I call liberal eudaimonism.

 

In sum, Jeffersonian scholarship is not a fool’s errand, but it is extremely arduous. It requires that a scholar be of large erudition and widely read in all, or almost all, subjects that Jefferson studied. When the groundwork is done, one might find that Jefferson was a man who was in vital respects much simpler, and less perplexing, than scholars typically portray him.

]]>
Fri, 22 Feb 2019 20:42:59 +0000 https://historynewsnetwork.org/article/171191 https://historynewsnetwork.org/article/171191 0
Barack Obama, Politics, and Presidential Rankings

 

As America celebrates Presidents Day, scholars once again rank presidents’ job performances. These rankings are often subjective and change along with the political climate. 

Former President Barack Obama was first judged in the Siena College Research Institute’s 2010 survey after he had been in office only a year. Scholars rated Obama the 15th best president. In 2014, the American Political Science Association survey rated Obama 18th.  In 2017, upon leaving the Presidency, Obama was rated the 12th best president by historians in the C Span Presidential Survey.

Then in 2018, after one year of Donald Trump’s presidency, the American Political Science Association conducted its Presidents and Executive Politics Survey. The study questioned 170 Political Scientists who self-identified as liberals, Democrats, moderates, independents, conservatives, and Republicans. In this poll, Obama was ranked 8th, ten points higher than in 2014. Democrats rated Obama 6th; Independents rated him 12th; and even Republicans rated him 16th. Donald Trump ended up at the bottom—number 44. After the constant comparison with President Donald Trump, Obama was considered a better president historically. He just might reach the top ten Presidents when the next C Span Survey is done after Donald Trump leaves office.

When one revisits Obama’s record in office, he stands out for passing the Affordable Care Act, which has survived nine years of bitter attacks and attempts to destroy the Act’s coverage for 20 million Americans. The Dodd-Frank Act to regulate Wall Street spurred economic recovery, including the greatest stock market growth historically, after the 2008 financial crisis. It also decreased unemployment. The Act created the Consumer Financial Protection Bureau and authorized a bailout of the auto industry which saved an industry heavily based in the Midwest and the South.  Obama also fought climate change and encouraged gun regulations, criminal justice reform, immigration reform, and increased gay rights including marriage. He also added large amounts of public lands as national monuments, making him one of the greatest environmental Presidents.

In foreign policy, Obama opened up relations with Cuba after a half century of isolation; made a nuclear deal with Iran that was backed by the United Kingdom, France, Germany, Russian Federation, and China; ended US military intervention in Iraq; increased the Iron Dome defense system for the protection of Israel from terrorist attacks; assassinated Osama bin Laden; and promoted more free trade agreements.

Many of these successes were reflected in how scholars evaluated Obama’s presidency and specific characteristics. In the C Span 2017 Survey, scholars rated Obama 3rd in the Pursuit of Equal Justice For All; 7th in Moral Authority; 8th in Economic Management; 10th in Public Persuasion; 12th in Vision/Setting An Agenda; 15th in Crisis Management and Performance Within Context Of the Times; 19th in Administrative Skills; 24th in International Relations; and 39th in Relations With Congress.   

The low ranking of Obama on Relations With Congress is, unfortunately, based on the hostile and uncooperative reaction of the Republicans in Congress. Republicans had a majority in the House from 2011-2015 and controlled both the House and Senate from 2015-2017. The party used this legislative power to block just about anything Obama pursued including rejection or delay on judicial and other appointments. If Obama had a Congress similar to that under Franklin D. Roosevelt, Lyndon B. Johnson, Abraham Lincoln, George Washington, Theodore Roosevelt and many other Presidents, he would have been rated higher than Woodrow Wilson and possibly Lyndon B. Johnson and Ronald Reagan.  

While certainly there is much to be researched and analyzed about the Obama Presidency, it is still clear that Barack Obama is highly regarded by scholars.  In national public opinion, he remains the most popular male leader for the past decade, and in polls of ordinary Americans, is seen as one of the best Presidents since World War II.  While public opinion usually ranks John F. Kennedy and Ronald Reagan as the “best” and most “popular” Presidents on Presidents Day annually, it would not be surprising if Barack Obama is added to that list, or maybe even replaces JFK or Reagan in future years.

]]>
Fri, 22 Feb 2019 20:42:59 +0000 https://historynewsnetwork.org/article/171261 https://historynewsnetwork.org/article/171261 0
America Needs a Moral Leader in 2020

As we enter 2019, Democrats begin to debate who their nominee for president will be in 2020. Progressive or moderate; man or women; white or minority; or simply someone who can beat Donald Trump? There is no shortage of potential candidates considering a run for the White House, but in our current political climate, I would argue there is one overriding consideration: Whoever is selected as the Democratic nominee must offer America moral leadership.

 

Previous presidents also found that moral leadership was essential. FDR wrote that the Presidency “is preeminently a place of moral leadership.”  Lyndon Johnson once said, “I recognized that the moral force of the Presidency is often stronger than the political force. I know that a President can appeal to the best in our people or the worst.” 

 

Moral leadership combines two important qualities. The first is the ability to distinguish right from wrong. A moral leader must not only know the difference between the two but be willing to act in the public square to implement a moral vision. The second is the ability to grow and change over time to address one’s own moral blind spots. The late James MacGregor Burns, in his book Leadership, writes that “at the highest stage of moral development persons are guided by near universal ethical principles of justice such as equality of human rights and respect for individual dignity.” Moral leadership “operates at need and value levels higher than those of the potential follower (but not so much higher to lose contact).” This is critical in democratic societies, where the power of a leader is grounded in the consent of the governed. As Burns notes in his introduction, “moral leadership emerges from, and always returns to, the fundamental wants and needs, aspirations, and values of the followers.” 

 

Moral leadership is sadly lacking in Donald Trump, as is any ability to grow into the job of president. Many of us had hoped that the awesome weight of responsibility inherent in the oval office would force Trump to grow into the role of president. Instead, he has regressed. Take Trump’s immigration policies as an example of this pattern. Since the day he announced his candidacy, Trump has been using the fear of outsiders to garner support. “When Mexico sends its people…they’re bringing drugs, they’re bringing crime, they’re rapists, and some, I assume, are good people,” Trump pronounced at Trump Tower in June of 2015. This initial position soon became the immoral policy of separating families at the border. At last count, the New York Times reports that over 3,000 children havebeen separated from their parents, including infants and toddlers. From promises to drain the swamp, to attacks on our democratic system, Trump’s pattern of regression has been consistent.

 

Abraham Lincoln, however, exhibits both elements of moral leadership. From an early age he knew right from wrong on the defining issue of his time, slavery, and he was willing to make clear statements to that effect in his public life. Yet, Lincoln initially espoused many problematic views on race. The fact that he learned and changed these views over time demonstrates his strength as a moral leader. 

 

Lincoln asserted in 1864 that he was “naturally anti-slavery. I cannot remember when I did not think and feel so.” Perhaps this was due to resentment of his father, who “regularly hired his son out to work for other farmers in the vicinity, and by law he was entitled to everything the boy earned until he came of age,” according to Lincoln biographer David Donald. At a campaign event in the 1850’s, Lincoln said “I used to be a slave,” a statement that another of his biographer’s attributed to his time working for his father without pay. Lincoln recalled how he was tormented by the sight of “ten or a dozen slaves, shackled together with irons” when he travelled by boat to New Orleans.

 

In the aftermath of the Kansas-Nebraska Act in 1854, which repealed the Missouri Compromise and opened additional territory to the expansion of slavery, Lincoln was stirred “as he had never been before.” While he had supported the Compromise of 1850, which had been in part fashioned by his political hero Henry Clay, he was aghast at any legislation that would allow slavery to expand beyond its then current boundaries. 

 

In 1854, Lincoln refuted Steven Douglas’s position on popular sovereignty, under which Douglas maintained that the people of a territory had a right to vote for or against slavery. “The doctrine of self government is right---absolutely and eternally right—but it has no just application” to the issue of slavery, Lincoln said. The reason is that the “negro is a man” and “there CAN be no MORAL RIGHT in the enslaving of one man by another.”

 

Lincoln would return to this theme again during the famous Lincoln-Douglas debates in 1858 as part of the campaign for the Senate seat in Illinois. In the final three debates, Lincoln maintained that slavery was a moral issue. Lincoln argued that the difference between he and Douglas was “the difference between the men who think slavery wrong and those who do not think it is wrong.” Lincoln and the Republicans wanted to “prevent its [slavery] growing any larger, and so deal with it that in the run of time there may be some promise of an end to it.” Lincoln took a principled stand, not only maintaining that slavery was morally wrong, but also extending to all people, including the enslaved, the natural rights Jefferson recorded in the Declaration of Independence, especially the equal right to be paid for one’s work. “But in the right to eat the bread…which his own hand earns, he is my equal and the equal of Judge Douglas, and the equal of every living man,” Lincoln emphatically asserted to great applause at the Ottawa debate. 

 

Yet Lincoln too had his own shortcomings as a moral leader when it came to race. In the Ottawa speech Lincoln said that he had “no purpose to introduce political and social equality between the white and black races,” and that physical differences would “forbid their living together upon the footing of perfect equality,” which was why he continued to support colonization of freed black people. Lincoln wanted to both oppose slavery and remain acceptable to white voters. He supported colonization of freed peopleboth because it was his personal opinion and because he wanted to be elected.

 

Yet Lincoln grew over time. “At the time of his death, he occupied a very different position with regard to slavery and the place of blacks in American society” than he did in the late 1850’s, writes Eric Foner. Frederick Douglass, who escaped slavery and became a leader in the abolitionist movement, was often critical of Lincoln. Yet when they finally met in 1863, Douglass said that Lincoln treated him as an equal, “just as you have seen one gentleman receives another.” James Oakes has written that even Frederick Douglass had come to realize the “skill, even the genius it had taken for a politician like Lincoln to maneuver the northern electorate” to accept emancipation and a more enlightened view on race relations. When evaluating Lincoln as a moral leader, one of his greatest attributes was his ability to change as new circumstances and information were presented to him.

 

There is one other element of moral leadership that is required in a political environment: in order to be successful, a president must be a politician. Jimmy Carter was certainly moral, yet he was a failure as president due in part to his lack of political skills. As Sean Willentz writes in his book The Politicians and the Egalitarians, Lincoln “achieved historical greatness in his later years because of, and not despite his political skills.” Part of the political skills of a Lincoln or Franklin Roosevelt was an understanding of timing, of when the public and historical conditions warranted action. During the early years of the Civil War, Lincoln was criticized for placing the preservation of the Union above the elimination of slavery. In a letter responding to Horace Greeley in the summer of 1862, Lincoln admitted that his primary goal was “to save the Union” and that if he could do that “without freeing any slave” he would do it and if he could save the Union “be freeing all the slaves” he would do that too. What Lincoln did not reveal, because the timing was not quite right, was that he had been working on a draft of the Emancipation Proclamation as he wrote his response to Greeley. 

 

The balancing act we see in Lincoln should remind all of us that we are not looking for perfection in our next political leader, a quality which is simply impossible to find even in a moral leader. Let us try to evaluate the candidates on a moral scale, always looking for a person who can change and grow based on experience. While a president cannot make us a moral people, I still believe that the great majority of Americans are moral and yearn for leadership that elicits our finest instincts. The candidate who can appeal to “the better angels of our nature,” as Lincoln framed it, should be our next president. 

]]>
Fri, 22 Feb 2019 20:42:59 +0000 https://historynewsnetwork.org/article/171037 https://historynewsnetwork.org/article/171037 0
What I’m Reading: An Interview With Historian Nikki Taylor

Dr. Nikki M. Taylor is a Professor of U.S. History and Chair of the Department of History at Howard University. She earned her PhD in U.S. History (and a certificate in Women’s Studies) from Duke University. Her research focus is 19th Century History, with a special focus on the History of Black Freedom in the Age of Slavery, Women’s, and Urban History. Dr. Taylor has authored 3 monographs including, Frontiers of Freedom: Cincinnati’s Black Community 1802-1868 and Driven Toward Madness: The Fugitive Slave Margaret Garner and Tragedy on the Ohio (2016). Her current project is about enslaved women who waged armed resistance to slavery. She has been awarded several prestigious fellowships and grants throughout her career, including a Fulbright to Ghana and Woodrow Wilson Career Enhancement Grant. In 2017, she successfully secured a half a million-dollar institutional grant to establish the prestigious Mellon Mays Undergraduate Fellowship program at Howard University—the first HBCU with its own program.

 

What books are you reading now? 

 

1. Baracoon by Zora Neale Hurston 2. Chocolate City: A History of Race and Democracy in the Nation’s Capital by Chris Myers Asch and Derek Musgrove.

 

What is your favorite history book?

 

There is a River by Vincent Harding and any book by Darlene Clark Hine. 

 

Why did you choose history as your career? 

 

I chose history because I was inspired by a historian of African history in my first college History course at UPENN. He made history come alive. As an African American, the history of Africa raised my consciousness and shaped my identity. 

 

What qualities do you need to be a historian? 

 

I will focus on personal qualities, rather than skills. I think one must be committed to finding and amplifying the voices of the voiceless, disempowered, and marginalized. To do so, and do it well, one must value and respect those groups of people. Secondly, I believe good historians also must be able to see and write about the bad in good people and the good in bad people. In short, we must be able to see and write about the humanity in both our heroes and villains. All three of my books have taught me that lesson again and again.  

 

Who was your favorite history teacher? 

 

Dr. Lee Cassanelli at UPENN and Peter Wood and Syd Nathans at Duke. 

 

What is your most memorable or rewarding teaching experience?

 

I am always deeply rewarded by getting emails from students from years past who tell me how much my classes transformed their minds and changed their lives. I have been most satisfied by teaching at Historically Black Universities and the University of Cincinnati. I like these institutions because I see myself in their students. 

 

What are your hopes for history as a discipline? 

 

Although there is much to be gained by doing history the old-fashioned way: by digging in boxes at the archives and building relationships with the keepers of the records—archivists and librarians, I do believe in universal, unlimited access to those records. Hence, digitization becomes the great equalizer. So, my dream is to have historical records be fully digitized and accessible to all. 

 

Do you own any rare history or collectible books? Do you collect artifacts related to history?

 

I own just a few rare books—and also some autographed copies of older books. I am not a collector of anything, except black angels. I like them because they defy the typical stereotypes of black women (as Jezebels, or immoral). They also make me feel protected by a higher power. I have them everywhere in my home. 

 

What have you found most rewarding and most frustrating about your career? 

 

I am most frustrated by what seems (to me) to be widespread miseducation about history. Society has been so misled and miseducated—mostly by the internet—that it makes it difficult for trained historians to teach…or re-teach. I am equally frustrated when students believe they already know American or African American history so they do not need to take it in college. In my Intro courses, I do a pre-test that asks them when the Civil War ended, or who was the first black woman to run for president on one of the 2 major parties, or to name 5 figures of the Civil Rights Movement besides MLK or Rosa Parks. Only after taking and failing that pre-test do they realize they really do not know U.S. history after all. Historians have written thousands of history books, and yet society largely remains miseducated about history. Another frustrating part about my career has been how few regular people outside of the academy actually read the books that I have sweat blood to research and write. The most rewarding part of my career has been and will always be teaching. I also feel an incredible high the day my bound book first arrives in the mail.   

 

How has the study of history changed in the course of your career?

 

I went from seeing hundreds of History majors in my departments; now, the typical number of history majors is less than 100. This decline saddens and troubles me because of what it reflects about society and our values. The discipline has become far more diverse over the years, especially as it relates to women and African Americans. That is a very welcome change. However, white men remain the guardians of certain types of history, namely Presidential history, History of the Civil War, and even the history of Slavery. 

 

Digitization of records and the shrinking of academic presses have brought the sharpest changes to how we conduct research and publish. I remain shocked by how many historians have secured agents and now publish with popular presses. I am a relic in that all 3 of my books have been through academic presses. If I want a wider audience, I understand what must happen. 

 

What is your favorite history-related saying? Have you come up with your own?

 

“Until lions have their own historians, the tale of the hunt will always glorify the hunter.” An African proverb (also repeated by Chinua Achebe). This one resonates deeply to my core. 

 

What are you doing next?

 

I am working on a monograph about enslaved women who used violence to resist slavery. 

]]>
Fri, 22 Feb 2019 20:42:59 +0000 https://historynewsnetwork.org/article/171036 https://historynewsnetwork.org/article/171036 0
UPDATED: What Historians Are Saying About Trump's National Emergency and Press Conference

]]>
Fri, 22 Feb 2019 20:42:59 +0000 https://historynewsnetwork.org/article/170895 https://historynewsnetwork.org/article/170895 0
Is the Green New Deal Impossible?

 

If you are sneering at advocates of the Green New Deal as impractical dreamers, misguided fanatics or headline seeking demagogues, here is a lesson from our nation’s past that should make you think again. Not only are these agitators deadly serious, but they have substantial historical precedent to back them up. 

 

Their demand that we abruptly abolish our carbon driven economy in the space of a single decade and redesign it around renewable energy sounds wildly impractical on first blush. But a critical examination of our nation’s past strongly suggests that they have it within their reach to dramatically alter the way we choose to live as the likelihood of environmental catastrophe grows ever larger. 

 

The history lesson that backs this bold assertion comes from the early 1830s when an unruly handful of people very much like the Green New Dealers, the fearlessly iconoclastic abolitionists, suddenly issued an extraordinary challenge to the political status quo. They demanded that the three or so million African Americans living in the South be permanently liberatednot sooner or later, but this instant. Their slogan was “Immediate Emancipation.” Not a dime should go in compensation to their former owners, whose collective investment in their “chattel property’ amounted to the economy’s second largest capital asset. Only the aggregate value of the nation’s total holdings in real estate exceeded the value of their investments in the enslaved. No matter, insisted these abolitionists.  The moral imperative to liberate the grievously oppressed far overbalanced any selfish concerns about profit and loss. 

 

The mission the abolitionists set for themselves was to persuade their fellow citizens to embrace this truth and act on it. They called this technique “moral suasion.” Most of their fellow citizens initially dismissed them as being hallucinatory but their three decades of ceaseless agitation ultimately persuaded many to take some their own stands against slavery.  Scholars today hold them in high regard as powerful advocates for racial equality and resourceful defenders of human rights. 

 

Let us now whisk ourselves back into the present where awaiting us are the aforementioned Green New Dealers, every bit as much the unruly iconoclasts as those old-time abolitionists. Like the abolitionists they demand profoundly drastic changes that are, it seems, patently impossible: “convert 100% of the power demand in the United States” to “clean, renewable and zero-emission energy sources.” Upgrade “all existing buildings” to meet existing energy efficient requirements.” Expand high speed rail everywhere so as to eliminate air travel. All within the next ten years. 

 

The moral obligation to prevent the catastrophe of unchecked global warming, they argue, overwhelms all concerns about economic consequences. The task these Green New Dealers set for themselves is to mobilize voters all over the country to demand enactment of their unprecedented legislative agenda. How? By appealing to them to open and change their minds. Their tactic is “moral suasion,” much like the abolitionists and so is the terrifying moral urgency of their cause. 

 

So to summarize: Back in the day the abolitionists demanded the overthrow of “King Cotton” by liberating its labor force. Today Green New Dealers seek to overthrow “King Carbon” by transforming our energy resources, a goal at least as disruptive as was immediate slave emancipation. For precisely this reason, it is impossible to decide which group of insurgents faces(ed) the more powerful opponents. 

 

Staring down the abolitionists was the South’s all powerful planter class, the wealthiest, most self aware, best organized and most politically potent interest group in all of pre- Civil War America (five of our first seven Presidents were notable slaveholders). For the Green New Dealers the equivalent is the formidable network of agencies and agents that promote, protect and consume “big energy.” To imagine the enormity of it all think at once about the Koch Brothers, Exxon Mobil and the fact that the United States is the largest oil consuming nation in the world (18.9 million barrels per day). King Cotton then, King Carbon now, each fully arrayed against their most extreme ideological adversaries. Giving their full support to both juggernauts, most tellingly, was /is the heavy weight of public opinion.

 

During the 1830s the abolitionists’ opening campaign for immediate emancipation forcibly confirmed this brutal truth. Since the overwhelming majority of white Americans had convinced themselves long ago that black skin confirmed innate inferiority,the prospect of three million emancipated African Americans “let loose”in civil society drove them to extreme actions. They broke up the abolitionists’ meetings, attacked free black communities, intimidated abolitionist newspaper editors and voted en masse for politicians who supported legislation to suppress the abolitionists’ freedom of expression. When the riots subsided, most white Northerners continued to express hostility to abolitionism with sullen apathy. Below the Mason Dixon line, any suspicion of abolitionist activity prompted brutal retribution.

 

Compared to the abolitionists, the prospects for success for Green New Dealers seem at this moment just as unpromising.  Despite ongoing protests and law suits,oil and gas pipelines proliferate alongside rapidly multiplying fracking operations and drilling leases. Drivers spurn high mileage small sedans for Ford F150s, SUV Cross-Overs and Cadillac Escalades. Delta Airlines and enforceable air quality standards occupy alternate universes. Federal subsidies supporting solar technology are dwindling fast…and so forth. Overwhelmed by the enormity of the environmental crisis, many Americans simply despair. And the brute fact is that that any hint of separating workaday Americans from their internal combustion transport would surely incite in-the-streets rebellion. Who faces/ faced the more daunting prospects, abolitionists or Green New Dealers? It’s a toss-up.

 

So how can one claim that the Green New Dealers actually have history on their side? To understand why this is so, let us consider further the history of those old-time abolitionists. What do we learn when evaluating their war against slavery that foretells an empowered future for the Green New Dealers?

 

We learn, above all, that for all their deep moral insight, the abolitionists demand for immediate emancipation was all aspiration, no plan. It contained not a shred of down-to-earth politically engaged public policy. The problem of slavery was so all-encompassing that abolitionists found it impossible to undermine incrementally. The only options left to them were condemning, prophesying and resisting. As a result, during three long decades of struggling to change people’s minds, abolitionists watched helplessly while slaveholders doubled their portion of the national domain. Profits wrung from enslaved labor doubled then doubled again. The enslaved population shot up by 25% from three to four million. In response to “moral suasion,” slavery waxed fat, white supremacy endured. 

 

Under these conditions, as historians Eric Foner, James Huston and James Oakes have all emphasized, slavery’s path to extinction was plotted not by the abolitionists, but by mobilized northern voters whose militant response to the continued expansion of slavery was to demand the construction of an “antislavery bulwark” to stop the plantation owners’ march westward and a new Republican party pledged to maintain such a barrier. Theirs was a forceful expression of “Not in my back yard,” not a bold initiative to unshackle King Cotton’s labor force.  Employing “moral suasion” in the hope of changing minds and moral outlooks in favor of African Americans had utterly failed. The highest aspiration of all—racial equality—moved away ever further as the abolitionists clung to the margins of the politics leading to Civil War. 

 

Attentive Green New Dealers reading this essay know already why this plunge into abolitionism’s history offers them such powerful reassurance. They’ve managed this feat by mapping the stark differences they discern between their movement and that of the abolitionists, not the similarities highlighted thus far. Each of these contrasts identifies one of the Green New Dealers significant strengths by setting it against a principal characteristic of the abolitionists. It’s a straightforward exercise and it makes it perfectly clear that history, applied comparatively, is on the side of the Green New Dealers.

 

--Abolitionists; Vaulting Aspirations, No Policies/ The Green New Dealers: Vaulting Aspirations Backed With Specific Policies

--Abolitionists: Highest Premium is Equalizing Races/ The Green New Dealers: Highest Premium is Securing Peoples’ Future Irrespective of Race

--Abolitionists:  Marginalized Politically /Green New Dealers: Ever More Highly Organized Politically

--Abolitionists: Strenuous Moralists /Green New Dealers: Strenuous Moralists and Strenuous Advocates of Empirical Science

--Abolitionists: Driven By Abstract Beliefs, Not By Pressing Deadlines /Green New Dealers: Driven Entirely By Scientifically Predicted Disaster Dates.... and so forth.

 

In the end, as both this essay and this specific listing demonstrate, the New Green Deal exhibits all vital strengths of the abolitionists, one of our history’s most dynamic, disruptive, egalitarian and ethically grounded social movements. At the same time it is burdened by none of abolitionism’s limitation even as it makes itself into a powerful hybrid mixture of a grass roots organization, political pressure group, guardian of the public good and legislative insurgency. In this particular historian’s best estimate, the Green New Dealers are the equivalent of the abolitionists reincarnated, but also so very much more than that and are poised to exert an enormous impact. “All things are ready, if our mind be so.” ---― William Shakespeare, HenryV. 

]]>
Fri, 22 Feb 2019 20:42:59 +0000 https://historynewsnetwork.org/article/171258 https://historynewsnetwork.org/article/171258 0
What a Japanese-American soldier’s thirty-year secret can teach us about race, war, and loyalty

Masao Abe with his Congressional Gold Medal. Photo Credit: Kathie Abe

 

The Thirty-Year Secret

 

Sometimes, stories of heroism reveal themselves in the most unusual and humble ways. That was the case with Masao Abe, a second generation Japanese-American, or Nisei. Masao served in World War II as an interpreter with the Military Intelligence Service (MIS), an operation credited for shortening the war in the South Pacific by two years. Masao recalled memories from World War II with clarity, as though events had taken place the week before. At the age of 91, Masao began to share his story until he passed away in 2013 at the age of 96.

 

Revealing his WWII experience wasn’t something Masao could have done when he was in his thirties, forties, or even fifties. The MIS operation was considered classified, so much so that discussion pertaining to the MIS was forbidden for years. In fact, these restrictions weren’t lifted until the 1970s. While other soldiers, even the Nisei soldiers who formed the 442nd, could speak about their part in American campaigns, MIS soldiers had to remain completely silent about their military achievements. Theirs was a sensitive operation that was held secret from the day it activated, a month before Pearl Harbor was attacked. And a handful of MIS soldiers, like Masao, had quite significant stories to share.

 

The Military Intelligence Service

 

The first class of the MIS began in November 1941. The initial location of the MIS Language School, where soldiers were trained in interpretation/interrogation, was housed on the Presidio under a wing of the Fourth Army. The Presidio would become the Western Defense Command hub, with Lieutenant General John DeWitt at the helm, when the US declared war on Japan. When DeWitt ordered the evacuation of all persons of Japanese ancestry off the west coast, that included the MIS soldiers on the Presidio, even though they were US Army and undergoing training on that very base. The MIS relocated to Minnesota in the spring of 1942. Approximately 6,000 MIS soldiers would be trained; most were sent to communications centers where they intercepted and interpreted Japanese radio chatter and translated documents and books.  

 

Within the highly sensitive MIS operation was a deeper level of military strategy. A small fraction, as few as 250 of the 6,000 MIS soldiers, were embedded in infantry divisions and served directly in combat in the South Pacific. This was the case with Staff Sergeant Masao Abe who was attached to the 81st Infantry Division.

 

Considering how much hatred there was toward the Japanese during that time, to be Japanese-American and embedded in a 25,000-soldier, mostly Caucasian infantry division was a dangerous mission at the outset. So dangerous, in fact, that Masao had three bodyguards attached to him at all times. These bodyguards were assigned to protect Masao from Japanese forces, but they were also there to protect him from Allied Forces, including American GIs and Marines.

 

The mission of the MIS, once on the ground in the South Pacific, was cave flushing. Japanese forces had occupied island chains in the South Pacific prior to the war and had carved out an elaborate labyrinth of tunnels and caves throughout the hills making these islands, such as the Palaus, nearly impenetrable. After the US Navy pummeled enemy strongholds with shell fire, ground troops moved in. Masao, attached to the 321st Regimental Combat Team, along with one other MIS soldier, was in the first wave of soldiers who landed on Angaur.

 

Masao was assigned to various battalions as they moved through the islands of Angaur and Peleliu to secure caves, humanely interrogate captured Japanese soldiers, gather documents to be translated, and determine military strategy such as ground tactics and the location of Japanese Imperial base camps as well as supply lines.

Amphibious assault on Angaur in the Palau Islands

(Paul J. Mueller Collection, U.S. Army Heritage and Education Center, Carlisle, PA)

 

Masao, an American citizen who had been raised for part of his childhood in Japan, was fluent in speaking, reading, and writing Japanese, and he was utilized to the fullest on battle lines. Masao’s medals speak to his heroics during the war. Among the many medals he earned were: Army Combat Infantryman’s Badge, a Purple Heart Medal, three Bronze Star Medals, three Bronze Battle Stars, one Bronze Arrowhead, an Army Good Conduct Medal, the American Defense Service Medal, an Army Commendation Medal, an American Campaign Medal, an Asiatic-Pacific Campaign Medal, and many more. 

 

There was no medal, however, for serving the US in isolation. Masao, and the other MIS soldiers who were on the ground in the South Pacific, served in a unique role. They were soldiers of Japanese descent in the midst of battle with Japanese Imperial forces and surrounded by Allied soldiers, many who had hatred toward any person of Japanese ancestry.

The ten-man Interrogation/Interpretation Team attached to the 81st Infantry Division

Courtesy of the Abe Family

 

As Masao recalled and shared his action-filled and sometimes poignant stories from World War II, I am reminded about why his is the Greatest Generation. It wasn’t that he was willing to sacrifice his life for his country, as he did with integrity as part of a sensitive and perilous operation. It was that he, like thousands of other Nisei soldiers, served his country with honor while his extended family was interned, even imprisoned, back in the US. 

 

And he, like thousands of others, couldn’t share what he’d endured. For thirty years.

 

“Never Again,” is a phrase used among Japanese-Americans so that their history, the indignities they and their families once endured, is never repeated. That phrase has recently changed to, “Never Again Is Now.” And the brave MIS soldiers, who waited so many years to share their memories, have an important message: it’s time we remember our  history. 

]]>
Fri, 22 Feb 2019 20:42:59 +0000 https://historynewsnetwork.org/article/171022 https://historynewsnetwork.org/article/171022 0
The Worldwide Problem of Holocaust Ignorance – and The Barriers to Solving It Steve Hochstadt is a writer and an emeritus professor of history at Illinois College.

 

I am not a Holocaust denier. Of course, the Holocaust happened. It remains one of the most important events of the 20th century, of modern history, perhaps of human history.

 

But if someone never heard of the Holocaust, doesn’t know that it happened, then history doesn’t matter. The event is wiped out of history, not by denial, but by ignorance.

 

Some of the most populous states passed laws between 1985 and 1995, covering nearly one-third of the US population, requiring the teaching of the Holocaust in public schools. In each case, the law specified that knowledge about the Holocaust ought to be connected to human rights issues. Prejudice and discrimination must be identified with genocide, leading to an emphasis on “the personal responsibility that each citizen bears to fight racism and hatred whenever and wherever it happens” (New Jersey) and “encouraging tolerance of diversity” (Florida). As the wording of these laws demonstrates, teaching about the Holocaust is a political act. Because encouraging diversity and fighting prejudice are politically controversial, Holocaust education is a partisan political act, and always has been.

 

Despite such laws, ignorance about the Holocaust is widespread in America, especially among young people. The millennial generation should have been exposed to Holocaust teaching in schools, especially in those states that require it. But they know little about the Holocaust. Two-thirds of millennials do not know what Auschwitz was; half cannot name one concentration camp; about 40% believe that fewer than 2 million Jews were murdered; 20% are not sure if they have ever heard of the Holocaust.

 

Ignorance about the Holocaust is a worldwide problem, even in Europe where it happened. In a recent poll, one-third of Europeans said they know little or nothing about the Holocaust.

 

There is overwhelming popular support for more teaching about the Holocaust. The same survey that showed the gaps in knowledge also found that 93% of Americans agreed that “All students should learn about the Holocaust while at school.”

 

Politicians are responding. Legislatures in Kentucky and Connecticut with unanimous votes recently passed laws to require teaching about the Holocaust in public schools. In 2017, the Anne Frank Center for Mutual Respect got commitments from legislators in 20 states to introduce bills to mandate Holocaust education, the beginning of its effort to get all 50 states to require Holocaust education.

 

But there are political problems for some in the implications of Holocaust history. The focus on human rights, the disastrous consequences of racial prejudice, the victimization of other minorities including gays, the hyper-nationalism of fascism and its deadly attacks on all leftists all can lead to a critical stance against typical conservative political positions, and in particular, against current policies of the Republican Party. Absorbing the moral significance of the Holocaust might well lead students to believe that monuments to Confederate white supremacy should be taken down, that denigration of immigrants is wrong, that loud claims that America is the greatest country ever sound like “Deutschland über alles”.

 

Holocaust deniersavowed Nazis, self-proclaimed antisemites, and supporters of white supremacy appear occasionally on the fringes of the Republican Party, or even among Republican congressmen. Some Republican candidates in the recent midterm elections used antisemitic images against their Jewish opponents. David Duke, former KKK leader and former Republican legislator in Louisiana, said about Trump’s 2016 election, “This is one of the most exciting nights of my life.”

 

American conservatives sometimes use the Holocaust to spread inappropriate partisan messages. On Holocaust Remembrance Day two weeks ago, the Harris County (Texas) Republican Party posted a Facebook message with a yellow star-shaped badge and these words: “Leftism kills. In memory of the 6 million Jews lost to Nazi hatred in the name of National Socialism. We will never forget.” The Texas Republicans explained that they were connecting the name of the National Socialist Party with “leftism”, even though the extreme right-wing Nazis killed every socialist they could get their hands on.

 

The use of the Holocaust to argue against restrictions on gun ownership has a long history. Wayne LaPierre, executive director of the NRA, Ben Carson when he was a Republican candidate for President, and the senior Republican in the House have all claimed that Jews were killed because they had not armed themselves.

 

Some people on the left also have trouble with teaching the Holocaust. Because the Israeli government and many Jews across the world have used the Holocaust as a justification for the existence of Israel, supporters of the rights of Palestinians sometimes claim that there is too much emphasis on the Holocaust.

 

Sometimes leftists are criticized, because they can be linked with other people who would like to see less attention paid to the Holocaust. For example, the two women who just became the first Muslim women elected to Congress, Rashida Tlaib and Ilhan Omar, are often accused by Republicans of being antisemitic, because of their criticisms of Israeli policy. Their comments do sometimes veer towards condemnations of Jews as a group, and Omar just had to apologize for some of her tweets. But their criticisms of Israel are echoed by many Jews. I find such conservative attacks misleading, but I am one of those Jews who is critical of Israeli treatment of Palestinians.

 

Nevertheless there are some on the left who do not wish to push more Holocaust education, because more sympathy for Jews can lead to support for Israeli occupation policies and discrimination against Palestinians.

 

But the facts of the Holocaust are clear and they lead inexorably to important moral and political conclusions, which can be discomforting to ideologues of the right and left. Antisemitism has always been based on false ideologies, and it leads to discrimination and eventually murder, like all ethnic hatreds. Extreme nationalism is the twin of ethnic hatred, and leads to war. It is always important to juxtapose the authority of governments or leaders with basic moral precepts, to question authority.

 

Holocaust education is necessary. The Holocaust is one of the most significant events of our recent global past and was an important determinant of the contemporary European and Middle Eastern world. Its moral implications, lessons if you will, have universal significance. Learning about the Holocaust makes everyone uncomfortable. That is why we must keep teaching it.

]]>
Fri, 22 Feb 2019 20:42:59 +0000 https://historynewsnetwork.org/blog/154183 https://historynewsnetwork.org/blog/154183 0
Roundup Top 10!  

Why this year's Black History Month is pivotal

by Peniel Joseph

In 2019, slavery's aftermath hovers over contemporary American race relations in deep and profoundly disturbing ways.

 

The ‘Loyal Slave’ Photo That Explains the Northam Scandal

by Kevin M. Levin

The governor’s yearbook picture, like many images before it, reinforces the belief that blacks are content in their oppression.

 

 

Will Harvard continue to fail Asian Americans — or will it learn from the past?

by Renee Tajima-Peña

Harvard does not have an Asian American studies program.

 

 

When the Catholic Church’s prohibition on scandal helped women

by Sara McDougall

But why has scandal been systematically silenced in the church for so long? One answer lies in the medieval church's doctrine on scandal.

 

 

The Dark History of Anti-Gay Innuendo

by James Kirchick

The accusation that Lindsay Graham is susceptible to blackmail is historically groundless, predicated upon the same flawed assumption most people held about gays at the height of the Cold War: that they would commit treason in order to avoid being outed.

 

 

Democrats are invoking FDR in their Green New Deal. It’s historically misleading.

by Charles Lane

Politically powerful as the invocation of America’s great collective deeds under Franklin D. Roosevelt might be, however, it is historically misleading — deeply so.

 

 

The Democrats’ dilemma: two parties in one

by Niall Ferguson

In their eagerness to recruit a new generation of young voters, the Democrats have — not for the first time in their history — admitted a faction of radical ideologues into their midst.

 

 

What the Paris Peace Conference can teach us about politics today

by Anand Menon, Margaret MacMillan, Patrick Quinton-Brown

Many of the challenges that concern us today—ethnic nationalisms, building the foundations for peace and prosperity around the globe, managing and containing war, or the future of Europe—were discussed in Paris a hundred years ago.

 

 

America’s Original Identity Politics

by Sarah Churchwell

The good news for anyone feeling perturbed is that it simply isn’t true that identity politics represents the end of America or of liberal democracy.

 

 

Trump’s Trail of Fears

by Jamelle Bouie

The president, channeling his hero Andrew Jackson, continues to champion a particularly virulent form of reactionary white majoritarianism.

 

 

Eugene V. Debs and the Endurance of Socialism

by Jill Lepore

Half man, half myth, Debs turned a radical creed into a deeply American one.

 

]]>
Fri, 22 Feb 2019 20:42:59 +0000 https://historynewsnetwork.org/article/171260 https://historynewsnetwork.org/article/171260 0
Congress and the Trump Administration Are Using and Abusing History At Least Six Times a Day Allen Mikaelian is a DC-based editor and writer with who works with clients and partners in government agencies and think tanks. He received his history PhD from American University and served as editor of the American Historical Association’s magazine, Perspectives on History. His Political Uses of the Past Project (http://historychecked.com) collects and catalogs historical statements by elected and appointed officials. 

 

I wasn’t prepared for the volume. When I started the Political Uses of the Past Project, based on my home-brewed code that scanned public records looking for historical references, I thought I’d find an occasional interesting appeal to the past among the political speeches and public statements of members of Congress and the current administration. I should have been better prepared. Almost all of our politicians, it turns out, are self-styled amateur historians. They operate from their vision of a future America, but they do so with an equally important vision of what the United States was in the past. This is evident in everything from Make America Great Again to the Green New Deal.

So when this project started, it tried to publish, via a blog, every single statement. This effort was quickly overwhelmed. Even after omitting references attached to anniversaries, tributes, eulogies, and casual references to particular dates, there was still too much. But this was interesting in itself, and opened up new possibilities for this still-evolving effort.

The timeline below is the Political Uses of the Past Project’s first attempt to visually display what it has collected. And it takes only a glance to see that the project has been busy. On average, the algorithms that make the project possible are capturing six references a day. When Congress is in full swing, it sniffs out as many as 21. In one month, the project has captured and cataloged 180 political uses of the past. In reality there are doubtless many, many more.

For more information on the data, why I’m doing this, and where the effort is going please visit the project’s “About” page and the paragraphs underneath the timeline below. For now, the project hopes historians will consider whether they have anything to say about this abundant use of history by our elected and appointed leaders. Historians have the expertise to fill in the substantial gaps left behind by steamrolling political rhetoric. They have long been encouraging each other to engage more directly with the public. They have protested the decline of truth-telling in general, not just in history. Here, in this searchable timeline, are 180 opportunities for public engagement, sparks for discussions, and prompts for essays, blog posts, op-eds, and classroom assignments. And this represents merely one month in the nation’s politics.

The timeline will be updated frequently. More visual presentations will follow. And the project is eager for feedback and ideas about where to go next.

If the timeline below seems cramped, a larger version is available here.

 

]]>
Fri, 22 Feb 2019 20:42:59 +0000 https://historynewsnetwork.org/blog/154184 https://historynewsnetwork.org/blog/154184 0
Alexander the Great and "The Republic of Northern Macedonia"

 

The Former Yugoslav Republic Of Macedonia (FYROM) standoff is finally over. The Greek Parliament has agreed that the Former Yugoslav Republic of Macedonia – its name for the country to the northwest of Greece since the break-up of Yugoslavia in 1991 –  can now be called the Republic of Northern Macedonia. The Greeks had been objecting to the use of the name “Macedonia.” Prefacing “Macedonia” with “Northern,” it turns out, makes all the difference.

 

The ultimate source of the problem – or at least the justification for the problem from the Greek perspective – has to be laid at the feet of Philip II of Macedon and, even more squarely, at those of his son Alexander the Great. If father and son hadn’t literally put Macedon on the map, modern day Greeks wouldn’t have been able to claim copyright over the place name. But who were the Macedonians anyway? What was their ethnicity?

 

During the Greco-Persian Wars, some 2,500 ears ago, an Athenian ambassador in Herodotus’ history states that Greeks are a single people because they share a common language, have common blood, and practice a common religion. We can forget about blood as an indicator of ethnicity because there’s no such thing as racial purity. Likewise, religion is a construct that is only marginally connected with race. So that just leaves language, and that raises a very thorny question: did the Macedonians speak a dialect of Greek, or a separate language, or something in between? They didn’t write literature and they haven’t left us any inscriptions in their own language, so we don’t actually know.

 

Moving on in time, to 1977 in fact, the Greek archeologist Manolis Andronikos claimed that the artefacts and paintings in what he identified as the tomb of Philip II at Vergina in northern Greece had proven beyond doubt that the ancient Macedonians were Greeks. When he died, Andronikos was awarded a state funeral, in which he was eulogized as “the shield of Greece.” The problem is that the artefacts might have been imported and the tomb painters might not have been Macedonians. So material evidence hasn’t settled the issue either, though it has incontrovertibly demonstrated that Macedon was inside the Greek cultural orbit.

 

Such complicating niceties are, of course, irrelevant when it comes to the bigger picture of using the past as propaganda. So, what’s in a name? Everything and nothing. But let’s at least praise the Greeks for burying the past.

]]>
Fri, 22 Feb 2019 20:42:59 +0000 https://historynewsnetwork.org/article/171234 https://historynewsnetwork.org/article/171234 0
Old Concepts for New Concerns: the Railroad, the Internet, and Government Regulation

New technology ripping up and reshaping the economy and individual lives. Personal privacy sacrificed. Local businesses chased out of business by remote technology-wielding companies. Monopolies ascendant.

 

This may sound like a description of today, but it is also a portrait of America about 150-years-ago during the last great technology-driven revolution. We proceed at our own peril if we don’t understand the similarities of circumstances and solutions between today and that history.

 

Like today, the new tech companies of the late 19th century were rich and powerful. Like today, the companies took advantage of the new circumstances technology had created to make their own rules. Ultimately, however, the American people had enough and their representatives in government created rules responsive to the new reality. Those rules – replacing corporate good with common good – went on to provide the stability and security underpinning unparalleled growth and wellbeing.          

 

The common denominator we today share with 150 years ago is the emergence of a new network that upends everything it touches. For us it is the internet; a century and a half ago it was the steam railroad. We marvel at the changes imposed by the internet, but they pale in comparison to how the railroad – the first high-speed network – restructured the human experience by vanquishing distance.

 

From the beginning of time, geography had controlled destiny: lives were lived in proximity to where they began; economic activity was tied to what came from the land; the family unit was also an economic unit. The railroad destroyed these traditions. Expanding at an unprecedented pace to move people and products at unimagined velocity, the railroad enabled industrialization, pulling masses of workers from the farms into urban centers for the mass production of products that were then redistributed back out to an interconnected mass market.     

 

Operating typically as local monopolies, railroad companies were free to charge whatever the market would bear. Farmers especially found their economic well-being controlled by the railroad monopolies. After a decades long fight for regulation of the railroads, in 1905 Congress gave the new Interstate Commerce Commission (ICC) – the first federal regulatory agency – the power to determine whether rates were just and reasonable and to require railroads to haul all freight on a nondiscriminatory basis.  

 

On the heels of the railroad came the first electronic network: the telegraph. If the locomotive drove the death of distance, sparks from the telegraph key were the end of time in the distribution of information. The lightning speed of telegraph messages created national-scope activities: a national media, national financial markets, national weather forecasting, and the creation of large managed-from-afar corporations.

 

The telegraph was also typically a local monopoly. Wary of a bottleneck to the free flow of information, Congress included in the Pacific Telegraph Act of 1860 a requirement that message traffic be carried without discrimination in the order it was received.

 

Preventing monopolistic bottlenecks through non-discriminatory network access on just and reasonable terms is a 150-year-old principle that remains fresh today. It is as applicable to the zeroes and ones of digital code as it was to the steam and sparks of 19thcentury networks.

 

Beyond the behavior of networks, the principles developed to oversee the behavior of those who use the networks remain relevant today as well.

 

Like internet giants such as Google and Facebook, the 19th century entrepreneurs that took advantage of the new technologies grew quickly, often at the expense of local businesses. When Gustavus Swift developed the refrigerated railroad car in 1878, for instance, his company did to local butchers what the internet companies would do to the local advertising business over a century later: provide an important new service that also had the effect of destroying a cornerstone of the local economy. For Swift it was slaughtering at scale. For Facebook and Google it was targeting information at scale.

 

Like the mid-19thcentury, the new networks have been harnessed to centralize economic activity in big corporations. Montgomery Ward and Richard Sears were the Amazon of the era, taking business from local merchants via mail order. Shortly after the turn of the century, Theodore Vail consolidated both telegraph and telephone networks to squeeze out local competitors. John D. Rockefeller leveraged the volume his oil business did with the railroads to crush small competitors.

 

Congress acted multiple times – the Sherman Act in 1890 and in 1914 the Clayton Act and Federal Trade Commission Act – to harness the abuses of market-destroying monopolies. The principles that reigned in Rockefeller and others remain relevant to today’s digital Rockefellers.

 

The effects of today’s new technologies reprise the historical experience. The question remains, however, as to who will make the rules for the digital age? Thus far, the people’s representatives have largely been unwilling to follow history’s precedent to protect consumers and competition in the new environment.

 

Because the capabilities of digital technology are new, there are few rules protecting consumers and competition from its impact. Over a century ago when the rules governing agrarian mercantilism proved insufficient for the industrial era, new policies were developed to reflect the new realities. For the last few decades we have relied on extrapolating those industrial rules to our new internet circumstances with little success. We now need specific rules for the internet era. 

 

The principles underpinning the industrial era rules remain valid today, they simply need updating to reflect the capabilities of the new technology. New guardrails are needed to protect against the natural tendency of institutions and individuals to maximize for their own welfare, rather than the welfare of consumers or the competitive market.

 

 

]]>
Fri, 22 Feb 2019 20:42:59 +0000 https://historynewsnetwork.org/article/171019 https://historynewsnetwork.org/article/171019 0
The American Tourist in Paris: A Retrospective

Screenshot from Woody Allen’s Midnight in Paris. Image courtesy of Imdb.com.

 

For many Americans, spending time in Paris is a chance to enjoy the same delights, copious amounts of champagne and dancing to jazz at the all-night parties in the cafes and clubs of Montparnasse, that some of the nation’s greatest writers did during the années follies of the 1920s. Films like Woody Allen’s Midnight in Paris glamorize the idea of this ‘faux bohemian’ life style in which “struggling” American writers, fleeing from the puritanical restrictions at home, were able to liberate themselves and indulge in all the pleasures Paris had to offer. This idea of a morally liberated city where one can indulge in all the pleasures life has to offer remains one of the many reasons Americans continue to flock to Paris on their vacations, be it for a few weeks, months, or years.

 

The presence of Americans in Paris stretches back to the days of the American Revolution, when diplomats and their families visited the fashionable salons of the Parisian nobility and haute bourgeois. Throughout the 19th century this trend continued with the wealthy members of American society taking in the sights, arts, and fashions of the city. The advent of easier, cheaper travel via steamship led to a broader range of upper middle-class tourists coming to enjoy the pleasures of the city throughout the Belle Époque era at the end of the century. The haute-couture houses in the fashionable 2nd arrondissement became a hub for wealthy American women, while the decidedly less reputable maisons closes/maisons tolérées became a must-see for many young American men. 

 

Paris’s reputation as a city of pleasures carried over into the First World War, with many American troops seeking and sharing information about where to spend an evening or two. After the Armistice was signed and the U.S. military no longer curtailed periods of leave, these evening galivants through the city’s less reputable locations dramatically increased (as did the reported cases of various venereal diseases). The presence of large numbers of American soldiers enjoying their leave in Paris heightened long-present fears among French troops on the front about the infidelity of women on the home front.   

 

While French troops might have resented the presence of Americans in Paris, civilians appreciated the energy (and money) the troops brought to the war-weary city. This was particularly true among those working in the luxury goods industries, as the well-paid American looking for souvenirs to bring home helped to ease their financial burdens created by wartime restrictions. The Parisian love affair with Americans in the immediate post-war years remained strong as President Woodrow Wilson and his Fourteen points became a symbol for peace and hope for the future of the world. 

 

However, following the refusal of the U.S. Senate to ratify the Versailles Treaty and the joint security accords between France and U.S. in 1920, relations between the two nations soured quickly. Combined with these diplomatic tensions, the importation of Americanized mass marketing, business practices, and five-and-dime stores were viewed with suspicion as they appeared to undercut traditional French values and business practices. Fears of the Americanization of French life and culture grew exponentially during 1920s, and remains a source of concern for many French people today.

 

The importation of American culture occurred alongside a massive increase in American tourists. The 1920s saw the numbers of American tourists increase from 15,000 to over 400,000 annually and the number of expats living in the city jumped from 8,000 to nearly 23,000 by 1923. The behavior of these American tourists and expats throughout the 1920s did nothing to ease the fears and tensions created during the First World War. The depressed value of the franc following the war made the luxuries associated with the “good life” easily attainable for the Americans living in and visiting the city, while for native Parisians it made basic household necessities difficult to afford. As Mary McAuliffe discusses in her book When Paris Sizzled: The 1920s Paris of Hemingway, Chanel, Cocteau, Cole Porter, Josephine Baker, and Their Friends, many Parisians viewed the ways in which Americans flaunted their wealth in fashion and extravagant soirées as extremely vulgar, and the avant-garde art movement they often patronized was considered by many as a foreign intrusion and an undesirable addition to the French artistic repertoire. The difficulties of getting a table at the hottest clubs and cafes in Montparnasse due to the constant presence of Americans was a cause of annoyance among Parisians; the unpaid bills and broken/stolen furniture did nothing to endear Americans to the proprietors of the establishments.

Life for most Parisians during the 1920s was much less glamourous than that of the many American expats. Image courtesy of Luc Sante (lucsante.com)

Combined extravagant displays of wealth and general disregard for local property, one of the main areas of contention between the Americans and Parisians was the overtly racist attitudes Americans had towards members of the black community. Paris in the 1920s was enraptured with all things associated with African Art and African-American culture (jazz, the Charleston, Josephine Baker, etc). Those Parisians who partook in this négrophilie considered themselves to be the epitome of the modern and fashionable world. Although extremely problematic in that it sexualized and fetishized black culture (see Jennifer Anne Boittin’s Colonial Metropolis: The Urban Grounds of Anti-Imperialism and Feminism in Interwar Paris), the Parisian négrophilie provided opportunities unavailable in the United States for African-Americans, particularly for performers, with Josephine Baker being the best example of this phenomenon. White Americans, however, were often horrified at the ways Parisians openly appreciated and enjoyed the imported black culture. They went so far as to frequently, and at times violently, demand that club and restaurant owners expel black performers and patrons. Things eventually got so bad that the French Minister of Foreign Affairs had to issue a statement: 

“some foreign tourists, forgetting that they are our guests and that, consequently, they should respect our customs and our laws, have recently, on numerous occasions, violently.... demanded [black persons'] expulsion in offensive terms. If this continued, sanctions will be taken.”

Sanctions were not taken. However, the onset of the Great Depression forced many Americans to pack their bags and head home as the good life, or any sort of life, in Paris was no longer affordable. The return of American troops during the Second World War and the boom period which followed the end of the war brought Americans back to Paris. These tourists once again indulged in the delights of the city, and throughout the second half of the 20thcentury helped to maintain Paris’s status as the home of fashion, art, and fine dining.

 

Today Americans remain one of largest groups of international visitors to Paris. With over 2 million Americansannually descending upon the city, it is hard for locals to ignore the contributions they make to the local and national economy. Relations between Parisians and American tourists are much less tense today than they were throughout the 1920s, with American tourists presenting a constantsource of amusement for Parisian inhabitants, particularly when it comes to their fashion choices (see below). Their large numbers however continue to contribute to the numerous tourist-centric problems facing the city. While crowded streets and metros are aggravating during weekday commutes, the inundation of short-term rentals like Airbnb has caused an already tight and expensive housing market to become even tighter and more expensive, forcing many Parisians to leave the city for good and leaving the tourist-centric arrondissements surrounding the Louvre in particular bereft of tax payers and children to fill classrooms. Although no longer breaking furniture, running out on checks, and throwing racist fits about the evening’s entertainment, Americans remain an integral part of Paris’s continued debates over the benefits and detriments of being one of the world’s largest tourists destinations. 

The signature “white socks with shorts” look of many American tourists is a constant source of amusement for Parisians. Image courtesy of The Telegraph.

 

]]>
Fri, 22 Feb 2019 20:42:59 +0000 https://historynewsnetwork.org/article/171179 https://historynewsnetwork.org/article/171179 0
Combatting Crimes against History

Many of us may remember Syrian archaeologist and historian Khaled al-Asaad. A director of the Antiquities Department in the ancient town of Palmyra for decades, he was detained in 2015 by Islamic State militants at age 81. They interrogated him for over a month because they wanted to know where his collaborators had hidden saleable artifacts. When he refused to reveal their location, he was beheaded in front of a crowd at a square in Tadmur, next to Palmyra. His body was first left on the ground, then suspended from a traffic light, his severed head underneath it. A handwritten placard tied to his body contained six charges: he served as the “director of idolatry” at Palmyra, he was an apostate, he was loyal to the al-Assad regime, he represented Syria at “infidel conferences,” he visited Iran, and he communicated with the security services. His body was then taken to Palmyra and hung from one of the Roman columns, while IS supporters circulated photos of the body online.

 

Crimes in history are often well studied, crimes against history are not. The latter have in common that they destroy either the messages or their producers. Crimes against history comprise any of the following acts when committed as part of a widespread or systematic attack pursuant to or in furtherance of a state or non-state policy: the assassination and disappearance of historians; public personal attacks on historians through hate speech, defamation, and malicious prosecution; the intentional destruction of cultural heritage; and disinformation, including fake news, genocide denial, and the censorship of history. Crimes against history attack the core of historical writing – its integrity.

 

At times, censors of history resort to extreme solutions. They do not “simply” suppress individual historical messages but destroy their vehicles or even kill their producers. Censors become hecklers, killers, and avengers. In Crimes Against History, I estimate that from the dawn of time 428 historians around the globe have been killed for political reasons. More than half of them were murdered after 1945, al-Asaad being one of the more recent cases. 

 

Alternatively, heads of state and government have tried to kill historians not physically but psychologically by publicly attacking them on account of their critical attitudes, often by spreading hate messages about them, defaming them, or unjustly prosecuting them. 

 

In the same vain, “fake news” and its corollary “disinformation” are threats to historians in many countries. Less recognizable than murder or character assassination and more insidious, disinformation is censorship’s twin.

 

Yet another form of attack against history is the intentional destruction of cultural heritage. Many political systems have tried to sweep the remnants of the past away and start from the year zero. After 1945, regimes who followed this path of destruction had a communist, nationalist, or Islamist signature. The Cultural Revolution in China was initiated by Communist Party leader Mao Zedong in 1966. It was set up to “smash the Four Olds” – old ideas, old culture, old customs, and old habits. One of the results was the demolition of over 6,000 monasteries (reportedly 95 to 97 percent of the total) in Tibet. During the wars that raged over the territories of Yugoslavia in 1991–1995 and led to its breakup, all sides in the conflict destroyed archives, monuments, and sites in a deliberate effort to achieve what was variously called cultural cleansing, crimes against culture, and bibliocide. In Afghanistan, the Taliban issued an edict to destroy the world’s two largest standing Buddha statues at Bamiyan on the Silk Road. The destruction went ahead in March 2001. 

 

Although iconoclastic regimes are transient phenomena, some take a long time to disappear. Meanwhile, they can destroy the entire texture of society. In any case, they are not easily forgotten and may haunt the public imagination for generations, if not centuries. In their relentless efforts to destroy the past, they achieve some of the immortality and posthumous fame they yearned for. But they survive not in recollections of pride but in collective memories of horror. Iconoclasm is a supreme form of censorship. It is for cultural heritage what killing is for people.

 

Sometimes, however, the censors organize their own funeral. The censorship of history has chilling effects, certainly, but it also generates backlash. It possesses the curious self-defeating tendency to highlight what it wishes to hide. The historians it targets invent techniques to evade it. When they cannot speak openly, they may introduce historical analogies through which they comment on the present by speaking about the past.

 

Take two causes célèbres. In the first controversy, prominent German historian Ludwig Quidde in 1894 compared the German Kaiser Wilhelm II to Roman Emperor Caligula. He was threatened with a defamation trial and professionally ostracized. In the second, Chinese historian Wu Han had written a biography of the first Ming Emperor, Zhu Yuanzhang. Upon reading it in 1949, Mao perceived it as a negative commentary on his own rule. Although Wu rewrote the biography several times, Mao was not satisfied. After he attacked Wu in public in 1965, the latter was thrown in prison, where he died almost three years later. Or take the lesser known case of dissident historian Ali Shariati, one of the outstanding ideologues of the Iranian Revolution. In the 1970s, he lectured against the Shah at a religious center in Tehran. These lectures had tremendous nationwide success and contained stories about “the Pharaoh” and “Umayyad Caliph Yazid I”. These were code words for the Shah and this was quickly perceived by the audience. In an act of stunning incomprehension, it took the secret police SAVAK six months to realize what was going on.

 

When such analogies stick to memory, they transform into templates for conversation, criticism, and action. In a unique reflex, historians throw light on recent injustice by talking about remote injustice and by referring to freedom in times past. In so doing, they arouse historical consciousness and briefly bring relief.If successful, historical analogies are small victories against autocratic power. 

     

Many historians living in repressive contexts went beyond the tool of historical analogies and expressed solidarity with their persecuted colleagues. Italian historian Federico Chabod did just this. When his teacher Gaetano Salvemini was persecuted for his anti-Fascist activities in 1925, he helped him escape across the border to France. Chabod also actively assisted his colleagues who had fallen out of grace after the 1938 race laws. During the war, he became a partisan leader, but he had to go into exile himself in 1944–1945. In the 1950s, when he acted as the president of the International Committee of Historical Sciences, he was known for his support of Eastern European historians after the crises of 1956. 

 

In many creative ways, historians have resisted the assault of power on the past tacitly or openly. They exploited the smaller or larger margins of freedom, often at great risk. Over time, a surprising amount of these acts have become examples of moral courage. There is a stubborn tradition in the historical profession of holding high the standards of scholarly integrity in the face of likely censorship. Intrinsically fragile as they are, acts of moral courage possess one powerful characteristic: they can inspire long after the facts to which they refer have disappeared. As long as they are retold, stories of commitment and integrity inspire hope and pride.

 

Those who died while exercising their right to history are testimony to the fact that the historian’s craft is vulnerable, that its integrity needs constant vigilance and protection, that the defence of its basic principles is a duty for all, always, and everywhere. Those who resisted the censorship of history and fought organized oblivion safeguarded the integrity of the historian’s craft and often inspired or comforted others who otherwise felt alone and powerless. I am writing these lines when the very idea of human rights has come under sustained attack and when some talk about a “post-human-rights world.” Historians, however, should defend the best idea humanity ever had: human rights. Why? If not for the idea itself, then at least because history as a craft can only flourish in a democratic society that respects human rights and freedoms necessary to protect its responsible exercise. How? By speaking out for these rights and freedoms. By combatting the crimes against history. As al-Asaad did before he became the victim of just such a crime.

]]>
Fri, 22 Feb 2019 20:42:59 +0000 https://historynewsnetwork.org/article/171033 https://historynewsnetwork.org/article/171033 0
The British, with Guitars and Drums, Invade America in 1963

 

You remember them from 1963 as vividly as you remember breakfast – the Beatles, the Rolling Stones, The Dave Clark Five, Lulu, Dusty Springfield, the Kinks, the Animals – the great British music invasion of the 1960s.

Oh, and there was Herman’s Hermits, too.

The lead singer for Herman’s Hermits was the lovable, 17-year-old Peter Noone, and his story is the centerpiece of My Very Own British Invasion: A Musical Fable of Rock N’ Love, the new, loud, and brash musical that had its world premier Sunday at the Paper Mill Playhouse in Millburn, N.J. The performers have been tearing the roof off of the staid old theater ever since they hoisted the Union Jack across the stage and exploded into their first song.

The just splendid British Invasion is a romping, stomping play about Mick Jagger, John Lennon and, most of all, Peter Noone. It has just about every hit song from England in it, including, thank you very much, a great medley of the Beatles work.

Over the last fifteen years or so there have been numerous plays about musicians. Most of them fail. The ones that succeed are winners because they not only present the hit songs of the performer of group, but tell a good story. Remember The Jersey Boys? Beautiful, the Carole King play? The British Invasion has that in the love story between Noone and a girl named Pam (based on singer Marianne Faithful) and the love triangle formed because Pam also loves Trip (based on Mick Jagger). It is at times a sweet story and at times tempestuous, as all good love stories are. The play’s director, Jerry Mitchell, also directed The Jersey Boys.

The play, a fictional story based on Noone’s life, starts in the Bag O’ Nails, a real, raucous London club populated by the British rock stars of the day. Noone, underage, gets in because he knows John Lennon, who immediately buys him drinks (two coca colas). There, amid an uproar of music, and the Beatles medley, he meets gorgeous Pam, an eighteen-year-old singer looking for fame and fortune, as they all are. Noone’s problem is that Pam is in love with Trip, a singer who is an explosion of both music and personality. Who to follow, the raffish Trip or quiet, solid and charming Peter?

Noone is jealous of the parade of Brit stars who go to America for tours in the mid-1960s and finally convinces his manager to send him, too. This is after Pam left for the states and her own tour. Here the Noone love affair with Pam is strained when he can’t find her and the shadow of Trip continually looms over both of them.

 Finally, Noone finds Pam, but so does Trip. Who does she end up with? That’s the story of the play.

The story is so successful because book writer Rick Elice has made the characters of Noone, Trip and Rice (and others) deep, rich and captivating. They are complete people with all of their triumphs and tragedies. The audience can connect with them.  Elice’s invasion story is a good one, too, and he tells it carefully. 

The play’s director, Mitchell, does a magnificent job of staging a musical that is not just about music. There are a lot of songs in the story but they do not overwhelm the lovers. The successes and failures of the lovers do not overwhelm the music, either.

People will probably go for the music and wind up enjoying an involved love story, too. Mitchell very delicately both merges and separates the two.

The performers are quite good. Jonny Aimes is just adorable as young Peter Noone. You so want him to wind up with Pam. He sings many of the Hermits songs, such as “Mrs. Brown, You’ve Got a Lovely Daughter,” and “I’m Into Something Good.” Ironically, Noone seems to have hated his biggest hit, “I’m Henry the Eighth, I Am” and cringes whenever people make him sing it. That’s the one song for which he will always be remembered. Equally good is Conor Ryan as Trip, who bounds about the stage with endless energy and can sing up a storm. The third part of the love triangle, Pam, is played with great emotion by Erika Olson. She does some nice ballads and rock the room when called upon.  John Sanders a solid agent Fallon.  Daniel Stewart Sherman is The Hammer, a tough guy (anyone nicknamed the Hammer has got to be a tough guy). Travis Artz is a funky Ed Sullivan and  Gemma Baird is enchanting as Peter’s mom. They are surrounded by a brilliant ensemble cast of actors/singers. The choreography by Mitchell is very good.

The theater rocks all night. Some of the songs are “For Your Love,” “She Loves You,” “Go Now,” “I Want to Hold Your Hand,” “Don’t Let the Sun Catch You Crying” and “House of the Rising Sun.” Not enough for you? How about “Let’s Spend the Night Together,” “Time of the Season,” “There’s a Kind of Hush (All Over the World),” “In My Life,” “She’s Not There” and “I Only Want to be with You,” the hit by the love of my life, Dusty Springfield.

The irony of the show, of course, is that these groups did not fade away in the ‘60s. They live on today through the radio and TV documentaries. The Rolling Stones are about to embark on yet another world tour, Paul McCartney is still a huge star and good old Peter Noone has his own show on Sirius radio.

Why do they live on? It’s because, as the song goes, rock and roll is here to stay.

PRODUCTION: The play is produced by the Paper Mill Playhouse. Set Design: David Rockwell, Costumes: Gregg Barnes, Lighting: Kenneth Posner, Sound: Andrew Keister. The play is directed and choreographed by Jerry Mitchell. It runs through March 3

]]>
Fri, 22 Feb 2019 20:42:59 +0000 https://historynewsnetwork.org/article/171214 https://historynewsnetwork.org/article/171214 0
Why the Congress of Racial Equality Has Been Forgotten – And Why It Still Matters Today

Marv Rich (left) and Rev. Fred Shuttlesworth (SCLC)

 

Marvin Rich of the Congress of Racial Equality (CORE) passed on December 29, 2019 in New York City (NYC). A month later, his death has gone unnoticed by any of the local NYC or major newspapers. This is despite the fact that as CORE's second in command during the early to mid-1960's, he was one of the most significant leaders of the Civil Rights movement. He helped lead CORE at its height when many of its members were imprisoned and even killed. 

 

The absence of coverage on Rich's passing speaks to how CORE has often been left out  of the history of the Black freedom struggledespite its significant contributions. According to historian Robyn Spencer, CORE "has been the most understudied of the four nationwide civil rights organizations (CORE, SNCC, SCLC, NAACP) associated with the early Black Freedom movement." There are far more books and academic essays on these groups than CORE andCORE's history is frequently left out of academic conferences and public commemorations celebrating the Civil Rights movement.

 

Even in the history ofits most well-known campaign, the 1961 Freedom Rides, CORE’s role is often diminished in popular accounts. Stanley Nelson's documentary Freedom Riders, for example, depicts the Student Nonviolence Coordinating Committee (SNCC) as the leaders and heroes of the story for stepping in and continuing the Rideswhen CORE’s national office stopped them following the violence in Birmingham, Alabama. Members of New Orleans CORE, however, insist they never quit and were preparing to continue the rides, a side of the story barely heard. 

 

CORE's history as a Black Power organization is even more neglected. While it may not have been the first Black Power group, it was definitelythe second. CORE's national director Floyd McKissick was alongside SNCC head Stokely Carmichael in Mississippi when he made his historic cry for Black Power in 1966. Like SNCC, CORE officially changed from a Civil Rights to a Black Power organization two weeks later at its national convention. It moved away from non-violent direct action and the goal of integration and instead encouraged self-determination and the use of violence in self-defense.

 

Further, CORE had supported the ideology behind Black Power long before it had a specific name.According to Harlem CORE chairman Roy Innis, Carmichael just gave a name to what many in his and other CORE chapters had already been doing. CORE's involvement in the movement, however, tends to be ignored. For example, the Schomburg Center's recent Black Power 50 book and digital exhibitions neglected to discuss or even mention CORE.  This is ironic given that CORE’s headquarters were located right around the corner from the Schomburg on 135th street.

 

 

CORE's exclusion from history can be traced back to Roy Innis who succeeded McKissick as national director. Since the 1980s his politics and therefore those of CORE shifted 180 degrees to the extent that he supported the same forces that CORE once opposed such as Rudy Giuliani, Ronald Reagan, both George Bush Senior and Junior. Over time, Innis became known as that Black leader used by the right wing to counter progressive Black leaders. The fact he was successful at it for so long is the problem. In the eyes of former members and scholars, Innis not only ruined his own reputation, he took the entire organization to hell with him. His anti-movement activism and "shenanigans" dirtied the "brand" and made it unattractive. By 1988, Innis' actions were so embarrassing (such as his fist fight on the "Morton Downey, Jr. Show" talk show), it was damn near impossible to give him and by extension CORE any credit for the good he did. 

 

 The SNCC Legacy Project partnering with Duke University has worked to create educational programs and make its archival material open to the public online. Unlike groups such as SNCC, however, CORE has no legacy project. This is the other part of the problem. There is no one "pushing the brand." History is political. It requires promotion and marketing. Under Innis, CORE both neglected its own history while rewriting parts of it, added "alternative facts" and took out references to having been a Black Power organization. Former members of CORE have been so demoralized by what Innis has done that they distanced themselves and gave up.

 

Starting with its sit-ins in 1942, it was CORE that pioneered non-violent direct action paving the way for Civil Rights organizations like the Southern Christian Leadership Conference (SCLC) and SNCC. CORE’s tactics, techniques and strategies were adopted and modified by Feminists, Gay Rights activists, Environmentalists, the Stop Mass Incarceration movement, the Occupy Wall Street protests, and Black Lives Matter (BLM). 

 

Once a vanguard protest organization, CORE’s history is significant because it challenges the historiography of the Black freedom struggle by showing how the Civil Rights movement also happened in the north. Racism and the violence that accompanied it was a national issue and not just a southern phenomenon. CORE’s history also speaks to the diversity of the Black Power movement and expands the conversation beyond that of the Black Panthers.

 

Historian Nishani Frazier argues CORE was the only organization to launch a nationally coordinated attack against Black economic inequality. Her book Harambee City discusses how "CORE attempted to radically restructure cities by transitioning from direct action protest to community development. CORE’s programs acted to stem urban decline by democratizing capitalism." Its Target City project is just one such example of how CORE created a "radical roadmap for economic development."

 

Another aspect of the Target City project is explored in Rhonda Y. Williams' essay "The Pursuit of Audacious Power Rebel Reformers and Neighborhood Politics in Baltimore, 1966-1968." Her work connects former CORE strategies against housing discrimination to current attempts to increase accessibility and affordability in housing.

 

 My own research focuses on CORE's attempts to transform New York City. In the process CORE created a blueprint for how to get control of local institutions in Black areas. It's focus on community control, independent Black politics, and police brutality has much in common with BLM's action platform. 

 

CORE members led the fight for community control of the city's public schools in the Ocean Hill-Brownsville section of Brooklyn as did Harlem CORE at I.S. 201 and Queens CORE at P.S 40. These demonstrations led to a dramatic increase in the number of Black and Latino teachers and principals as well as the inclusion of Black history in the curriculum. An independent Black school movement evolved as CORE members either founded or played central roles in creating schools such as Uhuru Sasa in Brooklyn, the Dewitt School in lower Manhattan, Central Harlem High School, and the CORE School in the Bronx. Independent Brooklyn CORE created several community based programs with after school tutoring and eventuallyover a dozen senior centers and kindergartens.  

 

  

CORE members also ran forpolitical office often by creating independent third parties. Many grassroots Blacks won positions ranging from community school board seats to councilmen to state senators. Congressman Major Owens' political career started when Brooklyn CORE created the Brooklyn Freedom Democratic Movement and ran him for city councilman. As part of its Target City program, Cleveland CORE played a central role in getting Carl Stokes elected as the first Black mayor of a major American city.  

 

CORE's anti-police brutality campaigns often made it seem like the fire starter. Its 1964 rallies in Harlem and Bedford-Stuyvesant set off city wide riots and the start of the ‘long hot summers’, the urban rebellions in northern cities across the country during the mid to late 1960s. This led to an immediate increase in the number of Black police officers, captains and commanders. Frazier's book also details how in 1965 Cleveland CORE had its own "counter commissions and investigations into police brutality." "Their activities became part of a larger movement to replace the political powers which selected the local police chief."

 

At minimum, CORE was a starter kit for Black radicalism. It was through CORE many future Black Power leaders found inspiration and were influenced to start their own organizations and sub-movements. According to Jitu Weusi, a founder of the cultural nationalist organization the EAST and its Uhuru Sasa school, "most of the skills that I obtained in my organizing and organization building came out of my relationship with Brooklyn CORE."

 

CORE also developed a lasting blueprint for broader social movements. Starting with its sit-ins in 1942, it was CORE that pioneered thenon-violent direct action tactics thatpavedthe way for Civil Rights organizations like the Southern Christian Leadership Conference (SCLC) and SNCC. CORE’s strategies for fighting against racial discrimination and organizing around issues are relevant because they are still being used today.CORE’s tactics, techniques and strategies were adopted and modified by Feminists, Gay Rights activists, Environmentalists, the Stop Mass Incarceration movement, the Occupy Wall Street protests, and Black Lives Matter (BLM). 

 

CORE also produced two of the most progressive Presidential candidates in modern times: Jesse Jackson and Senator Bernie Sanders, both of whom started their careers in activism as leaders of their college CORE chapters.

 

In the larger sense, CORE created models for how people of different ethnicities, religions, and backgrounds can live and work together in peace and harmony thus fulfilling the promise of American democracy, something very much in need of today in a country so hostile to "the other."

 

Much remains to be learned about CORE. What was happening in CORE that it attracted these activists to begin with? What forces happened within CORE to drive or produce such revolutionary actions? How can these CORE strategies be used today by activists fighting against inequality?

 

What have we missed by not studying CORE? We need to go back to CORE to have a better understanding of the ongoing antiracist movement and of how our scholarship can help sustainable cities and communities. 

]]>
Fri, 22 Feb 2019 20:42:59 +0000 https://historynewsnetwork.org/article/171193 https://historynewsnetwork.org/article/171193 0
We Need to Re-think Our Characterization of Trump’s Trade War, and “Mercantilism” Just Doesn’t Cut It

In the past few weeks, investors have blamed the volatile U.S. stock market on Trump’s ongoing “trade war” with China. Several opinion pieces from Big ThinkThe HillThe New York Times, and The Washington Post have called this policy “mercantilist.” President Donald Trump, they argue, is following seventeenth and eighteenth-century European commercial ideology. As Catherine Rampel of TheWashington Post has stated, Trump’s trade policies “bear an uncanny resemblance to classical mercantilism.”

 

While tempting to use some overarching term to explain Trump’s unpredictable approach towards commerce, these pieces fail to assess accurately both early modern European trade policies and Trump’s own actions. Their characterization deludes readers into thinking Trump follows some coherent commercial plan. To try and predict the President’s behavior we have to shed our tendencies to place a label on him. 

 

Based on these columnists’ definition of the term, “mercantilism” was a commonly-held ideology that assumed trade was a barbaric battle among competing empires for limited resources. These empires felt they needed to achieve a favorable balance of trade to maximize their wealth in the form of imported gold and silver. To augment state power and wealth at the expense of their rivals, rulers encouraged exports and discouraged imports other than hard cash. 

 

These writers assert that this attitude experienced a complete 180-degree shift in the nineteenth century. Economists such as David Ricardo began convincing policy-makers that unbridled, un-tariffed, international trade could benefit all. Based on the principle of “comparative advantage,” various countries can specialize in producing certain goods and trade for other products. Nations can thus take advantage of cost efficiencies and become richer than they would have been under autarky. 

 

After World War II, the “Western” world enshrined its commitment to such principles in institutions such as GATT (the General Agreement on Tariffs and Trade) which then became the WTO (the World Trade Organization). These organizations attempted to reduce tariffs and other restrictions on international free trade. President Trump, however, has apparently “brought mercantilism back,” thinking that the key to the United States’ economic future is to “balance” our trade relationship with China through high tariffs on Chinese imports.

 

The first problem with characterizing Trump as a “mercantilist” is that this definition of “mercantilism” is an ill-defined and inaccurate assessment of early modern European commercial policy. Recent historians have eschewed the word “mercantilism,” arguing that it wasn’t even used in the seventeenth and eighteenth centuries. The term also falsely betrays a sense of unified and perennial goals and practices. Many politicians, writers, traders, and clerics at the time thought that trade was not a battle among rival countries for a scarce pool of wealth. 

 

Take England for example, and the famous colonist of Jamestown Virginia, John Smith. As the historian Steve Pincus has shown, unlike his near-contemporary Sir Walter Raleigh, Smith did not believe that commerce was a vicious competition for limited resources. Smith thought that colonies could create new wealth through proper organization, labor, and commerce with other empires. 

 

“Mercantilism” was not a coherent ideology based on consensus, but rather vague ideas subject to intense debate and often rejection. European empires often enacted decrees to facilitate needed imports, going against the “mercantilist” logic. Throughout the seventeenth and eighteenth centuries, Spain allowed specific Spanish American ports to receive enslaved Africans from foreign merchants. The British Navigation Acts, first instituted in the 1650s, permitted British merchants to travel to foreign colonies for trade in specified goods, encouraging some foreign imports other than gold or silver. And in the 1760s, France established various “free ports” in the Caribbean that permitted foreigners to import goods such as lumber, corn, rice, oats, and bricks into France’s American colonies. 

 

Just as “mercantilism” was a debated and incoherent assessment of early modern European commercial policy, so too does this term inaccurately summarize Trump’s vision for the United States’ foreign trade. Rather than eschewing all foreign imports and attempting to export American-made goods only for hard cash, Trump is more than willing to buy foreign oil from Saudi Arabia and Iran rather than develop American sources. And while Trump calls for increasing tariffs on foreign steel and Chinese manufactured goods, most of the Trump apparel, Trump home items, Trump merchandise, and Trump beverages have been produced overseas. And rumors are now floating that Trump might try and roll back many of the Chinese tariffs in order to appease U.S. investors at a time of increasing political vulnerability for him. 

 

Characterizing Trump as a “mercantilist” misleads the public into thinking he follows a stable, precedented, and predictable ideology on trade. His actions are as varied as European commercial policy actually was in the early modern period. In reality, the only commercial mantra Trump ascribes to is “Trumpism.” Based on past experience, Trump will do whatever he thinks will energize his base and benefit his political and economic future. Of only that can we be sure. 

]]>
Fri, 22 Feb 2019 20:42:59 +0000 https://historynewsnetwork.org/article/171032 https://historynewsnetwork.org/article/171032 0
Why Our World Seems Out of Control

In a recent Turbo Tax ad, a man is frustrated with a "Robot Child."

 

Government shutdowns.  Brexit. Weird climate happenings. Dissolved arms agreements. Parents unable to control their teens' smartphone apps. Super Bowl ads that reflect “technological dread.” It seems harder than ever to understand and manage life. What is real and what is not? Who knows?  Our inability to distinguish between reality and fake Russian Internet postings helped elect Donald Trump. In his two subsequent presidential years he has constantly complained about "fake news," but has himself fabricated more than 8,000 falsehoods. 

 

The words the poet W. B. Yeats wrote a century ago seem strangely appropriate: 

 

Things fall apart; the centre cannot hold;  

Mere anarchy is loosed upon the world.

……………………………………….......

The best lack all conviction, while the worst   

Are full of passionate intensity.

 

Also relevant are historian Daniel Boorstin’s 1962 book The Image: A Guide to Pseudo-Events in America and novelist Milan Kundera words in Immortality (1990): “For contemporary man reality is a continent visited less and less often.” 

 

If things are more out of control than previously, how did we get in such a mess? Main answer: Our inability to wisely manage accelerating technological changes. 

 

During the 1930s, a distinguished Dutch historian, Jan Huizinga, noted the latest scientific and technological progress, and commented that “the masses are fed with a hitherto undreamt-of quantity of knowledge of all sorts.” But he added that there was “something wrong with its assimilation,” and that “undigested knowledge hampers judgment and stands in the way of wisdom.” 

 

Later in the century, General Omar Bradley warned: “Ours is a world of nuclear giants and ethical infants.  If we continue to develop our technology without wisdom or prudence, our servant may prove to be our executioner.” 

 

But wait a minute. Do we not have more control over our world than in earlier times? Some optimistic scholars like Steven Pinker, in Enlightenment Now: The Case for Reason, Science, Humanism, and Progress would argue that we do. And it is true that we are less at the mercy of nature, diseases like the terrible Black Death, and political and religious authority than we were in medieval days. In many ways, reason and science, as Pinker argues, have made people “healthier, richer, safer, and freer, [and] more literate, knowledgeable, andsmarter.” Moreover, Nazism, communism, and colonial imperialism are not the scourges they once were.

 

But there is a serious problem that Pinker fails to acknowledge. Scientific advances and the eighteenth-century Enlightenment freed humans from many restraints, including religious, political, and mental ones, but presented no unifying goal for them to seek. Since then, various movements such as communism have attempted to fill the emotional needs once offered by religions, but these secular substitutes were deeply flawed and failed. And capitalism’s goal of maximizing profits has also been an insufficient ultimate aim.

 

In his 1970s books Small Is Beautiful and A Guide for the Perplexed, E. F. Schumacher identified our central problem (see here and here for sources of his quotes). Science, he wrote, “cannot produce ideas by which we could live.” They conveyed “nothing about the meaning of life.”

 

In the absence of any higher goal for life, technology and economics became dominant. As Schumacher wrote, “Whatever becomes technologically possible . . . must be done. Society must adapt itself to it. The question whether or not it does any good is ruled out.” Technology, he feared, “tends to develop by its own laws and principles, and these are very different from those of human nature or of living nature in general.”

Regarding economics, he stated that it dominated government policies and the “whole of ethics” and takes “precedence over all other human considerations. Now, quite clearly, this is a pathological development.” He thought that industrial society’s obsession with the constant production of more and more goods, regardless of other consequences, was the main cause of earth’s rapidly increasing environmental damage. To stimulate such growth, advertising and marketing encouraged a “frenzy of greed and . . . an orgy of envy.”

In his A Guide for the Perplexed, Schumacher laments that modern higher education had become primarily career preparation for work in our modern industrial societies and that education in general left “all the questions that really matter unanswered.” What people needed were “ideas that would make the world, and their own lives, intelligible to them. . . . [otherwise] the world must appear . . .  as a chaos, a mass of unrelated phenomena, of meaningless events.” What education should be doing was clarifying “our central convictions,” teaching us wisdom. “The exclusion of wisdom from economics, science, and technology was something which we could perhaps get away with for a little while, as long as we were relatively unsuccessful; but now that we have become very successful, the problem of spiritual and moral truth moves into the central position.” (For a contemporary statement of similar ideas, see Robert Sternberg’s “It's Not What You Know, but How You Use It: Teaching for Wisdom.”)

In the decades after Schumacher’s death in 1977, technology, especially information technology including the Internet, expanded like never before. As Alvin Toffler predicted in his 1970 book, Future Shock, it propelled more rapid social change which in turn produced “increasing malaise, mass neurosis, irrationality, and free-floating violence.” He predicted that for the remainder of the twentieth century, many people in the most advanced technological countries would “find it increasingly painful to keep up with the incessant demand for change.”  

This adjustment pain was exacerbated by globalization, a particularly striking phenomenon of the last quarter century. It enabled large corporations to sell their products around the world, but also hurt more provincial and less innovate businesses. As one business columnist described it, “Innovation replaces tradition. The present—or perhaps the future—replaces the past. Nothing matters so much as what will come next, and what will come next can only arrive if what is here now gets overturned. While this makes the system a terrific place for innovation, it makes it a difficult place to live, since most people prefer some measure of security about the future to a life lived in almost constant uncertainty.”

Reaction to the dislocations and anxieties created by globalizations are one of the main reasons for the rise of populist movements in Europe and the United States, including Trumpian populism, as well as Trump’s “America First”—ism

 

A January 2019 New Yorker article, “How to Escape Pseudo-Events in America,” hints at how out-of-control matters have become. To indicate how confused we are, it mentions an article called “How Much of the Internet Is Fake? Turns Out, a Lot of It, Actually,” and it details how Russian operatives used media sites such as Facebook and Twitter to help sway the 2016 presidential election. In the same month, a “Worldwide Threat Assessment” report stated that adversaries like Russia and China are becoming even “more adept at using social media to alter how we think, behave, and decide. As we connect and integrate billions of new digital devices into our lives and business processes . . .  [they] almost certainly will gain greater insight into and access to our protected information.”  

 

In this midst of all this pessimism, a more optimistic outlook appeared toward the end of last year, psychologist and futurist Tom Lombardo’s valuable Future Consciousness: The Path to Purposeful Evolution. He believes that to create a better future “we need to feel that we are approaching a positive future, and not just that we are defending against an anticipated negative one, such as some great ecological catastrophe and the collapse of human civilization.” Yet, he also realizes that to prudently manage technology and create such a future we need to develop “a core set of character virtues, most notably and centrally wisdom.” Whether we will do so is an open question, but past decades do not give us great grounds for hope.

 

One type of wisdom is political wisdom. In an earlier essay, I wrote that such wisdom dictated seeking the common good and that, as George Washington realized, “virtue or morality is a necessary spring of popular government.” 

 

At present, the United States and the world in general seem to be spinning out of control, with climate change especially threatening our collective future. Political wisdom is rarely demonstrated, especially by our own foolish president. Even columnist Thomas Friedman, an early champion of globalization, now fears that “recent advances in the speed and scope of digitization, connectivity, big data and artificial intelligence” are coming faster than ever, but our ethical abilities to manage such changes have lagged far behind.

 

Although wisdom is sometimes demonstrated—for example, by Pope Francis on capitalist failings and climate change—it is too rarely evident. What the world needs more than ever are wise political leaders who can redirect technology to serve the common good.  In the 1930s, Franklin Roosevelt refashioned government policies to that end. Controlling and redirecting today’s technology will require even more wisdom.

 

If you enjoyed this piece by Walter G. Moss, check out his latest book:

]]>
Fri, 22 Feb 2019 20:42:59 +0000 https://historynewsnetwork.org/article/171177 https://historynewsnetwork.org/article/171177 0
What I'm Reading: An Interview with Urban Historian Carl Abbott

 

Carl Abbott is an American historian and urbanist, specialising in the related fields of urban history, western American history, urban planning, and science fiction, and is a frequent speaker to local community groups.

Why did you choose history as your career? 

I’ll attribute it to Scrooge McDuck. In the early and middle fifties, writer/artist Carl Barks created a series of classic Donald Duck and Scrooge McDuck comic books that often sent Donald, Scrooge, and the nephews chasing a historical myth or artifact. Along with the story, you could learn about Coronado’s search for Cibola, Vikings in Labrador, and Walter Raleigh’s expedition up the Orinoco in search of El Dorado. There was plenty of fantasy in the comics, but also tantalizing historical nuggets that led naturally to the Landmark Books, a series of history books for kids that my dad brought home from the library, and to reading real accounts of archeology. At one point I thought it would be great to be an archeologist, until I figured that they spent their time roughing it in sweltering jungles and blazing deserts and decided that reading and writing about the past in reasonably comfortable libraries might be more pleasant.  

What was your favorite historic site trip? Why?

Chaco Canyon: There is nothing else like it to remind 21st century Americans about the depth and complexity of our continental history. Tour the grand houses on the canyon floor and then climb to the rim to imagine the trade routes that radiated from what was essentially a metropolitan complex. Hadrian’s Wall and Housesteads Fort provide something of the same imaginative transport into a different past, especially when visited in proper English weather with dark skies, wind, and rain showers. For those interested, Gillian Bradshaw, Island of Ghosts, is a very well done novel that is set at the wall in the 2nd century and speaks to Britain’s multiracial past.

 

If you’d asked me when I was six years old, it would have been Castillo de San Marcos at St. Augustine (a fort! with high stone walls!! and cannon!!!).

 

If you could have dinner with any three historians (dead or alive), who would you choose and why?

 

Charles Beard, Frederick Jackson Turner, and Carl Becker, foundational voices for United States history as a comprehensive scholarly endeavor embracing social, economic, and intellectual history.  I would be eager for their opinions on our current ideas about historical epistemology and on the vastly expanded range of our understanding of past lives and peoples. If I wanted to be provocative, I might add or substitute that historical gossip monger Suetonius to see what juicy stories he didn’t dare put into his Lives, although I’d have to brush up on my high school Latin.

 

What books are you reading now?

 

I just finished service on an OAH book prize committee that received ninety submissions, so I feel extremely caught up on certain aspects of U.S. history. I recommend Beth Lew-Williams, The Chinese Must Go, Susan Sleeper-Smith, Indigenous Prosperity and American Conquest, and Julian Lim, Porous Borders.  On the nonacademic side, I have just finished Margaret Drabble, The Dark Flood Rises, a deeply humane novel about the ways that we confront aging and death (it is not depressing).

 

What is your favorite history book?

 

William H. McNeill’s The Rise of the West has been dated by decades of global history scholarship from different regions and postcolonial perspectives but it was eye-opening in the mid-1960s for someone who had just been through the Western Civ approach to history. I had the stimulating experience of taking a course on the history of the Balkans from McNeill at the University of Chicago and hearing about his approach to teaching (read a lot of books, close them, talk about what you’ve learned) and to writing (read, take minimal notes, close the books, organize your thoughts, write). I’ve only been able to follow this very challenging sequence a couple times, but I’ve really liked the results. After you revisit The Rise of the West, read Kim Stanley Robinson’s alternative history novel Years of Rice and Salt, which imagines the course of world history if 99.9% of Europeans had died in the plagues of the 1300s.

 

What is your favorite library and bookstore when looking for history books?

 

Living in Portland, I have easy access to Powell’s City of Books, a great independent bookstore that’s chock full of interesting finds on its multiple color-coded levels. When looking to order a book online, consider visiting Powells.com rather than defaulting to the first letter of the alphabet.

 

As a researcher, I’ve been privileged to work in both the Newberry Library and the Huntington Library, two treasure houses for scholars. When you walk out of the Newberry you have all the vibrancy of Chicago to enjoy; when you take a break at the Huntington you can wander its gardens. Chicago may be urbs in horto but that chunk of San Marino is bibliotheca in horto. Because I might have ended up a historical geographer if I had not attended a very small college with no geography courses, let’s highlight the Newberry’s map collection and the Geography and Map Division at the Library of Congress. 

 

Do you own any rare history or collectible books? Do you collect artifacts related to history?

 

I am not much of a collector. The old books in our house are Quaker journals and sermons from the 18th and 19th century, passed down in my wife’s family for several generations, which support her writing and teaching on the Society of Friends. 

 

Which history museums are your favorites? Why?

 

There’s no way I could show my face around here if I didn’t single out the Oregon Historical Society, where I’ve advised on numerous projects and whose library I’ve used for 40 years. Like all state historical museums, it has been responding to the need to broaden and deepen the “white guys” narrative with very good intent and generally successful results. I serve on a citizen committee to monitor a local tax levy that supports the Society, so I do get an inside view of their efforts to do a lot without enough money. We also need to acknowledge the valuable work of county historical societies and museums all over the country, where public historians are doing their best to add richness and nuance to the old pioneer stories—so a shout-out to the Deschutes County Historical Society (Bend, OR) and Crook County Historical Society (Prineville, OR) for recently inviting me to give talks and providing the impetus for weekends in sunny central Oregon. On big museum scene, I learn new things on each visit to the National Museum of the American Indian. 

 

Which historical time period is your favorite?

 

The right question can make any period interesting, but what originally engaged me as a graduate student were the early decades of the 19th century when the territory northwest of the Ohio River was being transferred and transformed from Indian to white occupation and development. This is the area where I grew up and where my family has roots to the 1820s, and I wanted to understand how it had changed with the Miami and Erie Canal, early railroads, and industrialization that would create the technological infrastructure that produced the Wright brothers.

 

What would be your advice for history majors looking to make history as a career?

Historians should make friends with geographers and other social scientists and take courses in policy research methods. Much discussion of non-academic careers for historians focuses on closely related areas like museums, archives, and historic preservation. Historians with social science and policy research skills have a wider range of options in government and think tanks jobs that should not automatically be ceded to economists. 

 

Who was your favorite history teacher?

 

The entire History Department at Swarthmore College when I was there in the 1960s—Robert Bannister, James A. Field, Jr., Paul Beik, and Lawrence Lafore.  Specializing in U.S., French, and British history, they had distinct personalities but all nurtured a love of critical inquiry.  In addition, I have a soft spot for the Chicago historian Bessie Louise Pierce. She was long retired from the University of Chicago by my time there, but attended my dissertation defense and told me quite firmly not let myself be pushed around by my committee (Richard Wade, Robert Fogel, Neil Harris). 

 

Why is it essential to save history and libraries? 

 

It’s a truism that we are all our own historians, a point made long ago by Carl Becker and recently reaffirmed by Edward Ayers. We understand our lives and world by the stories we tell about how we—and things in general—got to be as they are. It is the job of folks who are paid to study, write, and talk about history to help people tell accurate and inclusive stories that can be the foundation for a progressive and inclusive nation. That’s as important in Britain, India, Mexico, and every other country as it is in the United States.

 

The last couple decades have been a good time for public libraries. Cities all over the world have been investing in library systems that are true community learning centers as well as book-lenders.  The list of cities that have found it worthwhile to build new, architecturally exciting downtown libraries is long. Calgary is the most recent entry, but there’s Perth in Australia, Birmingham in the UK, Guangzhou, Amsterdam, San Diego, Seattle, Salt Lake City, Denver, Minneapolis, and Chicago (which helped to kick off this new wave). It is quite exciting for someone who grew up on trips of exploration to the downtown library in Dayton, Ohio, and it holds promise for civic life. Academic historians understandably  focus on university and research libraries, but public libraries nourish our audience.

 

Do you have a new book coming out?

 

I have two shorter books in process for series aimed to introduce topics to general readers. City Planning: A Very Short Introduction is for an Oxford University Press series.  Quakerism: The Basics, which I am writing with my wife, is for a Routledge series. I am trying to build up steam for a book on the ways in which speculative fiction uses and abuses history (galactic empires modeled on Rome, alternative history, etc.) but it is a long way from daylight.

 

 

]]>
Fri, 22 Feb 2019 20:42:59 +0000 https://historynewsnetwork.org/article/171178 https://historynewsnetwork.org/article/171178 0
Book Review: Andrew Roberts, Churchill: Walking with Destiny (2018)

On 15 March 1938, Adolf Hitler returned to the capital of his native Austria as a conqueror. Rather than guns, the Fuhrer, who received an ecstatic welcome from tens of thousands of rabid followers, incorporated Vienna into the Third Reich (Anschluss) through stirring irredentist rhetoric, revanchist propaganda and appeals to German-Aryan racial unity.  Emboldened by his uncontested victory, Hitler turned to the post-WWI created-state of Czechoslovakia (1918) and demanded territorial concessions.  In an attempt to contain Nazi ambitions, British Prime Minister Neville Chamberlain and French Prime Minister Édouard Daladier travelled to Munich and acquiesced to the Nazi annexation of the largely German-populated Sudetenland.  As Hitler had publicly stated his willingness to forgo further geopolitical claims, Prime Minister Chamberlain returned home one day later and confidently professed to a crowd outside 10 Downing Street “I believe it is peace for our time.” (434) While most Britons praised Chamberlain and hailed the agreement, Winston Churchill (1874-1965), an established public servant with a somewhat checkered reputation, offered a sharply dissenting view of the diplomatic course of events on 5 October:

I will begin by saying what everybody would like to ignore or forget but which must nevertheless be stated, namely, that we have sustained a total and unmitigated defeat, and that France has suffered even more than we have. (435)

In stark contrast to the members of his own government, Churchill had long-characterized the Fuhrer as a dangerous tyrant and a threat to both the Continent and civilization. For his supreme insight and principled cry to thwart Hitler’s ambitions by force, Churchill, who became prime minister on 10 May 1940, enjoyed unprecedented popularity for his courage, resilience in the face of defeat and skills as a wartime leader and subsequently achieved iconic status thereafter.  Although more than one thousand books have been written on his life, the recently-published Churchill: Walking with Destiny (2018) by Andrew Roberts merits consideration as the newly definitive one-volume biography of its subject due to presenting unequaled scholarly analyses of 1) the symbiotic relationship between Churchill, race and the British Empire and 2) a superlative, balanced evaluation of Churchill as statesman.

Churchill, Race and the British Empire

From the outset, Roberts adroitly situates Churchill within the transcendent tide of race-based science and sociology in the mid and late nineteenth century. (deleted phrase) Scholars of the period debated the capacity for “lower” races to advance educationally, socially and technologically by Western standards.  As a liberal imperialist and racial progressive relative to the regnant structure-of-thought, Churchill believed all races capable of “improvement” and worthy of respect and fundamental rights.  

In July 1900, delegates from around the world convened the first Pan-African Conference in London to boldly assert: “The problem of the twentieth-century is the problem of the colour line.”  For Churchill and European-born individuals of nineteenth-century, however, the “colour line” existed as a concrete reality instead of a “problem.”  While Voltaire (1694-1778) and Immanuel Kant (1724-1804) segmented humanity into separate and distinct races, Samuel Morton (1799-1851) and his disciple Josiah Nott (1804-1873), who earned medical degrees at the University of Edinburgh and the University of Pennsylvania respectively, altered nineteenth-century intellectual thought by successfully promoting and widely-disseminating the concept of an extant racial hierarchy in the state of nature from their scientific investigations.  As notions of Anglo-Saxon superiority, African inferiority and other non-European races as middling or semi-developed became socially and intellectually normative, Churchill and his generation embraced the racial-ideological rationale behind the British Empire, which “covered more than one-fifth of the earth’s land surface” at the fin de siècle, to “uplift” the peoples of the world.  

In Chapter Two, Roberts perfectly captures Churchill’s formative worldview with a quote from his first public address on 26 July 1897: “then shall we continue to pursue that course marked out for us by an all-wise hand and carry out our mission of bearing peace, civilization and good government to the utter-most ends of the earth.” (46) Unlike many other Englishmen and Europeans, Churchill sought to extend the Empire as a vehicle of beneficent paternalism – a view that would later lose legitimacy due to an increasing rejection of race-based concepts in the scientific community and a wave of worldwide activism after the First World War.  

From his travels through Mumbai, Bangalore and other parts of India as a Second Lieutenant in the late 1890s, Churchill lauded British-sponsored infrastructure improvements, which included the creation of roads, bridges, a railway system and a modern communications network, and the enforced prohibition of sati(or suttee) – an ancient Indian ritual whereby newly-widowed women committed suicide upon the death of their deceased husband. In the summer of 1898, Churchill, who had been redeployed to eastern Africa, fought with distinction at the Battle of Omdurman, outside of Khartoum, in a bid to avenge the heinous killing of Major-General Charles Gordon and to defeat the armies of Abdullah al-Khalifa.  By employing Maxim (machine) guns to full effect, the British cut down his forces in short order.  After General Kitchener refused to treat the enemy wounded – allowing the Khalifa’s soldiers to suffer gruesome deaths, Churchill became privately outraged but withheld criticism out of political expediency.  How did Churchill partly justify the imperial conquest of Sudan? From his prodigious research, Roberts produces a quote excised from the text of one of Churchill’s books (1902): 

The fact that in Mohammedan law every woman must belong to some man as his absolute property – either as a child, wife, or concubine – must delay the final extinction of slavery until the faith of Islam has ceased to be a Great Power among men.  Individual Moslems may show splendid qualities…but the influence of the religion paralyses the social development of those who follow it.(62) 

For Churchill and many (if not most) Britons, fanatical and hyper-patriarchal forms of Islam represented a form of “barbarism.” 

When Dutch Afrikaners rebelled against British rule in the Second Boer War (1899), Churchill traveled to South Africa and gained international renown by making an intrepid and improbable escape from a Pretorian jail, rejoining the fight and liberating the inmates of his former prison the following year.  In his account, Roberts notes a crucial dissimilarity between the antagonists.  Although both the English and the Afrikaners deemed black Africans inferior, the latter condoned their subjugation due to possessing more rigid racial views.  Churchill spoke for many of his countrymen by paternalistically stating that in the future: “Black is to be proclaimed as white…to be constituted his legal equal, to be armed with political rights.” (68) 

In April 1919, the decision of Brigadier General Reginald Dyer to unleash the firepower of his soldiers on an unarmed, peaceful assembly of men, women and children in Amritsar, India for refusing to disband also exposed the fissure between racial paternalists and non-paternalists in Britain.  Despite cruelly ending the lives of approximately one thousand innocent civilians, Dyer received widespread support from those unsympathetic to “inferior races.” Speaking for the government and the majority, Churchill declared: “It is an extraordinary event, a monstrous event, an event which stands in singular and sinister isolation.” (272)  After an official inquiry, Dyer was forced into an ignominious retirement.  In delivering a remarkably clear, accurate and incisive portrait of Churchill as a product of an era marked by archaic science (racial hierarchy) and high ideals (human rights), Roberts illuminates the contours of the British imperial mind.  

Churchill: Visionary and Flawed Statesman

Throughout the more than one-thousand-page tome, Roberts renders an extraordinarily judicious account of Churchill as both a prescient and at times – a wholly misguided statesman. One-quarter of a century before Chamberlain’s fateful trip to Munich (1938), Churchill began demonstrating his keen insight into the vicissitudes of Continental politics with particular respect to the rise of German militarism and questions related to the balance-of-power. Shortly after Berlin dispatched the gunboat Pantherto the port of Agadir (Morocco) to challenge French influence in 1911, the young British Home Secretary interpreted the provocative act as a portentous sign of aggression. To counter the looming threat, Churchill not only predicted Germany would “violate Belgian neutrality” in order to defeat France in a future war, but he also urged Whitehall to check German power by forging closer alliances with Paris and Moscow. (154-158) When German Admiral Alfred von Tirpitz rebuffed a third offer by London to negotiate a treaty limiting the production of battleships in early 1914, Churchill, who had been appointed as First Lord of the Admiralty, urged increases in naval armaments to prepare for war.  In August, the Kaiser fulfilled his prophetic warnings by ordering German armies to invade and plunder Belgium en routeto Paris. (170-181) By 1924, Churchill began forecasting the possibility of a second world war as a result of the crippling sanctions imposed by the Treaty of Versailles (1920) and the concomitant rise of National Socialism (Nazism).  

If the author commendably succeeds in portraying Churchill as a sage prognosticator with respect to Wilhelmine and Nazi Germany, Roberts also masterfully details his failed political appraisals.  In 1927, Churchill met Mussolini in Rome and publicly congratulated the Italian fascists for “[rendering] service to the whole world.” (325)  Six years later, Churchill called Mussolini “The greatest lawgiver among men” and woefully depicted Japan as “an ancient state, with the highest state sense of national honour and patriotism” despite its atrocity-laden invasion and occupation of Manchuria. (366)  From his faith in the paternalistic beneficence of the British Empire for “lesser races” (technological advances, parliamentary government, expansion of trade etc.), Churchill colossally misjudged nationalist aspirations on the Indian subcontinent and – out of frustration – blasted the Home Rule for India campaign as “criminally mischievous” and labelled Mohandas Gandhi as “seditious” and a “danger [to]…white people.” (342-352) How did Churchill politically survive these egregious miscalculations and gaffes?  In short, he possessed the capacity to admit error, alter his views to a degree, and temper his remarks.  

A Churchill for His Time…and Our Time

In measuring Churchill’s quasi-progressive record as a Minister of Parliament, support for Irish Home Rule, tenure as military strategist, unyielding stance against Nazism, sympathy for Zionism, obstinate resistance against independence for India, problematic relationship with Stalin and second term as Prime Minister (1951-55) between the bounds of his character and the generationally-accepted, tripartite prism of racial hierarchy, paternalism and belief in the world-historical mission of the British Empire – a mission replete with coercion and violence upon indigenous populations, Andrew Roberts has brilliantly reconstructed the life of a titanic figure of the twentieth century within the intellectual context of his times.  As such, Churchill: Walking with Destiny (2018) constitutes a first-rate, authentic work of historical scholarship for our time.

]]>
Fri, 22 Feb 2019 20:42:59 +0000 https://historynewsnetwork.org/article/171024 https://historynewsnetwork.org/article/171024 0
Revisiting the Wright Brothers and the History of Human Flight

William Hazelgrove summarized the premise of his book, Wright Brothers, Wrong Story, (Prometheus Books), in the title of his December 2, 2018, prepublication news release: “We’ve Been Celebrating Two Wright Brothers When We Should Be Celebrating Just One.” By his measure, only Wilbur Wright deserves credit for inventing controlled heavier-than-air powered flight in 1903.

 

Hazelgrove’s hypothesis is emphatically wrong. Wilbur was the more articulate and cerebral of the Wright brothers, while Orville was more reflective and mechanically inclined, but essential contributions of both men made possible the December 17, 1903, flight at Kill Devil Hills that begat practical aviation.

 

Hazelgrove’s briefs for Wilbur and against Orville rely on elisions of facts that contradict his verdict. He resorts to fictional scenes, invented conversations, imaginary soliloquies, and misdirection throughout the book. He connives to prejudice readers by retelling salacious gossip about private indiscretions attributed to Orville, originally published nearly sixty years after they supposedly occurred, which, even if true, had no bearing on Orville’s part in the (much earlier) invention of the airplane.

 

Archived letters and published articles by both Wrights refute Hazelgrove’s story. In a two-page letter dated January 8, 1904, on Wright Cycle Company stationery, Orville Wright gave Carl Dienstbach, American correspondent for the German journal Illustrierte Aëronautische Mittheilungen (Illustrated Aeronautical Correspondence) a first-hand report of the December 17 events. The brothers had taken turns, Orville flying first, Wilbur last, in four flights. The test flights ended when Wilbur crashed on the fourth flight. In order to portray Wilbur as the more skillful pilot, and to diminish the importance of Orville’s iconic first flight, Hazelgrove omits mention of Wilbur’s crash landing. 

 

So much for Hazelgrove’s assertion (page 26), “Neither brother had ever put pen to paper. Neither brother had said this is the way it happened, so there was a vacuum. Orville could say this is how it happened, and no one could question him. The dead asked no questions.” Hazelgrove also does not quote from the most important article the Wrights published, “The Wright Brothers Aeroplane,” which the editors of The Centurymagazine called “the first popular account of their experiments…” 

 

A great deal of that article could be excerpted to refute Hazelgrove’s distortions of the brothers’story point by point, but this passage lends insight into the importance of collaboration to their eventual success: 

 

As we were not in a position to undertake a long series of practical experiments to discover a propeller suitable for our machine, it seemed necessary to obtain such a thorough understanding of the theory of its reactions as would enable us to design them from calculation alone. What at first seemed a simple problem became more complex the longer we studied it. With the machine moving forward, the air flying backward, the propellers turning sidewise, and nothing standing still, it seemed impossible to find a starting-point from which to trace the simultaneous reactions. Contemplation of it was confusing. After long arguments, we often found ourselves in the ludicrous position of each having been converted to the other’s side, with no more agreement than when the discussion began.

 

It was not till several months had passed, and every phase of the problem had been thrashed over and over, that the various reactions began to untangle themselves. When once a clear understanding had been obtained, there was no difficulty in designing suitable propellers, with proper diameter, pitch, and area of blade, to meet the requirements of the flyer.

 

Hazelgrove is also wrong about the process the brothers went through to invent movable vertical rudders connected to the wing controls. No historian or biographer disputes that Wilbur’s vision, study, correspondence with aviation experts (Octave Chanute most of all), wind-tunnel experimentation, and glider flights at Kill Devil Hills between 1900 and 1902, solved the initial series of challenges, up to the need to add and test a movable vertical rudder. That invention has been credited to Orville for more than a century. Hazelgrove calls Orville’s contribution a myth (pages 146 and 147), which he says began with the 1943 publication of The Wright Brothers: A Biography Authorized by Orville Wright by Fred C. Kelly.

 

To support his argument, Hazelgrove quotes Wilbur’s deposition in a patent lawsuit, in which he didn’t mention Orville by name (page 149), and Wilbur’s 1903 presentation to the Western Society of Engineers, in which he again did not credit Orville by name (page 157).

 

On those arguments Hazelgrove rests his indictment. If Orville wasn’t responsible for the hinged rudder, he cannot be recognized as a co-inventor of the airplane. The corollary of his logic, however, is that because Orville did originate the idea, which became an essential element of controlled flight (and remains so), Orville must be recognized and honored for his indispensable contribution to the brothers’ invention.

 

In other parts of his book and in his bibliography Hazelgrove has cited the documents that verify Orville’s contribution — Orville’s 1902 diary and Wilbur’s 1915 Aeronautics journal article “The Story of Flight” — so one must draw the conclusion either that he was a careless reader or that he intentionally omitted the pertinent quotations in order to bolster his case against Orville. 

 

Hazelgrove also plays fast and loose with the details of the Wrights brothers’ personal lives. He wants his readers to regard Orville in later life as a lying hypocrite, not as the pitiable recluse described in Lawrence Goldstone’s Birdmen: The Wright Brothers, Glenn Curtiss, and the Battle to Control the Skies. To support this contention he relies on a selection of 1990s newspaper columns written by a Dayton local historian that discusses the Wrights brothers’ personal secretary Mabel Beck. One of the columns quotes a source who claimed that as a young boy he had seen Orville Wright kissing Beck. Another column provides vague details of a cache of letters from Wright to Beck that might or might not have existed.

 

Excellent biographies of our aviation pioneers are widely available, most recently The Wright Brothers by David McCullough, published in 2015. My shelf always has room for authors who present fresh discoveries, perspectives, and insights, but Wright Brothers, Wrong Story fails to satisfy those criteria.

 

I wish I could end by writing something positive about Hazelgrove’s book — some stray Wright brothers episode I had not encountered previously — but I can’t. This is the proverbial “new good book,” whose new parts aren’t good and whose good parts aren’t new. Save your money for a better one.

]]>
Fri, 22 Feb 2019 20:42:59 +0000 https://historynewsnetwork.org/article/171124 https://historynewsnetwork.org/article/171124 0
Don’t Expect Rulers of Nuclear-Armed Nations to Accept Nuclear Disarmament―Unless They’re Pushed to Do So

Gorbachev and Reagan sign the INF Treaty in 1987.

 

At the beginning of February 2019, the two leading nuclear powers took an official step toward resumption of the nuclear arms race.  On February 1, the U.S. government, charging Russian violations of the Intermediate-Range Nuclear Forces (INF) Treaty, announced that it would pull out of the agreement and develop new intermediate-range missiles banned by it.  The following day, Russian President Vladimir Putin suspended his government’s observance of the treaty, claiming that this was done as a “symmetrical” response to the U.S. action and that Russia would develop nuclear weapons outlawed by the agreement.

 

In this fashion, the 1987 Soviet-American INF Treaty―which had eliminated thousands of destabilizing nuclear weapons, set the course for future nuclear disarmament agreements between the two nuclear superpowers, and paved the way for an end to the Cold War―was formally dispensed with.

 

Actually, the scrapping of the treaty should not have come as a surprise.  After all, the rulers of nations, especially “the great powers,” are rarely interested in limiting their access to powerful weapons of war, including nuclear weapons.  Indeed, they usually favor weapons buildups by their own nation and, thus, end up in immensely dangerous and expensive arms races with other nations.

 

Donald Trump exemplifies this embrace of nuclear weapons.  During his presidential campaign, he made the bizarre claim that the 7,000-weapon U.S. nuclear arsenal “doesn’t work,” and promised to restore it to its full glory.  Shortly after his election, Trump tweeted:  “The United States must greatly strengthen and expand its nuclear capability.”  The following day, with his customary insouciance, he remarked simply:  “Let it be an arms race.” 

 

Naturally, as president, he has been a keen supporter of a $1.7 trillion refurbishment of the entire U.S. nuclear weapons complex, including the building of new nuclear weapons.  Nor has he hesitated to brag about U.S. nuclear prowess.  In connection with his war of words with North Korean leader Kim Jong-un, Trump boasted:  “I too have a Nuclear Button, but it is a much bigger and more powerful one than his.”

 

Russian leaders, too, though not as overtly provocative, have been impatient to build new nuclear weapons.  As early as 2007, Putin complained to top-level U.S. officials that only Russia and the United States were covered by the INF Treaty; therefore, unless other nations were brought into the agreement, “it will be difficult for us to keep within the [treaty] framework.”  The following year, Sergey Ivanov, the Russian defense minister, publicly bemoaned the INF agreement, observing that intermediate-range nuclear weapons “would be quite useful for us” against China.  

 

By 2014, according to the U.S. government and arms control experts, Russia was pursuing a cruise missile program that violated the INF agreement, although Putin denied that the missile was banned by the treaty and claimed, instead, that the U.S. missile defense system was out of compliance.  And so the offending missile program continued, as did Russian programs for blood-curdling types of nuclear weapons outside the treaty’s framework.  In 2016, Putin criticized “the naïve former Russian leadership” for signing the INF Treaty in the first place.  When the U.S. government pulled out of the treaty, Putin not only quickly proclaimed Russia’s withdrawal, but announced plans for building new nuclear weapons and said that Russia would no longer initiate nuclear arms control talks with the United States.  

 

The leaders of the seven other nuclear-armed nations have displayed much the same attitude. All have recently been upgrading their nuclear arsenals, with China, India, Pakistan, and North Korea developing nuclear weapons that would be banned by the INF Treaty.  Efforts by the U.S. government, in 2008, to bring some of these nations into the treaty were rebuffed by their governments.  In the context of the recent breakdown of the INF Treaty, China’s government (which, among them, possesses the largest number of such weapons) has praised the agreement for carrying forward the nuclear disarmament process and improving international relations, but has opposed making the treaty a multilateral one―a polite way of saying that nuclear disarmament should be confined to the Americans and the Russians.

 

Characteristically, all the nuclear powers have rejected the 2017 UN treaty prohibiting nuclear weapons.

 

But the history of the INF Treaty’s emergence provides a more heartening perspective.

 

During the late 1970s and early 1980s, in response to the advent of government officials championing a nuclear weapons buildup and talking glibly of nuclear war, an immense surge of popular protest swept around the world. Antinuclear demonstrations of unprecedented size convulsed Western Europe, Asia, and North America.  Even within Communist nations, protesters defied authorities and took to the streets.  With opinion polls showing massive opposition to the deployment of new nuclear weapons and the waging of nuclear war, mainstream organizations and political parties sharply condemned the nuclear buildup and called for nuclear disarmament.

 

Consequently, hawkish government officials began to reassess their priorities.  In the fall of 1983, with some five million people busy protesting the U.S. plan to install intermediate-range nuclear weapons in Western Europe, Ronald Reagan told his secretary of state: “If things get hotter and hotter and arms control remains an issue, maybe I should . . . propose eliminating all nuclear weapons.”  Previously, to dampen antinuclear protest, Reagan and other NATO hawks had proposed the “zero option”―scrapping plans for U.S. missile deployment in Western Europe for Soviet withdrawal of INF missiles from Eastern Europe.  But Russian leaders scorned this public relations gesture until Mikhail Gorbachev, riding the wave of popular protest, decided to call Reagan’s bluff.  As a result, recalled a top administration official, “we had to take yes for an answer.”  In 1987, amid great popular celebration, Reagan and Gorbachev signed the INF Treaty.

 

Although the rulers of nuclear-armed nations are usually eager to foster nuclear buildups, substantial public pressure can secure their acceptance of nuclear disarmament.     

 

      

]]>
Fri, 22 Feb 2019 20:42:59 +0000 https://historynewsnetwork.org/article/171171 https://historynewsnetwork.org/article/171171 0
What Historians Are Saying: State of the Union 2019

]]>
Fri, 22 Feb 2019 20:42:59 +0000 https://historynewsnetwork.org/article/171060 https://historynewsnetwork.org/article/171060 0
State of the Union: How Can We Assess Donald Trump’s First Two Years in Office?

 

On the morning of the State of the Union address, how can we assess Donald Trump’s first two years in office without being tainted by ideology?

In Trump: The First Two Years, just published by the University of Virginia Press, I try to do so from the perspective of more than two centuries of presidential history.

In the past, presidents almost always have become more effective during the first half of their first term.  Through what amounts to on-the-job training, they learned how to perform a role for which no previous position can prepare one. 

At the same, presidents typically become less influential.  Although each decision they make may gratify some people, it’s bound to disappoint others. Over time the accumulation of decisions costs the president more in the form of disappointment than it gains in the form of gratitude.

Trump has defied these patterns.  If anything, he was even less effective in 2018 than in 2017.  Instead of learning from the experience of being president, he reverted to what he was long before he took office.  He became more Trumpian, both in his operating style and policy preferences.  

With growing but unwarranted confidence in his own ability, Trump banished advisers like Secretary of Defense James Mattis and White House Chief of Staff John Kelly who restrained his impulses.  He replaced them with new advisers who confirm him in his longstanding opinions, and trained those who remained to withhold advice or information he did not want to hear.  

The oft-quoted line offered by Trump’s more thoughtful supporters in the 2016 election was that he should be taken “seriously but not literally” when he talked about immigration, foreign trade, Russia, and other matters.  

As Trump begins his third year in office it has become necessary to take him literally.  He really believes—and is increasingly willing to act on this belief—that although we are in every way the most powerful nation on Earth, the rest of the world is taking advantage of us.

Yet for all his decline in effectiveness, Trump has not become less influential.  To be sure, his standing with the public is low, never approaching 50 percent approval.  But it’s been low from the start of his term and has not sunk much further.  Trump’s base of supporters has become, if anything, even more solid in its devotion to him.  

As for his influence in Congress, Trump has gotten little in the way of legislation—but he also asks for little.  Yet as with his voter base—indeed, because of his voter base—Trump’s support among Republican members has solidified.  Critics like Tennessee’s Senator Bob Corker simply left Capitol Hill.  “It’s becoming a cultish thing,” Corker lamented.

Losing the House of Representatives to the Democrats in the 2018 midterm elections was a blow because it increases the likelihood of impeachment.  But with congressional Republicans squarely in Trump’s corner the prospects of removal by the Senate are vanishingly small.  

That may remain true even if something like a “smoking gun” emerges implicating Trump in one or more criminal offenses.  

Other recent presidents who were either forced to resign (Richard Nixon) or impeached (Bill Clinton) had to make their case to the voters through the mainstream news media. 

Unlike them, Trump has a direct pipeline to his supporters through Twitter, Fox News, and multiple conservative outlets on the internet.  Using these channels, he has inoculated himself among his base with recurring accusations that the mainstream media is full of “fake news” and the special prosecutor is conducting a “witch hunt.”

Losing the Senate in 2018 would have been a different matter for Trump by denying him his greatest area of ongoing accomplishment: judicial appointments. These include Memphians Tommy Parker and Mark Norris of the federal district court of West Tennessee, 30 appellate court judges (a record), and two Supreme Court justices.  

But instead the GOP increased its ranks in the Senate from 51 to 53, making the party less vulnerable to one or two defections on close confirmation votes.

As for the president’s prospects for reelection in 2020, the composition of the two-dozen-plus field of potential Democratic challengers may be the best thing going for him.  The Democrats’ roster skews far to the left, with all the pressure from the party’s activists pushing candidates even further in that direction.  

Center-left consensus builders like the only three Democrats to actually win a presidential election in more than half a century—Jimmy Carter, Bill Clinton, and Barack Obama—are far down the list in the emerging field of contenders.   

In all, Trump is what he was when we elected him: impulsive, braggadocious, ignorant (but far from stupid), cunning in some matters but gullible in others, open to flattery, insistent on loyalty bordering on devotion, enamored of Russian leader Vladimir Putin and other global strong men, incapable of rhetorical uplift but a brilliant verbal counter-puncher.

He’s also bored with substance but obsessed with image, tribal in tending his base of supporters, and convinced that the United States is always getting screwed, especially by its allies.  

Looking beyond 2020, the deeper question is: Will Trump’s leadership style—threatening the press, rubbing his supporters’ sores and turning Americans against each other, sacrificing American values such as liberty, equality, and the rule of law on the altar of blood-and-soil nationalism—be emulated in an ever more bitterly polarized political environment, or will it induce voters and political leaders to pull back from the brink? 

 

In other words, is Trump an aberration or does he portend a new and harsher style of presidential leadership than his 44 predecessors have offered since 1789?

 

The reassuring answer lies in that long history.  However aberrant and dangerous Trump or his successors may be, the original Constitution’s system of “separated institutions sharing powers” and the First Amendment’s protection of freedom of speech, assembly, and press remain no less resilient a safeguard of the Republic than ever.  

 

For as long as political power is sliced and diced among the three branches of the federal government, between the federal government and the states, and between all the institutions of government and a free people and free press, the constraints on how much good or harm a single person can do will remain.

 

If you enjoyed this piece from Michael Nelson, check out his latest book!

]]>
Fri, 22 Feb 2019 20:42:59 +0000 https://historynewsnetwork.org/article/171146 https://historynewsnetwork.org/article/171146 0
Roundup Top 10!  

The troubling history behind Ralph Northam’s blackface Klan photo

by Rhae Lynn Barnes

How blackface shaped Virginia politics and culture for more than a century.

 

Why we keep looking at the Northam photo

by Jonathon Zimmerman

Because we all know — in our hearts — that racism is America’s original sin.

 

 

Why it’s shocking to look back at med school yearbooks from decades ago

by Elizabeth Evans

They offer jaw-dropping examples of the sexism and racism that shaped professional cultures.

 

 

Eric Lott on Ralph Northam and the History of Blackface

by Isaac Chotiner

In an interview, Eric Lott discusses the subject of blackface and its historical role in American politics, culture, and racism.

 

 

How the Trump administration’s Title IX proposals threaten to undo #MeToo

by Ruth Lawlor

The changes would make campus sexual assault harder to punish, while increasing the burdens on black men.

 

When the Suffrage Movement Sold Out to White Supremacy

by Brent Staples

Historians like Glenda Gilmore, Martha Jones, Nell Irvin Painter and Rosalyn Terborg-Penn have recently revised the whitewashed depiction of the women’s rights campaign by rescuing black suffragists from anonymity.

 

 

The mistake NATO was formed to correct — and how President Trump is repeating it

by Gregory Mitrovich

Allies abandoned one another in the 1920s and 1930s. In the 1940s, they paid the price.

 

 

A New Americanism

by Jill Lepore

Why a Nation Needs a National Story

 

 

How to Make Graduate School More Humane

by David M. Perry

There's a mental-health crisis among graduate students, and it bears particularly hard on those with disabilities. Fixing it requires specific mental-health supports—and broad cultural change.

 

 

Is a history degree only accessible to students at elite universities?

by Eric Alterman

For the past decade, on American campuses, history has been declining more rapidly than any other major, even as more and more students attend college. But at Ivy League schools, the major is thriving.

 

 

Why You Should Dig Up Your Family’s History — and How to Do It

by Jaya Saxena

Learning your history is forced reckoning, asking you to consider whose stories you carry with you and which ones you want to carry forward.

 

 

Kruse and Zelizer: Watergate's lesson? If Democrats want to heal America, Trump must be held accountable

by Kevin Kruse and Julian Zelizer

Accountability is essential to the long-term health of our democracy, more important than even healing the nation’s partisan divisions.

 

 

As a historian, my instinct was to preserve Confederate monuments, but I changed my mind

by W. Marvin Dulaney

After taking a closer look at them and the historical lies that they present and perpetuate, and the reverence that they hold upon the nation's landscape I was convinced that all of them need to come down.

 

 

Don’t Let Democrats Become the Party of War

by Trita Paris and Stephen Wertheim

The gambit to out-hawk Trump is a dangerous one.

 

]]>
Fri, 22 Feb 2019 20:42:59 +0000 https://historynewsnetwork.org/article/171175 https://historynewsnetwork.org/article/171175 0
Don’t Invade Venezuela

Cuban defectors practicing parachute drops to prepare for the Bay of Pigs invasion.

 

The chaos in Venezuela has awakened the specter of U.S. military intervention. The U.S. government and several others have recognized the new president of the National Assembly, Juan Guaidó, as the South American nation’s legitimate president—a direct challenge to the rule of Nicolás Maduro. President Trump has renewed his vague threat that “all options are on the table” in dealing with Venezuela, including a U.S. invasion. Over the weekend he reiterated that an invasion was “an option.” But invasion should be off the menu.

 

First, a U.S. military intervention would be illegal according to all the conventions of the Inter-American System. It would also lose us allies in the region and elsewhere, such as Colombia, Brazil, and Argentina, and it would reinforce the hostility Washington has already whipped up in Bolivia, Ecuador, and Mexico. Cuba would benefit as a defender of its neighbors, as would the Russians and the Chinese, who have never intervened or threatened intervention. Back in July, Rex Tillerson and H. R. McMaster pushed back with these arguments, and Trump seemed placated. Now that Secretary of State Mike Pompeo and National Security Advisor John Bolton have respectively replaced these more cautious predecessors, there are no grown-ups left in the executive branch to steer the president away from certain disaster. 

 

History can be a reckless teacher in this case, offering deceivingly clear lessons. Trump has compared a hypothetical move against Venezuela to the U.S. invasions of Grenada in 1983 and Panama in 1989. In the latter, Operation Just Cause kicked out strongman Manuel Noriega and garnered the approval of most Americans and even Panamanians, and democratic (though corrupt) governments have been in place since. In the former, U.S. troops ousted an extreme leftist government and restored democracy. Righteous cakewalks, right? 

 

But the lesson gets blurrier if we look at the entire history of U.S. military interventions. For all the talk of their frequency, a takeover of Venezuela would actually be unprecedented.

 

Never has the U.S. military launched a full-scale invasion of a South American country. Its targets for almost all U.S. troop landings—usually Nicaragua, Panama, Cuba, Haiti, and the Dominican Republic—have been in the Caribbean and Central America. These countries were small, their militaries tiny and poorly trained and equipped. Rarely could they put up much of a fight. When they did, such as Fidel Castro’s Cuba in 1961, it was to repel an invasion supported by U.S. resources but manned by exiles, not U.S. military personnel. The same was true for Guatemala in 1954: The invaders were mostly Guatemalans and other Central Americans supported by the CIA. 

 

Washington likewise never attempted a takeover of Mexico. Yes, it took the northern half, but that was nearly two hundred years ago. And the troops, aware of the average Mexican’s hostility to them, marched on only part of the nation and left after scoring a quick victory. Decades later, U.S. marines and sailors took over Veracruz in 1914, but never left the city. The Punitive Expedition pursued Pancho Villa into northern Mexico in 1916, but never caught him and avoided a war against Mexican troops. U.S. war planners always knew that Mexico was too big to invade. 

 

Washington certainly did undermine South American leaders by supporting coup plotters—in Guyana under John Kennedy, in Brazil under Lyndon Johnson, and in Chile under Richard Nixon. But these were small-scale CIA operations or the mobilization of limited naval resources. None involved an invasion by the U.S. military. South American countries are larger, wealthier, and farther away than the usual suspects. Invading them would involve hundreds of thousands of U.S. troops, not thousands or even tens of thousands.

 

Take Venezuela—figuratively, not literally. Panama and Grenada are no Venezuela. On the eve of Operation Just Cause, Panama had a population of 2.4 million and a U.S. military base in the Canal Zone. It took only 27,000 U.S. troops to immobilize the Panama Defense Force. Grenada was even more ludicrous. Operation Urgent Fury’s 8,000 forces took a mere two days to topple the Grenadians and even the Cubans who helped them, occupying an island of 110,000 souls the size of Martha’s Vineyard. 

 

Venezuela, in contrast, has 32 million people. Its military stands at over 235,000 strong, with 1.5 million pro-government militia members waiting to mobilize. Those are Iraq circa 2003-level numbers. Speaking of Iraq, Venezuela is twice its size and has a similar population. Like Iraqis, Venezuela’s patriotic troops wouldn’t just cut and run. Many would fight. And Colombia, Venezuela’s neighbor, doesn’t seem to want to send their own, though it looks like they would tolerate 5,000 U.S. troops. Do we really want an “Operation Venezuelan Freedom” that ends up costing hundreds of billions of dollars and thousands of U.S. and countless Venezuelan lives? Maduro himself likened an invasion to another Vietnam that would “stain with blood” the White House.

 

Even if the initial invasion succeeded in deposing Maduro and installing Guaidó, a foreign intervention might destabilize politics there for a generation. Bigger countries are harder to rebuild after an invasion or an occupation. They have more political factions. They have more weapons. They have more places for the opposition to hide and regroup. And they generally are more nationalistic and thus able to portray those who collaborate with outside invaders as, well, collaborators.   

 

To be sure, some outside intervention may be legitimate. In our globalized world, the international community has the right—and the institutions—to override national sovereignty in extreme cases such as that of Venezuela today. It can demand that Maduro step down, and it can make use of a host of diplomatic and economic tools to compel him. And of course, an overthrow could come from inside Venezuela, the most likely outcome. But as Venezuelans of all stripes have said,  a “gringo intervention,” even one that coordinated with the Venezuelan military, would invite death, destruction, and maybe even the perpetuation of Maduro’s nihilistic regime.  

]]>
Fri, 22 Feb 2019 20:42:59 +0000 https://historynewsnetwork.org/article/171147 https://historynewsnetwork.org/article/171147 0
What Historians Are Saying: Ralph Northam, Blackface, and the KKK Click inside the image below and scroll to see tweets.

]]>
Fri, 22 Feb 2019 20:42:59 +0000 https://historynewsnetwork.org/article/171144 https://historynewsnetwork.org/article/171144 0
Feisty Singer Nina Simone Is Back in New Play

 

We all remember Nina Simone, the extraordinarily gifted singer who was named one of the thirty greatest singers of all time by Rolling Stone magazine, championed numerous causes within the Civil Rights movement, marched with Martin Luther King and developed a highly individual performing style that rumbled from classical musical to jazz to pure barroom.

The lyrical likable Nina is back in Laiona Michelle’s new play. Little Girl Blue: The Nina Simone Musical, that opened last weekend at the George Street Playhouse in New Brunswick, N.J. It is not only a pretty good show but a vivid historical look at the life of a singer in the 1950s and ‘60s, and the Civil Rights movement. It is a fond look back for one generation of Americans and a real eye-opening show for the younger generation who, in the Super Bowl halftime show did not have the genius of Nina.

Nina Simone, says the actress who plays her in the show, Laiona Michelle, ran into a thick wall of racism as a teenager when, having finished Juilliard School, she was turned down for admission to the prestigious Curtis Institute of Music in Philadelphia because she was black. From there, she launched herself as a performer, striking gold right away with a single, I Love You Porgy. She followed that with a successful first album, Little Girl Blue.

Her very original style made her a hit in Greenwich Village clubs in the same years that she joined the Civil Rights struggle. There, she had an immediate impact with a song, Mississippi Goddam, her scorching response to the 1963 murder of Medgar Evers and the ’63 bombing of a Baptist Church in Birmingham, Alabama, that she sings in the show. 

In the musical - the set is a nightclub - Nina charged that by 1970 she was fed up with racism in America and “blacks shot down in the streets” and moved to Europe, where she immediately carved out a nice career in the UK, France and Switzerland during the last years of her life.

Little Girl Blue is a good show, filled with much of her music. It has some minor troubles, though. First, it is very, very slow, Oh My God slow, in its first half hour. Then, when she gets married to an abusive husband, the pace quickens. Second, many of her hits, like Who Loves You Porgy, are not included in the show. Third, the events of her life are all out of order and get you mixed up. She joined the Civil Rights movement because of the Curtis Institute rejection as a teenager, but that incident does come into the show until near the end. Huh? Fourth, she suffered from bipolar disorder, never mentioned in the show (on stage, Nina just says that she “stopped taking my medication”).

In the show, it is clear that she had mental difficulties. She had a quick temper, yelled at her musicians and stormed off the stage on occasion. There were many incidents of violent behavior that are not mentioned and they formed an important part of her life. As an example, she fired a gun at a record company executive, whom she accused of stealing her royalties and admitted to police that she tried to kill him but missed. She shot and hit her neighbor’s son with an air rifle in a dispute over noise he was making. She pulled a gun on a shoe store employee to force him to refund her money for the purchase of a pair of sandals. She got angry at a friend, singer Janis Ian, and ripped a phone out of a wall in anger when she was refused money by her.

Director Devanand Janki, who does an otherwise fine job, should have insisted that all or some of these incidents be included in the show.

The scenes of violence that are in the show are gripping, such as her husband’s verbal and physical attacks on her (he apparently beat her up several times and once punched her in the stomach when she was pregnant). These attacks, her exile from the United States and the difficulties she had getting along with people, racism, plus the problems or raising her daughter as a single mom, help to paint a portrait of her as a sad victim of bi-polar disorder, the record business and, frankly, herself. The members of her band, who suffered her abuse for years, are played by Mark Fifer, Saadi Zain and Kenneth Salters.

Ms. Michelle, despite her own awkward book, is quite good as Nina, and in some scenes emotionally electric and very appealing.

Even with these difficulties, the play is pretty good. It needs to be tightened up a bit and Michelle should get rid of some songs and add others, though.

The one thing that excited me was the way that playwright Michele drew such an engrossing portrait of entertainment history in that era and showed how it was so different for people of different racial backgrounds.

And oh, that Nina Simone could sing!

PRODUCTION: The show is produced by the George Street Playhouse. Scenic Design:  Shoko Kambara, Costumes: Ari Fulton, Lighting: Xavier Pierce, Sound: Karin Graybash. The play is directed by Devanand Janki. It runs through February 24.

]]>
Fri, 22 Feb 2019 20:42:59 +0000 https://historynewsnetwork.org/article/171148 https://historynewsnetwork.org/article/171148 0
Happy Belated Birthday, Jackie Robinson!

 

January 31 was Jackie Robinson’s 100th birthday. Thinking about him always brings tears to my eyes. Let me try to figure out why.

 

I was born a Brooklyn Dodgers fan. Like most kids, I didn’t have much choice about whom to root for. My parents were Dodgers fans, and so was I. I was born in Manhattan, spent two years in Queens, then moved out to the Long Island suburbs. I never asked them why they didn’t root for the great Yankees or for the Giants.

 

Was it my mother’s choice? She grew up in Queens, a girl athlete before girls were allowed to be athletes. She taught me how to throw a ball, play golf and tennis and ping-pong, and shoot pool. I don’t think she ever was allowed to be on a team, until she became captain of adult women’s tennis teams in California in the 1970s. Did she learn to root for the Da Bums, just a couple of miles away at Ebbets Field, who had not won a World Series since 1890 and sometimes presaged the clownish ineptitude of the early Mets, who inherited many of their fans? She never explained her Dodger love.

 

It might have been my father, who probably had never heard of the Dodgers until he arrived in New York alone at age 18 in 1938, just escaping when the Nazis took over his home in Vienna. He grew up playing soccer and never was able to smoothly throw a baseball. Did my father hear of Jackie’s college exploits when he sold men’s hats in Los Angeles in 1939 and 1940? Jackie led the NCAA in rushing and in punt returns on the undefeated 1939 UCLA football team. He also lettered basketball, baseball, and track, the first athlete to letter in four sports in UCLA history. He won the NCAA championship in the long jump in 1940, and would have gone to the Tokyo Olympics that year if they hadn’t been cancelled due to the outbreak of war.

 

I was born the year after Jackie played his first game for the Dodgers in 1947. By the time I understood anything about baseball, he was already a superstar with a super team: Rookie of the Year in 1947, MVP in 1949, All-Star every year from 1949 to 1954, leading the Dodgers to six World Series in his ten-year career.

 

My brother and I have puzzled over many of our parents’ ideas since they died. But I don’t think that their fondness for the Dodgers was about success. Although they never talked about it, I believe race was at the heart of their preference.

 

I doubt if my father ever met a black person growing up in Vienna. But the Nazis taught him about the evils of white supremacy as he was growing up and later when he returned to Europe with the US Army, interrogating prisoners of war and seeing concentration camps. In our home, Jackie Robinson was a moral hero, as was Branch Rickey, general manager of the Dodgers, for signing him in 1945. We were excited to watch catcher Roy Campanella, who entered the majors in 1948, and pitcher Don Newcombe, who was Rookie of the Year in 1949. When most major league teams were still all white, the Dodgers fielded three black men, all of whom played in the All-Star game in 1949.

 

I must have seen Jackie play in many games on our little black-and-white TV and listened to Vin Scully’s play-by-play on the radio. But Jackie retired when I was 8. I remember him more vividly in connection with Chock full o’ Nuts, that “heavenly coffee”:  “better coffee a millionaire’s money can’t buy.” Another enlightened white businessman, William Black, hired Robinson as vice president for personnel right after his baseball career ended. Chock full o’ Nuts was the Starbucks of the mid-20th century, with 80 stores in New York, most of whose staff were black. Robinson’s political activism for civil rights was fully supported by Black.

 

I understood little about the civil rights struggle and was too young for coffee, but I knew about Jackie’s role as business executive.

 

Jackie Robinson was a Republican. He supported Richard Nixon against John F. Kennedy in 1960 and worked for Nelson Rockefeller’s 1964 presidential campaign. My parents were Democrats and I’ve wandered further to the left since then. But partisanship mattered much less in those days than morality, and Jackie came to represent for me the high moral calling of activism for equality, not just in politics, but in life. I was born into a family that assumed that racial discrimination was immoral. It was immoral in Europe against Jews and immoral in America against blacks.

 

When I was young, I didn’t know about Jackie Robinson’s life before the Dodgers or about my father’s life before he left Vienna. I didn’t learn in school about either the Holocaust or segregation. I grew up in an antisemitic and anti-black society in the 1950s and 1960s, but barely realized it until I went to college. Even at an Ivy League university in the late 1960s, lessons about American prejudice and discrimination were not a regular part of the curriculum. Over the past 50 years, life and the study of history have taught me the facts behind my family’s moral certainties. Jackie Robinson has accompanied me along the way, as inspiration, role model, hero.

 

To see 100 photos of Jackie’s life, click HERE. If you’re like me, have a handkerchief ready.

 

Happy Birthday, Jackie.

]]>
Fri, 22 Feb 2019 20:42:59 +0000 https://historynewsnetwork.org/blog/154179 https://historynewsnetwork.org/blog/154179 0
Why Elliott Abrams is the Right Man for the Job in Venezuela

 

Under the subject heading “You can’t make this stuff up!” I got a message from a friend announcing Secretary of State Mike Pompeo’s appointment of convicted liar, Elliott Abrams, as the new special envoy to engineer the replacement of Venezuela’s elected president Nicolas Maduro.  Abrams is an apt choice for the job, as his career stretches back through some of the most sordid instances of US intervention and brutality. As Assistant Secretary of State for Human Rights and Humanitarian Affairs in the early 1980s under President Ronald Reagan and later as Assistant Secretary for Inter-American Affairs, Abrams dismissed allegations of a December 1981 massacre in El Mozote, El Salvador, testifying before a Senate committee that the hundreds of deaths "were not credible." Subsequent documentation from the Salvadoran Truth Commission revealed that 794 civilians were "deliberately and systematically" executed at the hands of an elite US-trained military battalion.  For his part, Abrams claimed that the long war in El Salvador that took the lives of thousands of civilians—including Archbishop Oscar Romero who was assassinated in broad daylight while saying Mass—was “a fabulous achievement.” 

 

Abrams was a key operative in the Iran-Contra scandal, a clandestine scheme during the Reagan Administration to supply backers of the former dictator in their quest to overthrow Nicaragua's Sandinista government.  After Congress cut off support for the Contras in response to proof of the group’s widespread human rights violations, Abrams helped out by flying to London in August 1986 where he met secretly with an emissary from the fabulously rich Sultan of Brunei to ask for $10 million to fund the Contras.  (The Sultan gave the money, but through a clerical error it was deposited in the wrong Swiss bank account.) 

 

Pardoned by President H.W. Bush, Abrams went to work in the administrations of both George H.W. and George W. Bush, promoting the invasions of Iraq and Afghanistan, and undermining Iran in whatever way possible. Along with other neocon cheerleaders for regime change in the Middle East—a project that was supposed to be a “cake walk”—Abrams created a mess.  

 

Undeterred by all that icing clinging to his shoes in the quagmire that has spread to engulf one country after another in America’s longest wars, Abrams is now at work scheming to depose the very vulnerable and unpopular Nicolas Maduro.  In addition to his sterling record of lying, deceit, and incompetence, Abrams brings to this job his previous experience in the George Bush administration where he is credited with giving the “green light” to taking out Maduro’s predecessor, Hugo Chavez, in 2002.  That adventure was so poorly planned that Bush had just barely recognized the new US puppet Pedro Carmona when it became embarrassingly clear that Chavez hadn’t been deposed at all and was back at the helm. 

 

In summary, everything Elliott Abrams has touched has turned out badly.  He and others like him have never seen a crisis that couldn’t be solved by throwing massive amounts of military hardware at regimes and leaders who have fallen out of favor with the US agenda.  After decades of war, the Taliban are negotiating their way back into power, with the resultant carnage reminiscent of a phrase from a Bruce Springsteen song about a friend who had died in another futile and wrongheaded war: “They’re still there and he’s all gone.”  In every country Abrams touches the “all gone” are hundreds of thousands of civilian children, parents, American, Afghan, and Iraqi soldiers and personnel.

 

Abrams is now blithely looking toward another crisis, and given his extremely limited skill set, he’ll probably call for a military solution, that will end up in civil war and terrorizing death squads.  Similar to past conflicts in the Middle East and elsewhere, a disturbing number of centrist Democrats will fall into line with the hawkish Republicans, leaving us to ask: “If we’re using the same playbook, under the direction of the same incompetents like Abrams, how can we expect a different outcome?”  

 

As liberals and conservatives fall over each other decrying the misery of the Venezuelan people, the usual re-writing of history is well underway.  Everywhere Venezuela is referred to as a prosperous, rich, stable democracy before it fell into the hands of former leader Hugo Chavez and his successor, Nicolas Maduro.  The tedious reality that undermines this picture of an oil rich Garden of Eden is the fact that Venezuela has been a dramatically divided country since oil was discovered in 1935.  Under the military dictatorship of Juan Vicente Gómez, Royal Dutch Shell and Standard Oil were granted concessions to pump and refine the oil, while Gómez collected the money and spread it around to his military buddies.  In 1975 President Carlos Andrés Pérez nationalized Venezuela’s oil and established a state-owned enterprise, PDVESA, as a separate, parallel governing body to manage oil production. 

 

Oil revenues are the single source of Venezuelan revenue and the entire system has always been managed through the military. Except for intermittent windows of democracy and reform, the bulk of the country’s export revenue was never spread to the impoverished majority.  Despite reasonable returns, from 1970 to 1998 per capita income fell by 35 percent, one of the sharpest and most prolonged declines in the world.  Although Venezuela today is definitely in a crisis far more severe than in years past, and one which Maduro and his party show no sign of resolving, the underlying cause is not new. Since the 1970s Caracas has been one of the most unlivable cities on the planet—rimmed by mountains of mind-bogglingly poor slums lacking rudimentary housing, sewage systems, water, and infrastructure. Thecity is choked by pollution, overbuilt with well-guarded high rise apartments for an elite content to spend their oil wealth in shopping malls, restaurants, and amenities for everything but the public good.  In 1989 the city exploded in days of rioting when a newly elected president reneged on his campaign promise and struck a deal with the IMF that cut basic services to the millions of poor already living in abject poverty. Interestingly, Juan Guaidó, the legislative leader turned self-appointed president promises to negotiate a deal with international banks and the IMF to bail out the cash-starved economy.

 

Hugo Chavez at least tried to change this dismal record.  As former US president Jimmy Carter wrote on Chavez’s death in March 2013, despite his deep disagreements with some of his policies, under Chavez “Venezuelan poverty rates were cut in half, and millions received identification documents for the first time allowing them to participate more effectively in their country's economic and political life.”  As Greg Grandin points out, Chávez submitted himself and his agenda to fourteen national votes, winning in elections that Carter certified as honest and fair, even better than most of the 92 elections the Carter Center supervised and whose results world bodies accepted.  

 

Chavez, and Maduro after him, relied on oil revenues to build schools, finance cooperatives, and institute healthcare, trading Cuban doctors for oil.  Oil went everywhere, even to the poor of Massachusetts and the Bronx.  The Venezuelan elite complained bitterly about their bold and uncouth leader, but as long as their privileges remained somewhat intact, and the military kept their share, the system limped along. But oil prices have been in a slump for nearly a decade.  Corruption permeates the system as it has in the past, but the money to grease the wheels has disappeared, along with the charismatic and energetic leader who seemed to have really cared.   Today, Venezuela is ruled by an unpopular authoritarian, who lacks the skills of leadership and the resources to paper over the deepening cracks.  

 

In steps Elliott Abrams, the representative of a deeply flawed president who according to his own intelligence service, won office with the assistance of a foreign, often hostile, power.  Abrams represents a president who repeatedly seeks to discredit the press, who lies unabashedly, who presides over a revolving door of advisors who are either under indictment or already in jail, who cares little for civility and the preservation of his own country’s natural resources, and appears to be in office in order to enrich his family business.  Despite record high levels of income inequality and unprecedented profligacy of the super rich, his administration’s greatest accomplishment is a tax reform that benefitted the top one percent and eased corporate tax rates. Ever since the Supreme Court ruled that a corporation has the same rights as a human being, the entire political process serves the interests of the highest bidder, including military contractors and arms manufacturers.  The regime has been met with disillusionment and sporadic street protests, including women-led marches of over a million people, the largest in the country’s history.  Nonetheless, a third of the populace avidly supports the president.

 

In a desperate attempt to appease his base the president shut down the government in a fit of pique over money to build a medieval wall to keep out migrants, many of whom are held in cages and tents along the southern border.  For weeks the airwaves filled with stories of government workers lining up at soup kitchens, unable to pay their bills, worried about putting food on the table or meeting pressing healthcare needs.  The single strongest opposition in the government has come from the head of democratically-elected legislature. The legislature held firm and, for the time being, the authoritarian president has backed down.

 

I don’t know if Canada has contacted Nancy Pelosi to see if she wants to declare herself president, but it could.  

  

Obviously the situation in Venezuela is more extreme, but the comparison with the current crisis in the United States is not unfounded.   We have watched for over a month as the richest country in the world, some say the richest of all time, has pushed gainfully employed people to penury.  Obviously we are living in a deeply divided country, headed by a president with little knowledge of how most of the citizenry lives.  In the midst of this crisis, or as a result of it, the US government is playing with deposing the ineffective leader of another country, whose election is apparently marred with greater improprieties than our own.  Without a doubt, Elliott Abrams is the man for the job. 

 

If you like this piece by Teresa Meade, check out her latest book:

 

]]>
Fri, 22 Feb 2019 20:42:59 +0000 https://historynewsnetwork.org/article/171125 https://historynewsnetwork.org/article/171125 0
Sleeping Giant: When Public Workers Awake

 

It was the radical African-American intellectual, W.E.B. Du Bois, who famously called the mass disaffection and migration of southern slaves to Union battle lines in the Civil War a “general strike.”  To be sure, Du Bois took some literary license with the concept of the general strike—as perhaps more classically exemplified in the mass walkouts in Seattle in 1919 and England in 1926—as well as the history of slave resistance during the Civil War.  But the flight of many slaves to Union lines and their willingness to take up arms against slavery likely spurred Lincoln’s Emancipation Proclamation, slowed cotton production, and ultimately augmented the Northern army with nearly 200,000 black volunteers.  While hardly a conventional labor action, it can be viewed, imaginatively, as a political strike which lay outside the logic of economic bargaining, a logic that would ultimately lead to the system of industrial relations that began to be erected in the 1930s aimed at resolving conflict short of worker strikes or employer lockouts.  We might consider that both political and economic strikes are bound by separate rules and logic, but that working people need to be prepared for each in its proper season.  It’s a lesson that also beckons from the most recent, at-least-temporarily-interrupted government shutdown.  

 

Certainly, the very laws created to address the arbitrary treatment of much of the private sector and, by the 1960s, the public sector workforce also closed the door to any actions, such as a wildcat strike, that threatened to burst the proceduralism defined in collective-bargaining contracts or government work rules.  Non-union, private-sector workers can generally be fired ‘at-will’ (or on any pretext except for civil rights infringements) by their employers, but action even by unionized and public-sector employees (now often one and the same) is also heavily legally restricted.  Federal government employees, for example, are forbidden to strike under any circumstances, and many state governments follow the same conventions: only ten states permit teacher strikes, while four states deny the most basic, internationally-sanctioned right of collective bargaining to public employees altogether.  

 

In practice, legal prohibitions against striking workers have always been tempered by concerns for public welfare and political pragmatism.   Take the victorious, albeit illegal, 2018 West Virginia teachers’ strike as an example.  Which official wants to take responsibility for unsafe classrooms, dangerous understaffing, or lack of basic materials—the causes of recent walkouts in districts across the nation?  Moreover, what good does it do to fine or fire striking teachers if the next day you have to staff their schools and answer to parents on the way to work?   On the other hand, President Reagan retained public support when he fired 11,000 striking air traffic controllers in 1981, but only because their actions appeared precipitous and their wage demands unwarranted.  Every modern-day public-sector strike, it seems, is a political strike to the degree that its resolution depends on the strength of its public support.  

 

The latest evidence on this subject, of course, emerges from actions, or non-actions, of workers who ‘called in sick’ during the government shutdown.  In refusing to join the ranks of those whom economics czar Larry Kudlow called "volunteers” working out of their "allegiance to President Trump,” the numbers of non-cooperating workers grew precipitously throughout the last week of the shutdown.  There is every indication that, beginning with the chaos at LaGuardia airport on Friday, January 25, the absence of emergency workers figured prominently in Trump’s temporary retreat from confrontation over the border impasse.  

 

Denied access to any collective means of redress (with their elected union representatives effectively sworn to a virtual vow of silence), these protesters relied on deeper wellsprings of labor justice.  The 13th Amendment prohibiting involuntary servitude offered an irrefutable legal brace for such action.  Presenting themselves less as a unified workgroup than as individuals desperate to provide for their families, they seized the moral high ground against unfeeling politicians.  In the end, neither employment laws nor appeals for endless self-sacrifice from on high could stand in the way of citizens acting on the principles of “life, liberty, and the pursuit of happiness.”  

 

In one sense, perhaps what the government workers did was not so far removed from that of Du Bois’s protagonists or other civil rights heroes.  At the time when slaves struck a blow for their own freedom by fleeing slavery to seek refuge behind Union lines, it was manifestly illegal for slaves to run away.  Martin Luther King Jr. likewise led many illegal protest actions to which we now pay tribute.  What King once called the “fierce urgency of now” had begun to beckon in the shutdown crisis.  For more than a month, the president, politicians, and courts utterly failed federal and contract employees.  The workers and their families—both those forced to stay at their jobs and those on furlough --were desperate.  What alternative did they have?  Like the soldiers we celebrate upon return from difficult missions abroad, sick-out patriots had reason to think that their fellow citizens would have their backs when they disobeyed protocol in trying to bring a shutdown to an end.  There were no parades, but there were also no recriminations, and public opinion was clearly on their side.  Air traffic controllers, TSA employees, IRS workers, and others—your country surely thanks you for your service. 

]]>
Fri, 22 Feb 2019 20:42:59 +0000 https://historynewsnetwork.org/article/171127 https://historynewsnetwork.org/article/171127 0
What Popular Histories Often Get Wrong About the Underground Railroad           

“The Underground Railroad was highly improvisational – like good jazz.”

 

When I’ve used that line over the years in speaking to audiences about the loosely-organized network that aided runaway slaves in the pre-Civil War U.S., it has usually elicited surprise.  Since slavery ended, most Americans have come to view the Underground Railroad as an extremely well-organized and thoroughly-routinized system.  In part, this image can be traced to the railroad metaphor that came to describe the network in the 1840s.  Railroads, after all, run on tracks and on time.  In part, it is due to the work of pioneering Underground Railroad historian Wilbur H. Siebert of Ohio State University, who’s Underground Railroad from Slavery to Freedom (1898) still shapes our understanding of the network today. In the volume, he called the Underground Railroad a “great system,” “a series of interlocking lines,” and “a chain of stations leading from the Southern states to Canada.”  Of greater significance, perhaps, was a map in the volume, which showed the Underground Railroad as a series of distinct lines and approximated a railroad map.  That illustration has been widely reproduced in textbooks, in publications and museum exhibits on the Underground Railroad, and on the Internet.

 

In reality, Underground Railroad activists and fugitive slaves had to improvise in response to ever-shifting circumstances – the presence of slave catchers, road and weather conditions, and the availability of personnel and particular modes of travel.  All of this determined what road was traveled, or if a road was used at all. Sometimes, activists and freedom seekers cut through fields and woods and along rivers and creek beds.  At other times, beginning in the 1830s, they took passage on an actual railroad, or used the tracks as a footpath rather than roads.  One activist, the Rev. Henry Northrop of Ann Arbor, White Pigeon, and later Flint, Michigan, said of freedom seekers: “Hardly any two went over the same road and very little was known about the way lest the pursuers might follow.”  While this statement may involve a bit of hyperbole, it does make an important point: Underground Railroad activists regularly improvised.  Even Siebert, writing in his Underground Railroad, observed that activists often adopted “zigzag and variable routes” to “blind and throw off pursuit” and “avoid unfriendly localities.”

 

Several months ago, an enlightening example of the improvisational nature of the Underground Railroad arrived in a letter from my cousin Darlene Cassidy.  She had been going through the personal papers of her late father and my uncle Marvin Stanley and found documentation of the Underground Railroad work of her great, great grandparents Obadiah and Sarah Williams of Pickrelltown in Logan County, and later Hardin County, Ohio.  Some additional research on my part helped to establish other elements of their personal story.  In 1893, when Williams was “an elderly man,” he shared reminiscences of his activism with the daughter of another Underground Railroad activist, who passed this information along to Siebert.  It resides today in the Siebert Collection at the Ohio Historical Connection. Obadiah and Sarah, lifelong members of the Society of Friends, were motivated to perform this work as a divine calling.

 

As a young man, Obadiah transported a wagonloadof grain from Pickrelltown to the Cincinnati market.  There he encountered runaway slaves and heard their story for the first time.  It made an indelible impression.  Not long after returning home, he befriended a fugitive named Meschach “Mose” Moxley and eventually traveled to Kentucky to buy the freedom of his wife and children. The Moxleys settled in nearby Bellefontaine, where Meschach became a prominent gunsmith.  These were merely the first of many northbound African Americans aided by Obadiah at Pickrelltown.

 

Obadiah’s first experience as a “conductor” on the Underground Railroad took place in the summer of 1841 or 1842.  After hitching up the horses, he and a fellow activist carried a fugitive through Marysville to the home of activist John Cratty in Delaware, in a nighttime ride of more than thirty-five miles.  Cratty’s home was considered a “halfway point” on the way to Quaker safe houses in Mount Gilead in Morrow County.  This would become Obadiah’s usual route for transporting escaping slaves.  On this occasion, however, they took a slight detour, “fearing to go through Middleburg,” a nearby village.  After resting, they returned home the next day.  On another trip, they delivered a runaway some twenty miles to Marysville and left him at the home of a Presbyterian acquaintance, returning home the same night. Circumstances varied.

 

In October 1844, in a particularly illustrative case of improvisation, Obadiah conveyed a runaway slave couple and their three or four children from Pickrelltown to Mount Gilead, a distance of more than sixty miles.  One local furnished the team, another a covered wagon with flaps that closed in both front and rear.  Obadiah eschewed night travel and the usual route, observing that “the road through Marysville was now so well watched by Slave-hunters, it was not safe to go that way.” Instead, he concocted a plan to accompany two friends who were taking a load of produce northward to Marion. Obadiah followed them, posing as a farmer hauling grain to one of the ports on Lake Erie, thus avoiding suspicion. At Marion, he turned eastward toward Mount Gilead, completing the journey in three full days and successfully delivering his human cargo to Quaker activists there.  His wagon empty, he returned home by the usual route.

 

After Obadiah and Sarah married in 1845, they moved northward to Hardin County, where they continued their Underground Railroad work, with Sarah assuming a frequent and often major role.  According to family oral tradition shared by Evangeline Bealer, a local historian and another great, great granddaughter of the couple, Sarah, too, was a master of improvisation.   On one occasion, a family of fugitives had spent the night at their Hardin County farm.  As their guests were finishing breakfast, two slave catchers approached the house.  Obadiah went outside to meet them under the guise of helping them water and rest their horses.  Meanwhile, Sarah cleared all evidence of breakfast from the table and hid their guests elsewhere in the house.  She then invited the slave catchers into the house and fed them a hearty meal.  As they talked, she listened carefully to their conversation, including their travel plans.  Once they left to pursue their human prey, she hitched up the horses to a wagon and took the runaways to safety by way of an alternate route. When the slave catchers returned from their unsuccessful mission, she bedded them down in the room where the slaves had slept the night before.

 

The work of Obadiah and Sarah Williams demonstrates the improvisational nature of the Underground Railroad in a very personal way. But they are merely two among hundreds of activists whose stories make this important point.  Greater attention to the way in which they and others assisted fugitive slaves entrusted to their care will revise and correct the public perception of the network as an extremely well organized and highly-routinized system.

]]>
Fri, 22 Feb 2019 20:42:59 +0000 https://historynewsnetwork.org/article/170886 https://historynewsnetwork.org/article/170886 0
Why Do Few People Know the History of Northern Slavery?

 

 

After many decades of neglect, the story of slavery in the North is reentering collective memories in the region. This motivated two questions that I write about in my recent book, Slavery in the North: Forgetting History and Recovering Memory: How and why was the enslavement of tens of thousands of people in the region forgotten, and how and why has it recently been partially recovered? 

 

I, for one, must admit that despite have grown up in New York—the colony and state with over 20,000 enslaved people (12% of its population) on the eve of the American Revolution—I had virtually no awareness of slavery’s existence in New York or in any of the other Northern colonies and early states. In high school and college, although I had taken a number of American History courses, I never learned anything about it. Clearly, its existence clashed with the widespread popular belief that slavery was exclusively a “southern problem.” As I learned, my case was far from unique. Few Northerners had any awareness of slavery’s earlier existence, and even when they did, what they knew was often very minimal. 

 

In fact, Northern slavery was significant. It lasted from 1626 until 1865—246 years— and involved tens of thousands of people. Merchants from Rhode Island, New York and Boston were the largest North American slave traders in the colonies and early states. The wealth they earned produced the country’s early elite and funded the region’s first industries and universities. 

 

Explaining how something that significant virtually disappeared from the region’s collective memory was at first challenging. Thinking about the South in comparison with the North helped begin to answer this puzzle. In the states of the Confederacy, there are at least three clear differences from North that help us understand the difference in the collective memories of slavery in the two regions. The first is that the South has maintained narratives (often under the rubric of the “Lost Cause”) to explain the loss of the Civil War and their way of life including slavery. Second, there were many ritual expressions and enactments such as films, books, dramatic productions, museum exhibits, memorials, monuments, public holidays, parades, historical sites, and music that told the story of the Civil War and slavery from that perspective. Third, the region’s public and ceremonial landscape makes visible the antebellum period’s lifestyle and emphasizes the valiant battle for succession. Central to this are the memorials and monuments that are located throughout the region as well as the many Civil War battlefields that are now historical sites that attract many visitors each year. There are also many plantations that once had hundreds of enslaved people living and toiling on them that offer visits. Until recently, few talked about the lives of the enslaved on tours, but in recent years this has greatly changed in some places, as I learned on visits to Virginia, North Carolina and South Carolina. 

 

In contrast, the North was almost completely devoid of all three—narratives, ritual expressions and enactments, and visibility on the public and commemorative landscape. Even the few rare examples were often very limited. The African-American Museum in Boston has long told the story of Crispus Attucks, a Black man who was the first casualty of the Revolutionary War. However, when I visited it and took the Black History Tour there in 2015, I heard nothing about the existence of slavery in Massachusetts from the 1630s until 1800. Instead there was an emphasis on 19thcentury opposition to slavery and especially to the 1850 Fugitive Slave Act. In contrast, the Royall House in nearby Medford, Massachusetts, which in the 18thcentury it was the home of the largest slave owner in New England, is a now a historical site that incorporates the history of enslavement on the site and discusses what they know about the lives of the enslaved there. 

 

New Yorkers recently learned about slavery in the city in the 1990s with the rediscovery of the African Burial Ground in lower Manhattan that was originally located outside the city walls when it opened in the late 17thcentury. The discovery of the site resulted from a mandatory archaeological dig when the Federal Government decided to construct a new General Service Administration building on the site just off lower Broadway in 1989. It was widely assumed that little of historical significance would be turned up because the area had been so actively developed and redeveloped since the early 1800s, but the dig on the site found that there were the remains of between 15,000 and 20,000 people buried there. Over 400 remains were exhumed for study that turned out to be very productive. There was a contentious battle, however, over how to memorize the enslaved interred there. Eventually, the government funded a very impressive memorial, reinterred the remains of those who had been removed, and opened a National Park Visitor Center to tell the story of the site and of slavery in New York. When I visited it in 2011, I was struck by the number of white and Black visitors who were shocked to learn about New York’s history of slavery. This was the same reaction I heard at the New York Historical Society on a visit to see their impressive exhibit on “Slavery in New York” in 2005.

 

So why and how did the North manage to forget its long history of slavery? This happened in part due to gradual attrition as events as become more distant over time. But this is not sufficient, as we see in the case of the South. There is also the destruction or modification of sites associated with slavery; intense shame and guilt that leads Black and white descendants to repress memories; fear of social sanctions from others for recounting painful memories; reframing narratives about the past in ways that omit what makes people and communities embarrassed and uncomfortable; and incentives to forget a painful past. Each of these helps us understand how history is a social construction that fits the needs of people in the present and is not a stable in terms of what people know and how they know it.

 

 

 

 

]]>
Fri, 22 Feb 2019 20:42:59 +0000 https://historynewsnetwork.org/article/170943 https://historynewsnetwork.org/article/170943 0
What I’m Reading: An Interview With Cultural Historian Surekha Davies

Surekha Davies is a cultural historian specializing in science, visual and material culture, and the entangled connections between Europe and the wider world, particularly the Americas. She is currently the InterAmericas Fellow at the John Carter Brown Library. She formerly taught at Western Connecticut State University, where she was awarded tenure and promotion to associate professor in 2018. Her first book, Renaissance Ethnography and the Invention of the Human: New Worlds, Maps and Monsters (Cambridge University Press, 2016; paperback, 2017), won the 2016 Morris D. Forkosch Prize for the best first book in intellectual history, awarded by the Journal of the History of Ideas, and the 2017 Roland H. Bainton Prize in History/Theology, awarded by the Sixteenth-Century Society & Conference. It was shortlisted for the 2018 Pickstone Prizeby the British Society for the History of Science. Her website is surekhadavies.org.

 

What books are you reading now?

 

I’m reading two new books, among other things. One is Martha S. Jones’s Birthright Citizens: A History of Race and Rights in Antebellum America (Cambridge University Press, 2018), a gripping account of how African American activists in the nineteenth century reconfigured how Americans imagined the concept of citizenship: these activists argued that they were guaranteed by birth all the rights of US citizens. The other book is Molly A. Warsh’s American Baroque: Pearls and the Nature of Empire, 1492-1700 (UNC Chapel Hill, 2018). This book uses the movement of pearls around the Caribbean basin and through global markets to weave together an interconnected history of empire, labor, economies, ecologies, taste, and material culture.

 

What is your favorite history book?

 

Lorraine Daston and Katharine Park’s Wonders and the Order of Nature, 1150-1750 (Zone Books, 1998) is the ultimate book that keeps on giving. It offers an extraordinary window into European scientific thought through how people imagined the notion of wonder: as an emotion, as a cognitive faculty or thinking tool, and wonders as natural and artificial beings and things that prompted the emotion of wonder in the viewer. I first picked up the book in 2003, at the start of my doctoral program, and its principles still undergird my work. Daston and Park reveal how following historical sources no matter where they take you, rather than drawing arbitrary lines around disciplines, topics, or geographical fields which change over time, leads to a much richer understanding of how past cultures made sense of the world.

 

Why did you choose history as your career?

 

I became a historian because I watched too much Star Trek and Cosmos as a child. From about the age of seven, I wanted to become an intergalactic explorer, to meet extraordinary life-forms, to travel faster than lightspeed and then go back in time and meet myself and, generally, to have my mind blown. When I started out as a undergraduate at the University of Cambridge, I was a physics major seeking to specialize in astrophysics. When we got to quantum mechanics towards the end of the first year, I (and everybody else) conceded that faster-than-light space was, alas, unlikely to be invented any time soon: anyone who got into physics in order to be captain of the USS Enterprise had to re-think their life choices. Many of us switched out of physics into disciplines that seemed more likely to offer the intellectual adventures we were after (also biologists who had spent too much time watching Jacques Cousteau and imagining that being a biology major would be all scuba diving and shark cages). A lot of us switched to history and philosophy of science. Pretty soon I realized that the creative analytical brainstorming – and writing– that I most wanted to do fitted perfectly with currents in early modern cultural and intellectual history and history of science.

 

What qualities do you need to be a historian?

 

It varies with period and subfield (cultural/intellectual, social, political, and so on). If you work on cultural history or on the history of science or of ideas c.1200-1800, for example, it helps if you are invested in exploring how processes of close reading – of texts, images, or material artifacts – and practices of writing and re-writing constitute analytical work. Put another way, to better understand how and why people who lived half a millennium ago made sense of the world, you have to be prepared to experience your sources deeply, and to go where they gesture, and not go in with preconceived notions of how society, gender, science, or politics are supposed to work. To paraphrase the prolific cultural historian Keith Thomas, smart people believed in witchcraft in the early modern period; our job is to understand why. To do so, historians have to hunt in all manner of material traces of the past (books, manuscripts, images, objects…) to think elastically and most importantly to write paragraphs, chapters, and dozens, even hundreds, of pages. Practicing history well is the opposite of the discrete ‘this, this or this’ of the world of multiple choice exams and lists of names and dates through which the layperson might imagine the discipline of history. Understanding the past involves mental stamina and breadth.

 

Who was your favorite history teacher?

 

My favourite history teacher wasn’t officially one of my teachers – it was Julian Swann, professor of early modern history at Birkbeck College, at the University of London, and a specialist of eighteenth-century French political history. Julian gave me my first regular teaching gig, as the teaching assistant leading the seminars for his European History, 1500-1800 undergraduate class, while I was writing my doctoral dissertation. At the time I found political history rather boring: I was much more interested in the history of ideas than in monarchs, battles, or statutes. I had trained as a historian of science as an undergraduate, rather than as a straight historian, so politics was context rather than the central object of study. Julian’s early modern Europe course integrated cultural, intellectual, social, political, and religious themes; his was a cultural and intellectual approach to political history, and it made themes that I had not found terribly exciting before – such as political revolutions or religious change – vivid in their own right.

 

What is your most memorable or rewarding teaching experience?

 

In 2012, when I was an assistant professor at Western Connecticut State University, I taught an upper-level class on the cultural history of monsters from antiquity to the eighteenth century in Europe; it became my most popular class. Something that was surprising, memorable, and rewarding was how the students bonded with two of the set readings: Merry Wiesner-Hanks’s The Marvelous Hairy Girls: The Gonzales Sisters and Their Worlds (Yale UP, 2009), which is about the lives and responses to a sixteenth-century family, many of whose members had a genetic condition which covered their bodies, apart from their faces and the palms of their hands, with hair. Various family members were brought up at court, prized for the unusual physical condition that led viewers to ponder whether they were human, animal, or something monstrous and in-between. The other book was Daston and Park’s Wonders and the Orders of Nature, which I mentioned earlier as my favourite historical work. Students talked about the effect of the physical books on their learning experience, on people who saw them reading the books – both of which carry illustrations of Lavinia Gonzales, one of the hairy children – and related conversations they had started up with friends, family, and strangers who were intrigued by the titles and by the hairy face, framed by a ruff, on the covers. The enormous Wonders book was lovingly nicknamed ‘the brick’; students would unconsciously stroke the cover in class; and their empathy for those individuals who were imagined as monstrous in pre-modern Europe was inspiring.

 

What are your hopes for history as a discipline?

 

I would like to see greater boldness in undergraduate curricula. At the cutting-edge of research, the discipline is tremendously varied in its questions, sources, methods, and scope. However, the overriding structuring elements of many curricula remain chronological period and geographical region. In practice, in the archive, the most meaningful chronological and geographical frames depend on the question you happen to be asking. For example, people didn’t wake up one day and switch from manuscript (handwritten) books to printed books; nor did everyone wake up one day in sixteenth-century England and decide to break away from the Church of Rome. Similarly, major historical phenomena did not unfold exclusively within today’s national boundaries or conventions of continental boundaries. One can teach students how to think deeply and analytically about the past, and about its continuing ramifications for the present, through questions and themes that transcend period and place, worked out through careful and detailed study of how particular things and places change over time. Understanding the relevance of historical thinking and knowledge to the present and to questions that transcend later geographical boundaries, and acquiring a deep body of knowledge, are not mutually exclusive curricular goals.

 

Do you own any rare history or collectible books? Do you collect artifacts related to history?

 

I’m not a collector; I’d sooner consult collections that other people have assembled and are carefully preserving in libraries, museums, and archives. I did, however, collect half a dozen H. G. Wells first editions as an undergraduate, at an outdoor market in Cambridge, England; the experience of owning them was unexpectedly underwhelming, although they were (fortunately) worth more than I had paid for them. I do have a collection of everyday children’s books, mostly published in the UK by Penguin/Puffin/Dragon between the 1950s and 1980s – such stylized cover illustrations, beautiful prose and distinctive plots! – which I assembled as an adult, entirely for my own reading rather than for the sake of forming a collection.

 

What have you found most rewarding and most frustrating about your career?

            

The last year of writing my first book has been the most rewarding experience in my career so far. I was lucky enough to spend eleven months in Washington, DC, as a fellow at the Folger Shakespeare Library and at the Library of Congress for part of it, and teaching two courses online for a semester (thus remaining in DC between a fall semester fellowship and a summer fellowship). I spent the year making the 400pp book manuscript better: adding things, improving the argument, making the writing clearer, getting feedback on chapters from friends and mentors and incorporating their suggestions. I had a great group of fellow fellows at the two libraries: we went on endless walks, visited museums, cooked and ate well…. There was even a piano in my apartment (I had played seriously enough as a child to contemplate music school before college), and one of the other fellows was a composer and musicologist. I was thus able to play, to compose, and to go to several concerts a week (there are so many free ones in DC, plus all the free tickets my composer friend used to get). It was not the easiest year of my life – finishing a book is not easy! – but the combination of lifting and revising the book, cultural life, friendship, and sending the book into production at the end was wonderful.

 

One of the most frustrating things about my career has been how slow the academy’s hiring process has been, and continues to be, in coming to terms with the importance of work that transcends the disciplinary, geographical, and temporal boundaries established two hundred years ago. Many questions straddle the chasms between these divisions, as did past lives, events, trading routes, and ideas; there is much exciting work at the cutting-edge of the field. As the present becomes ever more entangled on a global scale, it becomes urgent that the discipline expands its visions in the context of hiring the next generation of historians.

 

How has the study of history changed in the course of your career?

 

My PhD is less than a decade old, but I have noticed a number of changes. Many more dissertation projects transcend geographical boundaries and traditional disciplinary divides. No individual can do everything, and much has been learned within the disciplinary framework of the past couple of centuries. Nonetheless, the past doesn’t happen within disciplines: it spills everywhere – life spills across space, time, peoples, cultures, activities… Only by asking questions and following them wherever they lead – even when they take you beyond the traditional historian’s purview of ‘written documents from X archive about Y country’ – can you make the most of the sources that survive. An increasing number of historians are drawing on visual, oral, and material sources. In the US, there has been a welcome expansion of interest and support for scholarship on Latin America, Africa, Asia, and the Pacific, regions that have traditionally received short shrift in comparison to the study of the US and Europe. At the same time, the reduced attention given to the pre-modern (say pre-1800) world, including pre-modern Europe, is troubling. Why? You cannot understand the structure of the world we live in without starting at least half a millennium ago. Europe was multicultural two thousand years ago and more; human populations had been travelling for many millennia before that; neither human migration nor cultural intermixing are anything new.

 

What is your favorite history-related saying? Have you come up with your own?

            

I’ve come up with my own, something that encapsulates the words of my dissertation advisors: ‘If this is the question, then at what sources should you be looking? And if these are the sources, then what questions can you most meaningfully ask of them?’ If we are to better understand the past, then it cannot be contained by either a question or a source alone; rather, there is a call-and-response relationship between questions that pique your interest and the sources you look at – and very quickly both start to spiral out into rich and messy stories.

 

That was certainly the case with my first book, Renaissance Ethnography and the Invention of the Human: New Worlds, Maps and Monsters (Cambridge UP, 2016; paperback 2017). When I started the dissertation I thought I was writing about the intersection of the history of reading and the history of knowledge-making. By the time the dissertation became a book, it was about the ways in which early modern Europeans were forced to re-imagine what it meant to be human in response to the visual arguments made by mapmakers who carefully synthesized information from travel accounts onto gridded maps which constituted new forms of knowledge about the world’s peoples. By analyzing manuscript and printed maps alongside prints, natural histories, geographies, costume books, and travel writing, I argued that new mapping techniques made the idea of monstrous peoples deformed by nature central to the category of ‘human’. In an age when scholars, missionaries, native peoples, and colonial officials debated whether peoples of the Americas could – or should – be converted or enslaved, maps were uniquely suited for assessing the impact of environment on bodies and temperaments. By revealing that map illustrations were scientific diagrams rather than fantasies, this book showed the centrality of spatial thinking for early modern science, and urged an expansion of the sources via which scholars study the prehistory of race.

 

Is this a story about Europe, the Atlantic world, the history of science, the history of art, the history of maps, biology, or geography….? I would say that it is all of them at once, and no less of one for being the other. Interdisciplinarity is not about the margins of one discipline or field intersecting with the edges of another in the manner of a Venn diagram; rather, it is about questions that sew together issues at the heart of one discipline to those at the heart of the next.

 

What are you doing next?

 

I’m working on my second book project. It sits at the intersection of cultural and intellectual history, history of science, and art history. Collecting Artifacts in the Age of Empire: Spaces of Disruption, 1550-1725 pursues the impact of New World artifacts on European science and medicine, examining the influence of overseas artifacts and knowledge that entered northern European cabinets of curiosities. Cabinets were epistemic installations: collections organized to generate knowledge and particular effects, strengthening some ideas about objects while eliding others. I am exploring the ways in which cabinets were conduits via which indigenous technical expertise and natural knowledge disrupted notions of the relationship between nature and culture, and shaped the disciplinary formations of biology, aesthetics, and anthropology. Inventing Europe required peoples who seemed not to shape nature; Collecting Artifacts offers a provocation to today’s essentialist approach to culture and to the perception of technology as an Old World invention. Skeins which extend from this project into the present also embody a conduit to public policy. Understanding the origins of classificatory systems is a prerequisite for decolonizing museums, providing context for interventions in debates on restitution, curation, and conservation.

]]>
Fri, 22 Feb 2019 20:42:59 +0000 https://historynewsnetwork.org/article/170960 https://historynewsnetwork.org/article/170960 0
The Inspiring Campaign We All Need to Remember at this Horrible Time in Politics

 

There are many reasons to worry about American democracy and candidate conduct is among them. Much campaign behavior does little to raise political discourse. Slogans and sound bites replace thoughtful discussion, candidates flatter loyal audiences, misleading messages and attack ads seem more common than thoughtful dialogue over public policy. 

Although these deficiencies appeared once again in 2018, they are not inevitable features of political campaigns. Such is the lesson from Edmund S. Muskie’s 1968 vice-presidential campaign which in 2018 experienced its golden jubilee. Muskie outperformed not only the other candidates for national office that year but virtually every other national candidate in his and probably most life times. He did so by discussing issues and basic American values in a thoughtful and candid way. Rather than tell audiences what they wanted to hear, Muskie tried to persuade them through rational discussion regarding what he thought they should think. Muskie’s ticket narrowly lost but his vice-presidential campaign was highly effective and provides a model for 21st century America.

Muskie’s campaign as Hubert H. Humphrey’s running mate began as an impossible undertaking. The Democratic ticket faced numerous obstacles including division over the war in Vietnam, disillusionment after the assassinations of Dr. Martin Luther King, Jr. and Senator Robert F. Kennedy, and Humphrey’s tarnished reputation. The party was financially broke and fractured with liberal supporters of Senator Eugene McCarthy disposed to stay home and ethnic and working class Democrats drawn to Governor George Wallace’s third party campaign. All Muskie had to do was introduce himself to America, win over the McCarthyites on the left and the blue collar and ethnic Democrats drifting to the right, and make Humphrey look good.

As police and protesters violently engaged outside the Democratic convention with rocks, billy clubs, and tear gas Muskie introduced himself to America with an acceptance speech that spoke of the foundations of a pluralistic society built on trust (“It means learning to trust each other, to work with each other, to think of each other as neighbors. It means diminishing our prerogatives by as much as is necessary to give others the same prerogatives.”) The speech, which Muskie dictated late that afternoon, was more college lecture than traditional convention speech but it echoed themes of Muskie’s lifetime of public service.

Whereas Richard M. Nixon and Wallace, and their running mates, Governor Spiro T. Agnew, and General Curtis Lemay respectively, embraced a “love it or leave it” mindset which associated dissent with disloyalty, Muskie championed constructive protest as inherent in the First Amendment and American tradition. He told Midwestern high school students that as Americans they were “privileged to kick the government around.” With that privilege comes “the duty and responsibility of using your heads, your hearts, your capacity for understanding, to do what is best for everyone concerned.”

Muskie’s performance at the Washington, Pennsylvania County Courthouse on September 25, 1968, was a metaphor for his campaign. When angry anti-war protesters, many from nearby Washington and Jefferson College, exercised their constitutional right to verbally kick the Democratic vice-presidential candidate around, Muskie invited a heckler to speak from the podium for 10 minutes with the understanding that the crowd would then hear Muskie for equal time. Muskie listened as the designated speaker advocated the politics of the street over the ballot box, and intervened to restore decorum when traditionalists jeered the provocative message. Muskie then told the audience why he believed in the American democratic system and encouraged the students to participate as citizens “to contribute to what must be done that the rest of us have not yet been able to do.” 

To fat cat groups, Muskie often defended the long-haired, unruly youths who challenged establishment values. Muskie told these Chablis and brie gatherings that the young had “honest doubts” about the American system which often closed doors to the disadvantaged, that many “inequities,” “are your fault and mine.” Citizens should listen seriously to youthful critics and encourage their meaningful participation to improve America. 

But Muskie did not coddle the college crowd. He told campus audiences of white, privileged students that he did not share their enthusiasm for the proposed all-volunteer army which would concentrate military service on the poor and blacks. He preferred a lottery which would spread the risk more equitably and contribute to democratic decision-making regarding war. And he admonished students to focus not just on the war but on other issues, to respect those who didn’t share their views, to accept, but work to change, democratic outcomes they dislike, and to build, not simply tear down.

Muskie frequently appeared before ethnic, working class audiences whose members, though traditionally part of the FDR coalition, were tempted by Wallace’s candidacy which spoke to, indeed fanned, their resentment about the racial policies of the Johnson administration and the decisions of the Warren Court. 

Muskie told those gatherings of his father, Stephen Marciszewski,a Polish immigrant who became a tailor in Maine in the early 20th century. He had come to America, Muskie said, looking for “equality and opportunity” and had seen his hopes realized because in America “[s]trangers were accepted.” To make America “the land of opportunity that we came here to find, “all Americans had to work towards “a united America,” not “a divided America.” Real safety and real security came from freedom, not from walls. Would we continue to have an America built on trust and freedom or one built on distrust and hatred?

My father did not come here looking for fear; he came here to escape fear and to find freedom. He did not come here looking for hatred; he came here to escape it and to find freedom. He did not come here seeking to deny other people opportunity; he came here to find it for himself, thinking it was available to all people who lived in America. He came here believing that freedom here was freedom for everyone.”

By campaign’s end, Muskie was, in the words of presidential campaign historian Theodore White, “an almost immeasurable asset” to Humphrey’s campaign. The Democrats sent Muskie to crucial battlefields, ran televised ads using the vice-presidential choice as a reason to vote Democratic, and had Muskie fly cross-country to appear in the televised campaign finale. A prominent political cartoon depicted Muskie as a runner carrying Humphrey.

To be sure, Muskie brought an excellence to public service that few can match. Yet what also made his campaign special was his willingness to behave as a leader, not a supplicant, and his ability to give authentic voice to the best ideals of American constitutional democracy. 

Muskie treated his listeners as adults who deserved to hear, and were capable of responding to, rational, fact-based discussions. He was a teacher who tried to persuade listeners, not a panderer who found inspiration in the biases, emotions, and fears of his listeners. Muskie saw political communication, not as a hierarchical activity in which candidates read applause lines from a teleprompter to supporters, but as an, interactive exchange in which candidates spoke, listened, learned, and answered, not simply to acolytes but to skeptics, critics, and reporters. Muskie held frequent press sessions and answered audience questions. The press was not the enemy of the people, but a necessity of government of, by, and for, the people. To Muskie, information and ideas were the tools of civil discourse and civil discourse was the indispensable instrument of democracy and constitutional government. He often said that “It is better to discuss a question without settling it than it is to settle a question without discussing it.” 

Muskie had a deep understanding of, and commitment to, America’s best ideals, which he recognized as the national fabric, and he used his candidacy to express them regularly and with rarely matched eloquence. For Muskie, pluralism did not simply describe America but explained its strength. There was greatness in a country that welcomed immigrants, that let a Polish tailor Marciszewski become Muskie. Those Americans by choice and their children and grandchildren made, sustained, renewed and enhanced America’s greatness. 

Freedom was the prescription for national greatness. “We are perhaps the prime example of a society which was created on the assumption that if you allowed the individual citizen to develop his full capacities, that he can contribute to a rational discussion and understanding and solution to our problems,” Muskie said. Just as democratic leadership compelled accountability, democratic citizenship imposed continuing duties and demanded responsible behavior, not simply a willingness to accept compromises but a commitment to work towards sensible accommodations to create and preserve a union built on mutual trust.

Muskie’s vice-presidential campaign was one of the few shining moments of the difficult year of 1968. It left a legacy of what a campaign and candidate should be like and modeled values upon which the survival of American democracy depends.

© Copyright Joel Goldstein 2018

]]>
Fri, 22 Feb 2019 20:42:59 +0000 https://historynewsnetwork.org/article/170754 https://historynewsnetwork.org/article/170754 0
To Free or Not to Free In retirement, however, there were opportunities to do more and Jefferson consistently refused them. In 1814, for instance, Edward Coles urged Jefferson to take the lead on the issue of emancipation. Jefferson declined. Such refusals have led a critical backlash by scholars who argue that the reason for inaction was racism. Yet Jefferson consistently maintained that his reasons for refusing further efforts on behalf of emancipation were otherwise: They centered on generational sovereignty and timeliness.

 

First, there is generational sovereignty. In a letter to Joseph Willard (24 Mar. 1789), Jefferson articulates the notion of generational sovereignty and how it calls each generation to do its part. “It is for such institutions [of Natural History and Natural Science] … to do justice to our country, its productions and its genius. It is the work to which the young men, whom you are forming, should lay their hands. We have spent the prime of our lives in procuring them the precious blessing of liberty. Let them spend theirs in shewing that it is the great parent of science and of virtue; and that a nation will be great in both, always in proportion as it is free.” In his 1814 letter to Edward Coles (Aug. 25), Jefferson excuses himself from further actions apropos of emancipation. “I am sensible of the partialities with which you have looked towards me as the person who should undertake this salutary but arduous work. But this, my dear sir, is like bidding old Priam to buckle the armour of Hector…. I have overlived the generation with which mutual labors & perils begat mutual confidence and influence. This enterprise is for the young; for those who can follow it up, and bear it through to its consummation. It shall have all my prayers, & these are the only weapons of an old man.”

 

Second, there is the issue of timeliness. Jefferson always believed that one could push an issue too quickly and by doing that, do more harm than good.

 

“I have long since given up the expectation of any early provision for the extinguishment of slavery among us,” writes Jefferson to William Burwell (28 Jan. 1805). “There are many virtuous men who would make any sacrifices to affect it,” he continues, “many equally virtuous who persuade themselves either that the thing is not wrong, or that it cannot be remedied, and very many with whom interest is morality. The older we grow, the larger we are disposed to believe the last party to be.” There are, Jefferson is saying, a large number of citizens pro-eradication, a large number con-eradication, and also a very large number who feel morally engaged because they have interest in the topic without an opinion. The time is just not right, as eradication does not have the consent of the general citizenry.

 

Some 20 years later, Jefferson gives a similar reply in a letter to James Heaton (20 May 1826), who too urged Jefferson to act:

 

The subject of your letter of April 20, is one on which I do not permit myself to express an opinion, but when time, place, and occasion may give it some favorable effect. A good cause is often injured more by ill-timed efforts of its friends than by the arguments of its enemies. Persuasion, perseverance, and patience are the best advocates on questions depending on the will of others. The revolution in public opinion which this cause requires, is not to be expected in a day, or perhaps in an age; but time, which outlives all things, will outlive this evil also.

 

The sentiment in the two letters, and in other letters, is that the time is not right to act on slavery. Public opinion is too divided among some and there is torpor among others. That sentiment is ingeminated time and again. Ill-timed action even on a moral cause might lead to results which do more to retard than to advance that moral cause, because it will have been undertake without the consent of the general citizenry. One can through intrigue aim to effect a revolution on behalf of a cause without general public support—viz., with the support of some part of the general public—but that effort, in opposition to the principle of government by the will of the majority, will be unjust and will likely fail, because it lacks general succor.  Whether Jefferson was right in asserting that generational sovereignty and timeliness were adequate justifications for inaction during retirement I decline to address. What I do address is a heretofore undisclosed tension in the axial principles of Jefferson’s political philosophy. That tension, to be resolved, requires some sort of axiological ordering of those principles.

 

That tension centers on the notions of generational sovereignty, of timeliness, and of government based on will of the majority. The argument goes as follows. A Jeffersonian republic is a government based on the will of the majority of citizens, suitably informed. The suitably-informed qualification is not some addendum to justify discretionary government—that is, transgressions in governmental decisions at odds with majority opinion—but merely is a requirement that the citizenry have a certain basal level of education as well as access to new political happenings and new scientific disclosures. Government based on the will of the majoritydemands political timeliness—that nothing be pushed through Congress unless it has the sanction of the people. Yet—and herein lies the nodus—Jefferson also committed to the principle of generational sovereignty, which entails as its corollary that each generation is unencumbered by the generation prior—the “unencumbrance” principle. The institution of slavery is an encumbrance that has been passed through the generations among the colonists since 1619. Regard for generational sovereignty demands that there be immediate action on slavery so that it is not passed down through the generations, thereby encumbering the generation subsequent with the problems of the generation prior. Yet if the generation in political control is unconvinced that slavery is the political problem or moral evil that it is, then obedience to the will of the majority necessitates that the will of the majority be respected and nothing be done to eradicate slavery until such time as the majority of citizens will it to be eradicated. And so, it seems, we unhappily have a scenario in which we are theoretically committed to quick action and to not so quick action to end slavery—a most unpalatable scenario.

 

There are a couple of outs.

 

One is to reject the notion that Jefferson was inexorably tied to generational sovereignty or to timeliness. The difficulty here is that there is no evidence that Jefferson was ever anything but tightly committed to both principles.

 

The remaining and most reasonable option is some sort of axiological ordering (e.g., primary, secondary, tertiary, etc.) of those principles along with a justification of that ordering. The difficulty here is that Jefferson, because he was never philosophically pushed to do so, never subjected his political philosophy to any such axiological ordering. Hence, we can only speculate, in keeping with his moral commitments, on such an ordering. Such speculations will not aim to expose the whole system of political principles to critical analysis.

 

We cannot but suppose that obedience to government in keeping with the will of the majority was for Jefferson a true axial principle and timeliness was a corollary of it. Yet we have also assumed a tight commitment by Jefferson to generational sovereignty and its corollary of unencumbrance. Nonetheless, that need not tie us to a tight commitment to generational sovereignty as an axial principle, but to a tight commitment to it as a secondary principle—viz., a principle to be applied ceterus paribus or when it does not impede the will of the majority. In short, the will of the majority and generational sovereignty are both to be respected, but when the two principles clash, the will of the majoritytrumps generational sovereignty. Consequently, if the majority of citizens are not anti-slavery, then (and only then) is it legitimate to contravene the unencumbrance principle and pass on the institution, obliquitous as it is, to the next generation.

 

There is no direct evidence, however, that Jefferson considered generational sovereignty as a secondary principle. Yet several difficulties with the principle listed by James Madison in a letter to Jefferson (4 Feb. 1790), after the latter articulated his commitment to generational sovereignty in a prior letter to Madison (6 Sept. 1789), are evidence that Jefferson was aware of difficulties with the principle, and that is some reason to consider generational sovereignty as a secondary principle of Jefferson’s political philosophy.

]]>
Fri, 22 Feb 2019 20:42:59 +0000 https://historynewsnetwork.org/article/171017 https://historynewsnetwork.org/article/171017 0
Six Times the Failure of a Political Nomination Changed American History 

When one examines history, random circumstances often affect the future. Sometimes the failure of one intended outcome greatly shapes the course of history. Here are 6 examples when the failure of a political nomination changed American history. 

 

1. Roger Taney 

President Andrew Jackson’s close adviser, Roger Taney, was the first cabinet officer rejected by the US Senate in American history. Taney served only as a recess appointment as Secretary of the Treasury from 1833-1834.  But Jackson then made Taney his appointment to succeed Chief Justice John Marshall.  Taney proceeded to have the second longest term as Chief Justice, only behind Marshall in duration (28 years to Marshall’s 34 years). Most infamously, Taney presided over the Dred Scott Supreme Court case that determined enslaved people were not citizens and could not sue in court. His controversial decision helped to bring on the Civil War.

 

2. Earl Warren

In 1953, Chief Justice Fred Vinson died suddenly, and President Dwight D. Eisenhower chose California Governor Earl Warren as his successor. Eisenhower had no idea how liberal Warren would be on the Court, and the impact that Warren would have in promoting the legal end of segregation in the case of Brown V. Board of Education (1954). Warren convinced a number of Justices who were leaning against the unanimous decision in that case, which clearly would not have occurred under Vinson, who was personally not in favor of the Court’s intervention in the case.

 

3. Harry Blackmun 

In 1970, President Ricard Nixon selected Harry Blackmun as his third choice for Associate Justice of the Supreme Court, after the failure of Clement Haynesworth and G. Harold Carswell.  Who could have known that Blackmun, thought to be a conservative, would become more liberal as the years went by, and would be the prime author of the most controversial Supreme Court decision of modern times, Roe V. Wade, the abortion rights case, in 1973?

 

4. Anthony Kennedy 

In what became known as the “Saturday Night Massacre” in 1973, Attorney General Eliot Richardson and Deputy Attorney General William Ruckelshaus refused to follow the President’s orders to fire Special Prosecutor Archibald Cox. Instead, they resigned immediately. Solicitor General Robert Bork, the third ranking in the Justice Department, followed the orders of President Richard Nixon and fired Cox. Bork’s cooperation with Nixon would come back to bite him: when President Ronald Reagan nominated Bork to the Supreme Court in 1987, the Democratic majority in the Senate successfully opposed his nomination. After Bork’s defeat, Douglas Ginsburg, the next appointee, withdrew after the Senate learned he used marijuana with students in his law school classes. Finally, Reagan made his third choice of Anthony Kennedy, who became the “swing vote” on the Court for much of his 30 year service until his retirement in 2018, affecting abortion rights and gay marriage.

 

5. Dick Cheney

In 1989, President George H. W. Bush chose former Texas Senator John Tower to be his Secretary of Defense.  However, when evidence of Tower’s heavy drinking and womanizing was revealed, his nomination went down in defeat.  Bush now turned to Republican House Whip Dick Cheney of Wyoming to head the Pentagon and Georgia Congressman Newt Gingrich replaced him.  Who could possibly have forecast that Cheney would gain favor for his leadership in the Gulf War of 1991, and a decade later, that Bush’s son, George W. Bush, would pick Cheney to be his Vice President? And who could have imagined that Gingrich, with his strategy and tactics of “guerrilla warfare” and extreme partisanship outside the norm of traditional Republican leadership, would end up leading the Republican “Revolution” of 1994 and the open conflict between the Republican majority and Democratic President Bill Clinton, including the decision to seek impeachment in 1998-1999?

 

6. John Roberts

A final example is that of Chief Justice John Roberts, who is becoming more significant and influential in his now 14th year on the Supreme Court.  When President George W. Bush nominated Roberts for the Court, he was slated to replace retiring Associate Justice Sandra Day O’Connor, but the hearings on his nomination had not yet begun. Suddenly, Chief Justice William Rehnquist died, the first person since 1953 to have passed away while in service.  Bush decided overnight to switch Roberts to that post, and then chose Samuel Alito to replace Sandra Day O’Connor.  With Roberts being seen as a moderate conservative, and the supposed “swing” vote on the Court, it is clear that had Alito become Chief Justice instead of Roberts, with Alito’s more conservative bent, the Supreme Court would be different in its character now and into the future.  

Roberts himself suffered a seizure on a dock in New Hampshire in 2007 while alone, two years after his confirmation to the Court. Fortunately, he did not fall into the water, which likely would have led to his death so early in his tenure on the Court.

 

So fate, chance, random circumstances, have had vast impact in American history, and certainly makes one humble about its role in human events.

 

]]>
Fri, 22 Feb 2019 20:42:59 +0000 https://historynewsnetwork.org/article/171126 https://historynewsnetwork.org/article/171126 0
Updated: Black History Month – What Historians Are Saying  

 

 

 

]]>
Fri, 22 Feb 2019 20:42:59 +0000 https://historynewsnetwork.org/article/168318 https://historynewsnetwork.org/article/168318 0
Marie Colvin: an Amazing Woman Whose Life Was Cut Short Trying to Present Us with the Truth

2018 was an annus horribilis for freedom of the press.  Reporters Without Borders announced that 63 professional journalists were killed, of whom 49 were specifically targeted for death by an army or rebel group. 

 

The most famous journalistic killing came on October 2, when agents of the Saudi government murdered and dismembered Jamal Khashoggi, a Saudi dissident and columnist for The Washington Post. 

 

The conditions endured by journalists in war zones are portrayed in chilling detail in Lindsey Hilsum’s new book on war correspondent Marie Colvin. A staff reporter for The Sunday Times of London, Colvin died on February 22, 2012, aged 56, in Homs, Syria. The Syrian army, honing in on her satellite phone, targeted an artillery strike on the building where she was reporting from. 

 

At the time, Colvin was arguably one of the best-known war correspondents at work.  With her distinctive black eyepatch, flak jacket and wavy hair tied in a bun, she was familiar to many Americans through her frequent appearances on CNN and other news outlets. 

 

She lost her left eye after getting hit by a rocket propelled grenade while covering the Tamil insurrection in Sri Lanka in 2001.  She had been the first reporter in six years to report directly from the Tamil rebels’ besieged enclave on the island nation. International monitors estimated that more than 80,000 people had died from shelling, hunger and disease. Colvin’s near death at the hands of the Sri Lankan Army drew world attention to the slaughter which had been conducted in secret. 

 

Although Colvin died eight years ago, her career and tragic death have gained new attention with the publication of Hilsum’s book and a recent biopic, A Private War, starring Rosamund Pike. 

  

Marie Catherine Colvin grew up near Oyster Bay, Long Island, one of five children in a middle-class Irish Catholic family. Her father, an ex-Marine, was an English teacher in the New York city public schools.  

 

After graduation from Yale where she wrote for the campus newspaper, she went to work at UPI’s Washington Bureau. From her first days in the newsroom, she told her friends she wanted to emulate female journalists such as Martha Gellhorn, who had covered the Spanish Civil War, World War II and Vietnam and Oriana Fallaci, who covered the Middle East and obtained candid interviews with world leaders including Henry Kissinger, Deng Xiaoping and Lech Walesa. 

 

As a young war correspondent in the 1980s, Colvin’s looks and manner were a magnet for men including those in power.  In 1986 she was the first reporter to interview Muammar Gaddafi after the U.S. air strikes on Libya. A dozen other western correspondents had traveled to Tripoli to see the “supreme leader,” but he chose Colvin for an exclusive sit-down discussion.  

 

Later, she became one of the few western reporters allowed to travel privately with PLO leader Yasser Arafat. He granted her a number of interviews and cooperated in a BBC documentary she produced about him (to the irritation of the Israeli government).   

 

From her base in London, Colvin covered the wars in Kosovo, Sierra Leone and Chechnya. At one point, traveling with Chechen rebels and cut-off by Russian Forces, she hiked 40 miles in snow over Caucasian Mountain, subsisting on canned milk and crackers, finally reaching safety in the Republic of Georgia. 

 

Author Lindsey Hilsum, writing her own first-hand knowledge of war zones, weaves a compelling story. Hilsum, currently serving as the international editor for Channel 4 News in Britain, has covered conflicts in Ukraine, Syria, Iraq and Kosovo. She knew Colvin socially and worked alongside her in Syria.  In researching this book, she gained the cooperation of Colvin’s family, and had access to the reporter’s personal diaries and notebooks. 

 

The result is a deeply moving, very personal biography. Colvin was as adventurous in her love life as she was in her reporting. She was married three times and had numerous, stormy love affairs. She tried twice unsuccessfully to conceive a child. Several of her lovers openly betrayed her, leaving her humiliated and deeply depressed for long periods of time.

 

Her difficult emotional life was exacerbated at times by an increasing dependence on alcohol. She tried on several occasions to stop drinking but was never successful. Despite her high alcohol intake, she was an empathetic interviewer and a skilled writer with a keen eye for detail. She rarely missed a deadline even writing under fire. 

 

Underlying the glamor of celebrity and the adrenaline-rush of dodging gunfire was a committed humanitarian. In a 2010 diary entry written in October 2010 in Afghanistan, she recorded her wish to “Use my skills as a writer to help those who can’t find justice anywhere else. Acting without fear – it matters. Show world what it cannot see firsthand. No death wish. I have too much to live for.”

 

Colvin rose to fame during the 1990s, before the rise of the Internet. During that time, she competed primarily with print journalists and television reporters from CNN and other international outlets. She covered the Egyptian revolution of 2011 from Tahrir Square and noted the new importance of social media in rallying opposition to authoritarian governments. She continued to believe in the importance of print journalists going out to the battle-zone, conducting interviews and presenting compelling stories to British and American readers.

 

As avid consumers of news, we often take the difficult work of journalists for granted. The murder of Jamal Khashoggi briefly focused attention on the dangers faced by reporters working in hostile environments. In Extremis is an instructive look at the motivations and operating methods of these men and women.  Readers will come away with a deeper understanding of the brutality of modern dictatorships and an appreciation for an amazing woman whose life was cut short trying to present us with the truth. 

]]>
Fri, 22 Feb 2019 20:42:59 +0000 https://historynewsnetwork.org/article/171110 https://historynewsnetwork.org/article/171110 0
What History Can Teach Us About Capitalism, Socialism, and Inequality

 

 

Steve Hochstadt teaches at Illinois College and blogs for HNN.

 

Capitalism based on the profit motive and private ownership is supposed to provide the best economic outcomes for everybody. As the owners of industry and commerce enrich themselves, they provide employment for the rest of the population. Everyone shares in economic growth, even if owners reap a greater share. The metaphor “a rising tide lifts all boats” describes this explanation of how capitalism should work.

 

Proponents of socialism argue that only the owners of capital profit from such a system. The great majority of people labor for the profit of a few. They propose an economic system based on social ownership and more equal distribution of profits.

 

These competing theories tend to leave out the role of government in shaping an economy and influencing the distribution of wealth. In every economic system, the state encourages and restricts economic activity, and funnels economic advantages to selected population groups. 

 

In all of the real existing systems that have called themselves socialist, social ownership has meant in practice government ownership. In every case since the creation of the Soviet Union, socialist governments have been dominated by a single political party, which have not allowed any challenges to their power. Inevitably, this has led to the corruption of the ideal of popular ownership of the economy. Those in charge of socialist governments have given themselves and their close supporters economic privileges denied to the wider population, from the special access to goods and services enjoyed by members of the Communist Party in the Soviet Union and its Eastern European satellites to the accumulation of wealth by the leading families of Communist China.

 

The absence of democracy, the brutal repression of critical ideas, and the continuing economic weaknesses of the Soviet systems led to their collapse in 1989. But not all socialist states were so unsuccessful. China, which had one of the world’s poorest populations through the first half of the 20thcentury, has nearly eradicated extreme poverty, according to the World Bank. Although the Cuban economy is one of the most government-controlled in the world, the poverty level is very low, and education and health care rank high.

 

In the US, capitalism has sometimes worked to make all boats rise. A remarkable study last year of the history of national income, written by the foremost French researchers about income inequality, Thomas Piketty, Emmanuel Saez, and Gabriel Zucman, shows that from 1946 to 1980, real income doubled across the economic spectrum. That was also a period of extraordinary economic growth: gains of over 5% in gross domestic product for most years, and occasionally more than 10%. The top income tax rate for the richest Americans was higher than 85% until 1964, and then 70% until 1980. Nevertheless, the top 0.01% tripled their income after taxes in this period.

 

But since 1980, the story has been very different. The income of the poorer half of Americans has remained completely stagnant. The upper half has seen its income grow, but most of that growth has been at the very top: incomes of the top 1% have tripled, and that tiny rich slice earns almost twice as much before taxes as the whole bottom half. The few thousand families in the top .001% have multiplied their income 7 times. Our graduated income tax, along with other income-based payments like Medicaid, does redistribute money toward the bottom, but that hardly dents the huge inequality.

 

That’s due to political choices. The top tax rate has fallen steadily, to 50% in 1982, to 40% in 1993, to 35% in 2003. The tax rate on capital gains from stocks, which nearly all go to the wealthiest Americans, has also fallen from 40% to 20%. After nearly tripling from 1940 to 1970, the real value of the minimum wage has fallen since then. One of the least discussed but most important political policies that contributes to growing inequality is the ability of the very rich to hide their income in international tax shelters. The leak of the so-called Panama Papers brought the illegal use of tax havens into the international spotlight: the anonymous leaker said he was motivated by “income inequality”. It is estimated than 10% of the world’s GDP is held in offshore banks, including about 8% of American GDP.

 

Corporations have contributed to rising inequality by boosting the incomes of top management. CEO’s earned about 30 times the income of a typical worker in 1980. That ratio has skyrocketed to 300 times average wages.

 

Political choices continue to widen the economic gulf between the few and the many. The Republican tax reform of 2017 mainly benefitted the rich, notably by doubling the amount of money that can be left in an estate without being taxed, helping only a few thousand families.

 

Growing inequality is not only an American problem, but a global problem that keeps getting worse. Between 2010 and 2016, the total wealth owned by the poorest half of the world’s population fell by over one-third. At this moment, the world’s top 1% owns more than all the rest of us. The world’s economy keeps growing, but the yachts of the wealthiest are disappearing from view. Since 2000, the bottom half of the world’s population has gotten about 1% of the increase in global wealth. The top 1% took in half of that growth. The 8 richest men in the world now own as much as the poorer half of the global population, 3.6 billion people.

 

Rising inequality in the US has provoked louder discussion. Conservatives try to derail political discussions about economic inequality by talking about the “politics of envy”. Mitt Romney as presidential candidate in 2012 criticized President Obama’s concern for the poor: “I think it’s about envy. I think it’s about class warfare. When you have a President encouraging the idea of dividing America based on the 99 percent versus 1 percent—and those people who have been most successful will be in the 1 percent—you have opened up a whole new wave of approach in this country which is entirely inconsistent with the concept of one nation under God.” Scary diatribes about the failures of “socialism” are designed to support the status quo.

 

It’s a common mistake of both left and right to talk about capitalism and socialism as if there were only two choices. One-party socialist systems in less developed countries have not worked well over the past century. Capitalism as practiced in the United States and many other nations has mainly benefitted those who already are wealthy. The nations in which all citizens gain from economic growth have combined elements of market economies, private ownership, and political policies that mitigate inequality. In western Europe, public health care, nearly free university education, stronger progressive taxation, higher minimum wages, and inclusion of trade unions in corporate decision-making result in much lower inequality and much happier populations.

 

No American politician argues for replacing capitalism. The political choices of the past 40 years have weakened our national economy and our political unity by favoring the wealthy. The rising tide is swamping too many American boats. It’s time for a different politics.

]]>
Fri, 22 Feb 2019 20:42:59 +0000 https://historynewsnetwork.org/blog/154177 https://historynewsnetwork.org/blog/154177 0
Bold Experimenter FDR Was Born 137 Years Ago

This caricature is by George Wachsteter (1911-2004) and ran in the April 14, 1938 edition of the New York Times. Drawing courtesy of the author. 

 

Franklin D. Roosevelt, who was born one hundred thirty-seven years ago today, is not among the four giants on Mount Rushmore.  But that’s only because he hadn’t been president yet when Gutzon Borglum began it in 1927.  Unquestionably, Roosevelt is the most consequential American president since Abraham Lincoln.

The ambitious cousin of Theodore Roosevelt led the nation through the greatest trails since the Civil War: the Great Depression and World War II.  FDR was a complex—even devious man—and yet his presidency was one of extraordinary accomplishment.

Roosevelt embodied many characteristics of leadership which should expect of a president.  One of them was a sense of empathy and appreciation for people less fortunate.  An important influence on him, of course, was Eleanor Roosevelt.  

Also significant was his own health challenges with his public and private challenges dealing with the effects of polio.  He showed his personal commitment to helping others similarly afflicted at Warm Springs, Georgia, facility.

Of course, FDR was a premier communicator.  His first inaugural address and his Fireside Chats helped rally the nation during the depths of the Great Depression, something that his predecessor was unable to do.  The nation had confidence in him when it sent its soldiers and sailors to fight in Europe and the Pacific in the 1940s.

But what is so extraordinary about Roosevelt’s leadership is his embrace of experimentation, something which he identified even before he became president, and he did it in Atlanta. Oglethorpe University’s president, Thornwell Jacobs, had invited the New York governor and part-time Georgia resident to speak in 1931, but FDR postponed the visit to May 22, 1932, when he addressed the graduating class at the Fox Theatre.  

He used the platform to outline his approach to the crisis gripping the nation.  Following the stagnant response by Herbert Hoover, Roosevelt presented a general, but aggressive approach.  Crucial to his address that Sunday night was a call for “bold persistent experimentation.”  Implicit was a call for greater engagement at the national level. 

He offered hope by suggesting that central, coordinated planning to address the problem.  The soon-to-be nominated presidential candidate laid out the basis for the New Deal and provided the groundwork for citizens looking to its chief executive as a focal point of American governance.  

In the following years, Roosevelt tried one new program after another, seeking to bolster the economy, provide a safety net to the most vulnerable, and to create ways to employ people while enhancing the nation.  Among the greatest achievements of the latter were the Civilian Conservation Corps, the Works Project Administration, and the Federal Writers Project.

The Oglethorpe speech to a small graduating class in downtown Atlanta not only presaged the national response to a great challenge, it also suggested the beginnings of the modern presidency.  Roosevelt would be elected president four times—a unique achievement—but also was a problem-solving leader, not hamstrung by ideology but liberated by effective pragmatism.  That’s a quality that should be compelling in the twenty-first century as we confront what appear to be a variety of intractable problems. 

 

For more by Joseph A. Esposito, read his book: 

]]>
Fri, 22 Feb 2019 20:42:59 +0000 https://historynewsnetwork.org/article/171095 https://historynewsnetwork.org/article/171095 0
Roundup Top 10!

Voter suppression carries slavery's three-fifths clause into the present

by Imani Perry

The Georgia governor’s election was the latest example of how James Madison’s words continue to shape our views on race.

 

Kevin Kruse and Julian Zelizer: Howard Schultz could re-elect Donald Trump

by Kevin Kruse and Julian Zelizer

Schultz could easily play the same spoiler role in 2020.

 

Jeremi Suri: The Case for Howard Schultz

by Jeremi Suri

"Third-party figures did not play spoiler to particular candidates; they were reflections of the weaknesses of the main-party candidates from the start."

 

 

The Women’s Suffrage Movement Included More Than Two Women And So Should The Monuments

by Michelle Duster

During a time when Black women’s votes are more pivotal than ever, our leaderships and contributions to the Suffrage Movement must be honored.

 

 

Trump's Troop Ban Is Part of a Long, Dark History of Accusing Trans People of Threatening National Stability

by Julio Capó Jr.

People gendered differently from what was assigned to them at birth have served in the American military since at least the 18th century.

 

 

After Donald Trump, we need a Washington outsider like Jimmy Carter in the 2020 race

by Jonathon Zimmerman

Carter wasn't a particularly effective president. But he was the right candidate for the moment, a decent man who healed the hole in our national heart.

 

 

The State of the Union shifts power to the president. Pelosi took it back.

by Kathryn Cramer Brownell

The shutdown upended the ‘bully pulpit’ Trump’s predecessors have used.

 

 

Xi’s China Is Steamrolling Its Own History

by Pamela Kyle Crossley

“Historical nihilism” is nothing more than a denial that the past is fundamentally a resource to be plundered by the present.

 

 

The Green Book’s Black History

by Brent Staples

Lessons from the Jim Crow-era travel guide for African-American elites.

 

 

Our Man From Boeing

by Mandy Smithberger and William D. Hartung

Has the Arms Industry Captured Trump’s Pentagon?

 

 

Why the decision to wear MAGA hats matters

by Matthew A. Sears

Political symbols signal political beliefs — and their usage can shape the course of history.

 

]]>
Fri, 22 Feb 2019 20:42:59 +0000 https://historynewsnetwork.org/article/171111 https://historynewsnetwork.org/article/171111 0