History News Network - Front Page History News Network - Front Page articles brought to you by History News Network. Sun, 16 Jun 2019 01:00:48 +0000 Sun, 16 Jun 2019 01:00:48 +0000 Zend_Feed_Writer 2 (http://framework.zend.com) https://new.hnn.us/site/feed Photographs From My First Trip to China Steve Hochstadt is a professor of history emeritus at Illinois College, who blogs for HNN and LAProgressive, and writes about Jewish refugees in Shanghai.

 

I went to China in 1989 with a group of former Jewish residents of Shanghai to celebrate the first open Passover Seder in Shanghai in decades. Our visit, which included the first Israeli journalists allowed into Communist China, was part of the more general liberalization in Chinese politics and economics. Passover began on April 20.

 

I was very happy to be on this trip, because I was just beginning to research the flight of about 18,000 Jews from Nazi-occupied Europe to Shanghai after 1938. My grandparents had been among them, leaving Vienna in 1939. I was able to do eight interviews, see the neighborhoods where my fellow travelers had lived, and even get into my grandparents' former apartment. We met Chinese scholars who were doing research on Jews in Shanghai.

 

I like to wander around new cities with my camera. In Shanghai, and a few days later in Beijing, I saw things I didn't expect: students protesting the lack of freedom and democracy in China. Nobody I asked knew what was going on. I took these photographs, and a few others. I wish I had taken many more. These images were originally on 35mm slides, and were printed by Jim Veenstra of JVee Graphics in Jacksonville.

 

A Shanghai Street Protest, Shanghai, April 22, 1989

 

I was wandering in downtown Shanghai when a wave of young people came out of nowhere and marched past me. I had no idea what they were doing. Chinese streets are often filled with bicycles and pedestrians, but this was different. Protests had begun in Beijing a week earlier, and had spread to other cities already, but we knew nothing about them.

 

 

Tiananmen Press Freedom, Beijing, April 24, 1989

 

After a few days of personal tourism and memories in Shanghai, our group flew to Beijing to be tourists. Tiananmen Square is in the center of the city, right in front of one of most important tourist sites, the Forbidden City, home of the emperors for 500 years. Students had been gathering there for over a week already. The sign in English advocating A Press Freedom was surprising, since there were very few signs in English at that time. The flowers on the monument are in memory of Hu Yaobang.

 

 

Summer Palace, Beijing, April 25, 1989

 

The next day we visited the Summer Palace on the outskirts of the old city, built in the 18th century as a lake retreat for the imperial family. Busses poured into the parking lot with school children and adult tourists. At the entrance, students displayed these signs in English for every visitor to see, requesting support for their movement. Our guides could not or would not comment on them.

 

 

 

Inside Summer Palace, Beijing, April 25, 1989

 

Inside the grounds of the Summer Palace, students were collecting funds for their cause: democracy and freedom in Chinese life. The bowl is filled with Chinese and foreign currency.

 

 

Beijing Street March, Beijing, c. April 25, 1989

 

I'm not sure exactly when or where I took this photograph. We were taken around to various sites in a bus, including some factories in the center of Beijing. We were not able to keep to the planned schedule, because the bus kept getting caught in unexpected traffic. I believe I took this photo out of the window of our bus, when it was stopped. The bicyclists and the photographer in front of the marchers show the public interest in these protests.

 

 

 

Our Chinese trip was supposed to last until April 30, but the last few days of our itinerary were suddenly cancelled, and we were flown to Hong Kong. There was no official public reaction to the protests we saw, but government leaders were arguing in their offices over the proper response. I was struck by the peaceful nature of the protests I had seen and the interest shown by the wider Chinese public. The protests spread to hundreds of Chinese cities in May, and Chinese students poured into Beijing.

 

On May 20, the government declared martial law. Student protesters were characterized as terrorists and counter-revolutionaries under the influence of Americans who wanted to overthrow the Communist Party. Those who had sympathized with the students were ousted from their government positions and thousands of troops were sent to clear Tiananmen Square. Beginning on the night of June 3, troops advanced into the center of the city, firing on protesters. Local residents tried to block military units. On June 4, Tiananmen Square was violently cleared. "Tank Man" made his stand on June 5.

 

All the Communist governments in Eastern Europe were overthrown in 1989. The Soviet Union collapsed in 1991. The Chinese government survived by repressing this protest movement. Since then all discussion of the 1989 protests is forbidden. Western tourists on Tiananmen Square are sometimes asked by local residents what happened there.

 

I wonder what happened to the students pictured in these photos.

]]>
Sun, 16 Jun 2019 01:00:48 +0000 https://historynewsnetwork.org/blog/154219 https://historynewsnetwork.org/blog/154219 0
Roundup Top 10!  

What Naomi Wolf and Cokie Roberts teach us about the need for historians

by Karin Wulf

Without historical training, it’s easy to make big mistakes about the past.

 

Free Speech on Campus Is Doing Just Fine, Thank You

by Lee C. Bollinger

Norms about the First Amendment are evolving—but not in the way President Trump thinks.

 

 

Don't buy your dad the new David McCullough book for Father's Day

by Neil J. Young

McCullough appears to have written the perfect dad book, but it's romantic view is the book's danger.

 

 

Voter Restrictions Have Deep History in Texas

by Laurie B. Green

Texas’ speedy ratification of the 19th Amendment represents a beacon for women’s political power in the U.S., but a critical assessment of the process it took to win it tells us far more about today’s political atmosphere and cautions us to compare the marketing of voting rights laws with their actual implications.

 

 

What Does It Mean to be "Great" Amidst Global Climate Change

by David Bromwich

How can Robert Frost, Graham Greene, Immanuel Kant, and others help us understand values and climate change?

 

 

How the Central Park Five expose the fundamental injustice in our legal system

by Carl Suddler

The Central Park Five fits a historical pattern of unjust arrests and wrongful convictions of black and Latino young men in the United States.

 

 

The biggest fight facing the U.S. women’s soccer team isn’t on the field

by Lindsay Parks Pieper and Tate Royer

The history of women in sports and the discrimination they have long faced.

 

 

I Needed to Save My Mother’s Memories. I Hacked Her Phone.

by Leslie Berlin

After she died, breaking into her phone was the only way to put together the pieces of her digital life.

 

 

How to Select a Democrat to Beat Trump in 2020

by Walter G. Moss

In a Democratic presidential candidate for 2020 we want someone who possesses the major wisdom virtues, virtues that will assist him/her to further the common good. In addition, we need someone with a progressive unifying vision.

 

 

Warren Harding Was a Better President Than We Think

by David Harsanyi

An analysis of presidential rankings and a defense of Warren G. Harding.

 

]]>
Sun, 16 Jun 2019 01:00:48 +0000 https://historynewsnetwork.org/article/172245 https://historynewsnetwork.org/article/172245 0
Women Have Fought to Legalize Reproductive Rights for Nearly Two Centuries

Image from "Marriage and Its Discontents at the Turn of the Century"

 

 

Mississippi state representative Douglas McLeod was arrested last week for punching his wife when she didn’t undress fast enough for sex. When deputies arrived, he answered the door visibly intoxicated, with a drink in his hand, and yelled, “Are you kidding me?” Police found blood all over the bed and floor, and had to reassure his frightened wife, who stood shaking at the top of the stairs, that they would protect her. In January, McLeod had co-sponsored a bill making abortions in Mississippi illegal after detection of a fetal “heartbeat” beginning at six weeks, before most women even know they are pregnant.  

 

In both of these scenarios, one thing is clear – Douglas McLeod believes he has such a right to his wife’s body (and other women’s bodies) that he is willing to violently and forcefully impose it. 

 

Even more clear is the fact that for nearly two centuries, women’s rights reformers have fought to make reproductive rights legal, precisely because of men like Douglas McLeod. 

 

Women’s rights reformers beginning in the 1840s openly deplored women’s subjugation before the law – which, of course, was created and deployed by men. Temperance advocates in the nineteenth century pointed especially to alcohol as a major cause of the abuse and poverty of helpless women and children. As one newspaper editor editorialized, “Many… men believe that their wife is as much their property as their dog and horse and when their brain is on fire with alcohol, they are more disposed to beat and abuse their wives than their animals…. Every day, somewhere in the land, a wife and mother – yes, hundreds of them – are beaten, grossly maltreated and murdered by the accursed liquor traffic, and yet we have men who think women should quietly submit to such treatment without complaint.”(1)

 

But of course women were never silent about their lowly status in a patriarchal America. As one of the first practicing woman lawyers in the United States, Catharine Waugh McCulloch argued in the 1880s that “Women should be joint guardians with their husbands of their children. They should have an equal share in family property. They should be paid equally for equal work. Every school and profession should be open to them. Divorce and inheritance should be equal. Laws should protect them from man’s greed…and man’s lust…”(2)

 

Indeed, the idea of “man’s lust” and forced maternity was particularly abhorrent to these activists. In the nineteenth century, most women were not publicly for the legalization of birth control and abortion but there were complex reasons for this rejection. In a world where women had little control over the actions of men, reformers rightly noted that legalizing contraceptives and abortion would simply allow men to abuse and rape women with impunity and avoid the inconvenient problem of dependent children. 

 

Instead, many suffragists and activists embraced an idea called voluntary motherhood. The theoretical foundations of this philosophy would eventually become part of the early birth control movement (and later the fight for legal abortion in the twentieth century). Simply put, voluntary motherhood was the notion that women could freely reject their husbands’ unwanted sexual advances and choose when they wanted to have children. In an era when marital rape law did not exist, this was a powerful way for women to assert some autonomy over their own bodies. As scholar Linda Gordon has written, it is thus unsurprising that women – even the most radical of activists – did not support abortion or contraception because “legal, efficient birth control would have increased men’s freedom to indulge in extramarital sex without greatly increasing women’s freedom to do so even had they wanted to.”(3) But the ideas underpinning voluntary motherhood promised to return a measure of power to women. 

 

Of course, the nineteenth-century criminalization of abortion and birth control in state legislatures was openly about restricting women’s freedom altogether. As Dr. Horatio Storer wrote, “the true wife’” does not seek “undue power in public life…undue control in domestic affairs,… or privileges not her own.”(4) Beginning in the 1860s, under pressure from physicians like Storer and the newly organized American Medical Association (who wanted to professionalize and control the discipline of medicine), every state in the union began passing laws criminalizing abortion and birth control. Physicians saw their role as the safeguard not only of Americans’ physical health, but the very health of the republic. They, along with other male leaders, viewed the emergent women’s suffrage movement, rising immigration, slave emancipation, and other social changes with alarm. Worried that only white, middle-class women were seeking abortion, doctors and lawmakers sought to criminalize contraceptives and abortion in order to ensure the “right” kind of women were birthing the “right” kind of babies. 

 

The medical campaigns to ban abortion were then bolstered by the federal government’s passage of the 1873 Comstock Act, which classified birth control, abortion, and contraceptive information as legal obscenity. Fines for violating the Act were steep and carried prison time. Abortion and birth control then remained illegal for essentially the next century, until the Supreme Court finally ruled in two cases – Griswold v. Connecticut (1965) and Roe v. Wade (1973), that both were matters to be considered under the doctrine of privacy between patient and physician. The efforts of the second-wave feminist movement also simultaneously transformed older ideas of voluntary motherhood, which asserted that women both didn’t have to have sex or be pregnant, into the more radical notion that women could – and should - enjoy sex without fear of becoming pregnant.  

 

Anti-abortion activists today thus know that they cannot openly advocate for broadly rescinding women’s human and legal rights. Instead, in order to achieve their agenda,  they cannily focus on the rights of the unborn or “fetal personhood,” and the false flag of “protecting” women’s health.  But it is crystal clear that the recent spate of laws criminalizing abortion in states like Georgia, Ohio, Alabama, and Douglas McLeod’s home state of Mississippi have nothing to do with babies or health. Instead they flagrantly reproduce the past history of men’s legal control over women. It is not a coincidence that women make up less than 14% of Mississippi’s legislative body – the lowest in the country. McLeod’s behavior and arrest may have taken place in May of 2019, but his actions – both at home and in the legislature – look no different than his historical male counterparts. Unlike the past, it’s just that neither he nor his colleagues are willing to admit it. 

 

(1) The Woman’s Standard (Waterloo, IA), Volume 3, Issue 1 (1888), p. 2. 

(2) “The Bible on Women Voting,” undated pamphlet, Catharine Waugh McCulloch Papers, Dillon Collection, Schlesinger Library. 

(3) Linda Gordon, The Moral Property of Women: A History of Birth Control Politics in America (University of Illinois Press, 2002).

(4) Horatio Robinson Storer, Why Not? A book for Every Woman (Boston: Lee and Shepard, 868). Quoted in Leslie Reagan, When Abortion Was a Crime. 

]]>
Sun, 16 Jun 2019 01:00:48 +0000 https://historynewsnetwork.org/article/172181 https://historynewsnetwork.org/article/172181 0
The President is Disrupting the U.S. Economy

Donald Trump has been president for only two of the ten years of America’s economic expansion since the Great Recession, yet he eagerly takes full credit for the nation’s advancement. It has been easy for him to boast because he had the good fortunate to occupy the White House during a mature stage of the recovery. The president’s fans attribute booming markets and low unemployment to his leadership even though Trump’s words and actions at the White House have often broken the economy’s momentum. In recent months, especially, Trump’s interference in business affairs has put U.S. and global progress at risk. 

 

An article that appeared in the New York Times in May 2019 may offer some clues for understanding why the American president has been less than skillful in managing the country’s financial affairs. Tax records revealed by the Times show that from 1985 to 1994 Donald Trump lost more than a billion dollars on failed business deals. In some of those years Trump sustained the biggest losses of any American businessman. The Times could not judge Trump’s gains and losses for later years because Trump, unlike all U.S. presidents in recent decades, refuses to release his tax information. Nevertheless, details provided by the Times are relevant to a promise Trump made during in 2016. Candidate Trump advertised himself as an extraordinarily successful developer and investor who would do for the country what he had done for himself. Evidence provided by the Times suggests that promise does not inspire confidence. 

 

Trump’s intrusions in economic affairs turned aggressive and clumsy in late 2018. An early sign of the shift came when he demanded $5.7 billion from Congress for construction of a border wall. House Democrats, fresh off impressive election gains, stated clearly that they would not fund the wall. The president reacted angrily, closing sections of the federal government. Approximately 800,000 employees took furloughs or worked without pay. Millions of Americans were not able to use important government services. When the lengthy shutdown surpassed all previous records, Trump capitulated. The Congressional Budget Office estimated that Trump’s counterproductive intervention cost the U.S. economy $11 billion. 

 

President Trump’s efforts to engage the United States in trade wars produced more costly problems. Trump referred to himself as “Tariff Man,” threatening big levies on Chinese imports. Talk of a trade war spooked the stock markets late in 2018. Investors worried that China would retaliate, inflating consumer prices and risking a global slowdown. Then Trump appeared to back away from confrontations. The president aided a market recovery by tweeting, “Deal is moving along very well . . .  Big progress being made!”

 

Donald Trump claimed trade wars are “easy to win,” but the market chaos of recent months suggested they are not. When trade talks deteriorated into threats and counter-threats, counter-punching intensified. In May 2019, China pulled away from negotiations, accusing the Americans of demanding unacceptable changes. Trump responded with demands for new tariffs on Chinese goods. President Trump also threatened to raise tariffs against the Europeans, Canadians, Japanese, Mexicans, and others. U.S. and global markets lost four trillion dollars during the battles over trade in May 2019. Wall Street’s decline wiped out the value of all gains American businesses and citizens realized from the huge tax cut of December 2017. 

 

President Trump’s confident language about the effectiveness of tariffs conceals their cost. Tariffs create a tax that U.S. businesses and the American people need to pay in one form or another. Tariffs raise the cost of consumer goods. They hurt American farmers and manufacturers through lost sales abroad. They harm the economies of China and other nations, too (giving the U.S. negotiators leverage when demanding fairer trade practices), but the financial hits created by trade wars produce far greater monetary losses than the value of trade concessions that can realistically be achieved currently. 

 

Agreements between trading partners are best secured through carefully studied and well-informed negotiations that consider both the short and long-term costs of conflict. The present “war” is creating turmoil in global markets. It is breaking up manufacturing chains, in which parts that go into automobiles and other products are fabricated in diverse countries. Many economists warn that the move toward protectionism, championed especially by President Trump, can precipitate a global recession.

 

Trump’s approach to trade had unfortunate effects early in the Great Depression. In 1930 the U.S. Congress passed the protectionist Smoot-Hawley Tariff Act that placed tariffs on 20,000 imported goods. America’s trading partners responded with their own levies. Retaliatory actions in the early 1930s put a damper on world trade and intensified the Depression. After World War II, U.S. leaders acted on lessons learned. They promoted tariff reduction and “free trade.” Their strategy proved enormously successful. Integrated trade gave nations a stake in each other’s economic development. The new order fostered seventy years of global peace and prosperity. Now, thanks to a president who acts like he is unaware of this history, the United States is promoting failed policies of the past. 

 

It is not clear how the current mess will be cleaned up. Perhaps the Chinese will bend under pressure. Maybe President Trump will agree to some face-saving measures, accepting cosmetic adjustments in trade policy and then declaring a victory. Perhaps Trump will remain inflexible in his demands and drag global markets down to a more dangerous level. Markets may recover, as they did after previous disruptions provoked by the president’s tweets and speeches. Stock markets gained recently when leaders at the Federal Reserve hinted of future rate cuts. It is clear, nevertheless, that battles over tariffs have already created substantial damage.

 

Pundits have been too generous in their commentaries on the president’s trade wars. Even journalists who question Trump’s actions frequently soften their critiques by saying the president’s tactics may be justified. American corporations find it difficult to do business in China, they note, and the Chinese often steal intellectual property from U.S. corporations. Pundits also speculate that short-term pain from tariff battles might be acceptable if China and nations accept more equitable trade terms. Some journalists are reluctant to deliver sharp public criticism of Trump’s policy. They do not want to undermine U.S. negotiators while trade talks are underway. 

 

American businesses need assistance in trade negotiations, but it is useful to recall that the expansion of global trade fostered an enormous business boom in the United States. For seven decades following World War II many economists and political leaders believed that tariff wars represented bad policy. Rejecting old-fashioned economic nationalism, they promoted freer trade. Their wisdom, drawn from a century of experience with wars, peace and prosperity, did not suddenly become irrelevant after Donald Trump’s inauguration. Unfortunately, when President Trump championed trade wars, many Americans, including most leaders in the Republican Party, stood silent or attempted to justify the radical policy shifts. 

 

Since the time Donald Trump was a young real estate developer, he has demonstrated little interest in adjusting beliefs in the light of new evidence. Back in the 1980s, when Japan looked like America’s Number One economic competitor, Donald Trump called for economic nationalism, much like he does today. “America is being ripped off” by unfair Japanese trade practices,” Trump protested in the Eighties. He recommended strong tariffs on Japanese imports. If U.S. leaders had followed Donald Trump’s advice in the Eighties, they would have limited decades of fruitful trade relations between the two countries.

 

America’s and the world’s current difficulties with trade policy are related, above all, to a single individual’s fundamental misunderstanding of how tariff’s work. Anita Kumar, Politico’s White House Correspondent and Associate Editor identified Trump’s mistaken impressions in an article published May 31, 2019. She wrote, “Trump has said that he thinks tariffs are paid by the U.S.’s trading partners but economists say that Americans are actually paying for them.” Kumar is correct: Americans are, indeed, paying for that tax on imports. This observation about Trump’s misunderstanding is not just the judgment of one journalist. Many commentators have remarked about the president’s confusion regarding who pays for tariffs and how various trading partners suffer from them. 

 

The United States’ economy proved dynamic in the decade since the Great Recession thanks in large part to the dedication and hard work of enterprising Americans. But in recent months the American people’s impressive achievements have been undermined by the president’s clumsy interventions. It is high time that leaders in Washington acknowledge the risks associated with the president’s trade wars and demand a more effective policy course. 

]]>
Sun, 16 Jun 2019 01:00:48 +0000 https://historynewsnetwork.org/article/172182 https://historynewsnetwork.org/article/172182 0
Political Corruption Underwrites America’s Gun-Control Nightmare Reprinted from The Hidden History of Guns and the Second Amendment with the permission of Berrett-Koehler Publishers. Copyright © 2019 by Thom Hartmann. 

At bottom, the Court’s opinion is thus a rejection of the common sense of the American people, who have recognized a need to prevent corporations from undermining self-government since the founding , and who have fought against the distinctive corrupting potential of corporate electioneering since the days of Theodore Roosevelt. It is a strange time to repudiate that common sense. While American democracy is imperfect, few outside the majority of this court would have thought its flaws included a dearth of corporate money in politics. 

—Justice John Paul Stevens’s dissent in Citizens United 

 

Parkland shooting survivor and activist David Hogg once asked, when Sen. John McCain, R-Ariz., was still alive, why McCain had taken more than $7 million from the NRA (not to mention other millions that they and other “gun rights” groups spent supporting him indirectly). 

 

McCain’s answer, no doubt, would be the standard politician-speak these days: “They support me because they like my positions; I don’t change my positions just to get their money.” It’s essentially what Sen. Marco Rubio, R-Fla., told the Parkland kids when he was confronted with a similar question. 

 

And it’s a nonsense answer, as everybody knows. 

 

America has had an on-again, off-again relationship with political corruption that goes all the way back to the early years of this republic. Perhaps the highest level of corruption, outside of today, happened in the late 1800s, the tail end of the Gilded Age. (“Gilded,” of course, refers to “gold coated or gold colored,” an era that Donald Trump has tried so hard to bring back that he even replaced the curtains in the Oval Office with gold ones.) 

 

One of the iconic stories from that era was that of William Clark, who died in 1925 with a net worth in excess, in today’s money, of $4 billion. He was one of the richest men of his day, perhaps second only to John D. Rockefeller. And in 1899, Clark’s story helped propel an era of political cleanup that reached its zenith with the presidency of progressive Republicans (that species no longer exists) Teddy Roosevelt and William Howard Taft. 

 

Clark’s scandal even led to the passage of the 17th Amendment, which let the people of the various states decide who would be their U.S. senators, instead of the state legislatures deciding, which was the case from 1789 until 1913, when that amendment was ratified. 

 

By 1899, Clark owned pretty much every legislator of any consequence in Montana, as well as all but one newspaper in the state. Controlling both the news and the politicians, he figured they’d easily elect him to be the next U.S. senator from Montana. Congress later learned that he not only owned the legislators but in all probability stood outside the statehouse with a pocket full of $1,000 bills (literally: they weren’t taken out of circulation until 1969 by Richard Nixon), each in a plain white envelope to hand out to every member who’d voted for him.

 

When word reached Washington, DC, about the envelopes and the cash, the US Senate began an investigation into Clark, who told friends and aides, “I never bought a man who wasn’t for sale.” 

 

Mark Twain wrote of Clark, “He is as rotten a human being as can be found anywhere under the flag; he is a shame to the American nation, and no one has helped to send him to the Senate who did not know that his proper place was the penitentiary, with a chain and ball on his legs.” 

 

State Senator Fred Whiteside, who owned the only non-Clark-owned newspaper in the state, the Kalispell Bee, led the big exposé of Clark’s bribery. The rest of the Montana senators, however, ignored Whiteside and took Clark’s money.

 

The US Senate launched an investigation in 1899 and, sure enough, found out about the envelopes and numerous other bribes and emoluments offered to state legislators, and refused to seat him. The next year, Montana’s corrupt governor appointed Clark to the Senate, and he served a full eight-year term. 

 

Clark’s story went national and became a rallying cry for clean-government advocates. In 1912, President Taft, after doubling the number of corporations being broken up by the Sherman Anti-Trust Act over what Roosevelt had done, championed the 17th Amendment (direct election of senators, something some Republicans today want to repeal) to prevent the kind of corruption that Clark represented from happening again. 

 

Meanwhile, in Montana, while the state legislature was fighting reforms, the citizens put a measure on the state ballot of 1912 that would outlaw corporations from giving any money of any sort to politicians. That same year, Texas and other states passed similar legislation (the corrupt speaker of the House Tom DeLay, R-Texas, was prosecuted under that law). 

 

Montana’s anticorruption law, along with those of numerous other states, persisted until 2010,when Justice Anthony Kennedy, writing for the five-vote majority on the U.S. Supreme Court, declared in the Citizens United decision that in examining more than 100,000 pages of legal opinions, he could not find “. . . any direct examples of votes being exchanged for . . . expenditures. This confirms Buckley’s reasoning that independent expenditures do not lead to, or create the appearance of, quid pro quo corruption [Buckley is the 1976 decision that money equals free speech]. In fact, there is only scant evidence that independent expenditures even ingratiate. Ingratiation and access, in any event, are not corruption.”

 

The US Supreme Court, following on the 1976 Buckley case that grew straight out of the Powell Memo and was written in part by Justice Lewis Powell, turned the definitions of corruption upside down.

 

That same year, the Court overturned the Montana law in the 2010 American Tradition Partnership, Inc. v. Bullock ruling, essentially saying that money doesn’t corrupt politicians, particularly if that money comes from corporations that can “inform” us about current issues (the basis of the Citizens United decision) or billionaires (who, apparently the right-wingers on the Court believe, obviously know what’s best for the rest of us). 

 

Thus, the reason the NRA can buy and own senators like McCain and Rubio (and Thom Tillis, R-N.C./$4 million; Cory Gardner, R-Colo./$3.8 million; Joni Ernst, R-Iowa/$3 million; and Rob Portman, R-Ohio/$3 million, who all presumably took money much faster and much more recently than even McCain) is because the Supreme Court has repeatedly said that corporate and billionaire money never corrupts politicians. (The dissent in the Citizens United case is a must- read: it’s truly mind-boggling and demonstrates beyond refutation how corrupted the right-wingers on the Court, particularly Scalia and Thomas—who regularly attended events put on by the Kochs—were by billionaire and corporate money.)

 

So here America stands. The Supreme Court has ruled, essentially, that the NRA can own all the politicians they want and can dump unlimited amounts of poison into this nation’s political bloodstream. 

 

Meanwhile, angry white men who want to commit mass murder are free to buy and carry all the weaponry they can afford. 

]]>
Sun, 16 Jun 2019 01:00:48 +0000 https://historynewsnetwork.org/article/172188 https://historynewsnetwork.org/article/172188 0
What We Can't Forget As We Remember Anne Frank

 

On grocery store checkout lines around the country this month, amidst the candy bars and zany tabloids, shoppers will find a glossy 96-page magazine called “Anne Frank: Her Life and Legacy.” Unfortunately, it fails to explain one of the most important but little-known aspects of the Anne Frank story—how her life could have been saved by President Franklin D. Roosevelt. 

 

The new Anne Frank publication, compiled by the staff of Life magazine, is filled with photographs of Anne and her family, and a breezy overview of her childhood, tragically cut short by the Nazi Holocaust. Today, June 9, would have been her 90th birthday. 

 

Little Anne, “thin as a wisp, curious, mercurial, and a know-it-all” at first enjoyed an idyllic life, “but outside the family circle, the world was changing,” Life recounts. Economic and social crises in Germany propelled Adolf Hitler to power in 1933, and Anne’s father, Otto, quickly moved the family to neighboring Holland for safety.

 

When World War II erupted in 1939, Life reports, Otto “frantically searched for ways to get his family away from the growing conflict” and “he hoped to emigrate to the United States.”

 

That’s all. No accounting of what happened when the Franks sought to emigrate to the United States. No explanation as to why the Roosevelt administration refused to open America’s doors to Anne Frank or countless other Jewish children. 

 

Just the one vague allusion to Otto’s “hope,” and then quickly back to the famous story of Anne hiding in the Amsterdam attic and writing entries in her diary.

 

Here’s the part of the story that Life left out.

 

Laws enacted by the U.S. Congress in the 1920s created a quota system to severely restrict immigration. Roosevelt wrote at the time that immigration should be sharply restricted for “a good many years to come” so there would be time to “digest” those who had already been admitted. He argued that future immigration should be limited to those who had “blood of the right sort”—they were the ones who could be most quickly and easily assimilated, he contended.  

 

As president (beginning in 1933), Roosevelt took a harsh immigration system and made it much worse. His administration went above and beyond the existing law, to ensure that even those meager quota allotments were almost always under-filled. American consular officials abroad made sure to “postpone and postpone and postpone the granting of the visas” to refugees, as one senior U.S. official put it in a memo to his colleagues. They piled on extra requirements and created a bureaucratic maze to keep refugees like the Franks far from America’s shores.

 

The quotas for immigrants from Germany and (later) Axis-occupied countries were filled in only one of Roosevelt’s 12 years in office. In most of those years, the quotas were less than 25% full. A total of 190,000 quota places that could have saved lives were never used at all.

 

Otto Frank, Anne's father, filled out the small mountain of required application forms and obtained the necessary supporting affidavits from the Franks’ relatives in Massachusetts. But that was not enough for those who zealously guarded America's gates against refugees. 

 

Anne’s mother, Edith, wrote to a friend in 1939: "I believe that all Germany's Jews are looking around the world, but can find nowhere to go."

 

That same year, refugee advocates in Congress introduced the Wagner-Rogers bill, which would have admitted 20,000 refugee children from Germany outside the quota system. Anne Frank and her sister Margot were German citizens, so they could have been among those children.

 

Supporters of the bill assembled a broad, ecumenical coalition--including His Eminence George Cardinal Mundelein, one of the country’s most important Catholic leaders; New York City Mayor Fiorello La Guardia; Hollywood celebrities such as Henry Fonda and Helen Hayes; and 1936 Republican presidential nominee Alf Landon and his running mate, Frank Knox. Former First Lady Grace Coolidge announced that she and her neighbors in Northampton, Massachusetts, would personally care for twenty-five of the children.

 

Even though there was no danger that the children would take jobs away from American citizens, anti-immigration activists lobbied hard against the Wagner-Rogers bill. President Roosevelt’s cousin, Laura Delano Houghteling, who was the wife of the U.S. Commissioner of Immigration, articulated the sentiment of many opponents when she remarked at a dinner party that “20,000 charming children would all too soon grow up into 20,000 ugly adults.” FDR himself refused to support the bill. By the spring of 1939, Wagner-Rogers was dead.

 

But Wagner-Rogers was not the only way to help Jewish refugees. Just a few months earlier, in the wake of Germany’s Kristallnacht pogrom, the governor and legislative assembly of the U.S. Virgin Islands offered to open their territory to Jews fleeing Hitler. Treasury Secretary Henry Morgenthau, Jr. endorsed the proposal. 

 

That one tiny gesture by President Roosevelt—accepting the Virgin Islands leaders’ offer—could have saved a significant number of Jews. But FDR rejected the plan. He and his aides feared that refugees would be able to use the islands as a jumping-off point to enter the United States itself.

 

At a press conference on June 5, 1940, the president warned of the “horrible” danger that Jewish refugees coming to America might actually serve the Nazis. They might begin “spying under compulsion” for Hitler, he said, out of fear that if they refused, their elderly relatives back in Europe “might be taken out and shot.” 

 

That's right: Anne Frank, Nazi spy.

 

In fact, not a single instance was ever discovered of a Jewish refugee entering the United States and spying for the Nazis. But President Roosevelt did not shy away from using such fear-mongering in order to justify slamming shut America’s doors.

 

The following year, the administration officially decreed that no refugee with close relatives in Europe could come to the United States.

 

Anne and Margot Frank, and countless other German Jewish refugee children, were kept out because they were considered undesirable. They didn’t have what FDR once called “blood of the right sort.” One year after the defeat of Wagner-Rogers, Roosevelt opened America’s doors to British children to keep them safe from the German blitz. Those were the kind of foreigners he preferred.

 

Life magazine’s tribute to Anne Frank is touching. The photos fill our hearts with pity. But by failing to acknowledge what the Roosevelt administration did to keep the Jews out, Life’s version of history misses a point that future generations need to remember: pity is not enough to help people who are trying to escape genocide.

]]>
Sun, 16 Jun 2019 01:00:48 +0000 https://historynewsnetwork.org/article/172187 https://historynewsnetwork.org/article/172187 0
What the Feud and Reconciliation between John Adams and Thomas Jefferson Teaches Us About Civility

 

Donald Trump did not invent the art of the political insult but he’s inflamed the level of vitriolic public discourse and incivility to a new low unmatched by other presidents. In a tainted tradition that has permeated our history, other presidents have not been immune to dishing out acerbic insults against one another.

 

John Quincy Adams was livid that Harvard University planned to award President Andrew Jackson with an honorary degree. He wrote in his diary that Jackson was “a barbarian who could not write a sentence of grammar and hardly could spell his own name.”

 

Franklin Pierce was not as impressed with Abraham Lincoln as history has been, declaring the day after Lincoln issued the Emancipation Proclamation that the president had “limited ability and narrow intelligence.” 

 

The list of spicy presidential insults goes on and on. While such statements are often laugh-aloud funny, they are also shocking and sobering. How can these men who have reached the pinnacle of political power be so crude and demeaning? We can learn a valuable lesson from the friendship and feud between John Adams and Thomas Jefferson, and their ultimate reconciliation.

 

In 1775, the 32-year-old Virginia born-and-bred Jefferson traveled from his mountain-top Monticello mansion to the bustling city of Philadelphia to serve as a delegate to the Second Continental Congress.

 

Sometime in June that year after Jefferson arrived in the City of Brotherly Love, he met for the first time one of the most prominent and outspoken leaders of the resistance to British domination – John Adams. The Massachusetts attorney was the soft-spoken Jefferson’s senior by seven years. But neither their opposite personalities, age differences, or geographical distance separating their homes stood in the way of the start of a remarkable relationship that would span more than a half-century. 

 

They forged a unique and warm partnership, both serving on the committee to draft a declaration of independence from British rule. According to Adams, Jefferson had “the reputation of a masterly pen,” and was therefore tasked with using his writing skills to draft the document. Jefferson was impressed with how Adams so powerfully defended the draft of the document on the floor of the congress, even though he thought Adams was “not graceful, not elegant, not always fluent in his public addresses.”

 

In the 1780s, they found themselves thrown together once again as diplomats in Europe representing the newly minted United States. These collaborators and their families were friends.

 

But by 1796, their friendship was obliterated by the rise of political parties with starkly different visions of the new American experiment. With his election that year as the nation’s second president, the Federalist Adams found himself saddled with Jefferson as his vice president representing the Democratic-Republican Party. Tensions were high between the two men. 

 

Just three months after their inauguration as the embryonic nation’s top two elected officials, Jefferson privately groused to a French diplomat that President Adams was “distrustful, obstinate, excessively vain, and takes no counsel from anyone.” Weeks later, Adams spewed out his frustration, writing in a private letter that his vice president had “a mind soured, yet seeking for popularity, and eaten to a honeycomb with ambition, yet weak, confused, uninformed, and ignorant.” 

 

When Jefferson ousted Adams from the presidency in the election of 1800, Adams was forced to pack his bags and vacate the newly constructed Executive Mansion after just a few months. At four o’clock in the morning on March 4, 1801, Jefferson’s inauguration day, the sullen Adams slipped out of the Executive Mansion without fanfare, boarded a public stage and left Washington.  The streets were quiet as the president left the capital under the cover of darkness on his journey back home. He wanted nothing to do with the man who had publicly humiliated him by denying him a second term as president, nor in witnessing Jefferson’s inauguration and moment of triumph. 

 

For the next dozen years these two giants of the American revolution largely avoided one another, still nursing wounds inflicted by the poisonous partisan politics of their era. But on July 15, 1813, Adams made an overture, reaching out to his former friend and foe, writing that “you and I ought not to die until we have explained ourselves to each other.” That letter broke the dam and began a series of remarkable letters between the two men that lasted for more than a dozen years until death claimed them both on the July 4, 1826 – the 50thanniversary of the Declaration of Independence. 

 

Not all such presidential feuds have resulted in such heart-warming reconciliations. But the story of Adams and Jefferson serves as a model of what can happen when respect replaces rancor, friendships triumph over political dogma, and we allow reconciliation to emerge from the ashes of fractured friendships. 

 

Adams and Jefferson ultimately listened to one another, explaining themselves. Listening to someone who thinks differently than we do can feel threatening and scary – almost as if by listening to their thoughts we might become infected by their opinions. So we hunker down and lob snarky tweets to attack the humanity and patriotism of others, foolishly hoping such tactics will convince them to change.

 

But what would it look like if we could agree on core values we share in common with one another? Patriotism, a safe country, a stable society, economic well-being that promotes health, education, food, and housing, ensuring that people are treated with dignity and respect.

 

We could then have vigorous and civil debates about the best policies to implement our values. We won’t always agree with everyone. There will be a wide diversity of opinions. But if we could “explain ourselves” to one another, listen deeply, forge friendships, and understand the hopes and fears and humanity of others, we might actually solve some of the problems that seem so intractable in our polarized society – a society that seems to thrive on extremism on both ends of the political spectrum.

 

Adams and Jefferson ultimately allowed their humanity and deep friendship to triumph over their politics. We can thank them and other candid and often irreverent barbs by our presidents about other presidents, because these insults cause us to reflect how we should treat one another – not only in the public square, but around the family dinner table, in our marriages, and in the workplace. 

 

Our survival as a nation depends on our ability to listen to those with very different political philosophies, to “explain ourselves” to one another, to search for broad areas of agreement with those of different political philosophies, and to reject the acidic politics of personal demonization in which we attack the humanity or patriotism of others.

]]>
Sun, 16 Jun 2019 01:00:48 +0000 https://historynewsnetwork.org/article/172184 https://historynewsnetwork.org/article/172184 0
Whatever Happened to an Affordable College Education?

Image: Pixabay

 

As U.S. college students―and their families―know all too well, the cost of a higher education in the United States has skyrocketed in recent decades.  According to the Center on Budget and Policy Priorities, between 2008 and 2017 the average cost of attending a four-year public college, adjusted for inflation, increased in every state in the nation.  In Arizona, tuition soared by 90 percent.  Over the past 40 years, the average cost of attending a four-year college increased by over 150 percent for both public and private institutions.  

 

By the 2017-2018 school year, the average annual cost at public colleges stood at $25,290 for in-state students and $40,940 for out-of-state students, while the average annual cost for students at private colleges reached $50,900.

 

In the past, many public colleges had been tuition-free or charged minimal fees for attendance, thanks in part to the federal Land Grant College Act of 1862.  But now that’s “just history.”  The University of California, founded in 1868, was tuition-free until the 1980s.  Today, that university estimates that an in-state student’s annual cost for tuition, room, board, books, and related items is $35,300; for an out-of-state student, it’s $64,300.

 

Not surprisingly, far fewer students now attend college.  Between the fall of 2010 and the fall of 2018, college and university enrollment in the United States plummeted by two million students.  According to the Organization for Economic Cooperation and Development, the United States ranks thirteenth in its percentage of 25- to 34-year-olds who have some kind of college or university credentials, lagging behind South Korea, Russia, Lithuania, and other nations.

 

Furthermore, among those American students who do manage to attend college, the soaring cost of higher education is channeling them away from their studies and into jobs that will help cover their expenses.  As a Georgetown University report has revealed, more than 70 percent of American college students hold jobs while attending school. Indeed, 40 percent of U.S. undergraduates work at least 30 hours a week at these jobs, and 25 percent of employed students work full-time.

 

Such employment, of course, covers no more than a fraction of the enormous cost of a college education and, therefore, students are forced to take out loans and incur very substantial debt to banks and other lending institutions.  In 2017, roughly 70 percent of students reportedly graduated college with significant debt.  According to published reports, in 2018 over 44 million Americans collectively held nearly $1.5 trillion in student debt.  The average student loan borrower had $37,172 in student loans―a $20,000 increase from 13 years before.

 

Why are students facing these barriers to a college education?  Are the expenses for maintaining a modern college or university that much greater now than in the past?

 

Certainly not when it comes to faculty.  After all, tenured faculty and faculty in positions that can lead to tenure have increasingly been replaced by miserably-paid adjunct and contingent instructors―migrant laborers who now constitute about three-quarters of the instructional faculty at U.S. colleges and universities.  Adjunct faculty, paid a few thousand dollars per course, often fall below the official federal poverty line.  As a result, about a quarter of them receive public assistance, including food stamps.

 

By contrast, higher education’s administrative costs are substantially greater than in the past, both because of the vast multiplication of administrators and their soaring incomes.  According to the Chronicle of Higher Education, in 2016 (the last year for which figures are available), there were 73 private and public college administrators with annual compensation packages that ran from $1 million to nearly $5 million each.

 

Even so, the major factor behind the disastrous financial squeeze upon students and their families is the cutback in government funding for higher education. According to a study by the Center on Budget and Policy Priorities, between 2008 and 2017 states cut their annual funding for public colleges by nearly $9 billion (after adjusting for inflation).  Of the 49 states studied, 44 spent less per student in the 2017 school year than in 2008.  Given the fact that states―and to a lesser extent localities―covered most of the costs of teaching and instruction at these public colleges, the schools made up the difference with tuition increases, cuts to educational or other services, or both.

 

SUNY, New York State’s large public university system, remained tuition-free until 1963, but thereafter, students and their parents were forced to shoulder an increasing percentage of the costs. This process accelerated from 2007-08 to 2018-19, when annual state funding plummeted from $1.36 billion to $700 million.  As a result, student tuition now covers nearly 75 percent of the operating costs of the state’s four-year public colleges and university centers.

 

This government disinvestment in public higher education reflects the usual pressure from the wealthy and their conservative allies to slash taxes for the rich and reduce public services.  “We used to tax the rich and invest in public goods like affordable higher education,” one observer remarked.  “Today, we cut taxes on the rich and then borrow from them.”     

 

Of course, it’s quite possible to make college affordable once again.  The United States is far wealthier now than in the past, with a bumper crop of excessively rich people who could be taxed for this purpose.  Beginning with his 2016 presidential campaign, Bernie Sanders has called for the elimination of undergraduate tuition and fees at public colleges, plus student loan reforms, funded by a tax on Wall Street speculation.  More recently, Elizabeth Warren has championed a plan to eliminate the cost of tuition and fees at public colleges, as well as to reduce student debt, by establishing a small annual federal wealth tax on households with fortunes of over $50 million.

 

Certainly, something should be done to restore Americans’ right to an affordable college education.

]]>
Sun, 16 Jun 2019 01:00:48 +0000 https://historynewsnetwork.org/article/172186 https://historynewsnetwork.org/article/172186 0
The Challenges of Writing Histories of Autism

Image: Julia is a character on Sesame Street who has autism. 

 

This is a version of an article first published in the May 2019 issue of Participations.  It is reproduced here with the kind permission of the editors.

 

Autism is a relatively new (and increasingly common) disability, and we don’t yet fully understand it.  The symptoms vary enormously from individual to individual. Severity can range from barely noticeable to totally debilitating. The condition often impairs the ability to read but can also result in “hyperlexia”, a syndrome which involves precocious reading at a very early age but also difficulties in reading comprehension. 

 

We have just begun to write the history of autism. Frankly, some of the first attempts stumbled badly, especially over the question of whether “It was there before” – that is, before the twentieth century.  That mantra was repeated several times by John Donvan and Caren Zucker in In a Different Key:The Story of Autism (2016). But they and others have found precious few halfway plausible cases in history, nothing remotely like the one in 40 children afflicted with autism reported by the 2016 National Survey of Children's Health. Donvan and Zucker claimed that the “Wild Boy of Aveyron”, the feral child discovered in France in 1800, “had almost certainly been a person with autism.” But autism impairs the ability to perceive danger, and frequently results in early deaths from drowning and other accidents, so it’s not likely that an autistic child could survive long in the wild. And there are barely a dozen examples of feral children in history, so even if they were all autistic, the condition was vanishingly rare.  

 

In Neurotribes (2015) Steve Silberman also argued that autism had been a common part of the human condition throughout history.  His book celebrated Dr. Hans Asperger as a friend and protector of autistic children, even placing his benevolent image on the frontispiece. Critics hailed that version of history as “definitive”. But recently Edith Sheffer, in Asperger’s Children: The Origins of Autism in Nazi Vienna (2018), confirmed that Asperger had been deeply implicated in the Nazi program to exterminate the neurologically handicapped. 

 

Surely if we want to write a full and honest account of the recent history of the autism epidemic, we should interview members of the autism community, defined as including both autistic individuals and their family members. This, however, presents  a number of special obstacles that I encountered when I conducted research for an article that was eventually published as “The Autism Literary Underground." Here I want to explain how we as historians might work around these barriers.

 

For starters, about a third of autistic individuals are nonspeaking, and many others experience lesser but still serious forms of verbal impairment.  But at least some nonspeakers can communicate via a keyboard, and can therefore be reached via email interviews. Email interviews have a number of other advantages: they save the trouble and expense of travel and transcription, they avoid transcription errors and indistinct recordings, and they allow the interviewer to go back and ask follow-up and clarification questions at any time.  This is not to rule out oral interviews, which are indispensable for the nonliterate. But email interviews are generally easier for autism parents, who are preoccupied with the demands of raising disabled children, many of whom will never be able to live independently. These parents simply cannot schedule a large block of time for a leisurely conversation.  When I conducted my interviews, the interviewees often had to interrupt the dialogue to attend to their children.  Perhaps the most frequent response to my questions was, “I’ll get back to you….” (One potential interviewee was never able to get back to me, and had to be dropped from the project.) Ultimately these interviews addressed all the questions I wanted to address and allowed interviewees to say everything they had to say, but in email threads stretching over several days or weeks.

 

Recent decades have seen a movement to enable the disabled to “write their own history”. In 1995 Karen Hirsch published an article advocating as much in Oral History Review, in which she discussed many admirable initiatives focusing on a wide range of specific disabilities – but she never mentioned autism. Granted, autism was considerably less prevalent then than it is today, but the omission may reflect the fact that autism presents special problems to the researcher.  In 2004 the Carlisle People First Research Team, a self-governing group for those with “learning difficulties”, won a grant to explore “advocacy and autism” but soon concluded that their model for self-advocacy did not work well for autistic individuals. Though the Research Team members were themselves disabled, they admitted that they knew little about autism, and “there was an obvious lack of members labelled with autism or Asperger’s syndrome” in disability self-advocacy groups throughout the United Kingdom.  The Research Team concluded that, because autism impairs executive functioning as well as the ability to socialize and communicate, it was exceptionally difficult for autistic individuals to organize their own collective research projects, and difficult even for nonautistic researchers to set up individual interviews with autistic subjects.

 

Self-advocacy groups do exist in the autism community, but they inevitably represent a small proportion at the highest-performing end of the autism spectrum: they cannot speak for those who cannot speak.  We can only communicate with the noncommunicative by interviewing their families, who know and understand them best. 

 

One also has to be mindful that the autism community is riven by ideological divisions, and the unwary researcher may be caught in the crossfire.  For instance, if you invite an autistic individual to tell their own story, they might say something like this:

As a child, I went to special education schools for eight years and I do a self-stimulatory behavior during the day which prevents me from getting much done. I’ve never had a girlfriend. I have bad motor coordination problems which greatly impair my ability to handwrite and do other tasks. I also have social skills problems, and I sometimes say and do inappropriate things that cause offense. I was fired from more than 20 jobs for making excessive mistakes and for behavioural problems before I retired at the age of 51.

Others with autism spectrum disorder have it worse than I do.  People on the more severe end sometimes can’t speak. They soil themselves, wreak havoc and break things. I have known them to chew up furniture and self-mutilate. They need lifelong care.[7]

 

This is an actual self-portrait by Jonathan Mitchell, who is autistic. So you might conclude that this is an excellent example of the disabled writing their own history, unflinchingly honest and compassionate toward the still less fortunate, something that everyone in the autism community would applaud. And yet, as Mitchell goes on to explain, he has been furiously attacked by “neurodiversity” activists, who militantly deny that autism is a disorder at all. They insist that it is simply a form of cognitive difference, perhaps even a source of “genius”, and they generally don’t tolerate any discussion of curing autism or preventing its onset.  When Mitchell and other autistic self-advocates call for a cure, the epithets “self-haters” and “genocide” are often hurled at them. So who speaks for autism?  An interviewer who describes autism as a “disorder”, or who even raises the issues that Mitchell freely discussed, might well alienate a neurodiversity interviewee. But can we avoid those sensitive issues? And even if we could, should we avoid them?  

 

Mitchell raises a still more unsettling question: Who is autistic? The blind, the deaf, and the wheelchair-bound are relatively easy to identify, but autism is defined by a complex constellation of symptoms across a wide spectrum – and where does a spectrum begin and end? You could argue that those with a formal medical diagnosis would qualify, but what about those who are misdiagnosed, or mistakenly self-diagnosed? What if their symptoms are real but extremely mild: would an oral historian researching deafness interview individuals with a 10 percent hearing loss? Mitchell contends that neurodiversity advocates cluster at the very high-functioning end of the spectrum, and suspects that some aren’t actually autistic:

Many of them have no overt disability at all.  Some of them are lawyers who have graduated from the best law schools in the United States. Others are college professors. Many of them never went through special education, as I did. A good number of them are married and have children. No wonder they don’t feel they need treatment.

 

Precisely because neurodiversity advocates tend to be highly articulate, they increasingly dominate the public conversation about autism, to the exclusion of other voices. Mitchell points to the Interagency Autism Coordinating Committee, an official panel that advises the US government on the direction of autism research: seven autistic individuals have served on this body, all of whom promote neurodiversity, and none favor finding a cure.  The most seriously afflicted, who desperately need treatment, are not represented, and they “can’t argue against ‘neurodiversity’ because they can’t articulate their position. They’re too disabled, you might say.”

 

The severely disabled could easily be excluded from histories of autism, unless the researcher makes a deliberate effort to include them, and in many cases we can only include them by interviewing their families. My own research relied on email interviews with self-selected respondents to a call for participants I had posted on autism websites. Though I made clear that I wanted to communicate with autistic individuals as well as with other members of their families, only the latter responded. As Jan Walmsley has rightly pointed out, consent is a thorny issue when we interview the learning disabled. I specified that I would only interview responsible adults -- that is, those who were not under legal guardianship -- but that proviso effectively excluded a large fraction of the autism community. For researchers, that may present an insurmountable difficulty.

 

Yet another ideological landmine involves the causes of autism, for many in the autism community believe it is a disorder that results from adverse reaction to vaccination.  In my own research, this was the group I chose to focus on.  The mainstream media generally treat them as pariahs and dangerous subversives, denounce them repetitively, and almost never allow them to present their views.  But that kind of marginalization inevitably raise troubling questions: Are these people being misrepresented?  What is their version of events?  And since they obviously aren’t getting their ideas from the newspapers or television networks, what exactly are they reading, and how did that reading shape their understanding of what has been inflicted on them? 

 

So I started with a simple question: What do you read? Unsurprisingly, many of my subjects had read the bestselling book Louder Than Words (2007) by actress Jenny McCarthy, where she describes her son’s descent into autism and argues that vaccination was the cause. Doctors have expressed horror that any parent would follow medical advice offered by a Playboy centerfold, but a historian of reading might wonder whether the reader response here is more complicated.  Are readers “converted” by books, or do they choose authors that they already sympathize with?  My interviewees reported that, well before they read Louder Than Words, they had seen their children regress into autism immediately following vaccination.  They later read Jenny McCarthy out of empathy, because she was a fellow autism parent struggling with the same battles that they had to confront every day.

 

Granted, my sample was quite small, essentially a focus group of just six self-selected parents.  Occasionally oral historians can (through quota sampling) construct large and representative surveys, for instance Paul Thompson’s landmark 1975 study of Edwardian Britain, but it would be practically impossible to do the same for the fissured and largely nonspeaking autism community. What oral historians can sometimes do is to crosscheck their findings against large statistical surveys. For instance, my respondents said that they read Jenny McCarthy not because she was a celebrity, but because she was an autism mom. They were corroborated by a poll of 1552 parents, who were asked whom they relied for vaccine safety information: just 26 percent said celebrities, but 73 percent trusted parents who reported vaccine injuries in their own children. To offer another illustration: vaccine skeptics are often accused of being “anti-science”, but my interviewees produced lengthy bibliographies of scientific journal articles that had shaped their views. They were supported by a survey of 480 vaccine skeptic websites, of which 64.7 percent cited scientific papers (as opposed to anecdotes or religious principles).

 

I oftenvdescribe autism as an “epidemic”. This is yet another flashpoint of controversy. Public health officials generally avoid the word, and many journalists and neurodiversity activists fiercely argue that autism has always been with us. As a historian who has investigated the question, I have concluded (beyond a reasonable doubt) that autism scarcely existed before the twentieth century, and that it is now an ever-spreading pandemic. To explain the evidence behind this conclusion would require a very long digression, though I can refer the reader to a robust demonstration. The essential point here is that any interviewer who refers to autism as an “epidemic” may alienate some of his or her interviewees.

 

So how do we handle this situation – or, for that matter, any other divisive issue?  All oral historians have opinions: we can’t pretend that we don’t. But we can follow the ethic of an objective reporter.  A journalist is (or used to be) obligated to report all sides of an issue with fairness, accuracy, and balance. He or she may personally believe that one side is obviously correct and the other is talking nonsense, but in his or her professional capacity he or she keeps those opinions to herself and assures his or her interviewees that they are free to express themselves.  One has to accept that not everyone will be reassured.  I found myself variously accused of being (on the one hand) an agent of the pharmaceutical companies or (on the other) an antivaccinationist. (I am neither.) But most of my subjects were quite forthcoming, once I made clear that the article I was writing would neither endorse nor condemn their views.

 

Of course, if any of the voices of autism are stifled, then the true and full story of the epidemic will be lost.  Some honest and well-researched histories of autism have been produced, notably Chloe Silverman’s Understanding Autism and Edith Sheffer’s Asperger’s Children. Although Silverman only employs a few interviews, her work is distinguished by a willingness to listen closely to autism parents.  And in her chilling account of the Nazi program to eliminate the mentally handicapped, Sheffer uncovered the voices of some of its autistic victims in psychiatric records. What both these books suggest is that we could learn much more about autism as it was experienced by ordinary people simply by talking to them.  Many of them protest that the media only reports “happy news” about autism (e.g., fundraisers, job training programs) and prefers not to dwell on the dark side (neurological damage, unemployment, violent outbursts, suicide), and these individuals are usually eager to tell their stories. To take one striking example, in 2005 the New York Times dismissed the theory that thimerosal (a mercury-containing preservative in some vaccines) might cause autism in a 2005 front-page story headlined “On Autism’s Cause, It’s Parents vs. Research” (suggesting that parents did no research). One of my interviewees had herself been interviewed by Gardiner Harris, one of the reporters who filed the Times story, and she offered a very different version of events:

Harris misidentified one of the two women in his opening anecdote. He described an autistic child’s nutritional supplements as “dangerous,” though they had been prescribed by the Mayo Clinic for the child’s mitochondrial disorder—facts he did not disclose. Three times Harris asked me, “How do you feel?” rather than, “What scientific studies led you to believe thimerosal is harmful to infants?"

 

Rather than rely solely on “the newspaper of record” (or any other newspaper), historians can find correctives and alternative narratives in oral interviews. Oral history has made an enormous contribution to reconstructing the history of the AIDS epidemic and the opioid epidemic, and it will be no less essential to understanding the autism epidemic.

 

 

 

 

 

 

 

]]>
Sun, 16 Jun 2019 01:00:48 +0000 https://historynewsnetwork.org/article/172185 https://historynewsnetwork.org/article/172185 0
On the eve of Pride 2019, D.C. LGBT Community Reflects on its own history with Lavender Scare  

 

“I really think it is so important to remember that there were people who were taking a stand in the years before Stonewall and people who really had thecourage to get the movement rolling in the 1960’s. Their efforts should be recognized.”

 

As the question and answer session after Wednesday night’s screening of The Lavender Scare was wrapping up, director Josh Howard reminded the audience of the focus of his documentary: the systematic firing and discrimination of LGBT people under the Eisenhower administration, from their perspective. The screening included a Q&A afterwards that featured  Howard, David Johnson – the novelist of the book that inspired the film, and Jamie Shoemaker–who is featured in the film as the first person to successfully resist the law. The screening was timely as D.C.’s Pride parade is Saturday, June 8, and the 50th anniversary of the Stonewall riots is Friday, June 28.  The Lavender Scare will premiere on PBS on June 18. 

 

Most of the seats in the Avalon Theatre were filled. After the film and applause ended, Howard asked a question he likes to ask every audience at a screening: how many of you were personally affected or knew someone who was affected by the Lavender Scare? Almost everyone in the audience raised their hands. 

 

The Q&A was an open dialogue, with several people standing and telling stories of how they were personally tied to the events of the film and the movement in general. Several were connected to the central figure of the documentary, former prominent activist Frank Kameny. One man who had grown up with another prominent activist, Jack Nichols, explained, “when Jack was picketing in front of the White House, I was quite aware. In fact, Frank and Jack did some of the planning in my apartment at the time; but because I was a teacher, I couldn’t have anything to do with it, because if my picture was in the paper, then my career would’ve been over.”

 

The policy harmed the careers of some in the audience, though. “I had gone to Frank for guidance before my interview at NSA,” one gentleman recalled, “and he told me ‘don’t say anything, don’t answer anything that you’re not asked,’ and so forth. Anyway, I was not hired and I’m frankly very glad now that I was not hired.” Experiences such as those reflect just how wide-reaching the policy was; it not only removed the gay community from office, but also discouraged them from applying to positions in the first place. 

 

Frank Kameny’s impact on the D.C. community was evident. In attendance was his former campaign manager from 1971, who recalled that the day after they announced the campaign, “we received a check in the mail for $500 from actors Paul Newman and Joanne Woodward. We used that money to travel to New York to meet with Gay Activist Alliance of New York.” Similarly, one of his former colleagues on the board of the ACLU in Washington recounted that as they defended their license to meet, “the issue was whether names [of gay members] would be revealed, and while Frank was very happy and very brave to have his name revealed, he didn’t feel that he could just turn over names of other people. That’s what he was fighting against in the agencies.” 

 

While the film successfully showed the struggle faced by the LGBT community, the conversion afterwards reflected the hope that many in the community feel today. Jamie Shoemaker, who was once almost fired from the NSA, evidenced the progress that he’s seen. “All of the security clearance agencies now have LGBT groups that are very active, including the NSA. One year after I retired, they payed me to come out to give a speech about my experiences… they (the groups) are very active and it’s really a good scene in these agencies now. What a difference,” he said. The theatre was immediately filled with applause. 

 

Many expressed a desire for reparations in some form or another. David Johnson, who authored The Lavender Scare: The Cold War Persecution of Gays and Lesbians in the Federal Government, threw light on the LOVE Act, an act introduced into the Senate that would “mandate that the State Department investigate all of its firings since 1950. They would collect information from either fired employees or their families, and I think most importantly, though, it would mandate that their museum, the US Diplomacy Center, actually have a permanent exhibit on the Lavender Scare.” Once again, the room broke into applause.

 

The Capital Pride Parade will take place on Saturday, June 8th across multiple locations in Washington. The 50th anniversary of the Stonewall riots is Friday, June 28.  The Lavender Scare will premier on PBS on June 18.

]]>
Sun, 16 Jun 2019 01:00:48 +0000 https://historynewsnetwork.org/article/172191 https://historynewsnetwork.org/article/172191 0
Arbella Bet-Shlimon Got Into History to Counter False Perceptions of Middle East Region

 

Arbella Bet-Shlimon is Assistant Professor in the Department of History at the University of Washington, a historian of the modern Middle East,  an adjunct faculty member in the Department of Near Eastern Languages and Civilization and an affiliate of the Jackson School's Middle East Center. Arbella’s research and teaching focuses on the politics, society and economy of twentieth-century Iraq and the broader Persian Gulf region, as well as Middle Eastern urban history. Her first book, City of Black Gold: Oil, Ethnicity, and the Making of Modern Kirkuk (Stanford University Press, 2019), explores how oil and urbanization made ethnicity into a political practice in Kirkuk, a multilingual city that was the original hub of Iraq's oil industry. She received her PhD from Harvard University in 2012.

 

What books are you reading now?

 

I just wrote an obituary for my Ph.D. advisor, Roger Owen, in the latest issue of Middle East Report, and I read his memoir A Life in Middle East Studies prior to writing it. It proved to be a fascinating retrospective on the development of our field over the twentieth century. At the moment, I am digging into the work of the multilingual Kirkuki poet Sargon Boulus, and scholarship about him, as I write an article about the idea of the city of Kirkuk as a paragon of pluralism in northern Iraq. This is a topic I became interested in when I was researching my book on Kirkuk’s twentieth-century history, City of Black Gold, just published by Stanford University Press.

 

Why did you choose history as your career?

 

I decided to make history a career after I was already in graduate school in an interdisciplinary program. I started that program with a goal: to counter inaccurate and stereotyped perceptions of the Middle East among Americans. These spurious ideas were fostering cruelty to Middle Easterners at home and prolonging destructive foreign policy abroad. I concluded that researching, writing, and teaching the modern history of the region would be the best way to meet that goal. The way I stumbled into this conclusion was essentially accidental, but I’ve never looked back.

 

It was an unexpected change of direction, because I hadn’t taken a single history class in college. And history, according to most college students who haven’t taken a history class, is boring. We have an image problem. Just look at the most famous secondary school in the world: Hogwarts (from the Harry Potter universe). This is a school where one of the tenure lines has a jinx on it that leaves professors fired, incapacitated, or dead after one year, but its worst course isn’t that one. Instead, its worst course is a plain old history class, taught by a droning ghost professor who bores even himself so thoroughly that he doesn’t realize he died a long time ago. High school students (real-life ones, I mean) will frequently tell you that they hate history because it’s just memorizing lists of things, or their teacher just makes them watch videos. That’s not what history is beyond the K-12 realm, of course—neither college history nor popular history is anything like that—and there are some great K-12 history teachers who don’t teach that way. But it’s a widespread stereotype rooted in some truth. I didn’t actively dislike history prior to pursuing it full time, but it hadn’t even occurred to me to consider it a possible career.

 

What qualities do you need to be a historian?

 

Qualities that are central to any research career. For instance, a high tolerance for delayed gratification, because you can knock your head against a research question for years before the answers start to come to fruition in any publishable form. And you need to be willing to be proven wrong by the evidence you find.

 

Who was your favorite history teacher?

 

My dad was my first history teacher. I learned a lot about the history of the Middle East riding in the car as a kid.

 

What is your most memorable or rewarding teaching experience?

 

Once, at a graduation event, a graduating student told me that a conversation he’d had with me during office hours was one of the main reasons he did not drop out of college. I had no idea my words had had that impact at the time. I think we professors are often not aware of the small moments that don’t mean much to us but change a student’s life (both for the worse and for the better).

 

What are your hopes for history as a discipline?

 

Institutional support; support from the parents or other tuition funders of students who want to pursue history as their major; and stable, contracted teaching positions with academic freedom protections for those who have advanced degrees in history and wish to work in academia.

 

Do you own any rare history or collectible books? Do you collect artifacts related to history?

 

I don’t collect artifacts, but I’ve used my research funds to acquire a few things that are hard to find and have been indispensable to my work. For instance, I have copies of the full runs of a couple of rare periodicals from Kirkuk that I acquired while writing my book. They’re almost impossible to find even in global databases—and when you come across something like that in someone’s private collection, you have to get a copy somehow.

 

What have you found most rewarding and most frustrating about your career? 

 

The most rewarding thing about being a historian is when a student tells me that their perspective on the world has been transformed by taking my class. The most frustrating thing is the pressure from so many different directions to downsize humanities and social science programs.

 

How has the study of history changed in the course of your career?

 

That’s a very broad question, but I can speak specifically about my own field of Middle Eastern history. When I set out to write a PhD dissertation on Iraq, some colleagues in my cohort reacted with surprise because, they pointed out, it would be extremely difficult to conduct research there. One fellow student told me that he’d started grad school interested in Iraq but realized after the 2003 US invasion that he wouldn’t be able to go there, so he switched his focus to Egypt. Since then, though, many more conflicts have developed and brutal authoritarian rulers have become more deeply entrenched. Nobody researching the history of the Middle East today can assume that the places they are interested in will be freely accessible or that any country’s archives are intact and in situ. And even if we can visit a place, it may not be ethical to talk to people there about certain sensitive topics. At the same time, we know that we can’t just sit in the colonial metropolis and write from the colonial archives, as so many historians of a previous generation did. So I think many Middle East historians have become more methodologically creative in the past decade, asking new sorts of questions and tapping into previously underappreciated sources.

 

What are you doing next?

 

Right now, I’m trying to understand Iraq’s position in the Persian Gulf, shifting my focus toward Baghdad and its south. Historians of Iraq have written extensively about its experience as a colonized, disempowered country, but have less often examined how expansionist ideas were key to its nation-building processes throughout the twentieth century. This becomes clear from the perspective of Kirkuk. It’s also clear when looking at Iraq’s relationship with Kuwait, which Iraq has claimed as part of its territory at several points. I’m in the early stages of gathering sources on this topic.

]]>
Sun, 16 Jun 2019 01:00:48 +0000 https://historynewsnetwork.org/article/172183 https://historynewsnetwork.org/article/172183 0
Here Comes the D-Day Myth Again

 

Last Friday (May 31, 2019), the NPR radio program “On Point” conducted a special live broadcast from the National WWII Museum in New Orleans entitled “75th Anniversary Of D-Day: Preserving The Stories Of WWII Veterans." The host was NPR’s media correspondent David Folkenflik and the segment featured Walter Isaacson, professor of history at Tulane University, and Gemma Birnbaumassociate vice president of the World War II Media and Education Center at The National WWII Museum, as guests. This writer was not only looking forward to an engaging discussion of the successful Allied landings at the Normandy beaches on June 6, 1944, but also hoping that the guests would present the contemporary state of military history research on the significance of D-Day.

 

I was sorely disappointed. Instead of placing the invasion within the wider context of the war against Nazi Germany, Folkenflik and his guests revived the “Myth of D-Day,” that is, they reinforced the erroneous belief that D-Day was the decisive battle of the Second World War in Europe, that it marked “the turning of the tide,” and that it sealed the doom of the German Army, the Wehrmacht. Had D-Day failed, so the argument goes, Germany could have still won the war, with nightmarish consequences for Europe, the United States and the world as a whole. This myth is a legacy of the Cold War, when each side accentuated what it did to defeat Nazi Germany, the most monstrous regime in human history, and played down the contributions of the other side. Russian students today, for example, are taught the “Great Patriotic War,” which the Soviet Union won practically single-handedly, without having previously cooperated with Nazi Germany and without having had committed any atrocities – which is take a creative approach to interpreting the history of World War II, to say the least. But it also remains the case that far too many American, British and Canadian students are taught that the victory over Nazi Germany was mostly the work of the Anglo-American forces, which also is a distortion of truth. 

 

This “Allied scheme of history,” as the Oxford historian Norman Davies calls it, was most consistently presented by Gemma Birnbaum on the On Point broadcast. She not only reiterated the belief that D-Day was necessary to defeat Nazi Germany, but her words also suggested that, until then, Germany was somehow winning the war. Before the Allies invaded France, she said, the Wehrmacht “was moving all over the place.” According to her, it was only after the German defeat in Normandy that “fatigue began to set in” among German soldiers. But “fatigue” had already begun to spread throughout the Wehrmacht in the late fall of 1941, when the Red Army stopped the Germans at the gates of Moscow. It is true that the Germans continued to “move all over” Europe afterwards, but they increasingly began doing so in a backwards motion. It is depressing to consider that Birnbaum co-leads the educational department of the World War II museum in New Orleans, where she has the opportunity to pass on her myopic views of the war onto countless young people, thus ensuring the perpetuation of the D-Day myth. Not much has changed in the museum, it would seem, since 2006, when  Norman Davies commented: “Yet, once again, the museum does not encourage a view of the war as a whole. Few visitors are likely to come away with the knowledge that D-Day does not figure among the top ten battles of the war.”

 

Many military historians would now contend that, if there was indeed any “turning point” in the European war, it took place before Moscow in December 1941. For it was then that Germany lost the opportunity to win the war that it had been hoping to win. It was also at that point that the Soviets forced upon the Germans a war of attrition. As the Stanford historian James Sheehan points out, there are no decisive battles  in wars of attrition, but rather milestones along the way to victory, as the enemy is slowly but surely reduced to a condition of weakness where they can no longer continue the fight. In that sense, the other important milestones were Stalingrad (February 1942), after which it became increasingly clear that Germany was going to lose the war, and Kursk (July 1943), after which it became increasingly clear that the Russians were coming to Berlin, with or without the help of the Western Allies.

 

Any objective look at the human and material resources available to Nazi Germany by the spring of 1944, especially compared to those available to the Allies, makes the claim that D-Day saved the world from a Nazi-dominated Europe preposterous. Such arguments are not history but science fiction. We need only consider that in May 1944, the German field army had a total strength of 3.9 million soldiers (2.4 million of which were on the Eastern front), while the Soviet Red Army alone had 6.4 million troops. Moreover, while the Wehrmacht had used up most of its reserve troops by 1942, Joseph Stalin could still call up millions more men to fight. While Germany was rapidly running out of the food, fuel, and raw materials an army needs to fight a protracted war, the stupendous productive capacities of the United States, through the Lend-Lease program, made sure that the Soviet soldiers were well-fed and equipped for their final assault on Germany. Add to this the continual pounding that German industry and infrastructure was taking by the Anglo-American air war, which also forced the German military to bring back invaluable fighters, anti-aircraft artillery, and service personnel to the home front, and it becomes obvious that Germany was fated to lose the war long before any Allied soldiers reached the beaches of Normandy. The German army was defeated on the Western front, to be sure, but it was annihilated in the East. Until almost the very end of the war, somewhere between 60-80 per cent of the German divisions were stationed in the East, and that was where they were wiped out. But the Soviets paid a horrific price for their victory. According to the Military Research Office of the Federal German Army, 13,500,000 Soviet soldiers lost their lives in the fight against Nazi Germany. The United Kingdom lost some 326,000 soldiers. The United States lost 43,000 men in Europe.

 

In light of such statistics, one can only imagine how offended many Russians, Ukrainians and Byelorussians must feel today when they hear Americans congratulating themselves for having been the ones who defeated the Nazis. Nevertheless, the host of the On Point broadcast, David Folkenflik, introduced one segment with the claim that the United States had played the “dominant” role in achieving victory in World War II. Regarding the Pacific theater, there is no doubt about this. But after considering the scale of the fighting on the Eastern front of the European war, Folkenflik’s contention becomes absurd. Unfortunately, such comments are still all-too common. The English historian Giles Milton, for instance, has recently published a book entitled “D-Day. The Soldier’s Story,” in which he writes that the tide against Nazi Germany “had begun to turn” by the winter of 1942, but he still reserves the final turning for D-Day.  So it is no wonder that many Russians today feel that people in the West fail to give them the credit they deserve for achieving victory in World War II.

 

This is important to contemporary polticis: if the tensions between Russia and the United States are ever to be overcome, then there will have to be more American recognition and appreciation of the sacrifices of the Soviet peoples in World War II. Otherwise Americans will continue to make it easier for Vladimir Putin to engage in his own historical myth-making to help legitimize his increasingly authoritarian rule. To be fair, if David Folkenflik had decided to include a discussion of the Eastern Front in his broadcast, it would have lasted too long and lacked focus. Moreover, it is only to be expected that, when a nation reflects on the past, it concentrates on its own historical achievements. But that cannot be a license for spreading false historical beliefs. At least a brief mention of the Eastern front would have been merited. 

 

To acknowledge that D-Day was no “turning of the tide” in no way implies that it was not an important, or even a crucial, battle of the Second World War. Had the landings failed, as the American Allied Supreme Commander Dwight D. Eisenhower feared they might, the war could have dragged on for several more years. In that case, the Nazis would have come much closer to their goal of exterminating every last Jewish man, woman and child in Europe. Not to mention the hundreds of thousands, perhaps millions more military and civilian casualties that would have ensued. Victory in Normandy saved countless lives. In the final analysis, however, the greatest strategic consequence of the battle lies elsewhere.

 

This true significance of D-Day was briefly mentioned during the On Point episode by Walter Isaacson. (He was also the only participant who did not engage in overt exaggeration of D-Day’s importance for defeating Nazi Germany.) Isaacson made the most sensible comment of the entire program when he pointed out that, had D-Day failed, a lot more of Europe would have fallen under the control of the Soviet Union that actually did. In truth, without D-Day, the Soviet T-34 tanks would not only definitely have crossed the river Rhine, but they most likely also would have reached the French Atlantic coast. As the English military historian Anthony Beevor has discovered “a meeting of the Politburo in 1944 had decided to order the Stavka [Soviet High Command] to plan for the invasion of France and Italy. . .  The Red Army offensive was to be combined with a seizure of power by the local Communist Parties.” D-Day may not have been necessary to defeat Nazi Germany, but it was needed to save western Europe from the Soviet Union. As Beevor observes, “The postwar map and the history of Europe would have been very different indeed” if “the extraordinary undertaking of D-Day had failed.”

 

By all means, then, we should commemorate the heroism and sacrifices of the Anglo-American soldiers who fought and died on D-Day. They all made an important contributions to liberating western Europe and achieving victory over Nazi Germany. But national pride must never be allowed to distort historical reality. The successful Allied landings in Normandy accelerated Germany’s defeat, but they didn’t bring it about. The German military historian Jörg Echternkamp puts it well: “From the beginning of the two-front war leads a straight path to the liberation of Europe from Nazi domination roughly one year later. Nevertheless the German defeat had already at this time long since been sealed on the eastern European battlefields by the Red Army. This is all-too easily concealed by strong media presence of D-Day today." The credit for vanquishing Adolf Hitler’s armies should go first and foremost to the Soviet Red Army. Again, Norman Davies is correct when he writes: “All one can say is that someday, somehow, the present fact of American supremacy will be challenged, and with it the American interpretation of history.” For now, however, as the On Point broadcast has shown, popular understanding of D-Day in the Unites States continues to be more informed by myth than reality. 

 

]]>
Sun, 16 Jun 2019 01:00:48 +0000 https://historynewsnetwork.org/article/172164 https://historynewsnetwork.org/article/172164 0
How Should Historians Respond to David Garrow's Article on Martin Luther King, Jr.?

 

 

Pulitzer Prize winner and noted historian David Garrow made headlines last week after Standpoint published his article on the FBI’s investigation of Martin Luther King, Jr. Their documents allege King’s  involvement in numerous extra-marital affairs, relations with prostitutes, and presence during a rape. In response, scholars have questioned the reliability of the FBI records Garrow used to make such claims. These documents, and the resulting controversy, should also lead scholars to ask questions about the ways in which historians can and should address the history of gender and sexuality when it intersects with the histories of the civil rights movement, American religion, and the development of the surveillance state.

 

First, King and many of the clergy involved in the civil rights movement took a different approach toward interacting with women than some other well-known preachers, particularly Billy Graham. In 1948, evangelist Billy Graham and his staff agreed to a compact known as the Modesto Manifesto. This informal compact dealt with a number of issues from distributions of revival offerings to relations with local churches to what would become more colloquially known as the Billy Graham rule: men on the team would never be alone with a woman who was not their wife. While the rule may have kept the evangelist, who was noted in the press for his fair looks, and much of his team on the straight and narrow, it no doubt limited the opportunities of women within his organization and marked women as dangerous to men, particularly preachers.

 

The Billy Graham rule would have been impractical for the civil rights movement. The work of women was essential to the growth and success of the movement, and it would have been nearly impossible for civil rights leaders, such as King, to avoid being in contact with women and still have had a thriving movement.  Sociology professor Belinda Robnett established that for the civil rights movement, it was very often women who linked leaders of organizations like King, to supporters of the movements on the local level. These bridge leaders recruited more activists to the cause and ensured the general running of civil rights organizations. Some of the women named in Garrow’s essay served as bridge leaders, and as a consequence were especially vulnerable to such charges in an era where Graham’s rule was influential. 

 

Those with traditional moral values reading David Garrow’s recent article on the alleged sexual proclivities of Martin Luther King Jr., might come to the conclusion that if King had instituted the Billy Graham rule, he never would have had the opportunity for extramarital affairs. They might imagine that there would have been no possibility that the FBI could have made such allegations, true or false. That however is unlikely to have been the case. While King’s moral failings are perhaps best left for he and his creator to resolve, it is certain that given the climate at the FBI at the time, and given J. Edgar Hoover’s special animus toward King, as Garrow described in this work, that there would have been continual attempts to try to establish some kind of moral failing with which to undermine one of America’s two most famous preachers.

 

The most controversial claim in these documents is a reference to an oddly edited document purporting to be a summary of electronic surveillance in which an unnamed Baptist minister forcibly raped a female parishioner while King looked on. While Garrow questions some documents, according to a Washington Post article, he seems to have less questions about the authenticity of this summary. King advisor Clarence Jones points out that while this rape should be condemned if true, if itdid occur, why did Hoover not turn over the evidence to other officials? It would have provided Hoover with the opportunities he had been seeking to undermine one of America’s most recognized preachers.

 

Jones of course is asking a question that all civil rights historians should ask but we should also ask other questions.  How do these documents often reflect a callous disregard for women? If this incident was true, why did the FBI not seek justice for this unnamed woman? And, if it is not true, how little did Hoover’s men value women that they thought an incident like this could be easily invented and the duplicity go unnoticed, and how did that impact their investigation of King? We should also ask if the Billy Graham rule set American expectations for the private behavior of very public clergy.

 

Women’s bodies are often sexualized, and black women’s bodies even more so.  In these documents, it is clear that the FBI placed an emphasis on what they deemed as these women’s immoral or abnormal sexual choices ranging from oral sex to adultery to prostitution to lesbian partners.  Even when they perhaps should have, the agents express little to no concern for the women, but rather the concern is for the state.  These women’s bodies mattered to the FBI only when they may have been in a position toplay a part in compromising America’s foremost black preacher and make him susceptible to communist influence, or when those same bodies offered the FBI an opportunity to expose that preacher’s failings.

 

For some of the women named in the documents used in the Garrow article, the evidence of sexual activity isscant, merely referring to them as girlfriends or women who wanted to know why King hadn’t come by when he was in their area. In another instance, an orgy between King, a prostitute, and a well-known female gospel singer is described.  For historians to focus on these instances now, with so much of the evidence from biased sources, and some of it still under seal, feels a bit like participating in historical slut shaming. For these women, whatever their sexual choices were over fifty years ago, there is no escape. Salacious details, real or fiction, lay forever in the National Archives.

 

In this case, much of what we’d like to know in regards to these controversies will not be revealed until the court order sealing the records expires in 2027, and may not be resolved even then. T. E. Lawrence once wrote that “the documents are liars.” It is the task of every historian to determine to what extent that is true, but it is also the task of every historian to examine the ways in which to documents may tell unplanned truths about our past, even if that makes us uncomfortable. 

]]>
Sun, 16 Jun 2019 01:00:48 +0000 https://historynewsnetwork.org/article/172128 https://historynewsnetwork.org/article/172128 0
Roundup Top 10!  

A Black Feminist’s Response to Attacks on Martin Luther King Jr.’s Legacy

by Barbara Ransby

We should not become historical peeping Toms by trafficking in what amounts to rumor and innuendo.

 

About the FBI’s Spying

by William McGurn

What’s the difference between surveillance of Carter Page and Martin Luther King?

 

 

What D-Day teaches us about the difficulty — and importance — of resistance

by Sonia Purnell

For four years, a few French citizens fought a losing battle. Then they won.

 

 

After Tiananmen, China Conquers History Itself

by Louisa Lim

Young people question the value of knowledge, a victory for Beijing 30 years after the crackdown on student protests.

 

 

How True-Crime Stories Reveal the Overlooked History of Pre-Stonewall Violence Against Queer People

by James Polchin

The history of such crimes tends to be lost.

 

 

Hitler told the world the Third Reich was invincible. My German grandfather knew better

by Robert Scott Kellner

As a political organizer for the Social Democrats, Kellner had opposed the Nazis from the beginning, campaigning against them throughout the duration of the ill-fated Weimar Republic.

 

 

How racism almost killed women’s right to vote

by Kimberly A. Hamlin

Women’s suffrage required two constitutional amendments, not one.

 

 

Who Will Survive the Trade War?

by Margaret O’Mara

History shows that big businesses profit most when tariffs reign.

 

 

Of Crimes and Pardons

by Rebecca Gordon

The United States was not always so reluctant to put national leaders on trial for their war crimes.

 

 

Trump Is Making The Same Trade Mistake That Started The Great Depression

by John Mauldin

Similar to today, the Roaring 1920s saw rapid technological change, namely automobiles and electricity.

 

 

 

The Making of the Military-Intellectual Complex

by Daniel Bessner and Michael Brenes

Why is U.S. foreign policy dominated by an unelected, often reckless cohort of “the best and the brightest”?

]]>
Sun, 16 Jun 2019 01:00:48 +0000 https://historynewsnetwork.org/article/172190 https://historynewsnetwork.org/article/172190 0
Why 2019 Marks the Beginning of the Next Cycle of American History

 

A century ago, historian Arthur Schlesinger, Sr. argued that history occurs in cycles. His son, Arthur Schlesinger, Jr., furthered this theory in his own scholarship. As I reflect on Schlesinger’s work and the history of the United States, it seems clear to me that American history has three 74-year-long cycles. America has had four major crisis turning points, each 74 years apart, from the time of the Constitutional Convention of 1787 to today.

 

The first such crisis occurred when the Founding Fathers met in Philadelphia in 1787 to face the reality that the government created by the Articles of Confederation was failing. There was a dire need for a new Constitution and a guarantee of a Bill of Rights to save the American Republic. The founding fathers, under the leadership of George Washington, were equal to the task and the American experiment successfully survived the crisis. 

 

For the next 74 years, the Union survived despite repeated disputes over American slavery. Then, in 1861, the South seceded after the election of Abraham Lincoln and the Union’s refusal to allow this secession led to the outbreak of the Civil War. In this second crisis, exactly 74 years after the Constitutional crisis of 1787, two-thirds of a million people lost their lives and, in the end, the Union survived.

 

The war was followed by the tumultuous period of Reconstruction and the regional sectionalism that had led to the Civil War continued. As time passed, with the growth of the industrial economy, the commitment to overseas expansion, and widespread immigration, the United States prospered over the next three quarters of the century until the Great Crash on Wall Street and the onset of the Great Depression under President Herbert Hoover in 1929. The economy was at its lowest point as Franklin D. Roosevelt took the oath of office in 1933.

  

World War II broke out in 1939—exactly 74 years after the end of the Civil War (1865). While America did not officially enter the war for two years, it is clear that the danger of the Axis Powers (Nazi Germany, Fascist Italy, Imperial Japan), on top of the struggles of the Great Depression, marked a clear crisis in American history.  Fortunately, America had the leadership of Franklin D. Roosevelt to lead us through the throes of the Great Depression and World War II. 

 

Once the Second World War ended in 1945, America entered a new period that included the Cold War with the Soviet Union and tumult in America due to the Civil Rights Movement and opposition to American intervention in wars in  Korea, Vietnam, and the Middle East.  The saga of Richard Nixon and Watergate seemed to many to be the most crisis-ridden moment of the post World War II era. But the constitutional system worked, and the President’s party displayed courage and principle and accepted that Nixon’s corruption and obstruction of justice meant he had to go.  Certainly, Watergate was a moment of reckoning, but the nation moved on through more internal and external challenges.

 

2019 is exactly 74 years after 1945 and it is clear that America is once again in a moment of crisis. As I have written before, I believe that today’s constitutional crisis is far more serious and dangerous than Watergate. Donald Trump promotes disarray and turmoil on a daily basis, undermines our foreign policy and domestic policy, and is in the process of working to reverse the great  progress and accomplishments of many of his predecessors going back to the early 20th century. The past 74 years have produced a framework of international engagement – the World Trade Organization and free trade agreements, the United Nations and conflict resolution, and a series of treaties like the Non Proliferation Treaty and Paris Climate Agreement. Nearly all of these accomplishments of the past 74-year cycle are now under threat. 

 

The rise of Donald Trump is not an isolated phenomenon as similar leaders have come to power in much of the world in the past couple of years. This has occurred due to the technological revolution and the climate change crisis.  Both trends have convinced many that the post-1945 liberal world order is no longer the solution to global issues and that authoritarian leadership is required to deal with the economic and security challenges that the world faces. Charismatic figures claim to have the solutions to constant crisis by stirring racism, nativism, anti-Semitism, Islamophobia, misogyny, and xenophobia.  

 

In some ways, this is a repeat of what the world faced in the late 1930s, but as this is the present instead of the past, we have no certainty that the major western democracies can withstand the crises and preserve democratic forms of government.  As America was fortunate to have George Washington, Abraham Lincoln, and Franklin D. Roosevelt in earlier moments of turmoil and crisis, the question now is who can rise to the occasion and save American prosperity and the Constitution from the authoritarian challenge presented by Donald Trump.

]]>
Sun, 16 Jun 2019 01:00:48 +0000 https://historynewsnetwork.org/article/172129 https://historynewsnetwork.org/article/172129 0
Beijing’s Tiananmen Square Massacre

Image: A rally of more than a million people who swarmed the city’s Happy Valley Race Course, to protest the killings.  To the right, young Hong Kongers in white (the Chinese color of mourning); to the left, at a private club, indifferent expats enjoy tonics by the pool. By then, many of them had already packed their bags.

 

This week marks the 30th anniversary of the infamous June 4th massacre in Beijing — when People’s Liberation Army troops, under the command of the Chinese Communist Party, murdered an unknown number of students in the Chinese capital. Estimates of the death toll range from 200 to 2,500 according to various independent accounts.  Most were killed by automatic gun fire, but many were crushed beneath the steel treads of Army tanks.

 

The sudden, merciless crackdown strangled a blossoming democracy movement led by university students and workers and sent shock waves around the world.  But nowhere felt the sheer terror of the mass murders more than the then-British colony of Hong Kong,  where I was working as a young reporter.

 

Knowing that the slaughter in Beijing could happen again in Hong Kong, the city’s confidence in its own future was shattered.  For 147 years, Hong Kong was a British colony and had encouraged a free-wheeling capitalist system. It became a showcase of social and economic freedom juxtaposed against the historically brutal communist China. Under British rule, Hong Kong had enjoyed a laissez faire economic system, creating a capitalist economy that was the envy of the world. The people had enjoyed freedoms of the press, speech, assembly and movement in and out of the territory. While local Hong Kong Chinese tycoons had largely run the city’s booming local business community, real political power rested with the governor, who had always been appointed by Britain’s Prime Minister, a situation which suited the Hong Kong Chinese just fine.

 

Under the terms of the historic 1984 Joint Declaration, signed by Britain and China, the British Crown Colony and its 5.6 million residents would revert to Chinese rule by June 1997. But the prospect of now being under the direct control of the People’s Liberation Army was one which deeply frightened the people of Hong Kong — especially since most of their parents or grandparents had fled  China after the communist takeover in 1949.Barring an unforeseen political turnaround by the Beijing regime, experts predicted a massive outflow of people and investments from Hong Kong. 

 

In the weeks leading up to the Beijing bloodbath, and in the dark days that followed, a normally non-political Hong Kong underwent an immense groundswell of cultural pride, and an almost overnight political awakening. As millions marched and swarmed to the city’s early optimistic rallies in proud support of the students’ democratic movement in late May, the long-held belief that Hong Kong people only cared about money was put to rest.

 

When a crowd of 300,000 packed the city’s Happy Valley Race course on May 27th, where I attended a day-long concert to raise funds for the Beijing students, organizers hoped to raise $250,000. By evening the total take was a generous $3 million.

 

As the many thousands waved yellow ribbons—borrowed from the 1986 Philippines’ People’s Power Revolution which had toppled  Filipino dictator Ferdinand Marcos— Hong Kong’s leading singers switched from their usual sappy love songs to passionately patriotic tunes such as For Freedom, Heir of the Dragon, Be a Brave Chinese! and Standing as One.

 

Normally hard-hearted taxi-drivers and mini-van owners refused to accept fares from people heading to the rallies, while street-side fish mongers and vegetable hawkers donated part of their day’s earnings to the Beijing’s students’ democratic movement. All the leading Chinese and English-language newspapers got involved, publishing emotional editorials supporting the aims of the Beijing students. The city’s many glossy magazines replaced the usual pouting pop stars and beauty queens with the handsome face of Beijing student leader Wu’er Kaixi. 

 

But as the days of proud defiance turned into a single night of horror on June 4th, Hong Kong reacted to the Beijing bloodbath with shock, sadness, anger, and finally, outrage. The marches and rallies continued swelling in size to two million. But the mood was now somber and grim. White—the traditional Chinese color of mourning—replaced yellow.  And hundreds of marchers—now old as well as young—openly cried in the tropical heat of the teeming streets; something I had never seen before, or since. The public was shaken by the senseless slaughter they’d all seen on television.

 

The Hong Kong Red Cross pleaded for blood donations for the many injured in Beijing, the call was answered by over 1,000 people per day. Catholic residents of the British territory, including many resident Filipinos, attended a special mass in memory of the slain students, celebrated by Cardinal John Wu and 300 priests. Buddhist services were held in temples across the territory.

 

In a unique waterborne protest, 200 fishing boats assembled in busy Victoria Harbour, forming the largest flotilla ever seen in Hong Kong. For five hours fishing captains and their crews circled the harbor to pay respects to the young victims of the massacre in China’s ancient capital.  Huge black banners, reading “For democracy, for freedom—the fishermen have come!” lay draped across the boats’ wheelhouses. 

 

 

Newspaper headlines frighteningly alluded to a possible second Chinese civil war. The Hong Kong stock market plunged 300 points—perhaps the sharpest single day fall since 1949. And a massive brain-drain began which eventually grew to a human flood, as close to a million Hong Kong families steadily fledtheir homes. The best and the brightest of the middle-class fled to Australia, Canada, the UK and the United States. The less affluent acquired passports which suddenly became available – for a fee — from remote and obscure poverty-stricken nations in Africa and the South Pacific.

 

Today, as the world marks the massacre’s 30th anniversary, post-Handover Hong Kong is ruled from Beijing and the city’s 7.6 million people still sit in the historical shadow of that slaughter.

 

China has never officially released a realistic death toll and Hong Kong stopped asking long ago.  Most of the local media is now controlled by Beijing, including the leading English-language newspaper, the South China Morning Post—owned by pro-Beijing billionaire Jack Ma, founder of Alibaba. Even public references to the Tiananmen Square massacre have been watered down – gradually moving to the more politically acceptable “Tiananmen incident.” Or merely mumbled as “Tiananmen…”

 

Self-censorship is now rampant in Hong Kong’s once vibrant pressbut sometimes more pressure is applied. In 2014, the editor of Ming Pao—a popular leading liberal paper seen as not supportive enough of Beijing—was attacked on his way to work by several people wielding meat cleavers. He barely survived and will walk with a limp for the rest of his life.  Some 13,000 people marched in the streets in protest, and young reporters carried signs reading “You can’t kill us all.”

 

 

Beijing is increasingly working to strip away Hong Kong’s freedoms and Hong Kongers have become like the proverbial frog sitting in a pot of water on a stove, where the heat is slowly increased. Beijing has called for Hong Kong’s high court judges to act not as independent magistrates, but as “civil servants” for the government — acting on its behalf rather than that of justice. Two-thirds of the members of the respected 112-year old Law Society of Hong Kong angrily protested. But one-third of the lawyers remained silent.

 

One poignant pertinent link still binds Hong Kong to the tragic events in Beijing all these years ago: The Goddess of Democracy statue. The original was created by the Beijing students, modeled on America’s Statue of Liberty. It was created in just a few days and made of fragile foam and papier-mâché on a metal frame. It stood 35 feet – designed to be tall enough to be seen from any point in the vast 109-acre Square. The People’s Liberation Army destroyed it on the night of June 4th.  

 

Every year, on June 4th,  hundreds of thousands of Hong Kong people remember the Beijing students by gathering for a candle-lit memorial at the city's beloved Victoria Park.

 

But in 2010, a full-scale bronze replica of the Goddess was created in Hong Kong to become a part of the city’s annual memorial services. These were held in the city’s beloved Victoria Park— in the very heart of the city, where as many as 200,000 people gathered each year. However, after the 2010 gathering, the Hong Kong Government decided to move the politically embarrassing statue to the Chinese University of Hong Kong’s campus, for permanent display—far outside the city. 

 

The hope being: Out of sight, out of mind. Just the way that Beijing wants it.

]]>
Sun, 16 Jun 2019 01:00:48 +0000 https://historynewsnetwork.org/article/172121 https://historynewsnetwork.org/article/172121 0
Remembering Murray Polner (1928-2019)

 

I knew Murray Polner, HNN’s book editor until May 2017, long before I began submitting reviews.  This was his second stint as my editor, the first being in the 1990s, when I became a contributing writer for PS: The Intelligent Guide to Jewish Affairs.  

 

PS was a newsletter he created with co-editor Adam Simms, as a postscript to his time as editor-in-chief of Present Tense, published by the American Jewish Committee from 1973 to 1990 as a liberal counterpoint to Commentary.  (That other AJC publication had evolved under Norman Podhoretz from a forum for liberals and moderate leftists to become the earliest exponent of a new intellectual and political current known as neoconservatism.)    

 

Working with Murray at PS, I got to know him and his wife Louise personally.  I recall visiting their home in Great Neck a couple of times, including for a vegan Passover Seder.  And I was privileged, over 20 years ago, to be included in a big bash at a Manhattan venue in honor of his 70th birthday.   

 

His career trajectory was astonishingly distinctive.  He served in the US Naval Reserve from 1947 to 1952 and then in the US Army from 1953 until ‘55, but he became disillusioned with the military and evolved into a pacifist. In this connection, he worked with a local antiwar group to oppose the possible renewal of military conscription after the Carter administration reinstituted compulsory registration for young men.  And he served as the editor of Fellowship, the organ of the pacifist Fellowship of Reconciliation from 1991 to ’93.      

 

In the late 1950s and early ‘60s, Murray taught at Thomas Jefferson High School in Brooklyn, and then in adjunct capacities at Brooklyn College, Queens College and Suffolk Community College.  He also served as executive assistant to Harvey Scribner, the first chancellor of New York’s public school system.     

After graduating CCNY in 1950, he pursued graduate studies in the late 1960s, earning an MA in history from the University of Pennsylvania and a Ph.D. in Russian history at Union Institute and University in 1972. In the meantime, he began publishing the first of his eight books, No Victory Parades: The Return of the Vietnam Veteran, in 1970.  Then came a work on amnesty for draft resisters, When Can I Come Home? a Debate on Amnesty for Exiles, Antiwar Prisoners, and Others in 1972.   

The first of his books with a Jewish theme was Rabbi: The American Experience, published in 1977. This was followed with two anthologies, one in 1994, The Challenge of Shalom: The Jewish Tradition of Peace & Justice, co-edited with Naomi Goodman, and Peace, Justice, and Jews: Reclaiming Our Tradition, co-edited with Stefan Merken in 2007.

 

He linked his passion for baseball with his devotion to social justice in Branch Rickey: A Biography. Published in 1982, this was about the sports executive who broke the color line in Major League Baseball by bringing Jackie Robinson to the Brooklyn Dodgers in 1947.

 

Returning to the theme of pacifism, Murray felt the need to enlist a knowledgeable Catholic as co-author of Disarmed and Dangerous: The Radical Lives and Times of Daniel and Philip Berrigan, the militant anti-war priests, published in 2007.  His choice was Jim O’Grady, a biographer of the Catholic political radical, Dorothy Day, and a reporter for WNYC public radio.    

 

Murray’s pacifism drew admiration from the antiwar right as well as the left.  In 2008, he co-edited We Who Dared to Say No to War: American Antiwar Writing from 1812 to Now, with Thomas E. Woods, Jr., a libertarian.  And from 2001 until 2015, Murray wrote numerous scathing opinion pieces on US foreign policy for the rightwing antiwar website, LewRockwell.com,  which mourned his passing immediately upon news of his death with a piece by Mr. Woods

 

This is what Rick Shenkman, HNN’s founder, emailed to his son, Rob, upon learning of his passing: 

 

“Murray went back almost to the beginning of HNN nearly 20 years ago. I marveled at his productivity into his nineties and his subtle grasp of the key issues facing the country.  

 

“I knew he was slowing down when he asked to retire as HNN’s book editor, but then he surprised me by indicating he wanted to keep up the blog.  And so he did! 

 

“Murray played a big role at HNN and was instrumental in our success.  

 

“I admit I was always jealous because he made writing seem easy.  But when you looked deeply into his complex sentences you realized he was an old-fashioned wordsmith who worked over his paragraphs until they sang.”

 

And this is from Rob’s message to Rick:

 

“Just a few days before we lost him, he dictated to me a letter to the editor of the NY Times, asking why the editorial page had not seen fit to warn about a possible US war with Iran. That was my dad, a lion to the end, yet also as sweet as a pussycat.”

 

Murray is survived by Louise, his wife of over 68 years, their daughter Beth Polner Abrahams, their two sons, Rob and Alex, and six grandchildren.  May his memory long endure.  

]]>
Sun, 16 Jun 2019 01:00:48 +0000 https://historynewsnetwork.org/article/172126 https://historynewsnetwork.org/article/172126 0
D-Day 75 Years Later and the Quest for Peace

D-Day planning map, used at Southwick House

 

My father Vincent was wounded clearing the mines on Omaha Beach following the D-Day invasion 75 years ago.  At the hospital next to him was one of the D-Day paratroopers who got dropped into France to fight the Nazi Germans.  When my father told the paratrooper about removing mines, his reply was "gee, that's dangerous." My father could not believe that a daring paratrooper would consider that a risky job! Both my father and the paratrooper were lucky in the sense they did not suffer the worst of injuries. But so many others lost their lives in the invasion to retake Europe from the hold of Nazi Germany.  The D-Day landings of June 6, 1944 and the invasion that followed brought about the end of the German war machine. D-Day led to freedom for millions who had been suffering under Nazi occupation.  It's important to remember D-Day because of what so many brave soldiers were sacrificing for: to build peace.  As General Dwight Eisenhower told Walter Cronkite, Americans and the allies came together “to storm these beaches for one purpose only. Not to gain anything for ourselves, not to fulfill any ambitions that America had for conquest, but just to preserve freedom. I think and hope, and pray, that humanity will have learned ... we must find some way ... to gain an eternal peace for this world.” It's special to remember these ideals because so often today we hear leaders talking about war recklessly. When discussing a potential military campaign, some even say things like it would only take a matter of days, as if it was all so easy. Some leaders flaunt military might and spending to the extreme. It’s scary when our leaders seem to have no concept of what war is or the human cost.  Not only are soldiers at risk in war, but civilians too. Part of the Allied invasion of Europe was civil affairs units bringing relief supplies to feed the hungry. War always leads to food shortages and hunger. This relief had to continue for years across the continent.    We need our leaders to be thoughtful, like Eisenhower, about the critical issues of war and peace.  Eisenhower, as president, avoided war. He was deeply concerned about too much military spending and sought arms control. Today, we need to pursue disarmament among the militaries of the world.  It is a tragedy in itself when nations have to commit so many of their precious citizens and resources to war. My father remarked how his fellow soldier, Lou Siciliano, was a really educated man who could be doing so many other things to help society, "but this is what happens in war."  And for the families back home there is pain during war, and the aftermath especially for those wounded.   My father, after being hurt by pellets from an exploded mine 50 feet away, was lucky that a letter he sent reached his mother before the War Department’s telegram.  Her brother had been killed during World War One, and the trauma of that War Department telegram arriving first would have been extreme.  My father had to live with pellets in his legs the rest of his life, but it did not cause him too much trouble except perhaps toward the end when his mobility was extremely limited.  Many other families were not so fortunate after the D-Day invasion. Cemeteries in France mark the fallen soldiers. They are not alone in their grief. To this day many military families suffer that terrible news about loved ones lost. For some families of injured soldiers, care is needed for a lifetime. When you think of these families, it can reinforce the mission for peace.  American and allied soldiers lost their lives on D-Day so that others may live free. The best way to honor D-Day veteran’s sacrifice is to work for that elusive, but achievable eternal world peace.   

]]>
Sun, 16 Jun 2019 01:00:48 +0000 https://historynewsnetwork.org/article/172100 https://historynewsnetwork.org/article/172100 0
Remembering Rome's Liberation

Celebrations as Rome is liberated

 

 

Amid the justifiable hoopla this week surrounding the launch of the D-Day invasion, it is important to note, too, the anniversary of an event that unfolded just two days earlier, on June 4, 1944--the Allied liberation of Rome. 

 

Despite Churchill's promises of quick victory in the 'soft underbelly” of Hitler's Fortress Europe, the prize was hard-won, indeed—Italy proved a “tough, old gut,” in the words of the GI's master chronicler Ernie Pyle. The Germans were intent on fighting for every inch of their southern flank. 

 

It began in summer '43 with Sicily, a campaign famous for the race that developed between General George Patton and Field Marshal Bernard Montgomery, his British rival, to be the first into the stepping-stone to the Italian mainland, Messina. Things got no easier after Italian dictator Benito Mussolini was arrested and Italy dropped out of the war. “All roads lead to Rome,” lamented theatre commander Sir Harold Alexander, “but all the roads are mined.” There were landings at Salerno, then chaotic Naples, where, as if on cue, looming old Vesuvius blew its top, giving the boys from Allen Park and Kalamazoo something to write about in their V-mail to the family back home. 

 

Twin, bloody stalemates followed over the fall and winter: at the swollen Rapido River in the Appenine Mountains south of Rome, near Cassino, and at Anzio, the seaside resort where Nero once fiddled as his eternal city burned.  Stars and Stripes cartoonist Bill Mauldin captured the dilemma facing Yanks, Tommies, Poles and other infantry units as they struggled to overcome fierce German resistance: “(We) seemed to find itself generally looking up, not down, at the enemy.” 

 

The breakthrough came finally in May of 1944, as advance troops pushed across the Alban Hills and then to the southern outskirts of Rome.  Fearing another Stalingrad, Hitler agreed to allowing his forces to pull back to positions 150 miles north, where the war would rage on for another full year. Fifth Army chief Mark Clark's jeep convoy got lost on the Via della Conciliazione near St. Peters, until a priest from Detroit offered directions to the city center. “They didn't even let us have the headlines for one day,” Clark was heard to complain when the gaze of the press shifted, suddenly and overwhelmingly, to Normandy. 

 

Is was in any case a glorious moment of triumph, paid for by the sacrifice of the 7,860 mostly young men who lie today buried in the American Cemetery at Nettuno, Anzio's sister-city.  “My God, they bombed that, too!” a member of the 1st Special Force exclaimed as columns in olive drab marched past the Coliseum.  Ex-pat Jane Scrivener recorded the sight as they moved down the posh Via Veneto, marking for Rome's citizens the end of months of brutal Nazi occupation: 

 

                          They were dusty, battle-worn and unshaven, but they smiled 

                          and waved in response to the greetings of the crowd. They 

                          had roses in the muzzles of their rifles and miniature Italian 

                          flags, which had been thrown to them. They had roses stuck 

                          in the camouflage nets of their helmets and in their shirts. 

 

“One has read these things in books, and accepted them as fiction,” Scrivener added, “never dreaming of witnessing them as we did today.” 

 

]]>
Sun, 16 Jun 2019 01:00:48 +0000 https://historynewsnetwork.org/article/172099 https://historynewsnetwork.org/article/172099 0
Depicting the Devil: How Propaganda Posters Portrayed Nazi Ideology In 1925, a bellicose Adolf Hitler understood that he needed the power of mass persuasion to push his political ideology on the German people. Citing propaganda as an essential component of statecraft in Mein Kampf, he wrote that propaganda must “awaken the imagination of the public through an appeal to their feelings, in finding the appropriate psychological form that will arrest the attention and appeal to the hearts of the national masses.” In its early phases, the Nazi party largely depended on Hitler’s own oratory gifts and stage presence to gather more interest and support. This changed dramatically with the party’s rise to political prominence and Hitler’s partnership with chief propagandist Joseph Goebbels. 

 

Goebbels immediately went to work weaponizing German history and mythology. In 1933, the Nazi government set up the Ministry of Propaganda with Goebbels at its helm. The Ministry identified two primary threats that needed to be brought to public attention and eliminated - so called internal enemies and external enemies of Germany and the German people. Jews, communists, Roma, homosexuals, religious groups such as Jehovah's Witnesses, and other minority groups were labeled as subversives: domestic enemies who were working actively against the country and its success. The perceived unbearable conditions of the The Treaty of Versailles and the myth of the “Stab in the Back,” were popular rallying cries and the so called historical proof in the pudding for nationalists. Propaganda was wielded like a weapon - painted in broad strokes and utilized as a means for both domestic suppression and international aggression.

 

Initially, the use of radio, newspapers, and even movies such as Leni Riefenstahl’s Triumph of the Will (1935) were popular vectors of transmission, but over time, less expensive visual means were also employed. Like in other parts of the world, the propaganda poster became a popular and easily mass produced misinformation tool. The striking nature of posters, coupled with psychological messaging, created an emotional response in viewers. The fact that they were unavoidable and plastered at various public locations was also a major convenience for Goebbels and the Nazi propaganda machine. 

 

One of the most important themes which is found in Nazi propaganda posters, and Nazi propaganda in general, was the adoration of Adolf Hitler. Der Fuhrer (leader) was likened to a god-king and a messianic human representation of the will of Germany. Nazi propagandists created numerous artifacts which were not only on public display but also pushed to the private realm of Germans who were encouraged to harbor artifacts of Hitler. The phrase “Ein Volk, ein Reich, ein Fuhrer,” (One people, one empire, one leader) was embedded in the minds of the people and pushed across various mediums to cement the idea that the party was in complete charge of the German state and its people.

 

 

Obsessed with imagery based around a national community, the Nazis utilized artists to create emotionally stirring and provocative pieces of art. To strengthen the idea of domestic enemies, the vilification of Jews was one of the most common themes in Nazi propaganda posters. Jews were often depicted as blood thirsty demons, feeding on pure-blooded Germans, portrayed as Bolshevik infiltrators,  a plague upon the land, or puppet masters secretly controlling the levers of power across national borders, especially in the economic sectors. This vile dehumanization used various motifs to encapsulate the Nazi ideology.

 

Many posters focused less on vilification, and more on the glorification of the Aryan body, and the idea of a pure blooded society. Aryan superiority was showcased through triumphant imagery of the unrealistically strong and perfect  male and female body . Often, so-called subhuman classes of people were also drawn to highlight the natural inferiority to the Aryan. Jews were further racialized in this way with caricature style depictions such as overly large noses and claws instead of hands. The Nazi obsession with family and the roles of men and women were also intertwined with encapsulations of what constituted the perfect German individual, and in turn, the German population as a whole. The Nazis strongly believed in the traditional roles of men and women.This was closely linked to the Nazi ideology of a Volksgemeinschaft (national community) which transcended class and religious differences to create a sense of racial comradery and national pride. 

 

 

Nazi propagandists naturally needed to sell their military aggression to the civilians at home. Weary and exhausted from World War I, the German public was in no mood for the triumphalism of war on the eve of the invasion of Poland in 1939. Propagandists turned to the idea of defining wars of aggression as “self-defense” and territorial acquisition which was completely necessary and justified for the preservation of the Aryan race. Upon reaching the age of 18, boys were required to join military service or the Reich Labor Service. Recruitment posters claimed that military service was for “freedom and life.”

 

Propaganda extended into other realms of everyday life as well, including education. Students were routinely encouraged to become the Fuhrer’s “little propagandists,” at every turn. Educators were expected to join the National Socialist Teachers League and by 1936, close to 97% did - one of the highest percentages in any profession in the country. Boys and girls between the ages of 10 and 17 were expected to join the Hitler Youth. Originally established as a youth training program to prepare young men to become a part of the Sturmabteilung (SA), it morphed into an mandatory after school program to mold children to be faithful to the Nazi party and to the Nazi leadership. A popular poster from the time reads - “Leader - all 10 year olds into the Hitler Youth.”

 

 

A recurring problem which plagued Nazi leadership and their attempts at total and complete control of the media in Germany were foreign broadcasts. Artists tackled this issue by presenting listeners to broadcasts from London, New York, Moscow and other ‘hostile’ locations as traitors. Utilizing guilt and the idea of racial and national disloyalty, the Nazis tapped into psychological manipulation to set their agenda and attempt to eliminate any threats to their control of the media landscape. 

 

Nazi propagandists targeted virtually every segment of society in Germany. From the individual to the overarching state, from the inner family sanctum, to the international policies of aggression and Aryanism. Nazi propaganda attempted to normalize the dehumanization of entire groups of people deemed unworthy according to the strict racist policies implemented on a national level. The poster became a cheap transmitter of these various messages and combined visual arts with psychological methods to incessantly repeat Nazi ideologies to the German public. 

]]>
Sun, 16 Jun 2019 01:00:48 +0000 https://historynewsnetwork.org/article/172125 https://historynewsnetwork.org/article/172125 0
Witch Hunt! How Europe’s Witch Mania Came to the New World

 

When I decided to write Poison in the Colony, a historical novel about the Jamestown colony, I was up against a problem. It was 2014 and young readers had made their desires clear: if a book didn’t contain a fantasy world, vampires, or at least a few wizards, they did not want to read it. I pondered how I might incorporate the other-worldly aspects that young readers wanted in a novel about real people and real historical events. I decided to blend in a believable and historically accurate bit of magical realism—my main character would have the gift of the “second sight,” i.e. strong intuitive powers.

 

I figured that a young person with a strong sense of intuition—she would know things she wasn’t supposed to know, she would have small peeks into the future—may well be accused of witchcraft. Even better, I thought. The plot thickens!

 

The fear of witches was at fever pitch in Britain in the early 1600’s and that fear had traveled across the ocean with the colonists. Witchcraft was condemned by the church and at the time church doctrine taught that witches had willingly entered into a pact with the devil. In Europe, between the years of 1500 and 1660 it is estimated that about 80,000 people were put to death for allegedly practicing witchcraft. Eight out of ten of these were women. Death was normally meted out through strangling and burning at the stake, beheading, or hanging. 

 

I started to research the years the book would cover, 1613 to 1622, and the family of Anne Burras and John Laydon and their daughter Virginia. Virginia was to be my main character, blessed (or cursed) with the second sight. In my research, I stumbled upon two stories that both shocked me and let me know that I was on the right track.

 

Anne Burras and Jane Wright, a left-handed midwife, were assigned to make shirts for the colony. The thread they were provided turned out to be rotten, and so the shirts were not made properly. They were accused of stealing from the colony. This was during the period of Martial Law, known for brutal punishments. For this transgression the two women were whipped severely. Anne was pregnant at the time and miscarried. Were Anne and Jane purposely given bad thread? Were they framed for the crime of stealing from the colony? Midwives and left-handed people were vulnerable to accusations of being in consort with the devil. Did someone accuse these women of witchcraft and was this their punishment? 

 

My research led me to another story: the first witch trial ever recorded in the New World. It happened in 1626 and took place in the Jamestown settlement in Virginia. The accused witch? This same Jane Wright. I felt as though my hunches, and my characterization of Virginia, may well be founded in historical reality

 

Witch hunts in the New World began in Virginia, though Virginia’s alleged witches were imprisoned rather than executed. The first American witch executions occurred in Windsor, Connecticut in 1647 (46 accused, 11 executed). The mania reached a peak during the famous Salem, Massachusetts witch trials in 1692: over 200 people were accused, 150 were arrested, 20 were executed and another 4 died in jail awaiting trial.

 

These hapless women and men were accused of causing a variety of problems: storms that wrecked ships, the death of humans and livestock, illness, and crop failure. Some “witches” were even accused of entering people’s rooms at night in apparition form and biting, scratching, and/or trying to strangle them. Any unexplained difficulty or calamity, or, it seems, strange dreams, could be blamed on a witch.

 

Social class figured prominently in these matters, with those of high status more often having the authority and credibility to do the accusing, and the poor most often bearing the brunt of the accusations. Acting outside the norm in any way, even by simply being left handed, was suspect, as was knowing how to use herbs to heal, being a midwife, or, as a woman, being unmarried. Sometimes the alleged witch had quarreled with neighbors over land boundaries or given birth to a child out of wedlock. Often, men who were accused of sorcery were those who had tried to defend their wives or other female family members when they were accused.

 

There does seem to be some truth to the accusations that these “witches” correctly predicted events and deaths, making it appear as though there was something going on beyond the imaginations of the accusers. When Jane Wright was brought to trial, the court records state that she had correctly predicted the death of several colonists and used witchcraft to kill a newborn as well as destroy crops and livestock. Knowing when a death will happen and predicting it, and actually causing the death, are of course two very different things, but this conflation certainly contributed to the belief in sorcery

 

In Poison in the Colony, I wrote about intuitive abilities that were passed down through women. Virginia’s grandmother, a character I invented, also had the gift of the second sight. She was convicted of sorcery, strangled and burned at the stake in England—all the more chilling for Virginia to know the specifics of what could befall her. 

 

This belief that being in consort with the devil was passed down from mother to daughter has its roots in history. As part of the finger pointing in Salem, Massachusettes, the four-year-old daughter of Sarah Good, one of the accused witches, was also arrested and put into prison. The child lingered in prison for between six and eight months, and by the time she was released she was so traumatized that she was never able to take care of herself. 

 

When her mother, Sarah Good, stood on the platform ready to be hung, the Reverend Nicolas Noyes urged her to confess to being a witch. Good replied, “You are a liar. I am no more a witch than you are a wizard, and if you take away my life God will give you blood to drink!” 

 

In 1717 Reverend Nicolas Noyes died hemorrhaging internally, choking on his own blood.

]]>
Sun, 16 Jun 2019 01:00:48 +0000 https://historynewsnetwork.org/article/172120 https://historynewsnetwork.org/article/172120 0
Donald Trump, the Humanities, and the Decline of American Values

The main reading room of the Library of Congress

 

During April 2019 several pieces appeared on the HNN website dealing with the decreasing interest in the humanities, including history. One of them was entitled  “US declining interest in history presents risk to democracy.” Commenting on President Trump’s poor knowledge of history, it observed that he “is a fitting leader for such times.” Another article, abridged from The New York Times, was “Is the U.S. a Democracy? A Social Studies Battle Turns on the Nation’s Values.” These essays stirred me to ask, “What is the connection, if any, between President Trump, the decline of the humanities, and U.S. values?”

 

Let’s begin with American values. While any generalizations present difficulties, they can at least help us get closer to important truths. A valuable indicator of American values, first published in 1950, is historian Henry Steele Commager’s The American Mind.  Regarding “the nineteenth-century American,” he wrote, “Often romantic about business, the American was practical about politics, religion, culture, and science.” In the next several pages, Commager also generalizes that the average American’s culture “was material”; there “was a quantitative cast to his thinking”; “theories and speculations” disturbed him, and “he avoided abstruse philosophies of government or conduct”; his “attitude toward culture was at once suspicious and indulgent,” and he expected it (and religion) to “serve some useful purpose”; and “he expected education to prepare for life — by which he meant, increasingly, jobs and professions.” “Nowhere else,” the historian noted, “were intellectuals held in such contempt or relegated to so inferior a position.” 

 

A dozen years after the publication of Commager’s book, Richard Hofstadter’s Anti-Intellectualism in American Life (1962) appeared. Over a year ago, I discussed that historian’s insights as they applied to present-day U. S. culture and President Trump.  Hofstadter noted that “the first truly powerful and widespread impulse to anti-intellectualism” arose during the Jackson era. This anti-intellectualism was common among evangelicals and it was reflected in the popularity of the Horatio Alger rags-to-riches myth, the increasing emphasis on vocational training, the popularity of self-help gurus like Norman Vincent Peale, and the strong impact in the early 1950s of McCarthyism. 

 

I then indicated how all these points were connected to Trump, that he “epitomizes the anti-intellectual strain in American culture,” and that he has never “evidenced any interest in the humanities or liberal arts. Literature, history, philosophy, the arts, and any interest in foreign cultures have remained alien to him.”

 

 

At the end of Commager’s book he asked a number of questions about the future. What would U. S. education educate people about? How would Americans use their increasing leisure?  Increasingly abandoning “traditional moral codes,” would “they formulate new ones as effective as those they were preparing to abandon?” “Would they preserve themselves from corruption and decadence?” “Could they preserve their pragmatism from vulgarization?”

 

In the seven decades that have passed since the publication of The American Mind, the answers we have provided to these questions regarding education, leisure, morality, corruption, decadence, and vulgarization have been more negative than positive. 

 

In 1985, Neil Postman wrote in Amusing Ourselves to Death, “Our politics, religion, news, athletics, education and commerce have been transformed into congenial adjuncts of show business, largely without protest or even much popular notice. The result is that we are a people on the verge of amusing ourselves to death.”  In 1993, Zbigniew Brzezinski, former National Security Adviser to U. S. President Jimmy Carter, stated that television had “become prominent in shaping the [U. S.] national culture and its basic beliefs,” that it “had a particularly important effect in disrupting generational continuity in the transfer of traditions and values,” and that it helped produce “a mass culture, driven by profiteers who exploit the hunger for vulgarity, pornography, and even barbarism.” By 2005, Postman’s son Andrew noted that entertainment had considerably broadened including the Internet, cell phones, and iPods.  By 2018, there was further broadening, and historian Jill Lapore wrote, “blogging, posting, and tweeting, artifacts of a new culture of narcissism,” became commonplace. Social media, “exacerbated the political isolation of ordinary Americans while strengthening polarization on both the left and the right. . . . The ties to timeless truths that held the nation together, faded to ethereal invisibility.”

 

How appropriate then that in 2016 we elect a TV celebrity (on The Apprentice), who in the words of historian Niall Ferguson, “is the incarnation of the spirit of our age. His tweets–hasty, crude and error-strewn–are just one symptom of a more general decline in civility that social media have encouraged.”

 

If Trump tells us something ugly about ourselves, what does that have to do with the present state of the humanities? In a 2018 essay, “Why Trump's Crassness Matters,” I indicated that “Trump’s crassness and lack of aesthetic appreciation reinforces an unfortunate tendency in our national character—to undervalue beauty.” The Frenchman Alexis de Tocqueville observed this already in the early nineteenth century, noting that we tended to “cultivate the arts which serve to render life easy, in preference to those whose object is to adorn it. . . . [We] will habitually prefer the useful to the beautiful, and . . . . will require that the beautiful should be useful.” 

 

At times, some of our leaders have demonstrated an appreciation of beauty. Historian Douglas Brinkley has written long books on both of our Roosevelt presidents’ appreciation of nature’s beauties, and John Kennedy once said, “I look forward to an America which will not be afraid of grace and beauty, which will protect the beauty of our natural environment, which will preserve the great old American houses and squares and parks of our national past, and which will build handsome and balanced cities for our future.”

 

Unfortunately, however, Donald Trump’s philistinism and disrespect for our environment is all too common, as is lack of aesthetic appreciation—note his constant budget proposals to kill the National Endowments for the Arts and Humanities. Does not the fact that fewer university students are selecting courses in history and the other humanities and arts reflect some of the same reasons we elected our liar-in-chief Donald Trump? We overvalue such things as making money, “getting ahead,” glitz, and celebrity status and undervalue what the humanities and arts emphasize—beauty, truth, and goodness.  

 

A month after Trump’s election I wrote that he reflected the “ugly side of American life.” A comparison that didn’t occur to me then, but does now is that our culture is like the popular understanding of a Robert Louis Stevenson character—Dr. Jekyll and his alternative personality, Mr. Hyde. Trump is the diabolical Mr. Hyde of our national personality. We also have a good Dr. Jekyll side represented by such individuals as Carl SandburgDorothy Day, and Martin Luther King, Jr. Like the Jekyll/Hyde multiple personality, the two sides are battling for our soul.  

]]>
Sun, 16 Jun 2019 01:00:48 +0000 https://historynewsnetwork.org/article/172118 https://historynewsnetwork.org/article/172118 0
The Bund was far from Perfect. It still matters to Jewish History

A Bundist demonstration, 1917

 

 

From 25-27 September 1897, thirteen activists from various Jewish radical organizations in the Russian Empire met in Vilna (now Vilnius, Lithuania). The date was no accident, it coincided with the Jewish holiday of Yom Kippur, a day when many Jews were traveling to be with family, a day when these activists could travel without arousing the authorities’ suspicion. The need for secrecy overrode all other concerns. 

 

At no point were all of the activists in the room at the same time, and no official minutes of the meeting were kept. But the party they founded, the General Jewish Labor Bund, would play a leading role in Jewish politics for the next 50 years. Staunchly secular and socialist, hostile towards Zionism but fiercely committed to the Jewish community, the Bund insisted that if the revolution liberated Jewish workers as workers, but allowed them to suffer continued persecution as Jews, it would not have liberated them. Rather, the party dreamed of a socialist federation of nations, including Jews; autonomous in cultural matters but politically and economically united.

 

The Bund occupies something of a paradoxical place in Jewish history. Judged by its own criteria, the Bund failed. Largely destroyed in the Holocaust (small chapters survive today), the closest it came to its goal of Jewish autonomy within a socialist federation of nations was the Soviet Union, an experiment few would consider a successful resolution of the Jewish Question. And yet, despite this inability to realize its goals, the Bund played an outsized role in the lives of the Eastern European Jews. When the Jews of Russia and Poland suffered from popular- and state-sponsored antisemitism and economic displacement, the Bund gained the admiration of many—including many of its staunchest opponents—for its role in organizing workers and defending Jewish communities from during pogroms. The Bund played a central role in cultural matters too, embracing Yiddish, a language often derided as a folksy jargon, as one worthy of serious discourse. We cannot easily dismiss this party from Jewish history.

 

After slowly fading from the Jewish communal consciousness after the Bund’s near-destruction in the Holocaust, the Yiddish socialism espoused by Bund today is undergoing a revival of interest, especially among young, left-leaning Jews. In part, this is spurred on by a declining interest in Zionism, itself the result of Zionism’s conflation, rightly or wrongly, with the politics of Benjamin Netanyahu, politics opposed by many American Jews. Combined with the rising antisemitism on both the right- and left-wings in the US that has robbed American Jews of a domestic political home, the Bund has emerged as an important symbol. Articles about the Bund specifically and Yiddish socialism in general have appeared in the New York Times, the New York Review of Books, the Jewish Daily ForwardJacobin, and elsewhere, with many of these going viral. Jewish Currents, a left-wing Jewish magazine founded in 1946, has successfully relaunched in pursuit of younger Jews. Organizations such as The Jewish WorkerJewdas, and Jewish Solidarity have joined with Jewish Currents in claiming the Bund’s legacy while growing the conversation about the Bund on Facebook and Twitter. Coinciding with a moment when campaigns by Elizabeth Warren, Bernie Sanders, Alexandra Ocassio-Cortez, and others have brought socialist ideas back into the mainstream, young Jews have found the idea of a proudly Jewish form of progressive politics that embraces cultural specificity while rejecting particularism attractive indeed.

 

Ironically, the Bund’s history of failure only adds to the Bund’s mystique. Virtue is easy when one lacks the power to act, and the Bund was powerless for most of its existence. Its program existed only in the world of “what could be.” While the Zionists, like so many national liberation movements before and after, disappointed in power, the Bund’s dreams, perpetually deferred, lived on as potent symbols. The party is easily reimagined as a kinder, purer alternative for Jewish politics, representing everything Zionism was not; while the latter was masculine and national, in the Bund is imagined an egalitarian and cosmopolitan spirit. 

 

However, the lionization of the Bund depends in a large part on the Bund’s historical powerlessness. This is problematic. Hannah Arendt once noted that beauty and humanity are luxuries afforded only the oppressed, luxuries that “have never survived the hour of liberation by even five minutes.” The Bund did not prove Arendt wrong. The Bund did experience one moment of power, during the Russian Revolution. Despite claims to the contrary by many antisemites, Bundists did not initiate the Red Terror (1917-1922). They did, however, participate. Spurned at the ballot box by the Jewish masses in favor of their Zionist archrivals, the Bund swiftly learned the value of being able to arrest their rivals on political charges. Dissenting party members suffered as well. Sara Foks, a seamstress from Kiev and one-time rising star of the Bund in Ukraine who opposed Soviet rule, was arrested and interrogated repeatedly by one-time comrades now wearing the uniform of the Cheka until, on July 24, 1919, she jumped off a bridge into the Dnieper River. Others were simply executed.

 

None of this is to say that Bundists were evil, but that they were human. Like all movements, the Bund reflected the environment from which it emerged, and late Imperial Russia was as harsh an environment as one can imagine. Its actions were driven by a desperate conviction that the future of the Jewish people depended on the successful realization of its program. This was at a time when the Jewish future was very much in doubt. The debate as to whether the Bund were angels or demons misses the point; that they were human, a status they had to fight time and time again to defend. 

 

The importance of the Bund is not in whether it succeeded or failed, or if it provided a kinder path for Jewish politics than Zionism. What does matter is what the Bund represented during the half-century it contended as a major force in Jewish politics. In leading strikes and organizing defense against pogroms, in advancing new forms of ideas in Jewish politics. The Bund embodied the aspirations and identity of millions of Jews for five decades and provided serious answers to the questions Jews faced then and now—even its mistakes are valuable lessons, warnings to good people from across the political spectrum who are convinced with absolute certainty that they are right. 

 

Moreover, the Bund represents a model for diaspora existence that should prove inspiring to Jewish communities around the world, pioneering the idea of meaningful Jewish existence beyond Zion. It offereda political language deeply committed to the Jewish community with an equally uncompromising commitment to values of freedom, justice, and societal fairness. It is in this legacy that the Bund remains essential for Jews today, a legacy at once more difficult and more helpful than the callous erasure of our past or a rose-tinted nostalgia for the lost causes. 

]]>
Sun, 16 Jun 2019 01:00:48 +0000 https://historynewsnetwork.org/article/172119 https://historynewsnetwork.org/article/172119 0
A Brief History of the Theory Trump and Barr Use to Resist Congressional Oversight

 

The unitary theory of the presidency may be reaching its logical conclusion under President Donald J. Trump. That theory, which is referred to as the unitary executive, holds that presidents have broad, close to unlimited, powers over the executive branch. At its extreme, the theory holds that the president cannot be checked “by Congress or the Courts, especially in critical realms of authority,” as John P. MacKenzie wrote in his book Absolute Power

 

The Unitary Executive, as put forward by Attorney General Barr, holds that presidential power over executive branch functions can only be limited by the voters at the next election, or by Congress through its impeachment power. This was essentially the position Barr took in his June 8, 2018 memo to the Justice Department. “Thus, under the Framer’s plan, the determination whether the President is making decisions based on ‘improper’ motives or whether he is ‘faithfully’ discharging his responsibilities is left to the People, through the election process, and the Congress, through the Impeachment process,” Barr wrote. Although Barr does not say it, a president who acted in an improper or faithless way, but who is reelected or who escapes impeachment, could indeed be above the law. Is this really what the Framers intended?

 

It is first important to recognize that the words “unitary executive” do not appear anywhere in the Constitution, although supporters of the theory claim to be originalists. The first known use of the term occurred during the Reagan Administration, when Attorney General Meese first put the theory forward. It was later used to justify much of President George W. Bush’s War on Terror, including extreme measures like torture in the post 9/11 world. Yet even Assistant Attorney General John Yoo, who advanced the theory during the Bush years by writing the infamous memo enabling the torture of terrorists, recently said in an interview with NPR that “the Constitution grants him [the president] a reservoir of executive power that’s not specifically set out in the Constitution.”

 

What Article II of the Constitution does provide is a broad statement that “the executive Power shall be vested in a President of the United States of America.” Alexander Hamilton, perhaps the foremost defender of presidential power, wrote in Federalist No. 70 that “energy in the executive is a leading character in the definition of good government.” Hamilton in part equated energy with unity and believed the presidency should be occupied by one person who could act decisively. The Constitutional Convention, which met in Philadelphia over the summer of 1787 and in which Hamilton had participated, debated and then rejected an executive council. But it was not a decision that was reached lightly, and there were numerous members of the Convention that feared a single executive could begin to resemble the British monarch. 

 

Those who feared a strong executive were influenced by the experiences of the colonists in the 1760s and 1770s during the buildup to the eventual break with Great Britain. During that time, royal governors, appointed by the King, had often dissolved local colonial assemblies when they disagreed with their decision sand regularly vetoed bills. The opponents of a strong executive now feared the return to monarchy, which they had fought to overturn during the Revolutionary War. The concerns they held, which focused largely on the concentration of power in the hands of one individual, had led to the weakening of executive power at the state level in the constitutions approved immediately following the Declaration of Independence. 

 

Yet the lack of a strong executive had led to numerous problems, both during the Revolutionary War and during the years the new republic was governed by the Articles of Confederation. The Convention finally settled on a single executive, but that decision was affected by the presence of Washington at the Convention.  Franklin, who opposed a single executive and preferred some form of an executive council, seemed to allude to this when he said, “The first man put at the helm will be a good one. Nobody knows what sort may come afterwards.” Pierce Butler of South Carolina wrote in a 1788 letter that, “many of the members cast their eyes towards General Washington as President and shaped their ideas of the powers to be given to a President by their opinions of his virtue.” It was clear that most members of the Convention, although concerned about placing too much power in the hands of any one man, were willing to place much more power in the new office of president because of their great respect for Washington. One historian has argued: “had Washington been absent, it is entirely possible that the framers of the Constitution would have created a multiple executive,” or at least have created an office that the legislature would select.

 

We need a balanced approach to our governmental institutions, just as the Framers intended. An energetic head of state is certainly part of this formula. As the political scientist Judith Best has observed “the ship of state cannot do without the pilot who sets the course, who knows where the shoals and reefs lie, and who can direct all hands.” What we do not need is an Imperial President, in Arthur Schlesinge rJr.’s words. Presidential overreach is especially dangerous when the ship of state is being guided by a man who lacks Washington’s sense of virtue. Even Hamilton feared an overly powerful executive and thought “the executive power is more easily confined when it is one” since it is easier to find misconduct when one person bears responsibility for the office of the presidency. 

 

It is not unusual that the Congress and the President sometimes butt heads. All presidents chafe at oversight by the legislative branch, which can sometimes be overbearing. Madison fully expected this, writing in Federalist No. 51 that “ambition must be made to counteract ambition. The interest of the man must be connected with the constitutional rights of the place.” Out of these conflicts each branch would, it was hoped, remain within its orbit. 

 

Yet we have also learned that the branches of government must find ways to work together with a certain degree of mutual forbearance. A good example occurred early in our history during the Jefferson Administration. During the treason trial of Aaron Burr, Chief Justice John Marshall had a subpoena served to President Jefferson to produce documents. “The English principle that the King could do no wrong, Marshall said, did not apply to the United States where the President…was subject to the same law as every other American,” as Schlesinger has written. But Marshall did not fully press his authority, and Jefferson was not required to appear in court. Jefferson’s view was that Marshall wanted him to “abandon superior duties” to inferior ones. “Both men were surely correct,” according to Schlesinger, and in the future courts would try to find that balance between enforcing the law equally upon everyone while recognizing the official duties a president must fulfill. 

 

The concept of Congressional oversight over the executive branch is a long-established precedent in the United States, a practice that traces back to our British roots. As early as 1792, the House established a special committee to investigate certain executive branch actions, and Madison and four members of the Constitutional Convention voted for the inquiry, indicating they thought this was a core function of the Congress. In a 1927 Supreme Court decision, the Court found that “the power of the Congress to conduct investigations is inherent in the legislative process [and] that power is broad.” It has often been the Supreme Court that has required presidents who overstep their bounds to comply with Congressional mandates. When Richard Nixon refused to turn over his tapes during the Watergate crisis, the Supreme Court ordered him to do so, leading to his eventual resignation from office. 

 

The Supreme Court has in fact ruled twice on the unitary executive theory, and both times rejected the concept. In Morrison v. Olson, decided in 1988, the Court majority decided that the special counsel statute did not violate the separation of powers. Justice Scalia, alone among the justices, issued a scathing dissent largely along the lines of the theory of the unitary executive. “Morrison shattered the claim that the vesting of ‘the executive power’ in a president under Article II of the Constitution created a hermetic unit free from the checks and balances apart from the community,” MacKenzie wrote in Absolute Power. In 2006, the Supreme Court again issued a stinging rebuke to executive overreach in Hamdan v. Rumsfeld, a case that dealt with the use of military commissions to try terrorists at Guantanamo Bay. As Justice Breyer wrote for the majority, “The Court’s conclusion ultimately rests upon a single ground: Congress has not issued the Executive a ‘blank check’ to create military commissions,” and told the Bush Administration that they should seek Congressional approval, which they ultimately received.

 

Not every adherent to the unitary executive theory accepts that the president has absolute power. Steven Calabresi, a major supporter of the theory, has written that “there are some people who believe that the President has the prerogative powers of King George III in foreign and domestic policy,” but that he does not “fall into that category.” Still, others have used the theory, or at least the concept, of unfettered presidential power. Richard Nixon, whose presidency predated the use of the term, once told David Frost in the aftermath of Watergate that “when the President does it that means its not illegal.” Dick Cheney, while a member of the House in 1987, was even more blunt when he dissented from the majority report on the Iran-Contra affair. “The Chief Executive will on occasion feel duty bound to assert monarchical notions of prerogative that will permit him to exceed the laws.” 

 

What is so shocking today is President Trump’s absolute refusal to comply with Congressional requests for information and testimony from some of his top aides regarding the recently released Mueller Report. One must wonder what advice he is receiving from his Attorney General, and whether Barr’s support of the unitary executive affects such advice. Lawyers for Donald Trump seem to adhere to the more extreme version of the unitary executive theory. In a letter dated May 15, 2019 to Chairman Jerold Nadler of the House Judiciary Committee, Trump’s legal counsel questioned whether the Committee’s inquiry was designed to “further a legitimate legislative purpose” or was designed to harass and embarrass “political opponents.” Nadler’s response went to the heart of the matter, indicating that first the Justice Department said it “cannot indict” a sitting president and “now it adds the extreme claim that Congress cannot act either…this flies in the face of the American idea that no one is above the law, and I reject it,” according to Nadler.  It also is inconsistent with the Mueller Report, which found that “Congress may apply the obstruction laws to the President’s corrupt exercise of powers” since it “accords with our constitutional system of checks and balances and the principle that no person is above the law.” In the meantime, lower courts have begun to act, requiring that the President comply with certain demands for information.    

 

Part of the tragedy of recent events is that William Barr came into the job of Attorney General with a solid reputation. It now appears to many of us that he has decided to protect Donald Trump at all costs, and not the office of the presidency, as he claims. The unitary executive theory which Barr supports is a dangerous doctrine when applied in the most extreme manner. Now it has been put at the service of a man with clear autocratic tendencies who knows no limits and respects no norms, a man who wants to use the power of the presidency to punish his enemies. If Barr really wants to save the presidency, he might start by rethinking his support for unlimited presidential power under the guise of the unitary executive. Otherwise he may leave the House of Representatives with little choice but to open an impeachment inquiry in order to do their jobs. But then perhaps that is what his boss really wants.  

]]>
Sun, 16 Jun 2019 01:00:48 +0000 https://historynewsnetwork.org/article/172098 https://historynewsnetwork.org/article/172098 0
The Occupation of The Atlantic Mind

 

 

In the latest issue of The Atlantic (May 14, 2019), it should come as no shock that Benny Morris, an Israeli historian turned propagandist, attacks Rep. Rashida Tlaib (D-MI), the first Palestinian-American in Congress and a Muslim. Morris’s implacable hostility to Arabs—as Atlantic editor-in-chief Jeffrey Goldberg once put it--“sometimes leads him to inflammatory conclusions,” including Morris’s lament that Israel’s founder David Ben-Gurion failed to carry out a “full expulsion—rather than a partial one,” which could have “stabilized the State of Israel for generations.” The professor’s article is entitled, “Rashida Tlaib Has Her History Wrong.” Morris proceeds to show that he is indeed an expert at getting the history wrong.

 

Morris condemns Tlaib for pointing out that European Jews escaped anti-Semitism by occupying her family’s homeland. He continues with a hoary account ascribing Palestinians with “direct” responsibility for the Nazi genocide, citing the anti-Semitism of the Islamic leader of the era and the efforts to impede Jewish migration into Palestine. Despite local opposition, thousands of European Jews flooded into Palestine in the 1930s precipitating the 1936-39 Arab revolt, which Morris blames on the Palestinians rather than the European migrants.

 

In Morris’s unapologetic and at least borderline racist reading of the past, “most Palestinians still hope for Israel’s disappearance”—this offered with no supporting evidence—whereas “the Zionist side over the decades has repeatedly agreed to a compromise based on partitioning Palestine into two states” only to have “the Arab side” reject all proposals. So, there you have it—peace-loving Israel always making generous offers and the Allah-worshipping fanatics always turning them down in deference to their murderous plots to destroy the Jewish state. History made simple--and loaded with Zionist apologetics.

 

For the record, it is a historical fact that since the June 1967 war Israel has repeatedly rejected opportunities to trade land for peace and has instead pursued a colonial domination of the West Bank, the Golan Heights, as well as Egyptian territory and the Gaza Strip, the territories that it did belatedly relinquish. Until the 1990s Israel vigorously opposed any discussion and refused to negotiate toward the creation of any sort of “Palestinian entity.”

 

Morris understands that an effective propagandist must offer a counter-argument or two, if only deftly to dismiss them. Thus, he allows, it is true that since 1967 “the Israeli side has oppressed the Palestinian inhabitants and denied them various civil rights,” but sad to say, “such is the nature of military occupation.” (Imagine Morris, or anyone else, writing, “Jews were beaten in the streets and had their shops closed, but such is the nature of anti-Semitism.”)

 

What has happened in Palestine is not a “military occupation,” it is rather a settler colonization, one that as pertains to the West Bank and East Jerusalem is illegal and thus illegitimate, as well as being an ongoing and highly destabilizing human rights atrocity. Jewish settlements are scarcely mentioned by Morris, for the obvious reason that they have been illegal and had already rendered a two-state solution unworkable by 2000, the time of the mythical “generous offer” at Camp David, which Morris the propagandist resurrects as the best example of unregenerate Palestinian hostility to peace. For the record, a scholarly consensus holds that the offer of a bisected, non-contiguous state replete with Jewish-only roads, checkpoints, and ultimate Israeli control of state security represented no real opportunity for a viable independent Palestinian state, let alone constituting a “generous offer.”

 

Sadly, at one time, as Norman Finkelstein has pointed out on several occasions, Benny Morris was an accomplished historian and in fact played a key role in a much-needed post-Zionist scholarship in Israel. His research revealed that the ethnic cleansing of 1948 had been deliberate. Only later did he become a cheerleader for it. Unlike Ilan Pappe, Avi Shlaim, Nurit Peled, and many other Israeli historians who have their integrity intact, Morris, as Finkelstein accurately charges, became a court historian--a propagandist for the State of Israel.

 

The problem isn’t just Morris, however—it’s the Atlantic too. Founded as the Atlantic Monthly in Boston in 1857, The Atlantic is a venerable American publication with a distinguished record of literary and cultural criticism and political reportage. Today the magazine is distinguished by the intensity of its Zionist distortions of the past and present of the Palestine conflict.

 

The Atlantic’s palpable pro-Israeli bias should come as no surprise as its editor-in-chief since 2016 is Jeffrey Goldberg, a citizen of both Israel, which he served as an IDF prison guard, and the United States. Goldberg is a staunch Zionist and is quick to equate criticism of the Israel lobby with anti-Semitism, as he did in a notorious review in The New Republic (October 8, 2007) in which he directly linked the book by the distinguished political scientists John Mearsheimer and Stephen Walt, The Israel Lobby and U.S. Foreign Policy (2006), with Osama bin Laden’s brand of virulent anti-Semitism.

 

The brilliant 2016 documentary film identifies “The Occupation of the American Mind” at the root of misperceptions, imbalanced reporting, and outright disinformation on the Israel-Palestine issue. The case of The Atlantic and of Benny Morris remind us of the pernicious and monolithic nature of Zionist discourseand this ongoing “occupation.” It places the noble professions of journalism and history in the service a crude propaganda regime that seeks to perpetuate the occupation of the American mind.

]]>
Sun, 16 Jun 2019 01:00:48 +0000 https://historynewsnetwork.org/article/172095 https://historynewsnetwork.org/article/172095 0
What I’m Reading: An Interview With Historian Mark Weisenmiller

 

 

Born in Pittsburgh in 1963 (four days after the assassination of President John F. Kennedy), Mark Weisenmiller graduated from the Pennsylvania State University in 1985. He is an author-historian-reporter. Previous employers include United Press International (UPI); Deutsche Presse Agentur (DPA); Agence France Presse (AFP); Inter Press Service (IPS); the international news wire service, based in Beijing, China, known as the Xinhua News Agency (XNA), and The Economist.

 

Regarding his history-themed writing, he has had such articles published in the Canada-based “History Magazine”; the London, England-based “History Today”; “America In WWII”--and many articles for History New Network. He has written articles about the following, which will be published in future issues, of “History Magazine”: a profile of Ivan The Terrible; a report on the Gang of Four Trial in China in 1980; a story about the famous 1967 U.S. Supreme Court case of “Loving v. Virginia,” and a story about the Grand Ole Opry.

 

When D-Day, 75th Anniversary: A Millennial’s Guide (to which he contributed two chapters) is published in 2019, this will be the fifth non-fiction book that he has either contributed and/or written solely. These books have been about, respectively, ice hockey; capital punishment; a biography of a famous American newscaster, and a book that chronicled the cultures, current events, leaders, news, and politics of 15 North African, Middle Eastern, and Southern European countries.

 

Divorced, he is the father of one son and one daughter. 

 

 

What books are you reading now?

 

Long has it been a policy of mine, in my work, to read books that do not directly relate to a book or story or project that I currently am working on. I find that, once I read said stuff, and then go back to the book or story or project that I am working on, this gives me a fresh perspective. In other words, my six senses (the five senses, plus kinesiology--i.e., the study of movement) become more alert and super-sensitive. Also, I am one of those people that is usually reading three or four books at a time. To directly answer your question: John Gunther’s Inside Africa; Robert K. Massie’s great Nicholas and Alexandra; Bob Gibson’s From Ghetto To Glory, and James Mustich’s 1,000 Books To Read Before You Die: A Life-Changing List.

 

What is your favorite history book?

Impossible to answer; rather, let me answer this question by way of the following: technically, I am a reading prodigy. When I was six, I was reading at a level common for fourth graders. So, I was reading non-fiction and history books at an unusually early age. The first history book of merit that really made an impression on me was Barbara W. Tuchman’s The Guns of August. Her sometimes slog-like prose is that of an academic historian but then again she was an academic historian. For some readers, the pace of her prose is as slow as molasses in Massachusetts in March. I have had friends who told me that they have tried to read a book of Tuchman’s but simply could not finish it. I am not a bibliophile or historical snob; I understand their viewpoints. However, regarding The Guns of August, when she gets her narrative motors running, in my estimation no history writer can keep up with her pace. 

Why did you choose history as your career?

From my mid-teens to age 50--that is, for 35 years--I worked for international news wire services. It was great fun but as I grew older I noticed at the end of the work day I didn’t have as much pleasure doing it as I once did. Maybe it was simply the fact that I did the same job--albeit, for different news wire services--for more than one third of a century. I loved working for international news wire services and, under certain conditions, would go back to it. Even when I was doing this news-wire service work, I also wrote innumerable stories for history magazines and websites. This fact, coupled with the fact that I have had a lifelong interest in world history, led me to what I am doing now. I don’t feel that I ran away from international news wire service work as much as I ran towards writing history articles, books, book reviews, etc.

 

What qualities do you need to be a historian?

Perseverance and a strong set of eyes; be prepared to do more reading than can be imagined. Actually, to my way of thinking, reporters becoming historians is a natural progression and people who work in either/or profession, besides sharing those two traits, also share the following items: sometimes being intractable; possessors of a mental and physical toughness; having large vocabularies; being friendly and gregarious, and finally, being open-minded and non-judgmental. All of this helped me write the two chapters that I contributed to D-Day, 75th Anniversary: A Millennial’s Guide.

 

 

Who was your favorite history teacher?

Professor Sidney Elkin, who was my instructor for the Political Science class when I was attending the Pennsylvania State University. Again, to my way of thinking, journalism, history, and political science--not always, but often--are interchangeable. 

 

What is your most memorable or rewarding teaching experience?

 

The most rewarding experience is, in retrospect, also the most memorable and it came via Professor Elkin. In his class, students had to pair off and visit poor people in their homes to conduct surveys asking them all sorts of--what I thought then and still very much think now--personal questions about their lowly economic status. I forget what we did with all the information we collected from the surveys; probably collated it. These people, many of whom were African-American, lived in squalid destitute in Aliquippa, Pa. This was in the days when Western Pennsylvania was the home base for U.S. steel manufacturing; now it is not. Many of these people were embarrassed by their economic status; I was quick to note that not many of them made eye contact with myself when I asked them the survey questions. What all of this taught me was, and is, that, in regards to history and journalism, statistics dealing with human beings are not just cold, sterile, numbers. Each digit is a person with dreams, desires, emotions, and feelings.

 

Do you own any rare history or collectible books? Do you collect artifacts related to history?

 

I have owned, or do own, literally thousands of history books; many of them I’ve given away to charities, friends, or libraries. I have a first edition of Lowell Thomas’ With Lawrence in Arabia (“Lawrence of Arabia” is one of my all-time favorite films) and a first edition of Gunther’s Inside Europe. Regarding artifacts, a main hobby is collecting American presidential campaign buttons. Some I bought, but most I got from when I covered Presidential election campaigns. Interestingly, both Thomas and Gunther were reporters before focusing more on history writing rather than hard-news reporting. Maybe, subconsciously, this is where I got the idea of becoming an author-historian-reporter.

 

What have you found most rewarding and most frustrating about your career?

 

Rewarding: simply the fact that I was, and am, given an opportunity to work in what I was trained to do. Frustrating: I am not sure that the following is a point of frustration or simply a phenomenon that goes with being a writer of history articles and books (it does not apply to my journalism career). To wit: this can be a very, very lonely life. I read three to four times as much, in toto, as the average person, simply because that is required to do this job well. Consequently, I am by myself for very, very long stretches of time. To anyone considering going into this field, I can not stress the following enough: make sure to make plenty of “down time” to share with family and friends. Also, be prepared to work holiday seasons as well. Deadlines for author-historian-reporters simply do not recognize such things.

 

How has the study of history changed in the course of your career?

 

Yet another difficult question. My immediate response is that due to the average attention span of people decreasing (if we are to believe sociologist’s studies), I am noticing that the study of history--and particularly the writing about it--may be becoming more driven by interest in personalities rather than events. I consider this a positive; in general, it puts things into human perspective. Most people would rather read about people than events and I tend to agree with them.

 

What is your favorite history-related saying? Have you come up with your own?

 

Being a decades-long international and domestic political reporter, I have, and know of, dozens of stories so I can easily provide one. This is listed on my Facebook profile and I will also repeat it here. One time during the 1964 American Presidential election, Senator Barry Goldwater, the Republican Party’s Presidential candidate, said something moronic in criticizing a social program by the incumbent, President Lyndon B. Johnson. In itself, this was unusual that Goldwater said a moronic thing because I once interviewed him and, although bull-headed, he was not at all moronic. Anyway, the following day, a reporter repeated what Goldwater said--never mind, for our purposes here, what it was--to LBJ while the President was sitting with his feet propped up before him behind his work desk in the Oval Office. LBJ rolled his eyes, shook his head from side to side, and said in that unique Texan drawl of his “Any jackass can kick down a barn but it takes a carpenter to build one.”

 

What are you doing next?

Besides continuing to write articles for history magazines and websites, I currently am at work in the multiple stages of self-editing my e-book of reportage about China. Afterwards, the e-book will be professionally edited. This will be the second in a series of books that I plan to write chronicling the cultures, histories, leaders, news, and politics of the world’s countries and regions. I am not yet sure what country will be the focus of Book Three in the series; maybe Brazil, maybe Canada, maybe some other country or region. These books will be somewhat similar to Gunther’s so-called “Inside” books. One thing that I have drifted away from, and did much of in the past, and which I very much miss doing, is writing profiles of the leaders and rulers of the world’s countries and regions. I have not yet been able to find the Internet, media, or press outlet for which these profiles would be a good fit. Attention, editors of said outlets, please be aware that, for commission, I am available for such work. In short, I have already accomplished much, but I have so much more to do. 

]]>
Sun, 16 Jun 2019 01:00:48 +0000 https://historynewsnetwork.org/article/172117 https://historynewsnetwork.org/article/172117 0
What Should Historians Do About the Mueller Report?

 

 

The Mueller Report, officially entitled Report on the Investigation into Russian Interference in the 2016 Presidential Election is an extraordinary document, systematic, detailed (472 pages) and thoroughly researched and documented (2,381 footnotes).  News media pundits have summarized its high points, congressional Democrats say they need more information, and President Trump has asserted that it proves there no collusion with the Russians and no obstruction of justice.  

But the report itself deserves wider reading and more scrutiny and analysis than it has received. So far, for instance, no member of Congress seems to have announced that he or she has read the entire report. The media have moved on to the confrontation between the president and the House of Representatives over who can testify in House hearings.  The report itself has moved into the background too soon. Historians can help encourage people to read, reflect on, and perhaps take action on, one of the most important documents in American history. We need to be more emphatic in asserting history's role in bringing clarity to complex public issues. 

A few strategies:

Demonstrate how to draw concrete conclusions from the evidence in the report.  The Mueller report is in essence a detailed assemblage and presentation of evidence, somewhat comparable to a grand jury report. Historians can guide the public by analyzing and weighing the information in the report, analogous to what historians routinely do with evidence from the past, and reach conclusions. Historians help people understand cause-and-effect relationships, analyze and weigh evidence, and discern underlying patterns in complexity. They are adept at presentations that are fair, objective, and fact-based. Those skills can be put to work on the Mueller report both to enlighten the public and to demonstrate how people can use the report themselves. Presentations at public meetings, op-ed pieces for newspapers, and broadcasts via social media are all potential forums.  For instance, there is a great deal of information about Trump's Russian ties and why and how the Russians supported his campaign. There are dozens of pages with evidence about the president's determined attempts to sidetrack, slow down, discredit, or stop the investigation. Appendix C, 21 pages long, presents the president's written response to the Special Counsel's questions (Trump refused to be interviewed), with more than 30 uses of the phrase "does not recall" or other phrases that the report calls "incomplete or imprecise."  This evidence, appropriately analyzed, can lead to informed conclusions about Russian meddling, the Trump campaign's actions, and the role and responsibility of the president himself.

Add a historical dimension. Special Counsel Mueller's charge was to examine the 2016 election. He did not look back into history, i.e., previous attempts by foreign nations to meddle in U.S. presidential elections. Historians know that 2016 was not the first time.  In 1960, the Russians secretly approached Adlai Stevenson with an offer to support his campaign. (Stevenson rebuffed them and decided not to seek the Democratic nomination that year.) In 1888 the British ambassador indicated a preference for incumbent Grover Cleveland, running for re-election. (The move backfired, alienating voters and probably costing Cleveland the election.) If the Russians tried in 1960, and got caught in 2016, it seems like a good bet that they may have tried in some of the 13 elections in between.

Compare the report to previous high-stakes reports. Historians can provide an important perspective by comparing the Mueller report to previous reports on critical national issues. In particular, they can remind the public why the investigations were undertaken, what the reports said, their strengths and weaknesses, public reaction, and what (if any) action they engendered. Examples might include the 9-11 Commission Report (2004), which analyzed intelligence failures before the attacks;  reports (6 volumes) (1976) of the Senate Select Committee to Study Governmental Operations with Respect to Intelligence Activities, which led to reform of federal intelligence agencies; reports of the National Commission on the Causes and Prevention of Violence, particularly Violence in America: Historical and Comparative Perspectives (1969) which emphasized lack of employment and educational opportunities in inner-city neighborhoods but did not lead to concrete actions and the issues continue today; the President's Commission on National Goals, Goals for Americans. (1960) which emphasized individual initiative and called for elimination of barriers to equal justice and opportunity and engendered considerable public discussion but no solid policy initiatives; the Congressional Joint Committee on the Investigation of the Pearl Harbor Attack's Report (1946) which exonerated President Franklin Roosevelt from blame for the surprise 1941 attack and led to policy changes, including the National Security Act and creation of the Department of Defense in 1947; and the House of Representatives' investigation and report (1792) on the defeat of General Arthur St. Clair by Indians in the Northwest Territory, which occasioned the first assertion of what we now call executive privilege, by President George Washington, in response to the House's request for documents on the military campaign.

Use the report to call attention to the relevance of the past to the present. The Mueller report and the issues it probes are good examples of where historical insight would help. This points the toward the need for more of what might be called "historical mindedness" in public debates of critical issues these days. "So long as the past and the present are outside one another, knowledge of the past is not of much use in the problems of the present,” wrote British philosopher and historian Robin G. Collingwood in his 1946 book The Idea of History. “But suppose the past lives on in the present; suppose, though encapsulated in it, and at first sight hidden beneath the present’s contradictory and more prominent features, it is still alive and active; then the historian may very well be related to the non-historian as the trained woodsman is to the ignorant traveler. ‘Nothing here but trees and grass’, thinks the traveler, and marches on. ‘Look’, says the woodsman, ‘there is a tiger in that grass’ ".   There is more than one "tiger" in the Mueller report.

Monitor preservation of the investigation's records.  Mueller's two-year investigation amassed a great volume of interviews, court filings, testimony, FBI investigative material, staff reports, and other records. These are official records of the Justice Department.  Historians (and others) need to call attention to the need for them to remain intact and at some point be transferred to the National Archives and Records Administration. The files should be opened to researchers in a timely fashion and with minimal restrictions. They can be expected to reveal a fuller picture of Russian interference, presidential obstruction, and the impact of both on the 2016 election. 

]]>
Sun, 16 Jun 2019 01:00:48 +0000 https://historynewsnetwork.org/article/172093 https://historynewsnetwork.org/article/172093 0
UPDATED Mueller Investigation: What Historians Are Saying  

 

 

 

]]>
Sun, 16 Jun 2019 01:00:48 +0000 https://historynewsnetwork.org/article/168317 https://historynewsnetwork.org/article/168317 0
Roundup Top 10!  

 

Adam Cohen: Clarence Thomas Knows Nothing of My Work

by Adam Cohen

The justice used my book to tie abortion to eugenics. But his rendition of the history is incorrect.

 

In our tumultuous times, history offers hope

by Katrina vanden Heuvel

In his book, my father ceaselessly reminds us that hard work and idealism can create change.

 

 

Billionaires can't fix college: Here's the real crisis in higher education

by Jim Sleeper

In this conversation between historian Matthew Frye Jacobson and Professor Jim Sleeper, they discuss how to reclaim college from market ideology.

 

 

Tony Horwitz’s Greatest Book, Confederates in the Attic, Seems Even More Crucial Today

by Rebecca Onion

Confederates in the Attic is a gift to teachers of American history. It’s wryly funny but sneakily profound.

 

 

Socialists Don’t Know History

by Joseph Epstein

Young people don’t remember the Soviet nightmare. But what’s Sanders’s excuse?

 

 

The Indian Law That Helps Build Walls

by Maggie Blackhawk

The Supreme Court’s legal abuse of Native Americans set the stage for America’s poor treatment of many of its vulnerable populations.

 

 

Open Forum: Are public schools ‘inclusive’? Not for those who oppose abortion

by Jonathan Zimmerman

Children need to learn how to discuss abortion — and other controversial political questions — in a fair and mutually respectful way. And that won’t happen if the adults in the room tell them the right answers, right off the bat.

 

 

Why we’re letting Americans vote, marry and drink far too young

by Holly N.S. White

Age is not a perfect qualifier of ability or maturity.

 

 

The “Forever Wars” Enshrined

by Andrew J. Bacevich

The memorial to American soldiers who were sent into the wars in the Middle East and died is essentially hidden away in a small Midwestern town, which tells you what you need to know about the value Americans actually place on those wars.

</

 

My Zionism is Personal and Complicated

by Ralph Seliger

Today’s an increasingly exasperating time for progressives who care about Israel’s future.

]]>
Sun, 16 Jun 2019 01:00:48 +0000 https://historynewsnetwork.org/article/172127 https://historynewsnetwork.org/article/172127 0
Good Intentions, But Still A Long Way To Go Steve Hochstadt is a professor of history emeritus at Illinois College, who blogs for HNN and LAProgressive, and writes about Jewish refugees in Shanghai.

 

 

 

 

One of the most momentous changes in my lifetime has been the broad social recognition that discrimination against women and people of color in American society is wrong. The idea that women and African Americans were deservedly inferior was a fundamental belief in Western society for so long that the seemingly sudden rejection of discrimination made the 1960s movements for equality seem revolutionary.

 

The revolution didn’t happen. Instead, gradual shifts in gender and racial relations have moved our society toward more equality in fits and starts over the past 50 years. Powerful resistance to change has slowed down the movement to equality at every point.

 

But lately a basic change in the arguments of the resistance demonstrates at least some ideological success. While the initial opposition to equality claimed that inequality was natural and God-given, those who oppose further change now often say that equality has been achieved, or even that the balance has shifted so far that white men are now at a disadvantage.

 

Daily life proves otherwise. The city of Boston is currently in an uproar over one of the innumerable daily incidents that show how persistent prejudice resists good intentions. The premier Boston art museum, the Museum of Fine Arts, long ago recognized that urban high culture tended to serve mainly the interests of white people. To counteract the legacy of racism, the MFA produces extensive programs to highlight the cultural contributions of black artists and to attract a diverse community. For 7 years, the MFA has celebrated Juneteenth, the oldest national commemoration of the end of slavery. The largest film festival in New England “celebrating film by, for, and about people of color”, the Roxbury International Film Festival, will also be held in June for the 21st year.

 

These laudatory initiatives came from the Museum’s leadership. But below the top level, racial resentments have not been eradicated. When a group of black 7th-graders from the Helen Y. Davis Leadership Academy, a local middle school whose students are not white, visited the MFA last week as reward for good behavior and good grades, they were greeted almost immediately with open expressions of racism. A museum staff member told the children how to behave: “no food, no drink, and no watermelon.” Security guards ostentatiously followed them around. Other museum patrons felt it was necessary to make racist remarks, including “Never mind, there’s fucking black kids in the way.” The MFA apologized, launched a wide investigation into this particular incident, and pledged to keep trying to improve its services to communities of color.

 

Only those who insulate themselves from the daily news would find this incident surprising. The broad social acceptance of the idea that discrimination is wrong has meant that the blatant daily transgressions against equal treatment have been splashed across the national media over and over again. That’s both useful and discouraging.

 

While continued instances of racism often make the news, the persistence of gender inequality is less visible, because it mostly occurs in private spaces. A remarkable recent book shows the stubborn tenacity of male resistance to equality, despite the profession of good intentions by men to relinquish a key privilege: letting women do most of the work of child care. The psychologist Darcy Lockman wrote “All the Rage: Mothers, Fathers, and the Myth of Equal Partnership” after she realized that her husband kept finding ways to avoid participating equally in child care, such as saying he needed to go to the gym after work. She found that equal parenting is mainly a myth.

 

While many men believe they carry equal weight at home, in fact women who work outside of the home still take on two-thirds of child care, a proportion that has not budged over the past 20 years. The time-use studies by the Bureau of Labor Statistics detail what men and women actually do every day. In families with a child where both parents work full-time, women spend 50% more time on “household activities”, 65% more time on “purchasing goods and services”, and 50% more time taking care of children. Men spend their extra time watching TV and in recreation.

 

I don’t get it. Watching TV or going to the gym is more interesting than caring for your child? Changing diapers is too difficult for men to master?

 

Lockman offered a set of interlocking explanations: men had generally been raised to think less about the needs of others; some people believe that women by nature were better suited to caring for children; men are more reluctant to let family responsibilities interfere with work; women are reluctant to demand that their partners take an equal role. But she ends the book with a more forceful insight: men resist giving up their privilege. Lockman cites the NY Times opinion column by philosophy professor George Yancy entitled “I am sexist” as an example of what most of the men she interviewed would not do: admit their privilege.

 

Elizabeth Cady Stanton wrote to a male cousin in 1855, “Did it ever enter into the mind of man that woman too had an inalienable right to life, liberty, and the pursuit of her individual happiness?”

 

We might broaden this plea to apply to both racism and sexism. Only once those who have enjoyed the privilege of belonging to dominant groups ask themselves whether other people also deserve the same rights will our society get beyond good intentions to equal results.

]]>
Sun, 16 Jun 2019 01:00:48 +0000 https://historynewsnetwork.org/blog/154216 https://historynewsnetwork.org/blog/154216 0
What the Fugitive Slave Law Can Teach Us About Anti-Abortion Legislation

 

Neo-Confederate apologists have long claimed, with spurious reasoning, that the American Civil War was simply about “states’ rights,” and in one ironic sense they were arguably right. The war was precipitated by a violation of regional sovereignty – of northern states rights. Case in point: the afternoon of June 2nd 1854, Bostonians gathered along the harbor’s embankments to watch a ship bound southward. Onboard was a man named Anthony Burns, but for the captains of that ship and for those who owned it he was simply cargo to be returned to his “owners” in Virginia. A formerly enslaved man, now a Baptist minister, Burns had escaped from a Virginia plantation to the freedom which Massachusetts offered, only to find that with the 1850 passage of the Fugitive Slave Act the seemingly limitless and probing powers of the slave states had followed him to Beacon Hill’s environs, and that whether willingly or unwillingly the new law would implicate all of his white neighbors in the enforcement of that slavocracy’s laws. 

 

According to the language of the Fugitive Slave Act, a New Yorker might dislike slavery, but if southern slave catchers did “summon and call to their aid the bystanders” witnessing a kidnapping, then it was required that the New Yorker must aid in that kidnapping. If a Vermonter or Rhode Islander found slavery distasteful, that was of no accounting if slave catchers were collecting a bounty in Montpellier or Providence, as all “citizens are hereby commanded to aid and assist in the prompt and efficient execution of this law, whenever their services may be required, as aforesaid.” As historian Jill Lepore writes in These Truths: A History of the United States, “Nothing so brutally exposed the fragility of freedom or the rapaciousness of slavery.” 

 

History might not repeat exactly, but there are certain themes that can be detected throughout, and there’s something clarifying in identifying those correspondences. Slavery was, among other things, an issue of violating bodily autonomy; of the state enabling one section of the population to declare to another “You do not have sovereign right over your own body – I do.” If one of the consequences of the Fugitive Slave Act was that it enlisted the unwilling to aid the violator in that violation of another person’s bodily autonomy, then Georgia’s HB 481 bill passed by the state legislature and signed into law by Governor Brian Kempe last week does something similar. Under that draconian law, and allowing few exceptions, a woman who elects to have an abortion can be charged with “murder,” not to mention the possibility of criminal charges against women who’ve suffered a miscarriage. The possibility for such punishment does not end at Georgia’s borders. 

 

Historian Mark Peterson explains in The City-State of Boston: The Rise and Fall of an Atlantic Power, 1630-1865, that the Fugitive Slave Act “demanded the active cooperation of every US citizen, on pain of severe punishment.” Harriet Jacobs, a formerly enslaved woman living in New York, noted that the passage of the law was the “beginning of a reign of terror” to African-Americans living free in the north. While certainly anyone who understands slavery knows that that is the nature of that evil institution, for the vast majority of northerners such an arrangement south of the Ohio River was of little concern to them. With the passage of the 1850 legislation, such willful ignorance could no longer be countenanced, for those who wished no part in slavery were now directly implicated in its practice in an indirect manner. Northerners could no longer pretend that slavery was simply a “regional issue;” the plantations which supposedly stopped at the Mason-Dixon Line had a federally invested power that allowed them to arrest formerly enslaved people not just in Georgia or Alabama, but in Pennsylvania and Massachusetts as well.

 

In a similar manner, both patient and doctor, whether that abortion was performed in New York, or Massachusetts, or California, could be theoretically charged with murder under Georgia law if that procedure involved a state resident. Mark Joseph Stern writes in Slate that if a “Georgia resident plans to travel elsewhere to obtain an abortion, she may be charged with conspiracy to commit murder… An individual who helps… or transports her to the clinic, may also be charged with conspiracy.” Antebellum northerners were content to pretend that bondage was simply a southern problem; in a north that may have found slavery unsavory, arrests such as those of Burns had a galvanizing effect. As the activist Amos Adams Lawrence would write, “We went to bed one night old-fashioned, conservative, compromise Union Whigs & waked up stark mad Abolitionists.” 

 

Reactionary rhetoric excels at euphemism, obfuscation, and a cagey manipulation of language which is best described as its own form of “political correctness.” A nineteenth-century advocate for slavery could define a human being like Burns as “property;” today those who advocate forced birthing are the intellectual descendants of the pro-slavery faction, and they perform the inverse rhetorical maneuver, they define a zygote or fetus dependent on the life of the mother as being a “person.” Both rhetorical maneuvers depend on defining things as that which they are not, but the result is the same – to deny the bodily integrity, autonomy, and rights of a segment of the population. Certain obvious declarations must be made – a fetus is not a human being, an abortion is a medical procedure, and no woman should be compelled by the state to carry a pregnancy to term for any reason. 

 

The so-called “Pro-Life” movement, it should also be said, is relatively modern, formed in reaction to Roe v. Wade and cleverly used as a political organizing tactic for the right. Theological arguments against abortion are thin and ahistorical – figures as varied as Thomas Aquinas and Maimonides acknowledged the occasional necessity of the procedure centuries ago – and scientific justifications against the procedure are non-existent. So strained and tortured has the conversation around abortion become that these truths are continually defended by those of us who are pro-choice, but as Rebecca Traister has written in The New Republic “the conversation around abortion has become… terribly warped.” A consequence of this warped conversation is the promulgation of outright lies, such as Donald Trump’s insistence that infanticide is practiced – a complete and utter fabrication.  Centrist Democrats have long ceded the rhetorical ground to the forced birthing movement, and the result has been the authoritarian legislation enacted this past week in not just Georgia, but Ohio, Alabama, and Missouri as well. In 1851, three years before Burns’ arrest, and Frederick Douglas said “I am sick and tired of arguing the slaveholders’ side;” in a similar vein Traister writes “Let’s just say it: Women matter more than fetuses do.” 

 

Because what the new legislation in Georgia, Missouri, Ohio, and especially Alabama threatens isn’t just a return to a pre-Roe world, it’s to establish horrifically authoritarian new laws which would serve to redefine a woman’s medical decisions as the exclusive purview of the state, and which furthermore establish women and their bodies as the effective property of the state. While some rank-and-file Republican voters are perhaps motivated by conviction that could be called “pro-life” (even while such sentiments rarely extend to empathy beyond the life of the fetus), let there be no doubt about what legislators hope to accomplish with this and related legislation. In Alabama, exception is made for embryos that are destroyed as part of IVF procedures, with a state senator explaining “The egg in the lab doesn’t apply. It’s not in a woman. She’s not pregnant.” Anti-abortion legislation is not “pro-life,” it’s about one thing – policing women’s decisions, imprisoning women (and their doctors), and disenfranchising women. Conservative pundit Andrew Sullivan, who is personally opposed to abortion but is politically pro-choice, astutely observes that this new legislation “is not about human life. It’s about controlling women’s bodies.”

 

As the Fugitive Slave Act was motivated by an inhuman racism, so are these new laws mandating forced pregnancy defined by hideous misogyny. They situate human beings and their bodily rights in relationship to the state rather than the individual, and they unequally define certain segments of society – whether African-Americans and women – as not fully deserving those rights which are actually the common treasury of humanity. Furthermore, the legislation of the slavocracy and the forced birthers alike implicate us in their noxious and anti-human designs. There are signs of hope however – this weekend Senator Elizabeth Warren unveiled concrete legislative and policy proposals as part of her presidential campaign which would nationally protect abortion rights. 

 

In the nineteenth-century a majority of southerners, including both enslaved African-Americans and poor whites, had no say in the political process. Today, the voters of Georgia are similarly disenfranchised, as Kemp questionably became governor after he supposedly defeated the progressive Democrat candidate Stacey Abrams. Those of us in blue states have no cause to simply demonize some mythic “South;” that’s to play on the terms of the Neo-Confederates who govern in many of those states. During the Civil War the Confederates slurred a multitude of pro-Union southerners as “Scalawags.” Such women and men who supported abolition and reconstruction in the midst of the slavocracy were courageous, and when it comes to reproductive freedom their descendants are still operating in places like Georgia and Alabama. In addition to national organizations like the ACLU and Planned Parenthood, there are groups like Access Reproductive Care – Southeast, Alabama’s Yellowhammer Fund, and Ohio’s Women Have Options, among dozens of other groups. Like Amos Andrew Lawrence, if liberals in blue states haven’t woken up already, it’s time we became “stark mad Abolitionists” in the case of reproductive rights, which are women’s rights, which are human rights. 

]]>
Sun, 16 Jun 2019 01:00:48 +0000 https://historynewsnetwork.org/article/172058 https://historynewsnetwork.org/article/172058 0
Clarence Thomas is Wrong: It’s Restrictions on Abortion that Echo America’s Eugenics Past

U.S. eugenics poster advocating for the removal of genetic "defectives" c. 1926

 

In the last few months, five states have enacted harsh new laws that place severe restrictions on abortion and are intended to provoke a legal challenge to overturn Roe v. Wade.  Georgia, Mississippi and Ohio would ban abortion as early as the sixth week of pregnancy, when physicians can detect a “heartbeat” but before many women know they are pregnant.  Alabama would make performing or attempting to perform an abortion a felony unless the mother’s life is at risk—without any exceptions for rape and incest.  This week, in a Supreme Court decision upholding part of Indiana’s abortion law, Justice Clarence Thomas wrote a 20-page concurring opinion that asserts “a State’s compelling interest in preventing abortion from becoming a tool of modern-day eugenics” by drawing on history. Thomas is wrong. It is not women’s right to choose abortion but the new state laws restricting abortion that are an echo of America’s eugenics past. 

 

Between 1907 and 1937, thirty-two states passed  laws permitting state officials to remove the reproductive abilities of individuals designated “feebleminded” or mentally ill in order to improve the human population. By the 1970s, more than 63,000 Americans—60 percent of them women—were sterilized as a result of these laws. In the first decades of legal sterilization, eugenic sterilization was controversial and a number of state laws were struck down in court.  Then in 1927, the US Supreme Court upheld Virginia’s eugenic sterilization law in an 8-1 decision.  Using language astonishing for its cruelty, Justice Oliver Wendell Holmes, Jr. affirmed the state’s power to prevent those who were “manifestly unfit from continuing their kind,” adding that “three generations of imbeciles are enough.”  Buck v. Bell established the constitutionality of state-authorized sterilization for eugenic purposes, and it has never been overturned.  

 

Opponents of abortion rights like to emphasize the eugenic roots of birth control and abortion, especially the role of Margaret Sanger and Planned Parenthood. The “heartbeat” laws do not explicitly mention eugenics, but the Alabama law actually likens abortion to the murder of Jews in the Holocaust and other 20th century genocides. It also compares the defense of the “unborn child” to the anti-slavery and woman suffrage movements, the Nuremberg war crimes trial, and the American civil rights movement.  

 

These comparisons present a serious misunderstanding of American history and eugenics. State sterilization laws generally had bipartisan support and were more varied—and more rooted in local politics and welfare systems-- than abortion opponents suggest.  Despite some eugenicists’ mean-spirited rants about the menace of the unfit, many also described involuntary sterilization as, in the words of California Republican Paul Popenoe, a “protection, not a penalty.”  Sterilization would “protect” the unfit from the burden of childrearing and protect the unborn from being raised by incompetent parents. 

 

Eugenic ideas echo in the anti-abortion movement today.  Like sterilization crusaders in the past, abortion foes see private decisions about reproduction as an urgent matter of state concern. They justify their interventions using dubious science—crude theories of human heredity in the eugenics case, and, to use the Georgia law as an example, inaccurate claims that “modern medical science, not available decades ago” proves that fetuses are living persons who experience pain and should be recognized in law. Moreover, the logic behind abortion restrictions, like eugenic sterilization, is deeply evaluative: some lives have more value than others. Low-income women and women of color will be hardest hit by an abortion ban, and the fetus is accorded more value than a pregnant woman.  

 

State sterilization laws, too, targeted the most disadvantaged members of society. Young women who became pregnant outside of marriage; welfare recipients with several children; people with disabilities or mental illness—as well as immigrants, people of color, and poor rural whites—were at risk of being designated “feebleminded” or “insane” and sterilized under eugenics laws.  Survivors of rape, incest, or other types of trauma (like institutionalization) were especially vulnerable. Carrie Buck, whose 1927 sterilization was authorized by the Supreme Court, became pregnant at age seventeen after being raped by a relative of her foster parents. They responded to her ordeal by petitioning a local court to commit her to the Virginia Colony for Epileptics and Feeble-minded. As an added indignity, they also took Carrie’s daughter and raised her as their own. Forty years later, North Carolina sterilized Elaine Riddick Jessie, a fourteen-year old African American girl social workers considered promiscuous and feebleminded because she became pregnant as the result of rape by an older man. Buck’s and Jessie’s stories are just the tip of the iceberg, rare public accounts of shame and suffering that have mostly remained private.  Indeed, about one-third of all those for whom the North Carolina Eugenics Board authorized eugenic sterilization were minors – some as young as 11.  The vast majority were victims of rape or incest.

 

Memories of eugenic sterilization have focused almost exclusively on race, class, and the American eugenics movement’s connection to Nazi Germany.  Yet these are selective memories that sever forced sterilization from the issue of reproductive rights. In fact, coerced sterilization proliferated when sex education, birth control, and abortion were illegal or inaccessible to women and girls who were poor. The women most likely to become victims of state sterilization programs were those who lacked access to reproductive health care. Tragically, our research shows that some poor women with large families actually sought “eugenic” sterilization because they had no other way of ending unwanted pregnancies. 

 

The animating belief of eugenics—the state should control the reproduction of poor people, immigrants, and women of color—is central to current abortion politics. It is hardly a coincidence that Alabama’s new abortion law permits no exception for rape and incest and imposes harsher penalties on the doctors who perform abortions than on rapists, but the state still refuses to expand access to Medicaid or take steps to bring down the state’s infant mortality rate, the second highest in the nation. The double violation at the core of Alabama’s abortion restrictions—discounting the pain of sexual violence and elevating the fetus over the pregnant women--perpetuates the dehumanizing logic of eugenics. 

 

Still, it is crucial to remember the significant differences between the current fight over abortion and America’s eugenics past.  State sterilization laws once had bipartisan support from Republicans and Democrats, and no social movement ever arose to contest them.  In contrast, Republican-led efforts to recriminalize abortion are rigidly partisan, and a movement to defend the reproductive and human rights of pregnant women is rising.

]]>
Sun, 16 Jun 2019 01:00:48 +0000 https://historynewsnetwork.org/article/172066 https://historynewsnetwork.org/article/172066 0
Edwardian England’s My Fair Lady is Fairly Wonderful

 

I don’t know where it was in the new Broadway production of the musical My Fair Lady that I decided it had to be the greatest musical of all time. Was it the early song Wouldn’t It Be Loverly, that introduced you to the lower-class world of spritely and lovable cockney flower girl Eliza Doolittle? Was it the rowdy With A Little Bit of Luck, in which you meet Eliza’s boisterous dad? The forever rhymning The Rain in Spain? On the Street Where You Live? I’ve Grown Accustomed to Her Face? Get Me to the Church on Time? The rousing, fabulous oh-my-God spectacular I Could Have Danced All Night?

Was it all of them?

I don’t know, but somewhere amid those unforgettable songs, sensational acting and magnificent sets at the revival of My Fair Lady at New York’s Lincoln Center, I made up my mind. I fell in love with the musical, now celebrating its first anniversary at Lincoln Center, all over again.

The musical, with book and lyrics by Alan Jay Lerner and music by Frederick Loewe, is set in London during Edwardian England (1900 – 1910) and based on George Bernard Shaw’s 1913 play Pygmalion. It is the story of street-wise Eliza, she of the irritating shrill voice and pronounced accent, and her efforts, under the tutelage of the esteemed, sophisticated phonetics professor Henry Higgins, to “become a lady.” Higgins is so sure he can turn raffish Eliza into a proper lady and debut his new creation at the fancy elegant Embassy Ball that me makes wagers on it.

He succeeds. He brings Eliza into his house to live and study, keeps her up until all hours of the night repeating word and phrase pronunciations hundreds of times, buys her dresses and hires her carriages. He brings her to the upper crust races at Ascot, the Embassy Ball and other public places. In the end, he has made the frisky, unkempt flower girl a member of high society, with a gorgeous wide-brimmed Merry Widow hat, expensive gowns and exquisite taste.

At what cost, though? Can the “new” Eliza, the belle of the ball and consort of Queens, ever go back to the streets, flowers bunched up in her hands? Can she ever hang out with her Dad and his raucous drinking buddies again?

Will she marry Higgins, who by the middle of the play is obviously in love with her? Will she eventually run his household full of expensive furniture and hard-working maids? Will her new-found friendship with Colonel Pickering continue?

This revival of My Fair Lady is a stunner. The play has it all. First, the music. Was there ever a more memorable song that I Could Have Danced All Night? A better love song that I’ve Grown Accustomed to Her Face? A better scene stealer, play stealer, history of the theater stealer than the voluptuous, bouncy Get Me to the Church on Time, which sprawls all over the stage at Lincoln Center as the roars of the audience get louder and louder?

The roles of Eliza and Henry Higgins are two of the best written roles in theater history. Eliza is the old street girl who, introduced to the high life embraces it, but with a desperate yearning for the old gang from the neighborhood. She works hard at becoming a lady and achieves her goal. Henry Higgins is an intellectual, a devoted bachelor and a man who has been successful – all alone—for his entire life. Can he survive the commotion caused in his life by the whirling dervish Eliza? Could anyone?

Laura Benanti is nothing short of dazzling as Eliza and is at her best as she belts out I Could Have Danced All Night. She grows in the role and by the end of the play you never, never think she had been a blustery flower girl. She has a gorgeous voice and is a superb actress. She also plays well in scenes with others, particularly those with Higgins and with his mother. Harry Hadden-Paton is a complicated Higgins. He acts well and sings well, but his strength is the way he looks at Eliza and dotes on her, even if a bit gruffly. He shows her off and is proud of her, but at the same time just does not know what to do with her.  Hadden-Paton has a moment early in the play when he looks at Eliza and keeps moving, with small steps, to see more of her. His uncertainty about the flower girl is the perfect sign of his love, his surprised love, for the flower girl. Hadden-Paton and Benanti are a delightful pair.

Other fine performances are from Christian Dante White as Freddie Eynsford Hill, who falls for Eliza, Allan Corduner as Colonel Picking, Higgins’ assistant in the make-Eliza-lady college, Alexander Gemignani as Eliza’a father, Alfred P. Doolittle, who prances, dances and, in high spirits, boldly struts across the stage, hat in hand. Rosemary Harris is a delight as Higgins’ mom, who chastises Eliza when the girl throws slippers at her son, “I would not have thrown slippers at him; I’d have thrown fire irons.” 

Some critics have written that it is now Eliza’s play, that director Bartlett Sher has made Eliza the centerpiece at last. No. It is still the story of both Higgins and Eliza, an historical pair if there ever was one. What director Sher has done, brilliantly, is highlight all of the characters sin the play, large and small, to create a richer and deep portrait of London just before the first World War. He is aided by the remarkable choreography of Christopher Gattelli.

The play opened on Broadway in 1956 and was an immediate hit. It was turned into an Oscar winning movie in 1964, starring Rex Harrison as Higgins and Audrey Hepburn as Eliza. It was then, and now, a rich history lesson for patrons about Edwardian England and its high society, low society. customs and traditions.

The play, and the 1964 movie,  produced a perfect re-creation of  Ascot race course traditions and dress, right down to the wide brimmed Merry Widow hats the ladies wore (yes, that’s the scene when prim and proper Eliza, unable to take the slow pace of the horse she bet on any longer, shouts out at the horse in her best high society language, “Move your bloomin’ arse!).

You learn of the customs and style of the Embassy Balls, with their personal introductions, stilted etiquette, chats with the Queens that turn up, men and women’s’ fashions and music. You also get a terrific lesson in the look of the street, with the grimy flower girls and men, many out of work, that hung out around the theaters, bars and other public buildings. Edwardian clothing at its best is showcased in the play, right down to the fancy off-white suits and hats of the men, who cavort through the night, too, in their black tuxedoes with white silk scarves draped around their necks.

Everyone sees the enormous difference between the first-class folks at the Embassy Ball in 1910 and the scruffy street people in third class London. The gap between the is three million miles and no song could bridge it - then or now.

It was the heady days just before the start of World War I in England and America, an event that changed everything.

There has to be much applause for the magnificent revolving stage set by set designer Michael Yeargen. He built a scrumptious, dark wood finish home for Higgins, complete with a spiral staircase and rooms everywhere. As the massive stage turns slowly, you see the tiny bedrooms of the maids float past you, along with gardens and trees with maids hanging on to them. It’s one of the best sets I have ever seen. 

If you can get yourself to Lincoln Center, and to Edwardian England, and need some flowers, go see Eliza, Henry Higgins, Colonel Pickering and even the Queen of Transylvania. They will have you dancing all night, and adoring it.

PRODUCTION: The play is produced by Lincoln Center.  Sets: Michael Yeargen, Costumes: Catherine Zuber, Lighting: Donald Holder, Sound: Marc Salzberg, Music Direction: Ted Sperling, Choreography Christopher Gattelli,. The play is directed by Bartlett Sher. It has an open-ended run.    

]]>
Sun, 16 Jun 2019 01:00:48 +0000 https://historynewsnetwork.org/article/172089 https://historynewsnetwork.org/article/172089 0
A Day to Remember: Memorial Day 2019

 

On May 30, 1963, I urged citizens to remember 42 years earlier when locals dedicated a granite monument in Ashland, Oregon as “a permanent memorial, reminding those that come after us of the glory of the republic, the bravery of its defenders and the holiness of its ideals.” This monument, dedicated shortly after World War I in 1921, remembered those who had not been brought back alive on our troop ships.  They had died in trenches, of poison gas, or in tank warfare, maybe side by side with the British in the fields of bloody France.

 

When preparing to speak to more than a hundred locals, I read up on war and peace, suffering and victory, and the joy found in winning.  Often I reflected on that emotional World War I against the Kaiser and the sacrifices in trenches and sunken ships.

 

It had been a War to Make the World Safe for Democracy. Woodrow Wilson, his Ph.D. from Johns Hopkins and academic life lived at Princeton, had chosen Herbert Hoover to be “Food Czar” with the mandate to unite the farmers of America behind the mission of making sure Europe (the part in Allied hands at least) did not starve.

 

At home, homeland happy Germans and agitating Socialists had a minimum audience for their protests.  In the Navy Department, the Assistant Secretary, hale and hearty Franklin D. Roosevelt, was charged with creating a mine field to keep Germany out of the North Sea.  He dealt in the capitol with my engineer father, Vaughn Taylor Bornet of the Budd Company to make a success of it.

 

A few decades earlier, 1898, I pointed out, we had fought Spain to free Cuba “in the cause of humanity and to put an end to the barbarities, bloodshed, starvation, and horrible miseries” that colony was felt to be suffering.  

 

In half a century it would be time to invoke the memory of Midway and Okinawa, of D-Day. We ensured the survival of Britain and France and occupied Japan!  Plenty there to memorialize!  It was indeed true that World Wars I and II had been victorious after The Yankees had come to the rescue of democratic regimes fighting the Kaiser, Nazis and Fascists…. 

 

On February 2, 1966,  I raised the question—as Vietnam was still being actively fought over—whether there was “an ethical question” in that war we were waging so seriously, yet so unsuccessfully.  I didn’t do very well, I thought in retrospect, so in 1976 I revised my remarks.  Looking back, I wrote this emotionally  trying paragraph:  

“We can now look back upon Vietnam, that tortured spot on the planet, and we look hopefully for signs that Good and Evil were clearly defined and readily identifiable to those who undertook the long crusade by force of arms.”  

A world of jungles surrounded us back then.

 

Today we look back full decades.  We visit and walk pleasantly about in today’s Vietnam. We regret we didn’t “win.” We still deplore Communism—that is, after departing by plane or ship.  And, especially, we regret all those deaths—on BOTH sides.  As we take pleasure in the happiness now visible on the sidewalks, we know that while the economy thrives, freedom is short. We also know full well that the war waged from Kennedy to Nixon, yes!,  should have been curtailed long before it was!

 

We do have a right to ask bluntly,  “Did we have to wage it with such extraordinary vigor (just because we weren’t winning).” Did we find Honor in not stopping?  We sought, it must be said, a victory of the kind we had won earlier, in the 19th and 20th centuries. It was unaccountably being denied us in jungles way off somewhere. It was humiliating!

 

In my book on celebrating our patriotic holidays I pointed out that “The literature that attempts to evaluate the Vietnam War is thoughtful and immense.”  Competing with it here is out of the question—although I must admit to having been, as a working historian,  very much a part of it as I defended “patriotism” back when.  I  devoted maybe 200 pages to President Johnson’s turmoil when deciding what in Hell to do in Southeast Asia. 

 

He could see that the Communists were not going to prevail in the rest of Southeast Asia!  In Indonesia, Thailand, Malaysia, Singapore the Philippines,  Republic of China, and Shri Lanka.  Whatever North Vietnam and China might want, South Vietnam was to be their limited prize.  We had been content with what  had “worked” in South Korea, but that South Vietnam had been a different ballgame, it had turned out.

 

The Vietnam disaster had an effect on the kind of patriotism that prevailed earlier; no doubt about it.  This time, we had Lost!  For awhile, we just wouldn’t think about it too much or too often.  Find something else to consider when reflecting on our body politic.

 

I will dare, as I conclude this troubled essay, to quote from my book’s page 149:  “The anti-patriotic among us sometimes descend to portraying the United States in the role of an “empire” engaged routinely in “imperialist” invasions and dedicated to “conquest” for only “economic gain.”

 

For some among us, Patriotism sometimes seems just “old hat.”  Not for everybody.  One thinks back on what can easily be termed “Great Causes” supported by us in the Past. Some are still part of our active heritage. There is a free Europe.

 

Partly from what we did in our past emerged a new Commonwealth, an independent British Empire.  Bad as it is sometimes, Africa could be worse.  We have helped, overall—not wisely, always, but aided by philanthropy centered in the U.S., by Gates, Rotary, and others, by sometimes doing the right thing.  Maybe we’re a little better than we sometimes think!

 

Yet our Nation’s prestige has suffered severely in the past two years.  Leadership has lost us the affection of far too many countries who were once so close as to show pride routinely.  Beginning with that inexcusable withdrawal from the Paris accords on climate, we have from our top office displayed misunderstanding, even contempt, for other Lands.

 

This must stop; the end of “going it alone” cannot come too soon.  Surely this mostly verbal misbehavior is a temporary and transitory thing.  All in office in the Executive Branch need to bear in mind at all times that they are trustees for our evolving reputation.  We must, and we will, strive to do better, very soon.  Downhill is not the right direction for the United States of America!

 

This Memorial Day is a good time to think back, bring our minds up to date, and fly that beautiful flag while humming or singing one of our moving, patriotic songs.  For this quite aged American, it remains “God Bless America” all the way.

]]>
Sun, 16 Jun 2019 01:00:48 +0000 https://historynewsnetwork.org/article/172057 https://historynewsnetwork.org/article/172057 0
How About a Peace Race Instead of an Arms Race?

 

In late April, the highly-respected Stockholm International Peace Research Institute reported that, in 2018, world military expenditures rose to a record $1.82 trillion.  The biggest military spender by far was the United States, which increased its military budget by nearly 5 percent to $649 billion (36 percent of the global total). But most other nations also joined the race for bigger and better ways to destroy one another through war.

 

This situation represents a double tragedy.  First, in a world bristling with weapons of vast destructive power, it threatens the annihilation of the human race.  Second, as vast resources are poured into war and preparations for it, a host of other problems―poverty, environmental catastrophe, access to education and healthcare, and more―fail to be adequately addressed.

 

But these circumstances can be changed, as shown by past efforts to challenge runaway militarism.

 

During the late 1950s, the spiraling nuclear arms race, poverty in economically underdeveloped nations, and underfunded public services in the United States inspired considerable thought among socially-conscious Americans.  Seymour Melman, a professor of industrial engineering at Columbia University and a peace activist, responded by writing The Peace Race, a mass market paperback published in 1961.  The book argued that military spending was undermining the U.S. economy and other key aspects of American life, and that it should be replaced by a combination of economic aid abroad and increased public spending at home.

 

Melman’s popular book, and particularly its rhetoric about a “peace race,” quickly came to the attention of the new U.S. President, John F. Kennedy.  On September 25, 1961, dismayed by the Soviet Union’s recent revival of nuclear weapons testing, Kennedy used the occasion of his address to the United Nations to challenge the Russians “not to an arms race, but to a peace race.”  Warning that “mankind must put an end to war―or war will put an end to mankind,” he invited nations to “join in dismantling the national capacity to wage war.”

 

Kennedy’s “peace race” speech praised obliquely, but powerfully, what was the most ambitious plan for disarmament of the Cold War era:  the McCloy-Zorin Accords.  This historic US-USSR agreement, presented to the UN only five days before, outlined a detailed plan for “general and complete disarmament.” It provided for the abolition of national armed forces, the elimination of weapons stockpiles, and the discontinuance of military expenditures in a sequence of stages, each verified by an international disarmament organization before the next stage began.  During this process, disarmament progress would “be accompanied by measures to strengthen institutions for maintaining peace and the settlement of international disputes by peaceful means.”  In December 1961, the McCloy-Zorin Accords were adopted unanimously by the UN General Assembly.

 

Although the accelerating nuclear arms race―symbolized by Soviet and American nuclear testing―slowed the momentum toward disarmament provided by the McCloy-Zorin Accords and Kennedy’s “peace race” address, disarmament continued as a very live issue.  The National Committee for a Sane Nuclear Policy (SANE), America’s largest peace organization, publicly lauded Kennedy’s “peace race” speech and called for “the launching of a Peace Race” in which the two Cold War blocs joined “to end the arms race, contain their power within constructive bounds, and encourage peaceful social change.”

 

For its part, the U.S. Arms Control and Disarmament Agency, created by the Kennedy administration to address disarmament issues, drafted an official U.S. government proposal, Blueprint for the Peace Race, which Kennedy submitted to the United Nations on April 18, 1962.  Leading off with Kennedy’s challenge “not to an arms race, but to a peace race,” the proposal called for general and complete disarmament and proposed moving in verifiable steps toward that goal.

 

Nothing as sweeping as this followed, at least in part because much of the subsequent public attention and government energy went into curbing the nuclear arms race.  A central concern along these lines was nuclear weapons testing, an issue dealt with in 1963 by the Partial Test Ban Treaty, signed that August by the U.S., Soviet, and British governments.  In setting the stage for this treaty, Kennedy drew upon Norman Cousins, the co-chair of SANE, to serve as his intermediary with Soviet Premier Nikita Khrushchev.  Progress in containing the nuclear arms race continued with subsequent great power agreements, particularly the signing of the nuclear Nonproliferation Treaty of 1968.

 

As is often the case, modest reform measures undermine the drive for more thoroughgoing alternatives.  Certainly, this was true with respect to general and complete disarmament.  Peace activists, of course, continued to champion stronger measures.  Thus, Martin Luther King, Jr. used the occasion of his Nobel Peace Prize lecture in Oslo, on December 11, 1964, to declare:  “We must shift the arms race into a ‘peace race.’”  But, with important curbs on the nuclear arms race in place, much of the public and most government leaders turned to other issues.

 

Today, of course, we face not only an increasingly militarized world, but even a resumption of the nuclear arms race, as nuclear powers brazenly scrap nuclear arms control and disarmament treaties and threaten one another, as well as non-nuclear nations, with nuclear war.

 

Perhaps it’s time to revive the demand for more thoroughgoing global disarmament.  Why not wage a peace race instead of an arms race―one bringing an end to the immense dangers and vast waste of resources caused by massive preparations for war?  In the initial stage of this race, how about an immediate cut of 10 percent in every nation’s military budget, thus retaining the current military balance while freeing up $182 billion for the things that make life worth living?  As the past agreements of the U.S. and Soviet governments show us, it’s not at all hard to draw up a reasonable, acceptable plan providing for verification and enforcement.

 

All that’s lacking, it seems, is the will to act.

]]>
Sun, 16 Jun 2019 01:00:48 +0000 https://historynewsnetwork.org/article/172060 https://historynewsnetwork.org/article/172060 0
Leadership and Mimicry: What Plutarch knew about Elizabeth Holmes

 

Founder of the biotech company, Theranos, Elizabeth Holmes is currently awaiting trial for cheating investors and deceiving her clients. She claimed that her company was building a device that would revolutionize healthcare by running dozens of lab tests on a single drop of blood. This device, called the Edison, was to become widely available in a nation-wide chain of drug stores, providing nearly every American with quick, affordable access to important information about their health. Holmes appeared to be doing the impossible, and nearly everyone believed in her, from seasoned Silicon Valley entrepreneurs to wealthy investors to former Secretaries of State. By the time she was thirty she had accomplished one of her childhood dreams: she had become a billionaire. But quick and easy blood testing, it turns out, really is impossible. While a legal decision about her behavior as CEO lies in the future, the verdict on her character appears to be in. Elizabeth Holmes is a fraud.

 

In the last year alone, Holmes has been the subject of a book (soon to be a movie), countless newspaper and magazine articles, an HBO documentary, and an ABC News podcast (soon to be a television series). This entrepreneur, once celebrated as a genius, is now more often called names like “disgraced fraudster,” and her career has repeatedly been cast in highly moral terms, with a rise-and-fall trajectory that seems already to have completed its arc. The way to explain the collapse of Theranos, it seems, is to study the deficiencies in Holmes’ character.

 

This approach to telling Holmes’ story calls to mind the Greek philosopher Heraclitus, who claimed that “character is destiny.” This ancient saying remains popular in our modern world. The New York Times editorial board used it just last year, for instance, to describe the downfall of Eliot Spitzer and to speculate about the future of Donald Trump. John McCain selected it as the title for his 2005 book, which contains stories of successful historical figures who demonstrated admirable character. Character alone, McCain argues in the introduction, determines the courses of one’s life and career. And so, according to both the ancient philosopher and the modern statesman, there is no pre-ordained path that we are obliged to follow, nor should we look for external guidance as we navigate our careers. We deserve full credit for our successes, but we must also take full responsibility for our failures.

 

Long before the rise and fall of Elizabeth Holmes, however, philosophers and ethicists were contemplating the implications of Heraclitus’ dictum. Plutarch of Chaeronea, for one, knew this principle well and wrote at length about the fundamental importance of character, especially for people in positions of power. He composed treatises on leadership, but his most ambitious project was the Parallel Lives, a lengthy series of biographies of Greek and Roman leaders that demonstrate how their virtues and vices affected their political and military careers.

 

For Plutarch, good character was fundamental to becoming an authentic leader. In his essay To an Uneducated Leader, he laments that most people who aspire to positions of power fail to realize that they must prepare themselves ethically. “And so,” he writes, “they imitate the unskilled sculptors who believe that their colossal statues appear great and strong when they fashion their figures with a mighty stride, a straining body, and a gaping mouth.” By emphasizing appearance over character, such leaders fool everyone, including themselves, into thinking they are the real thing because they “speak with a low-pitched voice, cast a harsh gaze, affect a cantankerous manner, and hold themselves aloof in their daily lives.” In fact, such leaders are just like the statues, “which on the exterior possess a heroic and divine facade but inside are filled with earth and stone and lead.” Plutarch is imagining statues made of bronze, which were cast over a clay core that remained inside. The statues, at least, could rely on this internal weight to keep them upright, while uneducated leaders “are frequently tripped up and toppled over by their innate foolishness, because they establish their lofty power upon a pedestal that has not been leveled, and so it cannot stand upright.” That pedestal, in Plutarch’s view, is character, and so a leader who forgoes ethical development is destined to fail.

 

Plutarch believed he could show that character was destiny by examining historical examples. In his biography of the Athenian orator and politician Demosthenes, for example, he presents an inauthentic leader who is publicly exposed as hollow. Demosthenes modeled himself on Pericles, an Athenian leader of an earlier generation who in both ancient and modern times has been portrayed in ideal terms. Demosthenes was selective in following his model, however, imitating only his style of speaking, his public demeanor, and his habit of getting involved in only the most important matters, “as though Pericles had become great from these practices alone” (Dem. 9.2). Now Demosthenes did indeed become a great speaker, and he used his oratorical prowess to organize resistance to Philip of Macedon, whose military might posed an existential threat to the independent Greek cities. He talked his way into a leadership position, but when the united Greek armies met Philip’s forces in battle, Demosthenes could not live up to the image he had created. “To this point he had been a brave man,” Plutarch explains. “In the battle, however, he did nothing that was honorable or that corresponded with his words, but he abandoned the formation, running away most shamefully after casting off his arms” (Dem. 20.2). Throwing away one’s arms, especially the heavy shield, was the proverbial sign of cowardice in Greek warfare. Thus, in this single act, Plutarch found all the proof he needed of Demosthenes’ deficiency in character.

 

 

 

 

The modern story of Elizabeth Holmes is one that Plutarch would surely have recognized. ABC News in particular has focused on Holmes’ efforts to shape her public persona and so to conceal the clay inside. When the company was new, the young entrepreneur had no shortage of admirers. “Don’t worry about the future. We’re in good hands,” declares Bill Clinton in the podcast’s first episode. He is followed by an exuberant newscaster who compares Theranos to Amazon, Intel, Microsoft, and Apple, before gushing, “It could be that huge.” But Holmes was not who she pretended to be. In order to make her company more like Apple, she hired away Apple’s employees. And then she went a step further, donning a black turtleneck in deliberate imitation of Steve Jobs, “as though Jobs had become great by wearing the turtleneck alone,” Plutarch would have added. The black shirt, it turns out, was a metaphor for the black box that was supposed to be testing blood but never really had the right stuff inside. In ABC’s version of the story, neither Holmes nor the Edison was ever more than a shell.

 

In business and in politics, then, philosophers and reporters tell us that no one can hide deficiencies in character forever. “It is, of course, impossible for vices to go unnoticed when people hold positions of power,” Plutarch writes in To an Uneducated Leader (7). Then he adds this example: “When jars are empty you cannot distinguish between those that are intact and those that are damaged, but once you fill them, then the leaks appear.” So how do we avoid giving our money to an Elizabeth Holmes, or putting a Demosthenes in charge of our government, only to find out too late that they are not up to the challenge? The answer for jars is to fill them with water and check for leaks before we use them to store expensive wine or oil. Just so, Plutarch, and before him, Heraclitus, would surely have suggested that we ought not give millions of dollars to a first-time entrepreneur, or place an untested politician in high office. In those situations, their character may be their own, but their destiny is ours.

]]>
Sun, 16 Jun 2019 01:00:48 +0000 https://historynewsnetwork.org/article/172064 https://historynewsnetwork.org/article/172064 0
On Toad-Eating, Tyranny, and Trump

 

 

Exactly two hundred years ago, following two decades of war between Napoleonic France and Britain, and the restoration of the crowned kings of Europe, William Hazlitt wrote an insightful and plain-spoken essay, “The Times: On the Connexion between Toad-Eaters and Tyrants.” There he argued that English conservatives’ fierce defense of the absolute right of kings was founded on a base submissiveness toward power and hope for rewards. He gave sycophantic defenders of autocratic rule the traditional name for the charlatan’s assistant who swallowed live, supposedly poisonous toads in order to demonstrate the effectiveness of the con-man’s cure-all.

 

Hazlitt’s argument about the worship of absolutism can help us understand why current commentators may be waiting in vain for Republican national office-holders to exercise oversight or check an unfit and almost certainly felonious President. 

 

The 80% of Republican voters and 10% of Democrats who approve of the President will not change their minds if House Democrats or a prosecutor from the Southern District of New York presents further evidence that Trump has committed bank fraud, evaded income taxes, laundered money, tampered with witnesses, suborned perjury, and obstructed justice—not to mention defrauded the US  by coordinating with an unfriendly foreign power the release of stolen documents and the sharing of sensitive polling data during an election. These charges and more are now all but certain. Most people know, even if they decline to admit, that Trump is a player of confidence games whose word is not his bond. 

 

More to the point, his supporters love him for having gotten away with being such a dishonest character and operating in the shadows of illegality his whole career. In the American tradition that equates money with success, Trump is a success, and many Americans worship success, however achieved. These people cheer him because he is uninformed but unashamed, having no ethics but billions of dollars, at least according to his varying and less than reliable claims. 

 

It does not matter to them, as it does not matter to him, that hundreds of millions of that money were funneled his way in the last twenty years by Russian oligarchs and other shady foreign nationals who own the majority of the condos in buildings such as Trump Soho. It does not matter that every business venture he initiated has failed—as the transcripts of his taxes from 1985-1994 show—nor that he has declared bankruptcy to evade creditors so many times in his career, nor that he has underpaid thousands of contractors who resemble his supporters. They do not support Trump despite his venality, immaturity, and obvious intellectual incapacity, but because of his failings of character.    

 

They have been encouraged in their adulation by electronic and print media. The parallel function of the media in Hazlitt’s time explains why the title of his essay begins with the name of the most virulently conservative newspaper of the day. The Times served as the propagandistic arm of the governing Tories, slanting its reporting against any who attempted to reform the corrupt English election system. The paper used its power to mock and launch personal attacks portraying reformers and government critics as lunatics and terrorists. 

 

The parallels between the Time sand Fox News could not be stronger. The second is waging, as the first once waged, a culture war by ridiculing and demonizing new ways of thought and policy proposals based on a desire for social justice. In the 2010s, as in the 1810s, cultural conflict serves as a vehicle and a surrogate for political conflict.    

 

During the election, many Republican office-holders kept their distance from the candidate of their party because of his habitual mendacity, his vulgarity, and his general untrustworthiness. When he became President, however, Republican office-holders and the majority of conservative commentators revealed their remarkable capacity for toad-eating.

 

Consider the Senate, designed to be one of the principal checks on the power of a lawless Chief Executive. The majority now exhibits an automatic subservience to Mitch McConnell’s increasingly anti-democratic schemes for seizing and maintaining power for his party for the next generation. Every day they countenance the Executive’s untold violations of constitutional constraints and criminal law, any two or three of which would have led to their removal of a Democratic President from office. 

 

Senator McConnell will grant with a smile and a twinkle in his eye that Trump is indeed an “unusual” politician. So is McConnell an unusual Majority Leader—a former institutionalist who tears away any shred of independence and legitimacy the upper chamber ever possessed. It now resembles the Roman Senate under the Empire, whose sole business was to inquire what were the wishes of the Emperor—how high would he like the Senators to jump, and which of them would he like to see commit suicide today?   

 

As a Representative, Lindsay Graham prosecuted Bill Clinton in order to “purify and cleanse” the Presidency because of one inconsequential lie. As a Senator, he now adamantly defends the unending stream of untruths that flows from Donald Trump’s mouth and through his Twitter account every day. There is little point in multiplying examples. Is there any need to mention Bill Barr, Rudy Giuliani, Paul Ryan, or Kelly Ann Conway? 99% of Republican legislators? The number of compromised followers is depressingly incalculable. 

 

But how, specifically, does our current pathological political condition illustrate the connection Hazlitt draws between toad-eaters and tyrants? As Hazlitt observed, legislators and media defenders are motivated by both fear and self-interest—fear of retaliation by the ruler on the one hand, and, on the other, the chance of retaining or improving their position and increasing their wealth and influence during his reign.    

 

Hazlitt’s analysis also throws light on why despots are able to attract followers. Their appeal derives from the consolations of unity and adoration of the One. In Hazlitt’s day, that respect was still based on the idea of a “divinity that hedged a king,” what he viewed as the false and ridiculous belief that a king was different from all the rest of humankind. Hazlitt anticipates the characterization of the authoritarian personality—a weak character that needs to identify itself with a strong leader in order to feel secure—as elaborated by Theodor Adorno and Max Horkheimer after the defeat of the Nazis in 1945.

 

In our day, the attitude of irrational enthusiasm for the President is also based on the doctrine of the “unitary executive” advanced by Dick Cheney and others in the Reagan years. This doctrine rules out all dissention or differences of opinion within the executive branch concerning policy, judgment, or facts. The executive branch must speak with one voice and one will—those of the Chief Executive. As the determined xenophobe Steven Miller asserts: “He will not be questioned!” 

 

That is to say, any Republican Chief Executive will not be questioned.This doctrine clearly and quickly leads toward consolidation of power in the Chief Executive, as capricious as he may be. In the right circumstances—if his party is in charge of the legislature and the Supreme Court—it enables him to be a lawless ruler.

 

Representative Jerry Nadler and other Democrats have said that Trump is behaving like a king, but he is actually acting as something worse—a tyrant. A constitutional monarch in fact acts within the law, but a tyrant disregards the laws and the common good while pursuing self-interest and personal pique. Representative Jackie Speier has more accurately asserted that the President “has in many respects become a dictator.” As the self-proclaimed One who “alone can solve” the country’s problems, he has the unconditional support that a tyrant needs. A host of Fox & Friends explained on May 9 that because he once bought a $28 million yacht, “he’s different from you and me.” Perhaps so, but not in any way that qualifies him for a high office of public trust.   

 

The paradox that Hazlitt’s psychological analysis exposes is that, like their base, Republican legislators and commentators find it easier and more exciting to defend a manifestly absolutist Executive who cares for nothing except money and power than to support a more moderate and thoughtful one. Partisans of absolutism are able to retail the most absurd justifications or explanations, freed by the example of their leader from logical constraints or concern for consistency and right. The more extreme and offensive the policy, the more transparent the excuse for misbehavior, the more contradictory the reasoning, the more delusional the thinking, then the louder and more ferocious is the defense of the leader’s moral vacuity, spitefulness, and ineptitude. The supporters are defending the ruler’s power, not his character, policy proposals, or arguments. 

 

The early Church Father Tertullian said that he believed that his Lord was both God and man not despite the idea’s absurdity, but because of it. Republican office-holders, commentators, and the base support Trump not despite his being a reality television star who is playing President, but because he has no qualifications, no knowledge of history, no understanding of the Constitution, no care for anyone’s interest but his own. They thus demonstrate their devotion to the pure pursuit of party power, unmixed with principle. That is the mark of pure and true toad-eating: “A rogue’s obeyed in office.”

 

It is not a pretty sight—the accelerating dissolution of a constitutional democracy that worked fairly well for a few decades in the mid-twentieth century. But at least we who witness the tragic and farcical spectacle can call the actors in it by their proper names.    

]]>
Sun, 16 Jun 2019 01:00:48 +0000 https://historynewsnetwork.org/article/172056 https://historynewsnetwork.org/article/172056 0
Where Did the Indigenous Community Mothers Go?

 

I have spent the past two decades researching and writing biographies of nineteenth century indigenous women who married and lived in a cluster of cross-cultural couples in northwest Washington State. Their husbands were county officials while others were military officers and agrarians. These women and their husbands composed 80 to 90 percent of Whatcom County’s married couples during its first twenty years of legal existence. And yet, when local historians wrote their county and city histories in the 20th century, they ignored these indigenous community mothers. The contributions and legacies of these wives and mothers were never explored. The same pattern exists in other places, leaving a conspicuous hole in Western history. 

This should not be surprising, given the iconic heroines mythologized by families and fans of the American westward movement: a courageous sad-eyed wife who left her home forever to trek the Oregon Trail across the continent and help her husband tame a wilderness. Or, the mail-order bride who braved the ocean’s dangers to marry a virtual stranger and help him build a community. Or, a spinster schoolmarm who brought literacy to pioneer children in a dusty town.  Or, the saloon girl with a heart of gold whose for-sale femininity kept a settlement from exploding in violence until real ladies arrived. Annie Oakley and Calamity Jane provided fierce examples of women equal to men in the face of the West’s challenges. Sacagawea remains the only native wife of a non-native man in the West that most Americans have ever heard about.

These archetypes appear again and again in national, regional, and local histories, as well as over two centuries of fictionalized versions of westward expansion. Until Elizabeth Jameson, Susan Armitage, and other women historians established the critical need to examine women’s contributions, most nineteenth century western women made appearances as accessories to male accomplishments or tragic sacrifices to Manifest Destiny. 

Almost never memorialized, even briefly, were the young indigenous women who lived near forts and in new settlements that displaced native communities, and whose husbands were army officers, Indian agents, merchants, local officials, and legislators. Historians did not consider that elite Native women’s families might have had their own agenda when they married their daughters to men they considered to be of equal status. Indigenous community mothers seem to have been an uncomfortable truth for historians and other writers that did not fit with the Euro-American mythology they sought to build around “the first white woman” in town. The result was their now-conspicuous absence.

Husbands in histories and literature have generally been portrayed as men on the fringe of development who contributed little or nothing to their town’s development. Historical accounts only reference wives briefly: “He married an Indian.” The husband often has been said to have “bought” his bride, revealing the writer’s ignorance of her family’s status or local wedding customs of Native Americans. For example, the Coast Salish people of western Washington State saw marriage as a family decision: the family considered how the groom would fit in and contribute to their extended family economy. Families chaperoned young elite women until marriage. His wedding gifts recognized her loss to her family, and a year or more of obligations from both sides were part of the arrangements. Whether or not the husband saw his tribal custom marriage as legitimate, the bride’s family did. These marriages took on the pattern of all marriages: long or short, happy or unhappy.  

This is not to imply that all young wives were from elite Native families. Some were not, and sometimes a slave was offered to a man seeking only a housekeeper and sexual partner, and was likely oblivious to class distinctions among native people.  

While trying to ignore the presence of native wives of “pioneer town fathers,” generations of historians showed no respect for the women’s contributions to their community’s development. A contribution was not always a man establishing a mill, a mine, or a business as current historians of western women have shown. At Bellingham Bay, four waves of bachelors poured into the settlement in the 1850s and those who stayed married the neighborhood women in the absence of eligible ones from their own society. These young indigenous wives taught the half-dozen white women how to make use of unfamiliar food sources, and cared for their children. Wives used their healing knowledge and delivered babies for isolated women. They took bachelors into their homes as boarders, providing meals, laundry, and a family atmosphere. They learned to bake bread and apple pies that their husbands and boarders missed. 

Children of cross-cultural marriages in the large cluster from which my eight biographies were written found success or failure in whichever identity they chose, whether they lived in the white-dominant society or joined relatives on a reservation. Writers continue to portray them as misfits who found no place in either society, but this stereotype is false. Legacies to their new communities and even wider society are found in the children and generations that were to come. 

One example is the legacy left by the young Alaskan Haida wife of Captain [and soon-to-be Confederate general] George E. Pickett, commander of Fort Bellingham. Her personal name remains unknown, but soldiers and locals addressed her respectfully as “Mrs. Pickett.” She lived with him in the fort commander’s in-town home where they often hosted visiting territorial officials and army “brass.” She died in the months after their son was born. Her son, James Tilton Pickett, became a much-lauded marine and landscape artist whose works reside in museums. He was the first Washingtonian to attend a professional art school, and one of the founders of Portland, Oregon’s fine arts community. 

 Jenny Wynn, an elite young Lummi woman, wed a Philadelphia Quaker blacksmith who arrived at the bay to work at the mill and new coal mine. After moving to a farm, her skill smoking superior hams turned into a profitable business. She and her husband’s support of education for rural children resulted in at least four generations of teachers in their family, educators who see that as the family identity.

Historians across the nation might enhance the view of early western communities, as well as many cities in the Old Northwest and along the Mississippi River, by looking beyond the culturally biased accounts to include this long- overlooked group of founding mothers.  

]]>
Sun, 16 Jun 2019 01:00:48 +0000 https://historynewsnetwork.org/article/172061 https://historynewsnetwork.org/article/172061 0
The Chinese Threat to American Higher Education

Tsinghua University in Beijing

 

 

Amid the Trump administration’s destabilizing trade war with China, President Xi Jinping is determined to make his nation into an unrivaled economic and military superpower. China has taken the lead in the development of green energy, 5G networks, and artificial intelligence and robotics—and Xi’s “Belt and Road” initiative will soon link the economies of Asia and Europe by road, rail, and water. 

 

The rise of China also poses significant challenges to American higher education. Although the U.S. remains a destination of choice for international students and scholars, China’s share of global scientific papers has increased from 6 to 18 percent from 2003 to 2013; the number of Chinese universities nearly doubled from 2005 to 2015; and eight million students graduated from Chinese institutions of higher learning in 2017 alone (a tenfold increase in a generation). As a result, twenty-two Chinese universities now rank in the top 100.

 

As President Trump’s nationalistic rhetoric and restrictive immigration policies have resulted in a 10 percent drop in international student enrollments on American campuses, China’s “Double First Class” plan aims to make forty-two of its universities into world-class institutions by 2050. To that end, the Thousand Talents Plan brings leading scientists, academics, and entrepreneurs to teach and research in China, and it provides financial incentives for Chinese scientists living abroad to return home. The government also doles out tens of thousands of full scholarships to attract international students from more than 170 countries. Alarmed by ominous signs that American higher education was losing its edge, I penned Palace of Ashes (Johns Hopkins University Press) in the summer of 2014 to illustrate how the forces of globalization were helping rapidly developing Asian nations—particularly China—to transform their major universities into serious contenders for the world’s students, faculty, and resources. I stand by the central claims of that work but could not anticipate the chilling effect the subsequent formulation of “Xi Thought” would have on higher learning. To create a culture based on “market socialism with Chinese characteristics,” Xi Jinping asserted political control over higher education in 2016 in a widely-publicized speech that put ideological and political work at the heart of university education to promote socialism. Xi also expressed a desire that Chinese colleges and universities be “guided by Marxism” to become “strongholds that adhere to Party leadership.” As he began a second term as president in 2018, an enhanced system of monitoring faculty members by using classroom informants emerged to identify faculty members who engaged in “improper speech.” Although censorship and classroom informants have remained a persistent part of higher learning for generations, Xi Thought challenges widely held assumptions in the West that economic development would result in more social and intellectual freedom. To encourage adherence to the core values of Chinese socialism, Xi advanced the “Chinese dream,” a vision of economic progress that will culminate in a “great renaissance of the Chinese nation.” To export that vision and to promote the country’s image as a global leader, the government funded Confucius Institutes to encourage the study of Chinese language and culture on campuses around the world. Concerned that these institutes erode academic freedom and institutional autonomy by functioning as a propaganda arm of the Chinese government (since they often recruit and control academic staff, influence the curriculum, and restrict free debate), there has been a spate of recent closures of Confucius Institutes in the United States. Perhaps in retaliation, President Xi recently initiated a campaign against Western values that encourages communist party members in academia to redline innocuous and apolitical expressions of American culture. Today, in place of the free flow of information, China’s "Great Firewall" blocks thousands of websites (including Facebook, Twitter, YouTube, and Google Scholar). Even science is not immune from censorship in Xi’s China—a situation that has raised alarms in Hong Kong, Britain, Australia, and the United States. In my view, banning topics such as constitutional democracy, civil society, income inequality, freedom of the press, human rights, and historical critiques of the communist party from university classrooms, research seminars, and publications is regressive. President Xi’s ambition to make China “a global leader in terms of comprehensive national strength and international influence” by 2050 is jeopardized by the ideological control of innovation, China’s slowing economy, the trade war with the U.S., and the strict regulation of information. It is also hard to imagine that heightened authoritarianism, emphasis on party ideology and socialist morality, and censorship and surveillance will contribute to the establishment of world-class research universities. Ultimately, an emphasis on ideological agendas produces strong incentives for researchers to value the quantity of research—over its quality. By contrast, less governmental intervention in higher education might generate results more in line with Chinese aspirations. Until such a moment arrives, institutions of higher education around the world should remain wary of China’s academic model in which free inquiry plays little part.

]]>
Sun, 16 Jun 2019 01:00:48 +0000 https://historynewsnetwork.org/article/172063 https://historynewsnetwork.org/article/172063 0
Yes, you should have free speech on Facebook.

 

 

Facebook purged hundreds of political pages and accounts last fall when responding to criticism that it too easily allowed the spread of “fake news.” It’s removed posts about racism and about breastfeeding. It’s blocked content at Russia’s demand. Now it’s taking on the anti-vaxxers and political extremists. Users who post questionable items or material that otherwise violates Facebook’s rules can find themselves in “Facebook Jail.” 

 

As the largest player in a multi-billion-dollar industry, Facebook increasingly faces complaints that it has illegally censored content and has violated users’ constitutional rights to freedom of speech. Free speech protections only apply to governmental bodies and not businesses, people argue back. While such is currently law, political realities demand that free speech protections evolve.

 

First some background.

 

Laws always evolve, even when they involve something seemingly permanent like the United States Constitution. As conceived, the Bill of Rights only protected citizens from the federal government. Beginning after the Civil War with the ratification of the Fourteenth Amendment and a series of Supreme Court cases over the next half-century, these protections, including free speech, extended and placed limitations on the power of state and local governments.

 

Simultaneously, interpretations of “free speech” have varied greatly. A century ago, due to the Christian Great Awakening (beginning in the 18th century and continuing well into the 20th century) and the push for morality, censorship was an everyday part of life. Various images, topics, and words—now often deemed fully acceptable—were illegal in art, film, and literature. Using the U.S. Postal Service to mail “obscene” material—including birth control and love letters—was illegal during such efforts to reform society. There have also always been gaps between the ideals of free speech and the realities of day-to-day life, especially during the world wars and even more so when the person “speaking” is a minority because of class, gender, race, or sexuality and is challenging authority in the United States.

 

Moving closer to the present, neoliberalism has guided policy moves by nearly all politicians in the United States starting in the 1980s. It favors free markets, deregulation, low taxes for the richest, and privatization—a society guided by Social Darwinism. Neoliberalism further holds that government is bad, business good. Through these processes, power is shifted to mega, ultra-wealthy corporations and their leaders—like Facebook. 

 

Facebook (and Tumblr, Twitter, WordPress, etc.) is now how everyday people receive news and freely express themselves through sharing memes, posting pictures, commenting on posts, or interacting with their friends and relatives. And this is a beautiful thing—from the billions of Facebook Messenger messages sent daily, the millions of blogs and tweets posted daily and from the hours people spend on social media daily, people are arguably reading and writing more than ever before. Elected officials, city governments across the nation, even the White House all run multiple social media accounts and encourage communication from the public. In the previous century, the postal service and Southwestern Bell delivered messages between people; today Facebook performs these tasks and as such, requires that freedom of speech protections expand in order to continue fostering a healthy democracy.

 

Thus, when blocking content or banning users, Facebook uses its ubiquitousness, power, wealth, and monopoly—made possible by neoliberal transferences of power from “real” governments to private businesses which function as unelected and unaccountable de facto governments—to silence. Such censorship deserves concern. As a result, Facebook is a kind of governmental institution because of its authority, capital, everyday role, and size. Private businesses are today’s governments.

 

Put differently, the United States and neoliberalism effectively enable such restrictions on “free speech” by allowing Facebook (with full rights of personage) to stand-in and to do the dirty work of suppressing content deemed immoral, unpopular, too radical, or incompatible with the status quo. 

 

But the internet should not give people free rein. Yelling “FIRE” in a theater just to cause panic is illegal. Restrictions are necessary. Using Facebook or other social media platforms to incite harm warrants appropriate silencing measures and should be accompanied by thoughtful questions about what exactly “harm” means and about enforcement mechanisms. The proliferation of propaganda-spreading bots remains an under-realized threat and also needs addressing.

 

There are always trade-offs with freedom of speech, too. As much as most people despise queerphobic or xenophobic rhetoric and don’t wish to encounter it online, any restrictions on free speech open the door to restrictions on all speech. Luckily, content filters available through household routers or through “block” buttons, make it relatively easy for a person to filter material they find offensive or harmful.

 

An additional note on neoliberalism and social media is necessary: Globalization is another characteristic of neoliberalism. Currently, 2.38 billion individuals across the globe use Facebook, and less than 20 percent are in the United States. Yet all of these people are subject to Facebook’s rules, developed primarily by white men based on mores in the United States—prudish by European standards and too freewheeling for governments in China, Iran, and elsewhere, which block the platform. All users are also subject to Facebook’s whims and forces as a new global power, as well as its security flaws. With a third of the globe Facebooking, what Facebook allows and doesn’t shapes everyone’s interactions in some way or another.

 

The internet facilitates opportunities for people to speak in 2019. In contrast to those who seek to end Net Neutrality, the United Nations has declared access to the internet and its rich libraries of information a basic human right. The American Civil Liberties Union continues to argue in and out of court that digital freedoms of speech, as broadly defined as possible, are vital. I wouldn’t be able to legally ban your ability to speak, neither should Mark Zuckerberg. 

]]>
Sun, 16 Jun 2019 01:00:48 +0000 https://historynewsnetwork.org/article/172062 https://historynewsnetwork.org/article/172062 0
The Expansion of Presidential Power Since 1973

 

Nearly a half century ago, famed historian and scholar Arthur Schlesinger, Jr. published The Imperial Presidency. This path breaking work described the growing centralization of the executive branch of the American government since the 1930s. The Imperial Presidency was published at the height of the Senate Watergate hearings in 1973, and brought essential attention to the need to prevent further abuses in the office of the Presidency.

 

Congress reasserted its authority after Watergate as it passed the War Powers Act of 1973 (which they passed by overriding Nixon’s veto), tried to limit FBI and CIA activities through the Church Committee investigations of the mid 1970s, and passed the Ethics in Government Act to create Special Prosecutors to investigate accusations of illegal activities in the executive branch. Unfortunately, these actions didn’t have the impact many in Congress hoped for: the War Powers Act was ignored by future Presidents who intervened regularly without giving notice under the law and the Church Committee investigations had no substantial long range impact.  

 

Presidents continued to expand their executive power. Republican President Ronald Reagan, despite his promotion of conservatism and the goal of making the federal government smaller, expanded the power of the presidency not through law but through precedent: because his substantial unilateral actions were not challenged, he set a precedent for future presidents.This was particularly evident in foreign policy, most notably the Iran Contra Affair.   Congress had banned any involvement or intervention in the civil war raging in Nicaragua against the leftist Sandinista government. Reagan’s administration nonetheless arranged secret arms sales to Iran and used the funds from those sales to support the anti government “Contras” in Nicaragua. Although some members of Congress called for impeachment proceedings, it was avoided because Reagan was in his second and final term, and because his warm personality and great oratorical ability made him widely popular. Reagan also used his executive power to authorize a secret intervention in Afghanistan against the Soviet Union, supported Iraq and Saddam Hussein in their war against Iran, and simultaneously sold arms to Iran. 

 

Reagan was then succeeded by his Vice President, George H.W. Bush. Bush, with his experience as Director of the Central Intelligence Agency under President Gerald Ford, also intervened internationally without Congressional authority. Bush authorized  the invasion of Panama in 1989 and organized a coalition to force Iraqi dictator Saddam Hussein out of Kuwait during the brief Persian Gulf War of 1991. Bush did not seek a Congressional declaration of war, and instead simply gained authorization to use force. At the end of his first and only term in the Oval Office, Bush, with his Attorney General William Barr, pardoned the major figures who had been convicted or were still facing trial as part of  the Iran Contra scandal.  This prevented any further investigation of the possibility that Bush himself was involved in that scandal. Bush followed in Reagan’s footsteps: he continued to take unilateral action in foreign policy and acted to ensure that Reagan never was held responsible for his presidential actions, confirming that presidential powers had expanded. 

 

When Democrat Bill Clinton came to office, and once the Republican opposition gained control of both houses of Congress in the midterm elections of 1994, the Republicans were less supportive of unchecked presidential power. While they had been unconcerned about Presidential power under Reagan and Bush, they complained that Clinton abused executive orders on domestic issues, including the environment. Clinton was also heavily investigated and even impeached over his personal behavior with women, including Paula Jones and Monica Lewinsky.

 

When Republican George W. Bush (2001-2009) came to power after the contested Presidential Election of 2000, he brought into government two figures who were particularly keen to add to executive leadership: Vice President Dick Cheney and Secretary of Defense Donald Rumsfeld. After the attacks on September 11, 2001, Cheney and Rumsfeld used national security concerns to justify the use of surveillance and “enhanced interrogation techniques” (torture). The Patriot Act was passed with very few dissenting votes, and the Department of Homeland Security was created. 

 

The decision to go to war in Afghanistan and Iraq based on faulty information caused some to call for the impeachment of George W. Bush, and a bill was introduced by Congressmen Dennis Kucinich of Cleveland, Ohio, and Robert Wexler of Boca Raton, Florida in 2008. The charges lodged against Bush in the impeachment resolution included accusations that Bush had misled the nation on the need for the invasion of Iraq; his conduct of the Iraq War; the treatment of detainees; the National Security Agency Warrentless Surveillance; and failure to comply with Congressional subpoenas. Democratic Speaker Nancy Pelosi resisted any move toward impeachment, with Bush’s time in office nearing its end.

 

Once the Democrats lost control of the House of Representatives in the midterm 2010 elections, and control of the Senate in the 2014 midterm elections, the Republicans worked mightily to attempt to block the agenda of the new President Barack Obama. Republicans argued Obama was abusing his power with his “excessive” use of Executive Orders on issues, such as the creation of numerous commissions, boards, committees, and task forces; along with Obama’s actions on environmental protections, his health care actions, and his initiatives on opening up relations with Cuba and authorizing the Iran Deal to prevent nuclear development.  Republicans further curtailed his agenda as they refused to even consider the nomination of Merrick Garland to replace Antonin Scalia on the Supreme Court after the latter’s death in early 2016, and prevented other judicial confirmations to the lower courts.  But Obama’s administration was scandal free, and no cabinet officers or other high ranking figures were indicted or convicted for corruption, which had been endemic under Reagan and the second Bush in particular.

 

Now Republican President Donald Trump has made the controversies under earlier Republican Presidents Reagan and the two Bushes look minor by comparison. Some even consider his  abuse of power as more scandalous than the Presidency of Richard Nixon. Many are concerned over the  involvement of Russia in the 2016 election; Trump’s violation of the Emoluments Clause; the abuse of power; obstruction of justice; and the massive corruption and incompetence of so many cabinet officers and other high officials under Trump which make him seem unfit for office. 

 

The crisis is greater than Watergate in many respects because Trump has now made it clear he will not cooperate with any Congressional committee demands for evidence or witnesses. He, perhaps jokingly, perhaps seriously, asserted the right to an extra two years as president because he believes he has been mistreated in his first two years due to the Robert Mueller investigation.  And his Attorney General William Barr, the same man who assisted George H. W. Bush in his move toward blanket pardons at the end of his term in 1992, is also refusing to give Congressional committees the entire Mueller report without any redactions.  And now Trump has declared he will not cooperate on any legislative actions by Congress until the “witch hunt” he sees against him comes to an end, which is not about to happen.

 

With Trump using his executive powers to attempt to reverse all of Obama’s accomplishments in office, and that of many other Presidents in the past century, unchecked presidential power has never seemed more of a threat.  Arthur Schlesinger Jr’s book from 1973 is now just the prelude to a far greater constitutional crisis that is possibly, in a permanent manner, transforming the Presidency and destroying the separation of powers and checks and balances created by the Founding Fathers in 1787.

 

]]>
Sun, 16 Jun 2019 01:00:48 +0000 https://historynewsnetwork.org/article/172065 https://historynewsnetwork.org/article/172065 0
For Anti-Racist Educator, Teaching History Was a Calling

 

 

Tobin Miller Shearer is the Director of the African-American Studies Program at the University of Montana and an Associate Professor of History. He conducts research into the history of race and religion in the United States with an emphasis on prayer, the civil rights movement, and white identity. 

 

What books are you reading now?

 

I just finished a week-long reading marathon while my partner was out of town. Of the eight books I plowed through in a week’s time, my three favorites were Laila Haidarali’s Brown Beauty: Color, Sex, and Race from the Harlem Renaissance to World War II (NYU Press: 2018); Tera Hunter’s Bound in Wedlock: Slave and Free Black Marriage in the Nineteenth Century (Belknap: 2019); and Mark Whitaker’s Smoketown: The Untold Story of the Other Great Black Renaissance (Simon & Schuster, 2018).

 

What is your favorite history book?

 

Albert J. Raboteau’s A Fire in the Bones: Reflections on African-American Religious History (Beacon Press, 1995). His lyrical prose, historical insight, and personal passion are stunning.

 

Why did you choose history as your career?

 

This is a second career for me. After working for fifteen years in the non-profit sector as an anti-racism educator and organizer, I realized that the thing I loved most was the occasional guest lecture I got to give when consulting with colleges and universities. Every time that I explored historical questions with groups of students, I left wanting more. That continues to be the case today.

 

What qualities do you need to be a historian?

 

A love of minutiae, a passion for reading, the mind of a detective, the imagination of a storyteller, and a commitment to making the past relevant to the present.

 

Who was your favorite history teacher?

 

One of my graduate school instructors, Nancy Maclean, demanded more of me as a writer than any other history teacher I’ve ever had. I often think of how she taught us to write when I am working with students today.

 

What is your most memorable or rewarding teaching experience?

 

A special topics course on the history of the White Supremacy movement. Even though (or perhaps because?) we held the class in an undisclosed location with police protection due to the death threats I received for teaching the class, the students were amazing. I learned as much as they did.

 

What are your hopes for history as a discipline?

 

That we as a guild would continue to find ways to be relevant and engaging to students who seek out a liberal arts education and to a broader public interested in connecting the present with the past.

 

Do you own any rare history or collectible books? Do you collect artifacts related to history?

 

I have an 1857 copy of The History of Slavery and the Slave Trade, Ancient and Modern, The Forms of Slavery That Prevailed in Ancient Nations, Particularly in Greece and Rome, The African Slave Trade and the Political History of Slavery in the United States, Complied from Authentic Materials by W. O. Blake that is every bit as ponderous and heavy as its title implies. The only thing that would fall into the historical artifact category is an original typeplate of hymn number 587 “O Send Thy Light” from one of the early Mennonite hymnals.

 

What have you found most rewarding and most frustrating about your career? 

 

Most rewarding: the daily balance of being able to spend several hours on my research (on a good day) and several hours engaging with students in and outside of the classroom. Most frustrating: negotiating the bureaucratic hoops that come with operating inside an institution of higher learning.

 

How has the study of history changed in the course of your career?

 

The biggest sea change has been in the realm of technological advances. Word processing, databases, and document digitization have revolutionized the craft and discipline of research and writing. The impact of post-modernism has been more mixed with important challenges being offered to the production of grand master narratives while those very challenges have made our ability to engage in public-facing historical work all that more difficult.

 

What is your favorite history-related saying? Have you come up with your own?

 

I like Marcus Garvey’s take for its simplicity: “A people without the knowledge of their past history, origin and culture is like a tree without roots.” In my African-American history survey class I sometimes make that observation that the most valuable history is often the most difficult to find; it is what we don’t even know that we need to know that gets us into trouble.

 

What are you doing next?

 

I am about three-quarters of my way through a new book – currently entitled Devout Demonstrators (Routledge – forthcoming) – that explores the role of religious resources in historical social change movements. I am studying four domestic and four international protest movements to better understand what has happened in the past when prayer, vestments, fasting, pilgrimage, and song became part of the arsenal of activists’ tactics.

]]>
Sun, 16 Jun 2019 01:00:48 +0000 https://historynewsnetwork.org/article/172059 https://historynewsnetwork.org/article/172059 0
The Fall of Communism in TV’s The Weissensee Saga

 

Among the numerous TV offerings available for streaming is The Weissensee Saga, a first-rate 24-episode German production now available on MHZ. Like Thomas Mann’s German novel Buddenbrooks  (1901), it has all the qualities required of a good multi-generational saga. The TV episodes (each about 49 minutes) contain love and jealousy, good acting, suspenseful plotting, and picturesque and interesting settings. But they also offers viewers, especially those interested in history and communism, excellent insights into the final decade of East German communist rule, including its collapse in 1989-1990, and the reunification of Germany (1990). In that Communist parties collapsed throughout Eastern Europe and the Soviet Union from 1989 to 1991, the saga depicted in East Germany provides valuable reflections on an even wider scale.

But first, many viewers who are not very knowledgeable about Eastern European history can benefit from knowing a few background facts. 

  • The East German and other communist governments in Eastern Europe came to power in the years following the end of World War II in Europe (mid-1945).
  • These governments owed their existence mainly to Soviet military might and support in the region, which came about as a result of USSR military victories in 1944-45.
  • The East German government was the last to assume power (in 1949) because prior to that year East Germany was ruled directly by the USSR. In 1945 the defeated Germany was divided into four zones ruled by the USA, the British, French, and USSR. Berlin, geographically within the Soviet sector, was also divided into four zones. In 1948-49, the USSR blocked railway, road, and canal access to the three western Berlin areas, but the western allies countered with the Berlin airlift to deliver supplies to their zones.
  •  In 1949, the three Western countries  combined their zones, setting up the Federal Republic of Germany, and the Soviets responded by establishing the German Democratic Republic in East Germany (GDR).
  • Soviet troops squashed uprisings in East Germany in 1953, Hungary in 1956, and Czechoslovakia in 1968.  
  • In 1955, West Germany joined NATO, and the USSR responded by establishing the Warsaw Pact, a military alliance between Russia and communist East European allies. 
  • In 1961 the GDR, with Soviet approval, constructed the Berlin Wall, dividing East and West Berlin, partly to prevent more East Berliners from pouring into West Berlin. Only in December 1972 did the two Germanys sign a treaty diplomatically recognizing each other.
  • Mikhail Gorbachev became the head of the Soviet communist party, and thus de facto leader of the USSR, in 1985 and soon thereafter initiated economic and cultural reform policies at home, accompanied by measures to end the Cold War with Western democracies.  
  • In 1989-1990 the USSR, now under Mikhail Gorbachev’s leadership, was no longer willing to use Soviet troops to put down opposition to East European communist governments. 
  • The GDR, under Erich Honeckerfrom 1971 to October 1989, resisted following Gorbachev’s example of initiating widespread domestic reforms. (In his Memoirs Gorbachev wrote that trying to convince Honecker to reform was like “speaking to a brick wall.”)

 

A leading expert on German-Russian relations, Angela Stent, has written that the GDR was “a state that never enjoyed popular legitimacy and whose most successful industry was spying, not only on West Germany, but on its own people.” This is a fitting introduction to The Weissensee Saga because the main family around which the series revolves, the Kupfers, contains two members of the Stasi, the German equivalent of the Soviet KGB, father Hans and oldest son Falk. Another prominent family, the Hausmanns, containing mother (famous singer-songwriter Dunja) and daughter (Julia), are victimized in different ways by the infamous Stasi. (An interesting coincidence is that from 1985 to 1990, when most of the TV saga is set, KGB agent Vladimir Putin was stationed in Dresden, East Germany, but the series does not mention him.)

 

What happens to the singer Dunja (played by the East German actress Katrin Sass) is typical. Like major cultural figures in other communist countries, she is forced to cooperate with communist authorities if she wishes to enjoy certain benefits, like traveling and performing in the West (West Germany in her case). The Stasi, including Falk Kupfer, employ various methods like bugging her apartment to make her more accommodating. These security police also employed torture and other means of “persuasion.” Actress Sass had her own personal experiences with the Stasi, no doubt aiding her in presenting her convincing Dunja portrait. Another major actor—Uwe Kockisch, who plays Hans—was once imprisoned for a year for trying to escape from East Germany.  

The experiences of Sass and other actors in the saga bring to life the words of historian Tony Judt:

The Communist regimes did not merely force their rule upon a reluctant citizenry; they encouraged people to collude in their own repression, by collaborating with the security agencies and reporting the activities and opinions of their colleagues, neighbours, acquaintances, friends and relations. . . . The consequence was that while the whole society thus fell under suspicion—who might not have worked for the police or the regime at some moment, even if only inadvertently?—by the same token it became hard to distinguish venal and even mercenary collaboration from simple cowardice or even the desire to protect one’s family. The price of a refusal to report to the Stasi might be your children’s future. The grey veil of moral ambiguity thus fell across many of the private choices of helpless individuals. 

Living in a nice house on a picturesque lake, the Kupfer family enjoys the privileges typical of the communist elite—father Hans has a high position in the Stasi and son Falk is an ambitious and rising force within it. The other son, Martin (East-German-born Florian Lukas), begins the series as a member of the regular police force, much less prestigious than the Stasi, but eventually quits, having soured on his duties, which included suppressing any activities disapproved of by the communist authorities. 

As social turbulence increases in East Germany in the late 1980s, partly due to the Gorbachev effect on Eastern European communist countries, members of the Kupfer family react differently. In one scene Hans approvingly watches Gorbachev on TV. Falk, both a careerist determined to get ahead and a defender of hard-line communist ways, is much less sympathetic to the Soviet leader’s reforming ways. Martin is the least political of the three. The disorienting effect on youth of the collapse of communist power in East Germany, coupled with German reunification, is also well illustrated through two of the youngest generation, Falk’s son Roman and Martin’s daughter Lisa. 

The three older Kupfer men all have their marital difficulties. Hans once had an affair with Dunja Hausmann, and his wife, Marlene (Ruth Reinecke), remains suspicious of her.  Falk’s wife, Vera (Anna Loos), grows increasingly unhappy with him and eventually leaves him and becomes a dissident against the dying communist regime. From the beginning, Martin is divorced from his wife and begins a romance with Julia Hausmann. All three Kupfer women play important roles in the saga, as do two other women who will later become involved with Falk and Martin.

The new woman in Falk’s life in later episodes is physiotherapist Petra Zeiler (Jördis Triebel), who earlier in her life was interrogated by the Stasi and imprisoned. Martin becomes involved with Katja Wiese (Lisa Wagner), a West German journalist. 

In the saga’s last dozen episodes, set in the crucial years 1989-90, East German relations with West Germans become increasingly important—the Berlin Wall is demolished in November 1989. For example, as the communist government collapses and with it the Stasi, Falk goes to work for a West German insurance company wishing to expand into eastern Germany, and is also blackmailed into providing information to the CIA. Martin’s furniture company, adversely affected by currency changes connected with German reunification, turns to a West German financier for advice. Martin’s former police partner, Peter Görlitz (an often humorous Stephan Grossmann), starts selling used cars.

As in other parts of the collapsing communist world of 1989-1991, financial reforms and what to do with all the state-owned assets becomes a crucial question, and Falk’s ex-wife, Vera, works for an agency dealing with privatizing some of these assets. Prior to this, she had become active as a dissident and later political candidate in 1990 elections. (She had also become involved with a Lutheran pastor, who falls into the hands of the Stasi and her ex-husband Falk.)  Attempting to retain East German communist assets are Falk’s mother, Marlene, and Hans’s former Stasi boss, the evil Günther Gaucke (Hansjürgen Hürrig).

The most impressive educational aspect of The Weissensee Saga is that it presents a realistic and convincing portrait of the differing lives and reactions of East German individuals in the final decade of the GDR’s existence. In that the experiences of East German politicians, officials, and other citizens were similar to that of individuals in other eastern European communist countries, including the USSR, we also gain insights into what mattered to them, for example, financial changes; more freedoms, including less government restraints; and personal transitioning from one economic-political-cultural system to another—like the East Germans, Russians such as Vladimir Putin had to make such a transition).  Along with MHZ’s even longer series, the World-War-II-72-episode A French Village, The Weissensee Saga  now springs to the top of my favorite TV fictional historical sagas. 

]]>
Sun, 16 Jun 2019 01:00:48 +0000 https://historynewsnetwork.org/article/172053 https://historynewsnetwork.org/article/172053 0
All the President’s Humility: What We Can Learn From Young George Washington

 

The Founding Father whom Americans revere as the incarnation of steady, selfless leadership – George Washington – was, in his early twenties, a remarkably self-centered young man.  This poses an interesting question:  Can today’s leaders – beginning at the top – make a similar transformation from self-centered to steady and selfless?  Or is it just too late?

 

After four years of immersing myself in George Washington’s life between the ages of twenty-one and twenty-six, what I find most surprising is not that he eventually grew to become a great leader.  Rather, it’s that he became a great leader despite where he started. As a young man this guy was a mess. 

 

Washington is certainly not the first young man to be selfish, egotistical, vain, thin-skinned, ungracious, whiney, petulant, and brazenly ambitious.  Most young men who feel underappreciated, however, don’t quit or threaten to quit their job at least seven times in the first few years.  Nor are most obsessed with their best friend’s wife.  Nor do most twenty-somethings inadvertently set off a global war.  What distinguishes George Washington’s youthful follies in the 1750s is that his relentless ambition happened to coincide with the many unsettled territorial claims to North America, creating a volatile mixture.  As it combusted, his youthful self-centeredness played out on a stage that quickly expanded from local, to regional, to international – with disastrous consequences.

 

From his mid-teens onward, Washington’s ambition shows.  It accelerates to a relentless upward clawing as he enters his twenties.  His father, Gus, had died when George was eleven, setting back George’s prospects for a secure future.  As a younger son in a fourth-generation family of middle-level Virginia tobacco planters, George, unlike his older half brothers, was not sent off to Britain to receive a polished boarding-school education, nor did he inherit enough land from Gus to support himself.  After his forceful and cantankerous mother, Mary Ball Washington, shot down his plan at age fourteen to go to sea, George, needing a way to make a living, polished up his father’s old surveying instruments, taught himself to operate them, and set himself up as a freelance surveyor.

 

By age eighteen, he’d earned enough money to start buying his own pieces of frontier land beyond the Blue Ridge Mountains.  By age twenty-one, still not rising fast enough in the Virginia aristocracy to sate his driving ambition, Washington took a part-time post in Virginia’s colonial military and volunteered for a dangerous winter mission.   He was to carry a message from Virginia’s British governor, Robert Dinwiddie, over the Appalachian Mountains and deep into the Ohio wilderness, delivering it to the commandant of a newly built French fort.  

 

The message said in essence, Stay out! All these lands belong to King George.        

 

This launched Washington on five years of harrowing adventures in the Ohio wilderness, its dangers further fueled by his heedless push to make a name and his almost utter lack of experience.  He came within an inch of dying on that first mission – pitched off a makeshift raft into an icy river then nearly freezing to death during the frigid night on a snowy island.  On his second mission into the wilds he rashly ambushed a French diplomatic party that was breakfasting in a wooded glen.  Not surprisingly, this triggered a massive retaliation by hundreds of French soldiers and Indian warriors, during which Washington’s outnumbered men perished in a pouring rain in blood-and-mud-filled trenches.  He had to surrender (although he refused to use that word) the claptrap fort he had thrown together, appropriately named Fort Necessity for the desperate circumstances he had created for himself and his troops.  This resulted in deep humiliation for the British Empire and its authorities in London, touching off tensions that exploded into the French and Indian War (and spread to Europe and around the globe as the Seven Years War).

 

Young Washington fervently wanted a British Royal Army officer’s commission – instead of his much less prestigious Virginia colonial commission – and he rode great distances to petition various aristocratic British generals to give him one. But he was a hayseed by their standards, an uneducated rube, and a military loser besides.  He was never granted a “king’s commission,” cementing his lasting resentment toward the British whom he felt treated him as second class.

 

He is mostly remembered today, of course, as the immortal embodiment of sound leadership. So how does one evolve from a festering mass of insecurities and perceived injustices to become a great leader? Not easily, and not all at once. It took Washington many years to metamorphose from self-centered, impetuous young man burning with ambition to gain personal “honor” into a steady, selfless, seemingly unflappable leader.

 

Yet one catches glimpses from his early twenties that hint at what he might become – transformative moments that show a young man beginning to extend his emotional and intellectual reach beyond himself.  There is a moment when he literally gets down off his high horse – the living embodiment of a Virginia gentleman’s status – and walks the muddy trail beside his men, freeing the animal to haul armaments over a steep mountain pass.  He shows an almost desperate sense of helplessness when Virginia frontier settlers, whose safety has been entrusted to his care, plead with him to save them from roving bands of Indians who scalp their loved ones and burn their homesteads, offering to give his life to save theirs if only it would help.  One senses in these moments his growing empathy for the plight of others.  As a young aide-de-camp to British General Braddock, he barely survives the wilderness ambush by Indian warriors and French soldiers of a large column of the general’s Redcoats.  Washington’s narrow escape from death was marked by the multiple bullet holes through his coat and hat.

 

“[T]he miraculous care of Providence,” he wrote his younger brother after the battle, “…protected me beyond all human expectation….” 

 

Implicit in this remark is that Providence may have chosen him for some greater role. Perhaps his destiny is not simply all about George Washington.

 

One sees steps toward a more mature style of leadership.  After he makes a series of heedless blunders in his rush to prove himself in his first engagements, Washington learns to listen carefully to intelligent and trusted advisors and weigh their words judiciously before making a decision. 

 

Young George Washington was not immune from his own era’s culture of voluble denial and dexterous shifting of blame, the same that besets us today.  He initially denied his mistakes or obfuscated his moments of failure.  The surrender of Fort Necessity comes to mind, when the twenty-two-year-old colonel’s less-than-complete public recounting of the bloody debacle reads as if his forces and the French simply agreed to stop fighting and walk away, rather than the reality of a slaughter leading to Washington’s forces’ surrender and signed documents to that effect.

 

As he grew older, he learned to cultivate his image and project a sense of dignity. Many commentators have remarked that he seemed to see himself as an actor on a stage.  He rarely revealed his deepest emotions, at least in public. But as he matured into leadership he clearly learned to accept his failures, take responsibility for them, and acknowledge his own human frailty, if sometimes only to himself and his beloved Martha.  When, at age forty-three, he was asked by the Continental Congress in June of 1775 to command the newly formed Continental Army against British forces, Washington responded, “…I this day declare, with utmost sincerity, I do not think myself equal to the command I [am] honoured with.”

 

What leader would say that today?  Who has that kind of humility?  Maybe it’s simply too late for most of our current leadership.  Self-centeredness and driving ambition have always played a role among American politicians.  One wonders, however, if today those qualities are amplified in our leaders – even encouraged – by instant polling, social media accounts that precisely measure “popularity” by counting followers or hits, and a media environment that thrives on volatile, off-the-cuff political commentary.   Is it just too hard for our leaders to embrace humility in that churning vortex, to acknowledge their own weaknesses?

 

Or do the lessons of humility have to come from somewhere far deeper, a place where the penalties for arrogance land far more severely?  Amid all the noise among our leaders today – the posturing, the blaming, the denying – how powerful are the consequences of self-centeredness and ambition, how immediate, how graphic, how frightening?  

 

The young George Washington, by contrast, suffered horrendous consequences for his self-centeredness and driving ambition, such as seeing his dead comrades-in-arms sprawled in the bloody trenches of Fort Necessity.  While rain cascaded down and darkness fell, it was a sobering reality.

 

Much later in life, during his presidency and after, Washington worried about extreme partisanship literally tearing the young nation into pieces.  Maybe we as a nation nearly two-and-a-half centuries later have felt invulnerable in our unity – fearing no consequences for our sniping fractiousness – until suddenly we find that unity shattered into unfixable, razor-edged shards.  Washington did not take unity for granted, in the least.  As commander-in-chief of the Continental Army and then as president, he understood that the greatest task facing him was not to enhance his stature but to unify the troops, then the nation.  The battering he received in his early twenties in the Ohio wilderness helped him arrive at this realization.  He learned – in the hardest way – that it was not only about him.  It was about everyone.  He learned to settle his anger, open his ears, subsume his hefty ego to a greater good.

 

In Washington’s era, as today, much of the responsibility for leadership fell on the citizenry and their honesty with themselves.  They could see the man and judge him for what he was.  He learned to welcome that, instead of ducking from it.  Washington’s growing self-assurance allowed him to acknowledge his own weaknesses and imperfections and, as a result, maintain his dignity rather than assuming a reflexive position of bluster, muscle-flexing, and blame.  Those who looked to him for leadership recognized his humility as a sign of wisdom and strength.  They saw a leader who was sincerely trying to do his best for a struggling nation.  They rallied behind him.  When they did, his humility ultimately became a source of the citizenry’s wisdom and strength.

]]>
Sun, 16 Jun 2019 01:00:48 +0000 https://historynewsnetwork.org/article/172009 https://historynewsnetwork.org/article/172009 0
Roundup Top 10!  

 

We Don't Have to Imagine the Consequences of Abortion Bans. We Just Have to Look to the Past

by Leslie J. Reagan

Making abortion illegal never meant abortion didn’t happen. For the entire century of criminalized abortion, women of every class, marital status, religion and race still obtained them.

 

Why nuclear diplomacy needs more women

by Elena Souris

Historically, a homogenous group of policymakers make innovation less likely.

 

 

Rashida Tlaib’s critics have Palestinian history all wrong

by Maha Nassar

The decades-long process that led to the creation of Israel involved plenty of Palestinian suffering.

 

 

The Real Reason Iran’s Hardliners Don’t Want To Talk To America

by Shireen T. Hunter

Tehran’s reluctance to engage in direct talks with America at a normal state-to-state level within a bilateral framework long predates the Trump administration.

 

 

Calhoun statue should not stand in prominent public space

by Joseph A. Darby

The only good “compromise” is to take it down and involve those who cherish his memory in choosing a suitable venue for its more appropriate display.

 

 

We need to stop focusing on the mental health of mass shooters

by Deborah Doroshow

Mentally ill Americans are already stigmatized — and wrongly so.

 

 

Living in a Nation of Political Narcissists

by Tom Engelhardt

American election exceptionalism from 1945-2019.

 

 

How Democrats can win the abortion war: Talk about Roe's restrictions as well as rights

by Jonathan Zimmerman

Republicans are lying when they paint us as the party of death and infanticide. Fight back by championing both the right to abortion and limits on it.

 

 

On the Recent Executive Order on"Free Inquiry" in Higher Education

by James Grossman and Edward Liebow

President Donald Trump’s executive order of March 21 on “free inquiry, transparency, and accountability in colleges and universities” is a textbook example of a classic negotiating ploy—misdirection.

</

 

Reclaiming History From Howard Zinn

by Naomi Schaefer Riley

The left’s portrait of America’s past has triumphed thanks to the abdication of serious historians. Wilfred M. McClay offers an antidote.

]]>
Sun, 16 Jun 2019 01:00:48 +0000 https://historynewsnetwork.org/article/172054 https://historynewsnetwork.org/article/172054 0
Susan Sarandon Shines in Happy Talk

 

The musical South Pacific, the Richard Rodgers and Oscar Hammerstein show about life in the Pacific during World War II, debuted in 1949 and was a huge hit. Now, all these years later, Lorraine has been cast as Bloody Mary in South Pacific in a production of the show at her small town’s Jewish Community Center. She sees herself as a glowing celebrity in yet another starring vehicle in her local theater group. She is, as she constantly says, loved by all. The musical comes to represent her life, and the life she wishes she lived.

 

Lorraine’s mother is dying and receives 24 hour care from a former Serbian caregiver with problems galore, Ljuba. Lorraine has been in a hollow marriage for years with her husband, Bill, who rarely speaks to her and is absorbed daily in a book about the Civil War. Lorraine bristles that Bill has turned into an old man because all old men in America find that they must read a book about the Civil War before they pass on. 

 

The travails of Lorraine are the material of Jesse Eisenberg’s very funny and very moving new play, Happy Talk, with a tremendous performance by Susan Sarandon as its star. The play opened Thursday at the Griffin Theater at the Pershing Square Signature Theater Complex on W. 42d Street, New York.

 

What do you get from a 1949 musical such as South Pacific? Everything, according to Lorraine. In the play, you continually hear the song Bali Hi, a song about hopes and dreams and a special place in a troubled world. That’s Lorraine’s world. She has built a self-centered, egomaniacal world for herself and refuses to recognize the odd and painful life in which she exists. She constantly goes back into the past, and to mythical Bali Hi, to try to re-discover herself, continually failing.

 

She never did have a good marriage and, of course, blames her husband, who can’t stand her. She and her mother never got along and for that she blames – mom. She never had friends and for that she blames all the people she says are her friends but will have nothing to do with her. As an example, after each rehearsal of the play all the actors go to a nearby bar to have a drink, but never ask Lorraine to join them.

 

Lorraine’s daughter, Jenny, whom she raised to be the same fantasy world chaser as her, arrives in mid-play and harangues her mother in long, hateful dialogues. Her chickens have not only come to roost, but to harass her.

 

She does have a wonderful relationship with Ljuba, the Serbian caregiver, who tells the audience a bit about the history of Serbia over the last few decades, Ljuba loves Lorraine, but for no apparent reason (you find out soon enough).

 

Ljuba has a time honored American historical problem involving immigration. She has been in the U.S. illegally for years and must find someone to marry her so she can stay. She’s willing to pay the going black market rate for arranged marriages, $30,000, and asks Lorraine to find her a hubby.

 

Lorraine recruits actor pal Ronny, who could be her lone friend, who agrees and is in it for the money and the money alone, even as he leads Ljuba to believe he likes her and that she could find happiness with him.

 

The road Ljuba and Ronny go down, urged on by the smiling and encouraging Lorraine is slippery slope and is the same road thousands of illegal immigrants have followed for a century in America. The marriage scam is an old one. The government permits many women, or men, from foreign lands to stay in America if they marry an American. Whole industries are involved in this. Men get off a plane and are married to a total stranger a few days later for a specified amount of money. Some women marry dozens of men, all at a fixed price.  It is a marriage mill that has been churning out legal couples who barely know each other for generations. When it starts in the play, numerous members of the audience nodded knowingly because the scam is so familiar to all.

 

The play is very, very funny and playwright Eisenberg takes the audience along on a comedic roller coaster with ups and downs and spins around dangerous curves. Then, later, there is a dramatic change in the story. His script is brilliant when it is funny and deep and provoking when it is dramatic.

 

Eisenberg’s work is smarty directed by Scott Elliot, who gets full use out of the music in South Pacific, particularly the song Bali Hi, using it as a backdrop to tell the story.

 

All of the actors do fine work. Ronny, the bubbling gay actor eager to collect his money, is played by the delightful Nicci Santos. Grumpy Bill, so enchanted by Lee, Grant and Gettysburg, is played well by Daniel Oreskes. Marin Ireland gives an enchanting and memorable performance as Ljuba.

 

The centerpiece of the show is Lorraine, played by Ms. Sarandon, The well-known screen actress (Bull Durham, Thelma and Louise, etc.), the star of so many movies, is just as comedic, and powerful, here on stage as she has been in any film. Her character takes both slow and sharp turns as the play progresses and Ms. Sarandon masters all of them. She is lovable and embraceable when she is funny and menacing when she is angry. She is hateful, and yet very vulnerable. She turns Lorraine into a memorable character, a pathetic middle-aged woman you will never forget. 

 

PRODUCTION: The play is produced by the New Group. Set Design: Derek McLane, Costumes: Clint Ramos, Lighting: Jeff Croiter, Sound: Rob Milburn and  Michael Bodeen. The play is directed By Scott Elliott. The show has an open-ended run.

      

]]>
Sun, 16 Jun 2019 01:00:48 +0000 https://historynewsnetwork.org/article/172010 https://historynewsnetwork.org/article/172010 0
The History of Black Women Championing Demands for Reparations

 

 

The American media has paid increasing attention to the legacies of slavery. The new National Museum of African American History and Culture features a huge exhibition on the history of slavery. Many US universities are studying their links with slavery and the slave trade. In several cases, schools decided to provide symbolic reparations by renaming buildings and/or creating memorials and monuments to honor enslaved men and women. 

 

But these measures do not seem to suffice: several activists and ordinary citizens are calling for financial reparations. Students of Georgetown University recently voted to pay a fee to finance a reparations fund to benefit the descendants of the 1838 sale of enslaved people owned by the Society of Jesus. The Democratic presidential candidates are routinely asked if they would support studies to provide financial reparations for slavery to African Americans. What is often missed is that these calls started long ago. Writers and readers also forget that black women championed demands of reparations for slavery.

 

Belinda Sutton is among the first black women to demand reparations for slavery in North America. Her owner, Isaac Royall Junior, fled North America in 1775, during the American Revolutionary War. He left behind his assets but his will included provisions to pay Belinda a pension for three years.

 

After Royall Junior’s death, we assume Sutton received the pension determined in his will. When three years passed, the payments stopped. Belinda petitioned the Massachusetts legislature and requested her pension continue. Emphasizing she lived in poverty and had contributed to the wealth of the Royalls, Sutton successfully obtained an annual pension. Belinda’s story is memorialized at the Royall House and Slave Quarters in Medford, Massachusetts.

 

Like today, the political context shaped these early demands for reparations and the responses petitioners received. Unlike other former slaves, Sutton’s odds to get restitutions were greater because her former owner was a British Loyalist. Moreover, he had already determined in his will to pay her a pension. 

 

Freedwomen and their descendants continued fighting for reparations in later years. They knew more than anyone else the value of material resources because they lacked them. They were those providing hard work to maintain their households and to raise children and grandchildren.

 

Sojourner Truth also demanded reparations for slavery through land redistribution. Following the end of slavery, during Reconstruction, Truth argued that slaves helped to build the nation’s wealth and therefore should be compensated. In 1870, she circulated a petition requesting Congress to provide land to the “freed colored people in and about Washington” to allow them “to support themselves.” Yet, Truth’s efforts were not successful. US former slaves got no land or financial support after the end of slavery.

 

The context of the brutal end of Reconstruction that cut short the promises of equal access to education and voting rights for black Americans favored the rise of calls for reparations. And once again black women took the lead.

 

Ex-slave Callie House fought for reparations. A widow and a mother of five children, who worked as a washerwoman, she saw many former slaves old, sick, and unable to work to maintain themselves. House became one of the leaders of the National Ex-Slave Mutual Relief, Bounty and Pension Association that gathered dozens of thousands of former slaves to press the US Congress to pass legislation to award pensions to freedpeople. 

 

Soon the federal government started accusing the association of using mail to lead a fraud scheme. Callie House responded that the association’s goal was to obtain redress for a historical wrong. She reminded federal authorities that former slaves were left with no resources and had the right to organize themselves to demand restitutions. She bravely denounced that government hostility against the pensions movement was motivated by racism. 

 

In 1916, the Post Office Department charged Callie House for using the US mail to defraud. She spent one year in prison.

 

Black women had good reasons to fight for reparations. Until the 1920s, black women were deprived of voting rights. More than black men, they were socially and economically excluded. With less access to education, even in an old age they were those running the households. To most former enslaved women, expectations of social mobility were impracticable. In contrast, pensions and land were tangible resources that could supply them with autonomy and possible social mobility.

 

Audley Eloise Moore from Louisiana also became an important reparations’ activist. Influenced by Marcus Garvey she became a prominent, black nationalist, Pan-Africanist, and civil rights activist. 

 

In 1962, Moore saw the approach of the one hundredth anniversary of the Emancipation Proclamation of 1863 as an occasion to discuss the legacies of slavery. To this end, she created the Reparations Committee for the Descendants of American Slaves (RCDAS) that filed a claim demanding reparations for slavery in a court of the state of California. She also authored a booklet underscoring that slaves provided dozens of years of unpaid work to slave owners. She emphasized the horrors of lynching, segregation, disfranchisement, raping, and police brutality. Yet, the litigation was not successful.

 

Moore defended payment of financial reparations to all African Americans and their descendants and that each individual and group should decide what to do with the funds. She contended that the unpaid work provided by enslaved Africans and their descendants led to the wealth accumulation that made the United States the richest “the richest country in the world.”

 

 

In later years, Moore continued participating in organizations defending reparations for slavery. In 1968, she joined the Republic of New Africa and later supported the efforts of the National Coalition of Blacks for Reparations in America (N’COBRA). She made her last public appearance at her late nineties during the Million Man March held in Washington DC in October 1995, when she still called for reparations.

 

In 2002, Edward Fagan filled a class-action lawsuit in the name of Deadria Farmer-Paellmann and other persons in similar situations. An African American activist and lawyer, Farmer-Paellmann founded the Reparations Study Group. Fagan’s lawsuit requested a formal apology and financial reparations from three US companies that profited from slavery. Among these corporations was Aetna Insurance Company that held an insurance policy in the name of Abel Hines, Farmer-Paellman’s enslaved great-grandfather. Although the case was dismissed in 2004, the US Court of Appeals for the Seventh Circuit later allowed the plaintiffs to engage in consumer protection claims exposing the companies named in the lawsuit for misleading their customers about their role in slavery. 

 

Years marking commemorative dates associated with slavery favor the rise of demands of reparations. This year marks the fourth hundredth anniversary of the landing of the first enslaved Africans in Virginia. In addition, it’s also the kick off of the 2020 presidential campaign. 

 

For black groups and organizations that now fully engage in social media it’s time to renew calls for reparations that have been around for several decades. For potential presidential candidates, the debate on reparations is an opportunity to gain the black vote.

 

But for black women, no matter the commemorative and elections calendars, the fight for reparations is not a new opportunity, it is rather a long-lasting battle for social justice.

]]>
Sun, 16 Jun 2019 01:00:48 +0000 https://historynewsnetwork.org/article/172003 https://historynewsnetwork.org/article/172003 0
Remembering Jackie Kennedy for More than Her Fashion Sense

 

It’s been 25 years since the death of Jacqueline Kennedy Onassis. When we remember the former first lady, specific images often come to mind: the fashionable young woman in the pillbox hat sitting atop her bouffant hairdo at her husband’s inauguration, the first lady in her beautiful gowns hosting world leaders and artists, the shocked wife in the pink suit covered in her husband’s blood, the grieving widow in the black veil holding the hands of her two young children, or the New York socialite and book editor in her Hermès scarves and signature large black glasses trying to hide her famous face. It’s like she’s frozen in time, preserved in still photographs that focus on her beauty, grace, strength, and perseverance. 

 

But there was so much more to Jackie than a pretty face and fashion sense. She was a reluctant yet supportive political spouse who helped her husband charm foreign dignitaries. She was a history and art aficionado who turned the White House into a living museum. She was a loving mother to her children and ultimately became a role model for American women.

 

But she didn’t stop there. Jackie also helped shape the first lady institution. She was the first presidential wife to have an official press secretary charged with handling the media’s insatiable appetite for news about her and her children. She was one of the first to focus her advocacy on a specific area – supporting the arts – which she did by inviting famous performers to the White House and through her oversight of the White House restoration. That legacy lives on at the Kennedy Center for the Performing Arts.

 

Her sophisticated fashion sense set a high bar for future first ladies. Some, like Michelle Obama and Melania Trump, have even emulated her style in their own fashion choices. She also provided a model for future first ladies on how to protect their children from the glaring media spotlight. First ladies including Rosalynn Carter, Hillary Clinton, Laura Bush, and Michelle Obama all followed Jackie’s example as they tried to give their children a somewhat “normal” life in the White House. 

 

And most notably, she was the architect who meticulously crafted President John F. Kennedy’s lasting legacy.

 

Just one week after her husband’s assassination, and only days after his funeral, she arranged an interview with Life magazine reporter Theodore White, with a goal of preserving her husband’s memory. During the interview, Jackie recalled that JFK often played the title song to the popular Broadway musical Camelot just before going to bed, noting that his favorite line was, “Don’t let it be forgot, that once there was a spot, for one brief, shining moment that was Camelot.” She went on to say, “There will be great presidents again…. But there will never be another Camelot again.”

 

Since then, the Camelot myth has been inextricably linked with the Kennedys. They are remembered as American royalty who led the country with youthful optimism and a noble purpose during uncertain times. For better or for worse, Jackie cemented JFK’s image as the apex of American liberalism when she fought White’s editors to keep the Camelot reference in his article. Although this interview was one of the only times that she spoke on record about the events surrounding her husband’s death, it forever shaped the way she and her husband are remembered.  

 

Jackie was a reluctant celebrity. Shortly after her husband’s assassination, she told a reporter that she was going to “crawl into the deepest retirement there is.” For the most part, she tried to stay out of the public eye and avoid the media. She even attempted to keep her 1968 marriage to Greek tycoon Aristotle Onassis private in spite of the media frenzy surrounding the couple. One of the few exceptions was utilizing her celebrity status in the mid-1970s to save New York’s Grand Central Station from demolition, leading the fundraising efforts to restore the historic landmark.

 

She remained the target of the tabloids and paparazzi throughout her life. And the public is still fascinated with her. She ranks as one of the most popular and well-remembered first ladies according to a 2018 YouGov poll. Thanks to movies like 2016’s Jackie and the countless books written about her, we know a bit more about the woman behind the image. But, even 25 years after her death, there is still so much we don’t know about this very private woman. And that’s probably the way she would like it.

]]>
Sun, 16 Jun 2019 01:00:48 +0000 https://historynewsnetwork.org/article/171995 https://historynewsnetwork.org/article/171995 0
From 37 to 79: Age and Presidential Campaigns

A portion of the Democrats running for president. 

 

Age is rarely an issue in presidential elections. Most candidates are neither too young nor too old. The average age of the last ten presidents upon taking office was 57.   The 2020 election, however, bristles with age issues: Five candidates will be in their 70s on Election Day, four will be in their 40s and three will be in their 30s.   Donald Trump, at 70, was the oldest candidate to ever win the presidency. If re-elected, he’d leave office at 78, the oldest president ever to serve––beating Ronald Reagan by nearly eight months.   But Trump, now 72, is one of those people who isn’t measured by age. He even calls himself a “young, vibrant man.” While that may be something of a fudge, polls do show that Trump is perceived as strong and bold, traits rarely associated with geezers.   Among other septuagenarians running are three Democrats and a Republican. When the new president is elected, Bernie Sanders will be 79, Joe Biden will be 77, Elizabeth Warren will be 71 and Trump’s GOP challenger, William Weld, will be 75.   The political trap for older candidates is not age, in a narrow sense, but more widely, the appearance of generational disconnect. Are they in touch with the modern world? Do they understand the needs of younger generations? Little wonder that 50-year old Bill Clinton’s re-election slogan against 73-year-old Bob Dole was “A Bridge to the 21st Century.”    Seventy-two-year-old John McCain lost to 47-year-old Barack Obama in 2008 not so much because of his age, but because the country wanted change, and Obama’s youth perfectly embodied a  “Hope and Change” message.   When candidates are young, on the other hand, the issue becomes experience and maturity of judgment. Have they seen enough of the world to master national leadership?    Theodore Roosevelt was the youngest U.S. president. At 42, he moved up from the vice presidency when President William McKinley was assassinated. John F. Kennedy was the youngest to be elected, at 43. In one of history’s touching parallels, he replaced the nation’s oldest president at that time, Dwight Eisenhower, who was 70 when he left office.   Kennedy’s entire career symbolized generational renewal, particularly apt in the years after World War II when young veterans were climbing increasingly steep career ladders. Kennedy won his first race for Congress at 29, and campaigned on the slogan ”A New Generation Offers a Leader.” In his inaugural address, he emphasized that “the torch has been passed to a new generation of Americans­­.”    Besides JFK and TR, America has had five other presidents in their 40s. The first three––Ulysses Grant, James Garfield and Grover Cleveland––were elected within a 16-year period, 1868-1884. The two most recent––Bill Clinton and Barack Obama––also won within 16 years, 1992-2008.   On the Democratic roster this year, five candidates are in their 40s and three are in their 30s. Former U.S. Rep. Beto O’Rourke will be 48 by Election Day. U.S. Rep. Tim Ryan and former mayor and HUD secretary Julian Castro will be 46, entrepreneur Andrew Yang will be 45 and U.S. Rep. Seth Moulton will be 42. U.S. Reps. Tulsi Gabbard and Eric Swalwell will be 39. The youngest candidate, Mayor Pete Buttigieg, will be 38––although he’ll become 39 the day before the next president takes the oath.   To offer perspective: When Buttigieg was born, Biden had already served nine years in the U.S. Senate. When Sanders was born, Franklin Roosevelt was president.   America has never elected a president in his 30s, although Williams Jennings Bryan won the Democratic presidential nomination at the tender age of 36.    The world has seen old leaders full of wisdom––Winston Churchill was 80 when he retired as British Prime Minister––and young ones brimming with new ideas. Emmanuel Macron was elected President of France at 39.    Mark Twain once said, “Age is an issue of mind over matter. If you don’t mind, it doesn’t matter.” As this campaign plays out, we’ll see about that.

]]>
Sun, 16 Jun 2019 01:00:48 +0000 https://historynewsnetwork.org/article/172006 https://historynewsnetwork.org/article/172006 0
Civics 101: Instilling Constitutional Literacy in Tomorrow’s Strategic Leaders

National Defense University (NDU) Faculty walk out of Roosevelt Hall for the Graduation ceremony at

Fort Lesley J. McNair, Washington D.C. DoD Photo by U.S. Army Sgt. James K. McCann.

 

 

“You should try teaching political science in this town with a straight face.” That has been my longtime lament to anyone who engages me on the enduring turbulence, divisiveness, inertia, and dysfunction of politics and governance in Washington. Now, though, the situation has become so massively fraught that my standing lament assumes new saliency. When catastrophe, calamity, debacle, disaster, fiasco, and chaos are words that seem best to characterize the functioning of the federal government today, it makes my job especially daunting.

 

I’m a professor – at one of the U.S. military’s senior colleges. My students aren’t your average student nor even your average graduate student. They’re experienced government professionals – military officers at the rank of lieutenant colonel and colonel (or the Navy equivalent) and federal civil servants and Foreign Service Officers of comparable grade, each with 15-23 years of professional experience – who have been specially selected by their parent service or federal agency for a year-long graduate-level educational experience designed to groom them for future positions of executive authority and responsibility. 

 

The Constitutional Oath

As the price of their admission to public service, these individuals have all sworn an oath of allegiance to the Constitution, thereby assuming the obligation, willingly and without mental reservation, to support and defend it against all enemies, foreign and domestic. That means, in my estimation, that they have agreed, uncoerced, to embrace, protect, and remain loyal to the precepts, prerogatives, institutional arrangements, and rights embodied in the Constitution, its amendments and, arguably to be sure, the Constitution’s underlying philosophical foundation, the Declaration of Independence

 

With regrettably few exceptions, though, most of these individuals haven’t given more than passing thought to the Constitution since they first took the oath. So, where there should be intimate familiarity and understanding, there is pronounced ignorance –civic illiteracy– that could signal danger ahead as these individuals advance to senior levels. On top of that, when the only role models they have at the highest levels of government discredit, sully, and even jeopardize the values the country claims to represent, civic consciousness, literacy, and competence assume overriding significance.

 

What, then, should the public expect such future senior leaders to learn? Let us note at the outset that these are public servants charged with serving the American public – professionals who, because of their specialized expertise and preparation, standards of conduct and performance, and presumed internal self-policing, are accorded a great deal of unquestioned discretionary license by the public they serve in return for competence, integrity, and accountability. For me, the message is clear: If the public is to be properly served, professional development at this level necessarily becomes an exercise in civic development.

 

As such, I would want these individuals, for starters, to address that most fundamental of questions: What is the very purpose of government they inhabit and operate? Is it merely to preserve property (a la John Locke), to facilitate the happiness of the people (a la John Adams), to provide justice (a la James Madison), or to ensure peace and security (a la Thomas Hobbes)? Is it, in the wise words of Abraham Lincoln, “to do for a community of People, whatever they need to have done, but can not do, at all, or can not, so well do, for themselves – in their separate, and individual capacities”? Or is it, as America’s founders contended in the Declaration of Independence, to secure the natural rights (including, but not limited to, life, liberty, and the pursuit of happiness) all humans (not just citizens) possess and deserve to enjoy simply by virtue of being human?

 

I would want them to ponder the other parts of that seminal second paragraph of the Declaration of Independence, so that they are duly sensitized to the importance of government legitimacy being derived from the consent of the governed (popular sovereignty) and the associated right, indeed the duty, of the people (inside and outside government) to express dissent (possibly leading even to overthrow) in the face of abuse by those in power. And then there’s the part about all of us being created equal. Does that mean that even though we obviously aren’t equal in our attributes, talents, and abilities, we are equal in the sense that we have the same rights? Or, on the contrary, do we have only those rights granted to us by government?

 

I would want them to address the Constitution’s Preamble as not simply hortatory, aspirational literary frill, but as an imperative for action, America’s Security Credo, encapsulating as it does the full range of imperatives that define security for individuals and society beyond just providing for the common defense: national unity, justice, domestic tranquility, general well-being, and liberty.

 

I would want them to recognize the Constitution as the supreme law of the land, the ultimate statement of the rule of law (which we preach incessantly to others the world over) over the rule of men, an anchor to guide us especially in the face of populist demagoguery. “In questions of power,” Jefferson said, “let no more be heard of confidence in men, but bind him down from mischief by the chains of the Constitution.”

 

I would want them to consider the ordering of the Constitution’s articles: why the legislative branch, as the people’s representatives, is listed first; the executive, as the president of all the people, second; and the judiciary, the protectors of the law, third; this, even though these are coequal, coordinate branches of government that necessarily – and desirably – share many powers. Is this just syntactic necessity or a reflection of more meaningful underlying purpose?

 

Diagnosing Congress

I would want these future senior leaders to scrutinize Article I’s treatment of Congress, starting with the basics: Is our republican form of government – representative democracy – actually the one we should want, for reasons including but also transcending the “efficiency” necessitated by our size and population? Aren’t the separation of powers and checks and balances designed to be intentionally inefficient? Is such inefficiency compatible with the strategic imperatives of unity – unity of purpose, unity of effort, unity of action – called for in the international affairs of state? Is representative democracy actually consistent with popular sovereignty – popular rule – especially when those who represent us have chosen to be a full-time political class? Is the implicit premise of republican government that the best of us govern the rest of us (notwithstanding ample evidence to the contrary)? If so, are the prescribed qualifications for office – age, citizenship, and residency alone – all that should be required, leaving the voters to make their own judgments about such things as competence, intelligence, integrity, trustworthiness, and public-mindedness?

 

I would want them to confront key questions about what we should expect from our representatives in Congress: Should the primary responsibility of congressional representatives be to their constituents or to the country? Should they make their own reasoned judgments in office or be essentially a mouthpiece for their constituents? Should they check and balance or rubber stamp and enable the president and the executive branch? Should they be loyal to Congress and its constitutionally prescribed mission or to their political party?

 

I would want them to pay close attention to the specific wording of the Article I powers conferred upon Congress – “All legislative powers herein granted shall be vested in a Congress” – at the same time they note Article II’s more expansive and vague wording for the President – “The executive power shall be vested in a President” – as well as the 10th Amendment’s provision that “the powers not delegated to the United States . . . are reserved to the States respectively, or to the people.” By the same token, I would want them to note the countervailing implied congressional powers suggested by Article I’s so-called elastic clause: “To make all laws which shall be necessary and proper for carrying into execution the foregoing powers. . . .”

 

I would want them to recognize that Article I gives Congress – not the executive – the power to “provide for the common defence,” and that “no money shall be drawn from the Treasury, but in consequence of appropriations made by law.” As importantly is Congress’s role in exercising civilian control of the military (beyond that accorded the President in Article II as “commander in chief of the army and navy . . . and of the militia”): raising and supporting armies; providing and maintaining a navy; making rules for governing and regulating land and naval forces; providing for calling forth (mobilizing) the militia – and for organizing, arming, and disciplining the militia when thus mobilized. Most importantly, almost certainly, is the power accorded Congress to declare war – which we don’t do anymore because it’s too hard (perhaps too provocative); which Congress has the power to do but isn’t obligated (nor, increasingly, even expected) to do; and which we avoid by calling wars something other than wars, using “authorizations for the use of military force” instead, relying increasingly on publicly deniable covert military operations, and falling back on the 1973 War Powers Resolution, which, rather than reasserting proper congressional prerogative, provided an excuse for congressional inaction on the use of force until after the fact.

 

Diagnosing the Presidency

I would want these future senior leaders, belonging as they do to the executive branch, to make exacting judgments about Article II’s treatment of the president and the presidency, not least the precise wording of Section 1: “The executive power . . . shall be vested in a President.” What does that really mean? Is he an executor who is expected to carry out the direction of Congress, or is he the presider – the issuer of direction? Are the President, the presidency, and the executive branch a unitary body (in the manner of a “unitary executive,” endowed with not only expressed powers but also a wide range of inherent powers); or should we expect and want internal checks and balances (State vs. Defense, Army vs. Navy)? Was Alexander Hamilton right in his famous Federalist #70 call for “energy in the executive,” a metaphorical unitary force to overcome the inertia of the popular representative mass that is Congress? On what basis, then, should we judge a President (and, by association, determine how binding his direction should be): by his accomplishments (domestic and/or international), by his behavior (public and/or private), by his attributes (charisma, character, vision, courage)? 

 

Of most salient immediate concern to this audience is the President’s designation as commander in chief, this being at the very heart of the hallowed democratic precept of civilian control. This raises numerous questions, especially in conjunction with the presidential oath of office, which swears him to “preserve, protect and defend the Constitution,” – to the best of his ability. Is this license for the President to order the military to do anything he wants; and is the military obligated in turn to dutifully obey any order that isn’t demonstrably unlawful? Considering that the Constitution details how laws are to be passed and treaties ratified via shared powers, how legitimate are recurring presidential actions to circumvent both – through executive orders, signing statements, and international executive agreements? What, therefore, do we and should we expect the relationship between the executive and Congress to be: confrontational? competitive? cooperative? collaborative? collusive? Recall Justice Robert Jackson’s well-known concurring opinion in the 1952 Youngstown Sheet & Tube Co. v. Sawyer case: “While the Constitution diffuses power the better to secure liberty, it also contemplates that practice will integrate the dispersed powers into a workable government. . . .”

 

Don’t Forget the Judiciary

Lest the judiciary be overlooked in a fit of casual neglect, I would want these future strategic leaders to be sensitive to the judiciary’s crucial role: a formally independent, non-political arm of government, whose mission is to interpret and apply the law – not to make or enforce it. Apolitical judicial independence in the service of “equal justice under law” is the normative ideal, though the selection of judges and justices is driven in very large measure by political and ideological considerations. There are no prescribed qualifications for these lifetime, non-elected appointees, though virtually all are lawyers whose inclinations for judicial activism or judicial restraint reflect inner ideological and political leanings. 

 

Two issues specifically mentioned in Article III – impeachment and treason – and two whose provenance lies outside the Constitution – judicial review and judicial deference – warrant particular attention. With regard to impeachment, a recognizably political rather than legal act addressed more directly in Article II, the most pressing question is what constitute “high crimes and misdemeanors.” With regard to treason, defined in Article III as a wartime act, the question, in light of the world of hybrid, asymmetric conflict we now face, is what constitutes war. Judicial review, codified in the 1803 Marbury v. Madison case, raises questions about the extent to which, and under what circumstances, the judiciary should have the final say on the legality of executive and legislative actions. And then there is judicial deference, the Court’s selective, not always consistent practice of declining to take up certain types of cases (e.g., defense, foreign affairs, war powers) it considers to be the proper purview of the “political branches.” 

 

And, Finally, the Amendments

Yes, finally, I would want these individuals to address the amendments to the Constitution head on, precisely because that is principally where the rights they have sworn to uphold are most clearly enumerated. Indeed, there is much to be discussed with regard to the meaning and scope of gun rights and gun control, unreasonable search and seizure, due process and equal protection, double jeopardy and self-incrimination, speedy and public trial by jury, citizenship, and the protection of rights not otherwise specified in the Constitution. Perhaps most salient and most potentially controversial, though, are the rights enumerated in the First Amendment: religion (church-state separation, persecution, religiosity in public office), speech (dissent, hate speech, incitement, slander), press (secrecy, propaganda and disinformation, censorship, libel, leaks and whistleblowing, public accountability, informed citizenry), peaceable assembly and redress of grievances (civil society, protest movements and events, public awareness, access to public facilities).

 

 

If this sounds like Civics 101, it is – for good reason. It would be a massive mistake to conclude that uniformed military officers, federal civil servants, and Foreign Service Officers – professionals all – who aspire to future responsibilities as senior leaders, should be judged by standards no different than in the past: basically, technical expertise and operational know-how. Now, though, they are enroute to becoming tomorrow’s generals, admirals, and senior diplomats and federal executives. If they are to earn the continued trust and confidence of the public, they must fully expect to be judged anew by how much and how well they demonstrate understanding of and commitment to the higher-order ideals of the Constitution they have sworn to support and defend.

]]>
Sun, 16 Jun 2019 01:00:48 +0000 https://historynewsnetwork.org/article/171997 https://historynewsnetwork.org/article/171997 0
History Does Not Bode Well for President Trump’s Peace Plan

 

 

President Donald Trump’s Peace Plan aimed at solving the conflict between the Palestinian Arabs and Israel appears to be doomed to failure, based on historical precedents. 

 

History has taught us that every attempt by the United States to settle the Arab Israeli conflict by advancing its own peace plan has failed.    

 

From the Alpha Plan in the mid-1950s, through the Rogers Plan in 1969, to the Reagan Plan of 1983, to the Clinton parameters in 2000 – none have succeeded in producing peace.

 

The Alpha Plan devised by the United States and Britain at the end of 1954 specifically called on Israel to make territorial concessions in the Negev, in southern Israel. In addition, Israel had to agree to a land corridor in the Negev so as to connect Jordan with Egypt. Last but not least, the Alpha Plan urged Israel to accept the inflow of Arab refugees into its sovereign territory. 

 

Israel stated that it could not accept the terms of the Alpha Plan. Egypt, for its part, refused to negotiate with Israel as it was unwilling to recognize it as a sovereign state. 

 

In 1969, the US Secretary of State William Rogers advanced a peace plan which called on Israel to withdraw to the boundaries existing prior to the Six Day War of June 1967, with minor territorial modifications. 

 

Although Israel made it clear it was ready to negotiate with its Arab neighbors and make peace with them, the conditions entailed in the Rogers Plan were unacceptable as they called for a withdrawal of Israel to the lines existing prior to the Six Day War, with only minor border changes. 

 

The Arab countries, for their part, rejected the Rogers Plan as it entailed Arab official recognition of Israel. 

 

The Reagan Plan of 1983 which was proposed by the United States in the wake of the First Lebanon War, called on Israel to agree to the establishment of a Palestinian autonomous entity to be linked to Jordan. President Ronald Reagan had discussed the terms of the plan in advance with some Arab allies, but not with Israel. Israel had been informed of the plan only hours before it was made public. 

 

Feeling betrayed by this treatment, Menachem Begin, Israel's prime minister, said to US ambassador Samuel Lewis that Israel was not a banana republic and would not consent to being treated as such. 

 

The Reagan Plan was also rejected by the leadership of the Palestinian Arabs, who thought it fell short of their minimum demands of an independent Palestinian state in the West Bank and Gaza, including East Jerusalem, and the Right of Return of the Arab refugees to Israel. 

 

The Clinton Parameters, drawn up by President Bill Clinton in the wake of the failed Camp David Summit in the year 2000, called for the establishment of a Palestinian State on most of the West Bank and Gaza, leaving under Israeli sovereignty the main blocks of existing Israeli settlements. This failed to lead to a peace agreement between Israel and the Palestinian Authority.

 

Although peace plans advanced by the United States have invariably failed, efforts at mediating have been more successful when no detailed proposals are laid out in advance. 

 

The United States successfully played the role of mediator in the aftermath of the Yom Kippur War in 1973. Then Secretary of State Henry Kissinger's Shuttle Diplomacy led to three interim agreements, two between Israel and Egypt and one between Israel and Syria.  This diplomatic feat was achieved by third party mediation, which was not preceded by a US public announcement of the precise conditions the sides concerned were supposed to accept.

 

The same applies to President Jimmy Carter, who in September of 1978, at the Camp David Summit, played the role of mediator between Egypt and Israel. The Camp David framework agreement for peace, which laid the basis for the Egypt-Israel peace agreement, was a corollary of that diplomatic effort. Again, Carter did not present a blueprint for peace or specific terms for an agreement, but helped bring it about by actively mediating between the Egyptians and Israelis.

 

It must be stressed: not every effort at mediation has been successful; but every successful effort by the United States to achieve an agreement between Israel and its Arab neighbors has been attained by mediation, without putting forward in advance either a peace plan or detailed terms for them to accept.

 

To be sure, the fate of the Trump Peace Plan might be different. We do not know yet what it contains. Also, history may be a general guide to the future, not necessarily a certain compass to it. However, if history is anything to go by, the chances of the Trump Peace Plan to succeed are slim. 

]]>
Sun, 16 Jun 2019 01:00:48 +0000 https://historynewsnetwork.org/article/171998 https://historynewsnetwork.org/article/171998 0
Into the Teeth of the Dragon’s Jaw in Vietnam

 

What’s worse: a wall of antiaircraft artillery fire and surface-to-air missiles, a relentless amount of enemy MiG planes on your tail, or the reality that the war being waged is unwinnable? How about a target that just can’t be taken down for the duration of an entire long conflict? Many young US airmen during the Vietnam War dealt with these harsh conditions for seven years as they carried out efforts to destroy the heavily defended and strategically important bridge called the Thanh Hoa, or “dragon’s jaw” in Vietnamese. The bridge was located in the Thanh Hoa Provide of North Vietnam and endured hundreds of attacks from the US Air Force and the US Navy before it finally gave way. The campaigns required intense perseverance, unguided and laser-guided missiles, and many sacrifices to eliminate it from the battlefield in 1972. Many American airmen were shot down, killed, or captured and taken to the infamous “Hanoi Hilton” POW camp. 

 

The bridge became a symbol of unbeatable spirit for North Vietnamese identity. US war planners fought hard and plotted for years to uproot it from the Song Ma river. Veterans of the Vietnam War who remember it shared their stories about dogfights, losses, desperate conditions, valor, and lessons learned in air combat. In an interview, best-selling author and Vietnam War veteran Stephen Coonts and military aviation historian Barrett Tillman spoke with us about their latest book which is available now for purchase, Dragon's Jaw: An Epic Story of Courage and Tenacity in Vietnam.

 

 

First, can you both talk about the courage and tenacity it took to take down the Thanh Hoa bridge?

 

Barrett: If you have time, Erik, I would refer you to a book I co-authored aboutmore than 30 years ago. It was called On Yankee Station: The Naval Air War Over Vietnam, and it was about one of the three best friends I ever had, Commander John Nichols, athree-tour F-8 Crusader pilot and we included a chapter in that on professionalism, and I know Steve will agree with this wholeheartedly. The motivation that kept that generation of American aircrews flying into literally the teeth of the Dragon throughout Southeast Asia was professionalism, and they had one another. Steve, do I remember correctly that the original title of Flight of the Intruder was For Each other. 

 

Stephen: That is correct. Barrett hit the nail right on the head. It should impress anyone who sits down with Dragon’s Jaw and reads about hundreds of young aviators, some of them reservists, but most of them regular Air Force or Navy. They kept going back again and again, not because it's Lyndon Johnson's war or anything else, it's because they're professionals, it's just what they do, and they owe it to each other. It's the old story: “If I don't go, somebody else will have to, so I'm going.”I think that's the essence of what military professionalism is all about. 

 

Barrett: One of the most impressive people I've ever known was ViceAdmiral Jim Stockdale, who got sidelined into politics after sevenyears in Hanoi as a prisoner of war and he is best known, unfortunately, as Ross Perot's running mate. But Jim was a consummate professional, an aviator and a philosopher at the same time. At a Tailhook Association banquet in 1988, he relayed that in 1965 or so (which was the year he got shot down and captured) then Secretary of Defense Robert McNamara came out to Yankee station and was aboard the carrier Oriskany for a short time. He just flat out told the pilots and aircrew of Air Wing 16, "You are expected to take unlimited losses in pursuit of limited goals." And Jim let that sink in for a moment-- just a hush in the room. Then he said, "What you must remember: it's nothing limited about your efforts when you're over the target," and that speaks eloquently to the concept of professionalism.

 

Stephen: I certainly second that. We managed to put that vignette Barrett mentioned in the book and that was a powerful moment. 

 

If military brass on both sides of the Vietnam War were somehow still alive and got a chance to review this book thoroughly from a battle strategy standpoint, what do you think their reactions might be?

 

Stephen: The North Vietnamese did the very best they could with the assets they had.The American military certainly realized that. I don't think the American politicians truly understood [the advantage] that absolute dictatorship gave the North Vietnamese. From a military standpoint, the North Vietnamese were darn tough soldiers and they did the best they could with what they had, as did the Americans. There was mutual respect on both sides.

 

Military history tends to always be relevant and timeless, especially while the American public is both drawn to and repelled by a controversial presidential administration which is running multiple theaters of war. But why the Dragon’s Jaw bridge, and why now?

 

Stephen: Barrett and I were talking about this book about 5 years ago this month, that the Thanh Hoa bridge was the most notorious target in North Vietnam, it was almost indestructible, like the thing were made out of kryptonite. The weapons during the early stages of the war were absolutely inadequate to knock it down and American Airmen went against it for seven long years. About a dozen planes were shot down, people were killed, imprisoned and so on. Millions of dollars worth of airplanes, tens of millions of dollars worth of fuel and ordinance and all that were expended against that bridge. The story had never been told, and we thought, we ought to do this book while these people are still alive to talk about it. 

 

If we would have waited another 10 years and these guys that flew these missions in the 60s and early 70s either won't remember or they’re no longer capable of talking about it. We thought, we'd better get busy and do this, before life or other projectsget in the way. Finally, we said, “I don't care, we're going to do it.”Barrett agreed to do the research and I agree to write the book and that's basically what came down. Fortunately, Barrett is the premier military aviation historian alive today in America, so boy, you talk about aces up, we had a guy that knew everybody, knew the American military, and made a career out of writing about military aviation and he just dove right in.It gaveus a wealth of material; I had to sit down to try to write the English side of it and put the pronouns in the right places. This is why we did it now because we thought it was a story worth telling and we wanted to get it out here while the people who lived it were there to tell it to us. 

 

Barrett: That's a big part of it, believe me, because going in, Steve and I realized this was a rare opportunity to focus on a primary topic of the entire Vietnam War. We approached the bridge almost as if it's a character among the human participants,and we decided to treat the campaign which as Steve said was off and on for seven years as a microcosm of that crazy Asian War. It's all there: the tactics, the strategy, the politics, the courage, the losses, it all comes together over Thanh Hoa, which is about 70 or 80 miles south of Hanoi. It's well into North Vietnam and it's the belly of the beast that became such a focus for so many years for hundreds of American aircrew. 

 

What will hardcore historians find useful about this book -- from all walks of the discipline, from military history to even Southeast Asian studies and historical fiction?

 

Barrett: The major advantage for the readership you are addressing is the breadth of the material that we've assembled. Not only is this the first book about Thanh Hoa bridge, but it's also a top-to-bottom, left-to-right, in-and-out assessment from not just the American side-- we had about 70 contributors and they represent the US Air Force, the Navy, the Marine Corps, some civilian contractors, and also, we had some tremendous material out of Vietnam that as far as I can tell has never been accessed before. It took these fiveyears since Steve first called me back in April of 2014 to learn the lay of the land. If we had tried to complete the book and publish it any sooner, we would have lost an awful lot of that benefit. For instance, I had contacted our embassy in Vietnam and their embassy over here, asking about sources and contacts and never got a reply from either of them. It's not that I really expected it but in the meantime there's a very well-connected assembly of Southeast Asia researchers and scholars in this country and elsewhere. One of our main contributors is a lieutenant colonel in the Hungarian Air Force, so that assemblage made all the difference and if we were just to try to tell the story from the American viewpoint, honestly I think we would have less than half the story we're telling. 

 

Stephen: I would add that from a historian's standpoint, one of the major themes of the book is the development of precision weapons, or guided weapons. It went from World War II type dumb bombs (if you just point the airplane at the target and drop the bomb) to what are now precision-guided weapons. They were all born during that era and from American frustration with the Thanh Hoa bridge and its seeming invincibility. 

 

One of the problems with the Thanh Hoa bridge is to deliver a weapon you had to get into the heart of the anti-aircraft envelope to deliver the weapon and expose the plane and the pilot to death or capture, or whatever. The drive was not only for accurate weapons but weapons that could be launched from outside the antiaircraft envelope defending the target. All these themes came together during the Gulf War in 1991 and later on. From a historical standpoint, in this book you see the driving force, the driving feature that lead the military and Industry to develop smart stand-off weapons.

 

Did this book project help toopen up any new doors of research that might allow you to write a future book about Vietnam in a more detailed way than you have been able to access in the past?

 

Barrett: That's a very good question. I haven’t given any specific thought to another Vietnam book but as Steve lightly notes, now is the time to do that. I'll back up fortyyears to when the Naval Institute published my first book. It was the history of the Douglas dive bomber my father flew and at that time, it was basically thirty years after World War II. There were hundreds of thousands of living, breathing, remembering WWII veterans, but now we're beyond that same place in regard to Vietnam's. My Facebook tagline is "Do It Now" and if I get the opportunity to write another Vietnam book, undoubtedly it would be aviation-oriented, and as Steve notes, I have had two tactical missions in A-6s. I'd love to write about the definitive history of the Intruder so that might be another possibility. 

 

Stephen: It might be. [laughter]

 

Can you talk about some of the differences between the Johnson administration and the Nixon administration, and how each leader and their war planners used strategy, priorities, and made decisions that affected America’s approach in the Vietnam War and with China/Soviet Union relations?

 

Stephen: Well, wars don't get developed in a vacuum. It's the geopolitical milieu at the time that causes these conflicts to spark and sustain themselves.The Vietnam War was really launched in the heart of the Cold War by the Kennedy administration, which was scared to death of having a nuclear confrontation with Russia and, to a lesser extent, with China. 

 

President Kennedy was looking for a way to stand up to the spread of what they thought was world communism and that whole era is sort of hard for a millennial today to understand. They talked about how many square miles of the Earth's surface was going communist every year, as if this scourge was going to eat the whole Earth. People believed that. Politics is all about perception. The Johnson Administration inherited the Vietnam War and simply nobody had ever accused Lyndon Johnson of being an intellectual. He was just a log-rolling politician, an arm twister, and he never asked the basic questions about Vietnam: Was it International interest? What are the upsides and downsides? Should we be there? Is it worth the treasure we're committing?

 

Further, the problem was Johnson never bothered to figure out an exit strategy. He kept feeding men and arms into Vietnam, expanding the war, thinking he could leave at any time and that was never the case, it was total fantasy. When he finally realized he wasn't willing to apply the military pressure it would take to get a military victory, he was in too deep. 

 

Richard Nixon got elected, and Nixon, on the other hand, had more backbone and realized, I think, with Henry Kissinger's help, that the solution to this war, like all wars is it's got to be political. Nixon went and try to open up a relationship with China but what he found out was China wasn't going to war over Vietnam under any circumstances. The United States got the license to talk about the Strategic Arms Limitation Treaty and a better detente, a better relationship. All these things took the threat of nuclear war and allowed Nixon to get us the heck out of Vietnam in a way that the Johnson Administration had never been able to see how to do. 

 

It was a human tragedy; 58,000 Americans lost their lives in Vietnam. Over a million Vietnamese, and the Communists won. It was America's first war they actually lost, and maybe even the championship team needs to get its butt kicked occasionally, and we did. Maybe we learned something from that. We'll see. 

 

Barrett: Fairly early on in the book, a portion describes the reality in China and in Russia to a lesser extent versus the perception inside the beltway in DC. Anybody who is reasonably well-informed today would look back and wonder, how in the world did the Johnson Administration-- sometimes regime--what was happening in China to think that China was going to get involved on the ground in a major way like it had in Korea in 1950? Throughout the 60sand well into the 70s, Mao's China was in turmoil. They had the so-called Great Leap Forward which was a cultural topsy-turvy. They had massive starvation;we still don't know how many million trainees died of malnutrition. Additionally, Russia and China had both political and philosophical differences. I forget the name of the island but there was combat with casualties on both sides between China and Russia. At this remove, you have to look back in wonder: what were Johnson and McNamara and Rusk thinking because they had to know what was actually happening between Russia and China, but they seemed to ignore it. 

 

What was the most eye opening aspect of writing this book? For example, was it just how stubborn the bridge was, that it wouldn’t fall, or was it something more subtle which revealed itself as the project came together?

 

Stephen: Well, it was all of the above. The political stupidities, the military difficulty, knocking down a grossly overbuilt steel and concrete bridge with the weapons available in the teeth of fierce defenses. When it all came together, we thought it was a very powerful tale. We thought it was worth our time and effort and we gave it the best we could.

 

You’ve both conducted original interviews with many combat veterans and have made reference to insights and testimonies that veterans and politicians had given from the past. Do you use a combined effort in reaching out to the community of veterans or is the research stage also dependant on others’ assistance to consult with a network of witnesses who were there, who played a part in what happened at the Dragon’s Jaw? Secondly, what were those interviews like? Was it painful for the pilots to re-enact those life-or-death scenes?

 

Stephen: Obviously, Barrett is our expert. He talked to I would say 90 to 95% of the people who are quoted in the book. I talked to several but I also put out appeals to the A-6 Intruder Association, and I think Barrett did to the Tailhook Association that everybody who had ever bombed the Thanh Hoa bridge, we want to hear from you. Drop us an email, write us a letter, and we got a great many responses from that. Barrett did most of the interviews and he's an expert at that. He knows the technology, he knows the people, he knows what they're talking about. He's a historian; that's his thing. 

 

Barrett: Thank you, Steve. I'll just add briefly that coming from a naval preference in my work going back to the 60s and 70s, I knew quite a few people, people like Jim Stockdale and Wynn Foster, so many of the others who are quoted in the book but I was not so well-connected on the Air Force side. However, through the River Valley Fighter Pilots Association (they called themselves the River Rats) and a couple of other contacts, I started learning about some wonderful sources. They included the Air Force Phantom crew that didn't destroy the bridge in the main 1972 mission but they dislodged the span and that pilot had a cockpit recording. It's interesting, he asked me out of the blue during a phone conversation, "I still have this recording, would you like to have it?" and I thought, "oh my Lord, this is the Big Rock Candy Mountain," and it gives a sense of immediacy that just isn't possible otherwise. 

 

You'll see in one of the later chapters where the pilot and his backseater are exchanging comments because the mission was slowing, and fog and haze reduced visibility; one of my favorite lines in the book is, "Where are ya, bridge?" and "Oh! There it is, 11 o'clock right," so that type of immediacy would not have been possible if we hadn’t been able to talk to so many of the actual participants.

 

Stephen: Barrett listened to that particular cassette tape a million times and transcribed it. He got it all written down but I'm sure that the background noise, the calls, the counter measures and emotional voices of the crew, it must have put you right in the cockpit, Barrett, because you did a great job. 

 

Barrett: Well, thank you!

 

You wrote that, “It is not our purpose in this book to write a history of the Vietnam War but to illuminate Americans’ efforts to destroy and, to the extent we can, North Vietnamese efforts to defend just one bridge, the Dragon’s Jaw at Thanh Hoa.” The book is full of the first part of the book’s purpose. American efforts to destroy the bridge are clearly and painstakingly defined and explored in the book, down to the bullets, the cigars in the cockpits, casualty statistics, flight hours. 

 

How difficult was investigating the extent of how the North Vietnamese defended the bridge? What did this type of research entail? Were you both limited by language barriers or barriers to trustworthy information?

 

Barrett: Originally, other than the already published sources and existing literature, I was fortunate years ago, before I ever thought of writing the book, in meeting a guy named Gary Wayne Foster. He's a structural engineer who has worked all over the Far East and he had a particular interest in the bridge because he knew a Navy Phantom crew that have been shot down. They were captured whilst trying to bomb the bridge. Gary's interest in the bridge went beyond the historical aspect. He started looking at it from an engineering viewpoint, and he was so intrigued that he went up to Hanoi and tracked down the architect who is credited with designing the famous Dragon’s Jawbridge in 1964 that resisted all of the American ordnance.

 

Through a couple of almost casual comments made, I started looking elsewhere and filled in the blanks--essentially built a matrix of the North Vietnamese air defense network. I identified the 238th People's Anti-aircraft Artillery Regiment which was defending the bridge for most of that time. Things expanded into the surface-to-air missile category and I already knew a good deal about the MiG jet fighters that were involved in defending the bridge early on. It was essentially a building block process that not only provided information but personal accounts and as you see in the book, we have more than a few passages quoting either individual Vietnamese or official documents. To me, that was probably the most satisfying portion of the research, before I wrote a rough draft and Steve took that and ran with it. Having that kind of immediacy was more than I expected we might have going in and so it was almost as if this project was just waiting for Steve and I to discover it and once we started, it just blossomed. 

 

Barrett: We both had a good time writing it. Writers write, that's what they do and Barrett's a terrific historian and writer and I've been doing novels for most of my career. Putting it all together, writing about something so immediate and so powerful and that meant so much to our generation. I flew in A-6s in Vietnam for the last two cruises of the Enterprise during the war. I never bombed the bridge and I bombed everywhere else, and these are my guys, man. I know these guys, I lived with them, I went on liberty with them, and so it's not only their story, it's my story too and it was fun to tell it. 

 

Barrett: I'd like to add that Steve's skills as a novelist shine throughout because to me it's so much more than a campaign history, it's an immediacy, pounding, ‘you were there’ treatment that you almost expect Jake Grafton to roll down on the bridge at any moment. I know that's a big part of the strength and the appeal of Dragon's Jaw.

 

How was the experience joining forces to write about the Vietnam War -- a conflict that Stephen received a Distinguished Flying Cross in, respectively, and that Barrett is an expert in, and which carries strong political and foreign policy currents? Was it okay that perhaps your experiences and political beliefs didn’t align exactly? For example, one author is a Nixon and Kissinger supporter, while the other has a few reservations?

 

Stephen: I don't think at this point that our political views are very far apart on this war. The more you study the Vietnam War, you realize the tragedy from any angle: how many families lost sons and husbands and fathers and so on. It probably was a war that should have never been fought. The stupidities of the politicians -- I think Barrett and I are both joined at the hip. We both thought that the Johnson Administration was inept, incompetent, and really stupid.We thought the people that did the fighting actually did the best they could under very difficult circumstances. America just gave up because they were trying to do something that just couldn't be done which was defend a nation that wasn't a nation and to turn South Vietnam into a real nation state, and that was fantasy.

 

Barrett: Steve has very generously included me in two or three of his anthologies, including a couple of original fiction compilations, so we've been acquainted since before Flight of the Intruder when it was originally For Eachother, because we had the same publisher, Naval Institute Press in Annapolis. I remember commenting to the editor who had sent me the manuscript for my opinion and I said, "This is so good, if you don't publish it, I will!" and Steve and I have been, as he said, 'joined at the hip' ever since.

 

Stephen: It's been an amazing adventure along the years; it's really amazing that this is our first book together!

 

When Stephen was a guest on Oliver North’s radio show in May of 1998 (at the 20 min. mark), he got a call from a fellow tailhooker, Barrett, who asked him about a contradiction in the publishing business, where agents and publishers decided there was no longer a market for military-themed books. Stephen, your book, Flight of the Intruder, got rejected by publishers 34 times before it was published. What’s the state of military fiction today in comparison to back then?

 

Stephen: I didn't realize that was Barrett but military history well told does find an audience. Now it isn't going to be bestseller fiction, but if it's an important subject well worth writing then there'll always be a market for it, not just for the people who were there but the students of politics, students of our national identity, people who are worried about the future. If you wanted to learn about the future, read about the past.

 

How does military fiction today compare to back in the late 90s?

 

Stephen: Well, talking about military fiction, I think it's worse than it was because back then. When I was shopping the Flight of the Intruder around in the mid-80s, theytold me there's no market at all for Vietnam fiction. "Nobody wants to hear about a war we lost" and "we're not going to publish it," and they literally said, their corporate decision was, “we're not going to publish anything about Vietnam,” so times change. Military fiction, per say, is certainly not as big as it was when Tom Clancy and I were writing the so-called techno-thrillers, and those sort of died out as a genre of fiction. The big fiction today is still the same old stuff: sex, murder, whodunnits, the usual. 

 

Barrett: There is a cycle to what the publishing business receives as viable. I remember in about 1993-94, I was discussing the future of World War II history with two of my colleagues and they both had multiple, superb WWII books to their credit. All three of us had heard this emerging conventional wisdom that after the 50th anniversary of WWII in 1995, the market was going to drop off and none of us believed it, because we knew there was tons of material out there that still waited to be revealed and deserved to be told. 

 

Here we are 25 years laterand there’s still a market for a good WWII material, whether it's Rick Atkinson or Adam Makos or Christopher Shores in Britain. As long as the WWII generation continues breathing (and that's shifting) there will always be a market for it and I really believe the same for Vietnam, because that was the defining event of our generation and I just don't think it's going to dissipate anytime soon.

 

Stephen: The second generation, the children of WWII veterans are buying WWII histories now to see what their dad and their parents went through. It’s going to do that with Vietnam veterans.Their children are going to be interested in what their parents went through, a natural progression, but we're talking history, not fiction.Fiction and history are two different things.

 

Is it a good idea to keep track of oral history databases already out in the public domain as you interview pilots? 

 

Barrett: Oh, sure. Oral histories have only relatively recently become common references, even though they go back to at least the 1950s. Several years ago on one of the C-SPAN programs on the History channel, Rick Atkinson was asked about his research procedures. He said he almost never interviews WWII veterans-- he won the Pulitzer Prize for his World War II U.S. Army trilogy-- because of slipping memories, and that is a factor. Atkinson specifically mentioned the enormous depth and variety of oral histories are going to be increasingly important, because more often than not, those were conducted when the subjects were relatively young, frequently within 20 and sometimes 30 years of the events they are describing. There's a lot to be said for that. However, if Steve and I had accepted Atkinson's attitude at face value, saying, “no, we're not going to interview any of the veterans because it has been fifty years now, ”the book would not be anywhere as worthwhile and it certainly wouldn’t have asense of immediacy. I'm a firm believer that if a conscientious writer/historian looks for the best and most reliable people to interview, and you can determine that without much effort, individual interviews still have a major role to play in recording history. 

 

Are your collaborative efforts symbolic in some way of what you hope to achieve in both areas, or about how you want to bridge the gap to entertain readers and educate and engage the public?

 

Stephen: I personally think that the Dragon's Jaw is good, solid history. It's factually based and it's written as immediate as we could write it and we want to reach out and grab people by the shirts and say, "this is what the people that serve our country do, and they risk their lives and their future and their families to do whatever it is politicians ask for them to do.” I think that comes through. In fact, one of my friends was an F-8 pilot in Vietnam; he went on to become a chief of the naval operations and he looked at this book and he told me, "this just wasn't for the guys who were there, this is for all the guys in the future who are going to be asked to lay it on the line for the United States of America." When we did the dedication to all those American military Airmen, past, present, future, who have been or will be called upon the fight in the defense of freedom.

 

Is there any expectation that this will be translated into Vietnamese and released on the market there?

 

Stephen: The problem is, in the book, in some ways we burst a lot of North Vietnamese bubbles. For example, they grossly exaggerated claims about the shoot down rate for propaganda purposes and so when you read a North Vietnamese government-approved account, it's usually just BS. I can't imagine that the communist government in North Vietnam is going to want a book like this floating around that in effect points out all of the lies they've told through the years.

 

Barrett: On the other hand it wouldn't surprise me to see, say, Japanese and maybe even Chinese rights purchased for Dragon’s Jaw. 

 

Stephen: That's true!

 

Thank you both for your time and I look forward to the book’s release!

]]>
Sun, 16 Jun 2019 01:00:48 +0000 https://historynewsnetwork.org/article/172008 https://historynewsnetwork.org/article/172008 0
Presidential Moral Character and Teddy Roosevelt

 

It was Saturday morning, September 14, 1901, and President William McKinley was dead, eight days after being shot by a crazed assassin. Americans were aghast—this was the third murder of a president in thirty-six years. Everything had been going so well. The economy had rebounded from the 1893 Panic, our industry was the most productive in the world, technological innovation had made life easier, and America had just won a war, gaining a new global empire with unlimited commercial possibilities. 

 

Suddenly, the historically do-nothing office of vice-president was in the spotlight as its occupant was sworn in as the 26th president of the United States. In the cigar- and whiskey-reeking backrooms of big city political bosses, the august boardrooms of Wall Street moguls, and the genteel verandahs of Newport aristocrats, the nation’s elite was anxious about what kind of president Theodore Roosevelt would turn out to be. Many already had an idea and they didn’t like it. This was because by 1901, Roosevelt was anything but an enigma to America. Though he was only forty-two when he became president that Saturday, his moral character and intellectual ability were already widely known.

 

His moral character was illuminated from his first political office at age twenty-three in 1881, when he became the youngest person ever elected to the New York State Assembly. Roosevelt stood out from other politicians because of his fearless quest for honesty and efficiency in government. Over his next three terms in the statehouse, he took on powerful foes like financier Jay Gould, who had attempted to corrupt officials, and Judge Theodore Westbrook, who had a shady relationship with Gould. He not only believed, but showed, that honest government transcended party politics, working with Democratic Governor Grover Cleveland to pass a civil service reform bill. Within two years of his arrival, Roosevelt was chosen to be the Republican minority leader in the state assembly. These years in the gritty mechanics of legislative process provided him with solid experience in how government works, and how to craft arguments to advance his agenda. This knowledge helped greatly once he was president. It also gave him the impetus to fully use presidential power, as shown by the fact he issued more executive orders (over 1000) than any of his 25 predecessors. 

 

His intellect showed early on, as well. During his life, Roosevelt was of the most prodigious readers and writers in America. By age twenty-four he had written The Naval War of 1812, a book which was soon required reading for naval officers around the world. To this day, it is considered the definitive history of that naval war. Over his lifetime, Roosevelt wrote (no ghostwriters for him!) dozens of magazine articles, essays, thousands of letters, and no less than forty-five books. His topics were diverse: hunting, social responsibility, travel, history, biography, politics, living the strenuous life. Naturally, his literary work spread his name and ideas, but it had another benefit. It introduced him to many of the famous journalists and authors of the day, several of whom became lifelong friends who promoted his political programs. It was not by coincidence that Colonel Roosevelt had his own press entourage while trudging through the jungles of wartime Cuba in 1898.  

 

His impressive mental capacity was manifested in another way—he was an outstanding orator. He could converse in French and German, though with a pronounced American accent. Roosevelt’s experiences out west with cowboys, in the tenement slums of New York, and with soldiers in the army, people who were completely outside his social norm, taught him their language styles. It gave him confidence in public speaking with different cultures. It also gave him the ability to size up an audience’s pride, hopes, and fears, allowing him to personalize his message to them. Sometimes he even turned adversaries into advocates with his candid sincerity, as he did with German immigrants irate with his decision to enforce the no alcohol on Sunday laws as New York Police Commissioner in 1896. After speaking to them in German, he had them laughing with him. As recordings of his speeches show, Roosevelt’s voice was high pitched and not what we would consider stentorian, but his passion for the topic and audience emerged loud and clear. In short, he knew how to bond with the audience.

 

An obvious sign of Roosevelt’s remarkable self-discipline was his physical fitness, which greatly influenced his character. A sickly boy with severe asthma, as a teenager he transformed his frail body into that of an athlete. He became a devotee of daily practice in martial arts (Judo, boxing, and single-stickfighting). Roosevelt always seemed in motion. He never strolled—he strode. He didn’t walk up steps, he leaped two at a time. The tragic deaths of his father, and later his first wife and mother, and after that his brother, taught him perseverance through plunging into hard work, both physical and mental. His time with rough men in the Dakota Badlands, facing enemy fire in Cuba, and in rugged sports, gave him a determination which no one doubted. His clenched jaw and narrowed eyes could give pause to the fiercest opponent, either physical or political. And, of course, there was the other manifestation of his personality, a sense of gentle humor displayed in that famous ear-to-ear enameled grin, accompanied by a true belly-slapping laugh that was impossible to not join in. Often, it was directed at himself.

 

We mustn’t think Roosevelt perfect, however. His moral strength sometimes failed. One of the most prominent political causes which withered was his initial presidential support of black civil rights in the South in the face of increasingly oppressive Jim Crow laws and KKK violence. That faded as he dealt with the considerable southern political powerin Congress. An example is his early friendship for Booker T. Washington, inviting him to be the first African-American to dine at the White House only three weeks after being sworn-in as president. The backlash was immediate and vicious, and Mr. Washington never got another such invitation. During his second term in 1906, Roosevelt made the decision to rely on and support racist army officers’ evaluations and adjudication of a black regiment’s alleged rioting in Brownsville, Texas. A total of 167 soldiers were dishonorably discharged and humiliated, though later they were shown to be innocent. Many in the nation were disappointed by Roosevelt.

 

But when viewed overall, Roosevelt’s life was an extraordinary preparation for the presidency. For over twenty years, he devoted much of his life not to personal gain but to public service. By the time he became president, he’d worked in legislative and/or executive branches of municipal, state, and federal governments. He’d been in appointed positions like U.S. Civil Service Commissioner (under Republican and Democratic administrations) and New York Police Commissioner; and elected positions like assemblyman, governor, and vice-president. He’d won elections and lost them. He’d served in the military as assistant secretary of the U.S. Navy and as a volunteer colonel in the U.S. Army who endured combat. It was a remarkable resumé of service beyond oneself which has seldom been equaled by other presidents.

 

Among the upper-class in The Gilded Age, Theodore Roosevelt was an anomaly. Though he came from their class, he didn’t act like them. He didn’t want to change their lives. He wanted to change life for the rest of America, making citizens’ lives safer, fairer, and more hopeful. Roosevelt’s “Square Deal for Every Man” centered around consumer protection, corporate regulation, and conservation of America’s natural wonders. 

 

His life experiences and intellectual ability helped frame Roosevelt’s moral character and thus, his political goals. His considerable stamina and skills were used to achieve those goals. Battling the political bosses, corporate moguls, and social elitists, he made progress in surprisingly diverse areas: The Pure Food and Drug Act, Meat Inspection Act, and food safety programs. Protection of labor rights. Promotion of American commerce. Veterans benefits. Rural free postal delivery. Breaking up of commerce, finance, and utility monopolies. From 1902 to 1905 alone, 190 indictments against corrupt government officials. Regulation of railroad rates to ensure access for all. Stressing personal physical fitness and literacy. Support of child labor laws. Construction of the Panama Canal with affordable transit rates for all nations. Modernizing the U.S. Navy and Army. Protecting Latin America against European military attacks. Founding the U.S. Forest Service. Creating 4 game preserves, 5 national parks, 18 national monuments, 51 bird preserves, and 150 national forests.  

 

A century later, America seems to be searching for a leader with moral character, intellectual ability, proven sacrifice for the nation, personal bravery, and genuine sincerity—a new version of Theodore Roosevelt. Someone with whom you might disagree on policy, but still personally admire and trust. Someone who can laugh at themselves, and even get you to join in. Someone the world will respect for speaking softly while carrying that big stick.

 

I know that person is out there, because even though so much has changed over the last 110 years, this is still the America of Theodore Roosevelt.

]]>
Sun, 16 Jun 2019 01:00:48 +0000 https://historynewsnetwork.org/article/172000 https://historynewsnetwork.org/article/172000 0
Anton Chekhov: Environmental Prophet for Our Planet

A still from the trailer for Netflix's Our Planet. 

 

As I was watching Netflix’s wonderful eight-part Our Planet series narrated by Sir David Attenborough, I often thought of Anton Chekhov. Like the series, he often displayed a deep love of nature in his hundreds of stories and plays. A doctor by training, he died 115 years ago, at a mere 44 years of age.

 

Our Planet documents the danger humanity poses to our oceans, sea creatures, and rivers. In Our Planet’s  Episode 6, “The High Seas,” Attenborough criticizes modern fishing practices and warns, that “if we continue to harvest the seas in this way, it's not just fisheries that will collapse. The whole ocean system could follow. One hundred million sharks are killed every year, just to make shark fin soup. Ninety percent of all large ocean hunters have disappeared.” In Episode 7, “Fresh Water,” we hear, “Today, Pacific salmon number less than one percent of the numbers they used to, and that's causing problems for many other animals.” “Until 30 years ago,” Attenborough notes, “rivers in this [eastern] part of Africa never ran dry. Now . . . during the dry season, the rivers shrink into isolated pools.” (All Our Planet quotes are taken from the film scripts.)

 

 

In Chekhov’s short story “Panpipes” (1887), an old shepherd in the Russian southern steppe region bemoans the diminishing animals, drying up rivers, and deforestation he sees all around. “What will it be like,” he asks, “if the whole world goes to wrack and ruin?” He mentions birds, cattle, bees, and fish and tells his listener, “If you don't believe me ask any old man. Every one of them'll tell you that fish ain't anything like what they used to be. Every year there's less and less fish in the sea, lakes and rivers.”

 

Regarding the rivers, the shepherd says, “Every year they get shallower and shallower, there's no longer those nice deep pools there used to be . . . . In my father's day that's where the Peschanka flowed, but now look where the devil's taken it! It keeps changing course and you see, it'll keep changing course till it dries up altogether. . . .  And what became of all them little streams? In this very wood there used to be a stream with so much water in it the peasants only had to dip their creels in it to catch pike, and wild duck used to winter there. But even at spring flood there's no decent water in it now.”

 

The old shepherd also states that the forests are “being cut down, they catch fire or dry up and there's no new growth. What does grow is felled right away. One day it comes up and the next it's chopped down and so it goes till there's nothing left.”

 

Our Planet devotes Episode 8 to the “Forests,” and we hear that “a third of Madagascar's forests have disappeared in the last 20 years, a result of the continued destruction of their forests by people. Since these pictures were recorded, this forest, and the unique life it once contained, have disappeared altogether. Only three percent of Madagascar's dry forest remains.” 

 

The effect of this disappearing forest and its life is suggested by other Attenborough comments such as, “There are at least 40 different kinds of lemurs, all unique to Madagascar and all endangered. Lemurs are crucial to the forest. Without them, some species of tree cannot survive.” In Episode 1, “One Planet,” we hear, “In the last 50 years, wildlife populations have, on average, declined by 60 percent.” In May 2019, a UN panel of experts concluded that up to one million animal and plant species are threatened by extinction, endangering ecosystems and eroding “the foundations of our economies, livelihoods, food security, health and quality of life worldwide.”

 

In Chekhov’s play Uncle Vania (1899), the writer’s views on deforestation are reflected in the views and words of Dr. Astrov.  In Act I, young Sonia introduces some of his ideas:  He “watches over the old woods and sets out new forests every year. . . . He says that forests are the ornaments of the earth, that they teach mankind to understand beauty and attune his mind to lofty sentiments. Forests temper a stern climate.” 

 

Astrov himself says: “You can burn peat [rather than wood] in your stoves and build your sheds of stone. Oh, I don't object, of course, to cutting wood from necessity, but why destroy the forests? The woods of Russia are trembling under the blows of the axe. Millions of trees have perished. The homes of the wild animals and birds have been desolated; the rivers are shrinking, and many beautiful landscapes are gone forever. And why? Because men are too lazy and stupid to stoop down and pick up their fuel from the ground.”

 

He then adds, “Who but a stupid barbarian could burn so much beauty in his stove and destroy that which he cannot make? Man is endowed with reason and the power to create, so that he may increase that which has been given him, but until now he has not created, but demolished. The forests are disappearing, the rivers are running dry, the wild life is exterminated, the climate is spoiled, and the earth becomes poorer and uglier every day. . . . When I pass village forests that I have preserved from the axe, or hear the rustling of the young trees set out with my own hands, I feel as if I had had some small share in improving the climate, and that if mankind is happy a thousand years from now I'll have been a little bit responsible for their happiness. When I plant a little birch tree and then see it budding into young green and swaying in the wind, my heart swells with pride.” (See here for more on deforestation and global warming.)

 

Sort of like we see at times in Our Planet, in Act III Astrov shows Elena a district map indicating the forests, vegetation, and animal and human life then existing and as it was fifty and twenty-five years earlier. He itemizes the environmental degradation and concludes: “It is, on the whole, the picture of a regular and slow decline which it will evidently only take about ten or fifteen more years to complete. You may perhaps object that it is the march of progress, that the old order must give place to the new. . . . So it destroys everything it can lay its hands on, without a thought for the morrow. And almost everything has gone, and nothing has been created to take its place.”

 

Astrov’s comment “the climate is spoiled” does not mean that Chekhov, who died in 1904, foresaw today’s climate-change crisis. As Attenborough suggests, it did not yet exist a century ago. In Episode 1, speaking of Antarctica and the Arctic, he states that “in just 70 years, things have changed at a frightening pace. The polar regions are warming faster than any other part of the planet.” Andin a new BBC production called Climate Change: The Facts, he states, “In the 20 years since I first started talking about the impact of climate change on our world, conditions have changed far faster than I ever imagined.” (He also notes that climate change is “our greatest threat in thousands of years.”) But if not yet able to foresee the full extent of today’s crisis, Chekhov at least was perspicacious enough to realize that deforestation would affect climate conditions. 

 

 

One final environmental problem that Our Planet and Chekhov both refer to is pollution. In Episode 6, “the High Seas, Attenborough tells us that “plastic pollution is a grave issue for our oceans.” In Episode 7, “Fresh Water,” he refers to polluted springs and the “badly polluted” rivers of Eastern Europe. Much of that latter pollution is due to factories and other producers of industrial waste.

 

In his long story, “In the Ravine”(1900), Chekhov writes of a village where “there was always a smell from the factory refuse and the acetic acid which was used in the finishing of the cotton print. The three cotton factories and the tanyard were not in the village itself, but a little way off. . . . The  tanyard often made the water in the little river stink; the refuse contaminated the meadows, the peasants' cattle suffered from Siberian plague, and orders were given that the factory should be closed. It was considered to be closed, but went on working in secret with the connivance of the local police officer and the district doctor, who was paid ten roubles a month by the owner.” 

 

Beyond the specific environmental concerns that unite Chekhov with Our Planet’s Attenborough, there lies a philosophy of nature. As one scholar has noted, to Chekhov “man and nature are one, they form a cosmic unity,” and “in his mature period Chekhov increasingly uses attitudes and behavior toward nature as a measure of the character and moral stature of individuals and groups.” (See my long essay “The Wisdom of Anton Chekhov” for the source of this and some other Chekhov quotes.)  

 

This Chekhovian approach to nature characterizes not only Attenborough, but also Pope Francis, who began his environmental encyclical of 2015 by stating that “our common home is like a sister with whom we share our life and a beautiful mother who opens her arms to embrace us.” Like that encyclical, the Our Planet website offers advice to deal with our environmental problems in a more enlightened way than has been done from Chekhov’s time to the present.  

 

Chekhov once stated that “in three or four hundred years all the earth will become a flourishing garden. And life will then be exceedingly light and comfortable.” Despite his environmental criticisms, he realized as Attenborough and Pope Francis do, that we must keep hope alive, for as one recent writer observes, “Pessimism would be an ethical catastrophe. It leads only to despair.” 

]]>
Sun, 16 Jun 2019 01:00:48 +0000 https://historynewsnetwork.org/article/172002 https://historynewsnetwork.org/article/172002 0
Why Do People Join Extremist Organizations?

St. Andrew's Church in Sri Lanka, one of hte targets of the Easter bombings.

 

It didn’t take long after the suicide bombings that hit Sri Lanka over Easter Sunday for an old question to resurface: What motivated the attackers?

Analysis of similar events in Europe, Africa, and Asia reaches contradictory conclusions. A paper on “Radicalisation and al-Shabaab recruitment in Somalia” found that people joined extremists organizations “for economic benefits.” In fact, the authors write, research from “Somalia showed that 27 percent of respondents joined al Shabab for economic reasons, 15 percent mentioned religious reasons, and 13 percent were forced to join.”   Meanwhile, a World Bank study based on leaked Islamic State records indicated no link between poverty or educational levels and radicalization. A joint study by Northwestern University and the Hebrew University concurred. “Poor economic conditions do not drive participation in ISIS,” the authors found. In fact, many of them came from wealthy countries with low inequality. Instead, the study concluded, “the flow of foreign fighters to ISIS is driven not by economic or political conditions but rather by ideology and the difficulty of assimilation into homogeneous Western countries.”   In Sri Lanka, too, the extremists weren’t poor people in search of economic improvement. Two were members of a very wealthy family that is involved in copper mining and spice trade. Their father founded Colombo-based Ishana Exports, which is largest exporter of spices from the island nation. Another of the bombers had studied in England and was a graduate student in Australia before returning to Sri Lanka. According to the Sri Lankan government, most of the attackers were similarly well-educated and had come from “middle-or upper middle-class” families.   Assimilation no factor in radicalization   Yet assimilation problems can’t fully explain the attacks either. Of Sri Lanka’s 22 million people, 70 percent are Buddhist, 13 percent Hindu, 10 percent Muslim, and 7 percent Christian. The groups have been living together at least 1,000 years, and religious schisms have only turned violent—with bad blood between Buddhists and Muslims, but not between Christians and Muslims.    To be sure, there are grievances. In Bangladesh, for example, professor Zia Rahman, chairman of the criminology department at Dhaka University, suspects that the rise of extremism resulted from a conflict in internal politics, especially the trial war criminals who opposed Bangladesh's independence. But Monirul Islam, counter-terrorism chief, says local militants are inspired by global militant activities.    Those are compounded by a sense that, around the world, Muslims face injustice. Many young people believe that the West has been suppressing Muslims for centuries: The current fighting in Syria, Iraq and Libya is against Muslims by Muslims, but there is a widespread perception that behind this mayhem is a Western conspiracy to weaken Islam.    Many people also believe that America, especially, works against Muslims because of Israel. No other issue riles up Muslims as much as Palestine. Pakistan's founder, Mohammad Ali Jinnah, warned President Truman in 1947 about this danger, and urged him not to divide Palestine. Muslims cannot fathom why Christians support Israel, since Jews do not honor Jesus, where as Muslims hold him in high regard as one of God’s prophets.    “A feeling of marginalization as a greater community of Islamic Ummah is encouraging even socially affluent people to get involved in ‘Jihad,' ”noted a study by the Bangladesh Institute of Peace and Security Studies.   Meanwhile, a study conducted by Anneli Botha at the Institute for Security Studies in South Africa found that 87 percent of the respondents cited religion as the reason they joined al Shabab. The extremists deploy the banner of religion to lure followers, as radical leftists once used communism. Religion promises the Muslim youth power and prosperity — as Marxism guaranteed freedom from exploitation. Islam offers another reward that makes it a uniquely potent force — heaven after death.   Islamic extremists resemble communists   Educated and wealthy youths join radical movements — be it Islam or communism — because of their desire to create an ideal world. Muslim youths become suicide bombers because they think they are doing their part to make this world a better place. They turn to Islam because there is no other progressive ideology available to them.   This dynamic is longstanding. After all, many communist leaders belonged to the upper class, too, and were highly educated. W. E. B. Du Bois, Ho Chi Minh, and Che Guevara are just a few examples. They were all driven by an understanding that society was leaving some people behind, an earnest desire to end social injustice, and the means to do something about it.   In Bangladesh, hammer and sickle graffiti was ubiquitous in the 1970s—painted by students in their teens and early 20s who fancied themselves romantic revolutionaries. They came from the upper echelon of society, but they thought of themselves as the saviors of their fellow hapless countrymen.    Right or wrong, many Islamic extremists hold similar views; they tend to believe they are on the right side of the equation. Muslims do not want to destroy the West, as the myth goes on in Europe and America, even through many of them consider it unfair and unjust. Yet, they refuse to be insulted by the West, and wish to be as wealthy as Americans, if not more. Above all, they want a seat at the table of equals. Until this happens, Islam and the West will remain mired in sporadic fights.   Accommodation—not confrontation—is the solution. Protestants and Catholics had to accommodate each other to end the Thirty Years' War, one of the most destructive conflicts in human history. ]]>
Sun, 16 Jun 2019 01:00:48 +0000 https://historynewsnetwork.org/article/171999 https://historynewsnetwork.org/article/171999 0
Thomas Harriot and the lost North Carolina Algonquian Language

 

 

Thomas Harriot was the English contemporary and peer of Galileo Galilei and Johannes Kepler, although he’s unknown to most people. That’s because his busy and dramatic life meant that he never got around to publishing his mathematical and scientific work, which is a pity for history: his manuscripts show that he was one of the most brilliant forerunners of modern science and mathematics. In this UN International Year of Indigenous Languages, however, it is especially significant that he was also a pioneering linguist and ethnographer. 

 

He worked for Sir Walter Ralegh as an astronomer and navigational theorist – his first job was to train Ralegh’s sea captains and pilots so that they could make their way safely across the uncharted Atlantic to America. Then, in 1585-86, he spent a year in “Virginia” (today’s North Carolina), with Ralegh’s First Colony on Roanoke Island. 

 

His job in America included a remarkable innovation by Ralegh: that of a kind of diplomat. Harriot had already learned some of the local language – North Carolina Algonquian dialects – from two indigenous men, Manteo and Wanchese. They had spent six months living in Ralegh’s home after sailing to England with his initial reconnaissance fleet in late 1584. Harriot lived in Ralegh’s mansion, too, and had plenty of opportunity to exchange language lessons with the two Americans, who returned home with the First Colony fleet. 

 

Fortunately, Harriot had the right temperament for his diplomatic role – he was open, curious, and notably non-judgmental. He made friends with the people, and clearly enjoyed much about their way of life. We know this because he left a remarkable record, A brief and true report of the new found land of Virginia. It was the only work he published. 

 

Although Ralegh had commissioned this report in order to show the commercial benefits of a trading colony in America, Harriot’s personal empathy and interest in the people and their way of life shines through. He doesn’t just offer a detached list of plants, foods, and so on; he describes the way the people went about their agriculture, hunting, and fishing, noting their abundant crops and their clever ways of building fishing weirs. Their staple crop, corn – called pagatowr in their language – yielded “a very white and sweet flour [that] makes a very good bread”. The people also roasted or boiled the corn for use in stews, for which purpose they used earthenware pots. Harriot commented, “Their women know how to make [these] vessels […] so large and fine that our potters with their wheels can make no better.”

 

He listed many other native foods, giving details of the way they were cooked and how they tasted. He was quite at home with their Algonquian names, which he didn’t obliterate by using Anglicized terms instead. And he was impressed that amid all this abundant food the people ate moderately, while at the same time “making good cheer together”.

 

It’s these kinds of little details of daily life – of shared meals, festivals, making of canoes and pots, planting and hunting – that bring to life a thriving, fascinating, and relatively harmonious community. So much so that when A brief and true report was published in a deluxe illustrated edition in 1590, it became a best-seller, published in four languages. It’s a landmark in American ethnology – a remarkable record of a remarkable way of life, in which Harriot’s report is accompanied by his captions to engravings of John White’s illustrations. Harriot and White had worked together in America: White sketched the people as they went about their daily lives, while Harriot’s knowledge of the language enabled him to converse with the people themselves.

 

 

He had a gift for languages, both verbal and mathematical. He later acted as a Greek language consultant to his friend George Chapman while Chapman was making the first English translation of Homer’s Iliad, and he produced the first fully symbolic algebra. But the exotic Algonquian language opened up a whole new world of sounds and expressions.

 

Harriot was so inspired by these new sounds that he created the world’s first complete phonetic alphabet. Four centuries before today’s global telecommunications revolution, he envisaged a different kind of globalism: the sharing of languages via a phonetic system designed to represent all the possible sounds of human speech. To make his system truly universal, he expressed it in unique trans-cultural symbols. 

 

Once again, he didn’t publish his discovery, and the manuscript of his alphabet was lost for many centuries. Fortunately, he’d used Latin letters to represent Algonquian words in his report on Virginia, which includes one of the earliest written records of indigenous North American words.

 

Neither Harriot nor Ralegh foresaw the disastrous consequences that followed their initial attempts to found a trading base in America – the diseases, the greed, the racism and rapaciousness of many English setters and administrators. It is a tragic, heartbreaking story that began to unfold even in the First Colony. The way of life brought so vividly to light in the illustrated Brief and true report no longer exists, and the particular North Carolina dialects that Harriot knew have not survived.

 

In 2014, however, historian Scott Dawson turned to Harriot’s work (and that of John Lawson a century later) when he wrote a paper, “The Vocabulary of Croatoan Algonquian”, which was published in the Southern Quarterly. Dawson is a descendant of the Croatoan people of Hatteras Island, which was also Manteo’s homeland – it is not far from Roanoke Island, and it is the last known destination of the famous Lost Colony, which vanished in the year or so after its establishment in 1587. Dawson noted that “a substantial portion” of what we know today about the Croatoan communities and their language comes from Harriot. 

 

Dawson’s paper includes a list of 120 Croatoan words and phrases, which, together with Harriot’s notes and White’s illustrations, offer a precious link to his people’s past. It’s something to celebrate, although this Year of Indigenous Languages also reminds us of how much was needlessly lost. 

]]>
Sun, 16 Jun 2019 01:00:48 +0000 https://historynewsnetwork.org/article/172001 https://historynewsnetwork.org/article/172001 0
To Prevent Brain Drain, Kosovo Must Eradicate Corruption

The Academy of Sciences and Arts in Pristina.

 

On the eve of the twentieth anniversary of the end of Kosovo war, the country is facing a dramatic large-scale brain drain. Every day, young professionals wait in long lines in front of EU embassies to apply for visas to legally leave Kosovo in the search for job opportunities and more promising futures. While it may be argued that massive brain drain is a problem that some European and Western Balkans states are facing, Kosovo’s migration is becoming increasingly acute, especially because of the endemic corruption among the political elite, much of the business sector, and many private and government institutions. The massive emigration of nearly 100,000 people that occurred in 2013 alone is so alarming demanding that the Kosovo’s government tackle the problem head-on with the support of the US and the EU if Kosovo is to remain a viable country with a secure future. According to Balkan Insight, a 2016 report from the German Interior Ministry listed Kosovo and Albania as the top countries whose citizens requested asylum in 2015. Kosovars filed 37,095 requests. Only Albanians, with a total of 54,762 requests, filed more. “Unlike the previous migrations of Albanians from Kosovo over the last 50 years, this new wave is different in that these young people are leaving for good, never to return to the country ruled by the elites who stole their future”, says Ilir Deda, Member of Parliament of the Republic of Kosovo and Vice-President of the Liberal-Democratic centrist party Alternativa. According to him, this trend will continue until Kosovo matures and takes decisive political and practical steps by ending two decades of endemic corruption of its leaders and their parties. “Kosovo political elites are engaged in unchallenged nepotism, sleaze, misusing of public funds, and impunity that have aroused the feeling of weakness, lack of perspective, and depressed citizenry”, says Lulzim Peci, former Ambassador to Sweden and current Executive Director of the Kosovo Institute for Policy Research and Development. What is particularly worrisome is that the new emigrants are mostly professionals who lost hope and accuse their deeply corrupt government of showing complete indifference to their needs. They feel trapped, and leaving the country appears to them as being the only viable option. In Kosovo, where unemployment has reached an alarming 30%, the politicians are the richest class in the country. Many big businesses have greatly expanded thanks to politicians’ support — who received millions in return for “their efforts.” Although the EU has deployed a police and civilian mission in Kosovo (EULEX) to prosecute corruption, it has largely failed. In fact, corruption has only become worse under the mission’s watch. The current US Ambassador to Kosovo, Philip Kosnett, in the ‘Week Against Corruption’, said that government officials continue to accept bribes, interfere in the justice system, and employ their relatives in public institutions. The EU representative in Kosovo, Nataliya Apostolova, reminded Kosovo’s citizens that corruption is ruining their country’s image. The US and EU “pressure” to fight corruption and deal with the country’s socio-political and economic ailments have largely failed. The US and the EU must now change their approach because their strategic interest aligns with the Balkans’ and Kosovo’s strong desire to integrate with the EU and NATO. It is common knowledge in Pristina that the US has directly interfered in Kosovo’s domestic affairs with little or no opposition, because the US is seen as a reliable friend. In 2011, Kosovo’s parliament elected first female president, Atifete Jahjaga, who was proposed by the US. In 2015, under US pressure, the Kosovo Parliament passed a law to create the Kosovo Specialist Chambers and Specialist Prosecutor’s Office, a court based in the Hague that has jurisdiction over Kosovo war crimes. Last December, Kosovo created an Army, defying Serbia and even NATO, but with the full support of the US. There are many steps the US and EU should take to assist Kosovo in revitalizing its economic sector, encourage social involvement, and push for political reform that would substantially reduce over time the numbers of the young who are leaving the country and precipitating the most disturbing brain drain. They can help Kosovo leave behind the doldrums in which it finds itself, and chart a new path that Kosovo’s government and institutions should fully embrace that would lead the country to a better and promising future. To send a clear signal to the entrenched corrupt Kosovar officials, US officials should regularly meet with trustworthy politicians and refuse to engage crooked officials in any social settings while preventing high-level businessmen from receiving EU and US visas. This will send an unambiguous message to the public that there is no international support for those self-serving officials who are undermining Kosovo’s future wellbeing. To nurture an independent juridical system, US and EU should expand training programs for young judges, lawyers and prosecutors and expose them to the ways the US and EU handle prosecution in dealing with corruption, and push for anti-corruption legislation. In addition, the US and EU should exert all necessary pressure on the government to reform the educational system, including technical training to provide new job opportunities and prepare a new generation to assume leadership positions. Since Kosovo wants to join the European Union, the EU is in a position to demand that the government begin a systematic process to clean up their acts by fully adhering to the EU’s requirement to qualify for membership and fully comply to the democratic principles, human rights, freedom of the press and untainted judiciary. Moreover, the US can help Kosovo to develop commercial opportunities by creating a better business climate for foreign investments while encouraging business interaction between western Balkan economies. A healthy economy allows employers to raise salaries – currently the lowest in the region – which can, at least in part, help to stem emigration of youth, especially young couples who can hardly make ends meet. Of particular importance, the US ought to insist that at least 20 percent of its financial aid to Kosovo is dedicated to participatory sustainable development projects. Communities can choose their own projects where the youth would be directly involved, develop a strong sense of belonging, feel needed, find meaning in their work and develop a vested interest in their projects and thus the motivation to stay. For these initiatives to work well, top officials must commit to protect human rights, end arbitrary incarceration and police brutality, prevent human trafficking, and protect free speech and free media outlets while undertaking social and political reforms to strengthen the democratic foundation. In the final analysis, however, every single official ought to remember that Kosovo has emerged from the ashes of many thousands of men and boys who were slaughtered by the Serbian military to prevent the rise of an independent and free Kosovo. They have a moral responsibility and a sacred duty to put the country’s national interest above their own and prevent brain drain, as the future of Kosovo rests on the vitality of its youth, in which every single Kosovar has a stake.

]]>
Sun, 16 Jun 2019 01:00:48 +0000 https://historynewsnetwork.org/article/171996 https://historynewsnetwork.org/article/171996 0
The Coming of American Fascism, 1920–1940

 

Fascism is usually thought of as a quintessentially and almost exclusively European phenomenon, that began with Mussolini, culminated with Hitler, and was eradicated in World War II. The U.S., in particular, is thought to have been largely immune to it, given the absence of mass movements similar to Nazism or Italian Fascism. But a different narrative exists, or at least did in the 1930s, before it was buried under an avalanche of patriotic American propaganda and liberal historiography. According to this alternative understanding, the U.S. was falling victim to fascism as early as the 1920s—though of a different sort than the European variety. Long-forgotten Marxist journals such as The Communist, The New Masses, and Labor Notes (unrelated to the current publication of the same name), and newspapers like the Daily Worker and the Industrial Worker, analyzed with great insight the nature of this distinctive American fascism, until the struggle against the Nazis shifted their priorities to supporting a more liberal and “patriotic” Popular Front.

 

In hisnew book entitled The Coming of the American Behemoth: The Origins of Fascism in the United States, 1920–1940, Michael Joseph Roberto has resurrected the old Marxian conception. Aside from its interest as a work of history, Roberto’sbook is particularly timely, as the old structures of American fascism have deepened in the last generation and colonized much of the world. 

 

The essence of fascism

            

Roberto’s book reconstructs the arguments outlined in pioneering works of the 1930s and ’40s, such as Lewis Corey’s The Decline of American Capitalism, Mauritz Hallgren’s Seeds of Revolt, Robert Brady’s The Spirit and Structure of German Fascism and Business as a System of Power, Carmen Haider’s Do We Want Fascism?, and A. B. Magil and Henry Stevens’ The Peril of Fascism. These authors and others, whose insights were ignored by subsequent liberal scholarship, understood first,that fascism was not uniquely European and second, that it had already arrived in the United States. For example, Brady noted in 1938 that “business is going political as it never has before, and it has learned to funnel its funds and pressures through highly centralized, interest-conscious, informed and exceedingly well-manned, united front organizations.”

            

After World War II, liberal understanding of fascism focused on the German and Italian characteristics, notably their one-party nature, their reliance on readily identifiable paramilitary groups and their violent nationalist, racist, and anti-Semitic ideology. While it’s perfectly reasonable to consider such phenomena as one manifestation of fascism, the analysis tends toward superficiality insofar as it obscures the class roots and class functions of the regime. Roberto believes the Marxist approach, which looks beneath the surface, is more penetrating, resulting in a “dynamic definition of fascism as an inherent function of monopoly-capitalist production and relations whose telos was and remains the totalitarian rule of capitalist dictatorship.” Or as Carmen Haider said: American fascism was the “attempt to introduce a collective form of capitalism in the place of individualism.”

 

The Marxists were not alone in this view. As Brady notes, in the 1930s, “many persons strategically placed in American business confidentially argue that [fascism] is already here in both spirit and intent.” In a 1937 speech Harold Ickes, Franklin Roosevelt’s Secretary of the Interior, argued that “fascist-minded men” had “a common interest in seizing more power and greater riches for themselves, and ability and willingness to turn the concentrated wealth of America against the welfare of America. It is these men who, pretending that they would save us from dreadful communism, would superimpose upon America an equally dreadful fascism.” Roosevelt himself sounded the same note in a speech a year later when he said “I am greatly in favor of decentralization, and yet the tendency is, every time we have [a recession] in private industry, to concentrate it all the more in New York. Now that is, ultimately, fascism.”

 

The New Deal’s corporatism

            

Roberto tells the history of the American political economy in the 1920s and ’30s through this lens, exploring how the fascist structures of our own day were forged in the interwar years. Much of his book, in particular the long expositions of Marxian economics, will be familiar to readers versed in left-wing literature. He devotes a chapter to the ideologists of fascism, or business rule, in the conservative 1920s, notably Thomas Nixon Carver, Harvard professor of economics, and Charles Norman Fay, vice-president of the National Association of Manufacturers and author of Business in Politics. He also examines the role of Edward Bernays, father of public relations and believer in the necessity of “regimenting the public mind every bit as much as an army regiments the bodies of its soldiers.” 

            

However, by 1930 the Great Depression had exposed the fallacy of those who believed in the concentration of power and wealth in the hands of the few. It turns out that when all the money goes to the top, the people on the bottom don’t have enough money to keep the economy growing. According to the leaders of business and politics, the answer to this problem was more fascism. Many of them pined for a Mussolini. Even liberal newspapers like the New York Times advocated “some sort of Council of State” that could rule by decree. In the end, the oligopolists only partially got their way, with the establishment of Roosevelt’s National Recovery Administration (NRA) in 1933.

            

At the time, Marxists and socialists argued that the New Deal was simply a higher stage of fascism, Roberto concurs. “Conceived as a means to create common ground between government and industry,” he writes, “the NRA marked a decisive move toward state monopoly capitalism in the United States.” The real power was left in the hands of big business, which wrote hundreds of “codes” to regulate prices, wages, work hours, etc., all to restore profits and eliminate overproduction. The NRA was a move towards a planned, state capitalist economy, of which big business was the sole beneficiary. Small businesses suffered, workers were not really empowered, income was not redistributed, and the economy remained sluggish. But the profits of big business recovered. 

            

The early New Deal “bore strong resemblances,” Roberto notes, “to the corporatist state established in Italy in its approach to reconciling the antagonism between capital and labor. Both Mussolini and Roosevelt had made clear their commitment to maintain and strengthen capitalism in their respective nations. Roosevelt himself admired Mussolini: “I don’t mind telling you in confidence,” he wrote an American envoy in 1933, “that I am keeping in fairly close touch with the admirable Italian gentleman.”

 

Huey Long and Charles Coughlin

            

Roberto is on shakier ground when discussing the “small-fry fascisti” who populated America’s political landscape during the Depression. His argument that Huey Long and the “radio priest” Father Coughlin were reactionaries and fascists is particularly weak. Long was a famously populist, albeit dictatorial, governor of Louisiana in the early 1930s who later became a U.S. senator, from which perch he criticized the New Deal for its conservatism and proposed his own wildly popular “Share Our Wealth” program. Had he not been assassinated in 1935, he might have posed a serious challenge to Roosevelt’s reelection. Coughlin, on the other hand, was never a political leader, though his radio broadcasts made him a political force. He, too, criticized the New Deal for its conservatism.

 

My own research on U.S. politics during the Depression hasled me to conclude that, despite what some historians (including Roberto) have argued, Long and Coughlin were more left-wing than right-wing, at least until Coughlin in later years turned decisively toward anti-Semitism. Certainly, they were politically ambiguous. But it’s inarguable that their massive following was due to the far-left character of their rhetoric—as may be judged by the Principles Coughlin laid out for the National Union of Social Justice, the political organization he founded. He went so far as to condemn the economic system itself:  “Capitalism is doomed and not worth trying to save.” 

 

Roberto’s characterization of those who were attracted to Long and Coughlin is also wrong.

 

Amid the swirl of change, dislocation, and anxiety about the present and fears for the future, [the petty bourgeoisie] made up the great wave of political reaction during the mid-1930s… Not understanding how and why those above them were responsible for the crisis that threatened them, they blamed most of it on the enemies lurking below, the Negroes, Jews, Catholics, Mexicans, anarchists, socialists, and, of course, the communists—all enemies of True Americanism.

 

As I have argued elsewhere, there was no “great wave of political reaction” in the mid-1930s except among big business. The middle and lower classes were generally far to the left of Roosevelt—and pushed him to the left in 1935, with the so-called Second New Deal that partially repudiated the fascist tendencies of the first. Long and Coughlin themselves played an important part in this swing to the left, since Roosevelt’s popularity was waning in 1934 under the barrage of left-populist criticism. As a result, in 1935 he supported the Wagner Act, the Social Security Act (which was more conservative than most Americans wanted), and the establishment of the Works Progress Administration. In 1936 he ensured his overwhelming reelection by taking a page from Long’s book and denouncing “economic royalists” who were callous to the suffering of Americans. 

            

The truth, then, is that Long and Coughlin, together with the influential Communist Party and other leftist organizations, helped save the New Deal from becoming genuinely fascist, from devolving into the dictatorial rule of big business. The pressures towards fascism remained, as reactionary sectors of business began to have significant victories against the Second New Deal starting in the late 1930s. But the genuine power that organized labor had achieved by then kept the U.S. from sliding into all-out fascism (in the Marxist sense) in the following decades.

 

The struggle to come

 

As we confront a polarized and oligarchical political economy so redolent of the factors that precipitated the Depression, The Coming of the American Behemoth offers lessons for the present. All the debate about whether Donald Trump is fascist, or whether society is in danger of succumbing to fascism, can be seen, from one perspective, as missing the point. Fascism in the materialist senseis already hereand would be here even if Hillary Clinton had won the presidency.

 

The danger isn’t so much that “paramilitary formations of brown shirts or black shirts” will take over America. It’s that Americans will fail to overturn the class foundations of fascism that are at this moment racing to destroy life on Earth. Roberto is right to emphasize this deeper structural reality.

 

The American Behemoth rose in the 1920s and ’30s. In the twenty-first century, “the beast is at full strength.” The Coming of the American Behemoth can serve not only as a useful problematization of the liberal understanding of fascism but also as an effective primer on the historical background for activists committed to fighting the beast that threatens to destroy us all.

]]>
Sun, 16 Jun 2019 01:00:48 +0000 https://historynewsnetwork.org/article/172004 https://historynewsnetwork.org/article/172004 0
What Trump Could Learn About Immigration from Teddy Roosevelt

An example of anti-Japanese sentiment.

 

Recently, Donald Trump virtually gutted the Department of Homeland Security with the forced resignation of DHS Secretary Kirstjen Nielson and key deputies because they were ‘too soft’ on immigration. He then made a bad situation far worse by the de facto appointment of xenophobic Stephen Miller to take over immigration policy. 

Anti-immigrant sentiment has been at the forefront of Trump’s politics since he announced his run for president. At his campaign announcement in 2015, he labeled Mexicans as rapists. Once in power he has labeled immigration as a national crisis and has demanded a wall between the U.S. and Mexico. While he initially asserted Mexico would pay for such a wall, when Congress denied him funding for it he shut down the government for a record-breaking 35 days. Most recently, he threatened to shut down the U.S. –Mexico border but ultimately did not because of the loss in trade. 

President Trump could learn a valuable lesson from Theodore Roosevelt on immigration. At the turn of the 20th century, Asian immigrants were demonized by Americans. The Chinese laborers brought to the west to work on the construction of railroads fueled the hatred of Asians and the “Yellow Peril” that some thought threatened to take over America and destroy Western civilization. 

After the Chinese Exclusion Act of 1882  the focus turned to the relatively small number of Japanese coming to America.  It was particularly virulent in San Francisco where Mayor Eugene E. Schmitz formed the Japanese and Korean Exclusion League in 1905. Schmitz demanded the segregation of the tiny fraction of Japanese children in the public schools in order "to save white children from being affected by association with pupils of the Mongolian race." The Board of Education agreed and the children were forced to attend a segregated school. 

In much the same way Americans protested the treatment of Amanda Knox in the travesty of a murder trial, Japan interpreted the discrimination as an insult to its national pride. Japan had recently been fortified by military victory over Russia and the acceptance of Japan by Western nations as an emerging world power. A series of diplomatic notes passed between Japan and the United States, and tensions mounted. 

In order to diffuse and resolve the problem, Roosevelt brought the mayor and the school board to the White House and cajoled them to reverse the decision. He secured a promise that the segregation would be lifted if Japan restricted emigration. The Japanese government agreed and stopped issuing passports to the United States, although some were allowed to go to the Hawaii Territory. With the guarantee the school board relented. 

The resolution became known as the Gentlemen’s Agreement of 1907 or in Japanese Nichibei Shinshi Kyoyaku. Without rancor or inflammatory rhetoric Roosevelt solved an immigration crisis. The agreement was not perfect. Some Japanese people granted entry into Hawaii could and often did make their way to the mainland.  An exception for family members, a practice now denounced by the current president as ‘chain migration’, still allowed some Japanese to come to the United States. Japanese nationals escaped the feared exclusion acts until the Immigration Act of 1924 which cut Asian immigration to near zero.  

Even if the current occupant knew this part of history he would not be able to learn from it. Intelligent foreign often requires the delicate touch of a scalpel rather than the pounding of a sledge hammer. But for the rest of us we can know from this history that it is possible to handle immigration policies sensibly and even compassionately. And there is hope that a better president will soon be repairing the damage. 

]]>
Sun, 16 Jun 2019 01:00:48 +0000 https://historynewsnetwork.org/article/172005 https://historynewsnetwork.org/article/172005 0
What If Donald Trump Resigned? Two thirds of the American public (give or take a little) now believe that it is time for our President to stop being President. Trump should no longer have the power to take us to war on a whim or to ruin the careers of our leaders. 

 

There has been endless, somewhat idle  discussion of “Impeachment”  in Congress. It hasn’t proven, so far at least, to be the answer to our dilemma.  There has developed considerable agreement that a case for an Exodus needs to be made—and soon.  While that case can be made (by lawyers, by partisans, by the impatient, and by those who take our foreign affairs exceptionally seriously), there is a plain truth: we’re getting nowhere. 

 

Tempers have risen as the convoluted months have passed. Countless speeches have been made urging change—and not just in favor of immediate action.  There are among us political party members who pause, consider, maybe show some sadness, and dwell a bit drearily on the theme:  “Yes, I know he really has to go.  But we’re getting nowhere.”

 

I have slowly arrived at a point of view.  Oh, I’ve done what I can:  I’ve written three substantial articles that unreservedly  attack President Donald J. Trump’s performance in office.  It was a pleasure to write, then read, them—if frustrating.  To the extent there has been a reaction, it has been favorable enough, but mostly ineffective.  “Yes,” vast numbers say, “he does have to go.” 

 

If we agree pretty much on the need for Trump’s departure, the time is very much at hand to ask, essentially, What does he think about it?  What does he want, mid-term in the White House? Does he think there has  been enough roughhousing, yelling, defiance, repudiation of  important leaders at times and for reasons that are bound to be embarrassing?  Persecution, really rudeness, to the Press? Could it be that our peerless leader is agreeable to returning himself to a variety of estates and golf courses?

 

Thinking about his “situation” and the unpleasant circumstances that are slowly developing for us and for him, it does seem to this observer that a moment of crisis is approaching.  What, then, has become the Path I see to some kind of solution?

 

Since writing the initial draft of this article our good Nation has sent an aircraft carrier squadron to the Persian Gulf as an all too obvious threat to the Iranian government.  This aggressive action has been taken entirely on the initiative of the one who has other choices!  Military engagement is not the option that will bring him a true and lasting  sense of well being.  He need not suffer legal confrontations, speech and rebuttal, partisan challenges, and never ending indignities to family members (deserved or not).  As the days drag on it is so very apparent there is a tenable solution:

 

The Honorable leader of the executive branch of the United States should RESIGN at a very early opportunity. The President should not drag his feet until the Situation gets too hot to handle.

 

Yes, the owner of “the Trump estate,” that husband of a lovely lady, parent of stalwart children, and regular commuter to Mar-a-Lago and traveler to random places worldwide in government airplanes, should once and for all  take the terrible pressure off his mind and his health by JUST DEPARTING.

 

When President Richard Nixon finally decided the time had come, he wrote a one line notification of what he was doing.  It sufficed then.  But noticeably more than that is needed now. The President will want to offer his point of view to Posterity!  Believe it or not, we the Public will be receptive to thinking and weighing his final point of view.

 

 I have thought about it.  Here is a tentative draft resignation that I think might serve presidential needs and history as well:

 

“I am today resigning the position of President of the United States, effective at the time of transmitting this letter to the Congress.  The never ending turmoil surrounding daily and weekly events is beginning to be a considerable strain on my  well-being.  I fear that it will affect my physical condition before too long. 

 

“The position I have been occupying is one of never ending, constant responsibility. It has had its rewards, for me and members of my family. I feel I have served my Country well.

 

 “I could continue—waging the never ending political battles that so entrance those for whom such political activity is a lifetime activity.  But I am increasingly aware that Life has other rewards in store for me—provided I treat it with careful regard. 

 

“As I say goodbye, I trust that observers will weigh with proper regard the several aspects of my presidency—partisan or not—and arrive at a balanced verdict on my shortened career as President.

 

“I wish my successors well.  Overall,  I am quite certain that my impact on the Presidency of the United States has been positive.”

 

DONALD J. TRUMP                                   

 

The letter above, drafted clear across the Nation cautiously and respectfully (yet still a Draft),  is the best I can offer for consideration at this point in time. It should not bear my name.   “Draft Letter for consideration” is intended as a title and should suffice.

 

I am suggesting this avenue as a possible way—sometime in the near future--to bring an end to the several  crises into which  our beloved Country has gradually worked itself,  and to avoid any and all wars which may ominously be waiting out there!   Our Leader will write his own letter, of course—and by no means do I expect it will be more than a tiny bit  influenced by my ordinary citizen’s prose—if indeed that. (I have no illusions that my prose will be the words finally chosen!)

 

Do be of good faith, fellow citizens of whatever persuasion.  We must avoid additional unpleasantness—and far worse!  Keep calm on the domestic front, and by all means be patient.  Rise above partisanship.  Let’s meet our Leader halfway on the course I suggest which, if taken, may  just be the direction to improving the future of all Americans.

]]>
Sun, 16 Jun 2019 01:00:48 +0000 https://historynewsnetwork.org/article/171992 https://historynewsnetwork.org/article/171992 0
Cassius Marcellus Clay and Muhammed Ali: What’s in a Name?

 

A new documentary on Muhammed Ali, What’s my name? is debuting on HBO, depicting the life and career of the man once known as Cassius Marcellus Clay Jr. 

 

What is in a name? 

 

To Ali his name meant everything. 

 

Said Ali: 

 

“Cassius Clay is a slave name. I didn’t choose it and I don’t want it. I am Muhammad Ali, a free name – it means beloved of God, and I insist people use it when people speak to me,” said a newly converted Ali when addressing the media

 

Ali was not joking. During a pre-fight interview between Ali and Ernie Terrell before their February 1967 fight at the Astrodome, Ali, as he was called by ABC’s Howard Cosell, regaled viewers with one of his patented poems to taunt Terrell, who responded by calling him Clay. Ali was not amused, questioning why he insisted on calling him Clay, portending that he was going to pay

 

“What’s my name?” yelled Ali in the eighth round, as he pounded Terrell with jab after jab breaking his eye socket, on the way to a 15-round decision, one of the more merciless beatings in boxing history to not end in a knockout. He had only one more fightin his career: he lost his boxing license for refusing to be drafted into the United States Army on religious grounds. 

 

When asked what his new Muslim name meant, the man heretofore known as Muhammed Ali responded “Worthy of praise, the most-high.”

 

Ali is more than an icon of sport. Ali’s life was emblematic of so many social identities: race, capitalism, war and peace, civil disobedience, freedom of religion; ostracization and redemption. He transcended sport; he was overtly political. Ali became a cultural touchstone and symbol of change, during a time when race and religion, then as now, was a defining paradigm of national discourse. Most importantly, he spoke his truth. 

 

But what about the name Cassius Marcellus Clay? Said a young Ali when interviewed before the Olympic trials: 

 

“I am Cassius Marcellus Clay VI; my great grandfather was a slave and was named after some great Kentuckian…Cassius Marcellus Clay is great name in Kentucky and really where he was from, I couldn't tell you. Now that obtained a little fame people want to know where I am from now, I am going to or have to look it up and see what it's all about now that I am getting a [few] interviews.”

 

The man for whom Ali was named, Cassius Marcellus Clay, also risked his livelihood and even his life to stand up for what he believed. 

 

Cassius Marcellus Clay turned his back on his own culture, put himself at the fore of social change and became one of the leading Southern Abolitionists of the 19th century. Like Ali, he was born in Kentucky and like Ali, it was American racial inequality and social unrest that changed Clay’s life and sent him on a course of political activism. Like Ali he was steadfast in his beliefs and had the force of personality to match.

 

He was a descendent of the famed Whig politician Henry Clay, who espoused antislavery ideas but owned slaves throughout his life. His father was the largest slaveholder in Kentucky, and it was in that milieu his conscience was first awakened to the evils of slavery. Abolitionism became the defining theme of Clay’s political career and life. 

 

As a Yale student with political connections he had the fortune to encounter many of the leading Northern Abolitionists, first meeting Daniel Webster and then William Lloyd Garrison whom he heard speak. Garrison’s rhetoric and unrelenting political action served as a catalyst to inspire the young Clay:  “the good seed which Garrison had watered, and which my own bitter experience had sown, aroused my whole soul.”

 

When he went back to Kentucky he continued to fight for the cause of abolition. Kentucky was at the epicenter of the debate over slavery and union. 

 

Clay was elected to congress for three terms as a Whig in 1836, but eventually followed in the footsteps of Garrison and started the True American abolitionist newspaper. The newspaper was repeatedly threatened and denounced by decree. Clay wrote in his 1885 memoir: 

 

My object was to use a State and National Constitutional right—the Freedom of the Press — to change our National and State laws, so as, by a legal majority, to abolish slavery. There was danger, of course, of mob-violence…and I determined to defend my rights by force, if need be. 

 

In the 1850’s he joined the newly formed Republican Party, though he didn’t always see eye-to-eye with them. He eventually aligned himself with Abraham Lincoln, with whom he shared many of the same views. Clay vigorously campaigned for Lincoln, rousing audiences with speeches and shouting down those who wanted to silence him. In one of the hotbeds of political unrest, on the precipice of Civil War, Clay stood for what he believed in, republicanism and the abolishment of slavery.

 

Clay, as one can tell by his memoirs, like Ali never one for humility, notes his name was bandied about for Vice-President and if he were present at the Republican Convention of 1860, he might have been chosen over Hannibal Hamlin of Maine. 

 

As it was, he was promised a position in what Doris Kearns Goodwin coined the “Team of Rivals,” but the cabinet was full.

 

Eventually, he was given the position as the Ambassador to the Empire of Russia, where he was instrumental in gaining recognition for the Union and preventing countries like Britain from recognizing the Confederacy for economic gain. Though, seldom spoke of, his contribution was essential to the war effort. 

 

Clay also advocated for Emancipation as an act of war as early as 1856. He writes that he urged Lincoln to write the Emancipation 1862. He did object, however, that the Emancipation Proclamation only applied to those areas annexed by the Union. Although he was in Russia and not there so see it, Clay received many a laudatory letter when Emancipation became a reality from men like Garrison and Wendell Phillips. 

 

For Ali, his stand against the Vietnam War nearly ended his sporting career, for Clay his political stance was a matter of life and death. While debating the merits of abolitionism – he opposed the annexation of Texas despite fighting in the Mexican War because of slavery – what began as a peaceful engagement became violent. Clay was shot by a mob planning to kill him. He had to defend his life with his knife, killing one of his assailants in self-defense. 

 

It is ironic that Ali who made his living as a pugilist, took a peaceful political stance, while his namesake who made his living as a political figure on the soapbox, almost had his life and career cut short by violence. Yet they share a common bond, each willing to risk ostracization for what they believed.

 

For Ali, that meant standing up for his religious beliefs, and for a time becoming something of a national pariah among many who didn’t understand his conversion or agree with his opposition to fight in the war. He had only one more fight in his career: he lost his boxing license for refusing to be drafted into the United States Army on religious grounds. Eventually Ali would be vindicated by the law of the land, a 8-0 Supreme Court vote overturning his conviction on the grounds of conscientious objection. He became one of the most beloved and  recognized men on earth and many see him as a symbol of greatness and national pride. Ali lit the torch  at the 1996 Olympic Games and received the Presidential Medal of Freedom. 

 

Like Ali, Clay would not be silenced.

 

Said Pulitzer’s New World: 

 

Cassius M. Clay won another victory for free speech, and struck a good blow in behalf of Republicanism…Mr. Clay had publicly announced, through both the papers issued at Richmond, that he intended to speak on this occasion, and the subject was much canvassed in the streets. The more violent portion of the Revolutionary Committee, we learn, were for silencing him.

 

Each felt a call to action that changed his life.Each eschewed public opinion and mounting vitriol to assert their ideals and stand for what they believed while using their gift of rhetoric to let people know just what they thought. Each man has markedly impacted what are some of the pervading narratives of American history -- race, social equality and national identity.

 

The two men, born Cassius Marcellus Clay, have a lot in common, showing that name, birth and background don’t necessarily dictate one’s impact, rather acculturation and moral courage that does. Both Ali and his namesake are connected with one moniker and while one man eschewed the name Cassius Clay, the abolitionist and the athlete are synonymous with courage and social change.   

]]>
Sun, 16 Jun 2019 01:00:48 +0000 https://historynewsnetwork.org/article/171955 https://historynewsnetwork.org/article/171955 0
Roundup Top 10! Roundup Top 10

 

It’s time to stop viewing pregnant women as threats to their babies

by Kathleen Crowther

How Georgia is continuing a centuries-long tradition, and why it must stop.

 

Why we can — and must — create a fairer system of traffic enforcement

by Sarah A. Seo

Its discretionary nature has left it ripe for abuse.

 

 

If judicial nominees don’t support ‘Brown v. Board,’ they don’t support the rule of law

by Sherrilyn Ifill

Few of us — no matter our race, color or creed — would recognize our democracy or legal system without the changes touched off by this momentous civil rights case.

 

 

How anti-immigrant policies thwart scientific discovery

by Violet Moller

By hindering international collaboration, the Trump administration has triggered a “brain drain.”

 

 

Why We Still Care About America’s Founders

by Rick Atkinson

Despite their flaws, their struggle continues to speak to the nation we want to become.

 

 

Rashida Tlaib Has Her History Wrong

by Benny Morris

The representative’s account of the Arab-Israeli conflict relies on origin myths about the birth of Israel.

 

 

A Whitewashed Monument to Women’s Suffrage

by Brent Staples

A sculpture that’s expected to be unveiled in Central Park next year ignores the important contributions of black women.

 

 

Redacting Democracy

by Karen J. Greenberg

What You Can’t See Can Hurt You

 

 

Men Invented ‘Likability.’ Guess Who Benefits.

by Claire Bond Potter

It was pushed by Madison Avenue and preached by self-help gurus. Then it entered politics.

 

 

 

Special Focus: Impeachment

What Democrats Can Learn About Impeachment From the Civil War

by Jamelle Bouie

Lesson One: Don’t let Trump take the initiative.

 

How the Mueller report could end the Trump presidency without impeachment

by Jasmin Bath

Democrats should run on a message from 1860: You need a president you can trust.

 

 

An Open Memo: Comparison of Clinton Impeachment, Nixon Impeachment and Trump Pre-Impeachment

by Sidney Blumenthal

The facts and history indicate that the Clinton case bears little if any relevance to the Trump one, while the Nixon case shows great similarity to Trump’s.

 

 

The Precedent for Impeachment: Nixon, Not Clinton

by Kevin Kruse and Julian Zelizer

"Blumenthal, who had a front row seat to the Clinton drama, understands that there are major differences between these two instances."

 

]]>
Sun, 16 Jun 2019 01:00:48 +0000 https://historynewsnetwork.org/article/171994 https://historynewsnetwork.org/article/171994 0
The D-Day Warriors Who Led The Way to Victory in World War ll

 

From THE FIRST WAVE: The D-Day Warriors Who Led The Way to Victory in World War ll by Alex Kershaw, published by Dutton, an imprint of Penguin Publishing Group, a division of Penguin Random House, LLC. Copyright © 2019 by Alex Kershaw.

 

The clock in the war room at Southwick House showed 4 a.m. The nine men gathered in the twenty‐five‐by‐fifty‐foot former library, its walls lined with empty bookshelves, were anxiously sipping cups of coffee, their minds dwelling on the Allies’ most important decision of World War II. Outside in the darkness, a gale was blowing, angry rain lashing against the windows. “The weather was terrible,” recalled fifty‐three‐year‐old Supreme Allied Commander Dwight Eisenhower. “Southwick House was shaking. Oh, it was really storming.” Given the atrocious conditions, would Eisenhower give the final go‐ahead or postpone? He had left it until now, the very last possible moment, to decide whether or not to launch the greatest invasion in history.

Seated before Eisenhower in upholstered chairs at a long table covered in a green cloth were the commanders of Overlord: the no‐nonsense Missourian, General Omar Bradley, commander of US ground forces; the British General Bernard Law Montgomery, commander of the 21st Army Group, casually attired in his trademark roll‐top sweater and corduroy slacks; Admiral Sir Bertram Ramsay, the naval commander who had orchestrated the “miracle of Dunkirk”—the evacuation of more than 300,000 troops from France in May 1940; the pipe‐smoking Air Chief Arthur Tedder, also British; Air Chief Marshal Sir Trafford Leigh‐Mallory, whose blunt pessimism had caused Eisenhower considerable anguish; and Major General Walter Bedell Smith, Eisenhower’s chief of staff.

A dour and tall Scotsman, forty‐three‐year‐old Group Captain James Stagg, Eisenhower’s chief meteorologist, entered the library and stood on the polished wood floor before Overlord’s commanders. He had briefed Eisenhower and his generals every twelve hours, predicting the storm that was now rattling the windows of the library, which had already led Eisenhower to postpone the invasion from June 5 to June 6. Then, to Eisenhower’s great relief, he had forecast that there would, as he had put it with a slight smile, “be rather fair conditions” beginning that afternoon and lasting for thirty‐six hours.

Once more, Stagg gave an update. The storm would indeed start to abate later that day.

Eisenhower got to his feet and began to pace back and forth, hands clasped behind him, chin resting on his chest, tension etched on his face.

 

 

What if Stagg was wrong? The consequences were beyond bearable. But to postpone again would mean that secrecy would be lost. Furthermore, the logistics of men and supplies, as well as the tides, dictated that another attempt could not be made for weeks, giving the Germans more time to prepare their already formidable coastal defenses.

Since January, when he had arrived in England to command Overlord, Eisenhower had been under crushing, ever greater strain. Now it had all boiled down to this decision. Eisenhower alone—not Roosevelt, not Churchill—had the authority to give the final command to go, to “enter the continent of Europe,” as his orders from on high had stated, and “undertake operations aimed at the heart of Germany and the destruction of her armed forces.” He alone could pull the trigger.

Marshaling the greatest invasion in the history of war had been, at times, as terrifying as the very real prospect of failure. The last time there had been a successful cross‐Channel attack was 1066, almost a millennium ago. The scale of this operation had been almost too much to grasp. More than 700,000 separate items had formed the inventory of what was required to launch the assault. Dismissed by some British officers as merely a “coordinator, a good mixer,” the blue‐eyed Eisenhower, celebrated for his broad grin and easy charm, had nevertheless imposed his will, working eighteen‐ hour days, reviewing and tweaking plans to launch some seven thousand vessels, twelve thousand planes, and 160,000 troops to hostile shores.

Eisenhower had overseen vital changes to the Overlord plan. A third more troops had been added to the invasion forces, of whom fewer than 15 percent had actually experienced combat. Heeding General Montgomery’s concerns, Eisenhower had ensured that the front was broadened to almost sixty miles of coast, with a beach code‐named Utah added at the base of the Cotentin Peninsula, farthest to the west. It had been agreed, after Eisenhower had carefully managed the “bunch of prima donnas,” most of them British, who made up his high command—the men gathered now before him—that the attack by night should benefit from the rays of a late‐rising moon.

In addition, it was decided that the first wave of seaborne troops would land at low tide to avoid being ripped apart by beach obstacles. An elaborate campaign of counterintelligence and outright deception, Operation Fortitude, had hopefully kept the Germans guessing as to where and when the Allies would land, providing the critical element of surprise. Hopefully, Erwin Rommel, the field marshal in charge of German forces in Normandy, had not succeeded in fortifying the coast to the extent that he had demanded. Hopefully, the Allies’ greatest advantage—their overwhelming superiority in air power— would make all the difference. Hopefully.

Not even Eisenhower was confident of success. “We are not merely risking a tactical defeat,” he had recently confided to an old friend back in Washington. “We are putting the whole works on one number.” Among Eisenhower’s most senior generals, even now, at the eleventh hour, there was precious little optimism.

Still pacing, Eisenhower thrust his chin in the direction of Montgomery. He was all for going. So was Tedder. Leigh‐Mallory, ever cautious, thought the heavy cloud cover might prove disastrous.

Stagg left the library and its cloud of pipe and cigarette smoke. There was an intense silence; each man knew how immense this moment was in history. The stakes could not be higher. There was no plan B. Nazism and its attendant evils— barbarism, unprecedented genocide, the enslavement of tens of millions of Europeans—might yet prevail. The one man in the room whom Eisenhower genuinely liked, Omar Bradley, believed that Overlord was Hitler’s “greatest danger and his greatest opportunity. If the Overlord forces could be repulsed and trounced decisively on the beaches, Hitler knew it would be a very long time indeed before the Allies tried again—if ever.”

Six weeks before, V Corps commander General Leonard Gerow had written to Eisenhower outlining grave doubts, even though it was too late to do much to alter the overall Overlord plan. It was distressingly clear, after the 4th Division had lost an incredible 749 men—killed in a single practice exercise on April 28 on Slapton Sands—that the Royal Navy and American troops were not working well together. Apart from the appallingly chaotic practice landings—the woeful yet final dress rehearsals—the defensive obstacles sown all along the beaches in Normandy were especially concerning.

Eisenhower had chided Gerow for his skepticism. Gerow had shot back that he was not being “pessimistic” but simply “realistic.” And what of the ten missing officers from the disaster at Slapton Sands who had detailed knowledge of the D‐Day operations, the most important secret in modern history? They knew about “Hobart’s Funnies,” the assortment of tanks specially designed to cut through Rommel’s defenses—including flail tanks that cleared mines with chains, and DUKWs, the six‐wheeled amphibious trucks that would take Rangers to within yards of the steep Norman cliffs—and they knew exactly where and when the Allies were landing. Was it really credible to assume that the Germans had not been tipped off, that so many thousands of planes and ships had gone unseen? 

Even Winston Churchill, usually so ebullient and optimistic, was filled with misgivings, having cautioned Eisenhower to “take care that the waves do not become red with the blood of American and British youth.” The prime minister had recently told a senior Pentagon official, John J. McCloy, that it would have been best to have had “Turkey on our side, the Danube under threat as well as Norway cleaned up before we undertook [Overlord].” The British Field Marshal Sir Alan Brooke, chief of the Imperial General Staff, had fought in Normandy in 1940 before the British Expeditionary Force’s narrow escape at Dunkirk. Just a few hours earlier, he had written in his diary that he was “very uneasy about the whole operation. At the best it will fall so very, very far short of the expectation of the bulk of the people, namely all those who know nothing about its difficulties. At the worst it may well be the most ghastly disaster of the whole war!”

No wonder Eisenhower had complained of a constant ringing in his right ear. He was almost frantic with nervous exhaustion, but he dared not show it as he continued now to pace back and forth, lost in thought, listening to the crackle and hiss of logs burning in the fireplace. He could not betray his true feelings, his dread and anxiety.

The minute hand on the clock moved slowly, for as long as five minutes according to one account. Walter Bedell Smith recalled, “I never realized before the loneliness and isolation of a commander at a time when such a momentous decision has to be taken, with the full knowledge that failure or success rests on his judgment alone.”

Eisenhower finally stopped pacing and then looked calmly at his lieutenants.

“OK. We’ll go.”

]]>
Sun, 16 Jun 2019 01:00:48 +0000 https://historynewsnetwork.org/article/171945 https://historynewsnetwork.org/article/171945 0
‘What’s Up, Doc?’’ Bugs Bunny Takes on the New York Philharmonic, Carrots and All

 

That wascally wabbit, Bugs Bunny, the notorious carrot chomping, sarcastic cartoon rabbit who first leaped on to the nations’ movie screens in 1940 and has been the star of 800 cartoons, four movies and 21 television specials, is back again, this time as the star of a special concert, Bugs Bunny at the Symphony II, in which the New York Philharmonic, live, plays the music of a dozen full length cartoons from the Looney Tunes and Merrie Melodies series, most starring Bugs, while the audience watches the cartoons themselves on a large movie screen. The production is at David Geffen Hall, the Philharmonic’s home, at Lincoln Center, New York. The show is this coming weekend as part of its national tour. 

The concert/show, co-sponsored by Warner Bros., under different names, created by conductor George Daugherty and David Ka Lik Wong, has been traveling through the United States for about 20 years and has been seen by 2.5 million Bugs enthusiasts. In addition to the show, patrons at Lincoln Center will get to meet a number of furry and colorful Looney Tunes characters who will be roaming through the lobby before the curtain. If Wile E. Coyote is there, watch out for him!

Among the cartoons to be screened will be Baton Bunny, Show Biz Bugs, Rhapsody Rabbit, Tom and Jerry at the Hollywood Bowl, The Rabbit of Seville, Rabid Rider, Coyote Falls, Robin Hood Daffy and What’s Opera, Doc?.

Conductor Daugherty was a Bugs fan as a kid, but it was not because of the rabbit’s zany onscreen antics. No, it was because the Bugs Bunny cartoons, and most in the Looney Tunes and Merrie Meodies cartoon factories work used the music of the great classical composers, such as Wagner, Rossini, Liszt and Donizetti. “I was a classical music fan as a boy and I reveled in listening to this great music used as the backdrop for these cartoons. I also appreciated the fact that millions of American kids were being introduced to classical music through Bugs Bunny,” he said.

The Bugs Bunny shows are like no other.

Fans at the Bugs concerts go wild. They cheer the good guys and jeer the bad guys. They applaud. They whoop. The juxtaposition of one of the world’s great orchestra’s playing the music of Richard Wagner as patrons of all ages shout and scream is both puzzling and wonderful.

“You go to a typical classic music concert and everybody is very quiet and respectful of the music. You go to a Bugs Bunny cartoon concert, though, and you lose all abandon. That’s what happens at these performances,” said conductor Daugherty with a big smile. “The same thing happened in the 1950s and it will happen forever.” 

He adds that most older people saw Bugs and Looney Tunes cartoons in a movie theater and kids on a small screen television set. “The chance to see the cartoons in a movie’ like setting, the Philharmonic concert hall, repeats that old feeling for adults and is all new for kids,” he said.

Daugherty and Wong started the production in 1990 and called it Bugs Bunny on Broadway. Since then the show, also called Bugs Bunny at the Symphony and Bugs Bunny at the Symphony II has been staged by more than 100 major orchestras, including the Boston Pops, the Los Angeles Philharmonic and Philadelphia Orchestra. It has been shown at the Hollywood Bowl and Sydney Opera House. 

In 1990, of course. Bugs was a huge Hollywood star. He began his career as a character in the Merrie Melodies cartoon series, making his star debut in Wild Hare in 1940. He was an instant hit, along with dopey Elmer Fudd, wily Daffy Duck and others. His popularity soared during World War II, when millions flocked to movies and the cartoons, which served as an escape from wartime pressures. Bugs Bunny was turned into a flag waving patriotic character during the war, even appearing in a dress blue U.S. Marine uniform in one cartoon. His popularity grew after the war and he remained the number one cartoon character in America for years, chomping on carrots in movie theater all across the country.

The really big advantages of the Philharmonic Hall, Daugherty said, was the sound of the orchestra in the concert hall.

“Back in the 1940s and ‘50s, when these cartoons first came out, the sound equipment in places where the cartoons were made, and in movie theaters, was limited. At the Philharmonic at Lincoln Center, and other halls where we stage the concerts, the sound is beautiful. That’s why people go to these shows,” said Daugherty.

He is always amazed at the people he meets at his productions. “I meet very old and very young people and music lovers, and cartoon overs, from every walk of life,” he said. He once met a couple who met at a Bugs Bunny concert eight years earlier, fell in love and were married.

People are getting used to these type of movie/performance shows. The Philharmonic has staged a number of them. Among them were Fantasia andStar Wars. The Philharmonic will stage a movie/concert of Close Encounters of the Third Kind and Psycho in September, Harry Potter and the Sorcerer’s Stone in December, and Singin’ in the Rain and Mary Poppins in May, 2020.The idea of a movie and a live orchestra is gaining ground in America – fast.

Surprisingly, the audience for the Bugs Bunny productions are neither kids or parents and kids – but individual adults. “I’d say 90% of our audience are adults without kids,” said Daugherty. “They are all coming back to see the cartoons they loved as children.”

And Bugs? The founder of the Warner Bros.’ Looney Tunes production show thinks that the hyperactive gray and white rabbit, getting on a little over the years, would love it.

When I ended my interview with the conductor, I was tempted to assume my very best Bugs Bunny voice and ask him “What’s up, Doc?” I could not do that, though, because the New York Philharmonic is so distinguished...

Really? Wait until this weekend, when Bugs fans pour into the Geffen concert hall at Lincoln Center and roar for Bugs and his cartoon pals who starred with him in all those wonderful old Looney Tunes and Merrie Melody cartoon production houses. The roar will be louder than the traffic in Times Square.

 

PRODUCTION: The Lincoln Center Shows are Friday at 8 p.m. and Saturday and 2 p.m. and 8 p.m.  

  That’s All, Folks !

]]>
Sun, 16 Jun 2019 01:00:48 +0000 https://historynewsnetwork.org/article/171954 https://historynewsnetwork.org/article/171954 0
Citizenship and the Census Steve Hochstadt is a professor of history emeritus at Illinois College, who blogs for HNN and LAProgressive, and writes about Jewish refugees in Shanghai.

 

 

Citizenship is becoming an ever bigger political issue. After some years of heated arguments about undocumented immigrants and whether they ought to be allowed to become citizens, a new front in the citizenship war has broken out over the census. The Trump administration wants to include the following question on the 2020 census form: “Is this person a citizen of the United States?” Possible answers include: born in the US, born abroad of US parents, naturalized citizen, and “not a US citizen”. 

 

It certainly is useful to have accurate data on the citizenship status of our population. But political calculation lurks behind this question, based on the following chain of reasoning. In the midst of a Republican campaign against immigrants and immigration, a citizenship question might frighten immigrants, both legal and not, from responding to the census, thus lowering total population counts. The census results are used to apportion Congressional seats and Electoral College votes, including everyone counted, whether citizen, legal or unauthorized resident. Many federal spending programs distribute funds to states based on population. Places with large numbers of immigrants tend to be Democratic-leaning big cities, so there could be long-range political power implications if the count is skewed. Counting citizens and non-citizens connects to counting votes, the most important constitutional issue of our time.

 

The biggest impact could be in Democratic California, one of Trump’s most persistent adversaries: 27% of Californians are immigrants and 34% of adults are Latino. Studies have already shown that Latinos were undercounted in the 2010 census and non-Hispanic whites were overcounted, according to the Census Bureau itself. The amount of federal funds that California could lose if a citizenship question causes even larger undercounting could reach billions of dollars.

 

Commerce Secretary Wilbur Ross, Steve Bannon (then a white House advisor), Kris Kobach (then Kansas Secretary of State), and others decided in early 2018 to put in the citizenship question, last asked in the 1950 census. Ross claimed the impetus came from a concern in the Department of Justice about protecting voting rights, but journalists uncovered an email trail proving he lied. The chief data scientist of the Census Bureau, John Abowd, opposed the addition of a citizenship question, which he said “is very costly” and “harms the quality of the census count”, and would result in “substantially less accurate citizenship status data than are available” from existing government records.

 

Nevertheless, Ross decided to include the question. Democratic attorneys general for 17 states, the District of Columbia, and many cities and counties have mounted a legal challenge in federal courts across the country. Judges in three federal courts in California, New York, and Maryland have already ruled that there should be no citizenship question. One judge described the argument by Commerce Secretary Ross as “an effort to concoct a rationale bearing no plausible relation to the real reason.” Another judge called the Republican case a “veritable smorgasbord of classic, clear-cut” violations of the Administrative Procedures Act, a 73-year-old law which makes the simple demands that decisions by federal agencies must be grounded in reality and make logical sense.

 

The Supreme Court has agreed to take the case on an expedited basis. So the census absorbs considerable political weight and becomes itself a constitutional issue, pitting Democrats and Republicans on the stage of the Supreme Court. A lawyer for the Democratic-controlled US House of Representatives will be one of the four attorneys arguing against the citizenship question. He will repeat the political power argument on which the local Democratic authorities based their case: they have standing to sue, because they would lose House seats and federal funds due to deliberately skewed results. 

 

The pure political weight of each seat on the Supreme Court has never been made so clear as in the past three years, where one seat in 2016 became the prize in a naked display of Republican Senatorial political power: we can do this, so we will. Now 5 Republican-appointed justices and 4 Democratic-appointed justices will decide the case. The decision will soon have consequences, when the 2020 Census results are used to allocate state and federal representation by Republican and Democratic legislatures for the next election, and even before that, to allocate federal dollars.

 

If you are interested in a fuller discussion of the significance of this case, go to the website of the National Constitution Center. It is rare to find a detailed, logical, clear and unbiased description of the facts on such a politically charged issue.

 

While technical legal issues determine who is a citizen, each party has been proclaiming their version of a good citizen. Republicans have been clear about their version of how a good citizen should act. Hate the free press, because they only tell lies. Physically attacking journalists is okay for a Republican citizen, and elected Republicans will defend your right to do that. The government elected by the citizens is evil, not a democratic institution, but one run by an unelected hidden “deep state”. Nothing is wrong with manipulating the tax system, because taxes are bad, the government wastes the money it collects, and the IRS is an ideological ally of the deep state, anyway. Citizens not only have the constitutional right to resist an oppressive government, but a good citizen treats our federal government as oppressive, and ought to resist it now, with the exception of everything the current President does.

 

It’s not necessary to be a violent white supremacist to be a good Republican citizen, but that’s not a disqualification. Disqualifications have to do with paperwork, with color, with where one was born, and with ideological viewpoints. Liberals are traitors to America, the worst kind of a citizen. People who believe in the right of a pregnant woman to control her own body are murderers, still citizens, but belonging in jail. Various other crimes of the mind disqualify Americans as good Republican citizens: advocating gun control, believing in climate change, and demanding that we protect the endangered environment.

 

Democrats need to tell Americans how we think about citizenship, not just the paperwork and the legalities, but the ethics and good behavior. I think a good American citizen:

1) Prizes the diversity of viewpoints that an ethnically and religiously diverse society produces;

2) Believes in the power of government to make people’s lives better;

3) Believes that government should act in the interests of all citizens, especially those who have the least resources;

4) Wants the government to protect the rights of minorities;

5) Believes that personal religion should be a free choice, but that the religious beliefs of no particular group should determine government policy.

 

If that is not a winning argument about what it means to be an American, then there will be no progress toward creating an equal and just democracy.

 

]]>
Sun, 16 Jun 2019 01:00:48 +0000 https://historynewsnetwork.org/blog/154211 https://historynewsnetwork.org/blog/154211 0