History News Network - Front Page History News Network - Front Page articles brought to you by History News Network. Thu, 18 Apr 2019 16:48:42 +0000 Thu, 18 Apr 2019 16:48:42 +0000 Zend_Feed_Writer 2 (http://framework.zend.com) https://new.hnn.us/site/feed American Jews Versus Israeli Politics Steve Hochstadt teaches at Illinois College and blogs for HNN.

 

Knesset chamber

 

 

Benjamin Netanyahu just won a record fifth term as Prime Minister of Israel. He has dominated Israeli politics for ten years. His reelection shows the widening gap between the ideas and politics of American and Israeli Jews.

 

The Israeli Attorney General announced at the end of February that Netanyahu will be indicted for bribery and fraud. Just days before the election, Netanyahu said that Israel would annex Jewish settlements on land in the West Bank taken in the Arab-Israeli War of 1967. About 400,000 Israelis live in West Bank settlements. He said, “I will impose sovereignty, but I will not distinguish between settlement blocs and isolated settlements. From my perspective, any point of settlement is Israeli, and we have responsibility, as the Israeli government. I will not uproot anyone, and I will not transfer sovereignty to the Palestinians.”

 

Netanyahu’s electoral opponents were a new coalition of centrist and conservative Israeli politicians. Thus the choice for voters was between a continued hard line against Palestinians and Netanyahu’s even harder line. His victory demonstrates the preference of Israeli voters for an ethically dubious politician, who offers no path toward peace with Palestinians, but continued seizure of formerly Arab land.

 

In 2009, Netanyahu made the following programmatic statement about the most pressing issue in the Middle East: “I told President Obama in Washington, if we get a guarantee of demilitarization, and if the Palestinians recognize Israel as the Jewish state, we are ready to agree to a real peace agreement, a demilitarized Palestinian state side by side with the Jewish state.” Since then he has gradually been moving away from this so-called two-state solution. In 2015, he employed harsh anti-Arab rhetoric during the last days of the election campaign, for which he apologized after winning. He seemed to move away from support of the two-state idea, but said after the election that this idea was still viable.

 

The election of Donald Trump pushed Israeli politics further right. Although Trump repeatedly claimed to have a bold plan to create a peace settlement between Israelis and Palestinians, in fact, he has openly supported Netanyahu’s movement away from any possible settlement. A year ago, Trump announced that the US officially recognized Jerusalem as the capital of Israel. Trump announced last month that the US recognizes Israeli sovereignty over the Golan Heights, seized from Syria during the 1967 war. Netanyahu used giant billboards showing him shaking hands with Trump.

 

To support his election bid this time, Netanyahu offered a deal to the most radical anti-Arab Israeli parties, which had thus far failed to win enough votes to be represented in the parliament, the Knesset. He orchestrated the merger of three far right parties into one bloc, the “Union of Right-Wing Parties”, and promised them two cabinet posts if he wins. One of those parties, Jewish Power, advocates the segregation of Jews and Arabs, who make up 20% of Israelis, and economic incentives to rid Israel of its Arab citizens. Jewish Power holds annual memorials for Baruch Goldstein, who murdered 29 Muslims at prayer in 1994. Imagine an American politician allying with a party which celebrates the murderous accomplishments of Dylann Roof.

 

Netanyahu recently said, “Israel is not a state of all its citizens,” but rather “the nation-state of the Jewish people alone.” That makes a “one-state solution” impossible, because non-Jews would automatically be second-class citizens. Netanyahu’s victory shows that the creation of a Palestinian state is less and less likely, as the land for such a state is increasingly seized by Israel.

 

While most Israelis also say they support a two-state solution, their real politics makes this support meaningless. A poll of Israelis in 2017 showed Jews leaning heavily to the right and extreme right. A more recent poll showed greatly increasing support for annexation: 16% support full annexation of the West Bank with no rights for Palestinians; 11% support annexation with rights for Palestinians; 15% support annexation of only the part of the West Bank that Israel currently fully controls, about 60% of it. About 30% don’t know and 28% oppose annexation.

 

Meanwhile, the uprooting of Arabs and confiscation of their land continue as Jewish settlements expand. While the West Bank is highlighted in the news, the Israeli policy of expelling native Arabs from their homes has also been taking place for decades in the Negev desert in southern Israel. Bedouin communities, many of which predate the founding of the Israeli state, have been systematically uprooted as part of an Israeli plan of concentrating all Bedouins into a few towns, in order to use their land for Jewish settlements and planned forests. The Bedouin communities are “unrecognized”, meaning that the Israeli government considers them illegal. Illegal Jewish settlements in that region have been recognized and supported, while much older Bedouin communities have been labeled illegal and demolished or slated for demolition. Essential services, like water and electricity, have been denied to the agricultural Bedouin villages in order to force their citizens to move to the new urban townships.

 

American Jews are overwhelmingly liberal. Polls since 2010 show over two-thirds supporting Democrats for Congress, rising to 76% in 2018. This long-standing liberalism meant broad support among American Jews for the civil rights struggle during the 20th century. Now the open discrimination against Arabs by the Israeli state, which in some ways resembles the former South African apartheid system, reduces sympathy for Israel.

 

Surveys of American Jews have demonstrated a consistent support for a two-state solution. Since 2008, about 80% of American Jews support the creation of a Palestinian state in Gaza and the West Bank. 80% also agree that a “two-state solution is an important national security interest for the United States.” Many factors have been moving American Jews away from support of Israel. The close family connections between Jews in America and Israel after World War II have diminished over the past half-century. The continued dominance of Israeli politics by ultra-Orthodox religious policies has worn out the patience of more secular American Jews in Conservative and Reform congregations.

 

In fact, the greatest support for hard-line Israeli policies has not been from American Jews, as Ilhan Omar recently implied, but from evangelical Christians who support Trump. After Netanyahu talked about annexing West Bank land, nine major mainstream American Jewish groups wrote to Trump asking him to restrain the Israeli government from annexation, saying that “it will lead to greater conflict between Israelis and Palestinians.”

 

The drifting apart of American Jews and Israelis is a tragic development, but perhaps an inevitable one. As Jews gradually assimilated into American democracy, they congregated at the liberal end of the political spectrum, feeling kinship with other minorities which experienced discrimination. American Jewish religious politics affirmed the traditional Jewish ethical ideas of justice, truth, peace, and compassion. Israeli Jews have faced a radically different environment. Although many of the early Israeli settlers and leaders came from the leftist European labor tradition, decades of conflict with Arab neighbors, in which both sides perpetrated countless atrocities, have led to hardening attitudes of self-defense and hatred for the other.

 

Jews in Israel support politicians and policies that I reject as abhorrent. That is a personal tragedy for me. The larger tragedy is that there appears to be no solution at all to the Israeli-Palestinian conflict.

]]>
Thu, 18 Apr 2019 16:48:42 +0000 https://historynewsnetwork.org/blog/154204 https://historynewsnetwork.org/blog/154204 0
I Stuck with Nixon. Here’s Why Science Says I Did It.

Richard Nixon surrenders to reality and resigns, August 9, 1974

Rick Shenkman is the former publisher of the History News Network and the author of Political Animals: How Our Stone-Age Brain Gets in the Way of Smart Politics (Basic Books, January 2016). You can follow him on Twitter. He blogs at stoneagebrain. This article was first published by the  Daily Beast.

Will Donald Trump’s supporters ever turn on him? I think I know the answer. It’s partly because I’ve been in their place.

During Watergate I was a die-hard Nixon dead-ender. I stuck with him after the Saturday Night Massacre in the fall of 1973 and the indictments of Nixon aides H.R. Haldeman and John Ehrlichman in 1974. Not until two months before Nixon resigned did I finally decide enough’s enough.

What was wrong with me? I’ve been haunted by that question for decades. 

I can clear up one thing immediately. I didn’t support Nixon out of ignorance. I was a history major at Vassar during Watergate and eagerly followed the news. I knew exactly what he’d been accused of.

The fact is the facts alone didn’t matter because I’d already made up my mind about him. My fellow Vassar students—all liberals, of course—pressed me to recant. But the more they did, the more feverish I became in my defense. I didn’t want to admit I was wrong (who does?) so I dreamed up reasons to show I wasn’t—a classic example of cognitive dissonance in action. 

A pioneering study by social psychologist Elliot Aronson conducted in the 1950s helps explain my mental gymnastics. Young college women invited to attend a risqué discussion of sexuality were divided into two groups. One group was put through a preliminary ritual in which they had to read aloud a list of words like “prostitute,” “virgin,” and “petting.” The other group had to say out loud a dozen obscenities including the word “fuck.” Afterwards, the members of both groups were required to attend a discussion on sex, which is what had been the draw. But it turned out they had all been duped. The discussion wasn’t risqué. The subject turned out to be lower-order animal sexuality. Worse, the people leading the discussion spoke in a monotone voice so low it was hard to follow what they were saying. 

Following the exercise the students were asked to comment on what they had been through. You might expect the students who went through the embarrassing rite of speaking obscenities to complain the loudest about the ordeal. But that isn’t what happened. Rather, they were more likely to speak positively about the experience.

The theory of cognitive dissonance explains why. While all of the subjects in the experiment felt unease at being duped, those for whom the experience was truly onerous felt a more compelling need to explain away their decision to take part. The solution was to reimagine what had happened. By rewriting history they could tell themselves that what had appeared to be a bad experience was actually a good one. Dissonance begone.

This is what I did each time one of my Vassar friends pointed to facts that showed Nixon was lying. 

Neuroscience experiments in the 21st century by Drew Westen show what happens in our brain when we confront information at odds with our commitments. In one study, supporters of President George W. Bush were given information that suggested he had been guilty of hypocrisy. Instead of grappling with the contradiction they ignored it. Most disturbing of all, this happened out of conscious awareness. MRI pictures showed that when they learned of Bush’s hypocrisy, their brains automatically shut off the “spigot of unpleasant emotion.” (It’s not a uniquely Republican trait; the same thing happened with supporters of John Kerry.) 

In short, human beings want to be right and we want our team to win. But we knew all that, right? Anybody who’s taken a Psych 101 class knows about confirmation bias: that humans seek out information that substantiates what they already believe; and bounded rationality: that human reason is limited to the information sources to which we are exposed; and motivated reasoning: that humans have a hard time being objective. 

But knowing all this isn’t enough to understand why Trump voters are sticking with Trump.

What’s required instead is a comprehensive way to think about the stubbornness of public opinion and when it changes. Until a few decades ago no one had much of a clue what a comprehensive approach might look like. All people had to go on was speculation. Then scientists operating in three different realms — social psychology, neuroscience, and political science — began to delve into the working of the human brain. What they wanted to know was how we learn. The answer, most agreed, was that the brain works on a dual-process system, a finding popularized by Daniel Kahneman, the Nobel prize-winning Princeton psychologist, in the book, Thinking Fast and Slow.

One track, which came to be known as System 1, is super-fast and happens out of conscious awareness, the thinking you do without thinking.

There are two components to System 1 thinking. One involves what popularly is thought of as our animal instincts, or what social scientists refer to, with more precision, as evolved psychological mechanisms. Example: the universal human fear of snakes. The other involves ways of thinking shaped by habit. The more you perform a certain task, the more familiar it becomes and the better you get at it without having to think about it.

Donald Trump likes to say that he goes with his gut. What he’s saying, likely without knowing it, is that he has confidence in his System 1. This is not exceptional. Most of us trust our instincts most of the time. What distinguishes Trump is that he seems to privilege instinct over reason nearly all of the time.

The second track, System 2, is slower and allows for reflection. This mode, which involves higher-order cognitive thinking, kicks in automatically when our brain’s surveillance system detects a novel situation for which we aren’t prepared by experience. At that moment we shift from unconscious reaction to conscious thinking. It is System 2 that we rely on when mulling over a difficult question involving multiple variables. Because our brain is in a sense lazy, as Kahneman notes, and System 2 thinking is hard, our default is System 1 thinking.

One thing that’s worth noting about System 1 thinking is that our brains are essentially conservative. While humans are naturally curious about the world and we are constantly growing our knowledge by, in effect, adding books to the shelves that exist in our mind’s library, only reluctantly do we decide to expand the library by adding a new shelf. And only very rarely do we think to change the system by which we organize the books on those shelves. Once we settle on the equivalent of the Dewey Decimal System in our mind, it’s very hard to switch to another system. This is one of the main reasons why people are almost always reluctant to embrace change. It’s why inertia wins out time and time again.

But change we do, thanks to System 2. But what exactly triggers System 2 when it’s our politics that are on the line? Social scientists finally came up with a convincing explanation when they began studying the effect of emotion on political decision-making in the 1980s.

One of the pioneers in this research is George Marcus. When Marcus was starting out as a political scientist at Williams College he began to argue that the profession should be focusing more on emotion, something they’d never done, mainly because emotion is hard to quantify and count and political scientists like to count things. When Marcus began writing papers about emotion he found he couldn’t find editors who would publish them. 

But it turned out his timing was perfect. Just as he was beginning to focus on emotion so were neuroscientists like Antonio Damasio. What the neuroscientists were learning was that the ancient belief that emotion is the enemy of reason is all wrong. Rather, emotion is the handmaiden of reason. What Damasio discovered was that patients with a damaged amygdala, the seat of many emotions, could not make decisions. He concluded: The “absence of emotion appears to be at least as pernicious for rationality as excessive emotion.” 

If emotion is critical to reason, the obvious question became: which emotion triggers fresh thinking? Eventually Marcus and a handful of other political scientists who shared his assumption that emotion is important to decision making became convinced that the one that triggers reappraisals is anxiety. Why anxiety? Because it turned out that when people realize that the picture of the world in their brain doesn’t match the world as it actually exists, their amygdala registers a strong reaction. This is felt in the body as anxiety.

Eventually, Marcus and his colleagues came up with a theory that helps us understand when people change their minds. It became known as the Theory of Affective Intelligence (later: the Theory of Affective Agency). The theory is straightforward: The more anxiety we feel the more likely we are to reconsider our beliefs. We actually change our beliefs when, as Marcus phrases it, the burden of hanging onto an opinion becomes greater than the cost of changing it. Experiments show that when people grow anxious they suddenly become open to new information. They follow hyperlinks promising fresh takes and they think about the new facts they encounter.

How does this help us understand Trump supporters? It doesn’t, if you accept the endless assertions that Trump voters are gripped by fear and economic anxiety. In that case, they should be particularly open to change. And yet they’re as stuck on Trump as I was on Nixon.

The problem isn’t with the theory. It’s with the fear and anxiety diagnosis. 

Humans can multiple feelings at odds with one another simultaneously, but research shows that only one emotion is likely to affect their politics. The dominant emotion characterizing so-called populist voters like those attracted to Trump is anger, not fear. This has been found in studies of populists in FranceSpainGermany and Britain as well as the United States

If the researchers are right that populists are mostly angry, not anxious, their remarkable stubbornness immediately becomes explicable. One of the findings of social scientists who study anger is that it makes people close-minded. After reading an article that expresses a view contrary to their own, people decline to follow links to find out more information. The angrier you become, the less likely you are to welcome alternative points of view. 

That’s a powerful motive for ignoring Trump’s thousands of naked lies.

Why did I finally abandon Nixon? For months and months I had been angry over Watergate. Not angry at Nixon, as you might imagine, but angry at the liberals for beating up on him. Nixon fed this anger with repeated attacks on the people he perceived as his enemies. As long as I shared his anger I wasn’t prepared to reconsider my commitment to his cause. 

But eventually there came a point when I stopped being angry and became anxious. 

I would guess that what happened is that over time Nixon’s attacks came to seem shopworn and thin. Defending him became more of a burden than the cost of abandoning him.

If I am right about the circuitous path I took from Nixon supporter to Nixon-basher, there’s hope that Trump supporters will have their own Road to Damascus epiphany. Like me, they may finally tire of anger, though who knows. Right-wing talk radio and Fox News have been peddling anger for years and the audience still loves it.

It took me 711 days from the time of the Watergate burglary to my break with Nixon, when I resigned from a committee defending him, to come to my senses. As this is published, it has been 812 days since Trump became president. And there’s little indication that Trump voters have reached an inflection point.

Any of a number of disclosures could disillusion a substantial number of them. We have yet to read the full Mueller report. Nor have we yet seen Trump’s tax returns, which might prove politically fatal if they show he isn’t really a billionaire or if they prove his companies depended on Russian money. (As Mitt Romney suggested, the returns likely contain a bombshell.) 

If Trump’s disclosures suggest to his supporters that they were chumps to believe in him his popularity no doubt would begin eroding. And already there’s evidence his support has weakened. In January 51 percent of GOP or GOP-leaning voters said they considered themselves more a supporter of Donald Trump than the Republican Party.  Two months later the number had declined to 43 percent. If this slippage is because more supporters feel they are embarrassed to come out as full-blown Trumpies he may be in trouble come election day.

In the end, politics is always about the voters. Until now, Trump has made his voters by and large feel good about themselves by validating their anger. But there remains the possibility that in the coming months disclosures may make them feel that they have been conned, severely testing their loyalty. If the anger they feel either wears off or is redirected at Trump himself their amygdala should send them a signal indicating discomfort with the mismatch between the known facts and their own commitments.

This presupposes that they can get outside the Fox News and conservative talk bubble so many have been living inside. Who knows if they will. It is worth remembering that even in Nixon’s day, millions remained wedded to his lost cause even after the release of the smoking-gun tape. On the day he resigned, August 9, 1974, 50 percent of Republicans still supported him even as his general approval dropped to 24 percent.

To sum up: Facts finally count if enough loyalists can get past their anger to see the facts for what they are. But people have to be exposed to the facts for this to occur. And we can’t be sure that this time they will be.

 

 

]]>
Thu, 18 Apr 2019 16:48:42 +0000 https://historynewsnetwork.org/blog/154203 https://historynewsnetwork.org/blog/154203 0
The Sorrow of Watching Notre Dame Burn

 

On the 20th day of Brumaire during the Second Year of the revolutionary order, a spectacle was held inside the newly consecrated Temple of Reason. Upon the altar of what had once been the magnificent cathedral Notre-Dame de Paris at the very heart of the greatest city in Christendom, the religious statues were stripped away (some decapitated like the heads of the overthrown order) and the whole building was turned over to a festival for what the most-radical of Jacobins called the “Cult of Reason.” In the hopes that this atheistic faith-of-no-faith would become the state-sponsored religion of the new regime, the revolutionaries staged their own observance, with young girls in tri-colored sashes performing a type of Morris dance about a statue of the Goddess Reason.  Such was just another occurrence among the competing factions of the Revolution, and which saw the dechristianization of France, including not just the iconoclasm of smashed stain-glass and white-washed images, but the execution of perhaps 30,000 priests. Less than a decade laterMass would once again be celebrated upon Notre-Dame’s altar. 

Within the shadow of its spire – which as of today no longer stands – the great Renaissance essayist Montaigne would have walked. By the massive rose window which filtered natural light into the ring of cobalt blue and emerald green, solar yellow and fire red, Rene Descartes may have contemplated his Cogito. By its gothic flying buttresses and underneath its simultaneously playful and disquieting gargoyles the novelist Victor Hugo both celebrated her stone walls and arches while advocating for her 19th century restoration. In 1323 the scholastic theologian John of Jandun would write of the cathedral that she “deservedly shines out, like the sun among stars.” And through it all, over a millennium of Parisian history, the cathedral stood guard from its island in the Seine.  Which is not to say that the cathedral hadn’t been destroyed before, and that it wouldn’t be destroyed again. Notre-Dame withstood the Wars of Religion which burnt across France during the sixteenth-century and Hitler’s orders to leave not a stone of Paris standing when the Nazis retreated at the end of the Second World War, and yet the cathedral endured. Since the twelfth-century Notre-Dame has survived, and while we watch with broken hearts as her spire collapses into the burning vaulted roof during this mournful Holy Week, we must remember that Notre-Dame will still be standing tomorrow. 

Sorrow for the destruction of something so beautiful, so perfect, must not obscure from us what a cathedral is. A cathedral is more than the granite which composes her edifice, more than the marble which lines the nave. More than the Stations of the Cross and the statues; more than the Crucifix which punctuates the altar. A cathedral is all of that, but it is also an idea; an idea of that which is more perfect than this fallen world of ours. More mysterious, and more powerful, and more beautiful. When we see push notificationsalerting us to the fire of this April 15th, when we see that tower which points to the very concept of God collapsing above her nave, it can feel as if civilization itself is burning. As if watching the Library of Alexandria be immolated on Facebook live, or reading the live tweeting of the dissolution of the monasteries. In this age of uncertainty, of rage, of horror, and of violence; of the decline of democracy and the heating of the planet; it can feel as if Notre-Dame’s fire is as if watching the very world itself be engulfed. Which is why it’s so important to remember what a cathedral is, what Notre-Dame is. 

Skeptics can reduce that which is associated with the phrase “High Church” to an issue of mere aesthetics, as if in our post-Reformation, post-secular world the repose of a cathedral is simply a mood or a temper and not a profound comment in its own right. An allegiance to the sacredness of silence, of the holiness of light refracted onto a cold stone floor. Minimalism makes its own offers and promises, and requires its own supplication, and the power of simplicity and thrift should not be dismissed. But a cathedral makes its own demands – a cathedral is beautiful. The intricacy of a medieval cathedral is not simply an occasion for art historians to chart the manner in which the romanesque evolved into the gothic, or for engineers to explicate the ingenuity of the flying buttress. Notre-Dame isn’t simply a symbol of Paris, nor a landmark by which a tourist can situate themselves. A cathedral is larger than the crowds which line up to take selfies in front of it; a cathedral more significant than the gift shops and food trucks which line the winding cobble-stoned streets that lead up to it. A cathedral is an argument about both God, but also humanity and the beauty which we’re sometimes capable of. 

Tomorrow the world will be less beautiful than it was this morning, and this is in a world which has precious little beauty that it should be able to give up. That Notre-Dame should be burning this April evening is a calamity, a horror. It is the loss of something that is the common treasury of humanity, which belongs not entirely to the people of France, nor only to those whom are Roman Catholics, but which rather sings of those yearnings of all women and men, living in a world not of our own creation but trying to console each other with a bit of beauty, a bit of the sacred. To find that meaning in the cathedral’s silence, in that movement of light and shadow upon the weathered wooden pews and the softness of the grey walls. The 17th century English poet George Herbert wrote of “A broken ALTAR… Made of a heart and cemented with tears,” as indeed may describe the crowds gathering along the Seine and singing hymns to our burning cathedral this spring night. Herbert’s poem is an apt explanation of what a cathedral is. A cathedral is a person. Her spine is the nave, and the transept her arms; the window her face, and the spire her head – the altar a heart. And though a cathedral is as physical as our finite bodies, threatened by incendiary and crowds, by entropy and fire, its soul is just as eternal. 

If there is something to remember, it’s that in the era before steel and reinforced concrete an anonymous mason would begin work with his brothers on a cathedral that his children would most likely never see completed. Perhaps his grandchildren would never live under its full height either. To work on a cathedral was a leap into a faith that we can scarcely imagine in our era, to work towards a future you’d never see, and yet to embrace that which is greater, more sublime, more perfect that you are. Our attitude of disposable consumerism and exploitive capitalism makes such an ideology a foreign country to us, yet if we’re to solve any of those problems that face us today – from climate change to the restoration of democracy – it must be with the faithful heart of a medieval mason who toils with the knowledge that a spire will rise above Paris – again. 

]]>
Thu, 18 Apr 2019 16:48:42 +0000 https://historynewsnetwork.org/article/171724 https://historynewsnetwork.org/article/171724 0
Benny and Joon and a Good Look at Schizophrenia

 

How do you stage a charming musical about schizophrenia? Was there ever a dimmer, sadder and troubling topic for a play?

Just ask the folks who run the Paper Mill Playhouse, in Millburn, New Jersey, where the new musical Benny and Joon opened on Sunday. It is delightful look at modern schizophrenia with three stars who are not only entertaining, but work hard to exam schizophrenia and talk about it on stage with candor, and with smiles, too.

Benny and Joon is the musical version of the 1993 movie of the same name that starred Johnny Depp. Schizophrenia was such a controversial topic in that year that the word schizophrenia was never mentioned in the script. Now, thankfully, it is.

Benny and Joon are brother and sister (he’s early twenties and she’s 20 or so). Joon suffers from schizophrenia and was a handful for her parents. They were killed in a cars crash when Benny was 18. Now, with them gone, Benny, who runs a car repair shop, has to raise her. All of a sudden, after a bad night playing poker, Benny has to provide room and board for a kooky young man, Sam, who comes to live with them. Sam, who wears and odd- looking hat, sees himself as the re-incarnation of Charlie Chaplin and Buster Keaton and mimics them. In one nice bit he uses dinner rolls as people and has them dance.

It starts with Sam’s arrival. Joon is relentless in her schizophrenic behavior and although Benny loves her to death, she drives him crazy. He faces the very real possibility of putting her into a group home with other mentally ill people. Joon, of course, wants to keep living with him and continue her amateurish career as a painter. He does not know what to do and consults Joon’s psychiatrist. 

The story of the play is Benny’s fear of putting Joon into a home or, later in the play, a mental institution. Throughout the story, Joon exhibits numerous schizophrenic tendencies. She is moody, very happy and then very sad, convinced people are trying to hurt her, fearful of what will happen to her. She’s impulsive. She doesn’t listen to people. She’ argumentative, possessive. There is no typical schizoid, but Joon exhibits the qualities of many people seen as such.

Yet, though all of this you love her.

 Kirsten Guenther wrote the book and the music and lyrics are by Nolan Gasser and Mindi Dickstein. They use their words and songs to suggest that while Joon might need help, she may not need all of the help that people suggest. They also get you to root for Joon. Isn’t she like so many quirky people we all know? Don’t put her away, people will say, just put up with her.

Sam, as he bops around the stage in a very goofy way, starts to admire, and then love, Joon. It’s an improbable relationship, to be sure, but so what? Where will they live, Benny asks his sister? The answer, as she frets, is well, who knows. We’ll get by.

Big brother Benny is scared to death. He is so, so worried about his sister and needs to protect her. What he’s going to do?

The success of the play is the work of the three stars, Claybourne Elder as Benny, Hannah Elless as Joon and Bryce Pinkham as the slightly nutty but thoroughly adorable Sam. They play their characters as lovable people trying to ward off schizophrenia.

The story is not about schizophrenia, but how it affects the families of its victims. It is the story, too, about how all mental illnesses affect families. We need more of these stories. There are tens of thousands of moms and dads, brothers and sisters, and have been throughout history, who have to live with and cope with mentally ill people. It is a struggle and Benny and Joon shows that in a majestic way. You need to love and support the victims of mental illness, not just toss them into a group home.

The music in Benny and Joon is OK, but none of the songs are memorable. Together, though, they create a nice atmosphere for the story. Some of the songs are painful, as they help to tell the story of the brother and sister and their wacky friend Sam.

The show’s director, Jack Cummings III, gets fine work from his stars, Elder, Elless and Pinkham, but also gets fine performances from the other actors in the play - Colin Hanlon, Paolo Montalban, Conor Ryan, Natalie Toro, Jacob Keith Watson, and Tatiana Wechsler.

Schizophrenia is a relatively new illness, not named by Doctors until 1908. Medical specialists today see schizophrenics as people with split personalities. They are, in general, wildly eccentric, believe other people are trying to get them to do things, feel slightly paranoid and see themselves embattled against just about everybody.

Benny and Joon, in the end, is both a sobering look at schizophrenics and a wonderful look at a pair of siblings who fight and feud, with the troubles of schizophrenia added, but, through it all, love each other.

We need more plays like this one. And more Bennys and Joons in this world, too.

 

PRODUCTION: The play is produced by the Paper Mill Playhouse. Scenic and Costume Design: Dane Laffrey, Sound: Kai Harada, Lighting: R. Lee Kennedy., choreography: Scott Rink. The play is directed by Jack Cummings III. It runs through May 5.

]]>
Thu, 18 Apr 2019 16:48:42 +0000 https://historynewsnetwork.org/article/171750 https://historynewsnetwork.org/article/171750 0
President Donald Trump, HIV/AIDS, and Black Lesbian and Gay Activism

ACT UP Protestors in New York

 

 

In his State of the Union address on February 5th, 2019, President Donald Trump surprisingly included a plan to eliminate HIV/AIDS in his budget: “My budget will ask Democrats and Republicans to make the needed commitment to eliminate the HIV epidemic in the United States within 10 years. Together, we will defeat AIDS in America.”  The inclusion of HIV/AIDS in his address came as a surprise to many because one of President Trump’s first actions upon arriving at the White House was firing all 16 members of the Presidential Advisory Council on HIV/AIDS.

Though President Trump reinstated this council 15 months later, his initial actions were indicative of his longer record on HIV/AIDS. The AIDS Coalition to Unleash Power (ACT UP)--New York held many direct-action protests, including one at Trump Tower in October 1989. Roughly 100 protestors gathered to protest the 6.2 million-dollars in tax abatements Trump received to build the mixed-use, high-rise property at a time when those stricken with AIDS were increasingly vulnerable to homelessness. Protestors saw Trump Tower as a symbol of corporate greed and argued that state monies could have been used to build more housing facilities for those impacted by AIDS.  

Creative writer, activist, and scholar Sarah Schulman has written that the rise in sudden deaths of gay men during the early era of AIDS hastened gentrification in New York City—their absences from rent-controlled apartments and their partners’ lack of access to inheritance claims accelerated the conversion of these apartments to market-rate rents. The early AIDS crisis facilitated changes in the constitution and character of New York City neighborhoods, linking it to larger trends in gentrification that have shifted the racial demographics of inner cities from ethnically and class diverse to more homogenous, middle-class, and increasingly white enclaves. 

Trump’s plan to end AIDS within this decade also came as a surprise given his abandonment of his mentor Roy Cohn after rumors spread publicly that Cohn was dying of AIDS.  It was Cohn’s ruthless business tactics and genius maneuverings around legal loopholes that helped Trump secure the tax abatements to build Trump Tower. Cohn had cut his teeth in politics as Senator John McCarthy’s chief counsel during the Army-McCarthy Hearings in 1954.  Cohn became a power broker in local New York City and federal politics, and in 1971 represented Trump when he was accused of violating the Fair Housing Act in 39 of his properties. Trump’s organization was accused of quoting different rental terms and conditions and asserting false claims of “no vacancy” to African Americans looking to rent apartments in his Brooklyn, Queens, and Staten Island properties. Under Cohn’s direction the Trumps countersued the government for $100 million dollars for defamation, and were able to settle the lawsuit against the Trump corporation by agreeing to stipulations that would prevent further discrimination, thereby not having to admit guilt.

 

 

Trump’s record on AIDS and racial and sexual discrimination make his 10-year plan even more surprising since the face of the U.S. AIDS epidemic is primarily black and Latina/o, especially gay, bisexual, and transgender blacks and Latina/os. In January 2019, the Black AIDS Institute (BAI), a Los Angeles-based, national HIV/AIDS think tank focused on black people, expressed their dismay when the Trump Administration proposed a change in “protected class status” under Medicare, which has allowed people living with HIV to access better medical care. In their response to his State of the Union address, BAI questioned President Trump’s intentions, since he has repeatedly sought to cut the President’s Emergency Plan for AIDS Relief, better known as PEPFAR, a multi-million-dollar initiative which has been credited with saving 17 million lives around the world. Moreover, they indicted the President for his racist and homophobic rhetoric, which has fueled an increase in violence against black and LGBTQ communities. One of the suggestions BAI made to move Trump’s plan from words to action was to center leadership from communities most impacted by HIV.

Some of the earliest leadership from communities impacted by HIV/AIDS emerged from black lesbian and gay artists and activists during the early era of AIDS. Beginning in the late 1970s, black lesbian and gay arts and activist movements—which political scientist Cathy Cohen has identified as the first stage of AIDS prevention efforts in black communities—centered collectivity, self-determination, creativity, and radical love as central to their political practice. They saw the elimination of racism, homophobia, and economic inequality as essential to the elimination of AIDS in black communities. In 1986, Philadelphia based, black gay journalist, creative writer and activist Joseph Beam published the editorial “Caring for Each Other” in Black/Out magazine, the official publication of the National Coalition of Black Lesbians and Gays. The essay is a meditation on placing community responsibility ahead of reliance on the state. Beam believed that the state had never been concerned about the lives of black people. State apathy, he argued, extended to black gay men and IV drug users dying of AIDS, stating that “it would be a fatal mistake if we were to relinquish our responsibility for AIDS in the black community to such an external mechanism.”  

Indeed, Trump’s proposal to end AIDS by targeting geographic and demographic “hot spots” in seven states, 48 counties, Washington, D.C., and San Juan, Puerto Rico, comes as part of a budget plan that would eliminate funding for global AIDS programs, slash expenditures on the Centers for Disease Control and Prevention, while transferring the management of Medicaid through block grants to states, comprising an overall cut to spending on health and human services. This plan proposes to end health inequalities at the local level while threatening to reproduce broader social inequalities at the state, national, and global levels. 

Though Trump’s plan of action challenges Beam’s narrative of state apathy by continuing the contradictory record of state action that began with President Ronald Reagan when AIDS first appeared, Beam’s caution suggests that our efforts to end HIVAIDS in poor communities and communities of color across the globe must not depend solely on federal or state bureaucracies. Instead, this history suggests that plans to eliminate HIV/AIDS must be centered on community care and responsibility, and political action aimed at transforming the conditions of structural inequality that President Trump has perpetuated throughout his career.

]]>
Thu, 18 Apr 2019 16:48:42 +0000 https://historynewsnetwork.org/article/171714 https://historynewsnetwork.org/article/171714 0
Health Care for All – A Cautionary Tale from the 1970s

 

With the 2020 presidential election around the corner, both parties appear headed, once again, for a train wreck on health care.  While scores of Democrats in Congress and on the presidential campaign trail advocate a single-payer health care system for all Americans immediately, other Democrats embrace the idea of universal coverage as the ultimate goal, but believe it should be achieved incrementally.  To some this seems like a repeat ofthe late 1970s when Democrats allowed the perfect to become the enemy of the good, and nothing was done on health care---for another 30 years. Meanwhile the unrelenting opposition of Republicans to the Affordable Care Act suggests that the GOP has no serious interest in offering an affordable health care plan. The voters punished them for it last year.  “Those who cannot remember the past are condemned to repeat it,” George Santayana famously said, offering an immutable truth that should be imbedded in the mind of every member of Congress.  

 

Health care coverage in the United States has had a compelling but sometimes fraught history that is essential to understand before it is reconsidered. Theodore Roosevelt first proposed national health care in his 1912 platform but he lost that election. Subsequent Democratic presidents including Franklin Roosevelt, Harry Truman and John Kennedy supported the idea but it was Lyndon Johnson who achieved Medicare for seniors with the Medicare Act of 1965.  At last every American 65 and over became eligible for federal health insurance regardless of income or medical history; it also included coverage for low-income Americans in the form of Medicaid. It was a landmark achievement, made possible by a unique moment in history and the tenacity of Democratic presidents in keeping the Republican Roosevelt’s 1912 idea alive. 

 

The next Democratic president, Jimmy Carter, was in step with his predecessors as he wanted to extend health care to all Americans, but the economic conditions of that time werevery different from 1965.  While both houses of Congress were Democratic in 1977-78, inflation was out of control and the economy as a whole was weak, straining the resources of the federal budget.  Carter had been a progressive governor of Georgia but a fiscal realist; he believed the country couldn’t afford such an enormous cost at that time without serious economic consequences.            

 

While Carter embraced universal coverage as the ultimate goal, he believed it should be achieved incrementally, not only for affordability but also for feasibility. An incremental approach, Carter contended, would aid the federal government’s ability to digest and administer such a huge and complex new system. Additionally, proposing a stepped approach would make it more likely to attract bipartisan support, which he believed was important for its long-term sustainability. 

 

Not everyone agreed. Eight years after Johnson’s Great Society was enacted, there were still pent-up demands among congressional Democrats for new federal spending.  Senator Edward M. Kennedy (D-MA) was the most vocal spokesman, and he was also, many suspected, planning to challenge Carter for the Democratic presidential nomination in 1980, using national health care as a defining issue.   

 

In 1977 Carter’s White House reached out to Kennedy to find a middle ground.  It became clear early on that there was a significant difference between the two camps. Over many months, the two parties tried to compromise, but the talks eventually faltered over the specific phasing-in of Carter’s proposal. The unbridgeable gaps were fully revealed at the final meeting between Carter, Kennedy, and their staffs in the Oval Office on July 28, 1978,.  When they first appeared, Carter, according to one participant, told Kennedy, “It will doom health care if we split . . . I have no other place to turn if I can’t turn to you . . . I must emphasize fiscal responsibility if we are to have a chance.”  Kennedy left the White House and soon announced he couldn’t support whatever the Administration offered on health care and he would write his own comprehensive bill, which he unveiled on May 19, 1979.  

 

A month later, Carter delivered a message to Congress calling for catastrophic coverage for all Americans so that  families who incurred severe and costly injuries or illnesses would not be financially destroyed. He also called for “comprehensive” coverage of 16 million low income Americans (Medicaid). It was a thoughtful, generous and responsible proposal, and it won significant early support on Capitol Hill, not least because many Democrats saw it as an essential step toward universal coverage.

 

In the previous fall of 1978, Kennedy had addressed the Democrats’ mid-term convention in Kansas City and threw down the gauntlet to Carter: “There are some who say we cannot afford national health insurance . . .But the truth is, we cannot afford not to have national health insurance.”  Tensions between the two men, already high, came to a boil when Kennedy formally announced his candidacy for president on Nov. 7, 1979. With no major issues dividing the candidates -- save for the timing but not the goal of universal coverage -- Kennedy’s campaign got off to a faltering start.  It was apparent he needed strong support from the more liberal trade unionsand some unions did sign on with Kennedy, including the United auto Workers, which had been a long-time supporter of national health care. The UAW’s  leadership pledged it would use its clout to see the plan enacted. Even after Carter captured sufficient delegates to win the nomination following a brutal series of primaries, the UAW would notback down from its all-or-nothing position. Neither would Kennedy. 

 

The hard-fought contest took its toll on both candidates and, tragically, on the issue of health care. In short, the dynamics of the 1980 primary campaign inevitably precluded the kind of legislative process that might have enabled universal catastrophic coverage to become law.  An important opportunity was lost; the American people would have to wait another 30 yearsfor major health care reform. 

 

It finally arrived in 2009 when President Barack Obama unveiled the Affordable Care Act as his highest legislative priority. The ACA or, as it became known, Obamacare, bore a striking resemblance to Carter’s proposal three decades before. New to the presidency, Obama’s leadership was sometimes hesitant and he failed to articulate a strong and consistent public case for his proposal, an omission that made passage more difficult. At a joint session of Congress in September 2009, the president read an endorsement from Senator Kennedy, written before he had died the month before. Obama rallied the congressional Democrats and, with the indispensable help of Speaker Nancy Pelosi, ACA finally became law in 2010.  It was an historic achievement, representing the most significant regulatory overhaul and expansion of coverage since 1965. 

 

With few Republicans supporting Obamacare, GOP leaders made its repeal their rallying cry for nearly a decade. Yet, they failed even when Republicans controlled both houses of Congress and the White House.  With Democrats now in control of the House of Representatives, the ACA finally appears secure--except that President Trump’s Justice Department is trying to overturn the ACA altogether.  

 

Republican control of the Senate and White House makes it a prohibitive time to attempt any major expansion of health care.  There is nonetheless an opportunity for Democrats -- and hopefully Republicans -- to prepare for the future by working together during the next two years to fix and strengthen the ACA so that it actually delivers the care it is meant to deliver.  They should also come together to significantly reduce the cost of medications, for which there is an undeniable bipartisan public mandate. Who knows where this could lead?  If led by serious people on both sides, it could yield yet more success stories like criminal justice reform and conservation of public lands.  Whatever it is, it’s better than polarized stalemate.

 

Thus, if the ultimate goal is to expand affordable health care to every American, history offers important lessons. It tells Democrats that in the next two years they must be politically savvy, and in some instances, uncharacteristically restrained, if they want to be poised to offer a viable form of expanded health care in 2021. They must be honest that 2021 is the first time a plan realistically can be considered.  Before then, they must avoid the public perception of “over-reach,” a political deadly sin that costs politicians who appear to offer grand proposals that are hugely expensive, complex and unwieldy. “Medicare for All” comes to mind as something many people already see as over-reach. Voters have finely attuned antennae, and most can tell when they’re being played by a slogan.  

 

On the other hand, Americans will respond favorably to reasoned proposals even for aspirational goals,as they did in 2018. They will do so again if a plan is couched in language they can understand, such as supporting a proposal for 2020 that offers “affordable health care for every American regardless of income or existing conditions.” At the same time, liberal Democrats should resist the siren song of ideological purity and embrace insteada pragmatism that will assure ultimate success.  The run-up to 2020 will be better than the 1970s unless Democrats take their eye off the ultimate goal and again allow a deep division within the party to preclude the outcome most Americans seek.

 

As for Republicans, history tells them that if they want to help shape America’s health care of the future, they should 1) accept the legitimacy, if not every detail, of the ACA,which is, after all, a direct philosophical descendant of the thinking of the conservative Heritage Foundation, as well as the first cousin of Republican governor Mitt Romney’s plan for Massachusetts, and  2) abandon their blind opposition to any expansion of health care. They should engage in a constructive and serious conversation with Democrats so that by 2021 we will have something approaching a national consensus on how to care for our health. 

]]>
Thu, 18 Apr 2019 16:48:42 +0000 https://historynewsnetwork.org/article/171720 https://historynewsnetwork.org/article/171720 0
Ilhan Omar is a Reconstruction Reformer

 

I stand with Ilhan Omar. As a historian of Reconstruction, I must. 

 

Omar embodies the best of Reconstruction-era reformers. She articulates a robust and inclusive vision of civil rights. She is a vocal advocate for the dispossessed and an outspoken opponent of racism and bigotry. She opposes Donald Trump’s nativist and Islamophobic “Muslim ban” and supports paid family leave and raising the minimum wage. In fact, she even co-sponsored the “Never Forget the Heroes Bill” that would permanently authorize the September 11th Victims Compensation Fund.

 

I did not run for Congress to be silent. I did not run for Congress to sit on the sidelines. I ran because I believed it was time to restore moral clarity and courage to Congress. To fight and to defend our democracy.

— Ilhan Omar (@IlhanMN) April 13, 2019

 

That last part might come as a surprise to those who know Omar primarily from the wave of race-baiting unleashed by conservative politicians, press, and agitators. Indeed, the president himself has repeatedly Tweeted lies about Omar paired with images of the 9/11 attacks obviously designed to make Omar out to be a terrorist. 

 

WE WILL NEVER FORGET! pic.twitter.com/VxrGFRFeJM

— Donald J. Trump (@realDonaldTrump) April 12, 2019

 

But we should recall that this has been a Republican strategy for quite some time now. The Republican Party of West Virginia implied that Omar was a terrorist last month, suggesting that Americans, by electing a Muslim, had “forgotten” the 9/11 attack. Again, this wasn’t some far-Right website. It was the WV state Republican Party. 

 

Nor is Omar the first woman of color to be targeted by Trump. Last year, Trump launched similar attacks against California Congresswoman Maxine Waters. These and other racist and Islamophobic attacks on Omar and Waters have inspired death threats against both women.  

 

As a scholar of Reconstruction, this recent surge in racist propaganda has me worried. It is precisely the tactic that conservatives used to subvert Reconstruction-era reforms. They publicly targeted politicians in their newspapers and incited violence as a tool to regain political power after having been defeated during the Rebellion.

 

I wrote about an eerily similar campaign of terror against Victor Eugène Macarty, an Afro-Creole politician recently for the Journal of African American History. Like Omar, Macarty was an outspoken advocate for equality. He had attended the voting rights convention on July 30, 1866 at the Mechanics Institute in New Orleans when it was attacked by police. He escaped death by hiding under the porch while New Orleans police officers, at the head of an angry mob of whites drummed up by the local press, attacked members of the convention and mangled their corpses.

 

I became interested in Macarty while researching his time as a member of the Orleans Parish School Board as part of a project examining the impact of racial science on state institutions after slavery. But the more I read about Macarty—who was singled out by the white-supremacist New Orleans Bulletin as “extremely offensive to the white people of this city”—the more I became intrigued by his story. During an era when the white press was reluctant event to print the names of African Americans, the Anglo papers in New Orleans routinely targeted Macarty, almost begging readers to attack him. They did.

 

After he confronted a white woman fired from her teaching position for supporting the White League—a white supremacist terrorist organization—the Bulletin repeatedly called for Macarty’s head. When the woman’s brothers attacked and left him for dead on September 16, 1875, the paper cheered the outcome and warned that the other Black school board members should “rememb[er] the fate of Macarty.” His attackers pleaded guilty and were “sentenced to each pay a fine of Ten Cents or one minute in the Parish Prison.” The court system in New Orleans functioned as an institution of racial control, letting Macarty’s attackers off the hook while signaling to African Americans that they would find no justice before the law. The continued media campaign and threats against Macarty played an outsized role in his political life and eventually led him to leave the city.

 

Macarty was not alone as a victim of media-initiated racist attacks. The white press regularly named targets for white vigilantism. White elites pioneered this form of racist terrorism after emancipation as a means of controlling African Americans and subverting working-class politics.

 

The consequences of the media campaign against Macarty should give us pause as the president and large portions of our national media engage in blatant race-baiting against Ilhan Omar and Maxine Waters. Indeed, it is hardly a coincidence that following this highly public, racist coverage, both Omarand Waters received death threats. As an activist and citizen, it is terrifying to see the resurgence of this Reconstruction-era tactic of racial oppression today.

 

What frustrates me as a scholar is that we’ve created a historiographic landscape in which African American contributions to American history are overlooked. We too often take a teleological approach to Reconstruction and spend too little time allowing ourselves to be surprised by the profound commitment to equality made by many of the era’s reformers. This act of intentional mis-remembering strengthens the foundation of white supremacy in our country. As we’re seeing right now, that’s incredibly dangerous. 

 

Macarty was a revolutionary figure about whom little was known until my recent article, despite his having brought the first lawsuit against segregated seating in federal court in 1869. In fact, the same few lines had been written and rewritten about Macarty since James Trotter’s 1880 Music and Some Highly Musical People, published the year before Macarty’s death. 

 

We need to better remember the stories of African American reformers and visionaries to counterbalance a field that remains plagued by Lost Cause categories, periodization, and imagery. We need to know more about those who led prior movements for equality. We need to celebrate their martyrs and understand the cause we inherit from them. And perhaps most crucially at this moment, we must become intensely aware of the tactics that their white supremacist opponents used to subvert equality.

 

Biography helps us accomplish these ends and we should pursue it vigorously and unapologetically. My friends and family are consistently surprised when they learn about my research into Macarty and his contemporaries. This cannot be the case, at least not if we hope to live in a society that values justice and equality.  

 

Biography is a key pillar of historical instruction from grade school through high school. It helps students recognize themselves in historical figures large and small. Well-executed biographies allow them to better understand the debates of the past and relate them to those of the present. They also enable students to approach the past with humility and to see that our forebears grappled with many of the same issues we face today. This is one of the central “lessons of history” and among the most important that we can offer. 

 

Further, biographical approaches to historical actors not only show African American resistance to white supremacy, but also avoid flattening African Americans into vehicles of resistance. Indeed, the view that African American liberty implies a rejection of (white) authority is a core belief of white supremacists. By telling the stories of African American men and women as whole persons, we can combat this racist lie.

 

In researching Macarty, I realized the need for more African American biographies in Louisiana and, I suspect, throughout the 19th-century U.S. At least in south Louisiana, I came across many prominent African Americans about whom little or nothing is known. Take T.M.J. Clark, who after having been enslaved, taught himself to read and became the president of the State Insane Asylum. Or John Gair, who helped write the Louisiana Constitution of 1868 and survived numerous threats and an assassination attempt before being gunned down while in police custody in 1875. Our histories have either completely ignored these radicals or, in cases where they’ve been mentioned in passing, gotten them almost entirely wrong.

 

Moreover, like Macarty, Gair and Clark were subjected to race-baiting coverage in the media that effectively ended their careers. The white press slandered and vilified both men and each of them suffered brutal attacks by white supremacist vigilantes. Like Macarty, Gair and Clark demanded equality. It was the cause for which Gair was martyred and Clark forced to flee for his life, a permanent exile from his hometown.

 

This wave of media-inspired white supremacist violence effectively ended Reconstruction. No one was ever held accountable for the massacre of voting rights activists in New Orleans in 1866. Macarty’s attackers, after nearly beating him to death, faced no consequences. And though Gair was assassinated while in police custody in 1875, none of his attackers were ever charged. It was this failure to hold the race-baiting press, politicians, and vigilantes responsible that undermined any semblance of equality for more than 100 years. 

 

Politicians like Macarty, Gair, and Clark took incredible risks and made enormous sacrifices to fight for equality 150 years ago. Their contemporaries failed to hold their attackers responsible. We cannot make that same mistake.

 

 

]]>
Thu, 18 Apr 2019 16:48:42 +0000 https://historynewsnetwork.org/article/171721 https://historynewsnetwork.org/article/171721 0
Oh, What a Beautiful Piece of American History

 

Oklahoma!, one of the great musicals of show business history, often loses its own history amid all of those gorgeous Richard Rodgers and Oscar Hammerstein songs. The play is a straight forward, and yet very complex, story of ranch hands and their women on farms in the bustling Oklahoma territory in 1906, just before Oklahoma became the 46th state. The simplicity and beauty of that life is the basis for the marvelous, new and different version of the play, that opened last week in New York at the Circle in the Square Theater at 1633 Broadway.

The play starts with ranch hand Curly, played superbly by the multi-talented Damon Daunno, a cowboy star in the Oklahoma territory who is desperately infatuated with farm girl Laurey. He stands up and, with a gorgeous voice, sings one of the signature songs in the musical, Oh, What A Beautiful Morning. It kicks off a play that is full of new romances, busted romances, patched up romances, a lot of violence, dark conversations, threats and a wild and wooly battle for the middle of America in a very divided country (sound familiar?). It is the men vs. the women, the good vs. the bad and the cowboys vs. the ranchers, all scrambling for a piece of the Oklahoma territory just after the turn of the century, in 1906, and all of the promises and dreams within it.

This new version is pretty much the same as all the other plays and movies (the 1955 film version won three Oscars) and yet, at the same time, it is distinctly different. The others were grand sprawling sagas with lots of props, such as the time-honored surrey with the fringe on top, farmhouses and barns. There are none of them in this new play, majestically directed by Daniel Fish. All the director gives the audience here is an empty stage with chairs, some spectators on the periphery, a small orchestra (all happily wearing cowboy boots) placed carefully in a shallow pit and that luscious music that drifts through the air and soothes the hearts of everyone in the theater.

The story (Hammerstein also wrote the book) develops nicely. Curly wants to take Laurey to the local dance but she had already promised to go with Jud Fry, a menacing, malevolent cowboy whom nobody likes. She only did it, she tells friends, to spite Curly. This sets off a battle between Curly, Jud and Laurey, in addition to the fight between cowboy Will Parker and traveling salesman Ali Hakim for the hand of the boisterous cowgirl Ado Annie. There is a lot of back and forth and the plot is told with the wonderful songs as well as dialogue. Those tunes include Oh, What a Beautiful Morning, The Surry with the Fringe on the Top, People Will Say We’re in Love, Kansas City, I Can’t Say No, and the rousing, burn-down-the-barn title song, Oklahoma!

Even though this is a barebones show, it has some marvelous special effects. At one point, Curly and Jud are arguing over Laurey with some pretty dangerous and threatening dialogue. Curly even suggests that Jud Hang himself. The whole scene is presented in the dark, so that you only hear their voices of the two men. Part of that confrontation is a huge, haunting, slightly out of focus film of Jud talking. It fills the stage wall.

Many of the conversations in the story are done with dark lighting and stirring music to add a sense of foreboding to the drama. There is some gunplay, pretty authentic for the era. An anti-gun theme is evident around the walls of the theater, where over a hundred rifles and standing in wall racks, ready to be fired at any moment if there is trouble somewhere in the territory of Oklahoma.

The story of the land and the people battling over it, the tale of yet another new frontier in U.S. history, is absorbing and the same story that developed in every other U.S. territory, whether it was Arizona, Alaska or Oklahoma. The play tells the tale of an America that, out there in the cornfields, is bursting at the seams. And, at the same time, it tells the story of Oklahoma, ranchers, cowboys and city folk.

In the play you learn about all the hard work the cowmen and ranchers put into make their ranches successful, the social customs of Oklahoma, and the mid-west, in 1906, the dances, the dating, the generational battles, and marvel of country folks for city folks, told so well in the tune Kansas City.

Amid all of this history is the story of the young people, helped and guided by the older ones, as they try to find their place in Oklahoma, America, and the world. It is a nicely told saga told within all of those memorable tunes.

Stetsons off to director Fish for not just re-staging, but re-inventing this classic musical. He used all of his genius to create a sensational new play out of an equally sensational old one. He gets significant help from a gifted groups of actors, including Daunno as Curly, Mary Testa as Aunt Eller, who holds the chaotic life of the prairie together through all of its storms,  Rebecca Naomi Jones, a fine singer and whirling dervish of a dancer as Laurey, James Davis as the stoic, hunkered down Will Parker, Ali Stroker as his beloved girlfriend Ado Annie, Patrick Vail as the villain Jud Fry,  Anthony Cason as Cord Elam, and Will Brill as salesman Ali Hakim.

The play started its musical journey in 1931 as Lynn Riggs’s Green Grow the Lilacs. It wound up with Rodgers and Hammerstein, who in 1943 made it into their very first, of many, shows. In 1944 it won a Pulitzer Prize. The play was a huge commercial hit and ran on Broadway for nearly seven years. Revivals of it over the years have won numerous Tony Awards. The 1955 movie, starring Gordon Macrae, Shirley Jones and Rod Steiger, garnered three Oscars.

The folks connected to the original play really should have taken some time to give people in the audience a little history about sprawling, ever green and inviting Oklahoma that was so central to the show. The big push for statehood started in the 1889 Oklahoma Land Rush, in which 50,000 energetic settlers raced across the territory’s plains in wagons, carriages and on horseback to claim two million acres of free land, a race into history sanctioned by the U.S. government as a way to populate the huge piece of Midwestern landscape.  As the new settlers developed it, the need for statehood grew. Ironically, after the success of the play, the state of Oklahoma named the title song of the musical as it’s official state song.

I’m sure they voted for it on a beautiful morning at the start of a beautiful day.

PRODUCTION: The play is produced by Leve Forward, Eva Price, Abigail Disney, others. Scenic Design:  Lara Jellinek, Costumes: Terese Wadden, Lighting: Scott Zielinski, Sound: Drew Levy, Choreography: John Heginbotham. The play is directed by Daniel Fish. It has an open-ended run.

   

]]>
Thu, 18 Apr 2019 16:48:42 +0000 https://historynewsnetwork.org/article/171723 https://historynewsnetwork.org/article/171723 0
When Women Ran Hollywood

 

Hold on. When did women—who produced only 18 percent of the 100 top-grossing movies of 2018, whose screenplays constituted a mere 15 percent, and who directed a microscopic 4 percent—ever run Hollywood?

 

Here’s how I found out about this little-known history. While researching a novel set in 1919 about vaudeville, the live variety shows that were America’s favorite form of entertainment at the time, I learned its demise was caused in part by the growing success of silent movies. The obvious question was, what could make silent movies—with their melodrama, bad acting and, you know, silence—more desirable than a live performance of, for instance, a regurgitator who could swallow and then upchuck items in the order the audience determined? (Audiences loved regurgitation, by the way; also sword swallowing and fire breathing. In addition to a wide variety of acts, there was a lot of ingesting stuff that one just shouldn’t.)

 

What was so great about silent movies? I soon found myself wandering down a succession of internet rabbit holes. (When it comes to research, most writers can get so rabbit-y we practically sprout long floppy ears.) First, I sampled a few films and found that they were more complex, well-acted, and creatively-filmed than I’d expected. Mary Pickford’s movies or Gloria Swanson’s, for instance, are surprisingly subtle.

 

But far more fascinating was the fact that women were a driving force in early filmmaking. Up until about 1925, “flickers” weren’t considered terribly respectable, so if you could get a real job, you avoided a career based on these flights of fancy. Conversely, if you were shut out of most employment because of, say, your gender, Hollywood beckoned. 

 

Consider the following:

  • Women worked in almost every conceivable position in the industry, from “plasterer molder” (set construction) to producer.
  • There were a few popular actors, but actresses were the stars.
  • In 1916 the highest salaried director was female.
  • In 1922 approximately 40 production companies were headed by women.
  • An estimated half of all screenplays produced before 1925 were written by women. 
  • For over twenty years, the most sought after and highest paid screenwriter was female. 

 

Why had I never heard of any of this? I was familiar with directors D. W. Griffith and Cecil B. DeMille, producers Sam Goldwyn and Jack Warner, writer-director-actor Charlie Chaplin, but had heard of almost none of the following sample of brilliant and powerful women.

 

Studio Chiefs

Alice Guy-Blaché began as a secretary for a French motion picture camera company. Fascinated by the medium’s possibilities, in 1896, she asked her boss if she could make a short story film—the first ever!—to promote the camera. He agreed as long as she didn’t shirk her secretarial duties. By the time she moved to America in 1907, she had produced 400 such films. She founded a new studio, Solax, served as president, producer, and chief director, and produced 300 more films by the end of her career. Her feature-length films were quite sophisticated, focusing on subjects of social import such as marriage and gender identity.

 

Mary Pickford, best known as “America’s Sweetheart,” was the most successful and highest paid actor of her time. She was also a shrewd business woman. Along with D. W. Griffith, Douglas Fairbanks, and Charlie Chaplin, she co-founded United Artists in 1919, and was arguably the most financially astute among them. Chaplin recalls that at a meeting to form the studio, “She knew all the nomenclature: the amortizations and the deferred stocks, etc. She understood all the articles of incorporation, the legal discrepancy on Page 7, Paragraph A, Article 27, and coolly referred to the overlap and contradiction in Paragraph D, Article 24.”

 

Screenwriters

Known for her sharp wit and snappy dialogue, Anita Loos’ career as a screenwriter, playwright, and novelist spanned from 1912 to the late 1950s. Douglas Fairbanks, as much an athlete as an actor, relied upon her to accelerate his career and devise an ever-expanding list of “spots from which Doug could jump.” Most famously, she adapted her bestselling novel Gentlemen Prefer Blondes as a silent film in 1928, which was the basis for the 1953 version starring Marilyn Monroe.

 

Frances Marion is hard to top even by today’s standards of output and success. Until 1935, well after women’s influence in Hollywood had waned, she remained the most sought after and highest paid screenwriter in America, male or female. She acted, directed, produced, and is the only woman to have won two academy awards for Best Original Screenplay. She was mega-star Mary Pickford’s preferred writer (and best friend) and saved many careers in the tumultuous years when the industry was converting to sound. Above all, Frances was a generous collaborator, hosting famous “hen parties” at her house as a sort of support group for Hollywood’s female filmmakers.

 

Directors

Lois Weber was also a studio head, producer, screenwriter, and actress, but as a director she was as well-known as D. W. Griffith and Cecil B. DeMille. In 1916 she was the highest paid director, male or female, earning an unprecedented five thousand a week. She was the first woman member of the Motion Picture Directors Association, with 138 films to her name. They were often morality plays on social issues such as birth control, drug addiction, and urban poverty, particularly as these affected the plight of working class women.

 

Dorothy Arzner, the most prolific American female director of all time, started in the scenario department in 1919 typing up scripts. In the fluid Hollywood work environment, she quickly progressed to cutting, editing, and writing, and by 1927 had directed her first film. With the advent of sound, she invented the boom mic to allow actors to move about the set without bumping into sound equipment. Arzner was gay and fairly open about her personal life, wearing men’s clothing, and living with choreographer Marion Morgan for 40 years. Despite her gender and orientation, she was able to work steadily as a director until she retired in 1943.

 

Renowned early film historian Anthony Slide has said, “Women directors were considered equal to, if not better than, their male colleagues.”

 

Actresses

Florence Lawrence is credited with being the world’s first movie star. In 1908 she was making 100 flickers a year with D. W. Griffith at Biograph, the world’s top studio at the time. However, she was known only as “the Biograph Girl” because the studio didn’t want to increase her burgeoning fame, and thus ability to demand a higher salary, by naming her. She moved to upstart IMP studios, and was involved in perhaps the first wide-scale publicity stunt. The studio quietly fed the papers a story that she’d been killed by a street car, then took out ads declaring “WE NAIL A LIE” claiming that other studios were trying to ruin her career. Fans went crazy for the story, and a public appearance shortly thereafter resulted in mayhem as a huge throng rushed her, pulling buttons from her coat and the hat from her head.  

 

Mabel Normand was a brilliant comic actress, starring in approximately 200 films, most at Mac Sennett’s Keystone studio, and was the first actor to be named in a film’s title (e.g. Mabel’s Lovers in 1912). She also directed many of her own films, including those in which she was featured with a young Charlie Chaplin. Though he erroneously claimed directorship on several of them, Mac Sennett has said that Chaplin “learned [to direct] from Mabel Normand.”

 

Every one of these women was a multi-talented powerhouse, committed to the success of her films, the industry, and other female filmmakers. And for each of those named above there were many, many more.

 

Unfortunately, with a few notable exceptions their careers were generally over by the end of the 1920s. As Hollywood historian Cari Beachamp said, “Once talkies arrived, in the late 20s, budgets soon tripled, Wall Street invested heavily, and moviemaking became an industry. Men muscled into high-paying positions, and women were sidelined to the point where, by the 1950s, speakers at Directors Guild meetings began their comments with “Gentlemen and Miss Lupino,” as Ida Lupino was their only female member.”

 

Their names may no longer be widely recognizable, but these were among the many women who built and ran early Hollywood, shaped the industry in myriad ways, and influenced what we see on the silver screen even today.

 

Will women ever “run” Hollywood again—or even advance to relatively equal numbers as studio heads, producers, directors, and screen writers? This remains to be seen, of course. But powerful leaders, like executive producer, showrunner, and director Shonda Rhimes, Amazon Studios head Jennifer Salke, Disney TV Studios and ABC Entertainment chair Dana Walden, Producer-director Ava DuVernay, and director Patty Jenkins, among many others, offer hope. 

 

“Demanding what you deserve can feel like a radical act,” Rhimes has said. Radical, perhaps, but not new. All it would take is a return to the good old days of early Hollywood.

 

]]>
Thu, 18 Apr 2019 16:48:42 +0000 https://historynewsnetwork.org/article/171719 https://historynewsnetwork.org/article/171719 0
Trump’s War on Civil Rights and Beyond: A Conversation with Acclaimed Political Analyst and Civil Rights Historian Juan Williams

 

 

Republican presidential candidate Donald Trump urged black voters to ditch the Democratic Party and “try Trump” at a campaign rally on August 19, 2016, in the predominantly white suburb of Dimondale, Michigan. He said of black Americans: "You're living in poverty. Your schools are no good. You have no jobs. Fifty-eight percent of your youth is unemployed.” Trump then asked, “What the hell do you have to lose?"

            

As it turned out, African Americans—among others—are losing a great deal under President Trump, as acclaimed commentator, journalist and historian Juan Williams argues in his timely and illuminating new book, “What the Hell Do You Have to Lose?”: Trump’s War on Civil Rights (Public Affairs). 

 

Mr. Williams contends that Trump’s now infamous campaign speech and other statements on race have conveniently ignored African American history and progress in the decades since the passage of the 1964 Voting Rights Act and the 1965 Voting Rights Act. He denounces the president’s ingrained tendency to intentionally distorthistory to fuel racial tensions for his political advantage.

 

In “What the Hell Do You Have to Lose?” Mr. Williams deftly weaves the remarkable story of the struggle for civil rights into his account of how the Trump Administration has been bent on turning back the clock and undoing or threatening advances in voting rights, school integration, equal employment, and fair housing, and other areas. He describes the unprecedented threat to civil rights under Trump as he chronicles the president’s personal and family history ofdiscriminating against people based on race and his record of hostility to African Americans, including President Barack Obama.

 

In describing the losses for African Americans under Trump, Mr. Williams also provides glimpses from the struggles of heroic pioneers who fought for civil rights and for a better life for all Americans. He shares the stories of activists such as Bob Moses of the Student Nonviolent Coordinating Committee who braved the violent Jim Crow South to register African American voters; James Meredith, a US Air Force veteran, who became the first black student to enter the University of Mississippi in 1962 in the wake of bloody riots at “Ole Miss”; A. Philip Randolph, a union leader who made strides for equal employment rights in the Jim Crow era; and Robert Weaver who championed fair housing programs and became the first black cabinet secretary as the head of the Department of Housing and Urban Development. 

 

Mr. Williams takes pains to explore the past in the belief that knowledge of history is the key to understanding the present and to shaping the future as he explains how the principles of equality, tolerance, and justice today are at stake for all citizens.

 

Mr. Williams is an award-winning journalist, political analyst and historianwho has covered American politics for four decades. He has written several other books, including Eyes on the Prize: America’s Civil Rights Years 1954-1965; Thurgood Marshall: American Revolutionary; This Far by Faith: Stories from the African American Religious Experience; My Soul Looks Back in Wonder: Voices of the Civil Rights Experience; and Enough. His articles have appeared in the New York Times Sunday Magazine, Time, Newsweek, Fortune, The Atlantic Monthly, Ebony, Gentlemen’s Quarterly, and The New Republic. Mr. Williams is currently a columnist for The Hill, and was a longtime correspondent for The Washington Post and NPR. He also cohosts the Fox News Channel’s debate show The Five, and appears on other Fox shows where he regularly challenges the orthodoxy of the network’s right-wing stalwarts. 

Mr. Williams generously spoke by telephone about his new book, his work, and his commitment to sharing historical context when discussing current events. Following our conversation, he added this opening update for readers on his historical perspective and recent events.

 

Juan Williams:I want to thank Robin for the opportunity to talk to history lovers on the History News Network. When I wrote “What the Hell do you Have to Lose: Trump’s War on Civil Rights,” my goal was to answer the question that then presidential-candidate Donald Trump posed to Black America: ‘What do we have to lose from a president who doesn’t care about African Americans?’

 

My book dissects Trump’s unprecedented assault on everything America has achieved over the last half century to move forward on race relations--from voting rights to integrated schools to equal opportunity in employment and fair housing. These changes were achieved by people who made sacrifices, put themselves at risk of being expelled from school, losing jobs, losing their mortgages, constant threats of violence and some even faced death.

 

I tell stories of these courageous civil rights heroes so that we can better understand that progress came at great cost. Starting from that baseline helps the reader to understand how much the nation has gained, and how much we have to lose from Trump’s effort to return to the past or, in his infamous words, “Make America Great Again.”

 

Since I finished writing What the Hell do you have to Lose in 2018, very little has changed. The president continues to tell lies about blacks, Latinos, and immigrants. He makes racial minorities and immigrants out to be a threat to America; we become the enemy, all lumped together as barbarians who commit crimes, take advantage of social programs, and abuse affirmative action policies.

 

These lies are aimed at the ears of white America at a time when pollsters report that large numbers of older whites are anxious about the growing number black and brown people, and immigrants of all colors, in the USA.

 

Trump’s most frequent refrain is that life is better for minorities with him as president. He dismisses talk about increasing racism and anti-Semitism as overwrought. Even FBI reports on the increase in hate crimes since he has been president are waved away as liberal nonsense. Instead, he frequently tells interviewers, for example, that the black unemployment rate is currently “the lowest in the history of the country.”

 

This is a distortion.

 

First, black unemployment under Trump has never reached its lowest point in history. Though it did hit 5.9 percent last May, Labor Department data indicates that black unemployment dropped down to 4.5 percent in 1953. According to the Washington Post Fact-Checker, this distortion was worth giving Trump three out of four Pinocchio’s for his unfounded claim.

 

In addition, the president fails to mention that black unemployment has been increasing. As recently as February 2019 it reached 7 percent. And throughout, black unemployment has remained more than double white unemployment.

 

Unfortunately, these are the kinds of distractions from the truth about race relations that Americans--black and brown Americans in particular--have come to expect from our president.

 

He’s a man who couldn’t condemn the unique horrors of white supremacy that resulted in the death of a woman in Charlottesville last summer

 

Trump won’t talk about the white supremacy that led to the death of Heather Heyer in Charlottesville, eleven Jews in Pittsburgh, and fifty Muslims in New Zealand. But he couldn’t be happier to talk about Congresswoman Ilhan Omar, whose recent treatment by Trump and the Republican Party has less to do with condemning anti-Semitism than it is a political ploy to silence an immigrant, black and Muslim woman who dares to wear a Hijab in Congress and speak her mind about controversial subjects.

 

He’s a man who, hours after it came out that a white supremacist in New Zealand slaughtered fifty Muslims during their Friday Prayers, said that white nationalism was “not really” a major threat, even as the killer’s manifesto described Trump’s 2016 victory as “a symbol of renewed white identity and common purpose.”

 

Indeed, even after Chicago Mayor Rahm Emanuel condemned the courts for dropping the charges against disgraced actor Jussie Smollett, he slammed Trump for speaking on the issue, ordering him to “stay out” because “the only reason Jussie Smollett thought he could get away with this hoax is because of the environment President Trump created.”

 

Previous Republican Administrations made good faith efforts to improve relationships with African Americans.

 

Presidents Reagan and Bush made a point of speaking at the NAACP, seeking out advice from prominent black intellectuals, and appointing African Americans to the highest positions in government. And under President Obama black and white members of both parties were willing to start having the messy, yet necessary conversations about issues that continue to prevent us from moving forward on race as a nation.

 

On the other hand, President Trump has just one African American in his Cabinet. Despite agreeing to some criminal justice reform measures, Trump has failed to deal with issues of police brutality that have led to persistent tensions with black America and the creation of the Black Lives Matter movement. Instead, he ran a campaign, and now a government, fueled largely by white American fears that the country is being stolen from them by ungrateful African Americans, undocumented immigrants, and radical Muslim terrorists.

 

According to Trump, the problem is not the harsh, unfair reality of high levels of segregation in neighborhoods, schools, and jobs. The problem in his eyes is a football player, Colin Kaepernick, kneeling in protest during the playing of the national anthem.

 

Trump also is easy to anger when prominent black people challenge his policies. He will also go out of his way to tongue-lash black critics, including insulting LeBron James, Steph Curry, Jay-Z and other black celebrities. He regularly disparages black women in the Congress who disagree with his policies. 

 

To get away from the day-to-day static around Trump’s mishandling of racial issues, the American people need to know about the civil rights heroes like Bob Moses, James Meredith, A. Philip Randolph, and so many others, because we need to understand how much blood, sweat, and tears it took to create the thriving Black America of today and protect us from those who, like President Trump, couldn’t care less.

 

 

Robin Lindley: Congratulations Mr. Williams on your powerful new book on Trump’s war on civil rights. You take pains to weave history into your reporting, and you are a historian in your own right with your acclaimed books such as Eyes on the Prize, a study of the Civil Rights Movement, and your renowned biography of Justice Thurgood Marshall. In your new book you share the story of civil rights advances that are now threatened under Trump. Your efforts as a journalist and historian are refreshing in this era of fake news. 

 

Juan Williams: I love history. I find it eye-opening because it tells me so much about not only the present but it allows me a structure for thinking about the future. For me, history always been a revelation. Even when I was a child when I learned about the past, I thought, Oh, my goodness. Who knew?

 

Robin Lindley: Did you have training in history when you went to school or did history just naturally come into your writing when you were a reporter?

 

Juan Williams: No, my love of history is an extension of my interest in the news, a fascination I had from my days as an immigrant child in a city with close to a dozen newspapers, New York. I found newspapers and daily journalism on radio and television to be a reason to look into history. The rest of the story, the back story if you will, was the history of the characters and events, and the ideas that animated the politics of the day. I would see something that happened in a prior period in American life and I would go to the library in New York City, where I grew up, and I’d read a book to investigate the story and to understand how we came to the point where we were then and how that article that I was reading in fact was representative of a larger and longer vein of history.

 

Robin Lindley: There's a new twist in the news every day concerning our history, and particularly about race. Attorney and Trump “fixer” Michael Cohen called Trump a racist, a con man, and a cheat at a public Congressional hearing. I don't think that was news to many of us, including the Republicans on the committee. You certainly delve into the history of Mr. Trump's racial insensitivity as well as his lack of historical knowledge as he attempts to erase the past.

 

Juan Williams: I write of the reality of the sacrifices, even people giving their lives, to accomplish racial justice in this country. I'm not suggesting this book is a complete telling of the civil rights movement; I structured the book to include the history as an introduction to the background for young people and a reminder for people who may have forgotten the past. My premise is that we have a traveled such a distance on race going back to our origins as a nation with legal slavery and then legal, government enforced legal segregation that extended well into the 20th century.

 

I had written some of that story in my first book, Eyes on the Prize. More of it is in my second book, a biography of former Supreme Court Justice Thurgood Marshall.

 

That brings me to this book and why I was offended by Trump telling white audiences that black people had nothing to lose by voting for him. The quote, "What the hell do you have to lose?" came from him during the 2016 election campaign. He argued to whites that these black people live in such bad neighborhoods in terms of the violence and crime, with bad schools and a lack of jobs. And, some white person driving through a troubled black neighborhood might say, "Well, it looks like he has a point."

 

There's so much missing context in terms of that distorted picture of black American life.  First, no fake news, just the facts: The nation’s black population is doing better than ever before by so many measures in terms of income, education, business ownership, occupying political office, and the like. I could go on. But Trump doesn't seem to plug into that part of the story. Instead, he takes a perverse delight in poverty and crime among blacks, Latinos and immigrants. Again, this why I think the history is so important. 

 

The history of progress for American minorities is needed to inform someone hearing Trump’s indictment so they are not fooled. With history in mind they will know what the hell striving minorities in this country have to overcome and a history lover knows how far minorities and immigrants have come despite those obstacles.

 

That indictment of black people by Trump is undermined by the history of all the struggle and sacrifices made to bring black people to this point. And also, it opens eyes to the idea that the African American community is not all poor and poorly educated. In fact, black America in 2019 is at historic heights in terms of income and education. Almost 40 percent of black people make between like $35,000 and $100,000 per year. Another 11 percent are earning between $100,000 and 200,000. So that's half of the black population living in the American middle class. And then you have the reality of black executives who have led very successful American companies like Time-Warner, Merrill Lynch, American Express, and Xerox. 

 

Those stories of black achievement are not part of Trump telling whites that blacks have nothing to lose. An informed listener will know they are being misled by Trump because they know the history of black trailblazers, beating the odds to make new paths in American society, a society that not only enslaved black people but legally segregated them and still discriminates against them.

 

And once voters – including Trump voters – are aware of that history I think their attitude might shift from contempt to admiration. They might say, ‘Oh gosh, look at these previously disenfranchised people who have made their way.’ Wow, 

 

We should celebrate these people who have remained loyal Americans and hardworking people who believe that they can make it in this society and that they can achieve the American dream. But to the contrary, they're vilified and made out to be a bunch of people with nothing to lose by the man who then becomes President Donald Trump.

    

Robin Lindley: It seems too that old stereotypes have re-emerged under Trump. By old, I mean before the Civil War, such as the recent controversy about the governor and attorney general of Virginia appearing in blackface as young men. How do you see this issue of political leaders who engaged in this racist mockery?

 

Juan Williams: Part of "Make America Great Again," Trump's campaign slogan, was to create nostalgia for some time before the civil rights movement, before the women’s movement, before America became more diverse, an earlier social hierarchy in which white people, especially white men, were at the top and people of color were below them. Black people fit into that picture as happy go lucky people, singing, dancing, and even white people feeling free to put on blackface and mock black Americans with minstrel shows. Apparently, we are to believe racial minorities were happy before all these northern agitators came down here. This is a modern version of segregationists telling each other that “Our black folks are happy folks.”

 

That was delusional thinking on the part of racists who didn't want to hear anything about equal rights or civil rights. So, when you look at this generation of white leadership in Virginia, which held the capitol of the Confederacy in Richmond, you see that they were in school [after the Civil Rights Movement]. You look at Governor Northam, and he was [wearing blackface] in the 1980s and then you look at the attorney general, and again, that was also the eighties. So even educated white men in the 1980s felt free to join in the mockery of their fellow Americans.

 

For young white men in Virginia and in fraternities, it was just acceptable behavior to replicate old Jim Crow dancing happily, making themselves fools for the entertainment of other whites. Blacks were portrayed as less intelligent, less hard working, less trustworthy liars and cheats-- people that you wouldn't want to be around, except to laugh at as fools. Certainly, not anybody you would trust as an employee or as a public official.

 

So here you have in modern America of 21st century a reminder of how even the best educated whites were also party to this longstanding dehumanization of black people by putting on blackface. It speaks again to the power of history to inform our understanding of who we are and who we elect today. Remember, both of those Virginia officials won the black vote in Virginia. What's curious about this is that Gov. Northam has continued to receive support from black Virginians who saythat's the just way it is, and let's look at his policies now,” and hope, in fact, that this might raise the race issue to the point where he and others feel as if they have some responsibility to make amends.

 

Robin Lindley: I can't help but think too that this ties also to the eugenics movement in early twentieth century America and white supremacy. Attorney Bryan Stevenson, who founded the Equal Justice Initiative and the Legacy Museum in Alabama, said that he believes that the worst legacy of slavery isn’t involuntary servitude, but it is the legacy of white supremacy that has echoed through time.

 

Juan Williams: That's my point. So, I say amen to Mr. Stevenson because this extends not to just to what you describe as the legacy of white supremacy in terms of our political institutions. It extends also into our assumptions about what we accept as normal in America. And I wrote this new book to open our eyes to see discrimination and inequality across the years so we can better understand it in the current context. 

 

You think about the contemporary American standards of success, standards of beauty, the exercise of power, the standards of intellectual achievements. President Trump, without any sense of history, acts as if black people are not contributing to the country, not up to these standards. 

 

Even if Trump wants to focus on higher level of poverty among blacks today, he can’t fool people who know their history. They can say to themselves, “Hey, wait a minute. Black people were kept out of our institutions beginning with schools. Black people were kept from buying real estate in neighborhoods that would have allowed them to amass wealth through the value of homes and property. Black people were kept out of American business and had no access to bank capital. Black people were kept out of the American military. Black people were kept out of our sports.”

 

For a young person today, you have to think about all these things when you hear someone like Trump putting down black people. He is appealing to what Mr. Stevenson said is the legacy of white supremacy.

 

Robin Lindley: I don't know if it matters whether we call Trump a racist, but he is a person who has a personal and family history of discriminating against people based on their race, which you document in your book.

 

Juan Williams: Yes, I think it's very important for people to know Trump’s personal history. You can inform people about the power of American history, but when you talk about it in terms of individual history you see the roots of his kind of leadership; the basis of so much of a person’s thinking as they grow to adulthood and then into power. That is a very revealing and illuminating backdrop for readers of American history. I think that's why biography, by the way, is such an important branch of American history. 

 

So, when I talk about Trump, I start with his father and his father was arrested at a Klan rally. We don't know if he was a Klan member, but we know he was arrested at a Klan rally in New York in the early part of the 20th century. 

 

 And then you come forward to the Trump family business agreeing to a deal with the US government that had charged them with housing discrimination in New York City in their housing units. In fact, in this book, I write about Woody Guthrie and others who were writing songs about the rank discrimination at Trump properties in New York City in mid-20th century. 

 

Once people have an understanding of Trump's upbringing and experience with race, they then might also come to understand why he's the guy who, when the five black and Latino boys are charged with beating and raping a white woman in Central Park, Trump leapt to assume their guilt. Subsequently, when they were found not guilty of this crime on the basis of DNA evidence, Trump did not recant or apologize for having run a full-page newspaper ad calling for the death penalty for these boys. He said nothing.

 

And Trump of course engaged in the whole birther argument against President Obama, trying to diminish the first black president by making him an illegitimate president. Some people might say, well, you know, what's the big deal, where is the birth certificate?—without understanding that it fits into this ongoing pattern in Trump’s life of appealing to racist sentiments that vilify people of color as strange, alien, foreign, dangerous, and coming to take your job or to disrupt your neighborhood. This is part of who Trump is and it's part of American history.

 

Robin Lindley: Trump's efforts to destroy any of the legacy of President Obama seems pathological. He and his party have tried to erase everything that President Obama accomplished. What's your sense of this obsession?

 

Juan Williams: A lot of the anger at President Obama again comes back to the idea that Obama was the first black president and did not rise to success through established white business, military or even political hierarchy. 

 

The argument, especially from a lot of the Trump people is, who is this guy? How did he get here? Who were his patrons? They derided him as a “community activist.” And even more, they expressed fear that Obama was going to take a special interest in caring for black Americans—that he was going to be the black president and neglect white America or even get revenge on whites.

 

That is a very interesting twist on reality. Historically it was black people who were excluded by white politicians from programs like the GI Bill, the programs to get people into schools and to help them buy housing. Even parts of the New Deal were not open to workers in unskilled, non-union jobs dominated by black people. This is the history of the country. Yet now we are told to focus on white, working class fear that the blacks are being given something for nothing. This is ridiculous if you know history.

 

Of course, President Obama's response to this psychological twisting by some people was interesting too because he was always on guard against the notion that he was only the president of the black people – not the president of all the people. Black activists on the far left often criticized him for not doing enough for black folks. It was kind of a Catch-22 in my mind.

 

But back to the Trump perspective. Again, it's the idea that if you undo Obama policies, what you are doing is making America great again by reorienting all the policies back to big business, and taking them away from trying to make amends for prior discrimination or high levels of inequality in American society. It's less about raising up those who have been left behind and more about rewarding those who are in power or who are at the top of the economic ladder. 

 

To me, it has as much to do with symbolism as it has to do the actual undoing of the policy. If you think about things like reversing environmental regulation or refusing to put in place consent decrees to deal with police accused of being brutal in their treatment of blacks, it is unbelievable. It is striking that Trump is able to convey to political base of support that, in order to make America great again, you don't want to lift up those who have been left behind, especially people of color. 

 

We haven't mentioned this, but I think it's very important to include immigrants, people of all colors but especially immigrants of color that Trump infamously described as coming from “Shithole” countries. Right from the start of his campaign, he demonized immigrants and specifically Mexicans as rapists and criminals. And again, the idea is these people are coming and taking advantage of the USA, when in reality these people are often times working in industries that can't find workers. And these are people who are trying and striving so hard to achieve their American dream.

 

Robin Lindley: The hate speech has been deafening. The events in Charlottesville had to be shocking to you. I never thought that I would see Nazi and KKK rallies in 21st century America, and you now have a president and a political party, the Republican Party, silent about this sort of racism and violence.

 

Juan Williams: Yes. Let me just say when Trump talks about fine people on both sides, the most forgiving interpretation you can give to him is to say that he sees these self-identified white supremacists as fine people who are simply standing up for Confederate statues and monuments to soldiers who fought to break up the United States. The phrase “fine people” assumes that these neo-Nazis were generally good Americans, just with different points of view about historical markers. In fact, they were celebrating racists and traitors to the American flag.

 

History can help you to stop and look through the fog of words from the president and you will see the history and tradition being celebrated is one of the Confederate Army attempting to secede from the United States and to break apart our country, and second, to defend slavery and impose that kind of oppression of human beings in the United States of America, a country based on the proposition that all men are created equal.

 

And again, only history can inform you of this distortion, which might fly by your ears while listening to the president of the United States. Fine people on both sides. Well, no. These are people who are celebrating monuments that are in fact intended as reminders of that legacy of white supremacy. Even if you were to say, and in many cases I can understand someone saying, the Confederacy is part of American history, like it or don't like it, and it's important that we know about it for better or worse.

 

Absolutely. It's also true that in Germany they do not celebrate with markers and monuments to the Third Reich.

 

Robin Lindley: I was thinking about the German example too. Trump and the Republican Party support the rollback of the Voting Rights Act and the undermining of democracy with voter suppression, trumped-up investigations of voter fraud, and gerrymandering. How do you see these efforts and why are Republicans virtually unwilling to contest Trump's racism and other faults?

 

Juan Williams: I expect it's a matter of pure politics in a contemporary context to understand the president and his support from self-identified Republican voters in the country. Any Republican who would challenge Trump's racist rhetoric and his other flaws would lose the Republican base. It has become the case that the Republican Party of 2019 is truly better described as the Party of Trump.

 

And, when it comes to issues like race, I think Republicans don't see that much benefit to stand up and speak honestly about Trump's racial attitudes despite the fact that 49 percent of Americans think the president of the United States is a racist, according to a Quinnipiac poll last year. It's incredible that nearly half of us would make that statement to a pollster. But that's repeatedly been the case. The same poll has 22 percent of Republicans saying this president has incited white supremacist behavior and actions in the country.

 

I'm very disappointed in where we are in terms of our nation’s civic morality and commitment to the historical premise of our country, equal opportunity for all. Somehow tribal political allegiances are failing to hold to those values. We seem to be going in the other direction, intent on exclusion instead of inclusion. You can't not see it if you open your eyes, but [Republicans] choose to ignore it, and in some cases to use it. 

 

You mentioned Michael Cohen’s testimony in which he called the president a racist in front of the Congress and the nation. And there was a congressman who then introduced a black person who was a friend of the Trump family and then promoted into a position in government by Trump. And he had this person stand up, like her presence was evidence that Trump is not racist. Well, again, this requires that you lose not only Trump's personal history, but our nation's history. Trump appeals to elements with a grievance against the minorities and uses that to elevate himself into power. So you'd have to ignore all of that. But again, using somebody as a prop to excuse it may be worse even than ignoring it.

 

Robin Lindley: That was a chilling moment, wasn't it? As that Republican member of Congress displayed this young black woman, it reminded me of antebellum images depicting slaves displayed on auction blocks.

 

Juan Williams: I hadn't thought of that one. Gosh.

 

Robin Lindley: If Trump has done anything positive, it seems he has sparked a conversation on slavery, cruel Jim Crows laws, racism, mass incarceration and more, and you have added to the dialogue. I may be in a bubble, but I've heard much more about this history in the past couple of years.

 

Juan Williams: I think so and we even see with the candidates campaigning for president on the Democratic side. They were in Selma, Alabama [observing the 54th anniversary of the Bloody Sunday, Selma March for voting rights], looking at America and the civil rights era and how we've come out of it. Again, the history informs our understandings about what's going on now and why there's such concern about white nationalism and negative racial attitudes flourishing under Trump.

 

Robin Lindley: You provide context in your book by weaving the civil rights history into your discussion of rollback of programs in the past two years. You focus on activists such as Bob Moses who risked his life to register black voters in the brutal Jim Crow South.

 

Juan Williams: I have respect for people who stand up for principle, but to sacrifice their lives is unbelievable. People don't understand the kind of courage it took to go up against segregationists who had guns and the authority to put you in jail or beat you without consequence if you said that all Americans should have the right to vote. It almost sounds ridiculous that you would have to fight for such a thing.

 

Bob Moses is still alive and living out a kind of second chapter of his life and his legacy as a civil rights pioneer. Now he's involved with something called the ‘Algebra Project.’ It teaches math skills to young people of color, giving them a chance to gain equality in terms of preparation for this high-tech economy. I wanted to focus on his work in Mississippi and in the South in general on voting rights because there were people, even civil rights heroes, who would not have been as courageous as Bob Moses in going down and directly challenging the Southern white segregationist power structure for failing to allow black people to vote. And it wasn't just challenging the white power structure. He had to challenge black people in a way that they had to put themselves on the line to stand up against that power structure. They faced the risk of violence, risk of jailing, and risk of loss of their jobs and mortgages, and all the rest, in order to take part in what Bob Moses was promoting. There's a lot of people who deserve similar credit for leading that effort and that movement.

 

But when we come to today and the Republican Party, you start to see that they view minority voting, and especially black voting, as a threat. Then you come to understand the historical roots and you start to look at what is identified often as voter suppression efforts by today's Republican Party, and you say we've lived through this as a country before in a different form in terms of outright denial of the vote. But now we come to efforts to push people off the voting rolls, limit the polling places available in minority neighborhoods, and limit the time available to vote at those polls, And you're reminded of things like literacy tests for the right to vote and limited time to register to vote for blacks. There were crazy tests [in Jim Crow states] about how many bubbles are in a bar of soap, or how many marbles are in a jar. These tests were all intended to deny black people the right to vote in earlier African American history.

 

Robin Lindley: Under President Trump, former Attorney General Jeff Sessions was curtailing enforcement of civil rights laws.

 

Juan Williams: This is such an interesting history. Now of course, that goes beyond Jeff Sessions, who is gone as Attorney General. 

 

Sessions didn’t want to enter into consent decrees with local police departments that encouraged making peace between local black communities, the Black Lives Matter movement, and police departments that were finding themselves charged with racism on the basis of either young men being shot and killed or police brutality and, of course, high rates of incarceration, and all that. On all of these counts, the idea was that the federal government under Obama tried to have a healing, salutary effect. Then, in comes the Trump administration with Jeff Sessions as Attorney General, and they say, “No, we're not interested in that.” What kind of signals does that send? To me, that's a pernicious signal. It's saying, "Trump supporters, Republicans, are not interested in that." 

 

In fact, they doubled down by giving police more military-style equipment. Again, this was a real signal of their antipathy towards racial peace in the country, in my mind.

 

Robin Lindley: You also comment on the Administration’s undermining of public education. You recount the history of civil rights and education with your story of James Meredith who enrolled as the first African American student at the University of Mississippi in 1962. You capture the extremely violent atmosphere at that time, and the story may surprise many younger readers in particular.

 

Juan Williams:  Instead of simply being a polemic aimed at how Trump has enabled racist sentiments to rise up, I think it's important to tell people exactly what we have to lose again in reference to Trump's statement: "What the hell do you have to lose?" Part of that speaks directly to the idea of James Meredith, the US Air Force veteran who enrolled at the segregated University of Mississippi, and wanting people to understand that it took the federal government to go down there and defend this one person who simply wanted to go to school--this one person again, an Air Force veteran who had served his country. And this one person was the target of such animus that there were riots and people were killed. 

 

It's hard to tell this story without explaining the level of violence that was mounted, and mounted with the encouragement of the governor at the time, Ross Barnett, with the intent of building political support for the white segregationists who would stop a black person from attending a state-funded university. 

 

And federal troops were sent into Oxford, Mississippi, so James Meredith could enroll. There was a pitched battle. Segregationists attacked US troops. Some people didn't make it through the night. Some people did not survive. Even a reporter was killed. And again, for so many people today, it would be hard to understand.

 

And recently, at the University of Mississippi, some of the basketball players knelt during the national anthem because there was a show of support for a rally by white supremacists coming to the campus.

 

Robin Lindley: I didn't know about that recent racial incident. That’s heartbreaking. You also detail cutbacks in funds for low income and fair housing and how these housing programs are now imperiled under Trump. And you share the story of Robert Weaver, a pioneering advocate for fair housing and the Secretary of Department of Housing and Urban Development. 

 

Juan Williams: The HUD building here is named for Robert Weaver and lots of times I point this out when I have guests in town and they say they had no idea. 

 

He's a figure who was obviously not lost to history, but I think he's not valued or celebrated in the way that he should be, now especially given Trump's background as a real estate developer. It's an interesting point of contact that there was a civil rights pioneer in just that area who insisted on equal housing rights in a country that had not only practiced housing segregation but also engaged in legal tactics to exclude blacks, Jews and other minorities from owning property in certain neighborhoods. 

 

Robin Lindley: I did want to ask about your role on Fox News. We have cheap cable, so we don't get Fox News or CNN or MSNBC. I wondered how you decided to appear on Fox News after your work on PBS and your work as a journalist. I'm heartened to learn that you present a counterpoint to the right-wing zealots who dominate Fox News.   

 

Juan Williams: Fox News is the number one cable network in the country. They don't tell me what to say and they don't censor me. 

 

For me, what's important is that I speak to an audience that otherwise wouldn't hear a different perspective. They wouldn't hear the historical background that I bring to discussions of contemporary events. The current politics in America is so partisan and so polarized and I offer people trapped in their own ideological bubble a breath of a different kind of air.

 

I think of myself as a foil to some of Fox’s leading hard right personalities, but it can be very difficult for me. And even with this book that we're discussing, What the Hell Do You Have to Lose: Trump's War on Civil Rights, you have people on the far right who immediately attacked the book, even before it was published. Their aim was to undercut its value and its attractiveness to readers. I find that just so alarming that you can't even have an honest discussion, an honest debate, or people will try to silence you. Anyway, I'm up against in many ways, but I think this is, for my time, the most important fight to be in.

 

Robin Lindley: That’s an act of courage now. I saw your Twitter feed and was stunned by the barrage of hateful and racist remarks you receive.                     

 

Juan Williams: Yes. That's what happens. You have these trolls and then the bots pick up and they never stop. They try to bury you alive in terms of American history. Can you believe that?

 

Robin Lindley: I’m sorry you’re the target of these vicious attacks.

 

Juan Williams: I just think that's an important point for you and for the History News Network to be aware of. In the current political climate, there are people who don't want to hear and will kill off any attempts to raise up American history, to help us better understand who we are and where we are today. I think that's what happened in terms of what you saw on that Twitter feed with this book.

 

Robin Lindley: Were there efforts kill the book?

 

Juan Williams: No. They couldn't stop me from writing the book and they couldn't stop the publisher from publishing the book, but what I'm saying is, when you see those remarks that you described on Twitter and the like, some of those came before the book was published. And then you see this onslaught of people who say they were reviewing it and they didn't even have the book. But they are so harshly critical because they don't want that message to be given any attention. They want to dismiss it out of hand or, as I was saying earlier, they want to kill it in the cradle. And that hasn't stopped. I just know that the bots or the trolls have been intent in trying to undercut this discussion of Trump and his policies on race.

 

Robin Lindley: I regret that I haven't seen your commentaries on Fox News. Are you able to talk about the war on civil rights and share your opinions on Trump and his administration?

 

Juan Williams: Typically, the racial issues come up time and again in American society. Recently, we talked about the blackface issue. We also talked about Jussie Smollett, the actor in Chicago who claimed that he was attacked by people wearing MAGA hats. And we talk about the spike in hate crimes since Trump has been in office, including the incident in Charlottesville.

 

My historical bent informs the comments that I make about these news events. But again, it's one voice and oftentimes people don't want to hear what I have to say and it becomes contentious. But, I'm in there and I'm trying to do my best.

 

Robin Lindley: I appreciate your unique perspective and your efforts to provide historical background in your work. Congratulations on your timely and informative book on the rollback of civil rights progress under Trump. 

 

Juan Williams: I am so grateful that you guys [at HNN] love history as much as I do. It doesn't have to be about race. On any subject, I think that the more people are aware of our history, the more they'll love this country and the more they’ll understand the true purpose of this country, which is opening doors, building bridges. We have a great country but we have to protect the ideals aggressively.

]]>
Thu, 18 Apr 2019 16:48:42 +0000 https://historynewsnetwork.org/article/171716 https://historynewsnetwork.org/article/171716 0
Teacher Pay, Presidential Politics, and New York’s Modest Proposal of 1818

 

With her recent pledge to raise teacher salaries, Senator Kamala Harris has guaranteed the issue will be a key part of the 2020 Democratic Party primary debates. Since the slew of teacher strikes began last year, the question of teacher pay has shoved its way to the front in American politics. We shouldn’t be surprised; teacher pay has always been one of the thorniest issues in public-school administration. The first generation of amateur administrators considered one solution that will seem shocking only to those who do not understand the desperate politics of school budgets.

 

The fact that teacher pay is a difficult issue should not come as any surprise. Public schools, after all, generate no profit and teacher salaries make up the biggest portion of budgets. Even a small decrease—or a reduced increase—in teacher pay can create a lot of wiggle room in a struggling school-district budget. It is always a temptation for administrators in straitened circumstances to dream of cutting salaries.

 

The most experienced and cynical teachers, then, won’t be surprised to hear of one plan from the earliest days of public education in the United States. Two centuries ago, well-heeled philanthropists of the New York Free School Society (FSS) operated two schools for the city’s least affluent children. 

 

Money was always tight. As the trustees noted in 1820, the FSS cobbled together funds from “the donations and Legacies of charitable Individuals, the bounty of the Corporation [i.e. city government] and the munificence of the Legislature.” Their funding was never guaranteed and it was never enough; the organization lurched from one financial crisis to the next. 

 

The dilettantish leaders had not expected this. They thought that a new plan of organizing their teachers would prevent such financial shortfalls. According to this plan, called “monitorial” or “Lancasterian” education, charismatic English school-founder Joseph Lancaster promised that thousands of children from low-income families could learn to read and write without expensive teacher salaries. One teacher would supervise “monitors,” student helpers who would do all the hard work of teaching. 

 

Experienced school leaders might have seen the problem coming, but New York’s administrators had more enthusiasm than experience. They were surprised to find that their monitors did not want to work for nothing. When the monitors were asked to do so, many of them simply left to open entrepreneurial schools of their own or to take jobs in other fields. The ones who stayed demanded salaries like the other teachers.

 

The FSS leaders were in a bind. Their budget was based on the free labor of monitors. They could not afford to pay more teacher salaries. In 1818, they wondered if they might import a traditional English solution. In October, 1818, the FSS board of trustees considered a plan to turn their young teachers—the monitors—into formal apprentices. The financial benefits would be enormous. 

 

According to the long tradition of apprenticeship, young people bound by indentures could be forced to work without salary until they turned twenty-one. In exchange for learning the art and mysteries of the trade of teaching, these young apprentices would be legally bound to serve the FSS for free until they aged out. 

 

The trustees created a model indenture form, one that would bind teachers until they turned twenty-one,

 

during all which time the said apprentice shall faithfully serve said [Free School] Society, obey their lawful commands and the lawful commands of such teacher and teachers and their agent or agents . . . for that purpose.

 

Beyond saving money on salaries and guaranteeing a committed, if temporary, workforce, the traditions of apprenticeship would have allowed the board to exert control over all elements of young teachers’ lives. As was common in apprenticeship agreements, the FSS indenture agreement forbade apprentices from having sex, getting married, gambling, drinking, or attending plays. Most important, the agreement legally prohibited apprentices from leaving.

In exchange, the FSS offered food, housing, and clean clothes. When the apprentice teachers aged out, they would be granted a leaving bonus—a parting gift of cash, with the amount to be determined later by the FSS. 

 

Temporary enslavement like this might seem shocking these days, but it had been a common practice among teachers in England. In 1813, for instance, Joseph Lancaster took on young men—never women—to serve as his teaching apprentices in his thriving London school. 

 

As was common in apprenticeship systems, parents often heartily supported the “masters.” One father told his grumbling son and Joseph Lancaster in 1813, 

Your master may dispose of you in any way he pleases—he has my hearty concurrence beg leave again to repeat—Sir do with my Child as you see fit my liberty and blessing you have For ever.

 

The power to deal with instructional staff in any way they pleased was enormously tempting for New York’s early school leaders. In the end, however, New York’s FSS trustees decided against turning their employees into apprentices. They realized that their young teachers would not likely accept the plan and could veto it, in practice, by simply walking out. 

 

The lessons of the early 1800s resonate with our teacher-pay politics today. The pressures on any school budget have always been intense. There is never enough money for every program and every need. When underpaid teachers threaten to walk away, school leaders have always considered desperate plans to fix desperate financial problems.

 

For more by this author, check out his latest book:

 

 

]]>
Thu, 18 Apr 2019 16:48:42 +0000 https://historynewsnetwork.org/article/171717 https://historynewsnetwork.org/article/171717 0
How did President Reagan Deal with Violent Radicals In His Own Party?

 

President Trump’s reaction to the tragedy in New Zealand was very much in line with his take on Charlottesville—a wink and a nod accompanied by a few ambiguous platitudes. That Brendan Tarrant, the terrorist who perpetrated the massacre, would explicitly acknowledge Trump as a “symbol of White identity and common purpose” comes as little surprise. After all, Trump the candidate urged supporters to beat demonstrators in his rallies. The violent participation of explicitly neo-Nazi groups like the Rise Above Movement at pro-Trump rallies around the country has also been met with a conspicuous silence from the White House as has the rise of racist violence globally.

Scholars of terrorism have long understood that there is a connection between the perception that national leaders condone or tacitly support violence and the decision of perpetrators to move from dreams of slaughter to the reality of cold blooded murder on the grandest possible scale. 

The tragedy in New Zealand and the series of church and synagogue shootings here bring to mind an earlier day and a very different president. In the 1980s, President Ronald Reagan entered the White House as the first explicitly pro-life president. To the rescue movement, the radical fringe of the pro-life community, the news was electric, and to some, a sign of Divine grace. Until then, the deeply religious rescue movement had followed Operation Rescue—the first national rescue organization led by Randal Terry—in pledging non-violence at clinic level demonstrations. 

From the staid confines of the annual White Rose Banquet to the increasingly acrimonious confrontations at abortion clinics, the talk was of the new President and how he was with them, even if he could not say so openly.

The rescue movement soon became bolder, holding massive demonstrations in cities across the country. Violent confrontations with police soon followed and the movement’s faithful—white, middle and working class churchgoers all—became acquainted with the realities of urban jails and the less than gentle ways police maintained order in many of these lockups.

Newer and more militant groups soon emerged with names like the Lambs of Christ and Missionaries to the Preborn. At the same time, demonstrations gave way to attacks, first on property as clinics were firebombed or rendered inoperable with chemicals like butyric acid poured through locks and keyholes. 

The violence turned deadly with the murder of Dr. David Gunn in Pensacola, Florida in 1993. Other killings followed. In my interviews with members of the rescue movement and doing fieldwork at clinic level demonstrations, it became clear that the turn to lethal violence reflected a disillusionment with America, which many protesters had came to identify with Nazi Germany based on abortion statistics and the perceived support of the nation’s leadership for the rule of law over that of conscience.

Much of this disillusionment occurred when President Reagan strongly condemned attacks on abortion clinics. On January 4, 1985 for example the Washington Post reported that the president “condemned such violence ‘in the strongest terms,’ and ordered an all-out federal effort to find those responsible for the ‘violent, anarchist activities.’”

The rescue movement could understand the political necessity of issuing statements against abortion violence, but unleashing the federal government against the movement sent a clear signal that violations of the law would not be tolerated. The federal effort was eventually successful in crushing the rescue movement when the Justice Department under the Clinton Administration used RICO statutes [Racketeer influenced and Corrupt Organizations law that had been designed for the Mafia] to effectively drive rescuers from the streets.

By contrast, under President Trump the rule of law is useful only when he deems it personally convenient. Given his actions aimed at bringing the rescue movement to heel, it is safe to assume that President Reagan would never have tolerated either the extremists at Trump Rallies or the Alt-Right extremists who perpetrate violence. Nor would President Reagan have had the slightest hesitation to express both sympathy and empathy to for the Muslim victims of the murders in Christchurch.  

In the end, what is left is the methodical workings of the law and the election cycle. Many years ago, the great historian Arthur Schlesinger noted the cyclical nature of American politics. The country goes through phases of activism which unleashes populist and leftist extremes and retrenchment which produces a tentative unity and a period of economic acquisitiveness. Recent election results have brought a Democratic majority with an investigative fervor to Congress. Trump policies from the border wall to the withdrawal from Syria are being increasingly challenged by both parties. 

Moreover, the second Unite the Right in August 2018 rally fizzled amid a wave of disgust for what occurred in Charlottesville. Two dozen marchers turned up, only to be dwarfed by hundreds of counter-demonstrators and a strong police presence.

All signs point to the wisdom of Schlesinger’s observations. Americans may for a time be entranced by the bromides of the populists or the voices of racism and division, but it soon passes and a twenty-first century version of Reagan’s ‘Morning in America’ will invariably follow. 

]]>
Thu, 18 Apr 2019 16:48:42 +0000 https://historynewsnetwork.org/article/171715 https://historynewsnetwork.org/article/171715 0
Is the Western World Declining and Russia Rising?

 

Is the western world declining and Russia rising? Yes, according to Glenn Diesen’s 2018 book The Decay of Western Civilization and Resurgence of Russia (hereafter Decay). Such a conclusion might come as a surprise, but it was a common one among nineteenth-century Russian nationalists. But paraphrasing Mark Twain’s reputed comment about his falsely-reported death, we might say that the report of the Western world decaying—either in the nineteenth century or today—is “greatly exaggerated.” 

 

The main reason that Diesen thinks “the geoeconomic foundations for the West's primacy for the past 500 years is ending” is that it has overemphasized the individual, rational, impersonal, and contractual to the detriment of community, which places more value on the irrational, instinctive, and spiritual.  (“Western civilisation prospered by embracing liberalism and rationalism, yet condemned itself to decay by self-identifying solely by these values.”)  In reaction to this overemphasis, Western right-wing populism has emerged, spurred on by resentment toward globalization and immigration and encouraged by the likes of Donald Trump and France’s Maria Le Pen

 

The West’s modern-day ills, according to Decay, also reflect the influence of cultural and political postmodernism, which “produces devastating nihilism by discarding traditional identities, cultures, nations, religions, the family units, genders, and civilisation itself as arbitrary . . . . Tolerance is corrupted by being translated into an embrace of cultural and moral relativism.” Diesen also believes that identity politics has converted the melting-pot ideal into that of the “salad bowl.” Unlike historian Carl Degler, who viewed the latter term favorably because it did not reject diversity and pluralism, he links it with “divisive society,” the politics of “victimhood” and evaluating equality as equal outcome rather than equal opportunity. Emphasizing “responsibilities towards sub-groups” rather than state loyalty “undermines the rule of law, legitimacy of government, and democratic rule.” 

 

Diesen’s criticism of identity politics fails to acknowledge, as Jill Lepore has indicated, that such “politics, by other names, goes all the way back to the founding of the Republic.” And “the identity politics of slaveholders and, later, of the Klan, and of immigration restrictionists, had been the work of more than a century of struggle—for abolition, emancipation, suffrage, and civil rights.” Although some fault might be found with the identity politics of recent decades for diluting “Rooseveltian liberalism” (Lepore’s view),  Diesen’s simplistic critique lacks context and nuance and undervalues the virtues of pluralism.

 

He goes on to opine that Western decay has sped up since the end of the Cold war. As globalization and new technology “centred on communication, automation, and robotics” has quickened, so too has growing inequality, “loss of civility,” political polarization, immigration, and hostility toward immigrants.

 

He also believes that Russia is rising, partly because it has “repositioned to the heart of Eurasia,” which has allowed it to reduce its reliance on the West, “while concurrently increasing dependence by others on Russia.” But also because it has balanced economic development with “culture and traditions to address the innate and imperishable in human nature, and to maintain a distinctive identity in a globalising world.” 

 

Although he barely mentions nineteenth-century Russian nationalist thinkers, Diesen’s ideas are similar to theirs in many ways.  Many of them, including Dostoevsky, thought that the West (by which they generally meant Western Europe and the United States) overemphasized rationalism, individualism, and money-making. Conversely, Russia was more religious and appreciative of non-rational elements and a spirit of community. 

 

In the mid-nineteenth century, the historian M. P. Pogodin wrote that the USA was “no state, but a trading company” that “cares solely for profit . . . She will hardly ever bring forth anything great.” In his book Russia and Europe (1869) the botanist Nikolai Danilevsky claimed that European civilization was not the only type of civilization and that there were no universal values, but that different historical-cultural types existed and that Europe and Russia belonged to two very different types. Western Europe, he believed, was decaying and an emerging Slavic civilization led by Russia was the great future hope. 

 

In his Winter Notes on Summer Impressions (1863), Dostoevsky criticized the West for its individualism and materialism. Later, in an 1880 speech he gave on the Russian poet Pushkin, he suggested that Russia might aid the West in helping it to regain a more spiritualistic basis for society.

 

Diesen states that Russia offers the possibility of doing something similar today. He asks, “Will Western decadence result in completely eviscerating the West as a civilisation . . . or does Russia intend to assist in rejuvenating the traditional and spiritual in the West?” In a December 2013 address President Putin declared that “today, many nations are revising their moral values and ethical norms,” destroying “traditional values,” accepting “without question the equality of good and evil.” 

 

Shortly after this speech, Diesan notes that American conservative Pat Buchanan wrote “Is Putin One of Us?”  Buchanan suggested that in “his stance as a defender of traditional values” Putin is very much in tune with U. S. conservatives.  He added, “Peoples all over the world, claims Putin, are supporting Russia's ‘defense of traditional values’ against a ‘so-called tolerance’ that is ‘genderless and infertile.’ . . . Putin is not wrong in saying that he can speak for much of mankind.” Later, columnist William Pfaff wrote that “the resemblance of President Putin's ambitions for his Russia to those of the neoconservatives in the contemporary United States bear a striking formal resemblance.”

 

According to Diesen, Russia’s appeal to Western conservative “populists” is its “bold and unapologetic commitment to preserve traditions, national culture, Christian heritage, national identity, and the family structure.” Besides Trump and France’s LePen, the author mentions other leaders, as well as Western parties, who generally share “a remarkable degree of empathy towards Russia and the belief that they have a common cause.” The group includes 

the Brexit-advocating UK Independence Party, the Alternative for Germany (AfD) party, and right-wing populist leaders who have come to power in Hungary, Poland, the Czech Republic, and Austria. In a late 2018 interview, Diesan reaffirmed this view of Russia as a defender of traditional values and that “Russia has returned to its pre-communist role as the go-to country for Western classical conservatives.”

 

Despite some nineteenth-century Russian nationalists’ predictions of Western decline, few people in the western world before World War I (WWI) believed the West was declining. Prior to 1914, nine western European countries controlled over four-fifths of the earth’s lands, and the United States had also expanded the lands it controlled by annexing Puerto Rico, the Philippines, and Guam and making Cuba a protectorate. In his memoir, The World of Yesterday, the Austrian Stefan Zweig wrote of the late-nineteen century widespread belief in progress—e.g., electric lights, telephones, automobiles, improved sanitation and medical treatment, expanded voting, justice, and human rights, reduced poverty, and even the hope for more peace and security.  

 

WWI, however, changed such optimism. Over 15 million Europeans, soldier and civilians, lost their lives in to what many seemed a senseless war. France regained some territory it had lost to Germany in the Franco-Prussian War of 1870-71, but 3 of every 10 Frenchmen between the ages of 18 and 28 paid for the gains with their lives. In addition, the peace treaties that followed the war were unsatisfactory to many, especially in Germany, which helped give rise to Hitler. The mere title of the German Oswald Spengler’s Decline of the West (1918-1922) was just one of many signs that a fundamental change had occurred in Western confidence.

 

The rise of communism in Russia, with Stalin eventually succeeding Lenin (d. 1924) as the Soviet leader; the rise of Italian fascism in the 1920s; the worldwide Great Depression of the early 1930s; Japanese aggression in Manchuria; and Hitler’s assumption of power in 1933 further sapped confidence in Western progress. Membership in western communist parties increased during the Depression, and not a few Western intellectuals, often fooled by Soviet propaganda, believed that more hope lie in communist Russia than in the capitalist West.

 

But, as happened before and has happened since, Western decline was only temporary. From 1933 until 1945 President Franklin Roosevelt (FDR) brought the United States out of the Depression and led a coalition of powers that defeated Germany, Japan, and other nations. In 1955, one of the twentieth-centuries most astute political philosophers, Britain’s Isaiah Berlin, had this to say about the 1930s:  “The most insistent propaganda in those days declared that humanitarianism and liberalism and democratic forces were played out, and that the choice now lay between two bleak extremes, Communism and Fascism . . . .The only light that was left in the darkness was the administration of Roosevelt and the New Deal in the United States. At a time of weakness and mounting despair in the democratic world Roosevelt radiated confidence and strength.. . . Mr. Roosevelt’s example strengthened democracy everywhere.”  

Contrasting today with late 1932, when Depression gloom was at its height, it is difficult to believe the Western world is now in worst shape than back then. Is a Russia led by an opportunistic Vladimir Putin, allied with conservatives who have groveled to the likes of Donald Trump, really one of the world’s greatest future hopes? Can the West still not produce leaders of FDR’s caliber? Can it still not reinvigorate values that once helped make it great like freedom, democracy, equality, social justice, and tolerance? 

History has demonstrated enough zigs and zags, stops and starts, setbacks and advances for anyone assuredly to predict the future. The nineteenth-century Russian exile Alexander Herzen was closer to the truth than Danilevsky, Spengler, Diesen, and others who thought they could foresee the future. In an essay on the Utopian socialist Robert Owen, Herzen wrote that “nature and history are . . . ready to go anywhere they are directed,” and “having neither program, nor set theme, nor unavoidable denouement, the disheveled improvisation of history is ready to go along with anyone. . . . A multitude of possibilities, episodes, discoveries in history and nature, lies slumbering at every step.” 

]]>
Thu, 18 Apr 2019 16:48:42 +0000 https://historynewsnetwork.org/article/171718 https://historynewsnetwork.org/article/171718 0
The Spies' Marathon before Patriots Day

The Boston Marathon

 

Long before there was the Boston Marathon, a couple of British spies in Massachusetts stumbled into one of their own. Captain John Brown and Ensign Henry De Berniere were sent by British General Thomas Gage on a long journey to survey the countryside outside Boston in February, 1775. Gage was planning to move troops against the rebellious colonists, but needed intelligence on roads and towns for the dangerous mission. Things were tense between the armed Massachusetts colonists and the occupying British forces in Boston.   Brown and De Berniere disguised themselves to look like the typical Massachusetts resident of the time, brown clothes and reddish handkerchiefs. Their mission was by foot through Eastern Massachusetts. These spies had their own colonial version of the Boston Marathon traveling through Suffolk and Worcester counties. They needed lots of carbohydrates.   So Brown and De Berniere, along with their servant John, stopped to eat at a tavern in Watertown early in their route. They of course wanted to go unnoticed, finish their meal, and rest for the night. But when the restaurant's waitress kept eyeing them "very attentively," this was not a good sign for the spies. The two men tried to make casual conversation with the waitress, remarking on the fine land of Massachusetts. The waitress replied, "So it is, and we have got brave fellows to defend it, and if you go up any higher you will find it so.”  Uh-oh. That is the word spies never want to say while on duty. Getting recognized a few minutes into your mission is a tough start for an undercover operation. De Berniere wrote in his account, "This disconcerted us a good deal, and we imagined she knew us..." They conferred with their servant John, who overheard that indeed the waitress had recognized the British officers. Their mission was compromised and they were in danger. They decided not to go back to Boston though because they would look foolish.  The British spies did cancel their plans to stay overnight at the tavern. It was not clear if they left a tip for the waitress.  Brown and De Berniere continued on to a safer inn. They eventually made it all the way to Worcester “very much fatigued” from their journey. But as they continued on their “secret” tour of Massachusetts more people were taking notice of them. They did keep collecting information from all the towns.  On the way back to Boston a classic Nor'easter arrived, bringing snow and rain to their obstacles. De Berniere wrote “it snowed and blew as much as ever I see it in my life.” Welcome to Massachusetts! They walked very fast, fearing pursuit. This was a race for their lives. De Berniere wrote, they were exhausted, "after walking thirty-two miles….through a road that every step we sunk up to the ankles, and it blowing and drifting snow all the way.” Gage should have given them a medal for their marathon spy mission when they returned to Boston. He received good intelligence from his officers. They showed courage and endurance under tough conditions. In March he even sent the pair on a sequel to gather intelligence on a town called Concord.  We know what happened next. On April 19, 1775, the first shots of the Revolutionary War were fired at Lexington, followed by the fight at Concord Bridge.  We celebrate Patriots’ Day to commemorate the battles of Lexington and Concord. The Boston Marathon is run on that day. Hopefully, not in the snow that Brown and De Berniere endured.  Patriots’ Day can also be a celebration on the peaceful relations between Britain and the U.S. after many years of war. The building of that peace was a marathon in itself spreading over many decades and treaties. That is something we should be proud and hope it stands a symbol of peace for all nations.  If you are running the Marathon, or exercising on your own, you can use the Charity Miles app to raise funds for the World Food Program or Save the Children. This will help bring food and comfort to children who are living in war zones. The Boston Marathon is a great race and can rally support for this and many other social causes.  Happy Patriots' Day!

]]>
Thu, 18 Apr 2019 16:48:42 +0000 https://historynewsnetwork.org/article/171713 https://historynewsnetwork.org/article/171713 0
The Israeli Elections and What We Can Learn from History

 

The results of the Israeli elections that took place on April 9 have shown that history could be a good guide to assess political processes, but not necessarily to predict how events will unfold. 

On two previous occasions, a center-left coalition was able to defeat the center-right in Israeli elections. 

In 1992, the Labour Party, headed by former Chief of Staff of the Israeli Defence Forces (IDF), Itzhak Rabin, managed to defeat the governing Likud Party, headed by Itzhak Shamir. 

In 1999, another former Chief of Staff of the IDF, Ehud Barak, heading a political alignment centered on the Labour Party, secured a comfortable election victory against the incumbent Prime Minister, Benjamin Netanyahu. 

Many in Israel thought that these two historical precedents might be repeated in the elections Tuesday. 

The logic of the argument ran like this: For any political challenge mounted by the center and center-left against the governing Likud Party to succeed, a former military leader would have to lead it. In a country beset from its inception by security-related threats, the aura of a distinguished military career could be a vote-winner. 

The above could be proved empirically. After all, both in 1992 and in 1999, the governing Likud Party lost to two former generals, heading a center-left coalition.

However, in 2019, a newly-created, centrist political grouping, Blue and White, led by a former Chief of Staff of the IDF, Benny Gantz, and a triumvirate of leading figures, two of them also former Chief of Staff of the IDF, Moshe “Bogy” Yaalon and Avi Ashkenazi, was defeated at the polls by Benjamin Netanyahu and his Likud Party.

Why?

To answer this question we would need to stress, first and foremost, the singular circumstances surrounding each electoral event. 

For instance, in 1992, Rabin was thought to be a very experienced politician, having already served as Prime Minister between 1974 and 1977, and as Defence Minister between 1984 and 1990. In 2019, Gantz had no political credentials at all. He had served in no ministerial post, nor had he been even a Member of the Knesset (the Israeli parliament). 

Further, whereas Rabin in 1992 faced a serving prime minister, Shamir, who many considered uncharismatic, Gantz in 2019  wanted to unseat Netanyahu, one of the most charismatic leaders Israel has ever known. Shamir might have been respected by his followers; Netanyahu was adored by them. 

In 1999, Barak, the most decorated military leader in Israel’s history, wanted to unseat a young prime minister, who, after having served for three years as prime minister, was seen as inexperienced and rather unsuccessful: Netanyahu. 

Certainly, what the examples of Rabin (1992) and Barak (1999) demonstrate is that a rather hawkish platform, coupled with a distinguished military career, can help a center-left candidate in securing political support among center and center-right voters in Israel. 

Beyond that, voting in Israel follows, more often-than-not, a deep-seated sociological trend. Most Israeli Jews of European descent tend to vote for center and center-left parties; most Israeli Jews of North-African and Asian descent are inclined to vote for center-right parties. The more affluent an area in Israel, the more its residents would tend to vote for a center, center-left candidate. Certain cities in Israel are identified with either the center-left or the center-right: thus in these recent elections, cities like Tel Aviv voted overwhelmingly for opposition parties, whereas Jerusalem voted mostly for center-right and religious parties. 

Therefore, apart from the personal and political characteristics of the leaders involved, and the concrete circumstances in which they have operated, to understand the Israeli electoral process one has to delve more deeply into Israeli society and the way it has evolved.

In Israel, one can learn from history by assessing these more profound sociological processes, while being careful not to reach unequivocal conclusions from similar events that have taken place in Israel’s history. 

]]>
Thu, 18 Apr 2019 16:48:42 +0000 https://historynewsnetwork.org/article/171692 https://historynewsnetwork.org/article/171692 0
Roundup Top 10!  

 

 

The Man Who Saw Trump Coming A Century Ago

by Ann Jones

A Reader’s Guide for the Distraught.

 

2 Minutes and Counting

by Oliver Stone and Peter Kuznick

Crises that seemed contained not long ago have now spiraled out of control—and the prospects for resolving them peacefully look depressingly bleak.

 

 

Harvard's Communist Uprising, 50 Years Later

by Daniel Pipes

That takeover and bust culminated my political education.

 

 

‘Not a racist bone in his body’: The origins of the default defense against racism

by Christopher Petrella and Justin Gomer

The rise of the colorblind ideology that prevents us from addressing racism.

 

 

What Donald Trump Doesn’t Get About George Washington

by Peter Canellos

“If he was smart, he would’ve put his name on it. You’ve got to put your name on stuff or no one remembers you.”

 

 

Anti-vaxxers are comparing themselves to Holocaust victims — who relied on vaccines to survive

by Helene Sinnreich

The comparison is offensive. It’s also historically wrong.

 

 

People Used to Hate the Electoral College for Very Different Reasons

by Justin Fox

A half-century ago, the House voted to replace the Electoral College with a direct vote and the Senate came close. The arguments made then are enlightening.

 

 

How Trump finally turned Republicans against McCarthyism

by Jonathan Zimmerman

After nearly 70 years, Republicans have stopped defending Joe McCarthy.

 

 

The End of the American Century

by George Packer

What the life of Richard Holbrooke tells us about the decay of Pax Americana.

 

]]>
Thu, 18 Apr 2019 16:48:42 +0000 https://historynewsnetwork.org/article/171712 https://historynewsnetwork.org/article/171712 0
Why the Second Summit Between Donald Trump and Kim Jong Un Failed

 

The second summit between U.S President Donald Trump and North Korean leader Kim Jong Un in Hanoi failed due to a relatively unknown quantity: North Korea’s domestic politics.  

 

Despite more than sixty years since the armistice declaring the end of the Korean War, the U.S and North Korea seemed closer to peace than ever before. As evidenced in the case of post-World War Two Germany and Japan, and communist Vietnam, U.S is no stranger to making former enemies into friends. However, few have asked how this thaw in hostilities affects North Korea’s messaging to its own people. It is too politically risky for Kim Jong Un’s regime to have friendly relations with Washington.

 

As an autocratic regime, North Korea remains politically stable by limiting the flow of information into and out of the country. This information blockade has allowed the Kim family regime to stay in power longer than the Soviet Union ever existed. North Korean propaganda has strategically positioned the “U.S imperialist bastards” as the forever enemy of the Korean people since 1950. Propaganda posters feature North Korean missiles hitting the U.S capitol and hook-noosed U.S soldiers brutally massacring Korean peasants. There is a museum in Sinchon, North Korea dedicated to U.S atrocities during the Korean War. Despite the fact that many of these atrocities are made up by the North Korean propaganda apparatus, schoolchildren take regular trips to this museum for political education purposes. At festivals, shooting games feature portraits of U.S soldiers as targets and children practice bayonetting U.S soldiers at recess.

 

The U.S boogeyman serves as a mobilizing force for the Kim family regime and inspires the North Korean people to endure harsh living conditions for the good of the nation. The North Korean people boast about its nuclear program and ability to defend themselves against the much stronger and larger U.S military. As North Korean propaganda explains, once the U.S leaves the Korean peninsula, the golden era of Korea will begin. Peaceful reunification with the South and a strong economy awaits those who now sacrifice for the fatherland.

 

However, can the Kim family regime continue its brutal ways without the U.S boogeyman? As 19th century military theorist Carl von Clausewitz explains, “primordial violence, hatred, and enmity” are a “blind natural force” in conflict and primarily motivates the people. Take this hatred away and what does a militaristic state such as North Korea have left to mobilize its people? The manufactured fear of a U.S invasion has left North Korea in a near-warlike state for decades and permitted the hereditary dictatorship of the Kim family. This siege mentality and anti-American sentiment will not vanish once Trump and Kim sign communiqués. It is too politically dangerous for the North Korean leadership.

 

With more peaceful relations between Washington and Pyongyang, a lifting of sanctions and an increase in living standards for the North Korean people would likely occur.  This would most likely boost domestic support for the Kim family regime. However, the information blockade and authoritarianism of North Korea’s political system would continue. In the future, who does Pyongyang then blame for its troubles?

 

Recently, there have been signs in Pyongyang that anti-American propaganda is fading away. Much of this seems to be a spectacle for foreign journalists and tourists visiting the North Korean capital city. The regime cannot wholly remove anti-Americanism from the national consciousness without the fear of domestic instability. Once the U.S boogeyman is gone, the people’s passions and grievances now turn towards the North Korean government that has kept its people impoverished and literally in the dark for the last several decades. The North Korean leadership cannot have that.

 

While friendlier U.S- North Korea relations are less dangerous for the whole world, the autocratic nature of the Kim family regime nor its many human rights abuses should be forgotten. As long as the Kim family regime remains in power, anti-Americanism and human rights violations will continue north of the DMZ. It is the only path North Korea’s leadership knows and can permit. Rapprochement with the United States is too politically dangerous for Kim Jong Un’s regime.

]]>
Thu, 18 Apr 2019 16:48:42 +0000 https://historynewsnetwork.org/article/171660 https://historynewsnetwork.org/article/171660 0
The Long History of Anti-Immigration Legislation and "Crimes Involving Moral Turpitude"

 

In light of the Supreme Court’s recent ruling on the Nielsen v. Preap case, a return to the history behind “crimes involving moral turpitude” and the concept’s unique relationship with immigration exclusion and deportation is useful. As many have noted, the Court’s ruling that immigrants who committed crimes and served their sentences (in some cases years or decades ago) can still be detained for deportation without bond hearings raises questions about the constitutionality of indefinite detention. But, as Justice Ruth Bader Ginsburg noted, the ruling also concerns the lingering and vague definition of “crimes involving moral turpitude” and its role in immigration.

 

In her dissenting opinion, Ginsburg referenced the dangers of interpreting crimes involving moral turpitude (or CIMT) as a means to curtail immigration. Under the Illegal Immigration Reform and Immigrant Responsibility Act of 1996, CIMT is a wide-ranging category of activities including serious crimes such as rape, incest, murder, and larceny, but also more minor offenses like petty theft or, as Ginsburg stated, “illegally downloading music or possessing stolen bus transfers.” 

 

While Ginsburg used modern examples of small crimes in her dissent, “moral turpitude” has been a staple in immigration law for over a century and has rarely been questioned as a useful tool for exclusion and deportation. In fact, the concept of “moral turpitude” and the litany of crimes that are evidence of an individual’s lack of “good character” (a requirement for naturalization rooted in the Naturalization Act of 1790) has historically applied primarily to immigration law. When an immigrant commits and serves time for a crime, immigration officials and courts can interpret that crime as one of “moral turpitude,” which indicates depravity, immorality, recklessness, or maliciousness on behalf of the perpetrator. 

 

The idea of using morality to target specific migrant groups first appeared in formal immigration policy in 1875 when Congress used its plenary powers to create an immigration law to exclude Chinese women suspected of being “undesirable” or engaging in prostitution—a response to a growing wave of anti-Chinese sentiment. “Moral turpitude” as an explicit phrase, however, made its first appearance in the Immigration Act of 1891. By the late nineteenth century, “new” immigrants from southern, eastern, and central Europe (as well as Mexico and the Caribbean) were beginning to arrive in larger numbers,as they fled political, social, and economic upheaval in their homelands and soughtjob opportunities in the industrializing United States. Many Americans, however, were alarmed by the arrival of a “horde” of immigrants who were not white Anglo-Saxon Protestants. Politicians and nascent anti-immigrant associations argued that the new immigrants were prone to criminal behavior and liable to become public charges. In response, the 1891 Act listed classes of migrants who were unfit to naturalize and therefore unfit to enter or remain in the United States,. The list included “idiots,” “insane persons,” the diseased,  “paupers,” polygamists, and those who had been convicted of a felony, misdemeanor, “or other infamous crime or misdemeanor involving moral turpitude…” 

 

Congress continued to pass laws that solidified the idea that immigrants were held to a higher degree of morality than most American citizens. In response to more radical political and social movements including socialism, anarchism, and labor organizing during the early twentieth century, the Immigration Act of 1907 expanded the list of offenses that qualified for exclusion. The 1907 Act banned “persons who have been convicted of or admit having committed a felony or other crime or misdemeanor involving moral turpitude; polygamists, or persons who admit their belief in the practice of polygamy, anarchists, or persons who believe in or advocate the overthrow by force or violence of the Government of the United States, or of all government, or of all forms of law, or the assassination of public officials; persons coming for immoral purposes…” 

 

The Immigration Act of 1917 further consolidated the groups of “undesirables,” but added specific references to “constitutional psychopathic inferiority” and “abnormal sexual instincts” as reasons for exclusion as well as deportation. The 1917 Act allowed for immigration inspectors and other immigration officials to deny entrance to and call for deportation of confirmed and suspected homosexuals who fell under these categories at the time or committed sodomy, a “crime of moral turpitude.” Throughout the early twentieth century, “undesirability” and “crimes involving moral turpitude” reinforced one another and served as a means to expand immigration exclusion and deportation at a time of heightened nativism and xenophobia. 

 

On June 25, 1952, President Harry S. Truman issued his veto of House Bill 5678 (or the McCarran-Walter Act), a proposal to amend the United States’ immigration policies. Truman not only insisted that the bill did little to address the discriminatory quotas targeting migrants who did not hail from Western European nations, but he also argued that it made it easier to deport immigrants who might be an asset to the United States during the Cold War rather than a threat or liability. Under HB5678, refugees fleeing the Soviet Union or those already living within the United States could be excluded or deported if “convicted of a crime involving moral turpitude” beyond a “purely political offense.” 

 

The bill’s provision for excluding and deporting immigrants based upon their ties to subversive practices and organizations is not surprising considering Cold War tensions. However, the bill also made “crimes involving moral turpitude” usefully vague during a time when many politicians and government officials argued that the U.S. was under constant threat from Soviet influence. But Truman believed that America’s reputation as a beacon of freedom for the tired, poor huddled masses would be challenged by an unnecessarily harsh immigration policy. The bill was poised to criminalize immigrants whom the U.S. should be sheltering rather than excluding. 

 

“We have adequate and fair provisions in our present law to protect us against the entry of criminals,” Truman wrote in his veto memo. “The charges made by the bill in those provisions would result in empowering minor immigration and consular officials to act as prosecutor, judge and jury in determining whether acts constituting a crime have been committed.” With such leeway in interpreting crimes of moral turpitude, the power to discriminate against immigrants while denying their rights could fall into misguided hands.

 

Despite Truman’s objections, Congress overrode the President’s veto and HB5678 eventually became the Immigration and Nationality Act of 1952. Under the 1952 Act, an immigrant who committed a “crime involving moral turpitude” within five years of their admission to the U.S. was deportable (those who committed two or more CIMTs were deportable regardless of their admission date). The Immigration Act of 1996, however, added the ambiguous language on when immigrants can be detained after being released from custody and also allowed for local law enforcement agencies, state-level officials, and immigration judgesto interpret CIMTs broadly.

 

Truman’s warnings against criminalizing immigrants and allowing crimes involving moral turpitude to be interpreted widely ring true today under President Trump’s administration. Moral turpitude is a phrase that has a long legacy of being used for exclusionary and discriminatory practices in deciding who is and is not fit to be an American. Perhaps it is time for a more careful examination of this component of American immigration policy in light of the questions over detention and due process.

]]>
Thu, 18 Apr 2019 16:48:42 +0000 https://historynewsnetwork.org/article/171665 https://historynewsnetwork.org/article/171665 0
A Response to Rebecca Spang's "MMT and Why Historians Need to Reclaim Studying Money"

 

 

Historian Rebecca Spang’s latest History News Network piece on MMT and history is both timely and thought-provoking. In addition to its biting critique of economic orthodoxy and other valuable insights, the essay sets into relief a productive ontological debate about money and its historical manifestations. Part of the present breakdown of the neoliberal consensus, the insurgent popularity of MMT in contemporary discourse has enlivened conservations about the nature of money and its role in shaping social life. As Spang rightly claims, this discourse requires historical context. As such, I welcome and applaud Spang’s intervention. However, I also wish to underscore some crucial differences between Spang's vital work (particularly on the French Revolution and the rhetoric of inflation) and the historical work being done within the MMT movement. 

 

In her concluding paragraph, Spang spells out how in her opinion, history proves the relevance of MMT to today’s politics. She writes: “MMT, along with the euro crisis and awareness of austerity’s social effects, has done much to open monetary and fiscal debates to wider audiences. Simply recognizing that money is political and historical (central, as Harvard Law Professor Christine Desan likes to say, to how a polity constitutes itself) is a difficult breakthrough for most people. On the other hand, seeing money in this way doesn’t—in a fractured polity characterized by demagoguery and high levels of inequality—make policy any easier to write or implement.” The opening of this paragraph is spot on, especially as Spang connects MMT with Desan’s constitutional history of money, a history that insists upon a legal foundation for monetary relations. 

 

(Shameless plug: we at the Modern Money Network (MMN) created an awesome episode for the Money on the Left podcast with Desan last year.)

 

Her concluding paragraph, however, also reveals a difference between her and many MMTers. More specifically, many following MMT’s insurgence in D.C. disagree with her conclusion that MMT doesn’t make policy “any easier to write or implement,” given the fractured, unequal and demagogic nature of this political moment. This is the case for a few reasons. One is that a central theme of MMT’s political and financial project is the introduction of a non-zero-sum rhetorical framework for legislative and social finance. As noted MMT economist Stephanie Kelton has repeatedly argued, MMT frees the Left from a relying on rich people’s money. Instead, she argues that the left should mount its case for confiscatory taxation on moral, rather than budgetary grounds. As well, MMT can change the perception that “taxpayer money” (often code for white people’s money) is what funds welfare and jobs programs for the disenfranchised, as MMT lawyer and legal scholar Raúl Carrillo has written. In insisting that fiscal allocations be labeled as “public money,” Carrillo and others challenge flawed neoliberal notions of money as not only private and scarce but also inherently white. All in all, this makes policy easier to imagine and implement because we can focus on what needs to be done rather than how it would be funded and who would oppose it. Instead of the zero-sum contests that presume “there’s no free lunch,” MMT says free lunches for everyone, as long as the food is producible! 

 

Perhaps more important, Spang’s argument about the current political fracture in America betrays tacit assumptions that MMT's understanding of money seeks to problematize. In her excellent book, she argues that money represents a sort of performance of our ongoing social bonds. À la Judith Butler, Spang writes that money is “not fixed or made once and for all but something that exists thanks only to its repeated enactment (not one interpellation but a whole series of them).” Furthermore, she claims that “monetary transactions are therefore characterized by what we might call ‘involuntary trust’—a trust itself resulting from involuntary, even unconscious, memory.” (6) Putting a Butlerian twist on Enlightenment social contract theory, Spang defines money as a process of ongoing consent between issuers and users, as well as buyers and sellers, one which is malleable and contestable. 

 

I take a different approach and think some other MMTers do too. From an MMT perspective, money is an asymmetrical and ongoing legal obligation between government and society and not “involuntary trust” among creditors and debtors. Take, for example, Scott Ferguson’s 2018 book Declarations of Dependence: Money, Aesthetics, and the Politics of Care (Ferguson, along with Carrillo, are on the board of the Modern Money Network). In the book, he argues for money’s inalienable public nature. “A political relationship between centralized governments and people, money, according to MMT,” Ferguson writes, “is an inalienable utility ever capable of expansion and reconstruction. Money obliges the public to a political center, socializing productive and distributive processes rather than organizing them locally and privately.” (184) Rather than being an ongoing form of trust in a credit relation, as Spang argues, Ferguson claims that money is always a centralized political mechanism for provisioning asymmetrical and reciprocal public obligations. In other words, money actualizes the polity’s indebtedness to its governing authorities as well as those authorities’ indebtedness to their polity. 

 

Instead of imagining a polity as always-already connected in its participation in the public money relationship, Spang’s conception of money as ongoing consent leaves politics attempting to unite, through consent, a polity imagined as unrelated or fractured. Throughout history, such projects sadly often take the form of imagining some other universal, like culture, race, nationality, or sometimes (if we are lucky) liberal consensus, under which those who are fractured can become one again. As Spang’s book argues about the rise of Napoleon, that sort of social wrangling often takes on an authoritarian color.

 

However, when one brings together these two seemingly opposed ontologies of money, one begins to see a new place for consent in money. Spang’s work traces the crises of political contestation and authority during the French Revolution among both a government and a polity sharing in economic and legal obligations. It is for precisely this reason that her documentation of the rise and fall of the Assignats (the revolutionary paper currency) demonstrates the problem with her own Liberal assertion of consent as the initial basis of money. Instead of the crises being caused by political fracture, they were caused by the assumption that consent would be enough to cement the revolution. Therefore, in not recognizing the tax obligation as a prime factor in the maintenance of money’s social role, revolutionary France created precisely the fracture Spang laments today. 

 

This insight allows us to bring together these two ontologies of money to color the contemporary rise of MMT in a particularly interesting light. MMT introduces democracy (public consent) into the budgetary process. Rather than rely on economists that used to tell us that we can’t afford to care of people, MMT gives us the precise tools to do so. Therefore, our initial relation through money allows us to mobilize our consent for progressive or leftist ends.

 

For these reasons, I laud Spang’s call to develop and complicate MMT’s approach to history. Still, I argue that it is equally important to problematize historians’ unquestioned ontological assumptions about money and legal mediation through the MMT framework. If historians take this up in earnest, we might finally be able to overcome our austere imagination of money’s role in history.

]]>
Thu, 18 Apr 2019 16:48:42 +0000 https://historynewsnetwork.org/article/171667 https://historynewsnetwork.org/article/171667 0
The Golan Heights: Its History and Significance Today

A UN-controlled border crossing point between Syria and Israel at the Golan Heights, Wikipedia Commons

 

Donald Trump’s decision to recognise Israel’s sovereignty over the Golan Heights might have serious repercussions for the future but to understand exactly why, and how, it’s necessary to first understand the past.      

                     

The Golan Heights is an elevated plateau stretching across some 932 square miles that shares a border with Lebanon, Syria, and Jordan. Several Jewish communities unsuccessfully tried to settle in the southern parts of the region and during the late Nineteenth Century the pre-independence Zionist movement claimed this area in the post-World War I Peace Conference of 1919. Rather than emphasising a historical connection to the area, the movement’s leaders chose to stress its strategic importance as a barrier from invading forces from the east (citing Bedouin tribes as the main threat) and a site for reliable irrigation sourcesfor a potential Zionist polity. With the creation of a mandate system in the early 1920s the area was eventually placed under French custodianship and became part of Syria with its establishment in 1946.     

  

Following the 1948 War, Israel and Syria battled over border delineation, diversion schemes for the Jordan River, and control of the Sea of Galilee. Artillery fire, infantry raids, and aerial dogfights became routine along the border during the 1960s causing considerable casualties and widespread destruction of property. The Golan became heavily militarised and was riddled with bunkers, minefields, and outposts. At the same time, it was also a bustling province with a population that in 1967 reached almost 150,000 civilians who lived in 270 villages and towns. The populations’ majority (85%) were Arab Sunni Muslims and the rest belonged to a variety of ethnic groups such as Cherkassy, Turkmen, Maronite Christians, Bedouin, Alawites, Isma’ilis, and Druze. 

 

Despite the area’s civilian life, the Syrian Plateau came to represent a threat to Israeli Jews. Significant pressure came from residents of the Kibbutzim (small agricultural communities) in northern Israel which at the time held significant political power. They demanded that the “Syrian plateau”, as it was colloquially known then, be pacified.  Syria played a minor part during the June 1967 war, but Israeli leaders decided that Syria’s involvement in the run-up to the war provided a pretext to occupy the Golan. The Golan’s occupation thus became part of an expansion strategy propelled by the perception (real or constructed) that Syria would continue to use the Golan to launch artillery attackson Israeli settlements. This fear-fuelled geopolitical “Israelification” of the Golan was enacted through three main tactics.  

 

First, the Israeli military displaced the Syrian population who remained in the territory after the fighting and prevented the return of civilians who fled during the war.  Only 6,500 people from the Druze sect that was historically considered “loyal” to Israel were permitted to remain. The rest were either forced to leave or were not allowed to return to their homes. Second, the Israel military, with supervision by archaeologists, architects, and rural planners systematically demolished villages, farms and houses. All structures were to be demolished except for those with architectural, archaeological or aesthetic significance.   

 

The population’s displacement and remaking of the landscape through demolition enabled the third tactic:  recreating the Golan as a tourist haven with a cool climate, open spaces, fertile soil, wineries, and attractions such as Mount Hermon, Israel’s only ski resort.  Most Israelis do not live in the Golan– the total Jewish population ofthe region isabout 20,000 --but rather visit it: between 1.5 to 3 million Israelis visit every year. In the words of Israeli Journalist Chemi Shalev, during the years in which the Golan Heights has been in Israel’s possession, it has become “more Israeli than Israel itself. An ideal version as we would like it to be: without Palestinians but with marvellous views, delicious wines, sympathetic residents, horses, crocodiles and ski sites.” 

 

In 1981 Israel extended its authority of civilian law to the Golan Heights, a move regarded as de-facto annexation. The decision was part of the attempt to secure Israel’s territorial conquest of June 1967 while then prime minister, Menachem Begin, was about to give back the Sinai Peninsula to Egypt as part of the Peace agreement between the two countries.The extension of civilian law to the Golan was a political decision made to garner support for Begin from the Israeli right who fumed over the withdrawl from Sinai. 

 

Similarly, Trump’s decision to recognize Israeli sovereignty over the Golan is an attempt to bolster Benjamin Netanyahu who is facing an uphill electoral battle in the coming elections in April. Both in 1981 and now, the pledge to the Golan was intricately connected to popularity contests and not to overall confidence of Israel’s leaders about the unquestioned “Israeliness” of the Golan. Indeed, Israeli policymakers have refrained from admitting that the Golan was officially annexed and from the 1990s up to the breakout of civil war in Syria in 2011, probed the possibility of giving back the territory. Thus, when Prime Minister Netanyahu lauds President Trump for recognising Israel’s sovereignty over the Golan, it goes against Israel’s strategy of leaving the question of the Golan’s formal status shrouded in opacity. 

 

Of course, the general sentiment that the Golan is supposedly part of Israel predates Trump’s recognition.  Jon Stewart’s The Daily Show is a case in point to this ignorance. A segment covering Israel’s 2014 military campaign in Gaza presented a map which did not encompass the Palestinian West Bank and Gaza Strip but depicted the Golan Heights as an integral part of Israel. The fact that Stewart’s team chose this map is, however, hardly coincidental, since many existing maps have erased the line separating the Golan. Can we blame Donald Trump, if even Jon Stewart “recognises” Israel’s control of the Golan?

 

Nonetheless, Trump’s recognition should have generated more attention than it got, especially because of his earlier decision to recognise Jerusalem as Israel’s capital triggered violent protests and worldwide condemnations. Unlike Jerusalem, the connection of the Golan to Zionism’s historical and theological narratives is peripheral at best and never played a dominant part in the reasons for controlling it. Besides, the area was never part of Mandatory Palestine and was internationally recognised, including by Israel, as part of Syria. Much more than Jerusalem, the Golan represents a clear case of a territory belonging to one sovereign state that is illegally occupied by another in clear violation of international law.   

 

So, what does the relative quiet with which Trump’s recognition has been accepted demonstrate?  The silence can be attributed to the announcement’s timing as it coincided with (another) violent escalation between Israel and Gaza and, perhaps more critical, the headline grabbing Muller Report’s completion. The brazen flaunting of International law also utilised the tragedy of the Syrian civil war as cover. There is almost no one left in Syria to challenge Israel’s claim to the Golan (although we can be sure that Putin and Iran weren’t too pleased). At the same time, any Israeli policymaker who will openly support an attempt to cede the Golan risks being branded as a lunatic. Fear mongering has always been a potent political currency in Israel, and the notion of Isis or Hezbollah taking over in Israel’s absence has significant public gravity.  

 

But as much as Israelis would like to claim that the Golan is needed for its strategic value or for its bucolic scenery, it cannot change the fact that there is no legal case for annexing the territory. Security threats cannot serve as a legitimate reason for annexing territory of another nation. The position of some legal experts that the territory rightfully belongs to Israel since it was occupied as an act of self-defence stands on shaky ground. It was Israel who invaded Syria which, until the moment of the incursion, did not play a significant part in the actual fighting of the June 1967 war. Even more so, a new historiography of the June war suggests that Israel's decision to take over the Golan was part of a strategy developed years before the actual occupation. Trump decision is simply another validation of one of the most successful land grabs of the twentieth century that is based on displacement, demolition and paradoxical transformation of the Golan into a peaceful warzone. A piece of “Europeansque” tranquillity in the heart of the Middle East’s most volatile region.

 

And this brings us to the question of what we can learn from the past for the future of the Israeli-Palestinian conflict. Some have suggested that Trump’s move, is, in fact, a way to sweeten the bitter pills that Israel will have to swallow once his ostensible “deal of the century” will be published. Trump, so the rumor goes, will force Netanyahu, or whoever will be Israel’s Prime Minister, to accept a Palestinian state, to relinquish land and evacuate settlers. But so far Trump, like the Golan, has proved to be more Israeli than Israel itself. The re-imposing of sanctions on Iran and the recognition of Jerusalem points to his complete lack of reservations about realising Israel’s wildest dream which now might become even wilder. This is especially true since the Golan is already developing into a generic model for what the West Bank should “become.” Put differently, the Israelification of the Golan which entailed massive population displacement, spatial demolition and European rebranding, has now become a battle-tested template to how annexation could look like in the West Bank. 

 

Yet we should also note that Trump’s blatant flaunting of international law in Iran, Jerusalem, and now the Golan is working at the moment but might also indicate America’s weakening position as it no longer can claim to lead an international community who opposes the notion that military might should translate into political rights. When Britain and France tried to take over the Suez Canal in 1956, a scheme that also involved Israel’s active participation and occupation of the Sinai Peninsula, they were forced by a new world order to back down, ending their time as colonial empires. While it is hard to draw immediate comparisons, Trump’s move might put America on the same path. In other words, Trump’s Golan might eventually become America’s Suez.

]]>
Thu, 18 Apr 2019 16:48:42 +0000 https://historynewsnetwork.org/article/171659 https://historynewsnetwork.org/article/171659 0
Senator Chuck Schumer says corporations used to care; here's how historians responded Allen Mikaelian is a DC-based editor and writer. He received his history PhD from American University and served as editor of the American Historical Association’s magazine, Perspectives on History. The Political Uses of the Past Project collects and checks statements by elected and appointed officials. This is the second installment of what will hopefully become a regular feature of the project. Read more about the project here.

 

Sen. Chuck Schumer: American corporations used to believe they "had a duty not just to their shareholders but to their workers, to their communities, and to their country"

When more than 80 percent of corporate profits are going to stock buybacks and dividends, something is really wrong in the state of corporate America and the state of our economy. It wasn't always this way. From the mid-20th century up until the seventies and even into the eighties, American corporations shared a belief that they had a duty not just to their shareholders but to their workers, to their communities, and to their country, which helped them grow and prosper, along with our schools, our roads, and everything else. That created an extremely prosperous America for corporate America but also for American workers in the broad middle of this country. But over the past several decades, workers' rights have been diminished, and corporate boardrooms have been obsessed, slavishly, to shareholder earnings. —Sen. Charles Schumer, Stock Buybacks, Senate Floor, February 4, 2019

Historians say...

Bottom Line: Most historians who responded agree that Senator Schumer is on solid ground, but their caveats and the statements of the historians who strongly disagree should not be ignored, especially if we want to use this history to help formulate policy. Scroll down for links to the historians' full responses.

Senator Chuck Schumer delivered the above statement while discussing the Republican tax cuts; he charged that corporations are not using their tax savings to create jobs or pay higher wages, but are instead buying up their own shares. This can drive up stock prices by creating scarcity, and shareholders naturally love it. But the GOP's tax cuts were granted, we were told, to create jobs, not merely to further enrich investors.

Schumer proposes legislation to force corporations to do good—investing “in workers and communities first”—before they can buy their own stock. And to set the stage for his proposal, he points to a past in which American corporations had a heart. Maybe that history makes his idea seem not so radical. Or it raises hopes that maybe we don’t have to be in constant battle with corporate America. That maybe our expectations for more socially responsible corporations aren’t so unreasonable. Or perhaps even that the CEOs want to do the right thing but have to be legislated into it.

Regardless of why Schumer decided this piece of business history was a “useful past,” most of the historians who answered our request for input thought Schumer was on solid ground. However, we should not overlook their caveats or the dissents of historians who disagreed with Schumer’s view of history; these are perhaps more deserving of policymakers’ attention, if they really want to learn from the past.

Summary

Several historians responded by mentioning the stakeholder model that captured at least the imaginations, if not the actions, of many mid-twentieth century executives: “Two competing models of corporate ownership through stocks were evident in the twentieth century: shareholder and stakeholder. The former model asserts that the leaders of corporations must make decisions based solely on the best interests of people who actually own stocks, while the latter maintains that other interested parties like workers and their communities have an interest in corporate actions equal to those of shareholders” (Jason Russell). “Earlier in the twentieth century, some management scholars such as Peter Drucker argued that corporations had different stakeholders, including the community, employees, and consumers” (Gavin Benke). “They've always cared about the bottom line, but back then felt compelled to consider the needs of multiple ‘stakeholders’” (David B. Sicilia).

This was, of course easier to do when the economy was booming. The strength of unions was also a factor—they were relatively harder to ignore (Jonathan Bean)—and higher taxes made large investments in infrastructure possible (Rosemary Feurer). All this started to change at least by the 1970s (Benjamin Waterhouse, and Jason Russell pegs it to the 1960s). And with this change came a large-scale shift in thinking.

Milton Friedman argued in the New York Times in 1970 that a corporation’s sole responsibility is to “increase its profits,” giving permission and intellectual heft to executives who, in the midst of globalization and declining profits, wished to focus on shareholders rather than stakeholders. Friedman was objecting “to a very real sense, both within and beyond business leadership circles, that corporations had a clear social responsibility” (Benjamin Waterhouse), but Friedman did not limit his thinking to profits and business culture: “He argued that ‘the cloak of social responsibility ... does clearly harm the foundations of a free society’” (David Hochfelder). And even further, he accused executives who took up social responsibility of “preaching pure and unadulterated socialism” and being “unwitting puppets” of the collectivist left.

Schumer may be on solid ground, but if we pay close attention to these historians’ caveats and to the historians who think he is dreaming of a “golden age” (Jonathan Bean), we might ask whether the CEOs who preached social responsibility were leading the charge or merely reflecting what the public expected and what legislation demanded. “Corporations thought in wider terms about stakeholders because regulations compelled them to do so” (Jason Russell). And insofar as some corporations “may have felt a sense of civic duty” and others contributed to the public good, “they did so to comply with the much more progressive tax code at the time” (David B. Sicilia). Schumer is right to offer legislation at the same time he speaks of a now-distant past when corporations did the right thing, but his case would be stronger if he made note of how corporate virtue had to be cajoled by legislation.

Schumer also leaves out a key aspect of the history of corporate responsibility, one that most of the historians here take up. Corporations had to contend with strong unions. This helped reinforce stakeholder responsibility when the morality of CEOs failed. The demise of unions was no accident, and it was not coincidental to the demise of corporate responsibility. While even the historians who agree with Schumer mention unions, the historians who disagree move unions to the center of the discussion.

Specifically, the rise of so-called right-to-work states lured corporations to the south and the sunbelt, where their stakeholder responsibilities were far less (Jonathan Bean). Rosemary Feurer has questions about this: “Ask textile communities in the North how much corporations cared about the devastating effect of relocating. Ask African-Americans in Detroit how many of their jobs, newly won, were lost to corporate decisions of automakers to relocate jobs to the South.” She also has questions for Schumer, questions that could be turned into policy, if we are serious about returning to a time when corporation responsibility received at least lip service: “And ask Schumer what the Democratic Party did to stop this in this time. Instead, the party at the time sought to grow the economy without any intervention in this dynamic, despite the attempt of unions to gain some control over these relocations.”

In the past, corporate responsibility had much to do with Congress building strong guardrails. But it did not involve Congress alone. It was not merely big government legislating big business. It was also citizens and groups like unions that forced the issue. Schumer’s job will likely be easier if his attempt to blunt the harder edges of capitalism and a return to stakeholder values also protects these groups of citizens, the actual stakeholders themselves.

Browse and download citations recommended by the historians below from our Zotero library, or try our in-browser library.

 

Jonathan Bean, Professor of Business History, Southern Illinois University

Rating: 1.5

This is the myth of a golden age of "corporate liberalism." While it is true that during that time period (circa 1945-1970s), CEOs were more likely to espouse a belief in "stakeholders" (beyond shareholders), it was mostly public relations. Read more...

Gavin Benke, Boston University, author of Risk and Ruin: Enron and the Culture of American Capitalism

Rating: 3.5

Earlier in the twentieth century, some management scholars such as Peter Drucker argued that corporations had different stakeholders, including the community, employees, and consumers. However, it would be wrong to look back on the mid-twentieth century as period without discord in American business.  Read more...

Rosemary Feurer, History Department, Northern Illinois University. Coauthor, with Chad Pearson, Against Labor: How US Employers Organized to Defeat Unions

Rating: 1.7

The main explanation Schumer gives for the postwar period is a myth. He seems to suggest that there was less concern for profits in this period, that CEOs cared more about their workers and community. He makes it seem a moral or personal decision, rather than acknowledging the key factor—unionization tamed some of the rapaciousness of capitalism in this period and created a middle class. Read more...

David Hochfelder, Associate Professor, University at Albany, SUNY, author of The Telegraph in America, 1832-1920

Rating: 3.9

Schumer’s statement is more or less true. Corporations often felt some obligation to their workforces and communities from the late 19th to late 20th centuries. Electric utilities had employee and customer stock ownership plans. Industrial firms provided health clinics, built parks and schools, financed mortgages for workers, etc. Read more...

Jason Russell, PhD, Empire State College—SUNY, suthor of Making Managers in Canada, 1945–1995: Companies, Community Colleges, and Universities (Routledge, 2018)

Rating: 4.8

One important point is that corporations thought in wider terms about stakeholders because regulations compelled them to do so. For example, laws like the Glass-Steagall Act, the Wagner Act, and the Fair Labor Standards Act established certain parameters for corporations. Read more...

David B. Sicilia, Henry Kaufman Chair of Financial History and Associate Professor, University of Maryland, College Park. Coauthor or coeditor of six books on business and economic history, including Constructing Corporate America: History Politics, Culture

Rating: 3.6

Sen. Schumer’s comment captures the spirit of an important transformation in the second half of the 20th century but should not be taken too literally. His statement centers on a claim about motive (“a shared belief that they had a duty”) that is difficult to prove. But corporate behavior, especially toward workers and communities, certainly changed when and how Sen. Schumer suggests. Read more...

Benjamin C. Waterhouse, Associate Professor of History, University of North Carolina at Chapel Hill, author of Lobbying America: The Politics of Business from Nixon to NAFTA (2014) and Land of Enterprise: A Business History of the United States (2017)

Rating: 4.6

Broadly speaking, Schumer’s claim reflects the way business historians summarize changes in attitude among corporate managers and leaders. Naturally, it is impossible to say precisely what “corporate leaders” believed at any point, because that group is large and reflects many different opinions. Read more...

]]>
Thu, 18 Apr 2019 16:48:42 +0000 https://historynewsnetwork.org/blog/154198 https://historynewsnetwork.org/blog/154198 0
James Madison Responds to Sean Wilentz

 

Sean Wilentz, Sidney and Ruth Lapidus Professor of the American Revolutionary Era at Princeton University, just announced in a New York Times op-ed that he retracted his earlier opinion on the origin of the Electoral College.  In NO PROPERTY IN MAN: Slavery and Antislavery at the Nation’s Founding, published by Harvard University Press in September 2018, Wilentz concluded “the evidence clearly showed the Electoral College arose from a calculated power play by the slaveholders.” Now Professor Wilentz asserts he was mistaken. “There is a lot wrong with how we choose the president. But the framers did not put it into the Constitution to protect the South.”

 

If I understand Sean Wilentz's new position on the origin of the Electoral College, it, like slavery, was an undemocratic element of the new Constitution endorsed by writers from the North and South who feared slave insurrection, democratic insurgencies like Shay’s Rebellion,  and popular government, who represented slave states (there was still slavery in most of the North) or commercial interests tied into the slave trade, and probably got a slaveholder elected President in 1800, but historians shouldn't conclude that they considered that the Electoral College, like the 3/5 clause, the fugitive slave clause, and the ban on banning the slave trade for 20 years, might protect slavery. 

 

I don’t consider myself equipped to debate either the earlier or later positions taken by Professor Wilentz, but I thought James Madison might be, so I decided to consult his Notes of the Constitutional Convention.

 

Hugh Williamson representing North Carolina seems to have first introduced the idea of an Electoral College in discussion of an Executive on June 2, 1787. On Wednesday July 25, 1787, the Constitutional Convention debated a series of proposals for selecting a national “Executive.” 

 

Oliver Ellsworth of Connecticut moved “that the Executive be appointed by the Legislature." Elbridge Gerry of Massachusetts, who later refused to sign the Constitution, argued that “an election at all by the Natl. Legislature was radically and incurably wrong; and moved that the Executive be appointed by the Governours & Presidents of the States.” James Madison of Virginia noted that “There are objections agst. every mode that has been, or perhaps can be proposed. The election must be made either by some existing authority under the Natil. or State Constitutions — or by some special authority derived from the people — or by the people themselves. — The two Existing authorities under the Natl. Constitution wd be the Legislative & Judiciary.” Madison opposed the judiciary and legislative options as  “liable to insuperable objections.” According to Madison, “The Option before us then lay between an appointment by Electors chosen by the people — and an immediate appointment by the people. He thought the former mode free from many of the objections which had been urged agst. it, and greatly preferable to an appointment by the Natl. Legislature. As the electors would be chosen for the occasion, would meet at once, & proceed immediately to an appointment, there would be very little opportunity for cabal, or corruption.” Ellsworth’s motion that the Executive be chosen by the national legislature was then defeated by 4 to 7 with only New Hampshire, Connecticut, Pennsylvania, and Maryland voting in the affirmative.

 

Charles Pinckney (South Carolina), George Mason (Virginia) and Elbridge Gerry supported a motion to have the Executive selected by the Legislature as long as “ no person be eligible for more than 6 years in any twelve years.” Gouvernor Morris of Pennsylvania spoke in opposition and insisted that “election by the people as the best, by the Legislature as the worst.”

 

The idea of an Electoral College was reintroduced by Pierce Butler, a South Carolina rice planter, one of the largest slaveholders in the United States, and one of slavery’s strongest defenders. Butler also introduced the Fugitive Slave Clause into the Constitution, supported the Constitution provision prohibiting regulation of the trade for twenty year,and demanded that the entire slave population of a state be counted for Congressional apportionment.

 

According to Butler, “The two great evils to be avoided are cabal at home, & influence from abroad. It will be difficult to avoid either if the Election be made by the Natl Legislature. On the other hand, the Govt. should not be made so complex & unwieldy as to disgust the States. This would be the case, if the election shd. be referred to the people. He liked best an election by Electors chosen by the Legislatures of the States.”

 

The issue of selecting an Executive was then referred to a special Committee of Eleven, also known as the Brearly Committee. On September 4, the Brearly Committee reported its recommendation that “Each State shall appoint in such manner as its Legislature may direct, a number of electors equal to the whole number of Senators and members of the House of Representatives, to which the State may be entitled in the Legislature.” Gouvernor Morris explained that the committee’s reasoning. “No body had appeared to be satisfied with an appointment by the Legislature,” “Many were anxious even for an immediate choice by the people,” and “the indispensable necessity of making the Executive independent of the Legislature.” Pierce Butler defended the recommendation, although “the mode not free from objections, but much more so than an election by the Legislature, where as in elective monarchies, cabal faction & violence would be sure to prevail.” The motion was then put on hold while the committee considered objection, not to the selection of the Executive, but to the process for removal.

 

On September 6, the edited Brearly Committee report was brought to the convention again. Alexander Hamilton, who had a strong “dislike of the Scheme of Govt. in General,” announced, “he meant to support the plan to be recommended, as better than nothing.” After continued debate and some amendments, Hamilton’s recommendation that the Convention approve the Brearly Committee’s recommendations for the organization of the Executive branch and acceptance of the Electoral College because the final document was “better than nothing,” was finally accepted by the Constitutional Convention and submitted to the states for approval. 

 

Madison’s notes do not definitely prove either Wilentz’s earlier or later positions on the relationship between support for the Electoral College and defense of slavery. What I find most suggestive in the debate is the role played by Pierce Butler, one of the Convention’s greatest slavery champions. The Electoral College may not have been expressly designed only to protect African slavery, but based on Madison’s notes, it was the mode most preferred by pro-slavery forces.

]]>
Thu, 18 Apr 2019 16:48:42 +0000 https://historynewsnetwork.org/article/171666 https://historynewsnetwork.org/article/171666 0
Venezuela and the Birth of the American Empire

 

John Bolton, President Donald Trump’s national security advisor, took to Twitter recently to disparage the regime of Nicolás Maduro. In particular, Mr. Bolton vowed that the detention of opposition leader Juan Guaido’s chief of staff would “not go unanswered.” Roberto Marrero’s arrest was deemed “illegitimate,” and Bolton echoed the president, who has repeatedly warned Venezuela that “all options are on the table” concerning the conflict over that nation’s recent elections. This, the president confirmed, included the possibility of a military intervention. 

 

Maduro’s two biggest backers, China and Russia, have invoked the Monroe Doctrine in order to disparage the US’s efforts to ouster Maduro and his United Socialist Party. Ted Galen Carpenter, writing in National Interest, also invoked the Monroe Doctrine but in a positive light, arguing that the US needs to invoke the old policy in order to curb Moscow’s foothold in Latin America. 

 

With so many politicians and analysts invoking the Monroe Doctrine, you would think at least one would correctly understand the history of this controversial piece of American foreign policy. The inaccuracies permeating the recent analysis suggest Americans need a refresher on the nearly 200-year-old document.

 

First announced in 1823 during the administration of President James Monroe, the so-called Monroe Doctrine declared that the United States intended to protect its “sister republics” in Latin America from further European imperialism, specifically Spanish imperialism. Monroe and Secretary of State John Quincy Adams (the policy’s true mastermind) let Madrid, Lisbon, and Paris know that any return of European rule in Latin America would be viewed by Washington as “the manifestation of an unfriendly disposition towards the United States.” 

 

The policy sounded tough, but was essentially toothless. The American Navy in 1823 had just sixteen vessels of war, five of which were deployed in the West Indies. This force could not deter any serious naval armada, therefore the British Royal Navy, which enjoyed trading relations with several Latin American nations, became the enforcer of the policy. The Monroe Doctrine was as much of a British policy as it was an American one, and for the majority of the nineteenth century, the Monroe Doctrine benefited the British Empire more than it benefitted the American Republic. 

 

That changed in 1895. The reason? A border conflict between Venezuela and British Guiana. By 1895, the US fleet included fifty-five warships in total, with three brand new battleships commissioned that same year. While nowhere near the strength of a first-class European fleet, the US Navy was by that point a force to be reckoned with. The British found that out when London refused to accept international arbitration on the long-simmering dispute over the Schomburgk Line. The Venezuelan government demanded territory as far east as the Essequibo River. The colonial officials in British Guiana, recognizing that this would strip them of about two-thirds of their territory, countered these claims by demanding 33,000 square miles of Venezuelan territory west of the Schomburgk Line (so named because of the German-born explorer Robert Schomburgk). 

 

For nineteen years, between 1876 and 1895, Caracas petitioned the United States to intervene on their behalf against the British. It took a new US Secretary of State, Richard Olney, to finally grant Venezuela’s wishes. Olney sent a letter to Thomas Bayard, the American ambassador to Britain, demanding that London settle the dispute by arbitration. Olney invoked the Monroe Doctrine to declare that the United States, which had “greatly increased in power and resources” since 1823, had an interest in protecting the status quo in the Western Hemisphere. British Prime Minster Lord Salisbury responded by telling Olney that international law did not recognize the Monroe Doctrine. 

 

Rather than turn tail and save America from the possibility of fighting the world’s premiere navy, President Grover Cleveland, a Democrat and a firm believer in limited government and even more limited US involvement abroad, sent the issue to Congress. Congress met specifically to talk about the formation of a boundary commission. However, behind the scenes, a few American wives of British statesmen (including Mary Chamberlain, wife of Joseph Chamberlain) told their husbands that Congress could declare war. Their urging, along with a growing crisis with the Boer republics in South Africa, convinced London to back down. By October 1899, the issue had been resolved, with the United States declaring that the border should concur with the Schomburgk Line. 

 

By that point the United States had a full-fledged empire in the Caribbean and Asia. Following the impressive victory against the Spanish in 1898, American troops occupied Cuba, Puerto Rico, Guam, and the Philippines. In the span of four years, America had gone from saber rattling on behalf of arbitration to overseeing what amounted to the British Empire in miniature. In 1904, the Roosevelt Corollary gave serious muscle to the Monroe Doctrine, adding that the US military now had a responsibility to police the Western Hemisphere. President Woodrow Wilson, a hated enemy of Theodore Roosevelt, would nevertheless echo his predecessor’s policy by adding to the Roosevelt Corollary an interest in promoting “good governance” in Latin America.  

 

Under this idea, small batches of US Marines and sailors would occupy Cuba (1906-1909, 1912), Haiti (1915-1934), the Dominican Republic (1916-1924), and Nicaragua (1909, 1912, 1927-1932). American military governments established order, balanced the books, and tried to depoliticize Latin America’s ever-restive militaries. Under President William Howard Taft, the US also promoted dollar diplomacy, whereby US loans were leveraged in order to promote the ascension of pro-American leaders to power in Latin America. 

 

None of these developments would have happened had not the American government managed to get the British deescalate matters in Venezuela in 1895. This legacy is controversial to say the least. American occupation tended to promote improved public hygiene and free and open elections. Veracruz, Haiti, and Nicaragua all benefited from US intervention. However, Richard Olney’s descendants also left behind dictators like Rafael Trujillo and Anastasio Somoza Garcia, and by the time of the Great Depression, the United States had become the “Colossus of the North”— an imperial power which gobbled up Latin American resources and raw materials with a voracious appetite. Detractors of Washington’s current approach to the crisis in Venezuela reference this imperial legacy when they invoke the Monroe Doctrine. 

 

It remains to be seen what President Trump’s administration will do in Venezuela. Could it all be bluster, or is there a possibility of another Banana War-like invasion in the same vein as Grenada in 1983 or Panama in 1989? Either way, it is clear that the Monroe Doctrine needs to be seriously studied rather than bandied about by both anti-American powers and interests and war hawks. Simply put, the Monroe Doctrine was a complicated piece of legislation that genuinely attempted to protect the Western Hemisphere from Spanish revanchism. The policy changed once the United States became an economic and military power. 

 

Today, it would appear sound to not only study the history of the Monroe Doctrine, but to study the traditions of realpolitikin general. For the first time in a long time, a non-American power (Russia) has sent troops and material into a Latin American country in order to prop up an anti-American government. This does directly challenge American hegemony in the region, and now the question to the US is this: do you follow the traditions set forth by the Roosevelt Corollary to the Monroe Doctrine, or do you follow a new approach not constrained by historic precedent? 

]]>
Thu, 18 Apr 2019 16:48:42 +0000 https://historynewsnetwork.org/article/171662 https://historynewsnetwork.org/article/171662 0
The Propaganda Posters That Won The U.S. Home Front

 

In 1917, James Montgomery Flagg created his iconic Uncle Sam poster encouraging American men to join the war cause with the clear message, “I want you for the U.S. Army!” as the U.S. ramped up preparations to enter World War I. Even though this was not the first instance of propaganda posters being employed on behalf of a war cause, the visual medium proved to be effective in the military’s recruitment drives and posters were routinely used to boost morale, encourage camaraderie, and raise esprit de corps.  Posters were cheap, easily distributed, and fomented a sense of patriotism and duty. In World War II, the U.S. turned to artists once again in an attempt to influence the public on the home front. Today, these posters offer a glimpse into American society and the efforts to mold public opinion in the country. 

 

Rolled out on a massive scale in World War I, the popularity of posters as propaganda only further increased in World War II. With the surprise attack on Pearl Harbor in 1941, the U.S. began mobilizing once again but not just militarily. The U.S. government leveraged hundreds of artists across the country to deliver important messages through visual means. This included some relatively famous artists such as the creator of Aquaman, Paul Norris, whose sketches were noticed by his superiors during his time in the military.  The artists’ designs were not just focused on the rank and file of the military either. The Office of War Information (OWI) believed that the ‘home front,’ was just as sensitive to enemy misinformation, and went to work creating a series of posters specifically focused on the population back home as the engine of the war effort in Europe and the Pacific. 

 

The designs and posters had a wide range in terms of messaging and design. Even though there was quite a number of posters in the U.S. with xenophobic or down right racist messaging and visuals, the majority centered around themes of tradition, patriotism, duty, and honor.  This was further expanded on the home front with themes such as conservation, production, work ethic, buying war bonds, tending to “victory gardens,” encouraging women in the labor force, and cementing a common enemy in the eyes of the American public.  

 

A Common Enemy Emerges

 

 

Several U.S. propaganda posters employed a tactic known as demonization.  This involved portraying the enemy as barbarian, aggressive, conniving, or simply evil. Demonization included derogatory name calling including terms such as “Japs,” “Huns,” and “Nips,” among others. Several posters in the U.S tapped into demonization by showcasing the Japanese with overly exaggerated features and by recycling racist and xenophobic personifications.

 

 

This was often paired with messaging such as one anti-Japanese poster which portrayed Emperor Hirohito rubbing his hands saying, “Go ahead, please take day off!.” The tactic was clear, motivate the working population at home to avoid sick days through fear of the inhuman enemy who is planning an attack on the homeland at any moment. 

 

Fear was a popular theme employed by artists, even with differing messages. In one poster, a giant Nazi boot is depicted crushing a small church with the language, “We’re fighting to prevent this.” Often, fear was utilized as a way to encourage the purchase of war bonds. Numerous posters portray children wearing gas masks or under the shadow of giant swastikas with clear messaging, “Buy war bonds to prevent this possible future.”  

 

 

Conservation and Production 

 

 

Some posters employed comedy as a way to break through, while at the same time tapping into the overarching fear of the enemy.  For example, one poster, seemingly in an attempt to encourage carpooling, depicts an outline of Adolf Hitler riding shotgun with a commuter with the messaging, “When you ride alone, you ride with Hitler.” Others encouraged high production outputs by likening slacking off with aiding and abetting America’s foreign enemies. At the same time, others were more positivist in nature such as the famous Rosie the Riveter “We can do it!,” poster, encouraging women in the workforce. 

 

 

Interestingly, many posters encouraged conservation and “victory gardens.” In an attempt to counterbalance rationing, the Department of Agriculture encouraged personal home gardens and small farms as a way to raise the production of fresh vegetables during the course of the war.  Some scholars, such as Stuart Kallen believe that victory gardens contributed up to a third of all domestic vegetable production in the country during the course of the war.  Posters espoused popular sentiments such as “our food is fighting,” “food is ammunition,” and “dig for victory.”  Coinciding with this, posters also espoused the benefits of canning with messages such as, “of course I CAN,” and “can all you can.”  

 

 

Loose Lips Sink Ships

 

Perhaps one of the most fascinating themes which were propagated on the home front is misinformation and “loose talk.” Some scholars have speculated that this theme emerged out of fear of domestic spies and foreign intelligence operations within the U.S. Others, however, maintain that the U.S. intelligence services had shut down any foreign intelligence networks even prior to America’s involvement in WWII. Their claim is that these types of messages merely aimed to dispel rumors and prevent a loss of morale at home and abroad. Whatever the case may be, the government asked illustrators to discourage the population at home from casual chatter about troop deployment, movement, and any other sensitive information which could be “picked up,” by enemy receptors or propagated on a large scale. The phrase, “loose lips sink ships,” emerged thanks in part to the work of Seymour Goff.  Goff’s poster depicts a U.S. boat on fire, sinking with the words, “Loose lips might sink ships.”  Similar messaging was also prevalent in Great Britain and Germany as well. 

 

 

Just as World War II was fought with bullets, boats, tanks and planes, the war at home was fought with information stemming from sources such as movies, radio, leaflets, and posters. Artists suddenly became soldiers on the front to win the hearts and minds of the American public.  Propaganda posters offer us an interesting insight into the objectives of the U.S. government and the war time aims of the mission to create consensus at home.

 

]]>
Thu, 18 Apr 2019 16:48:42 +0000 https://historynewsnetwork.org/article/171661 https://historynewsnetwork.org/article/171661 0
The Secret Life of CIA Spymaster James Jesus Angleton

 

If you asked the average American to name a CIA agent, he or she would probably go blank. One might list one of the Watergate burglars: E. Howard Hunt, James McCord or Gordon Liddy. A few new junkies might be able to name the current CIA Director, Dan Coats or his predecessor, Mike Pompeo (now secretary of state). 

Almost no one, however, would be able to identify James Jesus Angleton, despite his key role as leader of the agency’s head of counter-intelligence (CI) operations during the height of the Cold War. Angleton had a profound impact on the agency’s operating procedures during its formative years. A fervent anti-Communist, he was obsessed by the KGB and (he falsely believed) its constant attempts to plant agents in the CIA. Operating without hard evidence, he wrongly accused many CIA colleagues of disloyalty and ruined dozens of careers. 

The Ghost, a new biography of Angleton by Jefferson Morley, a Washington journalist, provides an intriguing look at this powerful, enigmatic Cold Warrior. He earned the nickname “the Ghost” because he was rarely seen outside his high-security office, yet had a major impact on the agency’s strategy and tactics. 

James Jesus Angleton was born in Idaho in 1917. He grew up in a secure, middle-class family. His father was a successful businessman who owned the National Cash Register franchise in Italy. Angleton spent several years in Europe and became fluent in Italian and German. 

As a student at Yale, he founded a literary magazine, Furioso, which published a number of avant-garde poets including William Carlos Williams, E.E. Cummings, and Ezra Pound.  He joined the U.S. Army in 1943 and was quickly assigned to the CIA’s forerunner, the Office of Strategic Services (OSS). (His father, then living in Italy, was already in the secret organization). Angleton rose quickly, by the end of the war he was head of the OSS’ X-2 division (counter-intelligence) for Italy. 

During the war, Angleton organized a number of secret missions and helped to round-up hundreds of enemy agents. He built a reputation as a genius who could interpret enemy strategies and discern the true loyalties of individuals. 

After the war, Angleton stayed in the army until he joined the newly formed Central Intelligence Agency in 1948. He was soon appointed head of the organization’s CI division, a job he held until 1974. In this position he supervised hundreds of agents around the world. As a fellow CIA officer recalled, he enjoyed the role of “Delphic Oracle.” He was “seldom seen, but frequently consulted.”

Angleton came to believe that the KGB had mounted a massive disinformation campaign designed to mislead the Western allies. As Angleton saw it, the split between Khrushchev and Mao, which culminated in the Soviet Union suspending all aid to China in 1961, was a carefully orchestrated deception – a ploy to persuade the West to lower its guard. 

He relentlessly pursued nonexistent KGB “moles” whom he believed operated at high levels in the governments of the U.S. and its allies. At various times, he falsely labeled as KGB operatives important figures such as Averell Harriman, U.S. Ambassador to Russia and former New York governor and two prime ministers, Harold Wilson of Great Britain and Lester Pearson of Canada.  

He also ruined the careers of more than a dozen loyal CIA executives by accusing them, without any firm evidence, of working for the Soviets.  Because many of these men spoke Russian or had worked in the U.S. embassy in Moscow, their forced retirement critically weakened the CIA’s ability to collect intelligence on the Soviet Union.

Angleton’s obsession with uncovering spies in American led him to authorize one of the CIA’s first domestic spying operations, Operation Lingual. Beginning in 1955, all U.S. mail sent to and received from the Soviet Union was opened and copied at a secret facility just outside JFK Airport. This operation was never disclosed to Congress and the FBI only found about it by accident in 1957. FBI Director J. Edgar Hoover, angered at the invasion of his turf (domestic spying), said nothing publicly but demanded the agency share its findings.   

A decade later, as student protests against the Vietnam War grew larger and larger, Angleton worked with the FBI to mount Operation CHAOS, which infiltrated the peace movement and surveilled many of its leaders. Angleton was convinced the KGB was inciting the protests with men and money. Still, it was a violation of the CIA’s charter, which prevented the agency from engaging in domestic operations against U.S. citizens.

In the wake of the Watergate trials and Nixon’s resignation, Congress began investigating FBI and CIA operations against American dissidents. On December 22, 1974, the New York Times published a story by Seymour Hersh that revealed the CIA’s extensive, illegal domestic spying operations. Angleton was mentioned by name and labeled an “unrelenting cold warrior” who had directed the operations. The CIA was deeply embarrassed by the revelations, and ordered Angleton’s immediate retirement. 

Morley’s account depicts a deeply troubled man. A chain-smoking workaholic and heavy drinker, Angleton would often arrive at the office at 10 and then retreat for a three-martini lunch. He would often work until two or three a.m. His rarely saw his three young daughters; his wife came close to divorcing him several times.   

Unlike other senior CIA officers, Angleton never published a memoir. In retirement, he clung to his paranoid suspicions, defending them to former colleagues and to journalists in off-the-record interviews.

He died of lung cancer in May 1987, taking most of his secrets with him to the grave.

Angleton may have been ghostlike, but the agency he haunted was a highly structured, powerful, organization that reported to the president. Morley’s biography falls short in providing context to Angleton’s rapid rise and sudden fall. Why was he allowed to condemn fellow CIA officers without evidence? Did the presidents he served under know what he was doing?

Angleton worked under five different administrations, those of Truman, Eisenhower, Kennedy, Johnson and Ford. But Morley’s book barely mentions these leaders and their different uses of the CIA.  While spy agency aficionados will have read recent CIA histories (e.g. Tim Weiner’s Legacy of Ashes), those who are new to the subject may wonder how this eccentric, paranoid man could wield so much power.    

]]>
Thu, 18 Apr 2019 16:48:42 +0000 https://historynewsnetwork.org/article/171663 https://historynewsnetwork.org/article/171663 0
The Original Border Wall

Ramón Murillo, Soldado de Cuero, 1804. (Archivo General de Indias, Seville: Mapas y Planos, Uniformes, 81; Wikipedia Commons).

 

Spaniards responded to the unfolding story of the American Revolution with a mixture of trepidation and schadenfreude. Britain was Spain’s dangerous imperial rival. Britain had humiliated France and Spain in the French and Indian War. So Spaniards much enjoyed England’s crisis. But in 1775, the Count of Aranda, the Spanish Ambassador to Versailles, presciently warned the government in Madrid that whether the Thirteen Colonies secured their independence or not, “we must view them as a rising power born to subjugate us.” In due course, he believed, whatever the outcome of the war, this terrifyingly bellicose population of mostly English-speaking, largely Protestant, northern Europeans would march across America, their Anglo-Saxon sights firmly fixed on the silver mines of Mexico.

 

Aranda preempted Manifest Destiny by three-score years and ten. He even identified the Red River as the likely invasion route, which would have led his imaginary army of alien heretics to the relatively prosperous community of 5,000 souls at El Paso, then famous for its aguardiente, literally “firewater.” Somewhere in New Mexico or Texas this phantasmagoric force would have been confronted by the first line of Spanish imperial defense, the garrisons of soldados de cuero, tough mounted border guards named for their leather armor.

 

In 1772, Charles III of Spain had ordered a major reorganization of the forts known as presidios and their garrisons of “leather-jackets” which were supposed to control and protect the northern reaches of the viceroyalty of New Spain (modern Mexico). The plan required a chain of presidios, each with its own well-equipped garrison, set one hundred miles apart. This “rampart” was to run from California to Texas following more-or-less the same line as the modern international border, with the exceptions of the outlying settlements in northern New Mexico and at San Antonio de Béxar, Texas. This was to be the frontier of Spanish occupied territory, beyond which lay Indian country which was nonetheless claimed by Spain.

 

Thus, King Charles’s Regulation of 1772 sought to establish an eighteenth-century Spanish predecessor to the contentious modern border wall. That ancestry is not without obvious irony, for the express purpose of this “Spanish wall” was, as Charles III stated, “to defend those borders and the lives and livelihoods of my vassals” in the modern Mexican border states of Sonora, Chihuahua, Coahuila, and Tamaulipas “from the barbarian nations” invading from the north. These “barbarians” were not hoards of British-Americans from the East, however, but Native Americans whose descendants are now US citizens. Northern New Spain had been devastated by highly mobile Apache and to some extent Comanche raiding parties from southern New Mexico and Texas that had repeatedly assailed settlements in northern New Spain. Their violence had caused large numbers of Hispanic settlers, mostly indigenous Mexicans or mixed-race castas who had emigrated from elsewhere in New Spain, to abandon their isolated farms and communities, causing massive depopulation and the dislocation of people across the region.

 

Behind this immediate threat from Native Americans, the distant but ever-present menace posed by Britain and the Thirteen Colonies did indeed lurk. The expanding population of British America created westward pressure on the Indian population that expressed itself as Comanche and Apache aggression when it came up against the Spanish world. Moreover, these Indian raiders acquired weapons and ammunition from trading networks that originated in the Thirteen Colonies, although they bought some from traders in Spanish Louisiana, to the disgust of the viceroy in Mexico City. So, while the Spanish border “wall” was not conceived of as a defense against British or American invaders, itwas a response to the consequences of American demographic growth and fast developing commercial relations with Native Americans.

 

Moreover, as Aranda’s warning exemplifies, Spaniards believed that they would soon have to confront a direct British or American trespass on their territory. Before the Revolutionary War, that threat was most tangibly present in Louisiana (ceded to Spain by France in 1763), where British West Florida was just across the Mississippi. The government in Madrid sought to create a bulwark against British-American expansion by building a series of alliances with the different Native American tribes who controlled most of the territory on both sides of the Mississippi. In 1769, the acting military-governor of Louisiana, an Irish-born army officer called Alejandro O’Reilly, convened a great Indian Council at which he met with chiefs from almost all the tribes living within two-hundred miles of New Orleans. He smoked the peace pipe with them and listened to their professions of friendship. He extolled the benign but awesome power of Charles III and then hung gold medals bearing an image of the king around the necks of the nine most important chiefs.

 

 

Fears of a British or American invasion of Spanish North America convinced Spain to maintain an ambivalent approach throughout the Revolutionary War. Madrid gave the rebels just enough support to prolong the conflict in order to weaken both sides, until the Spaniards finally entered the war, not as allies of the rebels, but in order to secure control of the Mississippi and the Gulf Coast. That policy succeeded. In 1783, at the Peace of Paris, George III ceded both East and West Florida to Charles III, giving Spain control over the Mississippi and the Gulf of Mexico. 

 

The Peace of Paris also created a troubled and porous frontier with the newly independent United States along the Saint Marys River. In the Florida, the disembodied specter of American invasion that had been invoked by Aranda manifested itself as real humanity on the ground. These were not the grand armies he had envisaged descending the Red River. In 1790, the new Spanish governor at Saint Augustine, Manuel de Zéspedes y Velasco, railed against “a species of white renegade, known as Crackers,” in a report sent to Spain. Their “wish to escape the authority of the law is so strong that they prefer to live in Indian... or Spanish territory rather than under the yoke of civilization.” They “are nomadic like Arabs and can be distinguished from savage [Indians] only by their complexions, their language, and the depraved superiority of their cunning and bad faith. As skilled as Indians when hunting, they will risk crossing great rivers on flimsy rafts, and can track man or beast through the thickest woods.” These “Crackers” trespassed across Spanish territory and occupied Indian lands, yet “far from opposing these land grabs,” Zéspedes complained, “the southern states of America encourage them, motivated by the desire to expand their frontiers and gain control over foreign lands”.

 

“America,” he might have said, “was not sending Spanish Florida its best.”

 

In 1785, the American general Nathaniel Greene made a surprise visit to Saint Augustine, where he much enjoyed the liberal generosity of Manuel de Zéspedes’s table.  Greene wrote his wife that perhaps “two hundred dishes of different kinds [were] served up in seven courses,” all washed down with a “variety of Spanish and... French wines.” After “five hours,” he confessed, “I was not unlike a stuffed pig”.

 

Greene did not visit Florida to have lunch, however enjoyable. Ironically enough, the ostensible purpose of his mission was to ask Zéspedes to help prevent Loyalist refugees from squatting and logging on Cumberland Island GA, where Greene had recently acquired the property which gained renown as Dungeness. More ironic still, Zéspedes reported to Madrid that Greene had in reality come to Saint Augustine not to complain about Tory vagrants, but to tempt formerly British Floridians newly subject to the Spanish Crown to settle and work his new estate. Skilled tradesmen like the carpenter Thomas Stafford, later elected to the State Convention of Georgia, and his brother Robert Stafford did indeed answer Greene’s call.

            

Not only was America not sending Florida its best, it was stealing Florida’s best to boot. 

]]>
Thu, 18 Apr 2019 16:48:42 +0000 https://historynewsnetwork.org/article/171664 https://historynewsnetwork.org/article/171664 0
What I’m Reading: An Interview With Historian of Mexico Pablo Piccato

 

Pablo Piccato got his B.A. in History at the Universidad Nacional Autónoma de México, in 1989, and his Ph.D. from the University of Texas at Austin in 1997. He is professor at the Department of History, Columbia University where he teaches on Latin America, Mexico and the history of crime. His research focuses on modern Mexico, particularly on crime, politics, and culture. He has taught as visiting faculty in universities in Mexico, Argentina, Brazil, Italy and France, and has been director of Columbia’s Institute of Latin American Studies. His books include Congreso y Revolución, 1991; City of Suspects: Crime in Mexico City, 1900-1931 (2001), The Tyranny of Opinion: Honor in the Construction of the Mexican Public Sphere (2010), and most recently A History of Infamy: Crime, Truth, and Justice in Mexico (2017), which won the María Elena Martínez Prize for the best book in Mexican History from the Conference on Latin American History.

 

 

What books are you reading now?

 

Reading fiction is a basic necessity for me, even when I am in the middle of research or teaching seasons. Right now I am readingThe Star Diaries by Stanislaw Lem, in the Spanish version. Science fiction creates worlds that are possible. They can be removed in time and space but they have a connection with our present. In a way, all science fiction is about colonialism. Lem is a great critic of science and politics: he plays with the encyclopedic knowledge of those possible worlds, and makes fun of our ridiculous anthropocentrism. His scientists of the future try to intervene in history to bring the world closer to their idea of perfection, but fail miserably because of bureaucratic intrigues. I’m also going slowly through Robert A. Caro’s second volume of Lyndon B. Johnson’s biography, Means of Ascent. I admire Caro’s narrative drive, but also his lack of concern about how long it will take to get to fully understand his subject. 

 

What is your favorite history book?

 

I have to think about several books that were important for me at different times: at the beginning of college, Charles Gibson’s The Aztecs under Spanish Rule, showed me the power of old and neglected sources to reveal a social structure that survived conquest. I read it in Spanish translation, in a battered copy at my university’s library, and it made me want to become a historian of Mexico in the sixteenth century. Just before graduate school, William B. Taylor, Drinking, Homicide and Rebellion in Colonial Mexican Villages showed me the value of judicial sources and the way in which transgression could be woven into the fabric of an apparently stable system of domination. When I was in Austin I discovered the richness of modern urban spaces and sociabilities through Judith Walkowitz, City of Dreadful Delight, and Margareth Rago, Os prazeres da noite: two wonderful books about transgression and desire set in urban spaces where women challenged the privileges of male gaze. 

 

Why did you choose history as your career?

 

I’m not sure. I was finishing high school and I had to decide what career to follow at the Universidad Nacional Autónoma de México. I guess I applied to History instead of Philosophy because I felt I had to understand the reasons why I was there: my father had to leave Argentina because of the political repression of the mid-seventies and we all moved to Mexico. Both countries still represent the questions of history for me, one in short and recent episodes of crises, and the other as long-term processes of stability and transformation. In those years, Argentina went through a process of internal political fragmentation that lead to a bloody military dictatorship. I discovered that Mexico had a rich pre-Hispanic and colonial history, a massive social revolution, and a regime that still welcomed exiles. Studying history may have been my way of honoring that complex combination in space and time that was Mexico City in 1982. I’m pretty sure I did not go into English Literature because I was afraid of spoiling the pleasure of reading fiction.

 

What qualities do you need to be a historian?

 

You have to be able to tell a story, but you also have to explain the past. Both require to be attentive to the present. The reasons why people approach history are always changing, so the historian has to have one foot firmly planted on her circumstances, and to read other historians with that in mind. But she also has to be willing to go into the archive or the library or the interview and let the sources take her into unexpected places. You have to be meticulous in preparation but also be ready for surprises, and keep a good database with your notes because you never know how are you going to use the source you read today. Without a particular combination of this sense of wonder with a maniacal concern about detail, one cannot go very far as a historian. And you also need patience: to spend many hours reading sources that do not seem to be productive, to wait until your files are delivered by the archivist, to look for a book that seems to have disappeared from libraries.

 

Who was your favorite history teacher?

 

Again: different people come to mind at different times. A teacher of Mexican history in middle school whose name I’ve forgotten was the first to show me that the past can be explained with clarity, and told me to visit the National Library, at the time still in an old church building in downtown Mexico City. Arturo Sotomayor, in high school, spoke with passion about modern Mexican history, in a way that gave it a relevance I did not imagine until then, and that I am still trying to fully apprehend. In college, Eduardo Blanquel taught me how to read and discuss primary sources, and I’ve been following his model and trying to do the same in my classes ever since. My doctoral adviser in Austin, Jonathan C. Brown, showed me how to think and then write clearly while I was immersed in a project that could have easily gone out of control. He is my model of a graduate mentor because he knows when to be critical and when to let you be. 

 

What is your most memorable or rewarding teaching experience?

 

After a particularly difficult class trying to understand Descartes’s Discourse on the Method, in my Contemporary Civilization section at Columbia. We were tired and not sure where the discussion was going. A student smiled and laughed, I laughed, and the entire group started laughing out loud, so much so I had to end the class right there. I guess we all agreed that we would never finish dissecting the book into its smaller parts, but also that the enterprise was foolish anyway. The lesson, I guess, is that you have to stop reading at one point, and move on. 

 

What are your hopes for history as a discipline?

 

That historians claim a stronger voice in the public sphere to talk about the present under the light of the past. This does not mean we have to become antiquarians who claim there is nothing new under the sun, or determinists that try to establish the laws of evolution. We can help shaping historical discussions that will make sense of the present with a proper historical perspective. We can compare places and times, and remind the public that their horizon should not be an op-ed about the last five years, or a rigid school narrative about the last two hundred years. Today that operation is more important than ever: we need to understand the history of fascism and racism if we are going to appreciate both the radical threat and the unavoidable roots of Trump’s government. Yet nothing of this will be possible if we stop training historians who can do serious and deep research, organize large amounts of data, write coherently and have a lasting impact as teachers and mentors.

 

Do you own any rare history or collectible books? Do you collect artifacts related to history?

 

Not really, unless you count a small collection of 1940s Mexican comic books about Chucho Cárdenas, the reporter-detective. I enjoy looking at old objects in museums and libraries. They help me imagine how they were used, circulated, touched and valued in times past. Yet I had never had the impulse to own them. They should be in a public place where others can come close to them and imagine those uses for themselves.

 

What have you found most rewarding and most frustrating about your career? 

 

The most rewarding aspect has been my work with students and colleagues. It is tempting to think about the work of the historian as solitary and individualistic, as if we were authors inspired by rare epiphanies that only occur after long years of painful research. The reality is that our conversations in seminars, workshops, conferences, the cafés close to archives and libraries, and at the occasional bar, play a decisive role to understand the possible contribution of our work. Often when I write I imagine the text as a conversation with other historians who can criticize my arguments, be skeptical about my sources, but also, eventually appreciate what I am trying to say. Co-writing has always been a good experience for me and sometimes I wonder why we historians are so reluctant to write in teams compared to other scholars. Another reward of the job is to come across readers who understand, sometimes more cogently than myself, what my books and articles do. 

 

I guess all institutions can be frustrating, even as they make possible the material conditions and the collaborations that are essential for our work. I have experienced the combination of mismanagement and petty authoritarianism of large institutions since college, but I have also seen the advantage of being patient and trying to change them from the inside. I guess I’ve been fortunate to have the option. But I am aware that biases permeate academic life, even if we refuse to recognize them. I am still learning how seemingly small interactions can have large consequences for people’s careers. Along with a wonderful group of colleagues from different disciplines at Columbia I participated last year in the production of a report on harassment and discrimination in the social sciences (https://www.fas.columbia.edu/home/diversity-arts-and-sciences/ppc-equity-reports). The experience helped me understand moments in my own career that I had tried to forget, perhaps because they undermined my confidence as a young scholar. It also helped me appreciate how effective serious research and collective work can be when we try to confront the problems derived from bias and inequality in academic life. The committee work that I have to do, as all of my colleagues, reminds me that institutions are not brands or buildings but people who come together with a purpose.

 

How has the study of history changed in the course of your career?

 

The fundamental change in recent decades has been a new ability of the discipline to synthesize methodologies and approaches that twenty years ago seemed to be isolated from each other. During graduate school, in the nineties, I saw the tension between cultural history and other subfields that defined themselves as “harder” in terms of their use of evidence and interpretive models. It was as if two fundamentally opposed paradigms of historical work were on a collision course. But if you look at the best programs in my field today, in Mexico and the United States, you can see that they have avoided the temptations of specialization and have encouraged historians to cross disciplinary divides, training students in a generous way. So, instead of a field divided between “postmodernists” and “positivists”, as many predicted twenty years ago, we have an explosion of work that engages social, cultural, economic, political, intellectual, environmental, migration and legal history, to name a few. We still have some colleagues who concern themselves with patrolling the boundaries of their area of expertise, but they do not have the influence they think they have.

 

What is your favorite history-related saying? Have you come up with your own?

 

I love “May you live in interesting times”: it might sound as an ironic curse but now I hear it as a blessing of sorts. We all live in interesting times, whether we like it or not. 

 

What are you doing next?

 

I am starting a project on poetry and politics in nineteenth-century Mexico. I am still far from producing anything of value but I am enjoying the process of learning how to read poetry. Mexico and other Latin American countries had a rich literary life in the nineteenth century. Some authors have survived, particularly from the second half of the century, like Sarmiento, Martí or Machado de Assis, but I think few historians appreciate yet the creativity and intensity of that world of fiction and poetry, a world that was shared by many people, across classes, in oral and written form. It was a realm of cultural production where Latin American authors and readers could be as productive and free from the legacies of colonialism as they wished to. Poetry in particular was a central medium for political speech during the era that we can roughly classify as romantic. 

 

I started this project with some trepidation, because I knew I would enjoy the research. I now understand Terry Eagleton when he writes that “Literary critics live in a permanent state of dread--a fear that one day some minor clerk, in a government office, idly turning over a document, will stumble upon the embarrassing truth that we are actually paid for reading poems and novels.” I already had a sense of dread when I was reading crime fiction for my previous book but I managed to overcome it when I confirmed that narrative was a privileged source to understand social ideas about crime and justice. Poetry is similarly promising if we read it as a medium that expands the communicative possibilities of words and images.

]]>
Thu, 18 Apr 2019 16:48:42 +0000 https://historynewsnetwork.org/article/171183 https://historynewsnetwork.org/article/171183 0
Roundup Top 10!  

 

What if Churchill Had Been Prime Minister in 1919?

by Andrew Roberts

More than most, he understood the grave challenges facing the West at the end of World War I.

 

How Reconstruction Still Shapes American Racism

by Henry Lewis Gates, Jr.

Regardless of its brevity, Reconstruction remains one of the most pivotal eras in the history of race relations in American history —­ and probably the most misunderstood.

 

 

The Electoral College Was Not a Pro-Slavery Ploy

Mr. Wilentz is the author, most recently, of “No Property in Man: Slavery and Antislavery at the Nation’s Founding.”

 

 

Are the Humanities History?

by Michael Massing

In the brave new world that is emerging, the humanities will have a critical part to play—provided that they themselves can adapt to it.

 

 

The Story We've Been Told About America's National Parks Is Incomplete

by Dina Gilio-Whitaker

The national park system has long been lauded as “America’s greatest idea,” but only relatively recently has it begun to be more deeply questioned.

 

 

Want to unify the country? A community organizer and a Klan leader showed us how.

by Jonathan Wilson-Hartgrove

In the midst of the identity crisis we face as a nation, the organizing tradition that Ann Atwater embodied is the strong medicine we need.

 

 

Waking Up to History

by Margaret Renkl

At new museums, the past is finally becoming more than the story of men and wars.

 

 

Can Bernie Sanders Exemplify the American Dream?

by Walter G,. Moss

How can a socialist provide a unifying vision, one that will unite U. S. citizens?

</

 

The truth about the "campus free speech" crusade and its myths that won't die

by Jim Sleeper

While critics of college "snowflakes" prop up a fake crisis, even "good liberals" misunderstand student outbursts.

]]>
Thu, 18 Apr 2019 16:48:42 +0000 https://historynewsnetwork.org/article/171669 https://historynewsnetwork.org/article/171669 0
The Red Scare: From the Palmer Raids to Joseph McCarthy to Donald Trump

 

In the immediate post World War I era, Attorney General A. Mitchell Palmer was considering a bid to become the Democratic Presidential nominee in 1920 to succeed President Woodrow Wilson.  To raise his profile, he claimed there was a massive wave of Socialists and Communists in America working to undermine the nation in the aftermath of the Russian Revolution of 1917.  Palmer arrested and detained thousands of suspected radicals as part of the Red Scare. Many people were detained for months without trial or protection of their basic constitutional rights, until some were deported and others were released without charges. 

 

Palmer had a very eager and zealous chief assistant, J. Edgar Hoover.  Hoover performed so well under Palmer, Palmer recommended Hoover become the head of the newly constituted Federal Bureau of Investigation. President Calvin Coolidge indeed selected Hoover to lead the FBI in 1924, beginning Hoover’s 48-year career as the Bureau’s head until his death in 1972.

 

Like his predecessor, Hoover’s tactics often violated Americans’ civil liberties. Under his leadership, the FBI engaged in unconstitutional behavior, particularly in the post World War II era as the U.S. fought Communism at home and abroad during the Cold War. Many in academia, in Hollywood, and in government were purged based on accusations that they were Socialists or Communists and were spying for the Soviet Union.

 

One beneficiary of this Second Red Scare in the late 1940s and early 1950s was a Republican United States Senator, Joseph Raymond McCarthy, who was rated the least influential and accomplished of all US Senators by a periodical.  Thinking ahead to his reelection in Wisconsin in 1952, McCarthy decided to raise his profile by accusing people in government and all walks of life of being Socialists and Communists.  McCarthy became extremely popular among a third of the American population, and with the exception of a few journalists and US Senators who spoke and wrote against McCarthy, he was able to run rampant in the last years of the Presidency of Harry Truman and the first two years of the Dwight D. Eisenhower Presidency. He finally went too far in his accusations, and was censured by the US Senate, a rare action in the history of the upper body, in 1954.

 

During his nearly five years of power from February 1950 to December 1954, McCarthy was aided by a zealous young man not all that different in character or motivation from J. Edgar Hoover three decades earlier.  McCarthy’s chief aide was attorney Roy Cohn, who zealously attacked innocent people who were accused of being Communists (Reds), or soft on Communism (Pinkos). Many believed he lacked any sense of ethics or honor and he was much feared.  Even after McCarthy fell from favor and then died in 1957, Cohn’s prominence continued and spent his remaining career as an attorney who often chose to represent reprehensible elements of society, including Organized Crime. He was also known for his wild social life.

 

Then, Roy Cohn met a young real estate entrepreneur named Donald Trump. The two men became close friends and Cohn impressed upon Trump how to exploit and play “hard ball” to gain ever more wealth and public influence.  As others have argued, Cohn was one of the most influential people in the development of Trump’s public persona and political views.  

 

Trump learned to exploit his critics, focusing on their weaknesses or shortcomings, to harm their reputations. Trump stoked fear in his goal to help his rise to power and march to the Presidency. 

 

Once in the White House, Trump worked to undermine civil liberties and civil rights, as A. Mitchell Palmer, J. Edgar Hoover, Joseph McCarthy, and Roy Cohn had done in earlier generations.  Utilizing racism, nativism, and Islamophobia, Trump also exploited the issue of gay and transgender rights, following the lead of his Vice President, Mike Pence, in promoting the elimination of gay and transgender people in the US military, despite their major contributions.  He undermined support for gay equality of treatment as a protected group, as had been pursued by Barack Obama during his Presidency.

 

Trump also has now labeled his Democratic opposition, especially Democratic Presidential candidates Elizabeth Warren and Bernie Sanders, as Socialists who threaten American capitalism. As he stirs up fear, he works to undermine Social Security, Medicare, Medicaid, the Affordable Care Act, and environmental protections.

 

Trump is clearly using similar tactics as Palmer, Hoover, McCarthy and Cohn did to promote his agenda and undermine civil liberties and civil rights. But he is more dangerous than his predecessors as only Donald Trump has reached the pinnacle of the Oval Office. Trump has learned very well from their examples, and it requires vigilance and activism to cope with the threat he represents every day.

]]>
Thu, 18 Apr 2019 16:48:42 +0000 https://historynewsnetwork.org/article/171640 https://historynewsnetwork.org/article/171640 0
ROOAARRRR: 66 Million Years of Tales about the Big, Bad T-Rex Dinosaur

 

In a stunning scientific discovery, last week Canadian scientists unearthed a 66-million-year-old Tyrannosaurus Rex dinosaur that weighed 9.5 tons and was 65% intact. The startling discovery of the world’s largest T. Rex was hailed throughout the scientific world

And, speaking of good timing…

* * * *

As you walk through the fascinating new exhibit on the history of the legendary Tyrannosaurus Rex dinosaur (T. Rex to his loyal fans) at New York’s American Museum of Natural History in New York, you hear this endless THUMP, THUMP, THUMP sound. It is the sound of T. Rex charging through the jungle in pursuit of yet another animal as his prey. The big, broody T. Rex, in all of his 66 million years of glory, and such a star on the silver screen for decades, is the subject of a new, brash and bold exhibit at the museum.

The exhibit, T. Rex: The ultimate Predator, opened last week at the museum, at Central Park West and W. 81st Street. It tells the 66 million year history of the T. Rex and explains how the huge, vicious two-legged dinosaur with the short arms evolved from a smaller and far less dangerous jungle creature into the much feared King of the Jungle.

The new T. Rex exhibit on the third floor of the museum  is the crowning touch to the large dinosaurs exhibits at the museum and helps, in an exciting way, to show visitors how all those movies and songs about T. Rex and his fellow dinosaurs emanated from the T. Rex over so many years.

Other dinosaur skeletons, fossils and mummies are house in the Hall of Dinosaurs on the fourth floor of the museum. The dinosaur hall has a towering T. Rex, a Ceratosaurus, the Stegosaurus, the Triceratops, Stegosaurus, Mammoth and a collection of fellow dinosaurs whose history stretches back 128 million years. The centerpiece of the hall is the 122’ long Tyrannosaur, found in an Argentina farm a few years ago, a beast whose skeleton is so long that it couldn’t be contained in one hall; his neck and head sticks out into a second hall, like a cartoon character, and is a daily tourist attraction for wide-eyed adults and giggling children.

The museum has done a nice job of setting up the T. Rex exhibit. They time the exhibit so the crowds are not large and you select your time when you obtain your tickets. The first few display boards tell you how the T. Rex descended from 24 different T. Rex species over millions of years. He was so big and heavy (around 9 tons) because all he did was eat and grow (he gained 140 pounds a month, and that’s without cookies). From there, you follow the story of how the dinosaur changed over the years. In the early days, the T. Rex was a much smaller animal. The meanest animal that ever lived, a really bad dude, was not mean in the beginning. In fact, don’t tell anybody, he was completely covered in feathers. Yes, feathers.

T. Rex now stands at the heart of the dinosaur exhibits at the museum. The reason is because the museum’s curator, Mark Norvell, and his staff have done a painstaking job of making him easy to understand without taking the menace out if him. They have also turned this into a very family-friendly exhibit, one that adults and kids alike can enjoy. And yet, T. Rex still as scary as ever.

Museum President Ellen V. Futter was so excited about the T. Rex idea that she made it the first major exhibit in the museum’s 150th anniversary celebration.  “Dinosaurs, and tyrannosaurus rex in particular, are such an important and iconic part of the Museum and have been throughout our history,” she said. “So, it seems fitting to launch the…anniversary with a major new exhibition on the ever-intriguing King of Dinosaurs.”

T. Rex was one of the most intelligent of the dinosaurs and yet, at the same time. mean and nasty.

He was such a belligerent predator because his razor sharp teeth could not only tear an animal to pieces, but crunch so hard that it made the prey’s bones explode. The dinosaur even ate the bones of his prey in a few big gulps. He also had a heightened sense of smell and hearing to easily hunt down animals, even those in hiding. No one could escape him 

His rage that made him so famous stemmed from injuries throughout his lifetime (most T. Rexes lived until about 34). “He had to go into battle every time he wanted to eat something, and the wounds he suffered piled up and he was always in some kind of pain,” said a tour guide.

The best way to see the sprawling exhibit is with a tour guide that you can find somewhere on the floor. The guides gather together a dozen or so visitors and, at no charge, give comprehensive tours discussing the T. Rex. They also answer any and all questions. They are also funny. Mine told a little girl that if a T. Rex gobbled her up she was so small that she would serve only as his “potato chips.” She laughed. 

It is appropriate that the museum is presenting the T. Rex exhibit because the world’s first T. Rex find was made by the museum’s famous paleontologist and fossil hunter, Barnum Brown. He uncovered the first-ever T. Rex remains in 1902 in Montana.

The T. Rex is everywhere in American culture – dolls, stuffed animals, coffee mugs, posters and even in rock and roll songs. T. Rex and other dinosaurs have been a staple of Hollywood movies for years. There have been more than 150 movies centered around dinosaurs, with the T. Rex, the Brontosaurus and others galloping through thick jungles and across meadows. They started in the silent movie era and picked up steam with the first King Kong movie in 1933. They gained wide popularity with all the recent Jurassic Park films (I saw Jurassic World on television just last Tuesday). Along the way there were all the Godzilla dinosaurs, The Land Before Time movies, the Lost Continent and Lost World movies, the Tarzan films, Mysterious Island, and of course, The Flintones (yaba-daba-doo).

The exhibit also includes the skeleton of a four-year-old T. Rex, a not so cuddly tyke, dinosaur teeth (pretty big and sharp) and a dozen or so re-created T. Rexes at different ages.

There is plenty to do at the exhibit besides stare at old bones. There is a “Roar Machine” on which you can listen to how the T. Rex sounded when he was angry. There is an “investigation station” that shows you various dinosaur fossils. There is a five minute long virtual reality show viewed with a mask in which you see the world as the dinosaurs saw it. There is a really neat, wall sized animated movie of a T. Rex rumbling through the jungle. You walk or run in front of it and the dinosaur follows your motion and chases you, roaring like mad and thumping away with his huge paws.

If you love dinosaurs and/or being chased by one, this is the exhibit for you. Bring the kids.

RRRRRROOOOAAARRRR….

The American Museum of Natural History is at Central Park West and W. 81stStreet. The museum is open daily, 10. A.m. to 5:45 p.m. It is closed on Thanksgiving Day and Christmas Day. The exhibit runs through August 9. 

]]>
Thu, 18 Apr 2019 16:48:42 +0000 https://historynewsnetwork.org/article/171638 https://historynewsnetwork.org/article/171638 0
The Temptations: Born in the Turmoil of the 1960s and Still Conquering

 

There have been a number of so-called “jukebox musicals” in theater over the years, but few, such as The Jersey Boys and Beautiful: The Carole King Story, succeeded. That’s because the ones that failed had lots of memorable music but no story. Now, following in the footsteps of the ones that worked, comes Ain’t Too Proud: The Life and Times of the Temptations, an out and out hit, and a great window on entertainment history. The play opened last week at New York’s Imperial Theater on W. 45th Street.

Everybody remembers the Temptations, voted the greatest Rhyme and Blues group of all time. They had huge hits with My Girl, Get Ready, If You Don’t Know Me by Now, Ain’t Too Proud, Papa Was A Rolling Stone and other tunes in a career that has stretched more than 50 years. They charmed you not only with their fine tunes and silky voice, but with that smooth, gorgeous choreography which served as a show in itself.

But there was also a lot of drama in the lives of the Temptations, all of them, and that is the heart of this terrific new play and the reason that it works so well. The play, written with great style by Dominique Morisseau, starts in 1967, the time of the Detroit riots, as Detroit’s Otis Williams is trying to sign up singers for his new music group.  He gets singers from his high school and neighborhood, bass Melvin Franklin, Paul Williams, Eddie Kendricks and flamboyant lead singer David Ruffin. They went through several names before settling on The Temptations, grabbed their song sheets and headed for the stage.

The story of the Temptations mirrored the story of America in the 1960s and ‘70s. The black group could not stay in white hotels after appearances in the South, suffered through the assassination of Martin Luther King Jr. and debated, as all African American music groups did, about what should they do, as high profile people, to help the Civil Rights Movement. They put up with rickety old tour buses, argued with their early manager and thirsted for fame.

There are very funny scenes in the play. In one, their new, aggressive manager, madder than hell at them, flies off the stage in a gorgeous, gleaming convertible. In another, Otis follows Motown head Berry Gordy into a theater’s men’s room in order to meet him. A minute later, all four Temptations singers run into the men’s room, too.

Later in the show, during the Vietnam War era, producer Berry Gordy tells the Temptations that the Vietnam protest song, War, will not go anywhere and that they should not record it. Someone else did and the song became a monster hit (“So I’m not always right…” bemoaned Gordy on stage to laughter from the audience).

Gordy took them into his stable of talent that included Diana Ross and the Supremes. The Supremes, who performed a number of songs, were a delight, led by Candice Marie Woods as Diana Ross.

Writer Morisseau was wise to let Otis Williams be the narrator of the show. The musical (and a previous TV movie) were based on his autobiography. He was with the group all of its life and so he can tell the story as an eyewitness and actor Derrick Baskin, as Otis, does a fine job holding the different segments of the tale together. Director Des McAnuff does a fine job of letting his actors flesh out the characters. 

The performers in the show are all wonderful. Some of the standouts are James Harkness a the alcoholic Paul Williams, Jawan M. Jackson as the deep voiced and flip Melvin Franklin, Jeremy Pope as the charming Eddie Kendricks, Ephraim Sykes as the bouncy, devilish David Ruffin, who brings the house down with his leg splits, body whirls and microphone flips, Jahi Kearse as produce Berry Gordy, Christian Thompson as Smokey Robinson,  Shawn Bowers as Otis’ son Lamont and Nasia Thomas as singer Tammi Terrell, who died at 24.

All of the Temptations’ great hits are performed on stage, to the out of this world, fabulous choreography of Sergio Trujillo. Ain’t Too Proud… is the story of the changes a group has to make, even when it is successful. The best example is the arrival and departure of sensational singer David Ruffin, later a star in his own right. He was a brilliant singer but an emotional mess who dragged the whole group down with him. Another singer, Eddie Kendricks, had the same problem and he was dismissed. There were singers who left to bask in retirement and one, Paul Williams, forced out after he became a bad alcoholic, later killed himself.

There are several themes in the musical that were common to all music groups in the 1960s and ‘70s. One – how do you keep a group together when members are in turmoil.  A number of groups collapsed because of inner friction in that historical era, but not the Temptations. Leader Otis Williams worked hard to find talented people to replace the talented people who were let go or quit and the group survives today (24 different singers over the years).

Second, how does a group that is very successful in the ‘60s with one style of music (reflected by My Girl) shift gears and come up with new music in the ‘70s and ‘80s, such as Papa Was a Rollin’ Stone? Third, how much does a group and its members have to give in to the wishes of the producer, Berry Gordy? He was a genius and they knew it, but still fought for themselves.

Fourth, the Temptations were always competing against some other group. At Motown, they had the enviable job of competing against the Supremes not only because of the Supremes’ enormous popularity, but because Berry Gordy was in love with Diana Ross.

And at the same time, the Temptations, a black group, had to compete against white groups and constantly had to avoid the tag of a “black music” group and still perform crossover music for mostly white audiences.

They were a reasonable success despite all of that, led by the hard working Otis Williams, who, because of tours, did not see his wife and son very much and missed them.

The group’s story was at times glorious and at time difficult, as it was for all the ‘60s music groups who had to compete against each other and the changing times. The play is a great behind the-scenes look at the history of the Temptations and American music. The Temptations did battle with the Supremes, Beatles and other groups and with their own, personal demons. too.

If you see the show, remember, they’re gonna make you love them…..

PRODUCTION: The show is produced by Ira Pittelman, Tom Hulce, the Berkeley Repertory Theatre and others. Scenic Design: Robert Brill, Costumes:  Paul Tazewell, Lighting: Howell Binkley Sound: Steve Canyon Kennedy, Projection Design: Peter Nigrini, Fight Director: Steve Rankin. Choreography: Sergio Trujillo, The play is directed by Des McAnuff, It has an open ended run.

]]>
Thu, 18 Apr 2019 16:48:42 +0000 https://historynewsnetwork.org/article/171639 https://historynewsnetwork.org/article/171639 0
MMT and Why Historians Need to Reclaim Studying Money

Pictured above: Campaign buttons from the 1896 election. Some monetary references are still obvious (“the money we want”); others, now more obscure (Bryan’s campaign advocated a “16 to 1” ratio between silver and gold). Supporters of a gold standard had been known as “goldbugs” since the 1870s; the silver beetle-shaped pins were made in response. 

 

 

MMT (Modern Monetary Theory—a form of post-Keynesian economics) is everywhere these days. Alexandria Ocasio-Cortez and Bernie Sanders embrace it; Paul Krugman and George Will write about it; the Financial TimesForbes, and The Economist have all run columns about it. Even the men’s parenting website Fatherly had an article on it. Do historians have anything to add? 

 

Historians know this is not the first time that American politicians, scholars, and ordinary people alike have asked fundamental questions about what money is, how it works, and who it benefits. The 1896 presidential election is famous for William Jennings Bryan’s “Cross of Gold” speech, but he was only one of many that decade to be talking and writing about the comparative merits of gold, silver, and paper. Americans got in barroom fights about it (at least one man died), sang songs about it, and composed poems on the subject. The economist President of Brown University, E. Benjamin Andrews, nearly lost his job because of his silverite views. Newspapers across the country reported when a Stanford professor asserted that faculty were forced to teach in favor of the gold standard; “Coercion in the colleges” ran the headline in the Morning World-Herald (Omaha).  

 

Today, as in the 1890s, the fundamental question is whether prosperity can be increased and inequality reduced by injecting more money into the economy. Orthodox economists—the vocabulary of “orthodoxy” has been part of economics since the first professorships were created—say it cannot: that growth (whether it be the manufacture of more stuff, or the greener production of better stuff) has to happen in the “real” economy and that money simply facilitates buying, selling, saving, and investing. As J. Laurence Laughlin (first chair of Economics at the University of Chicago) wrote in 1895: “Money… no matter how valuable, is not wanted for itself. It is only a means to an end, like a bridge over a river.” No one, Laughlin continued, could really believe that adding silver to the money in circulation would produce “bushels of wheat and bushels of corn and barrels of mess pork”—only mine owners and their investors would gain by its being minted. 

 

MMTers and silverites, in contrast, emphasize the work left undone—factories shut, children and the elderly not cared for, solar panels not made and installed, etc.—because there is too little money in circulation. MMT’s proposed mechanism for adding money to the economy is hardly that of the “Free Silver” movement, but the two fundamentally agree that money is a political phenomenon (a “creature of the state” in the words of Abba Lerner’s 1947 paper). Populists in the 1890s campaigned against the 1873 law that demonetized silver; MMTers today, against the rhetoric of “deficits” and mandates for pay-as-you-go budgeting that have been central to American politics since the Reagan Revolution. MMT crucially claims that a monetary sovereign cannot go broke in its own money—it can always issue more. We should therefore think of public deficits not as bills to be paid, but as indicators of how much we as a nation care about particular issues. Since money exists for wars and walls, they say, it can just as readily be found for high-speed trains and clean-power energy. 

 

If the sovereign uses its money-issuing power unwisely—if more exists in the system than there is work to be done or goods to be bought—then prices for everything could rise. Should there be high inflation, the government should spend less and tax the excess money back into its coffers. MMT, in other words, does recognize that deficit-spending could become problematic, but not for the reasons usually given. A country like the United States—a public entity that is sovereign, does not age or plan to retire, and is imagined as existing indefinitely into the future—is not a household that needs to balance its budget. Using examples from personal finance to explain public spending may give a homey touch to political campaigns, but they are fundamentally misleading. 

 

In the way it links monetary policy, fiscal policy, and social policy—the Jobs Guarantee and something like a Green New Deal are not things to be “paid for” via MMT, but are part of it—MMT rejoins the Enlightenment tradition of political/social economy. Adam Smith, remember, was not an economist (the word was barely used in the eighteenth century) but Professor of Moral Philosophy and an opponent of many of the developments—growth of corporations, laissez-faire capitalism, the exploitation of workers—for which he is now imagined to stand. As Gareth Stedman Jones and others have shown, the selective reading of Smith as “father of capitalism” was an interpretation formed in reaction to the social radicalism of the French Revolution. So, too, did political context play a significant role when economics became a distinct, and then increasingly model-based, social science some 120 years ago. With the strikes and labor unrest of the 1880s and the Populist Movement of the 1890s, economists who spoke in favor of unions or about the plight of workers under monopoly capitalism either found themselves out of a job or re-appointed to Social or Political Science departments. There is a long institutional history, then, to MMTers’ self-positioning as underdogs and voices in the wilderness. 

 

While MMT economists (Stephanie Kelton, Pavlina Tcherneva, Randall Wray, Warren Mosler, and Bill Mitchell are five big names to know) quarrel with their fellow post-Keynesians over models and implications, historians need to reclaim money as something to be studied in specific social and political contexts. Historians know what all financial advisors profess to recognize: “past returns are no guarantee of future results.” In fact, however, the entire field of economics—with its assumptions about trend lines, models, and transhistorical facts (such as Milton Friedman’s assertion that “inflation is always and everywhere a monetary phenomenon”)—has largely failed to internalize this salient and important truth. 

 

The historian Andrew Dickson White, first president both of Cornell University and of the American Historical Association, made himself part of the 1890s debate with his Fiat Money Inflation in France: How it came, what it brought, and how it went (1896). An earlier version, entitled Paper Money Inflation in France, had appeared in 1876 when the Greenback Party (named for the paper money issued by the United States to fund and win the Civil War) was at its peak. In both pamphlets, White used the example of the French Revolution’s paper money, the assignats, to argue against increases to the money supply and for “fighting a financial crisis in an honest and manly way.” By changing the first word in his title and adding material borrowed from Macauley’s History of England about seventeenth-century coinage debasements, White expanded his target to include all “fiat” currencies—all money created by government order. Re-issued in 1914, 1933, and 1945 by various publishers and in 1980 by the Cato Institute, White’s pamphlet remains widely available today. This is a record of impact and influence few historians can match, but I do not suggest it is a model we should follow. Couched in a vocabulary of natural laws—at one point, White describes issuing paper money as equivalent to opening dikes in the Netherlands; elsewhere, he compares it to corrosive poison and cheap alcohol—Fiat Money Inflation in France appealed to partisans of the Gold Standard because it  seemed to show fiat money’s inevitable outcomes. But nothing in history is inevitable (even if some things are far more likely than others) and the eventual failure of the assignats owed as much to the specific politics of the Revolution as to any timeless laws of economics.

 

MMT, along with the euro crisis and awareness of  austerity’s social effects, has done much to open monetary and fiscal debates to wider audiences. Simply recognizing that money is political and historical (central, as Harvard Law Professor Christine Desan likes to say, to how a polity constitutes itself) is a difficult breakthrough for most people. On the other hand, seeing money in this way doesn’t—in a fractured polity characterized by demagoguery and high levels of inequality—make policy any easier to write or implement. 

]]>
Thu, 18 Apr 2019 16:48:42 +0000 https://historynewsnetwork.org/article/171617 https://historynewsnetwork.org/article/171617 0
Elizabeth Blackwell, MD., Hero, Humanitarian, and Teacher

 

To celebrate Women’s History Month, the television quiz show Jeopardy, recently posted a category related to female historical figures. The contestants, sharp, enthusiastic, and knowledgeable, answered all the questions in that category, except for one. When host Alex Trebek asked, “Who was the first female doctor in the United States,” all three contestants failed to press their buzzers. Trebek looked at them skeptically and simply said, “Elizabeth Blackwell.  While the contestants surely would have immediately recognized the names of Sojourner Truth, Susan B. Anthony, Harriet Tubman, Harriet Beecher Stowe, Florence Nightingale, Jane Addams, Coretta Scott King, Amelia Earhart, and Marie Curie, for example, I wondered why none of the contestants even thought to take a guess. 

 

Thef act that Elizabeth Blackwell was the U.S.’s first female doctor is certainly worthy of recognition. What is far more important is what she did. She was a pioneer in fostering the role of women in medicine both in the United States and Great Britain. In the United States, she founded in New York City an infirmary for poor women and children, during the American Civil War she provided invaluable assistance combatting infectious diseases and treating the sick and wounded for the Union cause under the jurisdiction of the United States Sanitary Commission, and prior to returning to England she established a medical college for the training of female physicians. In Great Britain she duplicated these efforts where she led the way with the formation of the National Health Society as well as the London School of Medicine for Women where she served as professor of gynecology from 1875 to 1907. Why she did not receive the recognition she deserved during her lifetime and afterwards until the second half of the twentieth century is due in large measure to the profession she chose, which until recent times believed that women were better suited as nurses and not physicians. But perhaps more importantly, when rejected for hospital positions, she used her skills as a teacher to become not only the nation’s first female physician but also its first female professor of medicine. Very, very few know about the latter. 

 

Blackwell was born on February 3, 1821 in Bristol, England, the third of nine children. Her father, Samuel, was a devout Quaker and one of the founders of the Bristol Abolition Society. Her subsequent social activism, especially her anti-slavery views, as an adult was greatly influenced by her father’s beliefs.  

 

During the Bristol Riots of 1831, Samuel’s small sugar business was destroyed by fire. Disillusioned and nearly destitute he relocated his family to the United States where he established a new refinery in New York City. However, the Panic of 1837 hit his business hard,and in 1838 he moved his family to Cincinnati in order to re-establish his business. Three months after arriving in the QueenCity, Samuel died leaving his family destitute. Determined to survive Elizabeth, along with her mother and two older sisters, started a small private school. Later Elizabeth also taught in Kentucky and North Carolina.

 

While she was working as a school teacher, she was drawn to the field of medicine. In 1845, she began reading medical books under the direction of Dr. John Dickson of Asheville, North Carolina and his brother, Dr. Henry Dickson of Charleston, South Carolina. What drew her into the study of medicine was her friendship to another woman who was suffering from a terminal illness; her friend expressed to her how embarrassed she felt going to male doctors and it was her wish that someday there would be female physicians better able to relate to her personal feelings as a woman. Determined to learn more with regards to medical treatment for women, in particular, in 1846 she applied to medical schools in New York City and Philadelphia, only to be rejected because of her gender. Finally, in 1847, Geneva Medical School, a small medical school in upstate New York, gave her a chance. She did not disappoint, finishing at the top ofher otherwise all-male graduating class. While attending classes she was largely ostracized and made to feel unwanted. She received her medical degree in January 1849.

 

Perhaps because of her own family struggles, she chose to work briefly with patients at a Philadelphia alms house, an experience that provided her with a considerable amount of knowledge in the study of epidemiology. Curious to learn more about this field she moved back to England in April of that year, where she worked under Dr. James Paget in London. There, she developed a close relationship with Florence Nightingale and Elizabeth Anderson, pioneers in professional nursing and women’s health care in Great Britain. Paget became a leader in the study of women’s breast cancer (a form of the disease is named after him); Nightingale and Anderson were attracted to Blackwell because of her work with Paget and her interest in larger medical issues such as childbirth (she briefly went to Paris and studied at La Maternite)  and infectious or communicable diseases. 

 

Returning to America in the summer of 1851, she was denied positions in New York City’s hospitals. In part this was due to her contracting a disease during a procedure on an infant that led to blindness in one eye while studying Midwifery in Europe. Her career as a surgeon was over but why she was not hired to teach at one of these hospitals is troubling. Nonetheless, by this time her sister, Emily, also had a medical degree, and the Blackwell sisters together with Dr. Marie Zakrzewska, established the New York Infirmary forI ndigent Women and Children. This infirmary took the lead in presenting important lectures on hygiene and preventive medicine, including the training and placement of sanitary workers in the city’s poor areas. As a former schoolteacher, Blackwell was well suited for the job. 

 

Attempting to cast a wider net regarding health care for women she also published her own account on such matters, one aimed, specifically, at young ladies, The Laws of Life, with Special Reference to the Physical Education of Girls (1859). This book called attention to the importance of healthy living and proper exercise of girls, who were now confronted by the growing complexities of a developing industrialized society. It was important for women to be both strong and healthy as contributors to this new way of life. “In practical life, in the education of children, in the construction of cities, and this arrangements of society,” she wrote in her introduction to this book, “we neglect the body, we treat it as an inferior dependent, subject to our caprices and depraved appetites, and quite ignore the fact, that it is a complex living being, full of our humanity[.]” 

With the outbreak of the American Civil War in 1861, Blackwell rallied other female reformers to establish the Women’s Central Relief Association in New York City to train nurses for the Union Army. Her motivation and commitment to the Union cause grew out of her own anti-slavery beliefs. Providing medical aid and comfort was her way of upholding her Quaker beliefs while sustaining her support for the Union. The association quickly became part of the United States Sanitary Commission (USSC), a private relief agency to assist the sick and wounded. With Blackwell in the forefront many women were trained and began serving on hospital ships and as army nurses and sanitary relief workers. Working hand-in-hand with USCC, Blackwell orchestrated the building and running of hospitals and soldiers’ lodging houses and devised a communication system that delivered letters and telegrams to men in the field.

In 1868, she and her sister Emily established the Women’s Medical College of the New York Infirmary where she served as a professor of hygiene. The next year she decided to return to England where she would reside permanently. In large measure this was due to previous conversations she had with Nightingale, who had expressed to her the need to establish a medical college for women like she had done in the United States. Given that England was now a mature urban-industrialized society, whereas the United States was just beginning to experience the transition from agrarian to industrial, England offered Blackwell more opportunities to explore national health issues on a grander scale. Upon her return she helped form the National Health Society, designed to educate citizens on the importance of health and hygiene issues and founded the London School of Medicine for Women.

During her remaining years—she died at her home in Hasting sin Sussex on May 31, 1910--Blackwel lextended her outreach to promoting municipal reform co-op communities, prisoner rehabilitation, and the Garden City movement—a method of urban planning begun by Sir Ebenezer Howard designed as planned self-contained communities surrounded by lush “greenbelts that provided for areas of residences, industry workplaces, and agriculture." Her humanitarian reform efforts went beyond medical treatment and education, although it is fair to state that she considered these attempts part of her professional obligation.

Although Jeopardy may have acquainted millions of viewers with Blackwell’s occupation on this show it falls far short of calling attention to her many achievements as a leading female figure. Simply remembering her as the country’s first female doctor shortchanges her numerous contributions to women’s history in the United States and Britain. She remains a heroine for her pioneering research into female health issues, as a teacher, for establishing medical schools for women in the United States and Great Britain, and for risking her own health and welfare when volunteering her services to assist sick and wounded Union soldiers. As a humanitarian she also deployed her medical expertise to help indigent women and children by building infirmaries and developing local and national health agencies associated with the growing complexities confronting nineteenth century urban-industrialized societies. Her humanitarian contributions, moreover, led to her association with urban reform efforts in the twilight of her career.  

Yes there are reminders of her place in history. There is a statue of her on the lawn at Hobart and Smith College (formerly Geneva Medical College), an 18 cent postage stamp dedicated in her honor in 1974, a 2003 historical marker established by the Ohio Historical Society, scholarly works about her life as a physician and inclusion in the National Women’s Hall of Fame, and the Elizabeth Blackwell Medal presented by the American Medical Women’s Association —the award was officially established in 1958.  Although she became a naturalized U.S. citizen, furthermore, she is also the first woman admitted to the British Medical Register permitting her to practice medicine in the United Kingdom. Yet it is still very puzzling why she is not generally known among the populace at large. Just witness the Jeopardy contestants.  

What needs to be done is to give her greater exposure in our secondary social studies textbooks and teaching. Include a description of her many accomplishments along with a photo caption that reads:  “Elizabeth Blackwell, Physician, Heroine, Humanitarian, and Teacher.” What may very well capture the students’ attention is the “teacher” description. That is the one aspect of her life which has received the least attention. Yet it could be her most valuable contribution to the study of women’s history.  After all, she used that skill to inspire women against difficult odds to follow in her footsteps. Now it is up to textbook publishers and schoolteachers to give Blackwell her just due.

 

]]>
Thu, 18 Apr 2019 16:48:42 +0000 https://historynewsnetwork.org/article/171612 https://historynewsnetwork.org/article/171612 0
The Perils of Criminal Justice Reform

Living facilities in California State Prison (July 19, 2006)

 

 

I started working on Beyond These Walls: Rethinking Crime and Punishment in the United States during the Obama presidency. I wanted to understand and hopefully explain why no substantial reforms of the carceral state occurred in the second decade of the 21st century, despite militant street protests against police killings and widespread consensus among liberals and libertarians that something needed to be done about the country’s unprecedented rate of imprisonment. 

 

Reform is one of the most overused, misused, and Orwellian terms in the English language. “I am well convinced that it is kind, humane, and meant for reformation,” wrote Charles Dickens in 1842 after he witnessed a Pennsylvania prison’s system of silent solitary confinement. But its outcome, he observed, was to subject prisoners to “torturing anxieties and horrible despair” that left them “dead to everything.”

 

This combination of benevolent rhetoric and punitive measures is a persistent theme in American criminal justice history. During World War I, for example, the federal Comission on Training Camp Activities claimed to be acting in the interest of “delinquent women and girls” by rounding up and detaining without trial some 30,000 of them suspected of spreading venereal diseases and perversion, while the men received health care and wholesome entertainment.

 

When government officials and their allies call themselves reformers, it’s time to look out, and to look deeply and carefully at what is being proposed. Most government-sponsored reforms of criminal justice operations manage and rearrange existing institutions of power. 

 

Not all reforms are manipulative and repressive. There is also a tradition of progressive grassroots reforms that try to make a difference in and empower people’s everyday lives. But these efforts to accomplish structural reforms typically are undermined in practice. Why have there been more failures than successes, and what is needed to reverse this sorry record? 

 

 

Historically, the overwhelming majority of reforms are top-down, state-engineering initiatives that are never intended or designed to expand the rights or improve the well being of their recipients. One of the earliest examples was the Progressive Era’s child-saving movement that formally did away with due process for juvenile delinquents. It recruited social workers, public health personnel, police, and urban reformers to send thousands of European immigrant youth to punitive reformatories, and Native American youth to boarding schools where they were punished for “speaking Indian.” In the 1940s, the Preston School of Industry in California was “organized like the military,” a former prisoner recalled. “We marched everywhere, and were always on ‘Silence’.” 

 

The child-saving movement was a model for many other government reforms that, in the words of historian Lisa McGirr, came loaded with “strong doses of coercive moral absolutes,” such as forcing the children of Jehovah’s Witnesses to salute the flag during World War I in the name of spreading patriotism, and then criminalizing their parents when they refused. In the 1920s, the federal Prohibition Bureau, with five times more staff than the FBI, saved the drinking poor from the scourge of alcohol by arresting them, while the wealthy drank in private clubs or bribed their way out of arrest. Between the world wars, government agencies compelled the sterilization of some 60,000 working class women in the name of purifying motherhood. Similarly, in the 1950s, “protecting the family” supposedly justified purging gay men from government jobs and subjecting them to the kind of systematic harassment by police that young African American men routinely experience. 

 

We see the same kind of coercive benevolence at work today when local governments and professional functionaries invoke civility codes to tear down homeless encampments and in cities such as Irvine, California, run beggars out of town in order to “keep our streets safe.” 

 

The second type of reform has a democratic impetus and is intended to expand the rights of the disenfranchised and improve people’s everyday lives. Pursuing this kind of grassroots initiative requires the stamina of a marathon runner, for there is a long history of trying to substantially reform criminal injustice operations that typically does not end well. 

 

Take, for example, the 1963 U. S. Supreme Court decision in Gideon v. Wainright that required states to provide attorneys to defendants in criminal cases if they cannot afford counsel; and the bail reform movement that achieved passage of the Federal Bail Reform Act of 1966 that granted release on own recognizances (OR) to federal defendants in noncapital cases. 

 

The Gideon case represented a victory for activists who had struggled for decades to bring some balance to an adversary system of criminal justice that is heavily weighted in favor of the prosecution. “Thousands of innocent victims,” wrote W. E. B. Du Bois in 1951, “are in jail today because they had neither money, experience nor friends to help them.” The provision of government-funded defense lawyers was supposed to rectify this wrong.  

 

However, the underfunding and understaffing of public defenders, and pressures from criminal court bureaucracies to process cases expediently resulted not in more trials and more pleas of innocence, but in a decline of trials and increase in guilty pleas. How can clients get a “reasonably effective defense” in Louisiana, for example, if a single public defender is expected to carry a caseload of 194 felony cases? “No majesty here, no wood paneling, no carpeting or cushioned seats,” writes James Forman, Jr. about his experience as a public defender in Washington, D. C. It wasn’t unusual for him to want to cry in frustration at the railroading of his clients. “Sometimes the only thing that stopped the tears,” he says, “was another case or client who needed me right then.” 

 

The Federal Reform Bail Act met a similar fate. Much of the legislation’s provisions were destroyed by the Nixon and Reagan governments as new legislation eliminated OR for dangerous defendants, a proviso that ultimately included people arrested on drug-related and non-violent behavior, meaning just about everybody. Today, more than sixty percent of people confined in the misery of local jails are there because they are unable to make bail and do dead time, a travesty of “presumed innocent.” 

 

Too often when progressive reforms are passed, they stand alone as single issues and are generally ineffective because they lack sustained and wide support, or they are whittled away to the point of ineffectiveness. A similar process is at work with the recent First Step Act, Congress’ tame effort at federal prison reform. This legislation originated in the efforts of reformers during the Obama presidency to dramatically reduce mass incarceration nationwide. By the time of the Trump presidency, the libertarian Right dominated the politics of reform and put their stamp on the Act: no relief for people doing time for immigration or abortion or violence-related crimes; the privileging of religious over secular programs; and a boost for the electronic shackling industry. 

 

Too often, substantial reform proposals end up politically compromised and require us to make a Sophie’s choice: release some “non-violent offenders” and abandon the rest, including tens of thousands of men who used a gun during a robbery when they were in their 20s. Or give public welfare relief only tocarefully screened “worthy recipients,” while subjecting millions of women and children to malign neglect. Or, potentially, provide the immigrant Dreamers with a path to citizenship while making their parents and relatives fair game for ICE. 

 

 

It’s not for lack of trying that substantial reforms are so difficult to achieve. There are structural, multifaceted reasons that undermine our effectiveness.  “America is famously ahistorical,” a sardonic Barack Obama observed in 2015. “That’s one of our strengths – we forget things.” In the case of efforts to reform prisons and police, we remember the experiences of Malcolm X, George Jackson, Attica, and the Black Panther Party, but then amnesia sets in. 

 

We need to reconnect with the writers, poets, artists, activists, and visionaries who generations earlier took on the carceral state and forged deep connections between the free and un-free. Let’s remember Austin Reed, a young African-American incarcerated in ante-bellum New York, who told us what it was like to “pass through the iron gates of sorrow.” And the Socialist and labor leader Gene Debs, imprisoned many times for his activism, who made sure his comrades in the 1920s knew that his fellow nonpolitical prisoners were “not the irretrievably vicious and depraved element they are commonly believed to be, but upon average they are like ourselves.” And the young Native American women and men, forcibly removed to boarding schools, who reminded us of their resistance, as in the words of a Navaho boy: “Maybe you think I believe you/ But always my thoughts stay with me/ My own way.” 

 

Revisiting this long historical tradition is important, not out of nostalgia for what might have been or to search for a lost blueprint of radical change, but rather to learn from past reform efforts and help us to understand the immense challenges we face – to “bring this thing out into the light,” as the civil rights leader Fannie Lou Hamer used to say. 

 

In addition to a deep history, we also need a wide vision in order to see that state prisons and urban policing are components of a much larger and more complex private and public social control apparatus that plays a critical role in preserving and reproducing inequality, and in enforcing injustices. No wonder that structural reforms are so difficult to achieve and sustain when carceral institutions are sustained by private police, public housing and education, the political system, immigration enforcement, and a vast corporate security industry that stokes what Étienne Balibar calls the “insecurity syndrome.” 

 

Struggles for equality in the United States have usually been uneven and precarious, with improvements in rights and quality of life for one group often coming at the expense of others – not consciously, but in effect. Our challenge is to rebuild a social and political movement that bridges the divide between a panoply of activists in the same way that post-World War II civil rights and black liberation organizations incorporated prisoners and victims of brutal policing into the Movement. Important single-issue campaigns – to eliminate cash bail, to restore voting rights to millions of former prisoners, and to make American prisons comply with global human rights standards – will have a better chance of success if backed by a multi-issue, grassroots campaign. 

 

We should not give up on big ideas and structural reforms. We never know when a spark will light a fire and energize a movement. Let's remember that it was protests against a police killing in a place like Ferguson that led to the Black Lives Matter movement and compelled a meeting with the president; and it was a high school student protests for gun reform in Florida that prompted a former Supreme Court Justice to call for the abolition of the 2nd Amendment. 

 

The Right has been extraordinarily effective in promoting a dystopia that anchors and propels its law and order policies. We need a comparably progressive vision.  In this moment of resistance and defense, to articulate an ideal of social justice might seem like pie-in-the-sky and a waste of energy. But to get support for progressive policies will require widespread endorsement, and this will only happen if we speak to people’s deeply held anxieties and aspirations. Without a movement and long-term vision that engages people, good policies wither. 

 

It will take nothing short of a broad-based movement, a revitalized imagination, and reckoning with a historical legacy that bleeds into the present to make the criminalized human again and end the tragedy of the carceral state. 

]]>
Thu, 18 Apr 2019 16:48:42 +0000 https://historynewsnetwork.org/article/171611 https://historynewsnetwork.org/article/171611 0
E-Carceration: Are Digital Prisons The Future?

 

We’ve all heard the unsettling stats regarding the U.S. mass incarceration crisis:

 

The United States holds 5 percent of the world’s total population, yet 25 percent of the world’s prison population.

 

On any given day, there are more than 2.3 million people locked up in jails and prisons across America—an estimated more than 540,000 incarcerated without ever even being convicted or sentenced—and more than 4.5 million folks on parole or probation. 

 

Although millions are imprisoned behind the drab concrete walls and cold steel bars of your stereotypical detention facility, an ever-growing number—estimated as high as 200,000—are instead being clamped with the trendiest weapon of the U.S. prison industrial complex: electronic monitoring.

 

Commonly referred to as “ankle bracelets,” these GPS-equipped devices are praised as cheaper and more humane alternatives—a legitimate remedy, even—to locking up so many within the overpopulated local jails and monolithic fortress-like prisons that have come to visually define mass incarceration within the United States.

 

Though these digital “shackles,” as one detractor calls them, have been utilized by the U.S. criminal justice system for more than 30 years, their use has increased a whopping 140 percent throughout the past decade, with cautionary criminal justice reform advocates sounding the alarm about the tech’s deception, and ramifications. 

 

We investigated this issue in “E-Carceration: Are Digital Prisons The Future?,” the latest episode of our social justice podcast, News Beat. Criminal justice reform advocates view the ballooning prevalence of electronic monitoring as just another way for profit-hungry corporations and billionaires to camouflage the true agenda: maintaining mass incarceration’s sinister legacy of punishing the poor and devastating black and brown communities, while those pulling the strings get even richer. E-carceration’s rise, criticspoint out, comes at a time when officials nationwide are making conscientious efforts to decarcerate, and seeking legitimate reforms—the reassessment of money bail, legalization of marijuana, and proliferation of diversion programs, among these. 

 

“I felt like I was still under carceral control, which I was,” author, educator, social justice activist, and former fugitive and prisoner, James Kilgore, tells us. 

 

A research scholar at the University of Illinois at Urbana-Champaign, Kilgore spent six and a half years in prison for crimes during the 1970s, and more than two decades as a fugitive. He wore a monitor as a condition of his parole.

 

“And the image that always comes to me, is the fact that when I went to sleep at night, I felt as if my parole officer was laying across the bed looking up at me from under the covers,” he says, bristling at the suggestion that being harnessed with such tech is a more compassionate form of punishment.

 

“Most people just said, ‘Well, it’s better than jail,’” continues Kilgore. “And my response was always, ‘Well that’s true, but a minimum-security prison is better than a supermax, but it’s still a prison.’ And by the same token, at an individual level, I would never tell someone, ‘Well, you’re better off staying in prison or in jail than going out on an electronic monitor.

 

“Just like I wouldn’t tell somebody, ‘Well, stay in that supermax where you’re in solitary 24 hours a day, and don’t go to this camp where you can be out free, you know, [for] 16 hours a day, moving around the yard and so forth,’” he adds.

 

Myaisha Hayes, national organizer on criminal justice and technology at the nonprofit Center for Media Justice, and also a News Beat guest, explains that now—as mass incarceration is an increasingly prominent issue—is the time to question these new extensions of prison and assess such strategies before they’re universally accepted. 

 

“As we’re in this moment of bail reform, parole justice, all of these different issues within the criminal justice space, we have a real opportunity to take a pause and think about, okay, if we’re going to end monetary bail, how do we actually address harm and maintain public safety in our communities?” she asks. “Or do we want to just find a technological solution to this issue? 

 

“I think that’s the issue that we’re having to deal with, and I do think…we have an opportunity now to course-correct,” adds Hayes.

 

Among her and other prison reform advocates’ list of concerns is the fact that the four largest providers of electronic monitors are private prison companies, including the most profitable, GEO Group, which has secured lucrative government contracts to operate federal prisons and monitor people in immigration proceedings.

 

The Center for Media Justice reports thatBI Incorporated, a subsidiary of GEO Group, has government contracts with at least 11 state departments of correction, and earned nearly $84 million in revenue in 2017.

 

Boulder, Co.-based BI has also reportedly earned more than half a billion dollars in U.S. Immigration and Customs Enforcement (ICE) contracts since 2004, according to local newspaper the Daily Camera.

 

Privacy and the possibility of even ensnaring other individuals within close proximity to those being surveilled should also be concerns, contends Stephanie Lacambra, criminal defense staff attorney at nonprofit Electronic Frontier Foundation, another guest on our News Beat “E-Carceration” episode.

 

“Locational privacy has been a concern in the deployment of a number of different law enforcement surveillance technologies, from automated license plate readers to the use of facial recognition,” she explains.

 

“I think we should be concerned about location tracking,” continues Lacambra. “Not just in the context of electronic monitoring, but in all of these other contexts as well, because I think they give law enforcement the potential to really aggregate very detailed profiles about all of us, regardless of whether you’re on probation or parole, or awaiting trial.”

 

Kilgore, the former inmate-turned-prison reform advocate, is also the project director for the initiative Challenging E-Carceration, and collaborated with Hayes and others at the Center for Media Justice on an analysis released last year titled “No More Shackles.”

 

It categorically rejects electronic monitoring (EM) as a viable alternative to incarceration; rather, it deems such tech as an “extension” of the very system it’s purportedly helping to rectify.

 

“We view EM as an alternative form of incarceration, an example of what we call “e-carceration”—the deprivation of liberty by technological means,” it reads. “Hence, as part of the greater movement for transforming the criminal legal system, we call for the elimination of the use of monitoring for individuals on parole. When people have done their time, they should be cut loose, not made to jump through more hoops and be shackled with more devices, punitive rules and threats of reincarceration.”

 

News Beat podcast melds investigative journalism with independent music (primarily hip-hop) to shine a light on the most pressing social justice, civil liberties and human rights issues of our day. “E-Carceration: Are Digital Prisons The Future?” and all News Beat episodes can be listened to, downloaded, and subscribed to via Apple Podcasts, Stitcher, Spotify, and wherever else you get your favorite podcasts.   

 

 

]]>
Thu, 18 Apr 2019 16:48:42 +0000 https://historynewsnetwork.org/article/171615 https://historynewsnetwork.org/article/171615 0
Immigration Restriction by Remote Control

 

On March 12, the Trump administration announced it would close all international offices of US Citizenship and Immigration Services, an action that will choke off the largest channel for legal migration. While much of the coverage of immigration has focused on the Border Wall, we have forgotten that most immigration restriction happens beyond the borders of the United States through what political scientist Aristide Zolberg calls “remote control.”  

 

US Citizenship and Immigration Services and neighboring countries like Mexico now prevent far more prospective migrants from entering the US than border control. It seems like common sense that most immigrants would be stopped from entering the US at the border, but this has not been true for almost a century.

 

When immigration restrictions were first established in the late nineteenth century to keep out Chinese laborers, convicts, people with diseases and prostitutes, the system of passports and visas that we now take for granted did not exist.  Immigration inspectors determined immigrants’ eligibility at ports of entry as inspectors screened passengers for admissibility, including doing medical and psychological exams. Long lines at Ellis Island illustrated the screening process at ports of entry.

 

However, even in the early twentieth century, consular officials conducted medical inspections abroad. At some US consulates, such as the one in Hong Kong, the rejection rate for Chinese migrants was more than fifty percent. Historian Amy Fairchild shows that by the 1920s, consular officials conducted rigorous medical exams and rejected about 5 percent of all applicants, which was 4 times the rate of rejections at US ports. 

 

In 1921, the Quota Act established national origins quotas for each country, but initially the slots were filled on a first-come, first-served basis. Ship captains raced to reach port before quotas were filled, creating chaos. In response, the 1924 Johnson-Reed Act mandated quotas would be filled not by counting immigrants but by counting immigration certificates issued at consular offices abroad. From this point forward, the biggest barrier to entering the US was obtaining a visa, not getting past a border patrol agent, a fence, or a wall.

 

The professionalization of the Foreign Service, the establishment of a universal requirement for passports, and the institutionalization of visas meant that would-be immigrants had to pass through the gauntlet of US restrictions in their home countries, long before they arrived on US soil. This system allowed prospective immigrants to know whether they were eligible to enter before they got aboard a ship. 

 

After 1924, the job of the immigrant inspector was mainly to inspect documents to make sure papers were in order and not fraudulent. The long lines of medical inspectors ceased to exist at Ellis Island. This more efficient system meant that most exclusion of immigrants was not happening at ports of entry but in far-flung consulates around the globe. 

 

The importance of this extra-territorial inspection was not just that it was more efficient, but it denied would-be immigrants any protections from the US Constitution. While prospective immigrants have few constitutional protections before being admitted, over time the US courts recognized that even undocumented immigrants within the US have rights to due process. By keeping migrants far from US soil, or claiming they have not technically entered even though they are on US spoil, the government denied them the possibility of using the US court system to apply for asylum or challenge decisions by consular officials.  

 

Keeping potential asylum seekers off US soil is why Trump demands that Central American migrants remain in Mexico while they await their asylum hearings. It’s also why the US has been paying Mexico hundreds of millions of dollars to detain and deport Central Americans since the mid-2000s.  Since 2015, Mexico deported more Central Americans than the United States, reaching almost 100,000 in 2018.

 

The dramatic rise in visa denials in recent years prevents hundreds of thousands of immigrants and non-immigrant visitors from entering the US.  Consular officials have drastically cut the number of non-immigrant visas issued from almost 11 million in 2015 to just over 9 million in 2018. At the same time, new international students in the US dropped by almost 7 percent in the 2017-18 academic year, and the latest data show signs of a continuing decline. Shuttering overseas immigration services will make it even harder for immigrants to apply for legal entry to the US.

 

Comparing data from 2016 and 2018, analysis by the Cato Institute shows that denials of visas to potential immigrants have increased since Trump took office by more than 37 percent.  In 2018, 150,000 more immigrants were refused visas than in 2016. 

 

While the president wants to focus our attention on the dramatic rise in border apprehensions, reaching 467,000 people last year, more than 620,000 were denied immigrant visas. 

 

Today, the backlog in visa applications stands at 4.1 million worldwide, and for Mexicans it is 1.3 million.  The wait time for most Mexicans is thus well over 20 years. When people talk about immigrants getting in line and waiting their turn, they need to recognize that that the line has become absurdly long.

 

Consular offices around the world are ground zero for immigration restriction.  No matter what your perspective, it’s time we focused on where most immigration restriction really happens.  

 

]]>
Thu, 18 Apr 2019 16:48:42 +0000 https://historynewsnetwork.org/article/171613 https://historynewsnetwork.org/article/171613 0
The Psychotherapy of Marcus Aurelius

 

Did one of Rome’s wisest and most revered emperors benefit from an ancient precursor of cognitive psychotherapy?

 

The Roman emperor Marcus Aurelius mentions undertaking Stoic “therapy” (therapeia) at the start of The Meditations, his famous journal of personal reflections on philosophy.  He writes, “From Rusticus, I gained the idea that I was in need of correction and therapy for my character.” Junius Rusticus was one of Marcus’ closest and most beloved advisors, a mentor to him in Stoic philosophy, perhaps even serving as a sort of therapist or life coach to the emperor.

 

Marcus mentions that he struggled at first to manage his own feelings of anger with certain individuals, including Rusticus.  There are numerous references to psychological strategies for anger management scattered throughout The Meditations.  It’s a topic to which he keeps returning, at one point listing ten different techniques for overcoming anger.  He describes these Stoic therapy strategies as gifts from Apollo, the god of healing, and his Muses.  For instance, he advises himself to pause when becoming angry with another person and first investigate whether or not he’s guilty, or at least capable, of similar wrongdoing himself.  Of course, bearing in mind our own imperfections can prevent us flying into a rage with others and help us move closer toward empathy, understanding, or even forgiveness in some cases. 

 

Marcus also frequently recounts the use of psychological techniques for coping with pain and illness.  Research shows that’s another problem with which cognitive-behavioural therapy (CBT) can help.  Marcus had a reputation for physical frailty and poor health in adulthood.  He particularly suffered from chest and stomach pains, poor appetite, and problems sleeping.   His Stoic mentors taught him to cope, though, by using mental strategies such as contemplating the temporary nature of painful sensations or their limited intensity and location in the body.  Rather than allowing himself to think “I can’t bear it,”, he’d focus on his ability to endure pain that was more intense or lasted longer.  He learned to accept painful feelings and other unpleasant symptoms of illness, to adopt a philosophical attitude toward them, and find more constructive ways of coping. 

 

The concept of philosophy as a medicine for the soul, a talking cure, or a psychological therapy, goes back at least as far as Socrates.  However, the Stoics, who were greatly influenced by the practical nature of Socratic ethics, developed this therapeuticaspect of his philosophy even further.  For example, the Roman Stoic teacher Epictetus, whom Marcus greatly admired, taught that, “It is more necessary for the soul to be cured than the body, for it is better to die than to live badly.” He therefore states bluntly, “the philosopher’s school is a doctor’s clinic.” 

 

The Stoics wrote books specifically dedicated to the subject of psychological therapy, such as the Therapeuticsof Chrysippus. Although these are now sadly lost, we can perhaps infer something about them from a surviving text by Marcus Aurelius’ famous court physician, Galen, titled On the Diagnosis and Cure of the Soul’s Passions, which outlines an eclectic approach to philosophical psychotherapy but cites earlier Stoic texts as its inspiration.  What we learn is that an aspiring philosopher should seek out an older and wiser mentor, someone he trusts to examine his character and actions, exposing flaws in his thinking through observation and questioning. 

 

It’s no coincidence, therefore, that the pioneers of CBT originally drew on the Stoics for their philosophical inspiration.  Modern cognitive approaches to psychotherapy are based on the premise that our emotions are largely (if not exclusively) determined by our underlying beliefs.  This “cognitive model of emotion” was ultimately derived from the ancient Stoics. Albert Ellis, who created Rational-Emotive Behaviour Therapy (REBT), the first form of CBT, in the 1950s, wrote:

 

This principle, which I have inducted from many psychotherapeutic sessions with scores of patients during the last several years, was originally discovered and stated by the ancient Stoic philosophers, especially Zeno of Citium (the founder of the school), Chrysippus, Panaetius of Rhodes (who introduced Stoicism into Rome), Cicero [sic., actually an Academic philosopher albeit greatly influenced by Stoicism], Seneca, Epictetus, and Marcus Aurelius.  The truths of Stoicism were perhaps best set forth by Epictetus, who in the first century A.D. wrote in the Enchiridion: “Men are disturbed not by things, but by the views which they take of them.” (Ellis, 1962, p. 54)

 

Aaron T. Beck, the founder of cognitive therapy, another form of CBT, repeated this claim with regard to his own approach.  In his first book on the subject he also quoted Marcus Aurelius’ version of the saying above in a more antiquated translation: “If thou are pained by any external thing, it is not the thing that disturbs thee, but thine own judgement about it.” This simple concept has been so influential, both in ancient philosophy and modern psychotherapy, because people find it of practical value.  From the 1950s onward, psychologically research increasingly lent support to the techniques of cognitive therapy, and in the process of so doing we might say it indirectly validated the practices of ancient Stoicism.

 

There’s an important difference, though.  CBT is a therapy; Stoicism is a philosophy of life, albeit one containing many therapeutic concepts and techniques. CBT is normally remedial, outcome-oriented, and time-limited.  It treats problems that already exist.  The holy grail of mental health, nevertheless, is prevention because as we all know: prevention is better than cure. Stoicism not only provided a psychological therapy, a remedy for existing problems like anger and depression, though, but also a set of prophylactic or preventative psychological skills, designed to build what psychologists today refer to as long-term emotional resilience.  

 

The historian Cassius Dio, for instance, praises Marcus Aurelius for the remarkable physical and psychological endurance that he showed in the face of great adversity as the result of his lifelong training in Stoicism.

 

[Marcus Aurelius] did not meet with the good fortune that he deserved, for he was not strong in body and was involved in a multitude of troubles throughout practically his entire reign. But for my part, I admire him all the more for this very reason, that amid unusual and extraordinary difficulties he both survived himself and preserved the empire.  (Cassius Dio)

 

Stoic philosophy therefore holds promise today as a means of expanding the findings of CBT beyond the consulting room and the limited duration of a course of psychotherapy.  It can provide a model for applying evidence-based psychological strategies to our daily lives on a permanent and ongoing basis in order to build emotional resilience.  For its modern-day followers, Stoic philosophy has once again become a way of life.

]]>
Thu, 18 Apr 2019 16:48:42 +0000 https://historynewsnetwork.org/article/171614 https://historynewsnetwork.org/article/171614 0
The Holocaust and the Christian World

 

Holocaust scholars were stunned last year by the results of the April 2018 survey of Americans and the Holocaust, according to which, 31% of all Americans believe that two million or fewer Jews were killed during the Holocaust, while 41% of Americans cannot say what Auschwitz was. Additionally, 22% of millennials (ages 18-34) “haven’t heard” or “are not sure if they have heard of the Holocaust.” Other survey questions concerning the names of countries where the Holocaust took place, the names of ghettos and concentration camps, and the persistence of antisemitism also yielded low awareness rates. 

 

Simultaneously, a recent FBI report shows that hate crimes in the U.S. spiked 17% in 2017 alone—the third straight rise in as many years. The worst anti-Semitic attack in U.S. history---the murder of eleven people at the Tree of Life Synagogue in Pittsburgh occurred in November 2018.

 

The arrival of the second edition of The Holocaust and the Christian World: Reflections on the Past, Challenge for the Future, couldn’t be more timely. Its contributors are among the leading Holocaust scholars of their generation. The editors, emblematic of the ecumenical nature of this ethical undertaking, include Carol Rittner, a Catholic nun and distinguished professor emerita of Holocaust and Genocide Studies at Richard Stockton University; Stephen D. Smith, the Protestant co-founder of Beth Shalom, Britain’s first Holocaust Memorial, and current executive director of the USC Shoah Foundation; and Irena Steinfeldt, a Jewish educator and former director of The Righteous Among the Nations Department at Yad Vashem in Jerusalem.

 

The Holocaust and the Christian World is divided into nine sections: Confronting the Holocaust; Chronology, 1932-1998; Anti-Semitism; The Churches and Nazi Persecution; The Reaction of the Churches in Nazi-Occupied Europe; The Vatican, the Pope, and the Persecution of the Jews; The Challenge of the Exception (the Rescuers of Jews); After the Holocaust: How Have Christians Responded? (Activities and Issues); and, finally, the Afterword. Each section is made up of severalshort, stimulating articles that go directly to the issue at hand and offer suggestions for further reading as well as questions for reflection.  For example: 

 

--“What would have happened if the Churches—Protestant and Catholic alike—had defied Hitler during the Third Reich and stood in solidarity with the Jews?”

--“When does silence become an active form of collaboration?”

--“What should it mean—and not mean—to be a post-Holocaust Christian?”

--“How can we help people to develop faith without prejudice?”

--“What obligation do we have to stand up for people whose beliefs we do not share?” 

--“Who is part of your universe of obligation today?”

 

The Afterword  contains documents related to church matters during the Holocaust from Norway, Greece, France, and Denmark; post-Holocaust statements from the churches in Switzerland, Rome, United States, Hungary, Germany, Poland, France, and Austria; the text of the March 1998 Vatican Statement, “We Remember: A Reflection on the Shoah;” an updated videography and detailed list of on-line sources; and, finally, a select bibliography not contained in the first edition with appropriate entries through 2017.

 

Two impulses drive the text from beginning to end: the frank admission of the role played by Christianity in the Holocaust and the current project of completely ridding Christianity of all anti-Judaism. Carol Rittner and John Roth elucidate the history and Christian roots of anti-Semitism (“the longest hatred of human history”) found in the New Testament and suchearly Church Fathers as Saint Augustine and Saint John Chrysostom, as well as in later Christian preachers and theologians, such as Bernard of Clairvaux and Martin Luther. The authors describe the institutional anti-Judaism of Christian churches, the negative depiction of the Jewish people in Christian preaching and liturgy, and the process by which the Jew became “the other”—“marginalized, persecuted, blamed for every woe, from unemployment and slums, to military defeats and unsolved murders.” In addition, they present a chilling chart that lists Nazi measures on the one hand and prior Canonical Laws on the other, for example, “Nazi Measure: Law for the Protection of German Blood and Honor, September 15, 1935 (Canonical Law: Prohibition of intermarriage and of sexual intercourse between Christians and Jews, Synod of Elvira, A.D.306); Nazi Measure: Book Burnings in Nazi Germany (Canonical Law: Burning of the Talmud and other books, Twelfth Synod of Toledo, 681);  Nazi Measure: Decree of September 1, 1941—the Yellow Star (Canonical Law: The marking of Jewish clothes with a badge, Fourth Lateran Council, 1215).

 

After centuries of being “cast outside the universe of moral obligation,” it is not surprising that most churches and most Christians were indifferent to the fate of Jews during the Nazi plague. In the words of Pope John Paul II, “their history had ‘lulled’ their consciences.” Although this volume clearly states that Christianity cannot be seen as the cause of the Holocaust, it does convince the reader that Christianity prepared the way and then allowed it to happen. As a result, the authors accept the Shoah as part of Christian history. The enormity of Christian responsibility means that the Holocaust can no longer be conceived of as solely what happened to the Jewish people but rather what also happened to Christians who claimed to be disciples of a Jew named Jesus. 

 

Having clearly established the anti-Jewish bias of traditional Christianity, the text then moves to the contemporary task of ridding Christianity of its anti-Judaism. It explains what has been done since 1945 and what still needs to be done. The book’s authors offer several strategies to strengthen the dialogue between Christians and Jews. For example, Stephen Smith wants Christians to take an active part in the remembrance of the Shoah, given that the perpetrators, collaborators, and bystanders were not Jewish. To foster, Tikkun (Healing), Marcia Sachs Littell stresses the importance of developing a Christian liturgy on the Holocaust and the faithful observance of Yom HaShoah (the Day of Holocaust Remembrance) by the joint Christian-Jewish community.  Michael Phayer, Carol Rittner, and Isabel Wollaston call for the development of Holocaust education and courses in Hebrew Scripture and post-World War II Jewish-Christian relations in Christian seminaries, colleges and universities. They also suggest a moratorium on terms such as “Old Testament” which implicitly connote Jewish displacement. 

 

Although the contributors to this book do not hesitate to analyze factors that led to specific historical events, even identifying particular individuals in positions of responsibility, The Holocaust and the Christian World is never excessively accusatory. To learn retrospectively what should have been done by churches and individual Christians during the Holocaust is not tantamount to knowing what we would have done had we been in their place. This wisdom permeates the text whose authors recognize that their responsibility and ours lies in the present, in the creation of a world where another Auschwitz would be unthinkable. 

]]>
Thu, 18 Apr 2019 16:48:42 +0000 https://historynewsnetwork.org/article/171609 https://historynewsnetwork.org/article/171609 0
How Hollywood has been fooled by Robert F. Kennedy assassination conspiracy theorists

 

Renewed interest in the Robert Kennedy assassination flourished on the 50th anniversary of his assassination in 2018 and the following year with the publication of two books about the assassination, A Lie Too Big to Fail by Lisa Pease and The Assassination of Robert F. Kennedy by Tim Tate and Brad Johnson. Both books allege RFK’s assassin was a hypnotised assassin manipulated by the CIA and had no real motive thus innocent of the crime.(1)

 

The falsehoods promoted by Pease, Tate and Johnson – a ‘girl in a polka dot dress’ controlled Sirhan, misinterpretations of the ballistics evidence, the creation of suspicion around LAPD crime scene mistakes, multiple teams of CIA-controlled assassins skulking around the Los Angeles Ambassador Hotel, allegations that Sirhan was never close enough to RFK to fire the fatal shot, allegations of Sirhan firing ‘blanks’, accusations against  an innocent security guard Thane Eugene Cesar - have all been addressed and debunked over the years (most recently here: https://historynewsnetwork.org/article/169208  and here:  https://www.moldea.com/RFKcase.html)

 

The authors of the books were given moral support of late by a group of Hollywood celebrities including Oliver Stone, Alec Baldwin, Martin Sheen, Rob Reiner, David Crosby, Mort Sahl, and two Kennedy family members -  Robert F. Kennedy Jr.and Kathleen Kennedy Townsend. Following publicity about the books the group called for a new investigation of the assassination. (The group also alleged that other political assassinations of the 1960s, JFK, MLK and Malcolm X, involved government malfeasance and cover-up and wanted the government to also re-investigate those crimes). (2)

 

It is no mystery why some Hollywood celebrities support the notion of an innocent Palestinian refugee railroaded into a notorious murder case. Many in Hollywood have endorsed and embraced the Palestinian cause mimicking the American left’s decades-old support. Unable to gauge Sirhan’s true character by reading the recent conspiracy books they would naturally assume Sirhan mysteriously acted without a motive.

 

The recent conspiracy authors adopt the modus operandi of previous RFK conspiracy authors in the way they attempt to portray Sirhan as a young man who had no real political agenda or any fanaticism. Tate and Johnson inform their readers that, “Sirhan’s closest friend, Ivan Garcia, had explicitly told them that, ‘Sirhan did not appear to be particularly aware of any political party, was not interested in groups or being a leader and was not openly fanatical about politics.’” (3) Lisa Pease cites acquaintances of Sirhan who described him as polite and non-violent; “…. nearly everyone”, she writes, “described Sirhan as polite, respectful and friendly…Sirhan did not appear to be particularly aware of any political party, was not interested in groups or being a leader and was not openly fanatical about politics”. (4)

 

Although many falsehoods about the case have been debunked over the years this crucialexamination of Sirhan’s motives has been largely ignored or overlooked by the mainstream media – motives which convincingly and conclusively show that not only was Sirhan a political fanatic but he also embraced the concept of violent solutions to political problems. 

 

Although some acquaintances of Sirhan said he was ‘pleasant and well-mannered’ and ‘non-political’ it is not the lasting impressions of those who knew him best. His brother Munir said Sirhan was ‘stubborn’ and had ‘tantrums’(5)William A. Spaniard, a twenty-four-year-old Pasadena friend of Sirhan’s, said the young Palestinian was “a taciturn individual.”(6) Fellow students characterized Sirhan as not only ‘taciturn’ but also  ‘surly’, ‘hard to get to know’, ‘withdrawn and alone’. (7) One of his professors saw Sirhan and another student have an argument that, “almost became a fist fight”. He said Sirhan had, “an almost uncontrollable temper”. (8) 

 

Sirhan also revealed the violent side of his character when he was employed as an exercise boy/ trainee jockey. According to two exercise girls who worked at the Grande Vista Ranch Sirhan treated the horses ‘cruelly’. Del Mar Race Track foreman Larry Peters sawSirhan kick a horse in the belly and after he remonstrated with him he was taken aback at the vitriol which emanated from the young employee. Peters said Sirhan's temper had been unusually violent when he was told he would never become a jockey. (9)

 

Additionally, a horse trainer at the Grande Vista Ranchranch saw Sirhan mistreat a horse, “…kicking and hitting it with his fists”. Sirhan, he said, “…was in a rage of temper”. By way of explanation Sirhan told him the horse “provoked him”. (10) In fact, Sirhan had used this excuse at his trial when he testified that Robert Kennedy, by his support of Israel, had ‘provoked’ him which led to his decision to assassinate the senator. (11)

 

As a young adult, Sirhan sought meaning to his increasingly hopeless life by embracing anti-Semitism, anti-Americanism and Palestinian nationalism. Sirhan’s parents taught him the Jews were ‘evil’ and ‘stole their home’. They also taught him to hate, despise and fear Jews. As a part-time gardener Sirhan came to hate the Jews whose gardens he tended. (12)

 

Amongst the many descriptions of Sirhan by those who knew him well include his friends Walter Crowe, Lou Shelby and John and Patricia Strathmann, as well as his former boss John Weidner. They all agreed that Sirhan hated Jews and been intense and emotional whenever he discussed the Arab-Israeli conflict. They all agreed he was vehemently critical of American foreign policy regarding Israel. 

 

Walter Crowe had known Sirhan from the time they were young adults and also during a short period of time when Sirhan was a Pasadena College student. Crowe said Sirhan was virulently anti-Semitic and professed hatred for the Jews and the state of Israel. He believed Sirhan’s mother Mary propagated these views to Sirhan. (13)

 

Lou Shelby, the Lebanese-American owner of the Fez Supper Club in Hollywood, knew the Sirhan family intimately. He described Sirhan as, “intensely nationalistic with regard to his Arab identity”. According to Shelby, “We had a really big argument on Middle East politics...we switched back and forth between Arabic and English. Sirhan’s outlook was completely Arab nationalist - the Arabs were in the right and had made no mistakes”. (14)

 

John and Patricia Strathmann had been ‘good friends’ with Sirhan since High School. According to John, Sirhan was an admirer of Hitler, especially his treatment of the Jews, and was impressed with Hitler’s Mein Kampf. John also said Sirhan became ‘intense’ and ‘mad’ about the Arab/Israeli Six Day War. Patricia said Sirhan became, “burning mad . . . furious” about the war. (15) 

 

Sirhan discussed politics, religion, and philosophy with his boss, John Weidner, a committed Christian.Weidner was honoured by Israel for his heroism in saving more than 1,000 people from the Nazis.Sirhan worked for Weidner from September 1967 to March 1968. According to Weidner Sirhan, ‘hated Jews’. (16)

 

Sirhan was not only anti-Semitic in his political views but believed in violent action as a political tool. He admired the Black Panthers and even wanted to join their organisation. According to his brother Munir, Sirhan also became enamoured with the Black Muslims, who “were like him culturally”. Sirhan attended the Black Muslim Temple in Central Los Angeles until he was told he could not join the organization because he was not black. (17)

 

The notion that Sirhan never held any animus towards Robert Kennedy is also entirely without foundation as friends and Sirhan himself have revealed. Sirhan said he believed Robert Kennedy listened to the Jews, and he saw the senator as having sold out to them. (18) 

 

Sirhan also expressed hatred for Robert Kennedy to John Shear, an assistant to trainer Gordon Bowsher at the Santa Anita Racetrack. Shear recalled that the newly hired Sirhan heard a co-worker read aloud a newspaper account of Robert Kennedy recommending the allocation of arms to Israel. “Sol (Sirhan) just went crazy,” Shear said. “He was normally very quiet, but he just went into a rage when he heard the story.” (19)

 

Sirhan thought RFK would be, “like his brother,” the president, and help the Arabs but, “Hell, he f….. up. That’s all he did. . .. He asked for it. He should have been smarter than that. You know, the Arabs had emotions. He knew how they felt about it. But, hell, he didn’t have to come out right at the f…… time when the Arab-Israeli war erupted. Oh! I couldn’t take it! I couldn’t take it!” (20)

 

Despite protestations to the contrary Sirhan had clear and defined motives in wanting to murder Robert F. Kennedy and the hatred that spewed forth from his gun can ultimately be traced back to one cause - Palestinian nationalism. 

 

 

Notes

 

1. The Assassination of Robert F. Kennedy byTim Tate and Brad Johnson Thistle Publishing 2018

 

A Lie Too Big To Fail by Lisa Pease Feral House 2018

 

2. Kennedy, King, Malcolm X relatives and scholars seek new assassination probesBy Tom Jackman, 25 January 2019 https://www.washingtonpost.com/history/2019/01/25/kennedy-king-malcolm-x-relatives-scholars-seek-new-assassination-probes/?utm_term=.3ff4a5a06d2a

 

3.Tateand Johnson, Kindle edition.2018, location1678

 

4. A Lie Too Big To Fail by Lisa Pease, Feral House, 2018, 126-127

 

5.Houghton, Robert A. Special Unit Senator: The Investigation of the Assassination of Senator Robert F. Kennedy. Random House 1970 New York,181

 

6 .Francine Klagsbrun and David C. Whitney, eds., Assassination: Robert 
F. Kennedy, 1925–1968 ,New York: Cowles, 1968, 109

 

7. Godfrey H. Jansen, Why Robert Kennedy Was Killed: The Story of Two Victims, New York, Third Press, 1970,121–123

 

8.FBI Airtel To LA From San Francisco Kensalt, Interview with Assistant Professor Lowell J Bean, 21 June 1968

 

9.  Houghton, 191

 

10. FBI Kensalt Files, Interviews, 7 June 1968, Inglewood, Ca LA 56- 156 and 8 June 1968, Corona, Ca LA-56-156

 

11. John Seigenthaler, Search For Justice, Aurora, 1971 Seigenthaler, 256

 

12.  See: The Forgotten Terrorist, Chapter 3, Sirhan and Palestine)

Klagsbrun and Whitney, Assassination, 110

 

13. Houghton, 165

 

14 Jansen, 138/139

 

15.Houghton, 231- 232

 

16.Jansen, 135

 

17. Kaiser 214

 

18. Seigenthaler, 295

 

19. Larry Bortstein, “Guard Has a Leg Up on Opening Day,” OC Register, 
December 24, 2006,http://www.ocregister.com/ocregister/sports/other/article_1397207.php

 

20. Robert Kaiser, RFK Must Die, E P Dutton 1970, 270

 

]]>
Thu, 18 Apr 2019 16:48:42 +0000 https://historynewsnetwork.org/article/171610 https://historynewsnetwork.org/article/171610 0
The Weakness of Democracy

 

Donald Trump is the most dishonest and most ignorant president in living memory, perhaps in American history. With his disdain for fundamental elements of democratic practice, such as freedom of the press and separation of powers, he is a danger to our American democracy.

 

But his election and the continued support he receives from a significant minority of voters are themselves symptoms of weaknesses which seem to be inherent in modern democracy itself. When we extend our gaze beyond the US, we can more easily perceive that democracy often works badly. I am not talking about fake democracies, where there is voting but no choice, as in the Soviet Union and the states it controlled. Even in countries where there is real opposition and secret ballots, voting can produce terrible results.

 

Venezuela, currently suffering a constitutional and humanitarian crisis, appears to have a functioning democracy, but the system has been rigged in favor of Nicolás Maduro, the successor of Hugo Chavez. Physical attacks on and arrests of opposition leaders, banning of opposition parties, sudden changes in the date of the election, and vote buying helped produce a victory for Maduro in 2018.

 

Algeria is currently experiencing a popular revolt against the elected president Abdelaziz Bouteflika, who was first elected in 1999, when the five other candidates withdrew just before the vote. He has been re-elected in 2004, 2009, and 2014, and announced he would run again this year, until massive protests forced him to withdraw as a candidate. He is very ill and has not said a word in public since 2013. His power has been based on military control, corruption, voting manipulation, and extensive use of bribery to create supporters and discourage opposition. The rebels are calling for an overthrow of the whole system.

 

These two cases are exceptional: the illusion of democracy hid authoritarian reality where democracy had never achieved a foothold. Much more common over the past two decades has been a gradual decline of existing democracies across the world, a process which could be called autocratization. A recent study shows that gradual autocratization has weakened democracies, in places as diverse as Hungary, Turkey and India. By extending government control of media, restricting free association, and weakening official bodies which oversee elections, modern autocrats can undermine democracy without a sudden coup. The authors argue with extensive data that the world has been undergoing a third wave of autocratization covering 47 countries over the last 25 years, after the first two waves in the 1930s and in the 1960s and 1970s.

 

The efforts of would-be autocrats to maintain their power by restricting democracy discourage trust in democracy itself. Nearly three-quarters of voters in Latin America are dissatisfied with democracy, according to a survey in 18 countries by Latinobarómetro, the highest number since 1995.

 

This is the context for the current failures of democracy in the United States (Trump) and Great Britain (Brexit). What can explain these failures? Physical coercion of political opponents is nearly non-existent. Corruption and voter suppression certainly play a role, at least in the US, but probably not a decisive one. Voters were overwhelmingly free to choose. Why did so many make such bad choices? I believe that conservative politicians in both countries used carefully chosen political tactics to appeal to widespread voter dissatisfaction. Those tactics are fundamentally dishonest, in that they promised outcomes that were impossible (Brexit) or were not actually going to be pursued (better health care than Obamacare). White voters made uncomfortable by the increasingly equal treatment of women and minorities were persuaded that it was possible and desirable to return to white male supremacy.

 

Voters made poor choices, even by their own professed desires. There is a dangerous disconnect between the voting preferences of many Americans and their evaluations of American political realities. A survey by the Pew Research Center at the end of 2018 offers some insight into the fundamental weakness of American democracy. A wide bipartisan majority of 73% think the gap between rich and poor will grow over the next 30 years. Two-thirds think the partisan political divide will get wider and 59% believe the environment will be worse. Only 16% believe that Social Security will continue to provide benefits at current levels when they retire, and 42% think there will be no benefits at all. Nearly half say that the average family’s standard of living will decline, and only 20% believe it will improve. These are not just the views of liberals. 68% of Republicans say that no cuts should be made to Social Security in the future. 40% say that the government should be mostly responsible for paying for long-term health care for older Americans in the future.

 

Yet when asked about their top political priorities, Republicans offer ideas which don’t match their worries about the future. Their three top priorities for improving the quality of life for future generations are reducing the number of undocumented immigrants; reducing the national debt; and avoiding tax increases. The richer that a Republican voter is, the less likely they are to want to spend any money to deal with America’s problems. Republicans with family incomes under $30,000 have a top priority of more spending on affordable health care for all (62%) and on Social Security, Medicare and Medicaid (50%), while those with family incomes over $75,000 are give these a much lower priority. 39% of poorer Republicans say a top priority is reducing the income gap, but that is true for only 13% of richer Republicans. Republican politicians follow the preferences of the richest Republican voters, but that doesn’t seem to affect the voting patterns of the rest.

 

Nostalgia for the “whites only” society of the past also pushes Americans into the Republican Party. About three-quarters of those who think that having a non-white majority in 2050 will be “bad for the country” are Republicans.

 

A significant problem appears to be ignorance, not just of Trump, but also of his voters. Many are ignorant about the news which swirls around us every day. A poll taken last week by USA Today and Suffolk University shows that 8% of Americans don’t know who Robert Mueller is.

 

But much of the ignorance on the right is self-willed. Only 19% of self-identified Republicans say the news media will have a positive impact in solving America’s problems. Only 15% are “very worried” about climate change and 22% are not worried at all. Despite the multiple decisions that juries have made about the guilt of Trump’s closest advisors, one-third of Americans have little or no trust in Mueller’s investigation and half agree that the investigation is a “witch hunt”. Despite the avalanche of news about Trump’s lies, frauds, tax evasions, and more lies, 27% “strongly approve” of the job he is doing as President, and another 21% “approve”. 39% would vote for him again in 2020.

 

Peter Baker of the NY Times reports that “the sheer volume of allegations lodged against Mr. Trump and his circle defies historical parallel.” Yet the percentage of Americans who approve of Trump is nearly exactly the same as it was two years ago.

 

Ignorance and illogic afflict more than just conservatives. The patriotic halo around the military leads Americans of both parties to political illusions. 72% of adults think the military will have a positive impact on solving our biggest problems, and that rises to 80% of those over 50.

 

The British writer Sam Byers bemoans his fellow citizens’ retreat into national pride as their political system gives ample demonstration that pride is unwarranted. His wordsapply to our situation as well. He sees around him a “whitewash of poisonous nostalgia”, “a haunted dreamscape of collective dementia”. He believes that “nostalgia, exceptionalism and a xenophobic failure of the collective imagination have undone us”, leading to “a moment of deep and lasting national shame”.

 

One well-known definition of democracy involves a set of basic characteristics: universal suffrage, officials elected in free and fair elections, freedom of speech, access to sources of information outside of the government, and freedom of association.

 

We have seen some of these attributes be violated recently in the United States. Republican state governments have tried to reverse electoral losses by reducing the powers of newly elected Democratic governors. Trump, following the lead of many others, has urged Americans to ignore the free press and to substitute information that comes from him. Many states have tried to restrict the suffrage through a variety of tactics.

 

Across the world, democracy is under attack from within. Winston Churchill wrote, “it has been said that democracy is the worst form of Government except for all those other forms that have been tried”. Unless we want to try one of those other forms, we need to fight against autocratization, at home and abroad.

]]>
Thu, 18 Apr 2019 16:48:42 +0000 https://historynewsnetwork.org/blog/154196 https://historynewsnetwork.org/blog/154196 0
Roundup Top 10!  

How New York’s new monument whitewashes the women’s rights movement

by Martha S. Jones

It offers a narrow vision of the activists who fought for equality.

 

Why Trump’s recognition of the Golan Heights as Israeli territory is significant

by Dina Badie

Given the dimensions of America’s global influence, U.S. recognition could lend some legitimacy to Israel’s controversial annexation policy

 

 

What Mueller's probe has already revealed about Trump

by Julian Zelizer

The price of this entire process has already been high.

 

 

Why Historians Are Like Tax Collectors

by Matthew Gabriele

Studying the past - and then, importantly, talking about it with an audience - is about revealing the mess behind the myth, the story behind what we think we know.

 

 

An Economist with a Heart

by Hedrick Smith

Alan Krueger, the Princeton professor and economic adviser to two presidents who died last weekend, was one of those rare economists who break out of the ivory tower and plunge restlessly into the real world.

 

 

The New Zealand Shooting and the eternal fear of “race suicide”

by Jonathon Zimmerman

Put simply, the fear of being flooded by foreign hordes is baked into our national DNA. And it all starts with the question of fertility.

 

 

The danger of denying black Americans political rights

by Kellie Carter Jackson

Without access to political rights, violence becomes a crucial tool in the fight for freedom.

 

 

Turning Our Backs on Nuremberg

by Rebecca Gordon

John Bolton and Mike Pompeo Defy the International Criminal Court

 

 

How Experts and Their Facts Created Immigration Restriction

by Katherine Benton-Cohen

Facts have a history, and we ought to admit it.

 

 

Democrats’ Voting-Rights Push Could Begin a Third Reconstruction

by Ed Kilgore

As in the Reconstruction era after the Civil War, one party is committed to the use of federal power to vindicate voting rights, and the other is opposed.

 

 

Democrats are holding their convention in Milwaukee. The city’s socialist past is an asset.

by Tula Connell

With the announcement of Milwaukee as the site of the 2020 Democratic National Convention, political opponents wasted no time in raising the specter of the city’s socialist past.

]]>
Thu, 18 Apr 2019 16:48:42 +0000 https://historynewsnetwork.org/article/171608 https://historynewsnetwork.org/article/171608 0
A Heartwarming Lost Chapter on Immigrants Emerges

 

From 1907 to 1911, the U.S. Government sponsored the Galveston Movement, a massive effort to direct arriving Jewish immigrants away from New York City and other East Coast ports, all overcrowded with immigrants and their families, and bring them to cities on the Gulf of Mexico, primarily Galveston, Texas. The government wanted to populate the Southern and Western states with immigrants as well as the Atlantic Seaboard. At first, it worked. Nearly 1,000 immigrants, mostly Jewish people fleeing from the pogroms of Russia at that time, moved to the Galveston area and were gradually assimilated into the city and its suburbs. The movement had problems, though. Jews refused to work on Saturday, annoying their employers. There was some anti-Semitism. Low paid Texas workers complained that the Jews took their jobs. There was not a big Jewish community in Galveston to embrace newly arrived Jews, as there was in cities like New York. The new arrivals encountered many of the same problems that confront Jewish, and other, immigrants today. The movement shut down in 1912.

Among those Jews who did move to Galveston was Russian Haskell Harelik, who spent his life there. His story is now being told by his grandson, playwright Mark Harelik, The Immigrant. It is a heartwarming, engaging and thoroughly lovable story not just about the Jews, but the Texans who befriended them and, like so many Americans, helped them to become Americans themselves. The play just opened at the George Street Playhouse, in New Brunswick, N.J.

The play starts with the impressive display of dozens of huge black and white photos of the Galveston immigrants and what their lives were like in those years. Black and white pictures appear from time to time in the play, just the right times, too, to help tell the story. As the play actually starts, we meet Haskell Harelik.  Harelik, a charming, affable young man, arrived in Galveston by ship in 1909, leaving his parents behind in Russia. He had nothing.  Milton and Ima Perry, a Galveston couple, take him in, renting him a room in their house and get him to start a banana sales business, that he runs out of an old wooden cart.

Young Harelik, a hard worker, soon evolves the banana trade into a produce store and then a larger dry goods store. His wife arrives from Russia to join him and they have three children. They become patriotic Americans and his three sons all fight for the U.S. in World War II.

Harelik has his struggles, though. Angry residents of a nearby town shoot at him when he visits there  with his banana cart. Others scorn him. Many ignore him. Eventually, though, he succeeds.

Playwright Harelik does not just tell his grandfather’s personal story in The Immigrant; he tells the story, in one way or another of all immigrants. They all faced the same difficulties upon arrival in America and, in some way, overcame their problems and were assimilated. This is a story triumph, not just for Harelik, but all the immigrants who came to America over all the years. It is a reminder, too, to those on both sides of immigration wars today, that the entry of foreigners int America, however they got here, was always controversial.

There are wonderful scenes in the play, such as those when a thrilled Harelik carries his newborn babies out of his house and lays them on the ground so that they become part of America. Then, years later, he names his baby after his friend Milt, and Milt happily carries him out of the house and lays him on the ground.

There is the story of the first Shabbat, a Jewish Holy Day, when Haskell and his wife invite Milt and his adorable wife Ima to their home.

It is the story of Milt and Ima Perry, too. One of their two children died quite young and the other ran away from home and was rarely heard from. They battle each other, and the Hareliks, and townspeople, from time to time, like we all do. Their story is the story of Texans, and Americans, embracing, with problems, these new immigrants.

The play succeeds, mostly, because of the mesmerizing acting of Benjamin Pelteson as Haskell. He is funny, he is sad, he is exuberant. You cheer for him and cry for him.  Director Jim Jack, who did superb work on the drama, also gets outstanding performances from R. Ward Duffy as Milt, Gretchen Hall as Milt’s wife Ima, and Lauriel Friedman as Haskell’s wife Leah.

There are some gaps on the play. We don’t know if Harelik spoke English when he arrived in Galveston or whether he learned it here. We know very little about the story of his wife Leah or troubles his kids might have had in school. All of that, of course, would require a 450 hour play. The drama in this one is good enough.

Haskell and his family were assimilated into Galveston life, his business did succeed and they made friends. It was an American dream for them.

PRODUCTION: The play is produced by the George Street Playhouse. Scenic Design: Jason Simms, Costumes: Asta Bennie Hostetter, Lighting: Christopher J. Bailey, Sound: Christopher Peifer, Projection Design: Kate Hevner. The play is directed by Jim Jack. It runs through April 7. 

    

]]>
Thu, 18 Apr 2019 16:48:42 +0000 https://historynewsnetwork.org/article/171594 https://historynewsnetwork.org/article/171594 0
A Scorching Look at a Word War II Jewish Ghetto in Poland

 

Ira Fuchs’ powerful new play about a Jewish Ghetto in World War II, Vilna, starts off with a small crime, the bribing of a Nazi official with a bottle of liquor by a Jewish woman doctor, and ends in one of the greatest crimes in human history, the extermination of 60,000 Jews in the Polish city of Vilna, part of the mass murder of six million Jews throughout Europe.

Fuchs’ stellar play opened last week at the Theatre at St. Clements on W. 46th Street, New York. It is a deep, rich and thoroughly frightening story of the Jews’ battle for survival in the city of Vilna, their faith in God, and each other, as truckload after truckload of them are taken onto the Ponary forest outside the city, where they are lined up and shot to death by firing squads. 

The story of the ghetto in this Polish city resembles many other stories in plays, films and novels. There is nothing terribly new about it. The heroes, resistance leaders, and the villains, the Nazis and collaborating local government officials, are the same. The difference, and what makes Vilna so outstanding, is the graphic violence, cruel and heart stopping, plus just tremendous acting by an all-star cast, whose work you will remember for quite a long time.

The play tells an entirely true story, although some of the scenes have been invented but resemble similar scenes at other ghettos and execution sites. Vilna was a city of more than 80,000 Jews in a larger population. The Jews were the backbone for a large cultural community that included theaters, symphonies and operas. It was an Emerald City for Jews in Europe. Starting in 1941, the Nazis, who occupied the city, began forcing the Jews into a ghetto, much like they did in Warsaw and other cities, denying them health care, food and sanitation. The Nazis established a Judenrat, or central committee, of Jewish community leaders, to run the ghetto. Then, systematically, they began to remove hundreds of Jews each day and murder them at a forest concentration camp (a total of 100,000 people, Jews and others, were executed there).

Vilna is the story of the Judenrat and its members, and the members of their families. It is also the story of Motko Zeidel and Yuri Farber from their high school days in 1926 to the end of the war. Through their story, playwright Fuchs tells the tale of all the Jews in Vilna and tells it well.

Motko and Yuri wind up working for Jacob Gens, a Jewish hospital director and father of a teenage girl. They do fine work of running the Ghetto – curbing diseases and preventing starvation and crime, until the Nazis tell them that it is their job to decide who shall be murdered and who shall survive (in the end, nobody survived). They made heart breaking decisions and suffered for them.

The play, deftly directed by Joseph Discher, is carried by its actors.

Sean Hudock, who plays Motko, and Seamus Mulcahy, who plays Yuri, are marvelous in their roles, especially towards the end of the play. There is a scene in which Moto tells his father about his job of deciding who lives in which his dad, really shaken, rubs the side of his face with his hand and Motko starts crying and his body stiffens. It is a striking moment.

There is another moment when the Nazis discover that Yuri changed architectural plans for the execution grounds to save Jews. He shakes with fear because he thinks they are going to hang him for his transgressions. It is one if the best moments in theater I have ever seen. Nathan Kaufman is impressive as Judenrat leader Jacob Gens. There is a scene where he is having dinner with the Nazi chief and time after time quivers, his eyes frozen wide and his jaw trembling, as the German tells him of the awful things he has to do to his own people.

The violence is disturbing but helps make the play as powerful as it is. At one point, a man on a platform is shot and there is a huge BANG in the air. Everybody in the theater was shocked. That, though, is the least violent act in the play. You sit there and shudder when you watch the violence, and know that it happened in Vilna, and in other Jewish ghettos, again and again and again.

One of the wonders of the play is the superlative acting by minor characters. Zeidel’s mom, the Jewish doctor, played wonderfully by Cary Van Driest, is brilliant as Jew, Mom and physician. His father, the elderly Josef Zeide, is play with both anger and tenderness by Mark Jacoby.  The Nazi leader, Bruno Kittel, is one of the most vile, vicious stage characters I have ever seen. You can almost feel the floor of the theater wobble when he struts across the stage and you can feel the air in the theater move when he waves his arms hysterically.

Their performances make the play. Others in this fine cast include James Michael Reilly as the director of an engineering firm, Brian Cade as Martin Weiss, the chief Nazi and Paul Cooper as Kittel.

A few summers ago, I visited a concentration camp outside of Hamburg, Germany. It was set, like so many were in a beautiful forest with nearby lakes, streams, clusters of swaying green trees and sweet-smelling flowers. You stand there and visualize the horror of what happened and ask yourself how human beings could do such things to other human beings. You think the same thing as you watch Vilna. How could this possibly have taken place?

…never again… 

 

Production: The play is produced by the Theatre at St. Clements. Scenic Design: Brittany Vasta, Lighting: Harry Feiner, Costumes: Devon Painter, Sound: Jane Shaw.

 The play is directed by Joseph Discher. It runs through April 11.

]]>
Thu, 18 Apr 2019 16:48:42 +0000 https://historynewsnetwork.org/article/171595 https://historynewsnetwork.org/article/171595 0
America, Palestine, and UNRWA: A History of Self-interest

 

In the fall of 2018, Donald Trump announced he would withdraw from the American commitment to provide some $300 million to the United Nations Relief and Works Agency. UNWRA was set up in 1949 as the only UN institution dealing solely with the Palestinian refugee crisis. It has since become one of the UN’s largest organizations, employing some 30,000 people and operating more than 700 schools for Palestinian refugees as well as a wide variety of other social services in camps and settlements across the Arab world. By its own accounting, its services reach more than 3.5 million people, a majority of the 5.1 million registered “Palestine refugees” on the UN’s books. Largely as a consequence of Trump’s decision to withhold funding, UNRWA is currently facing a major financial crisis that has brought into question whether many of the schools, clinics, and supply centers it runs will reopen in the fall of 2019. 

 

Trump claimed he curtailed UNRWA funding because the organization was a “failed mechanism” (according to national security adviser John Bolton) that was contributing to the ongoing political stalemate between Israel and the Palestinians. Prodded by his Middle East negotiator Jared Kushner, Trump appears to have hoped that the withdrawal of funds from UNRWA would press the Palestinians to drop their claim to the right of return that they have consistently made in negotiations with Israel since the 1948 war. When asked about the change in policy, then-ambassador Nikki Haley defended the stance as part of Trump’s “America first” approach: “First of all, you’re looking at the fact that there’s an endless number of refugees that continue to get assistance. But more importantly, the Palestinians continue to bash America.”

 

But UNRWA began as essentially an American project, designed to ensure that the Palestinian refugee crisis would not disrupt the development and operation of American interests (including, crucially, oil investments) across the Middle East. From its founding in 1949, American thinkers and bureaucrats saw in it an opportunity both to physically confine the refugees and to spread American ideas in the Middle East. In 1949 the former U.S. Tennessee Valley Authority director Gordon Clapp authored a UN report that claimed Palestinian refugees could serve as an entry point for Western, and especially American, political and economic influence across the region: “The administration of the relief and public works programme for refugees… can, in the considered judgment of the Economic Survey Mission, become a contributing factor for peace and economic stability in the Near East.” 

 

Clapp proposed a series of land development schemes, backed by American private investment, that would simultaneously create livable space for the refugees in Jordan and Gaza (thus preempting their uncontrolled movement into Israel or around the region) and offer venues for American money, advice, and interests to infiltrate the Middle East. For the Eisenhower administration, it seemed that UNRWA represented an opportunity to press for a new US-led political and economic order across the region. 

 

American personnel and interests dominated UNRWA in its first decades; the US represented UNRWA’s single most important financial backer and in some years provided as much as seventy percent of its funding. In the 1950s and 1960s, successive American administrations believed UNRWA’s schemes of refugee relief could simultaneously prevent Palestinian support for Communism and serve as  economic and political leverage with their Arab host states. Assistant Secretary of State George McGhee under President Truman put the matter clearly in a statement to the House Committee on Foreign Affairs in 1950: “[The Palestinian refugees] will continue to serve as a natural focal point for exploitation by Communist and disruptive elements which neither we nor the Near Eastern governments can afford to ignore. … The presence of three-quarters of a million idle, destitute people – a number greater than the combined strength of all the standing armies of the Near East – whose discontent increases with the passage of time, is the greatest threat to the security of the area which now exists.” By 1961, the State Department was defending its policy of supporting UNRWA by pointing out that at a cost of nine cents a day it had been “remarkably successful in keeping the potentially explosive refugee problem under control.” 

 

John Davis, the American head of UNRWA in the early 1960s, summed up what the organization was doing even more succinctly: “UNRWA was one of the prices – and perhaps the cheapest – that the international community was paying for not having been able to solve with equity the political problems of the refugee.” He added, “It was surely well worth the cost.” In the decades since, this political calculation continued to form the backbone of American support for UNRWA. American leaders were not interested in it as a crucial humanitarian obligation, but as a security mechanism. In the absence of an institutional frame keeping Palestinian refugees in confined spaces, American leaders believed, Palestinians could rapidly cause political chaos, threatening not only the security of Israel but the stability of the region in general – including American access to Gulf oil piped through the refugee-heavy regions of Jordan, Lebanon, and Syria. 

 

For decades, support for UNRWA was conceived of as an action of American self-interest, not of Palestinian relief. The current administration’s reversal of this longstanding approach suggests less a changed set of American political alignments in Palestine/Israel than an unwillingness – or incapacity – to see the longstanding realpolitikcalculations behind an apparent commitment to humanitarian aid. And perhaps it won’t matter much in the end; because there are other actors who worry that the dissolution of UNRWA would threaten their political stability and economic interests, and are willing to shell out to avoid such a risk. At last accounting, UNRWA commissioner general Pierre Krahenbuhl reported that the $300 million shortfall caused by the US withdrawal of funds had been reduced to about $21 million, thanks to a series of emergency donations. He wished especially, he noted in a press conference, to thank the four donors who had collectively nearly made up the difference: Kuwait, Qatar, the United Arab Emirates, and Saudi Arabia. 

]]>
Thu, 18 Apr 2019 16:48:42 +0000 https://historynewsnetwork.org/article/171539 https://historynewsnetwork.org/article/171539 0
What Does William Barr Have to Do With Iran Contra?

Donald Trump’s nomination of William Barr to become attorney general has recast the spotlight on the presidency of George H.W. Bush. Barr served as attorney general in the Bush administration from late 1991 to early 1993. Most notably, Barr railed publicly against a long running independent counsel investigation of the Reagan-Bush administration and he fully supported President Bush’s last minute pardon of Caspar Weinberger, Reagan’s former defense secretary. Weinberger had been indicted on five felony charges, including accusations that he obstructed federal investigations and lied to Congress about the Iran-Contra affair.   

In the wake of Bush’s recent death, innumerable editorials have heaped praise on the late president for his prudent and polite leadership. Far too little attention has been paid to his role in the Iran-Contra scandal.

No writer has been more generous to Bush than journalist Jon Meacham, the author of The American Odyssey of George Herbert Walker Bush. In a New York Times editorial assessing Bush’s legacy, Meacham lauded the nation’s forty-third vice president and forty-first president for being especially principled and pragmatic; a leader whose “life offers an object lesson in the best that politics…can be.” Bush, Meacham noted admiringly, saw politics as a noble pursuit, a means to faithfully serve the public, “not a vehicle for self-aggrandizement or self-enrichment.”     

But the history of Bush’s involvement in the Iran-Contra scandal is not one of nobility and virtue. The object lesson, in fact, is that even our most revered leaders are fallible human beings subject to making unethical decisions out of misdirected loyalties or self-preservation. 

There is no doubt that Bush, as a loyal vice president, was aware of and endorsed the Reagan administration’s covert policies in the Middle East and Central America. Specifically, he knew of the illicit program of selling arms to Iran, a U.S. designated terrorist state, in hopes of recovering American hostages in Lebanon. And, he knew of the illegal program of suppling aid to the Contra rebels in Nicaragua. Years later when running for reelection as president, Bush admitted to his diary that, “I’m one of the few people that know fully the details [of Iran-Contra]….It is not a subject we can talk about.”

It is also clear that Reagan and his senior staff, Bush included, understood that the Iran and Contra programs were illegal. At one point, in regard to the arms-for-hostages initiative, Reagan informed his advisers that he would risk going to prison because the American people would want him to break the law if it meant saving the lives of hostages. “They can impeach me if they want,” Reagan said, and then he quipped “visiting days are Wednesday.”

Shortly after the Iranian weapons deals became public, Bush tried to distance himself from the Iran-Contra scandal by telling reporters that it was “ridiculous to even consider selling arms to Iran.” Knowledge of Bush’s involvement could jeopardize his plans to succeed Reagan. Such deceptive maneuvering was galling to Reagan’s secretary of state, George Shultz, who knew all too well that Bush had supported the Iran project. Shultz told a friend: “What concerns me is Bush on TV,” because he risks “getting drawn into a web of lies….He should be very careful how he plays the loyal lieutenant.”

Bush did become president and his eventual pardon of Weinberger, just weeks before leaving office, was not an act of virtuous public service; even Reagan had refused to grant pardons to those involved with Iran-Contra. Bush’s decision was a self-serving one as a trial examining Weinberger’s role in Iran-Contra, including the administration’s orchestrated cover-up, risked exposing the outgoing president’s complicity.

Hearing of Weinberger being pardoned, Judge Lawrence Walsh, the independent counsel investigating Iran-Contra, issued a statement of condemnation: “President Bush’s pardon…undermines the principle that no man is above the law. It demonstrates that powerful people with powerful allies can commit serious crimes in high office—deliberately abusing the public trust without consequence."

Among the lessons of Iran-Contra is that a healthy democracy must have robust checks on executive authority in order to minimize abuses of power. A quarter century ago, the president’s attorney general, William Barr, staunchly opposed the independent counsel’s investigation of wrongdoing in the White House, and he also firmly supported Bush’s use of pardons as a means of self-protection. Are we to believe that Barr’s relationship with President Trump will be any different? 

 

If you enjoyed this piece, be sure to check out Dr. Matthews forthcoming book: 

]]>
Thu, 18 Apr 2019 16:48:42 +0000 https://historynewsnetwork.org/article/171025 https://historynewsnetwork.org/article/171025 0
Is Shoeless Joe Jackson Innocent? The Black Sox Scandal 100 Years Later

 

What do Pete Rose, Rob Manfred, Barry Bonds, and Ted Williams have in common? Why, Shoeless Joe Jackson, of course. 

Major League Baseball has had its share of controversies and scandals, but perhaps none has had a more lasting impact than the Black Sox Scandal of 1919. At the center of that legacy is Shoeless Joe Jackson, the legendary outfielder for the Chicago White Sox. He was arguably the best player in baseball at the time and remains one of the game’s greatest hitters with the records to prove it. He also remains permanently banned from professional baseball and therefore ineligible for the Hall of Fame. The why’s and wherefore’s of his banishment have stirred the passions of countless fans for the last one hundred years.

Gambling is at the heart of the Black Sox story. Eight White Sox players conspired with gamblers to throw the World Series, which Cincinnati won in game 8 (the Series was 9 games that year). That is not in doubt. What continues to be questioned is Jackson’s role in the conspiracy. When did he know about it? Was he in on it? What did he do about it? Did he take money for it? Did he field and hit poorly in order to lose or did he play his heart out? The answers to those questions are not the subject of this article. Instead, I investigate why Joe was banned and how his legacy still shapes baseball today. 

The rules against gambling sprang from the Black Sox Scandal and are clearly posted in every professional clubhouse in the land: “Any player, umpire, or club or league official or employee who shall bet any sum whatsoever upon any baseball game in connection with which the better has a duty to perform, shall be declared permanently ineligible.” Pete Rose, who has the most hits in baseball history, clearly broke that rule and has also been declared ineligible for the Baseball Hall of Fame. Yet many still argue that Rose should be allowed into the Hall of Fame. Many often debate who has a better case for reinstatement, Joe Jackson or Pete Rose?

Joe Jackson broke no rule. In fact, it might be argued that gambling was the national pastime in 1919 (It might still be argued that gambling is our national pastime). Gamblers often greased a player’s palm in exchange for inside dope on who was hurt, who was drinking too much, anything that would help solidify the bet. The owners knew it, which is why the White Sox owner, Charles Comiskey, wasn’t that concerned when he heard rumors that the fix was in. After the owners elected Kennesaw Mountain Landis baseball’s first commissioner in 1921, gambling was declared illegal, but that was two years after the 1919 scandal. Shoeless broke no rule. Pete Rose broke the rules, plain and simple. That alone gives Jackson the better case for reinstatement.

More importantly, Joe was only alleged to have broken the lawand was never convicted. In the 1920 jury trial, “The Eight” were found not guilty. And when Joe sued Comiskey for back pay, a 1924 jury awarded it to him, finding him not guilty of the gambling conspiracy. How, then, did he come to be banned from baseball?

The answer goes to another part of Joe’s legacy: the autocratic power of baseball’s commissioner. Landis, a former judge, would not take the job unless he had absolute power when making decisions. The owners gave it to him. One need look no further than his ruling: “Regardless of the verdict of juries, no player who throws a game...will ever play professional baseball.” Imagine being able to act “regardless of the verdict of juries!” Still today, Rob Manfred, the current Commissioner of Baseball has almost unlimited power to investigate and issue punishment for any practice or transaction he believes is “detrimental to the best interests of baseball.” He owes that power to the legacy of Joe Jackson and the “eight men out.” 

“Cheating” has become a modern-day equivalent of “gambling.” And the question of Jackson’s banishment has also impacted the conversation about whether Barry Bonds, Roger Clemens, Mark McGuire, or others who used steroids should be voted into the Hall of Fame. Unlike Shoeless, they are not banned from baseball; the sports writers could vote them into the Hall. It has become a question of character. Should the writers—and by extension, the fans—consider only the baseball statistics, or should the morality of what the players did be considered? Bonds, to take one example, had his obstruction of justice conviction overturned. Like Shoeless, he has never been convicted of anything. Nevertheless, the writers have refused to vote him in, the highest percentage of votes for admitting him, 56%, falling well short of the necessary 75%. 

Judge Landis certainly considered the morality of Joe Jackson when he banned him from professional baseball. Is the shadow of Joe’s banishment lingering in the minds of today’s sports writers when they refuse to vote into the Hall any otherwise eligible player credibly accused of using steroids to enhance his performance on the field?

Among die hard baseball fans, no one question elicits more “discussion”—i.e. argument—than that of “Who was the greatest hitter of all time?” Was Teddy Ballgame better than the Babe? How about Ty Cobb vs. Tony Gwynn? Aaron Rodriguezor Bonds or Joltin’ Joe DiMaggio or Stan Musial? Or perhaps some “ancients” like Ed Delahanty, Dan Brouthers, Cap Anson? Joe Jackson could outhit them all, some say. In almost any discussion of hitting, in fact, the name Shoeless Joe Jackson usually arises. Many consider him to be the best “natural” hitter of all time, with a swing so perfect that no one could match it. Both Babe Ruth, who patterned his swing after Jackson’s, and Ty Cobb expressly said just that. 

Jackson played in the “dead ball” era of baseball, where one baseball was used for an entire game, if possible, and his lifetime batting average of .356 stands third of all time. Had he played in the “live ball” era, where new balls were frequently inserted into the game and scuffed balls disallowed, there is no telling what average he could have hit for. In any event, he is on almost everyone’s list of top hitters and to this day is one of the gold standards of hitting when fans discuss the “best of all time.”

But perhaps the greatest legacy of Shoeless Joe and the Black Sox Scandal of 1919 is simply this: we’ll never know exactly what happened one hundred years ago and that gives baseball lovers the chance to do what they love best: argue. 

Shoeless hit .375 in the series, had 12 base hits, a record not broken until 1964, committed no errors, threw out a runner at the plate. 

Oh yeah? His average in the games they lost was only .286. And what about the $5,000? Joe said he tried to give it back to Comiskey. 

Oh yeah? How come he made no mention of that in his grand jury testimony? He knew about the fix, he should have done more to stop it. 

He tried to, he asked his manager to bench him. 

Oh, yeah? Prove it!

My novel takes place in 1951 and uses flashbacks to describe the Scandal. Ultimately, I had to decide for myself whether Joe was innocent or not. My answer turns on the question of character. Do I believe Jackson deserves to be reinstated and then voted into the Hall of Fame? You’ll have to read the novel. Although he probably wouldn’t have wanted it this way, the wonderful legacy of Shoeless Joe is that he’ll never have a last at-bat.

To read the author's latest book on Shoeless Joe, click below!

]]>
Thu, 18 Apr 2019 16:48:42 +0000 https://historynewsnetwork.org/article/171454 https://historynewsnetwork.org/article/171454 0
An American Socialite and the Biggest British Constitutional Crisis Since Henry VIII's Divorce

 

It was the biggest constitutional crisis since Henry VIII divorced Catherine of Aragon – except this time everyone agreed who the villain was. On December 10, 1936, Edward VIII renounced what Winston Churchill called “the greatest throne in history,” giving up an empire of 500 million people, to marry the twice-divorced American socialite, Wallis Simpson. Ever since, this Baltimore native has been blamed as the wicked-witch who almost derailed the British monarchy. Throughout the intervening decades, we have been overfed a diet of such fantastical slander about Mrs. Simpson that it has become impossible to discern the real woman. Wallis has been written off as a seductress, a gold-digger, a Nazi sympathizer and seen as a cold, ambitious bitch who schemed from the outset in the hopes of becoming Queen of England. She has become a caricature of villainous womanhood.

History is mostly perceived from the perspective of his-story. But what about her story? This Women’s History Month, Wallis Simpson is an important woman to revisit and uncover her real history. 

It was the unholy Trinity of the Church, the Palace, and Parliament, who did not want Edward VIII on the throne. They considered him weak and ill-disciplined and saw Wallis as the perfect excuse to rid England of a man they deemed unfit to rule. Far from the villain of the history books, Wallis was the victim of the abdication. She was undermined by a cunning powerful British establishment who sought to destroy and diminish her. Bright and perceptive, she soon realized that machinations to use her were underway. At the time of the abdication crisis, she wrote, “I became obsessed with the notion that a calculated and organised effort to discredit and destroy me had been set afoot.” She was right. She became the perfect pawn for the wily palace courtiers. 

That Edward did not conform to court life, preferring a vigorous and flamboyant social life to the grey strictures of monarchical duty, was tantamount to treachery in the eyes of his advisors. In 1927, courtier Tommy Lascelles told Prime Minister Stanley Baldwin of his violent disdain for the Prince of Wales: “You know, sometimes when I am waiting to get the result of some point-to-point in which he is riding, I can’t help thinking that the best thing that could happen to him and the country, would be for him to break his neck.” “God forgive me,” Baldwin replied. “I have often thought the same thing.” This conversation occurred seven years before Wallis Simpson met Edward Prince of Wales at a weekend house party in the British countryside.

When Edward fell in love with Wallis Simpson, no one could have predicted the strength of his obsession. At the time, she was happily married to her second husband, Ernest Simpson.  Many ask, why didn’t she break off her relationship with Edward, especially when he became King? Why did she divorce Ernest Simpson? Her detractors fail to acknowledge that she never wanted to divorce Ernest or to marry Edward. Initially, she was flattered by his attention. What woman would not have been beguiled by the prince’s “unmistakeable aura of power and authority?” Yet she never expected the infatuation to last. In 1935, she wrote to her beloved aunt, Bessie Merryman, “What a bump I’ll get when a young beauty appears and plucks the Prince from me. Anyway, I’m prepared.” 

It was Edward, then King, who forced her into an untenable position, refusing neverto give her up. At the time of the abdication, Edward slept with a loaded gun under his pillow and threatened to kill himself if Wallis forsook him. Aides described him as “exalté to the point of madness.” Wallis knew that her fate would be far worse if a beloved and popular King took his life because of her. In the name of Edward’s needy, obsessive love, Wallis paid the ultimate price: entrapment by a childish narcissist who threw the biggest tantrum in history when he could not have the two things her wanted most in the world — her and the throne. When he chose Wallis, the couple was devastated when the royal family closed ranks against them, forcing them into exile from Britain for the rest of their lives. Wise to this, Wallis wrote to Edward post-abdication, “It is the politicians whose game it is to build up the puppet they have placed on the throne. I was the convenient tool in their hands to get rid of you and how they used it!” 

During my research, in which I gained entrée into Wallis’s coterie of living friends, I listened with mounting incredulity and fury as they told me repeatedly of her kindliness, sense of fun and depth of friendship, which contradicted the public image of a hard-nosed, shallow woman. It was a revelation for me to discover what a warm, witty, loyal friend Wallis was. Her friends adored her. The Conservative MP Sir Henry “Chips” Channon said, “She has always shown me friendship, understanding, and even affection. I have known her to do a hundred kindnesses and never a mean act.” 

She was no saint – but she was far from a sinister manipulator. Her detractors continue to argue that she was a Nazi sympathizer and traitor, yet her friends and eminent historians, such as Hugo Vickers, Philip Ziegler, and the late Lord Norwich, are adamant that there is no concrete evidence of Nazi conspiracy. She did go with Edward to Germany to meet Hitler in 1937, but it was before the atrocities of the Second World War and only because Edward wanted his wife to experience the pomp and ceremony of a royal tour that was denied to Wallis in England. Edward was keen for this trip when the Germans agreed to his request that Wallis would be curtsied to and addressed as Your Royal Highness. This was blisteringly important to him, as he felt so aggrieved that the British royal family refused to give Wallis the crucial HRH title, even though she was legally entitled to this as the wife of the former King. 

When the world learned of the abdication, it recoiled in shock. How could this strange, angular-looking woman take a beloved King from his people, many wondered. Wallis received vicious hate mail. “It’s no exaggeration to say that my world went to pieces every morning on my breakfast tray,” she later wrote. Admirably, she schooled herself to survive what would have felled the hardiest of souls: “To be accused of things that one has never done; to be judged and condemned on many sides by the controlling circumstances; to have one’s supposed character day after day laid bare, dissected and flayed.” She succeeded with “a kind of private arrangement with oneself.” She knew who she was and her friends knew too. She learned to uphold what matters in life and to endure being a woman misunderstood and excluded on an international scale.

From the moment she was locked in the Faustian pact of marriage to the Duke of Windsor, she determined to make their marriage a success and to ensure that her husband was as happy as he could be, ousted by his family and exiled from his country. For thirty-five years , she triumphed in this endeavour, even as she endured psychological assassination from the entire world. Wallis Simpson was no ordinary woman. An inscrutable dignity gave her strength. She has been misunderstood and misinterpreted for far too long. This woman deserves our admiration for the situation she became embroiled in and what she subsequently had to endure. Most of all, she deserves for her reputation to be rehabilitated in the annals of history.

]]>
Thu, 18 Apr 2019 16:48:42 +0000 https://historynewsnetwork.org/article/171541 https://historynewsnetwork.org/article/171541 0
The Elon Musk of Global Crime and European Colonialism in Africa

 

Paul LeRoux, a renegade tech titan from southern Africa, introduced Silicon Valley-style entrepreneurship to transnational organized crime.  A pioneer in the field of cyber security, LeRoux broke bad and used his exceptional gifts to become the international criminal underground’s premier innovator. He dealt in arms, weapons systems, drugs, gold, natural resources, phony documents, bribery and contract murder, trading illicit commodities and favors with Iran, North Korea, the Chinese Triads, the Serb mafia, the Somali pirates, warlords, militias, terrorists and mercenaries of various nations. 

 

Two agents of the U.S. Drug Enforcement Administration’s elite, secretive 960 Group, part of the agency’s Special Operations Division, tracked him down, penetrated his inner circle, and lured him to Liberia and arranged to have him arrested and expelled to their custody on September 26, 2012. LeRoux promptly flipped and helped the DEA agents round up his hit men and his North Korean methamphetamines trafficking team.   All have been convicted.  The last to face justice was Joseph “Rambo” Hunter, a former U.S. Army sniper trainer and drill sergeant, who was sentenced in New York on March 7, to three life sentences plus 10 years for setting up the brutal murder of a Filipino woman targeted by LeRoux.  LeRoux himself has pleaded guilty to arms and drugs trafficking and other crimes and is incarcerated in New York, awaiting sentencing.

 

Paul LeRoux is the fruit of a poisonous tree.  I could never think about him without thinking of the horrific history of European colonialism in Africa. I read deeply into histories of colonial Rhodesia and South Africa, where LeRoux was born and grew up.  I thought of Kurtz, the corrupted, blood-soaked, self-exiled anti-hero of Joseph Conrad’s masterpiece, The Heart of Darkness.  “His soul was mad,” Conrad writes.  “Being alone in the wilderness, it had looked within itself and, by heavens I tell you, it had gone mad.”  It became clear to me that, for all his technological prowess, LeRoux has to be seen in the context of the history of southern Africa, a rapacious, swash-buckling profiteer trading in guns, drugs, gold, timber, false documents and human life.

 

He was born on Christmas Eve 1972 in Bulawayo, Rhodesia’s gritty, vibrant second city. He was the illegitimate son of a young white woman and her lover, both of British descent. A married white Rhodesian couple, Paul and Judith LeRoux, adopted him.  

 

Today, only 17,000 people of European extraction live in Zimbabwe, a landlocked nation of 150,000 square miles wedged between South Africa, Mozambique, Botswana, and Zambia. At the time of LeRoux’s birth, some 260,000 whites resided there, ruthlessly dominating, exploiting, and fearing the country’s 4.8 million blacks. Bulawayo, a precolonial tribal capital whose name meant “place of slaughter,” had blossomed into an industrial powerhouse and processing center for the region’s abundant metal ores, cattle, cotton, tobacco, and maize. The colony’s wealth cushioned it from international sanctions meant to force Prime Minister Ian Smith, an unyielding champion of white rule, to agree to a transition to majority—black—rule.

 

Three days before LeRoux was born, the colony’s long-simmering racial and economic disparities ignited into what whites called the Bush War and blacks called the War of Liberation. The civil war escalated as both sides engaged in hideous atrocities. The British historian Piers Brendon, in his 2008 book, The Decline and Fall of the British Empire, wrote:

 

…Guerrillas, some backed by China and others by Russia, crossed the frontier from Mozambique and Zambia to attack remote farmsteads, railways and roads….The guerrillas tried to enlist the native population, using terror tactics against anyone who resisted. Chiefs were regularly tortured and murdered. Schoolteachers were raped. Villages were looted and burned. Counter-insurgency measures were no less savage.… African cattle were seized or deliberately infected with anthrax. Captured combatants were given electric shocks, dragged through the bush by Land Rovers or hung upside down from a tree and beaten. 

 

Under pressure from the United Nations, Great Britain, and the United States, Smith reluctantly held elections. On March 4, 1980, guerrilla leader Robert Mugabe’s party won in a landslide.  The colony of Rhodesia disappeared from the map, replaced by the independent nation of Zimbabwe.

 

Liberation brought no peace. Mugabe launched a dirty war against tribal and political rivals. He created a 5,000-man Fifth Brigade, had it trained and equipped by North Korea, and dispatched it into the countryside to pillage, rape, torture, and slaughter. Between 20,000 and 80,000 people, mostly civilians, died.

 

The LeRoux family reportedly lived in the mining town of Mashaba, where Paul LeRoux the elder was a supervisor of underground asbestos mining in the enormous Shabanie and Mashaba asbestos mining complex, one of the largest and most hazardous mining operations in the world at the time. Mashaba meted out misery and early death to black asbestos miners, but it would have afforded an uneventful childhood to the son of a white overseer.

 

White privilege couldn’t survive Mugabe’s financial mismanagement, which launched the Zimbabwean economy into a tailspin and sent white professionals fleeing. The LeRoux family joined the white exodus in 1984 and landed in the grimy South African mining town of Krugersdorp, 540 miles to the south of Bulawayo. LeRoux’s father parlayed his knowledge of mining into work as a consultant to South African coal mines. LeRoux later claimed that his father developed an off-the-books sideline as a diamond smuggler and introduced his son to figures in the South African underworld.

 

Whites were at the top of the economic and social heap in South Africa. LeRoux, chunky and socially awkward, buried himself in his computer. He studied programming at a South African technical school, soon outstripped his classmates and his teacher, and developed a specialty in cyber security programming. In 1992, when he was twenty, he snagged his first job, working at a London-based information technology consultancy. He became a digital nomad, traveling between Europe, Hong Kong, Australia and the United States, setting up secure data systems for government ministries, corporations, law firms and banks.

 

 In 2002, when he was 30, he felt a vocation to entrepreneurship, in the mold of his South African contemporary Elon Musk. LeRoux’s innovations were always on the dark side – drugs, arms, smuggled gold, illegal timber, false documents, murder. As he accumulated wealth, he fell back on a mind-set he absorbed in Rhodesia—dig in hard, don’t spare the bullets, and be ready to move.

 

He bought bolt-holes – safe houses -- throughout Africa and Asia, but he dreamed of returning to the place he was born.   He paid a broker $12 million to give Mugabe so he could acquire a plantation confiscated from white farmers. He was cheated:  Mugabe never delivered.  In 2009, he tried another tack, sending “Jack,” a European aide, to travel around the Zimbabwean countryside to search out a colonial-era villa with white, plantation-style columns, some acreage, and a “big, curvy driveway.”  He evidently harbored a fantasy of the by-gone colonial era, when white gentlemen planters, called “verandah farmers,” enjoyed idle days and debauched nights, sipping cool drinks on the broad front porches, observing from a distance the toils of the black farmhands, then toddling off at dusk for dinner and an orgy with other planters’ bored wives.

 

As a child in a whites-only school, he would have been taught the so-called Pioneers Myth, about intrepid English settlers taming the verdant, empty plain and carving out a civilization. It is highly doubtful a white schoolboy would have been told the truth -- that Rhodesia was founded on and sustained by blood and lies.  When the indigenous people rebelled in 1896, British troops and Rhodes’s militiamen exterminated them. Historian Brendon described scenes of horrific cruelty: British soldiers and settlers putting villages, grain stores crops to the torch; slaughtering men, women and children; collecting trophy ears; and making their victims’ skin into tobacco pouches. In the famine that resulted, people were reduced to eating roots, monkeys, and plague-ridden cattle corpses. The streets of Bulawayo filled with emaciated refugees trying to escape to South Africa. 

 

LeRoux wasn’t interested in his homeland’s shameful imperial history, or Mugabe’s misrule.  

 

 “It was about getting what he wanted,” Jack said, “and if he had to do business with an evil person like Mugabe, then so be it, as long as he got his part off the deal. He didn’t care about the people. They were all monkeys to him.”  

 

To read more about Paul LeRoux, check out Hunting LeRoux: The Inside Story of the DEA Takedown of a Criminal Genius and His Empire by the author:

 

 

 

]]>
Thu, 18 Apr 2019 16:48:42 +0000 https://historynewsnetwork.org/article/171540 https://historynewsnetwork.org/article/171540 0
A House Once More Divided

 

Beginning with the Three-Fifths Compromise in the U.S. Constitution, United States history is filled with “compromises” intended to preserve a rough balance of power between slave-holding and free states. The Three-Fifths Compromise was followed by the Missouri Compromise of 1820 and the Compromise of 1850. These negotiations helped America delay war, but after the Kansas-Nebraska Act of 1854, further concessions meant not only preserving but expanding slavery.

 

The election of Abraham Lincoln outraged many in the South. South Carolina Governor Francis Pickens declaredprior to the Civil Warhe “would be willing to cover the state with ruin, conflagration and blood rather than submit” to abolition.(1) After decades of compromise on the issue of slavery, South Carolina became the first state to secede. Ultimately Governor Pickens reached his goal, but before peace was restored, conflagration and blood truly covered South Carolina.

 

Lincoln’s campaign and election prompted a different response from activists like Joseph Medill, co-owner and editor of the Chicago Tribune. Medill was motivated by a desire to preserve the Union and emancipate slaves, and he felt a good newspaper must report stories in ways that advanced society. To him that meant abolishing slavery. Joseph became a key player in a new generation of abolitionist leadership. 

 

Public advocacy in the Tribune made Joseph a target. In 1860 while in Washington, D.C., he criticized concessionists, the position of Illinois Congressman William Kellogg. At the National Hotel, Congressman Kellogg attacked Joseph, landing blows to Joseph’s head and face. Kellogg had been appointed to the Committee of Thirty-Three of the U. S. House of Representatives, tasked with averting a civil war. Joseph described the assault in a letter to his wife, Katherine: “Wm. Kellogg started home in a hurry to Springfield to help beat Judd (2) for a place in the Cabinet. He is talking compromise. He [Kellogg] is a cowardly Republican and wants to back down. I quarreled with him." (3)

 

Joseph Medill and his partner, Dr. Charles Ray, used the pages of the Tribune to support the Lincoln administration and rally the public to the cause of emancipation. Joseph urged the swift organization of black regiments and broadcast the goals for the Union League of America (U.L.A.,) a group established to promote loyalty to the Union. Joseph played a prominent role in Union League programs.(4) The U.L.A. supported organizations such as the United States Sanitary Commission and provided funding and organizational support to the Republican Party. 

 

Joseph’s early public calls for war turned to personal anxiety and grief when two of his younger brothers became casualties of war. Yet, he continued to support a war of liberation and pursue principles of freedom and self-government. Joseph provides a poignant example of moral imperative informing political activism.

 

Abraham Lincoln and supporters like Joseph Medill taught that politics must not violate human rights. Immoral behavior must never be subject to a majority vote. Robert Todd Lincoln explained his father’s views on democracy eloquently in 1896. “In our country there are no ruling classes. The right to direct public affairs according to his might and influence and conscience belongs to the humblest as well as to the greatest…But it is time of danger, critical moments, which bring into action the high moral quality of the citizenship of America.”(5)

 

 

People didn’t grasp the danger of a house divided then, and many fail to grasp it now, but history repeats itself in elusive, yet profound, ways. Today, the ugly specter of divided parties returns. No matter which party we align with, President Trump’s ability to divide us and willingness to condone violence should alarm us all. 

 

From the beginning of his campaign, Donald Trump used rhetoric to incite supporters, using baseless slurs to disparage immigrants (6) and political opponents. During the presidential campaign in March 2016, it seemed unlikely that Trump had enough votes at the Republican National convention to secure his nomination. If that happened, Trump warned during an interview with CNN, “I think you would have riots.” (7) When President Trump wages verbal war with the intelligence community and independent sources of investigation, he provokes division that threaten to become an “irrepressible conflict,” echoing the pre-Civil-War rancor. If Americans don’t reject politicians who divide us, condone violence, label a group of people as criminal, and another group enemies of the people, we do so at our own peril.

 

Once again we face dilemmas that require as much of us as any time in the nation’s past. Modern Americans tend to take our stable democracy for granted, but Mr. Lincoln realized the freedoms gained in the Revolution could be lost. He enlisted newsmen like Joseph Medill to champion justice and liberty. Lincoln understood that involved citizens preserve the union, and he taught a vital lesson that only when human rights are respected is democracy worth preserving.

 

(1) Orville Vernon Burton, Age of Lincoln, (New York : Hill and Wang, 2007)118.

(2) Longtime Lincoln friend and supporter Norman Judd did not receive a Cabinet post but was named Minister to Prussia.

(3) Georgiann Baldino, Editor, A Family and Nation Under Fire, (Kent: Kent State University Press, 2018) 25.

(4) Robert McCormick’s papers in the McCormick Research Center at the First Division Museum, Medill Family Correspondence.

(5) Speech of the Hon. Robert T. Lincoln made at the Celebration of the Thirty-eighth Anniversary of the Lincoln-Douglas Debate, Galesburg, Ill., October 7, 1858(Hancock NY: Herald Print, 1921) 2.

(6) Jennifer Rubin, “Most Americans agree: President Trump is divisive,” Washington Post, January 17, 2018, accessed March 11, 2019, https://www.washingtonpost.com/blogs/right-turn/wp/2018/01/17/most-americans-agree-president-trump-is-divisive/?noredirect=on&utm_term=.e89c9aaeb8b0

(7) Jonathan Cohn, POLITICS, HuffPost 06/09/2016 06:23 pm ET Updated Jun 16, 2016, accessed March 11, 2019, https://www.huffingtonpost.com/entry/worst-trump-quotes_us_5756e8e6e4b07823f9514fb1

]]>
Thu, 18 Apr 2019 16:48:42 +0000 https://historynewsnetwork.org/article/171535 https://historynewsnetwork.org/article/171535 0
What I’m Reading: An Interview With Environmental Historian Eleonora Rohland

 

Eleonora Rohland is Assistant Professor for Entangled History in the Americas (16th-19th centuries) at Bielefeld University, Germany. Rohland was trained as an environmental historian at the University of Bern, Switzerland. She received her PhD from the Ruhr-University of Bochum, Germany, in 2014 and was a doctoral fellow at the Institute for Advanced Study in the Humanities Essen (KWI) from 2008-2014. Her research was supported by the Swiss Study Foundation, as well as by the German Historical Institutes in Washington and Paris. Her MA as well as her PhD thesis were awarded prizes. Rohland is the author of two books, Sharing the Risk. Fire, Climate and Disaster. Swiss Re 1864-1906(Lancaster: 2011) and Changes in the Air: Hurricanes in New Orleans, 1718 to the Present, which just appeared in the Rachel Carson Center (RCC)’s Series Environments in History: International Perspectives (Berghahn Books) in 2018. With her third book project, tentatively entitled Encountering the Tropics and Transforming Unfamiliar Environments in the Caribbean, 1494 to 1804, her research focus moves geographically from the U.S. Gulf Coast into the Caribbean (Hispaniola and Jamaica).

 

What books are you reading now?

 

Related to the courses I am going to teach next semester (starting in April in Germany), I am reading Origins: How the Earth Made Us by Lewis Dartnell and Energy and Civilization: A History by Vaclav Smil. I am also very interested in the neurology of the creative process, partly for my own writing, but also in relation to teaching, so in this context I am reading Imagine: How Creativity Works by Jonah Lehrer. And I just love the humor and voice in Ursula Le Guin’s A Wave in the Mind: Essays on the Writer, the Reader, and the Imagination. Her texts are beautiful, profound and inspiring.

 

What is your favorite history book?

 

It’s hard to mention just one…Les Paysans du Languedoc (The peasants of the Languedoc) by Emmanuel Le Roy Ladurie; Mosquito Empires by John McNeill, Round about the Earth by Joyce Chaplin.

 

Why did you choose history as your career?

 

I think there’s a difference between the career aspect and my personal relationship to the field of history. At the present state of academia (and I think that is true for the U.S. as well as for Germany and Switzerland), it’s not so much a question of my choice, but of whether you get lucky and the system chooses you. But even beyond that, it was not a straightforward choice, though on my father’s side, I come from a family of historians. That doesn’t mean my career was in any way predetermined, it’s rather that, I guess, historical thinking has been part of my upbringing, and I was lucky to have a great history teacher in school who managed to foster that already existing interest. Also, I was lucky to have come to the field of environmental and climate history early during my undergraduate studies with Christian Pfister at the University of Bern (Switzerland), one of the pioneers of climate history. The way he taught Annales School-style history at his department of economic-, social- and environmental history was very comprehensive, geared towards understanding macro-scale historical connections while not losing sight of the micro-developments, and it was always related to present-day questions. 

 

During research for my MA thesis I realized that I really loved working with archival materials, the materiality and aesthetics of it intrigued me. The detective work that is involved in the research process intrigued me. And that connection only deepened during research for my doctoral thesis (though of course neither of these research projects was always pleasurable and easy). History is a fascinating subject that allows you to see so many levels and facets of human existence – the very light and the very dark – at different times and in different places. And I would say I am studying and teaching history most of all, to understand how we’ve got where we are today in our current, troubled era of unprecedented global change.

 

What qualities do you need to be a historian?

 

An open mind. Curiosity. Imagination. Inquisitiveness. A very healthy dose of skepticism towards anything already written. Persistence. Meticulousness. Frustration tolerance. Self-criticism. Patience.

 

Who was your favorite history teacher?

 

My history teacher at high school, Jürg Düblin. He had a wonderful sense of humor and always spiced up our history lessons with jokes and references about current events. I went to high school during the Clinton era, so of course there was ample opportunity to crack jokes about the Lewinsky affair. He also taught us to see the interconnectedness and entanglement of historical processes, and to accept and welcome complexity. 

 

And Christian Pfister, now Professor Emeritus at the University of Bern. Early in my undergrad studies I took his seminar on the history of disasters, which became my starting point into environmental and climate history.

 

What is your most memorable or rewarding teaching experience?

 

Again, it’s difficult to point out a single moment or course. I would rather say more generally that it’s most rewarding when the student group and I manage to create an atmosphere in which the students feel comfortable enough to really ask fundamental questions and in which they start discussing among themselves, without me having to guide much. This is not a situation I can create at will, it’s a co-creation between students and teacher, that depends on the make-up of the student group and on how they interact with the subject of the course. 

 

What are your hopes for history as a discipline?

 

That it fully absorbs the profound implications that the Anthropocene has for the discipline. Dipesh Chakrabarty in his 2009 “The Climate of History: Four Theses” clearly and brilliantly laid out how the fact that humans were now shaping the earth’s climate and other realms of the ecosystem spelt the end of the separation between human and natural history. Chakrabarty himself and others (Julia Adeney Thomas, John McNeill, Amitav Ghosh, and Franz Mauelshagen, to mention just a few) have since elaborated on this new perspective, and I am basing my hopes and opinion on their work. 

 

History needs to get comfortable with deep time scales that reach beyond the written record; that is, it needs to become more interdisciplinary, to connect more, and more naturally, with related disciplines such as anthropology and archaeology to include artefactual evidence alongside the written record; and with the natural sciences in order to understand environmental and climatic aspects concerning past societies. But even beyond the immediate theoretical and methodological implications of the Anthropocene, and more focused on the current, worrying changes in political arenas around the globe, I think history as a discipline needs to renew itself, needs to reassert or renegotiate its place in society, needs to be vocal on political abuses of terminology or populist and racist reinterpretations of historical events.

 

Do you own any rare history or collectible books? Do you collect artifacts related to history?

 

I own a facsimile of the Nuremberg Chronicle (theSchedelsche Weltchronik) and the first edition of the Jacob Burckhardt’s The Greeks and Greek Civilization (Griechische Kulturgeschichte)(1898), which was edited by my great-great grandfather, Jacob Oeri.

 

What have you found most rewarding and most frustrating about your career? 

 

I see it as a great privilege to be able to teach what I research and to work with students in general and thinking about ways to help them find what sparks their interest and passion. The historical profession unites many of the activities and themes I am passionate about myself (researching, teaching, writing, books, discussing, conceptualizing projects) and I consider myself very fortunate to be working as a professional historian.

 

One of the more frustrating experiences on this career path is – and this probably accounts more for Germany where environmental and climate history are completely under-institutionalized, and where, at the same time, one has to acquire project funding from national or EU funding bodies to do research – that funders have not yet realized that the discipline of history has substantial contributions to make to questions of sustainability, climate change and societal change. Consequently, funding formats that deal with these topics are never made with historians (or the humanities) in mind. On the other hand, environmental history is still seen as somehow narrow, restricted to the history of political movements since the 1950s, even among historians. So, as an environmental historian in the German speaking countries of Europe, one is kind of wedged in between those two (mis-)perceptions, and it will take some work to get out of this situation…

 

How has the study of history changed in the course of your career?

 

Well, my career has not been very long yet, but I would say that when I started studying history in the early 2000s, the cultural turn was in full swing and its methodologies were becoming mainstream in history. However, by now it has reached a sort of dead end. As far as I can see, no renewal or alternative is yet in sight, but is urgently needed. Be it by taking into account the Anthropocene’s implications or by other important aspects related to current events. So, I guess this observation connects to what I said above about my hopes for the discipline.

 

What is your favorite history-related saying? Have you come up with your own?

 

“The past is a foreign country, people do things differently there.” By L.P. Hartley, from The Go-Between. 

 

To me this saying perfectly expresses my fascination with history. In our every-day relationships we tend to instinctively look for similarity. In history, and in particular in early modern history the challenge is to understand as far as possible the foreignness of those worlds, to embrace the feeling of being estranged.

 

What are you doing next?

 

I am writing a short publication on “Entangled History and the Environment” which is an offshoot from my third book project on the socio-environmental transformation of the island of Hispaniola, from Columbus to the Haitian Revolution.

 

And I am preparing two courses on resource and energy history in the Americas.

]]>
Thu, 18 Apr 2019 16:48:42 +0000 https://historynewsnetwork.org/article/171538 https://historynewsnetwork.org/article/171538 0
Political Leadership and the Need for a New American Story

Franklin Delano Roosevelt delivering a "Fireside Chat" to the American people.

 

I recently came across Political Humor (1977) by Charles Schutz, and it detailed what a great storyteller Abraham Lincoln was. That got me thinking, “What other notable presidents were also good storytellers?” “How are stories and storytelling related to effective political leadership?” “Do we need a new American Story?”   

I checked out Franklin Roosevelt and found that historian Cynthia M. Koch wrote that FDR used heroic individuals like Washington and Lincoln “to tell stories that would unite people and provide comfort, courage, reassurance, and inspiration to Americans facing fear, hardship, uncertainty, and war.” The FDR Foundation added that “to heal a wounded nation,” FDR relied “in no small part” on “storytelling” . . . to tap into humankind’s primeval need to understand issues not only in intellectual terms, but on an emotional level as well.” 

This sounded much like Lincoln. At a cabinet meeting he once said, “I don’t propose to argue this matter because arguments have no effect upon men whose opinions are fixed and whose minds are made up.” Instead, he told a story to illustrate his point. 

Both great presidents possessed an acute understanding of the common people. Lincoln’s came partly from his origins among common frontier folk, but the aristocratic Roosevelt, who campaigned extensively throughout the nation, “principally relied on his feel for public mood to guide him in leading the country.” Yet both presidents understood that to reach people, to motivate them, to win them over, a president had to appeal to their emotions, and storytelling was one way of doing so. Carl Sandburg, who wrote a six-volume biography of Lincoln and was a strong FDR supporter, saw numerous parallels between the two presidents, especially their attunement to the will of the American people.

In recent years much has been made of the great U. S political divide. In a previous article, I mentioned that Donald Trump appeals to the anti-intellectual strain in American life earlier highlighted by historian Richard Hofstadter’s Anti-Intellectualism in American Life (1962). Liberals, progressives, and even some thoughtful conservatives bemoan the anti-intellectualism of Trump supporters. “Why do they believe the steady spew of lies told by Trump and all his talk of ‘fake news’”? “How can they deny the scientific consensus on climate change?” 

The answer is simple. Our problem is that we keep forgetting it: Most people’s politics are not based on reason or rationality. This was a good part of the message of former HNN editor Rick Shenkman’s book Political Animals, as well as earlier ones like Predictably Irrational andThe Righteous Mind.  

Supporters of former President Obama are especially prone to forgetting this message, for he spoke on an intellectual level more than most politicians. In 2009, former presidential adviser and television journalist Bill Moyers stated that in the medical care debate Obama “didn't speak in simple, powerful, moral, language. He was speaking like a policy wonk.” In 2011, a similar criticism came from historian John Summers, who wrote that Obama paid insufficient attention to the irrational and emotions, and that conservatism was better at recognizing that “successful politicians tapped into the collective unconscious of voters, controlling their perceptions.” Liberalism, however, as Lionel Trilling wrote in Liberal Imagination (1950), “drifts toward a denial of the emotions and the imagination.”

Not coincidently, Junot Diaz wrote in 2010 that one of the main responsibilities of a president is to be a good storyteller, and that President Obama had failed miserably in this regard. “If a President is to have any success, if his policies are going to gain any kind of traction among the electorate, he first has to tell us a story.” [In his pre-presidential days Obama had told a good story in his Dreams from My Father.] Republicans, Diaz believed, were “much better storytellers.” 

In contrast to Obama, President Reagan did pay more attention to the irrational and to storytelling, as Jan Hanska emphasized in Reagan’s Mythical America: Storytelling as Political Leadership (2012). Historian Koch agrees: “Like FDR, he [Reagan] was a great storyteller.” He used stories “to look backward to an earlier time to promote ideas of self-reliance and free enterprise.” President Trump, though not much of a storyteller, also appeals to myths about America’s past—note all the “Make America Great” hats worn by his supporters.

But how does storytelling help political leaders? Lincoln provides some interesting insights.  He believed it often helped “avoid a long and useless discussion . . . or a laborious explanation.” Moreover, it could soften a rebuke, refusal, or wounded feelings. Influenced by Aesop’s fables, as well as the parables of Jesus, Lincoln intuitively understood that to sway the American public simple stories were often more effective than reasoned arguments. As he once stated, “They say I tell a great many stories . . . but I have found in the course of a long experience that common people . . . are more easily informed through the medium of a broad illustration than in any other way.” In his emphasis on Lincoln’s storytelling ability and the humor that often accompanied it, Schutz notes that they reflected his ability to identify with the common people, his appreciation of their practical bent, and his good-natured acceptance of the flawed human condition. 

More modern thinkers also have recognized the usefulness of storytelling for political purposes. In a book on violence John Sifton quoted the philosopher Richard Rorty on the usefulness of “sad stories,” rather than reasoned appeals to change people’s minds about using violence. Similarly, British climate-change activist Alex Evans came to realize that bombarding people “with pie-charts, acronyms and statistics” was not persuasive enough and that activists “could only touch people's hearts by telling stories.” He believes “that all successful movements, including those that overturned slavery and racial discrimination, consisted of a network of small and large communities held together not by common calculations or common acceptance of certain technical facts, but by commonly-proclaimed narratives about the past and the future. In his view the political shockwaves of 2016, including Brexit and Donald Trump's victory, reflected the winning camps’ ability to tell better stories, not their superior command of facts.” 

Futurist Tom Lombardo believes that the personal narratives we tell ourselves “give order, meaning, and purpose to our lives.” He also writes that “the most powerful way to generate change is to change the personal narrative. . . . Similarly, to change a society, its grand narrative needs to be changed” to one that will help provide “society a sense of integrity, distinctiveness, and overall purpose.”

In a 2016 HNN essay, historian Harvey J. Kaye argued that “the time has come for progressive historians and intellectuals to join with their fellow citizens in the making of a new American narrative,” one that would “encourage renewed struggles to extend and deepen American democratic life.” We now have at least one such narrative, Jill Lepore’s These Truths: A History of the United States (2018). 

In a recent Foreign Affairs essay, “A New Americanism: Why a Nation Needs a National Story,” she asks, “What would a new Americanism and a new American history look like?” Her essay and book answer that they would accept and celebrate our ethnic, religious, and gender-identity diversity. In both works, she quotes from an 1869 speech of Frederick Douglass where he refers to a “‘composite nation,’ a strikingly original and generative idea, about a citizenry made better, and stronger, not in spite of its many elements, but because of them.”She ends her essay by criticizing those with a false view of our nation. “They’ll call immigrants ‘animals’ and other states ‘shithole countries.’ They’ll adopt the slogan ‘America first.’ They’ll say they can ‘make America great again.’ They’ll call themselves ‘nationalists.’ Their history will be a fiction. They will say that they alone love this country. They will be wrong.”

In two earlier HNN essays (here and here), I wrote of the need for a new “compelling, unifying vision that a majority of Americans would embrace.” It would build upon the visions suggested in the 1960s by Martin Luther King, Jr. and Robert Kennedy, which foreshadow Lepore’s “new Americanism.” In addition, it might sprinkle in the spirit of Carl Sandburg (1878-1967), “the one living man,” according to Adlai Stevenson, “whose work and whose life epitomize the American dream.” It would also recall other qualities that our country has demonstrated in its finest moments such as tolerance, compromise, pragmatism, generosity, and a willingness to undertake great tasks. 

Biographer of Ben Franklin, Walter Isaacsonwrotethat our forefathers who wrote the Constitution demonstrated that they were great compromisers. Also, for Franklin “compromise was not only a practical approach but a moral one. Tolerance, humility and a respect for others required it. . . . Compromisers may not make great heroes, but they do make great democracies.”

Today we again need to demonstrate both the idealism and compromising ability of the Constitution makers. Our present climate-change crisis provides such an opportunity. The Democrats’ Green New Deal reflects our idealism and willingness to once again take up a great task—as FDR did in fighting the Depression and mobilizing U. S. power in World War II. But transforming this idealistic resolution to effective legislation also requires political compromises. 

As various Democratic presidential contenders vie for the 2020 nomination, we need at least one of them to provide a unifying vision. Being a storyteller able to relate to and inspire most Americans, like Lincoln and FDR did, would also help create a “new Americanism.” So too would a Republican change of heart about compromise, a word most of them have rejected for far too long.

]]>
Thu, 18 Apr 2019 16:48:42 +0000 https://historynewsnetwork.org/article/171537 https://historynewsnetwork.org/article/171537 0
Vernon Johns: An Often Forgotten Controversial Civil Rights Activist

 

Atop a hill on the southwest part of Lynchburg, “Hill City,” there sits on Garfield Avenue a small educational institution with a rich, storied history: Virginia University of Lynchburg (VUL). It comprises three old and weathered buildings, each much in need of repairs—Graham Hall, Humbles Hall, and Mary Jane Cachelin Memorial Science and Library Building. It is there that part of the story of the dispute between black separatists and black accommodationists was fought—a dispute centered on the disparate views of Booker T. Washington and W.E.B. Du Bois concerning disfranchisement of Blacks.

 

Washington, an erstwhile slave, argued for accommodationism—for a sort of bootstrapping. He resisted the temptation to challenge directly the injustices of Jim Crow laws and black disfranchisement. In his famous “Atlanta Compromise,” he maintained that Blacks ought to challenge racial injustices by tardigrade progress through “industrial education, and accumulation of wealth.” His approach was conciliatory, not aggressive. He said in his Atlanta address: “To those of my race who depend on bettering their condition in a foreign land or who underestimate the importance of cultivating friendly relations with the Southern white man, who is their next-door neighbor, I would say: ‘Cast down your bucket where you are’—cast it down in making friends in every manly way of the people of all races by whom we are surrounded.” Blacks could in time eliminate racial injustices by gradual integration in white society through learning industrial skills, vital in the Southern economic climate.

 

W.E.B. Du Bois began as a proponent of Washington’s conciliatory approach to racial injustices. Yet he soon strayed from Washington’s conciliatory approach, which was too tardigrade, conceded too much the interests of Whites, and did too little to address disfranchisement, lynching, and Jim Crow laws. In his watershed book, The Souls of Black Folk, Du Bois maintained that the separate-but-equal policy was itself proof of Blacks’ inequality. In an obvious poke at Washington, he stated, “To make men, we must have ideals, broad, pure, and inspiring ends of living, not sordid money-getting, not apples of gold.” Education was his vehicle for racial reform. “The function of the Negro college, then, is clear: it must maintain the standards of popular education, it must seek the social regeneration of the Negro, and it must help in the solution of problems of race contact and co-operation. And finally, beyond all this, it must develop men.”He argued that the “Talented Tenth,” a group of highly educated Blacks, in time would change more rapidly the racist landscape.

 

The disparate views of Washington and Du Bois shaped accommodationist and separatist strategies thereafter.

 

Enter Vernon Johns, born on April 22, 1892 in Darlington Heights, Virginia. Johns disagreed with both Washington and Du Bois—viz., he disagreed with Washington’s slow conciliatory approach and Du Bois’ separatism and elitism.

 

Johns we know attended VUL when it was Virginia Theological Seminary and College (VTSC, est. 1886) as an educational institution that trained young black men in theology. The year of matriculation was 1912. The institution was separatist in ideology, and that probably appealed to Johns who was no stranger to antagonism. Says Ralph E. Luker in “Johns the Baptist”: “The transfer [to VTSC] was crucial to his development and would shape his career for another two decades, for Virginia Seminary challenged Virginia Union’s cooperation with Northern white Baptists with coeducation of men and women, an emphasis on the liberal arts, and unceasing devotion to African American autonomy.” Johns entered the seminary, but was expelled permanently from VTSC for rebelliousness in 1915. The school at the time focused heavily on study through Greek and Latin.

 

Having left the seminary, he matriculated at Oberlin College in 1915, where he received an education that, Luker says, “no one of color might have found anywhere in Virginia and won honors among his classmates.” Johns graduated in 1918 and thereafter studied theology for a year at the University of Chicago’s graduate school.

 

Johns revisited VTSC in 1919 and taught homiletics and the New Testament. He became pastor of Court Street Baptist Church in 1920 and kept that position till 1926. In a 1920 letter to Professor G.W. Fiske of Oberlin College, he writes of his appointment, “My sailing at present is smooth with no clouds insight, and my prayer is that I may do some good on the voyage and at last be granted a safe harbor.”  In 1923, he was removed from the faculty of VTSC on account of harsh criticism of their curriculum. He removed to West Virginia, New York, and then to North Carolina, where he married. He returned inauspiciously to VTSC in 1929—the year of the Great Depression. For five years, he functioned as president of VTSC. In 1933, he was forced to resign, due to students’ protests—they striked until Johns was dismissed—and the financial impoverishment of the institution. Formal complaints were these: “We want a president whose presence on the campus will be a source of joy and not intense displeasure. We want a president who will not regard students and trustees as enemies of the school simply because they oppose or object to certain of his policies or actions. We want a president who will not convert classrooms into coal bins and chicken houses. … We want a president whose remarks to students in chapel services will be advisory and not adverse criticism or lambaste. We want a president who will not stoop to the use of profanity and vulgarity in addressing the students in chapel services and in the presence of young women on the campus.” Johns, it is obvious, was not a popular president.

 

In 1937, Johns began a second appointment at First Baptist Church in West Virginia, where he caught and sold fish, fishmongery, for additional income. In 1941, he again returned to Lynchburg as pastor of Court Street Baptist Church. He was removed from that position in 1943, after disputes with laymen.

When his wife Altona Trent Johns joined the faculty at Alabama State University’s Department of Music in 1947, Johns became pastor of the esteemed Dexter Avenue Baptist Church. There he continued his rebelliousness with incendiary speeches (e.g., “Segregation after Death” and “When the Rapist Is White”) and loud actions in protest of racial discrimination. Johns, never again holding a pastorship, resigned in 1952, due once again to unrest from his congregation. He was eventually succeeded by Dr. Martin Luther King, Jr. Johns died from a heart attack on June 10, 1965 in Washington, D.C.

 

What of Johns the man?

 

The pattern of his life, as the short biographical sketch shows, reveals that Johns could never stay in one place too long. Why? His message for racial equality was direct and unbending, and his vision was broad and far-seeing. As biographers Partrick Louis Cooney and Henry Powell note, “He had a natural ability to be socially insensitive to people individually, yet, at the same time, caring mightily for them as a group.”

 

Yet that seems understated. Johns was so lost in the end he pursued, eminently just, that he could not see a suitable means to achieve that end. He alienated both Whites and Blacks who were impassioned advocates of civil rights. At a convention of white and black preachers in Baltimore in 1960, Johns objected to a talk by a white minister. “The thing that disappoints me about the Southern white church is that it spends all of its time dealing with Jesus after the cross, instead of dealing with Jesus before the cross. … I don’t give a damn what happened to [Jesus] after the cross.” Though he might have been right, the comment offended almost all present. Again, aiming to democratize the elitism of Du Bois, he asked in effect all Blacks in their own way to be part of the Talented Tenth, and he was asking too much of them. At Dexter Church, he earned the reputation, as one biographer notes, of a “‘militant guy,’ who exhorted the congregation like a ‘whirlwind’ to get involved in social issues.” Martin Luther King, Jr., wrote of Johns: “A fearless man, … he often chided the congregation for sitting up so proudly with their many academic degrees, and yet lacking the very thing the degrees should confer, that is, self- respect. One of his basic theses was that any individual who submitted willingly to injustice did not really deserve more justice.”

 

Johns was a critical figure in the push for racial equality, chiefly because he was not merely a pusher, but also a shover. Never afraid of offending others, chiefly the Whites with whom he sought equality, he challenged Jim Crowe laws by bringing to trial Whites accused of raping Blacks, sitting with Whites in the front of a bus, and entering Whites-only restaurants. He also consistently offended black students and parishioners by demanding of them the sort of deeds that only a person of bulky courage, vision, and abilities, like he, could do.

 

Though a critical figure, he was never politically mainstream in the Civil Rights movement. He was kept from political involvement in racial injustice, because, while he praised many Blacks’ literary and intellectual achievements, he railed against general black indifference to or noninvolvement in civic issues.

 

Johns was also hampered by his large intellectual capacity. Self-educated as a boy, he came to learn Greek, Latin, Hebrew, and German and committed to memory lengthy biblical passages and numerous quotes from philosophers from antiquity, Shakespeare and other poets, sermons, and other literature. Yet his prodigious intellect generally made him largely inaccessible as professor, school president, and pastor, and he seems to have made little effort to make himself accessible—a problem shared my numerous others with large intellection.

 

Johns was also off-putting because he was a strange admixture of husbandman and preacher. He loved the land. Brought up on a farm, he was always a farmer and fisherman at heart. He would unabashedly sell his farmed goods and fish at his churches and at the various schools he attended. Many thought that that was pompous and untoward.

 

Moreover, Johns was restive—always aiming to be a harbinger of outsized change—and he had to instigate that change in his own, inimitable manner. Of prodigious intellect and unsubtle and uncomfortable with compromise, he irritated and angered Whites, and did the same with Blacks, and so he was less effective as a harbinger of change than he would have been had he been less intelligent, more subtle, and open to compromise. His successor at Dexter Church, Martin Luther King, Jr., fortunately possessed those qualities that Johns lacked.

 

Finally, Johns was an enigma, because he stood solidly for civil rights, but it was never quite clear what his modus operandifor Blacks was. He was no accommodationist, but he also did not fit squarely into the separatist mold. He might best be categorized as an antagonist or a revolutionist, who championed swift and decisive counteractions to unjust actions. In a sermon delivered after the shooting of a black man by a white officer in 1948, Johns reminded his congregation that one of the Ten Commandments was “Thou shalt not kill.” He added that God did not qualify that commandment with “unless you are a police officer” or with “unless you’re White.” Johns then added: “I’ll tell you why it’s safe to murder Negroes. Because Negroes stand by and let it happen.” Johns was arrested for instigation. He was seen as a threat to the white status quo and he also managed in the process to offend his black congregation by his statement that Blacks would not act.

 

Was Johns a failure because he could have been a greater harbinger for change had he had aimed for subtlety and conciliation?

 

Johns once said, “You should be ashamed to die until you’ve made some contribution to mankind.” Thus, Johns throughout his life strived to improve not just Blacks’ condition, but the human condition. Johns lived up to and greatly exceeded that mark. In doing so, he has set a high mark for us, irritated today by injustices of any sort, to match.

]]>
Thu, 18 Apr 2019 16:48:42 +0000 https://historynewsnetwork.org/article/171536 https://historynewsnetwork.org/article/171536 0
What Democratic Socialism Is and Is Not

In recent weeks, Donald Trump and other Republicans have begun to tar their Democratic opponents with the “socialist” brush, contending that the adoption of socialist policies will transform the United States into a land of dictatorship and poverty.  “Democrat lawmakers are now embracing socialism,” Trump warned the annual Conservative Political Action Conference in early March.  “They want to replace individual rights with total government domination.” In fact, though, like many of Trump’s other claims, there’s no reason to believe it.

The ideal of socialism goes back deep into human history and, at its core, is based on the notion that wealth should be shared more equitably between the rich and the poor. Numerous major religions have emphasized this point, criticizing greed and, like the revolutionary peasants of 16th century Germany and the rebellious Diggers of 17th century England, preaching the necessity for “all God’s children” to share in the world’s abundance. The goal of increased economic equality has also mobilized numerous social movements and rebellions, including America’s Populist movement and the French Revolution.                                                                               

But how was this sharing of wealth to be achieved?  Religious leaders often emphasized charity.  Social movements developed communitarian living experiments. Revolutions seized the property of the rich and redistributed it.  And governments began to set aside portions of the economy to enhance the welfare of the public, rather than the profits of the wealthy few.

In the United States, governments at the local, state, and federal level created a public sector alongside private enterprise.  The American Constitution, drafted by the Founding Fathers, provided for the establishment of a U.S. postal service, which quickly took root in American life.  Other public enterprises followed, including publicly-owned and operated lands, roads, bridges, canals, ports, schools, police forces, water departments, fire departments, mass transit systems, sewers, sanitation services, dams, libraries, parks, hospitals, food and nutrition services, and colleges and universities.  Although many of these operated on a local level, others were nationwide in scope and became very substantial operations, including Social Security, Medicare, National Public Radio, the National Institutes of Health, and the U.S. armed forces.  In short, over the centuries the United States has developed what is often termed “a mixed economy,” as have many other countries.

Nations also found additional ways to socialize (or share) the wealth.  These included facilitating the organization of unions and cooperatives, as well as establishing a minimum wage, unemployment insurance, and a progressive tax policy―one with the highest levies on the wealthy and their corporations.

Over the course of U.S. history, these policies, sometimes termed “social democracy,” have enriched the lives of most Americans and have certainly not led to dictatorship and economic collapse. They are also the kind championed by Bernie Sanders and other democratic socialists.

Why, then, does a significant portion of the American population view socialism as a dirty word?  One reason is that many (though not all) of the wealthy fiercely object to sharing their wealth and possess the vast financial resources that enable them to manipulate public opinion and pull American politics rightward.  After all, they own the corporate television and radio networks, control most of the major newspapers, dominate the governing boards of major institutions, and can easily afford to launch vast public relations campaigns to support their economic interests.  In addition, as the largest source of campaign funding in the United States, the wealthy have disproportionate power in politics.  So it’s only natural that their values are over-represented in public opinion and in election results.

But there’s another major reason that socialism has acquired a bad name:  the policies of Communist governments.  In the late 19th and early 20th centuries, socialist parties were making major gains in economically advanced nations.  This included the United States, where the Socialist Party of America, between 1904 and 1920, elected socialists to office in 353 towns and cities, and governed major urban centers such as Milwaukee and Minneapolis. But, in Czarist Russia, an economically backward country with a harsh dictatorship, one wing of the small, underground socialist movement, the Bolsheviks, used the chaos and demoralization caused by Russia’s disastrous participation in World War I to seize power. Given their utter lack of democratic experience, the Bolsheviks (who soon called themselves Communists) repressed their rivals (including democratic socialists) and established a one-party dictatorship.  They also created a worldwide body, the Communist International, to compete with the established socialist movement, which they denounced fiercely for its insistence on democratic norms and civil liberties.

In the following decades, the Communists, championing their model of authoritarian socialism, made a terrible mess of it in the new Soviet Union, as well as in most other lands where they seized power or, in Eastern Europe, took command thanks to post-World War II occupation by the Red Army.  Establishing brutal dictatorships with stagnating economies, these Communist regimes alienated their populations and drew worldwide opprobrium.  In China, to be sure, the economy has boomed in recent decades, but at the cost of supplementing political dictatorship with the heightened economic inequality accompanying corporate-style capitalism.

By contrast, the democratic socialists―those denounced and spurned by the Communists―did a remarkably good job of governing their countries. In the advanced industrial democracies, where they were elected to office on numerous occasions and defeated on others, they fostered greater economic and social equality, substantial economic growth, and political freedom.

Their impact was particularly impressive in the Scandinavian nations.  For example, about a quarter of Sweden’s vibrant economy is publicly-owned. In addition, Sweden has free undergraduate college/university tuition, monthly stipends to undergraduate students, free postgraduate education (e.g. medical and law school), free medical care until age 20 and nearly free medical care thereafter, paid sick leave, 480 days of paid leave when a child is born or adopted, and nearly free day-care and preschool programs.  Furthermore, Sweden has 70 percent union membership, high wages, four to seven weeks of vacation a year, and an 82-year life expectancy.  It can also boast the ninth most competitive economy in the world. Democratic socialism has produced similar results in Norway and Denmark.

Of course, democratic socialism might not be what you want.  But let’s not pretend that it’s something that it’s not.

]]>
Thu, 18 Apr 2019 16:48:42 +0000 https://historynewsnetwork.org/article/171564 https://historynewsnetwork.org/article/171564 0
Trump's Executive Order Censors Free Speech on College Campuses

 

In 1961, a historian at the University of Pittsburgh named Robert G. Colodny was called before the House Un-American Activities Committee. Colodny was just one of HUAC’s many targets, a list which included screenwriters like Dalton Trumbo and playwrights such as Arthur Miller. HUAC remained a fearsome and fundamentally anti-democratic means of intimidation and often professional ruin even after the height of the McCarthy era’s Red baiting. The professor drew suspicion after he innocuously referred to Cuban “agrarian reforms” in the Pittsburgh Press, which was enough for a local state representative to label Colodny a communist sympathizer. Shortly after, Congress and then the university itself launched investigations. This, it should be said, is what an attack on academic freedom looks like. 

Part of what contributed to the professor’s new-found notoriety was that Colodny had been among those idealists and visionaries, including writers like Ernest Hemingway and George Orwell, who enlisted themselves in the army of the democratically elected government of Republican Spain, which in the late 1930’s was threatened and ultimately defeated by the fascist forces of the future dictator Francisco Franco. They were often tarred as “prematurely anti-fascist,” with historian Adam Hochschild explaining in Spain in Our Hearts: Americans in the Spanish Civil War 1936-1939 that for those fighters the conflict was “seen as a moral and political touchstone, a world war in embryo, in a Europe shadowed by the rapid ascent of fascism.” Franco received aid and assistance from Mussolini and Hitler, with the Luftwaffe’s brutal destruction of the Basque city of Guernica indeed a prelude to the coming horror of the bloodiest war in human history. Women and men like Colodny, who served in the international brigades, correctly believed that right-wing nationalism and international fascism should be countered on the battlefields of Spain. As Orwell would write in his 1938 account Homage to Catalonia, “I recognized it immediately as a state of affairs worth fighting for.” From 1937 until the following year, Colodny would fight in a battalion of volunteers known as the Abraham Lincoln Brigade, the first integrated squadron of American soldiers, and one of over fifty international brigades composed of leftists who fought against the Spanish fascists. The future professor sustained a gunshot wound above his right eye which left Colodny partially paralyzed and blind. Despite his injuries, he’d later serve in the American armed forces, going onto receive a doctorate in history at the University of California at Berkeley, where he specialized in the philosophy of science.

Such were the vagaries of a fascinating, if unassuming, professional career until Colodny would be called to account for his anti-fascist record. After his congressional testimony, the University of Pittsburgh was under pressure to terminate Colodny’s appointment, but after six months of investigation they would conclude that the professor’s political opinions and service didn’t constitute a reason for dismissal. Pitt’s Chancellor Edward H. Litchfield wrote in his conclusion to the investigation, in a statement that deserves to be the canonical statement on academic freedom, that a university “embraces and supports the society in which it operates, but it knows no established doctrines, accepts no ordained patterns of behavior, acknowledges no truth as given. Were it otherwise, the university would be unworthy of the role which our society has assigned it.”   

As moving and apt an encapsulation of the free inquiry that lay at the heart of American higher education as any that’s ever been written, and one that as of today is under serious threat by the machination of the Trump administration. On March 21st Trump signed an executive order with the anodyne designation of “Improving Free Inquiry, Transparency, and Accountability at Colleges and Universities,” a declaration that by name alone would be easy to assume is congruent with Litchfield’s idealistic argument of half a century ago. But the order’s language, which claims that we must “encourage institutions to appropriately account” for free inquiry in their “administration of student life and to avoid creating environments that stifle competing perspectives” lacks not just Litchfield’s poetry, but indeed means the exact opposite of that earlier defense. Trump’s order, fulfilling a promise to his right-wing supporters and their long-standing obsession with a perceived liberal bias in the academy, exists not to promote inquiry, but to stifle it; not to expand perspectives, but rather to limit them; not to encourage free speech, but to censor it. 

Trump’s order was germinated out of the debate that has surrounded questions concerning the scheduling of fascist speakers at universities. Today’s order can arguably be traced back towards an incident at Colodny’s alma matter of Berkeley, which incidentally was also the birthplace of the Free Speech Movement of the 1960’s. In 2017 violent confrontations between political groups, none of whom were affiliated with the university, led to the cancelling of one event due to security concerns. Importantly the university had approved this speaker’s visit, and indeed said the speaker was paid with student activity funds. At no point was the speaker censored or oppressed, despite his abhorrent views. 

With his characteristic grammar, punctuation, orthography and enthusiasm for capitalization, the president tweeted on February 2, 2017 that “If U.C. Berkeley does not allow free speech and practices violence on innocent people with a different point of view – NO FEDERAL FUNDS?” Tallying the inaccuracies in a Donald J. Trump statement is a bit like searching for sand at the beach, but it should go without saying that neither Berkeley faculty nor its administration had enacted  “violence on innocent people.” Rather a hitherto invited right-wing speaker arrived with his own retinue of supporters that were countered by community groups not affiliated with the university itself, and unsurprisingly hate speech generated hate. 

The language of the March 21st executive order is nebulous, but seems to imply that colleges and universities will lose federal funds if there choose not to host certain speakers. This is, as should be obvious, the opposite of free speech. A university has every right to decide who will speak on its campus, and the community certainly has the right to object to certain speakers, who are normally paid from the budget generated by student activity fees. It’s unclear if such a federal order will be consistently applied, so that an evangelical college would be required to invite pro-choice speakers, or a Christian university would have to pay visiting atheist lecturers, but I’ll let you guess what the intent of the proclamation most likely is. 

Colodny’s brother-in-arms George Orwell would probably have something astute to say about the manner in which the Trump administration has commandeered the language of free speech so as to subvert free speech. At the very least you have to appreciate the smug arrogance of it. Such an executive order, which is red-meat to Trump’s base, is the culmination of two generations of neurotic, anxious, right-wing fretting about apparent liberal infiltration of colleges and universities. While it’s true that faculty, depending on discipline, tend to vote liberal, you’re as likely to find a genuine Marxist among university professors as you are to find an ethical member of the Trump administration itself. Furthermore, this concern over “political diversity” is only raised when conservatives feel threatened, and academe is simply the one small corner of society not completely dominated by the right. Ask yourself what insecurity encourages those who dominate the executive branch, dozens of state governments, business, and increasingly the judiciary to continually fulminate about academe, Hollywood, and the media?

You’ll note that the concern over the perceived lack of political diversity among faculty normally begins and ends at the social sciences and humanities, though more recently the natural sciences have also been attacked for daring to challenge the conservative ideological orthodoxy on issues such as climate change. Conservatives aren’t concerned about a lack of diversity among business faculty, or even more importantly among the trustees of colleges and universities, where every higher education worker knows that the real power is concentrated. For that matter, there is no equivalent hand-wringing about political diversity on corporate boards, though perhaps a socialist sitting in on a board meeting at the Bank of America could have all done us some good in 2008. Nobody in the Republican Party seems terribly concerned that other professions which hew to the right, be they law enforcement or investment bankers, don’t have a “diversity” of political opinions represented in their ranks. 

That’s because today’s order obviously has nothing to actually do with free inquiry and diversity, but rather intends to stranglehold it. Terry Hartle, the senior vice president for government and public affairs at the American Council on Education said in a speech that “As always in the current environment, irony does come into play. This is an administration that stifles the views of its own research scientists if they are counter to the pollical views of the administration… And the president vigorously attacks people like Colin Kaepernick.” 

It’s impossible to interpret much of what the administration does without an awareness of their own finely adroit sense of sadistic irony and mocking sarcasm. In such a context, Thursday’s executive order, whose full ramifications remain unclear, is far from a defense of free inquiry but rather a sop to those like right-wing activist David Horowitz, director of his own self-named and so-called “Freedom Center,” or the administrators of the website Professor Watchlist, maintained by the conservative group Turning Point USA. Trump’s executive order is an attempt to return us to the era in which Colodny could be fired for his progressive views, an age of blacklists and loyalty oaths. 

Anyone attending a college, or has children enrolled, or who works in higher education, is amply aware that the state of the American university is troubled. The recent enrollment scandal whereby wealthy parents simply paid their children’s way into elite institutions (as cynically unsurprising as this may be) only underscores the malignancies which define too much of post-secondary education in America today. College is too expensive, too exclusionary, and its resources are misallocated. The academic job market is punishing, and serves not the graduate students who aspire towards a professorial job. Undergraduates take on obscene amounts of debt, and the often-inflated reputation of the Ivy league and a handful of other instructions still sets too much of the tenor of American social, political, and cultural life. But none of these problems are because the university is too “liberal.” To the contrary, American higher education could stand to move a lot more to the left in terms of admissions and employment. If anything, the current crisis in higher education is most closely related to the imposition of a certain business mentality upon institutions whose goal was never to be the accumulation of profit for its own sake. 

Because despite its contradictions, American higher education has historically remained the envy of the world. There is a reason that international students clamber for a spot at an American college. Since the emergence of the American research university in the 19th century, higher education has been at the forefront of research and innovation. Even more importantly, democratizing legislation such as the GI Bill and affirmative action transformed American universities into the greatest engine of upward class mobility in human history. It’s not a coincidence that conservative attacks on higher education occurred right at the moment when it became available to the largest number of people, but the nature of these most recent attacks, making federal funding contingent on which right-wing agitator receives a hefty speaker’s fee, could have a chilling effect on education. 

Sociologist Jonathan R. Cole writes in The Great American University that our system of higher education has been “able to produce a very high proportion of the most important fundamental knowledge and practical research discoveries in the world.” By intervening in the details of who is invited to speak on a college campus (which is of course separate from censorship), the federal government threatens the independence and innovation of higher education, by imposing an ideological approved straight-jacket upon that which has historically been our great laboratory of democracy. Colodny wrote that the goal of higher education was so that “some traditional holder of power feels the tempest of new and renewing ideas.” The man who currently occupies the Oval Office can’t abide either of those things, and so he’d rather burn it all down than spend a moment being threatened by institutions that actually enshrine free inquiry. The gross obscenity is that he’s self-righteously claiming the mantle of that same free inquiry to do it. 

 

 

]]>
Thu, 18 Apr 2019 16:48:42 +0000 https://historynewsnetwork.org/article/171563 https://historynewsnetwork.org/article/171563 0
What is Antisemitism? Steve Hochstadt teaches at Illinois College and blogs for HNN.

 

 

Antisemitism is alive and well these days. In Europe and America, the number of antisemitic incidents areincreasing every year, according to those who try to keep track.

 

News about antisemitism has recently wandered from the streets and the internet into the halls of Congress. The presence of two newly elected young Muslim women in the House, who openly advocate for Palestinians against Israel, has upset the strongly pro-Israel consensus that has dominated American politics for decades. Accusations of antisemitism are especially directed at Ilhan Omar from Minneapolis, who has used language that is reminiscent of traditional antisemitic themes in her criticism of Israeli policies. Her case demonstrates that it can be difficult to distinguish between unacceptable antisemitism and political criticism of the Jewish government of Israel and its supporters.

 

Some incidents seem to be easy to label as antisemitic. For example, when a large group of young people physically attacked Jewish women while they were praying. Many women were injured, including the female rabbi leading the prayers. The attackers carried signs assailing the women’s religious beliefs, and the press reported that the women “were shoved, scratched, spit on and verbally abused”.

 

An obvious case of antisemitism? No, because the attackers were ultra-Orthodox Jewish girls and boys, bussed to the Western Wall in Jerusalem in order to attack the non-Orthodox Women of the Wall, who were violating misogynist Orthodox traditions about who can pray at the Wall. This incident fulfills every possible definition of antisemitism. For example, the International Holocaust Remembrance Alliance offers the following description of public acts that are antisemitic: “Calling for, aiding, or justifying the killing or harming of Jews in the name of a radical ideology or an extremist view of religion.” The ultra-Orthodox leaders who encouraged the assault would argue that they were protecting, not attacking Judaism, and that the Women of the Wall were not really Jewish anyway.

 

Acts of antisemitism are political acts. Accusations of antisemitism are likewise political acts, deployed in the service of the political interests of the accusers. Many, perhaps most accusations of antisemitism are made in good faith for the purpose of calling attention to real religious prejudice. But such accusations are often made for less honest political purposes.

 

The Republicans in Congress who demand that Democrats denounce Ilhan Omar are cynically using the accusation of antisemitism for political gain. Many Republicans have themselvesmade statements or employed political advertisements that are clearly antisemitic. The rest have stood by in silence while their colleagues and their President made antisemitic statements. But they saw political advantage in attacking a Democrat as antisemitic.

 

Supporters of the Israeli government’s policies against Palestinians routinely accuse their critics of antisemitism as a means of drawing attention away from Israeli policies and diverting it to the accusers’ motives. Sometimes critics of Israel are at least partially motivated by antisemitism. But the use of this rhetorical tactic also often leads to absurdity: Jews who do not approve of the continued occupation of land in the West Bank or the discrimination against Palestinians in Israel are accused of being “self-hating Jews”.

 

This linking of antisemitism and criticism of Israeli policy has worked well to shield the Israeli government from reasonable scrutiny of its policies. In fact, there is no necessary connection between the two. Criticism of current Israeli policy is voiced by many Jews and Jewish organizations, both religious and secular.

 

Supporters of the idea of boycotting Israeli businesses as protest against Israeli treatment of Palestinians, the so-called BDS movement, are sometimes assumed to be antisemitic and thus worthy of attack by extremists. But the pro-Israel but also pro-peace Washington Jewish organization J-Street argues that “Efforts to exclude BDS Movement supporters from public forums and to ban them from conversations are misguided and doomed to fail.” I don’t remember that any of the supporters of boycotting and divesting from South Africa because of its racial policies were called anti-white.

 

Those who advocate a “one-state solution” to the conflict between Israel and the Palestinians are sometimes accused by conservatives of being antisemitic, with the argument that this one state will inevitably eventually have a majority of Muslims. The Washington Examiner calls this equivalent to the “gradual genocide of the Jewish people”.

 

The absurdity of equating anti-Zionism with antisemitism is personified by the denunciations of Zionism and the existence of Israel by the Orthodox Satmar, one of the largest Hasidic groups in the world.

 

On the other side, the most vociferous American supporters of Prime Minister Netanyahu’s government have been evangelical Christians. Although they claim to be the best friends of Israel, the religious basis of right-wing evangelical Christianity is the antisemitic assertion that Jews will burn in hell forever, if we do not give up our religion. Robert Jeffress, the pastor of First Baptist Church in Dallas, who spoke at President Trump’s private inaugural prayer service, has frequently said that Jews, and all other non-Christians, will go to hell. The San Antonio televangelist John C. Hagee, who was invited by Trump to give the closing benediction at the opening of the new American Embassy in Jerusalem, has preached that the Holocaust was divine providence, because God sent Hitler to help Jews get to the promised land. Eastern European nationalists, who often employ antisemitic tropes to appeal to voters, are also among the most vociferous supporters of Netanyahu and Israel.

 

Political calculations have muddied our understanding of antisemitism. Supporters of the most right-wing Israeli policies include many people who don’t like Jews. Hatreds which belonged together in the days of the KKK may now be separated among right-wing white supremacists.

 

But no matter what they say, purveyors of racial prejudice and defenders of white privilege are in fact enemies of the long-term interests of Jews all over the world, who can only find a safe haven in democratic equality.

]]>
Thu, 18 Apr 2019 16:48:42 +0000 https://historynewsnetwork.org/blog/154195 https://historynewsnetwork.org/blog/154195 0
Mike Pence Says the US Has Been "A Force For Good in the Middle East" for "nearly 200 years"; Here's How Historians Responded Allen Mikaelian is a DC-based editor and writer. He received his history PhD from American University and served as editor of the American Historical Association’s magazine, Perspectives on History. The Political Uses of the Past Project collects and checks statements by elected and appointed officials. This is the first installment of what will hopefully become a regular feature of the project. Read more about the project here. Contact the editor of the project here.

Vice President Pence: "For nearly 200 years, stretching back to our Treaty of Amity and Commerce with Oman, the United States has been a force for good in the Middle East"

For nearly 200 years, stretching back to our Treaty of Amity and Commerce with Oman, the United States has been a force for good in the Middle East. Previous administrations in my country too often underestimated the danger that radical Islamic terrorism posed to the American people, our homeland, our allies, and our partners. Their inaction saw the terrorist attacks from the U.S.S. Cole; to September 11th; to the expansion of ISIS across Syria and Iraq — reaching all the way to the suburbs of Baghdad. But as the world has witnessed over the past two years, under President Trump, those days are over. —Vice President Michael Pence, Remarks, Warsaw Ministerial Working Luncheon, February 14, 2019

Historians say...

Eight historians responded to our request for comment; their full statements and recommended sources are on the Political Uses of the Past page).

The vice president starts with the 1833 treaty with Oman, and so shall we, even though it’s an odd place to start. As Will Hanley of Florida State University noted in his reaction to Pence’s claim, the treaty itself is a piece of routine boilerplate, not so different “from dozens of other 1830s agreements between Middle East authorities and representatives of American and European states.” But there was at least one innovation, as Hanley explains: “The Sultan of Muscat inserted a clause saying that he, rather than the US, would cover the costs of lodging distressed American sailors. A more accurate statement [by Pence] on this evidence would be ‘For nearly 200 years, stretching back to our Treaty of Amity and Commerce with Oman, representatives of the United States have pursued standardized agreements in the Middle East and enjoyed meals that we haven't paid for.’”

Vice President Pence made this broad statement at a ministerial meeting on terrorism, but his mind was primarily on Iran. His intent was to draw a contrast between the United States and Iran, with the former being a “force for good” in the region and the latter being a perpetrator of continual violence. But by going back to 1833 to reference a routine and fairly boring trade agreement with a minor kingdom, he appears to be grasping at straws.

If Pence was looking for good done by the United States in the Middle East, he could have asked some of the historians who reacted to his statement. He may have learned from Joel Beinin how “American missionaries established some of the leading universities in the Middle East: The American University of Beirut, The American University in Cairo and Robert College in Istanbul. The Medical School of AUB is among the best in the region.” He may have been interested to hear from Indira Falk Gesink that "after World War I, most of those polled in the regions surrounding Syria wanted the US as their mandatory power (if they wanted any)." He may have learned from Lior Sternfeld how the United States has sponsored “schools, universities, and orphanages” and took a stand against its European allies and Israel during the Suez Crisis of 1956.

But if he had asked and had learned about these efforts, he would also have learned from Professor Beinin that many of the missionaries who established these schools went to work for the CIA in the postwar period, “so even the very best thing that Americans have done in the Middle East since the early 19th century was corrupted by government efforts to exert power over the region in order to control its oil.” And Pence would have also had to hear Professor Sternfeld tell about the 1953 coup in Iran that cemented a brutal regime in place for the next quarter-century and how, as described by Professor Gesink, "from that point on, US actions in the Middle East were guided by demand for oil and anti-Communist containment." Finally, he would have had to hear about how much that 1953 coup has to do with our relations with Iran now.

Historians who replied to our request for comment could not find much “force for good” in the historical record. Instead, they find “death, displacement, and destruction” (Ziad Abu-Rish), support for “the most ruthless and brutal dictators at every turn” and the “most fanatical and chauvinistic nationalist and religious forces at every turn” (Mark Le Vine), “intense and destructive interventions … characterized by public deception, confusion, and mixed motives” (Michael Provence), "a moral compromise with authoritarianism"  (Indira Falk Gesink), and actions that have “contributed to breakdowns in security, widespread violence, and humanitarian disaster” (Dale Stahl).

Homage to the Shah after coup d'état, 5 September 1953, The Guardian - Unseen images of the 1953 Iran coup.

Three historians below recommend The Coup: 1953, The CIA, and The Roots of Modern U.S.-Iranian Relations by Ervand Abrahamian, and this book is incredibly pertinent today. Previous historical accounts and justifications by 1950s policymakers made the coup all about Mosaddegh’s unwieldiness to compromise or said it was all about winning the Cold War. Abrahamian instead shows that it was about oil, or, more specifically, “the repercussions that oil nationalization could have on such faraway places as Indonesia and South America, not to mention the rest of the Persian Gulf.” And for this, Iran and the Middle East got, courtesy of the United States, the brutal Mohammad Reza Shah. The shah crushed the democratic opposition, filling his jails with thousands of political prisoners, and left “a gaping political vacuum—one filled eventually by the Islamic movement.” And so here we are.

Mike Pence’s incredibly blinkered statement can be viewed as an extreme counterpoint to the right-wing view of Obama’s Cairo speech, in which the president mildly acknowledged that the US had not always been on the side of right in the Middle East, and that its history of actions have come back to haunt us all. Such things, it seems, must not be spoken in the muscular Trump administration, even if it means abandoning an understanding that might actually be useful. “For me as an historian,” Mark Le Vine notes below, “perhaps the worst part the history of US foreign policy in the region is precisely that scholars have for so long done everything possible to inform politicians, the media and the public about the realities there. Largely to no avail.” Indeed, Mike Pence here appears intent on utterly blocking out history and historical thinking, even as he dreams of a long and glorious past.

Browse and download sources recommended by the historians below from our Zotero library, or try our in-browser library.

 

Ziad Abu-Rish, Assistant Professor of History at Ohio University

I'm only going to tackle the "force for good" claim, without getting into the claims about Trump compared to his predecessors or the notion of "radical Islamic terrorism." Let's give Vice President Pence a chance at being correct... Read more

Joel Beinin, Donald J. McLachlan Professor of History and Professor of Middle East History, Emeritus, Stanford University

American missionaries established some of the leading universities in the Middle East: The American University of Beirut, The American University in Cairo and Robert College in Istanbul. The Medical School of AUB is among the best in the region... Read more

Indira Falk Gesink, Baldwin Wallace University

I think this is a much more complicated question than is generally acknowledged. On the one hand, some American private citizens have had long-lasting positive impact&mdash;for example the founding of educational institutions such as Roberts College, the American University in Beirut (originally the Syrian Protestant College), and the American University in Cairo. At that time, the US generally was viewed positively in the region. ... Read more

Will Hanley, Florida State University

It's not possible to use historical evidence to support a black-and-white statement like "The United States has been a force for good in the Middle East." Even if it were possible, the slim 1833 treaty between the US and the Sultan of Muscat is meager evidence. ... Read more

Mark Andrew Le Vine, Professor of Modern Middle Eastern History, UC Irvine

This statement is ridiculous even by the standards of the Trump administration. The US has been among the most damaging forces in the Middle East for the last three quarters of a century. ... Read more

Michael Provence, Professor of Modern Middle Eastern History, University of California, San Diego

The United States had no role in the Middle East before 1945, apart from private business and educational initiatives. Within a couple years of 1945, the US tilted toward Israel in its first war, began overthrowing democratic Middle Eastern governments, and propping up pliant dictators. ... Read more

Dale Stahl, Assistant Professor of History, University of Colorado Denver

I see this statement as "more or less false" because there are clear examples where the United States has not had a positive influence in the Middle East. One needn't reflect very far back into that "nearly 200 years" of history to know that this is so. ... Read more

Lior Sternfeld, Penn State University

While the US had some moments where it was a force for good, with projects like schools, universities, and orphanages, it was also a source for instability in cases like the 1953 coup against Mosaddegh that overturned the course not just of Iran but of the region in its entirety. Read more

 

]]>
Thu, 18 Apr 2019 16:48:42 +0000 https://historynewsnetwork.org/blog/154194 https://historynewsnetwork.org/blog/154194 0
Roundup Top 10!  

The New Zealand Shooting and the Great-Man Theory of Misery

by Jelani Cobb

Most of the men who committed these recent acts of terror composed manifestos. A sense of history turning on the fulcrum of a single man’s actions is a theme within them.

 

Nazis Have Always Been Trolls

by Adam Serwer

Historically, they rely on murderous insincerity and the unwillingness of liberal societies to see them for what they are.

 

 

The first time the U.S. considered drafting women — 75 years ago

by Pamela D. Toler

As legislative debate about drafting women in 1945 shows, if the military need is great enough, women will be drafted no matter how uncomfortable lawmakers are with the prospect.

 

 

Poor criminal defendants need better legal counsel to achieve a just society

by Connie Hassett-Walker

Why we must fulfill the promise of a famous Supreme Court decision to truly achieve criminal justice reform.

 

 

Native children benefit from knowing their heritage. Why attack a system that helps them?

by Bob Ferguson and Fawn Sharp

For 40 years, the Indian Child Welfare Act has protected the best interests of Native children and helped preserve the integrity of tribal nations across the United States.

 

 

The Story of the Dionne Quintuplets Is a Cautionary Tale for the Age of ‘Kidfluencers’

by Shelley Wood

The pitfalls and payoffs of advertising directly to children have consumed psychologists, pediatricians, marketers and anxious parents for the better part of a century.

 

 

Citizenship in the Age of Trump

by Karen J. Greenberg

Death By a Thousand Cuts

 

 

When bad actors twist history, historians take to Twitter. That’s a good thing.

by Waitman Wade Beorn

Engaging with the public isn’t pedantry; it’s direct engagement.

 

 

Americans don’t believe in meritocracy — they believe in fake-it-ocracy

by Niall Ferguson

This illegal “side door” into college came into existence because the back door of a fat donation — like the $2.5 million paid by Jared Kushner’s father to Harvard — isn’t 100 percent reliable.

</

 

Who’s the snowflake? We tenured professors, that’s who

by Anita Bernstein

Our freedom to say what we want is not only tolerated but celebrated.

]]>
Thu, 18 Apr 2019 16:48:42 +0000 https://historynewsnetwork.org/article/171561 https://historynewsnetwork.org/article/171561 0
Andy Warhol: A Lot More than Soup Cans A month ago, I watched a television program that covered, briefly, the art of pop icon Andy Warhol, he of all the Campbell’s Soup cans. The narrator said that Warhol had passed into history and that young people today probably had no idea who he was.

I was startled. Young people did not know who the thin man with the white hair was, the man who hung out with Liz Taylor, Liza Minelli, dress designer Halston and the Jaggers? The man who painted the famous Mao portrait? Truman Capote’s buddy?

I’m a professor, so the next day I asked my classes, 25 students in each, if they knew who Andy Warhol was. I didn’t say artist or painter Andy Warhol, just Andy Warhol.

The hands shot into the air. About 95% of them knew who he was.

Andy Warhol will never pass from the scene. That is proven, conclusively, in the largest exhibit of his work in generations at the Whitney Museum, in New York, Andy Warhol – From A to B and Back Again. It is a marvelous and exciting tribute to his work and is attracting huge crowds.

The crowds are not art aficionados from the 1960s, either, but young women with baby carriages, high school student groups, young couples and foreign tourists. Warhol was an international celebrity and a celebrity superstar in addition to being a memorable artist, and, these crowds indicate, always will be remembered.

“Modern art history is full of trailblazers whose impact dims over time,” said Curator Scott Rothkopf. “But Warhol is that extremely rare case of an artist whose legacy grows only more potent and lasting. His inescapable example continues to inspire, awe and even vex new generations of artists and audiences with each passing year.”

Another curator, Donna De Salvo, said the originally Avant Garde Warhol has become part of mainstream art. “Warhol produced images that are now so familiar that it’s easy to forget how just how unsettling and even shocking they were when they debuted,” she said.

Warhol really became famous not so much because of his new age art, but because of his celebrity. He was friends with many of the biggest entertainment stars in the world, was a fixture at legendary New York nightclub Studio 54 in the 1980s, paled around with fashion designer Halston, drank wine with Liza Minelli and lunched with Liz Taylor. He was almost murdered in 1968 when an irate actress from his film studio, the Factory, shot him several times. The shooting made front page news all over the world. He was a central character in the movie Factory Girl, about Edie Sedgwick, one of his Factory actresses.

Everybody recognized him instantly since he wore those thick glasses and had that mop top of dyed white hair. That fame was why people paid so much attention to his often-bizarre work. Some said that the quiet boy from Pittsburgh, who fell in love with Shirley Temple as a kid created a unique persona of himself that worked well.

The Warhol exhibit, a real achievement in cultural history, occupies all of the fifth floor at the Whitney plus additional galleries on the first and third floors. The best way to start is on the first floor and the gallery of his oversized portraits. They are mounted in log rows across the walls of the room and they introduce you to Andy the celebrity and Andy the artist at the same time. The portraits also tell you a lot about show business and art history in the 1960s and ‘70s. There are lots of famous people on the walls here, like Liza Minelli, Dennis Hopper, soccer star Pele, socialite Jane Holzer and Halston, but lots of people you never heard of, too. 

The third-floor houses wall after wall of his famous “Cow Wallpaper,” adorned with hundreds of similar heads of a brown cow. It is eye-opening and hilarious. 

Another room has a stack of his popular blue and white Brillo pad boxes and a wall full of S & H Green Stamps (remember them?)

There are his paintings of magazine covers and lots of newspaper front pagers (an eerie one about a 1962 Air France plane crash).

You learn a lot about his personal life. As an example, as a young man he became a fan of Truman Capote, who wrote Breakfast at Tiffany’s and called him every single day. 

There are drawings of celebrity’s shoes to show how they represented their personalities. Christine Jorgensen was one of the first modern openly transgender women, so she has shoes that don’t match each other.

Unknown to most, he loved to do paintings of paintings of comic strip characters. Two in the exhibit, of Superman and Dick Tracy, in blazing bright colors, were displayed in a New York City department store window. 

What makes the exhibit so enjoyable at the Whitney Museum, recently opened on Gansevoort Street near the Hudson River, is the way the curators use its space. Unlike most museum exhibits, where everything is scrunched together, the curators used the large, high ceilinged rooms wisely, putting the 350 Warhol pieces, especially the very large ones (some are thirty feet wide) alone on the pristine white walls so they jump off the wall at you. You go around one corner and there is Elvis Presley as a gunslinger in four separate portraits firing one of his six-guns. Next to him is Marlon Brando in a leather jacket and on his motorcycle.

There are weird walls of photos such as most wanted criminals he drew from photos in a New York State Booklet, “13 Most Wanted Men.’ There is a series of his copies of Leonardo da Vinci’s Mona Lisa and then his copy of his copy.

He did inspirational photos and silkscreens. A woman named Ethel Sculls CHEC arrived at his studio one day for what she thought would be a traditional portrait. Instead, Warhol took her to Times Square and had her sit for dozens of photos in the cheapie photo booths there, where all the going-steady high school kids went. The result – a sensational wall of photos of her in different giddy and seductive poses. Brilliant.

There are photos of Jackie Kennedy Onassis. One set is of her smiling on the morning before her husband’s murder and then, in the next strip, is her, somber, at the President’s funeral.  There is a wall full of his famous photo of Marilyn Monroe. There is a world famous, mammoth, and I mean mammoth, portrait of China’s Chairman Mao. One wall is filled with his fabled Campbell’s soup can paintings and another with his Coca Cola works.

Sprinkled among all of these paintings are real life photos and videos of Warhol at work.

There is a large television set on the third floor in which you see a truly bizarre video of Warhol simply eating a cheeseburger for lunch (he’s doing to get sick eating so fast!)

Warhol was also a well-known Avant Garde filmmaker and the museum is presenting dozens of his 16mm movies in a film festival in its third-floor theater. Some of these star the famous Ed Sedgwick, who appeared in many of his films and died tragically of a drug overdose.

Andy Warhol, who died at the age of 58 during a minor operation, led a simple middle-class existence until he arrived in New York. He was born in 1928 in Pittsburgh, was graduated from Carnegie-Mellon University there and then went to New York where he became well-known. He began his career as a commercial artist, earning money for drawings for magazine ads (dozens of them are in the show).

He became famous for his portraits of Campbell’s Soup cans. He painted them because as a kid his family was so poor that he and his brothers had Campbells Soup for lunch every day. Warhol said he had Campbell’s soup for lunch every day for 20 years. He also saw the soup can as a window into America. He was right.

The exhibit is a large open window on American history and culture in the 1960s and ‘70s and how the outlandish Warhol starred in it and, with his genius, changed it.

Andy Warhol not remembered? Hardly.

The exhibit runs through March 31.

 

]]>
Thu, 18 Apr 2019 16:48:42 +0000 https://historynewsnetwork.org/article/171533 https://historynewsnetwork.org/article/171533 0