Spain issues deadline to Catalans.
Elderly couple among those killed in California wildfire.
NAFTA without Mexico? It’s possible.
Boy Scouts to admit girls; allow them to reach Eagle rank.
See who won the Genius grants.
How The Draft Reshaped America — Amy J. Rutenberg in the New York Times.
“Greeting: You are hereby ordered for induction in the Armed Forces of the United States.” In 1967, more than 300,000 American men opened envelopes with this statement inside. Few pieces of mail ever incited the same combination of panic, anticipation and resignation as a draft notice. The words struck terror in the hearts of many recipients. Others found them comforting after years of waiting for the Selective Service System to come calling.
The Vietnam generation came of age with the threat of military service hovering in the background. Although the Selective Service called relatively few men between the end of the Korean War in 1953 and American escalation in Southeast Asia in 1965, the draft had been in almost continuous operation since before the United States joined World War II. During that time Selective Service, under the leadership of Gen. Lewis B. Hershey, faced little public criticism. In fact, Hershey had shaped it into a venerated institution. Although most men may not have wanted to dedicate two years of their life to active military service, draftees generally acquiesced to Uncle Sam’s wishes.
After President Lyndon Johnson mobilized ground troops in 1965, draft calls tripled. With each passing year, more men faced conscription to fight a war with whose goals and methods a significant number disagreed. Stories of privileged men finding ways to beat the draft began to circulate. Newspaper articles with headlines like “Young Men Dream Up Some Ingenious Ways to Avoid the Draft” and “Avoiding the Draft Is Becoming the Favorite Sport Among Youth” horrified Americans who believed military service should be an equal obligation of male citizenship. At least as portrayed by reporters, these men were almost always middle class, with seemingly All-American families.
Critics at the time and since have identified the Selective Service’s system of deferments as the main cause of military inequity during the Vietnam War. Although the Department of Defense did not keep records on the socio-economic status or racial identification of service personnel beyond whether they were African-American or not, there’s no doubt that men with fewer resources were less likely to obtain deferments than those with more. As a result, they were more likely to be drafted, serve in combat and die in Vietnam. Long Island’s war dead, for example, hailed overwhelmingly from working-class backgrounds.
But why? How is it that the Selective Service, which had used deferments during both World Wars and the Korean War, allowed the situation to become so bad that by 1967 fewer than half of Americans polled believed that the draft operated fairly? For this answer, one must look to the goals of Cold War liberals, both Republican and Democrat. The deferment policies that created such havoc during the Vietnam War were the direct outgrowth of Washington’s desire to fight Communism at home as well as abroad.
Deferments are a necessary element of any system of selective military service. If a nation does not require all of its citizens to participate in the armed forces, then someone must decide who goes and who stays. Deferments allow those with skills needed on the home front to exempt themselves from their military obligations because, especially during the upheaval of war, they ensure a viable domestic economy and stable society. Factories, hospitals and schools, for example, can operate only when fully staffed with skilled employees. Farmers and agricultural workers maintain necessary food supplies. In theory, deferments should be limited only to those considered more valuable to war aims as civilians than as soldiers.
But the nature of the Cold War, especially early on, complicated things. Defeating Communism was more than a military endeavor; the home front became a crucial site of defense operations. Americans believed that triumph over the Soviet Union required a prolonged ideological, technological and economic struggle. The circumstances of the Cold War, therefore, granted the Selective Service System license to use deferments as a tool of social engineering.
Hershey believed that all able-bodied American men had the obligation to serve the nation, but he began to advocate a definition of service that included civilian pursuits, particularly in science, mathematics and engineering. Throughout the 1950s, the perception that the United States was in danger of falling behind the Soviets caused national panic, especially after the U.S.S.R. successfully launched its Sputnik satellite in 1957. According to politicians and intellectuals, American superiority rested on outpacing Soviet technological development, both in the domestic realm and in the military sector. The Army’s strategic plans for countering atomic attack depended on the invention of new weapons, while consumer capitalism required new products to buy and sell. The United States needed a steady supply of men in STEM fields to develop the state-of-the-art appliances and futuristic weapons systems that it so desperately wanted.
In Hershey’s view, the Selective Service was the “storekeeper” of America’s manpower supply. He believed that the promise of deferments could be used as a tool to coerce — or bribe — men to go to college and enter occupations defined as in the national interest. In the words of one planning memo, the Selective Service could use the “club of induction” to “drive” individuals into “areas of greater importance.” This policy, known as manpower channeling, specifically defined these pursuits as service to the state on a par with military service.
The availability of deferments for men attending college and in professional fields ballooned. Occupational deferments increased by 650 percent between 1955 and 1963. But men had to qualify for higher education and be able to pay for it. Since part-time students did not receive deferments, men could not take semesters off to earn tuition money or recover from academic probation. Eligible occupations skewed toward those with college degrees. Unlike during World War II, most factory and agricultural workers could not gain occupational deferments by the late 1950s. Such dispensations were reserved for scientists, engineers, doctors and teachers.
Even those deferments theoretically available to anyone really were not. Medical deferments, for example, were harder for poorer men to obtain. The doctors performing the cursory exams at pre-induction physicals often failed to detect health defects that would have guaranteed exemptions from military service. And if men did not have a record of private medical care, they had little recourse when declared available for service.
By 1965, many middle-class men had come to expect deferments. Military service, to them, was for “suckers” who had made poor choices. Working-class men, of course, were not “suckers.” Rather, Great Society policies meant to strengthen the economy by alleviating poverty ended up targeting them for military service.
Policy makers in the Kennedy and Johnson administrations began to focus on America’s poor as the weak link between national strength and the promise of democracy. Secretary of Labor W. Willard Wirtz identified the Selective Service as an “incomparable asset” in locating men who could benefit from government aid. Virtually all American men underwent a pre-induction exam. Approximately one-third failed. Such “rejectees” were overwhelmingly from poor and minority backgrounds. In early January 1964, less than two months after taking office, Johnson ordered the Selective Service, the Department of the Army, the Department of Labor and the Department of Health, Education and Welfare to address the problem.
Secretary of Defense Robert McNamara actively wanted the armed forces to be part of the solution. He firmly believed that military service could be used to “rehabilitate” men caught in the cycle of poverty. He, along with Assistant Secretary of Labor Daniel Patrick Moynihan, argued that military training freed poor men from the “squalid ghettos of their external environment” and the “internal and more destructive ghetto of personal disillusionment and despair.” McNamara wanted a program that would bolster national security by eliminating a source of social unrest and benefit American combat readiness by boosting the number of men in uniform.
In August 1966, he announced the Defense Department’s intention to bring up to 100,000 previously ineligible men into the military each year to “salvage” them. Project 100,000, as it came to be known, would “rescue” poor and especially minority men from the “poverty-encrusted environments” in which they had been raised. These so-called New Standards men — who were otherwise ineligible for military service — were to be admitted into all branches of the armed forces, both voluntarily through enlistment and involuntarily through the draft.
Over all, all branches of service added a combined total of 354,000 New Standards men to their active-duty rosters between 1966 and 1971, when the program ended. Forty percent of these men were black, at a time when the entire military averaged only 9 percent African- American. McNamara hoped that a stint in the military would make New Standards men better husbands, better fathers and better breadwinners, and thus better citizens. Most ended up as infantrymen in Vietnam.
It was no coincidence that those men who already fit the middle-class mold of domestic masculinity — those men who were college students or teachers or scientists — received deferments. Midcentury liberals believed such men did not need the military to lift them up. Meanwhile, every slot filled by a New Standards man was one a middle-class man avoided.
Ultimately, what made sense during the militarized peace of the Cold War did not during a hot war. Many middle-class men did not consider it their responsibility to serve in the military, especially in a war they often categorized as somewhere on the continuum between unnecessary and immoral. Instead, they learned to work a system designed to encourage them to see military service as a personal choice rather than an obligation. Working-class men simply were not offered the same option.
Diagnosing Trump — Masha Gessen in The New Yorker.
The question is not whether the President is crazy but whether he is crazy like a fox or crazy like crazy. And, if there is someone who can know the difference, should this person, or this group of people, say something—or would that be crazy (or unethical, or undemocratic)?
Jay Rosen, a media scholar at New York University, has been arguing for months that “many things Trump does are best explained by Narcissistic Personality Disorder,” and that journalists should start saying so. In March, the Times published a letter by the psychiatrists Robert Jay Lifton and Judith L. Herman, who stated that Trump’s “repeated failure to distinguish between reality and fantasy, and his outbursts of rage when his fantasies are contradicted” suggest that, “faced with crisis, President Trump will lack the judgment to respond rationally.” Herman, who is a professor at Harvard Medical School, also co-authored an earlier letter to President Obama, in November, urging him to find a way to subject President-elect Trump to a neuropsychiatric evaluation.
Lifton and Herman are possibly the greatest living American thinkers in the field of mental health. Lifton, who trained both as a psychiatrist and a psychoanalyst, is also a psychohistorian; he has written on survivors of the atomic bombs dropped on Japan, on Nazi doctors, and on other expressions of what he calls “an extreme century” (the one before this one). Herman, who has done pioneering research on trauma, has written most eloquently on the near-impossibility of speaking about the unimaginable—and now that Donald Trump is, unimaginably, President, she has been speaking out in favor of speaking up. Herman and Lifton have now written introductory articles to a collection called “The Dangerous Case of Donald Trump: 27 Psychiatrists and Mental Health Experts Assess a President.” It is edited by Bandy X. Lee, a psychiatrist at the Yale School of Medicine who, earlier this year, convened a conference called Duty to Warn.
Contributors to the book entertain the possibility of applying a variety of diagnoses and descriptions to the President. Philip Zimbardo, who is best known for his Stanford Prison Experiment, and his co-author, Rosemary Sword, propose that Trump is an “extreme present hedonist.” He may also be a sociopath, a malignant narcissist, borderline, on the bipolar spectrum, a hypomanic, suffering from delusional disorder, or cognitively impaired. None of these conditions is a novelty in the Oval Office. Lyndon Johnson was bipolar, and John F. Kennedy and Bill Clinton might have been characterized as “extreme present hedonists,” narcissists, and hypomanics. Richard Nixon was, in addition to his narcissism, a sociopath who suffered from delusions, and Ronald Reagan’s noticeable cognitive decline began no later than his second term. Different authors suggest that America “dodged the bullet” with Reagan, that Nixon’s malignant insanity was exposed in time, and that Clinton’s afflictions might have propelled him to Presidential success, just as similar traits can aid the success of entrepreneurs. (Steve Jobs comes up.)
Behind the obvious political leanings of the authors lurks a conceptual problem. Definitions of mental illness are mutable; they vary from culture to culture and change with time. The Diagnostic and Statistical Manual of Mental Disorders is edited every few years to reflect changes in norms: some conditions stop being viewed as pathologies, while others are elevated from mere idiosyncrasies to the status of illness. In a footnote to her introduction, Herman acknowledges the psychiatric profession’s “ignominious history” of misogyny and homophobia, but this is misleading: the problem wasn’t so much that psychiatrists were homophobic but that homosexuality fell so far outside the social norm as to virtually preclude the possibility of a happy, healthy life.
Political leadership is not the norm. I once saw Alexander Esenin-Volpin, one of the founders of the Soviet dissident movement, receive his medical documents, dating back to his hospitalizations decades earlier. His diagnosis of mental illness was based explicitly on his expressed belief that protest could overturn the Soviet regime. Esenin-Volpin laughed with delight when he read the document. It was funny. It was also accurate: the idea that the protest of a few intellectuals could bring down the Soviet regime was insane. Esenin-Volpin, in fact, struggled with mental-health issues throughout his life. He was also a visionary.
No one of sound mind would suspect Trump of being a visionary. But is there an objective, value-free way to draw the very subjective and generally value-laden distinction between vision and insanity? More to the point, is there a way to avert the danger posed by Trump’s craziness that won’t set us on the path of policing the thinking of democratically elected leaders? Zimbardo suggests that there should be a vetting process for Presidential candidates, akin to psychological tests used for “positions ranging from department store sales clerk to high-level executive.” Craig Malkin, a lecturer at Harvard Medical School and the author of “Rethinking Narcissism,” suggests relying on “people already trained to provide functional and risk assessment based entirely on observation—forensic psychiatrists and psychologists as well as ‘profilers’ groomed by the CIA, the FBI, and various law enforcement agencies.” This is a positively terrifying idea. As Mark Joseph Stern wrote in Slate in response to last December’s calls for the Electoral College to un-elect Trump, it “only made sense if you assumed as a starting point that America would never hold another presidential election.”
Psychiatrists who contributed to “The Dangerous Case of Donald Trump” are moved by the sense that they have a special knowledge they need to communicate to the public. But Trump is not their patient. The phrase “duty to warn,” which refers to a psychiatrist’s obligation to break patient confidentiality in case of danger to a third party, cannot apply to them literally. As professionals, these psychiatrists have a kind of optics that may allow them to pick out signs of danger in Trump’s behavior or statements, but, at the same time, they are analyzing what we all see: the President’s persistent, blatant lies (there is some disagreement among contributors on whether he knows he is lying or is, in fact, delusional); his contradictory statements; his inability to hold a thought; his aggression; his lack of empathy. None of this is secret, special knowledge—it is all known to the people who voted for him. We might ask what’s wrong with them rather than what’s wrong with him.
Thomas Singer, a psychiatrist and Jungian psychoanalyst from San Francisco, suggests that the election reflects “a woundedness at the core of the American group Self,” with Trump offering protection from further injury and even a cure for the wound. The conversation turns, as it must, from diagnosing the President to diagnosing the people who voted for him. That has the effect of making Trump appear normal—in the sense that, psychologically, he is offering his voters what they want and need.
Knowing what we know about Trump and what psychiatrists know about aggression, impulse control, and predictive behavior, we are all in mortal danger. He is the man with his finger on the nuclear button. Contributors to “The Dangerous Case of Donald Trump” ask whether this creates a “duty to warn.” But the real question is, Should democracy allow a plurality of citizens to place the lives of an entire country in the hands of a madman? Crazy as this idea is, it’s not a question psychiatrists can answer.
Democrats Can Win — Charles P. Pierce on contesting every race.
I’m reluctant to point this out, lest I blow the covert aspects of some good news, but it seems that, almost without anyone’s noticing, very progressive African-American candidates have been getting elected to be mayors in cities in the very deepest parts of the deep South. First, it was Oxford American:, an actual Socialist, who was elected mayor in Jackson in Mississippi Goddamn. From
In Lumumba’s successful campaigns for city council in 2009 and for mayor in 2013, “Free the land” had been a common refrain of his supporters. His platform, too, echoed the vision he and his fellow New Afrikans had harbored for their new society on Land Celebration Day. He pledged that his office would support the establishment of a large network of cooperatively owned businesses in Jackson, often describing Mondragon, a Spanish town where an ecosystem of cooperatives sprouted half a century ago. In debates and interviews, he promised that Jackson, under the leadership of a Lumumba administration, would flourish as the “Mondragon of the South”—the “City of the Future.”
If I may repeat, this is Jackson. The one in Mississippi. Goddamn.
Then, on Tuesday, a man named Randall Woodfin challenged and beat the incumbent mayor of Birmingham, Alabama, William Bell. Woodfin is 36, which will make him the youngest mayor of that city in over a century. More significantly, Woodfin had the active support of Bernie Sanders and the people allied with Sanders’ late campaign for president. Sanders recorded a robo-call on Woodfin’s behalf late in the race and Nina Turner, the head of Our Revolution, the Sanders-affiliated political operation, made two trips to Birmingham on Woodfin’s behalf.
(It should be noted that the Sanders folks also scored victories on Tuesday night in preliminary contests for mayor of Albuquerque and for an open seat in the California Assembly.)
If the Democratic Party weren’t so terminally bumfuzzled, and if many of its activists could get over the wounds their delicate fee-fees suffered during the 2016 presidential primaries, the party could see a great advantage in coordinating efforts between the formal party apparatus and what could be described as the progressive shock troops that carried Woodfin to victory in Birmingham.
Right now, for example, if you can believe it, the Democratic National Committee seems to be slightly baffled about what to do as regards the race for the open U.S. Senate seat in Alabama. The Democratic candidate is Douglas Jones, the former U.S. Attorney who sent to prison the last of the terrorists who bombed the 16th Street Baptist Church in 1963. The Republican candidate is a lawless theocratic nutball named Roy Moore, who lost his job as chief justice of the Alabama Supreme Court twice because of flagrant judicial misconduct.
It would seem to the casual observer that people generally should realize it to be their patriotic duty to keep Moore out of the Senate for the good of the country. However, as reported by The Daily Beast, the Democratic Party apparatus can’t even decide if it should go all in for Jones.
A spokesman for the Democratic Senatorial Campaign Committee said only that the group is closely monitoring the race and providing support if necessary to the Democratic candidate, Doug Jones. The spokesman also said that Sen. Chris Van Hollen (D-MD), the chairman of the DSCC, had made a personal contribution to the Jones campaign. Democratic super PACs, meanwhile, are evaluating their options when it comes to the Alabama general election, which isn’t until December. Before making any investments in the race, they first want to assess how vulnerable Moore is in the state. The former chief justice has emerged from a primary during which virtually every establishment Republican institution was against him. Democratic operatives said on Wednesday that they’re looking to see if some GOP voters keep their distance from Moore before deciding to come to Jones’ aid.
Good god, how is this even a question? Roy Moore is a howling extremist, if that word has any meaning at all anymore. Why would the Democratic Party worry about whether or not Republicans in Alabama are going to “keep their distance” from their party’s lunatic candidate? (Pro Tip: They almost never do.) Get in there with both feet immediately and don’t get out until the job’s done.
Or, if you insist on overthinking yourselves into paralysis, turn Nina Turner and the people allied with her loose and then come in at the end—cooperatively, mind you—and drown the race with money and ads. And if the Our Revolution people hold back because they don’t want somebody on the Internet to get mad at them for “selling out,” they should tell that person to shut up and dance. This is too important. There are now two mayors who’ve proven that progressive candidates can win just about anywhere. Learn that lesson or you deserve to lose forever.
Doonesbury — Hits keep coming.
Conscientious Objector — Charles M. Blow in the New York Times.
Donald Trump schlepped across town on Tuesday to meet with the publisher of The New York Times and some editors, columnists and reporters at the paper.
As The Times reported, Trump actually seemed to soften some of his positions:
He seemed to indicate that he wouldn’t seek to prosecute Hillary Clinton. But he should never have said that he was going to do that in the first place.
He seemed to indicate that he wouldn’t encourage the military to use torture. But he should never have said that he would do that in the first place.
He said that he would have an “open mind” on climate change. But that should always have been his position.
You don’t get a pat on the back for ratcheting down from rabid after exploiting that very radicalism to your advantage. Unrepentant opportunism belies a staggering lack of character and caring that can’t simply be vanquished from memory. You did real harm to this country and many of its citizens, and I will never — never — forget that.
As I read the transcript and then listened to the audio, the slime factor was overwhelming.
After a campaign of bashing The Times relentlessly, in the face of the actual journalists, he tempered his whining with flattery.
At one point he said:
“I just appreciate the meeting and I have great respect for The New York Times. Tremendous respect. It’s very special. Always has been very special.”
He ended the meeting by saying:
“I will say, The Times is, it’s a great, great American jewel. A world jewel. And I hope we can all get along well.”
I will say proudly and happily that I was not present at this meeting. The very idea of sitting across the table from a demagogue who preyed on racial, ethnic and religious hostilities and treating him with decorum and social grace fills me with disgust, to the point of overflowing. Let me tell you here where I stand on your “I hope we can all get along” plea: Never.
You are an aberration and abomination who is willing to do and say anything — no matter whom it aligns you with and whom it hurts — to satisfy your ambitions.
I don’t believe you care much at all about this country or your party or the American people. I believe that the only thing you care about is self-aggrandizement and self-enrichment. Your strongest allegiance is to your own cupidity.
I also believe that much of your campaign was an act of psychological projection, as we are now learning that many of the things you slammed Clinton for are things of which you may actually be guilty.
You slammed Clinton for destroying emails, then Newsweek reported last month that your companies “destroyed emails in defiance of court orders.” You slammed Clinton and the Clinton Foundation for paid speeches and conflicts of interest, then it turned out that, as BuzzFeed reported, the Trump Foundation received a $150,000 donation in exchange for your giving a 2015 speech made by video to a conference in Ukraine. You slammed Clinton about conflicts of interest while she was secretary of state, and now your possible conflicts of interest are popping up like mushrooms in a marsh.
You are a fraud and a charlatan. Yes, you will be president, but you will not get any breaks just because one branch of your forked tongue is silver.
I am not easily duped by dopes.
I have not only an ethical and professional duty to call out how obscene your very existence is at the top of American government; I have a moral obligation to do so.
I’m not trying to convince anyone of anything, but rather to speak up for truth and honor and inclusion. This isn’t just about you, but also about the moral compass of those who see you for who and what you are, and know the darkness you herald is only held at bay by the lights of truth.
It’s not that I don’t believe that people can change and grow. They can. But real growth comes from the accepting of responsibility and repenting of culpability. Expedient reversal isn’t growth; it’s gross.
So let me say this on Thanksgiving: I’m thankful to have this platform because as long as there are ink and pixels, you will be the focus of my withering gaze.
I’m thankful that I have the endurance and can assume a posture that will never allow what you represent to ever be seen as everyday and ordinary.
No, Mr. Trump, we will not all just get along. For as long as a threat to the state is the head of state, all citizens of good faith and national fidelity — and certainly this columnist — have an absolute obligation to meet you and your agenda with resistance at every turn.
I know this in my bones, and for that I am thankful.
“Tu día llegó” — Jennine Capó Crucet on Miami’s reaction to the death of Fidel Castro.
The first time Fidel Castro died was around my birthday in 2006. I was in Miami when the announcement went out that Castro had had an operation and was temporarily ceding power to his brother. This being the first time Castro had voluntarily stepped away from his dictatorship, speculation ran wild. Miami Cubans took to the streets to celebrate the death of a tyrant, a symbol of death and loss for Cubans of all races and faiths.
This morning, my sister texted, “Fidel is dead… again,” one of 26 messages from friends and relatives sharing the news.
I’d already heard: around midnight, Cubans of every age again poured into the streets of Miami to celebrate the death of a dictator who’d had a profound effect on our lives — who was, in many ways, the reason we were all here in the first place. I was in Westchester, a south Miami neighborhood that’s arguably the heart of Miami’s Cuban community (and as a Hialeah native, I’d be the first one to argue).
On Bird Road, where the lane closest to the sidewalk had been blocked off to allow for overflowing crowds, police lights bathed people in swirls of blue and red light. A father had his arm around his adolescent daughter, who was draped in a Cuban flag, the two of them watching the celebration around them. A woman about my age, there with her girlfriend, wore a T-shirt she seemed to be saving for this day: it read, Tu dia llego (meaning, “your day has come,” though the accents were missing from both día and llegó). A crew of fraternity brothers, none of them Cuban, said they’d “come down from Broward to see this.” “Hialeah must be on fire right now,” one of them said.
I am always somehow back in Miami when something monumental happens in our community. Celia Cruz’s death. Obama’s 2015 visit to Cuba. Even the Elian Gonzalez chaos in 1999 and 2000 coincided with my college breaks. I turned that saga into a novel in order to write through the media’s inaccurate and incomplete portrayal of frenzied Cubans throwing themselves at the feet of a young boy-turned-symbol.
The news out of Miami today will show you loud Cubans parading through the streets. It will show us hitting pots and pans and making much noise and yelling and crying and honking horns. It will give you familiar, rehashed images of old men sipping café out of tiny cups outside Versailles, the famous Cuban restaurant in Miami. That’s all part of it, yes.
But what is more important, yet difficult to show, are other prevalent scenes: People just outside the camera frame, leaning against a restaurant wall, silent and stunned and worried about those still on the island; the tearful conversations happening this morning between generations, families sitting around café con leche and remembering those who Castro’s regime executed.
At a dinner with Miami-based Latino writers a couple nights after the Miami Book Fair last week, we joked that Castro would never die because he is protected by powerful santería — the joke being that the news would take such a statement from us as fact because of our heritage. We are already anticipating the inevitable question: Now that Castro is dead, will we visit Cuba? As if those visits would legitimize something about our identities as American-born Cubans, as if the choice to visit the island would be worth bragging about — as if our answer to that question is anyone’s business but our own.
Those conversations are more nuanced and don’t have the same dramatic effect as banging on pots and pans. They are complex and harder to fit into whatever you write within hours of learning that the dictator who has literally and symbolically represented oppression your entire life is finally gone: Tu día llegó – your day has come – and yes, the shirt fits, but each of us knows there is so much more behind those words that is impossible to distill.
Many of us out on the streets last night and this morning are here as witnesses, as bearers of memory, as symbols ourselves. Many of us are out because we have family that can’t be here — mothers, abuelos, cousins who died at the hands of the Castro regime. We are here to comfort each other and to honor the sacrifices these family members made. This morning in Miami, in the house in Westchester, we were calling each other around the city and the country and saying, “I am thinking of you.”
In one call, ten minutes into the play-by-play of where we all were when we heard the news, my partner’s grandfather, who was born in Cuba but now lives in Puerto Rico, asked us over speaker phone, “Now are you gonna get married?” I lifted a mug to my mouth and began chugging coffee with sudden intensity, and in the laughter around the moment, someone chimed in that we’d stick to the day’s plan of getting a Christmas tree. But his response is proof that there is hope and optimism and excitement at the base of many of these new conversations.
Today I awoke to stories we’d heard a thousand times, stories about the family left behind in Cuba, about survival and exile, about first weeks in the United States, stories honoring those who did not live to see this moment — all being told with more verve and energy than they’ve been told for a long time. I cannot speak for every Cuban and have never embraced the chance to do so. This was my immediate reaction to hearing about Fidel Castro’s death: That’s impossible, he will never die. Turns out even I’d fallen for the hype.
Broadway Recommendations for Mike Pence — Michael Schulman at The New Yorker has his picks.
Dear Vice-President-elect Pence,
Congratulations on scoring tickets for “Hamilton”! Not an easy task. Hopefully you enjoyed the title performance by Javier Muñoz, a gay, H.I.V.-positive Puerto Rican.
Here are some suggestions for other Broadway shows to check out—or avoid, for your own safety. As you know, the theatre is a “safe place,” except if you’re a virulent homophobe or texting in the presence of Patti LuPone.
So get on that TKTS line and remember: if you’re molested by a Times Square Elmo, you have Rudy Giuliani to thank.
A stage version of the Disney classic about an Arab street criminal who infiltrates the government under a false identity and employs black magic to bring down the wise Royal Vizir. Skip.
“The Book of Mormon”
An inspirational drama about two white Christians spreading God’s word to deepest, darkest Africa. The showstopper is about young men using religion to repress their homosexual thoughts. No wonder audiences are smiling!
“The Phantom of the Opera”
A psychopathic troll terrorizes the cosmopolitan élite. Donald Trump called it “great”!
A well-intentioned and intelligent woman is smeared with false accusations until the public is convinced that she’s a malevolent witch. A+
A musical about gay Jewish New Yorkers who have lesbian neighbors and sing songs like “Four Jews in a Room Bitching.” At the end, one of them gets AIDS and dies. AVOID.
An eye-opening portrait of crime and corruption in Barack Obama’s home city. The hero is the brilliant defense attorney Billy Flynn, who bamboozles the public with sensationalist lies and sings, “How can they hear the truth above the roar?” Bonus: jailed women.
A throwback to when America was truly great, 1942. Men were men, women were women, and barns were red. Includes the greatest song ever written by a Jew, “White Christmas.”
“Fiddler on the Roof”
A musical about members of a despised minority who are forced to leave their homes after being targeted by violent hate groups under a repressive czar. A heart-warmer!
“The Color Purple”
A wistful portrait of being a poor black woman in the Jim Crow South, a.k.a. the good old days.
“The Front Page”
An exposé of the corrupt mainstream media as it distorts the truth and undermines law and order. Needless to say, Nathan Lane is a hoot!
This portrait of working-class women in America’s heartland starts off O.K., when the title character chooses to carry an unwanted pregnancy to term. But she winds up committing adultery and taking control of her own life choices. Recommendation: leave at intermission.
A black drag queen helps the white working class bring back manufacturing jobs by producing bedazzled red footwear. This musical must be stopped.
The Donald Trump of musicals: it’s tacky, it’s nonsensical, and it’s from the eighties. The cats live in the streets without a social safety net. And, since they’re competing for a chance at reincarnation, all the characters are potentially unborn. Go!
Doonesbury — It’s an honor.
100 Days — Margaret Doris on the road ahead for Hillary Clinton.
PHILADELPHIA—Ain’t nobody gonna rain on her parade.
Hillary Rodham Clinton planned to celebrate the launch of her fall campaign outdoors on Friday afternoon, with Independence Mall providing a historic backdrop to a massive rally. Instead, when the forecast called for thunderstorms, organizers scaled back and moved the rally indoors, to an old gymnasium best remembered as the home of the inaugural 1938 NIT champion Temple University Owls.
It didn’t make no nevermind to the candidate.
“I don’t know about you, but I stayed up really late last night. It was just hard to go to sleep,” an ebullient HRC told the crowd of several thousand supporters gathered just hours after she formally accepted the Democratic Party’s nomination for President. “When I woke up this morning, and Bill and I started drinking our coffee—or asking that it be administered with an IV—we suddenly looked at each other and we realized as of tomorrow, we have 100 days to make our case to America.”
The kick-off event, the prelude to a three-day bus trip reprising the Bill Clinton/Al Gore 1992 post-convention swing, served as a formal introduction to the themes and images that will define the campaign in the weeks to come.
The Democratic Party has now taken back the flag. Red, white, and blue bunting festooned the balconies and railings in McGonigle Hall, and the campaign handed out American flags to the celebratory crowd. Unfortunately, the convention did not inspire a new campaign slogan. The Clinton/Kaine ticket is apparently sticking with “Stronger Together.”
Donald Trump has travelled far on “Make America Great Again.” Bernie Sanders’ “A Future to Believe In” inspired over 13 million voters. Rolled out in late May, “Stronger Together” is by some counts the seventh slogan HRC has employed in the course of her campaign and sounds sadly like something the second string at Sterling Cooper Draper Pryce came up with to promote a new compound laundry detergent.
On Friday, massive Bernie Blue “Stronger Together” banners and signs flanked the left side of the podium (on the right, large stenciled lettering on the walls suggested campaign tactics: GYMNASTICS. FENCING.) The candidate herself is on week two of her wedding dance song, entering and exiting with Tim Kaine to the strains of “Ain’t No Mountain High Enough” (two points for going with Marvin Gaye and Tammi Terrell over Diana Ross).
“Donald Trump painted a picture, a negative, dark, divisive picture of a country in decline,” she said Friday. “He insisted that America is weak, and he told us all, after laying out this very dark picture, that ‘I alone can fix it.’
“Now, as I watched and heard that, it set off alarm bells, because just think about what happened here 240 years ago,” she continued. “Think about our founders, coming together. A Declaration of Independence, writing a Constitution. They set up our form of government, the longest-lasting democracy in the history of the world. And you know they did it because they knew they didn’t want one person, one man, to have all the power, like a king,” she said. “I don’t know any founder, no matter how strong they were, no matter how smart they were, that believed only one person could solve our problems.”
As if on cue, a protester starting yelling “Hillary is a war criminal!” As he was escorted out, HRC seamlessly ad-libbed, “And I’ll tell you something else—they also expected a kind of raucous debate in America. But at the end of the debate we have to come together and get things done.”
She can expect to encounter protesters almost every day from here on in. Her ability to keep her cool, to handle protesters with grace and wit, will say much about the condition of the campaign.
Jody Sturgill, 43, travelled to Philadelphia from east Kentucky to volunteer with the Philadelphia Host Committee. Back home, he juggles the challenges of promoting tourism in Kentucky’s impoverished coal region, advocating for LGBTQ causes, and supporting Hillary Clinton.
“I’ve been working for her since 2007,” Sturgill explained at the conclusion of the rally, watching from a balcony as Bill Clinton worked to leave no hand unshaken. “I’ve met her in person like four times. She’s a genuine person.”
He continued, “What you see on TV seems more fake or projected. [In person] she seems more like an aunt or a grandmother.” That’s why he hopes the campaign puts Kentucky in play. “Everybody…thinks they’re forgotten. She needs to come, let her voice be heard.”
Drew Wicas, a rising senior at Franklin Marshall, and her sister-in-law, Erica Wong Wicas, a workers’ comp litigator, got in line at 9 a.m. to secure a spot at the rally. Drew Wicas, a Sanders supporter, found the whole event “magical.”
“Talk about someone that doggedly goes after something,” she said, impressed.
“She just had this big convention, and she’s ready to get going.”
“I’m going to donate a buck or two” to the Clinton campaign, she said, taking a page from the Bernie Sanders playbook. “Everybody’s got a buck or two. You’re a college student, donate a buck or two.”
The thunder held off, and the rains never came. The “bus” outside was really two “Stronger Together” buses, several charters, a couple of black SUVs, and a fleet of police escorts.
Finally, after a long and grueling primary season, the campaign was on the road again.
A year’s worth of rain fell in 70 minutes.
Clouds piled 12 miles into the mountain sky unleashed a deluge on July 31, 1976, setting off the most powerful flood since glaciers retreated 10,000 years ago.
The chaos along an otherwise trickling Big Thompson River killed 144 people, five of whom were never found, and carved out a chapter in the history books as Colorado’s deadliest natural disaster.
It was the eve of the state’s 100th birthday, part of a three-day shebang that drew weekend warriors and outdoor enthusiasts to the mountains of Larimer County. An estimated 3,500 people were camping, fishing and relaxing in the canyon that night.
A thunderstorm parked near Estes Park and turned the sky a daunting black late that afternoon.
Some residents recall fishing in Loveland and looking to the west, curious about the strange storm pattern that didn’t jibe with late-summer monsoon flows. Others remember the peculiarity of water filling wheel barrows in a matter of seconds or nature’s brilliant light show after the sun set.
Even the 2013 disaster in the same spot paled in comparison both in body count and sheer brutality, largely because people were caught flat-footed some 40 years ago. A foot of rain fell during a few hours in a stretch of land between the tourist hub of Estes Park and the quaint mountain communities of Drake and Glen Haven.
With nowhere to go, that deluge sped down the rocky hillsides.
It took everything in its path.
“I’m stuck. I’m right in the middle of it. I can’t get out…” said Colorado State Patrol Sgt. Willis Hugh Purdy in his last radio transmission before being swept away, killed by the water. He’s credited with saving hundreds of lives by issuing evacuations lower in the canyon.
Propane tanks burst. Water buoyed homes. Babies were snatched from their families.
The river even moved a 275-ton boulder the size of a small house.
All told, the pressure washer of water that tore through the Big Thompson Canyon caused more than $35 million in damage to 418 homes and businesses — nearly $150 million by 2016 standards. More than 400 vehicles, many loaded with tourists or residents trying outrun the water, were swept off roads and sent crashing down the steep and craggy mountain canyon.
Bodies were pulled from debris piles and muck from high in the canyon to areas near Interstate 25. It wasn’t until the death toll surpassed the 100 people that many realized just how bad this storm had been.
There were at least 250 reported injuries, and more than 800 people were helicoptered out when day broke and the sun shined the following Sunday morning, Aug. 1. The stories of survival, near death and loss made national headlines. Flood waters were replaced by a flood of people — rescuers, family members and journalists, their own stories making headlines about covering the mayhem in a time before cellphones, the internet and camera ubiquity.
“For days, it was a race from one stop to the next, then to the nearest phone or back to Fort Collins to make the deadline for the afternoon paper,” wrote Jake Henshaw, the lead Coloradoan reporter who covered disaster, in a column marking the 10th anniversary. “…[W]hat strikes me most is not how quickly the flood and the rescue were over but how long the clean-up took and how deeply the scars cut.”
Families gathered at the old Loveland Memorial Hospital, anxious to hear the latest identity of the figures tucked in body bags, which were laid out in refrigerated trucks in the parking lot — there were too many for the morgue to handle. The bodies of five flood victims were never located.
Signs now dot U.S. Highway 34 — and canyons across Colorado — warning people to climb to safety in the event of flooding. That was a lesson from 1976. Flood plains were re-drawn. Some homes were rebuilt. Many weren’t.
Each year, residents, friends, family, and survivors gather at the Big Thompson Canyon Association and Memorial Site, about one mile below Drake, 13 miles west of the Kmart on U.S. Highway 34 in Loveland. Sometimes there’s singing. Other times just speeches. Scholarships to children have become part of the ceremony.
But there’s always a somber note that hangs in the air, one that remembers the deadliest natural disaster in Colorado history.
¿Qué está cocinando? — Maddie Oatman at Mother Jones tells us that you have never actually eaten Mexican food.
When white people think of Mexican food, visions of nachos coated in orange melted cheese and jalapeños, or burritos bursting with grilled chicken come to mind. Even in US cities where “authentic” Mexican taco trucks line the streets, fried meat and sour cream feature prominently. Sure, these dishes might make you salivate, but they’re just one layer of the country’s complex cuisine—and a pretty unhealthy layer at that.
Hiding behind these modern dishes is a legacy of foods from the indigenous people who inhabited Mexico before the Spanish arrived. For their new cookbook Decolonize Your Diet, authors Luz Calvo and Catrióna Rueda Esquibel dug up that history and displayed it in all its glory. Their task: To “decolonize” their diets and show readers how eating foods native to North America led them to healthier lives.
As the authors informed us on our latest episode of Bite, indigenous Mexicans feasted on corn, beans, potatoes, wild greens, cactus, squashes, other plant-based dishes, and meat prepared in a wide variety of sauces. This diet kept them relatively healthy: Historians have found that at the time of the Spanish Conquest, the Aztecs in Mexico lived, on average, 10 years longer than Spaniards.
But, as Esquibel told us, “the Spaniards really tried to change the way indigenous people grew food and prepared food. They wanted to replace their foods with European foods, particularly wheat.” Indigenous grains were thought to be inferior, and some of them, like amaranth and chia, were even outlawed because they were used in religious ceremonies and associated with paganism.
In other words, the very foods that have come to characterize contemporary Mexican-American fare—cheese, flour tortillas, beef, cane sugar—didn’t exist in America before the Europeans. And unfortunately those foods are linked to the obesity, diabetes, and cancer epidemics plaguing Mexican-American communities today.
As Calvo and Esquibel found, revisiting pre-Hispanic cuisine meant unearthing ancient ingredients and recipes that can help counter those diet-related maladies. But for the couple, it’s about more than physical health: “We’re trying to push people towards a radical rethinking of the way food is both grown and distributed and consumed,” Calvo said.
They left us with a recipe ripe for mid-summer produce: A rich vegetarian soup showcasing creamy corn and delicate blossoms from a squash plant. “You can really put whatever you happen to have growing in the garden into the soup as well,” Calvo noted.
Sopa de Milpa
*Milpa is a sustainable crop-growing system used throughout Mesoamerica.
Adapted from Decolonize Your Diet: Plant-Based Mexican-American Recipes for Health and Healing, by Luz Calvo and Catriona Rueda Esquibel
15 squash blossoms
2 fresh poblano chiles, roasted, peeled, and seeded
½ medium white onion, finely chopped
1 tablespoon olive oil
2 garlic cloves, peeled and finely diced
6 cups corn stock (made by bringing 8 cups water with 6 corn cobs, 1 quartered onion, 4 peppercorns, 1 bay leaf, and any fresh herb sprigs to a boil and then simmering for 1-2 hours. Strain solids and use broth in the soup recipe) or vegetable broth.
2 medium zucchinis, sliced into bite-sized quarter-rounds
2-3 ears of corn, to make 2 cups kernels
2 tablespoons chopped epazote or cilantro
½ teaspoon sea salt
1/8 teaspoon white pepper
2 avocados, peeled, seeded, and cubed
6 ounces queso fresco, cubed (optional)
Prepare squash blossoms: If there is a long pistil in center of blossom, remove and discard. Rinse flowers gently under cool water. Gently tear squash blossoms in half.
Roast the poblanos: Rub with oil and place under broiler until they turn black and blister. Place in a bag or under glass container and steam for 30 minutes. Carefully remove charred skin from chile. Tear chiles into strips about ¼-in wide and cut each strip 3-4 inches long.
In a large saucepan on medium heat, sauté onions in oil about 10 minutes, until golden brown. Add garlic and stir until fragrance is released, about 30 seconds. Add corn stock, chiles, zucchini, corn, and epazote/cilantro and bring to a light boil. Simmer 20 minutes. Add squash blossom pieces and cook 5-10 minutes, or until zucchini is crisp-tender. Add salt and pepper. Taste and adjust seasonings. Ladle soup into blows and serve topped with avocado cubes and queso fresco.
Doonesbury — Teachable moment.
Summer is a great time to read a good book. There’s more daylight for sitting on the back porch with a cool libation forming little dark water rings on the coaster; at the beach you can relax in the shade of an umbrella while the waves lap at your ankles and the kids build sand castles or try out their surfer moves. Or you can sit on a bench in the park in the middle of a busy city and tune out the world on your lunch break. Books are great ways to take a summer vacation while staying right where you are.
Summer is not the time, however, to read something heavy. Save Dr. Zhivago or Arrowsmith for sitting next to the fireplace next winter. What you need is a good page-turner that defies you to put it down or make you curse when the LOW BATTERY symbol flashes on your Kindle.
I’ve come across two such fun and intriguing reads by Stephen Anable. First up is The Fisher Boy, a detective story told by a most unlikely detective: Mark Winslow, a gay stand-up comic trying to make it during the high season in Provincetown on Cape Cod. It’s a deftly-woven story written in fine detail that surrounds you with the feeling of being there. The story moves at a quick but not hurried pace; it’s like he wants you to enjoy the view as we solve the mystery. There are enough twists and turns to keep you guessing to the very end. Oh, and you’ll learn a lot about the very interesting artists and denizens of the fabled P-town.
The next entry in the Mark Winslow series is A Pinchbeck Bride. This time we’re in Boston exploring the historic sites and learning about some rather interesting if not repellent Back Bay blue bloods with a family history that sometimes seems more like the Charles Addams family instead of John or Samuel Adams. Mark is volunteering as a docent and member of the board of the historic Mingo house when a grad student assistant is found delicately murdered in full Victorian regalia in the house. There are lots of suspects with lots of alibis and insight into the rarefied air of the musty attics of family secrets. Even if you don’t know Boston from downtown Longmont or a historical museum from the fun house at Coney Island, you will quickly feel at home and get to know these characters.
There’s a certain craft to writing a detective story that I’ve always envied, and Stephen Anable has it down to perfection. I hope you take the time to take a look and take them along to wherever you go to enjoy a good read this summer.
I could have posted this yesterday, but here it is. WWE star John Cena delivers a message on what patriotism is.
Master of the Universe, let us make up. It is time. How long can we go on being angry?
More than 50 years have passed since the nightmare was lifted. Many things, good and less good, have since happened to those who survived it. They learned to build on ruins. Family life was re-created. Children were born, friendships struck. They learned to have faith in their surroundings, even in their fellow men and women. Gratitude has replaced bitterness in their hearts. No one is as capable of thankfulness as they are. Thankful to anyone willing to hear their tales and become their ally in the battle against apathy and forgetfulness. For them every moment is grace.
Oh, they do not forgive the killers and their accomplices, nor should they. Nor should you, Master of the Universe. But they no longer look at every passer-by with suspicion. Nor do they see a dagger in every hand.
Does this mean that the wounds in their soul have healed? They will never heal. As long as a spark of the flames of Auschwitz and Treblinka glows in their memory, so long will my joy be incomplete.
What about my faith in you, Master of the Universe?
I now realize I never lost it, not even over there, during the darkest hours of my life. I don’t know why I kept on whispering my daily prayers, and those one reserves for the Sabbath, and for the holidays, but I did recite them, often with my father and, on Rosh ha-Shanah eve, with hundreds of inmates at Auschwitz. Was it because the prayers remained a link to the vanished world of my childhood?
But my faith was no longer pure. How could it be? It was filled with anguish rather than fervor, with perplexity more than piety. In the kingdom of eternal night, on the Days of Awe, which are the Days of Judgment, my traditional prayers were directed to you as well as against you, Master of the Universe. What hurt me more: your absence or your silence?
In my testimony I have written harsh words, burning words about your role in our tragedy. I would not repeat them today. But I felt them then. I felt them in every cell of my being. Why did you allow if not enable the killer day after day, night after night to torment, kill and annihilate tens of thousands of Jewish children? Why were they abandoned by your Creation? These thoughts were in no way destined to diminish the guilt of the guilty. Their established culpability is irrelevant to my ”problem” with you, Master of the Universe. In my childhood I did not expect much from human beings. But I expected everything from you.
Where were you, God of kindness, in Auschwitz? What was going on in heaven, at the celestial tribunal, while your children were marked for humiliation, isolation and death only because they were Jewish?
These questions have been haunting me for more than five decades. You have vocal defenders, you know. Many theological answers were given me, such as: ”God is God. He alone knows what He is doing. One has no right to question Him or His ways.” Or: ”Auschwitz was a punishment for European Jewry’s sins of assimilation and/or Zionism.” And: ”Isn’t Israel the solution? Without Auschwitz, there would have been no Israel.”
I reject all these answers. Auschwitz must and will forever remain a question mark only: it can be conceived neither with God nor without God. At one point, I began wondering whether I was not unfair with you. After all, Auschwitz was not something that came down ready-made from heaven. It was conceived by men, implemented by men, staffed by men. And their aim was to destroy not only us but you as well. Ought we not to think of your pain, too? Watching your children suffer at the hands of your other children, haven’t you also suffered?
As we Jews now enter the High Holidays again, preparing ourselves to pray for a year of peace and happiness for our people and all people, let us make up, Master of the Universe. In spite of everything that happened? Yes, in spite. Let us make up: for the child in me, it is unbearable to be divorced from you so long.
Liberals Need White Men — Eric Levitz in New York on why Democrats ignore the white working class at their peril.
A specter haunts the left’s last bastions of white working-class support — the specter of right-wing populism. As the New York Times’ Nate Cohn notes, outside of London, Labour’s working-class districts bucked their party’s leadership by voting for a Brexit campaign led by right-wing nationalists. Recent elections in Austria, Denmark, and Germany have produced a similar pattern; in all three countries, working-class areas that once voted with the Social Democrats or the center-left embraced far-right populists who promised to stem the tide of globalization.
Donald Trump has brought his own idiosyncratic brand of reactionary populism to our shores. And it’s playing well in the Democrats’ white working-class strongholds. According to Cohn, Trump’s most reliable voters in the GOP primary were “self-identified Republicans who nonetheless remain registered as Democrats.” On Tuesday, the presumptive GOP nominee made it clear that his general-election campaign will be aimed squarely at these voters. Contradicting decades of conservative free-market doctrine, Trump debuted a seven-point plan for reviving domestic manufacturing through trade protection.
Even if this message resonates with its target audience, current polling suggests Trump will have a tough time winning in November. But if issues of globalization continue to gain political salience, it could drive a wedge between the Democratic Party’s white working-class voters — who disproportionately favor restricting immigration — and the rest of the party’s base, which has been moving steadily toward an embrace of open borders. This is no small threat to Team Blue: White voters without college degrees made up a full 34 percent of the Obama coalition in 2012.
Liberals can’t give these voters what they want (in the aggregate) on immigration. To retain the party’s current share of the demographic, Democrats will need to make their economic pitch more salient than the right-wing’s nationalist appeals. There are many ways to go about this task. But a good first step would be to stop insinuating that non–college educated workers are destined to live miserable lives because their skills are obsolete.
If that strikes you as something liberals never do, you should listen to last week’s edition of Slate’s Political Gabfest podcast. During a discussion on the links between Brexit-backers and the Trumpian proletariat, NPR’s economics reporter Adam Davidson offered the following explanation for right-wing populism’s current appeal:
I know Hillary Clinton’s economic team fairly well, and I’m very impressed by them. They really are top-notch economists and economic policy thinkers. They don’t have anything for a 55-year-old laid-off factory worker in Michigan or northeastern Pennsylvania. Or whatever. They don’t have anything to offer them. And so I think it’s intuitively understandable that a screaming, loud, wrong answer is more compelling than a calm, reasonable, accurate, right answer: Your life is going to be worse for the rest of your life — but don’t worry, these hipsters in Brooklyn are doing much better.
[…] The threshold for wages has gone up. There was a long period in the 20th century where, simply being willing to go to a building reliably everyday for eight hours or 12 hours and do what you’re told was worth a lot. […] And you didn’t need to read, you didn’t need to write, you didn’t need to have any kind of education. Those jobs are all but fully gone. […] So in this country, we don’t have demand for the high-school-only graduates and the high-school dropouts we have, and that’s a big population. Something like 80 million people.
The “accurate, right answer” is that your life is going to get worse because you’ve fallen beneath the threshold for wages. This is how a well-sourced reporter summarizes the consensus of the Democratic nominee’s policy team. And we wonder why so many voters disdain elite expertise.
The Origins of Mordor — Joseph Loconte on what inspired J.R.R. Tolkien to create his mythological Hell.
In the summer of 1916, a young Oxford academic embarked for France as a second lieutenant in the British Expeditionary Force. The Great War, as World War I was known, was only half-done, but already its industrial carnage had no parallel in European history.
“Junior officers were being killed off, a dozen a minute,” recalled J. R. R. Tolkien. “Parting from my wife,” he wrote, doubting that he would survive the trenches, “was like a death.”
The 24-year-old Tolkien arrived in time to take part in the Battle of the Somme, a campaign intended to break the stalemate between the Allies and Central Powers. It did not.
The first day of the battle, July 1, produced a frenzy of bloodletting. Unaware that its artillery had failed to obliterate the German dugouts, the British Army rushed to slaughter.
Before nightfall, 19,240 British soldiers — Prime Minister David Lloyd George called them “the choicest and best of our young manhood” — lay dead. That day, 100 years ago, remains the most lethal in Britain’s military history.
Though the debt is largely overlooked, Tolkien’s supreme literary achievement, “The Lord of the Rings,” owes a great deal to his experience at the Somme. Reaching the front shortly after the offensive began, Tolkien served for four months as a battalion signals officer with the 11th Lancashire Fusiliers in the Picardy region of France.
Using telephones, flares, signal lights, pigeons and runners, he maintained communications between the army staff directing the battles from the rear and the officers in the field. According to the British historian Martin Gilbert, who interviewed Tolkien decades later about his combat experience, he came under intense enemy fire. He had heard “the fearful cries of men who had been hit,” Gilbert wrote. “Tolkien and his signalers were always vulnerable.”
Tolkien’s creative mind found an outlet. He began writing the first drafts of his mythology about Middle-earth, as he recalled, “by candle light in bell-tents, even some down in dugouts under shell fire.” In 1917, recuperating from trench fever, Tolkien composed a series of tales involving “gnomes,” dwarves and orcs engaged in a great struggle for his imaginary realm.
In the rent earth of the Somme Valley, he laid the foundation of his epic trilogy.
The descriptions of battle scenes in “The Lord of the Rings” seem lifted from the grim memories of the trenches: the relentless artillery bombardment, the whiff of mustard gas, the bodies of dead soldiers discovered in craters of mud. In the Siege of Gondor, hateful orcs are “digging, digging lines of deep trenches in a huge ring,” while others maneuver “great engines for the casting of missiles.”
On the path to Mordor, stronghold of Sauron, the Dark Lord, the air is “filled with a bitter reek that caught their breath and parched their mouths.” Tolkien later acknowledged that the Dead Marshes, with their pools of muck and floating corpses, “owe something to Northern France after the Battle of the Somme.”
In a lecture delivered in 1939, “On Fairy-Stories,” Tolkien explained that his youthful love of mythology had been “quickened to full life by war.” Yet he chose not to write a war memoir, and in this he departed from contemporaries like Robert Graves and Vera Brittain.
In the postwar years, the Somme exemplified the waste and futility of battle, symbolizing disillusionment not only with war, but with the very idea of heroism. As a professor of Anglo-Saxon back at Oxford, Tolkien preferred the moral landscape of Arthur and Beowulf. His aim was to produce a modern version of the medieval quest: an account of both the terrors and virtues of war, clothed in the language of myth.
In “The Lord of the Rings,” we meet Frodo Baggins and Samwise Gamgee, Hobbits of the Shire, on a fateful mission to destroy the last Ring of Power and save Middle-earth from enslavement and destruction. The heroism of Tolkien’s characters depends on their capacity to resist evil and their tenacity in the face of defeat. It was this quality that Tolkien witnessed among his comrades on the Western Front.
“I have always been impressed that we are here, surviving, because of the indomitable courage of quite small people against impossible odds,” he explained. The Hobbits were “a reflection of the English soldier,” made small of stature to emphasize “the amazing and unexpected heroism of ordinary men ‘at a pinch.’ ”
When the Somme offensive was finally called off in November 1916, a total of about 1.5 million soldiers were dead or wounded. Winston Churchill, who served on the front lines as a lieutenant colonel, criticized the campaign as “a welter of slaughter.” Two of Tolkien’s closest friends, Robert Gilson and Ralph Payton, perished in the battle, and another, Geoffrey Smith, was killed shortly afterward.
Beside the courage of ordinary men, the carnage of war seems also to have opened Tolkien’s eyes to a primal fact about the human condition: the will to power. This is the force animating Sauron, the sorcerer-warlord and great enemy of Middle-earth. “But the only measure that he knows is desire,” explains the wizard Gandalf, “desire for power.” Not even Frodo, the Ring-bearer and chief protagonist, escapes the temptation.
When Tolkien’s trilogy was published, shortly after World War II, many readers assumed that the story of the Ring was a warning about the nuclear age. Tolkien set them straight: “Of course my story is not an allegory of atomic power, but of power (exerted for domination).”
Even this was not the whole story. For Tolkien, there was a spiritual dimension: In the human soul’s struggle against evil, there was a force of grace and goodness stronger than the will to power. Even in a forsaken land, at the threshold of Mordor, Samwise Gamgee apprehends this: “For like a shaft, clear and cold, the thought pierced him that in the end the Shadow was only a small and passing thing: There was light and high beauty forever beyond its reach.”
Good triumphs, yet Tolkien’s epic does not lapse into escapism. His protagonists are nearly overwhelmed by fear and anguish, even their own lust for power. When Frodo returns to the Shire, his quest at an end, he resembles not so much the conquering hero as a shellshocked veteran. Here is a war story, wrapped in fantasy, that delivers painful truths about the human predicament.
Tolkien used the language of myth not to escape the world, but to reveal a mythic and heroic quality in the world as we find it. Perhaps this was the greatest tribute he could pay to the fallen of the Somme.
Doonesbury — Sameness.
A Long Line — Bill Moyers and Michael Winship on the history of American demagogues.
There’s a virus infecting our politics and right now it’s flourishing with a scarlet heat. It feeds on fear, paranoia and bigotry. All that was required for it to spread was a timely opportunity — and an opportunist with no scruples.
There have been stretches of history when this virus lay dormant. Sometimes it would flare up here and there, then fade away after a brief but fierce burst of fever. At other moments, it has spread with the speed of a firestorm, a pandemic consuming everything in its path, sucking away the oxygen of democracy and freedom.
Today its carrier is Donald Trump, but others came before him: narcissistic demagogues who lie and distort in pursuit of power and self-promotion. Bullies all, swaggering across the landscape with fistfuls of false promises, smears, innuendo and hatred for others, spite and spittle for anyone of a different race, faith, gender or nationality.
In America, the virus has taken many forms: “Pitchfork Ben” Tillman, the South Carolina governor and senator who led vigilante terror attacks with a gang called the Red Shirts and praised the efficiency of lynch mobs; radio’s charismatic Father Charles Coughlin, the anti-Semitic, pro-Fascist Catholic priest who reached an audience of up to 30 million with his attacks on Franklin Delano Roosevelt and the New Deal; Mississippi’s Theodore Bilbo, a member of the Ku Klux Klan who vilified ethnic minorities and deplored the “mongrelization” of the white race; Louisiana’s corrupt and dictatorial Huey Long, who promised to make “Every Man a King.” And of course, George Wallace, the governor of Alabama and four-time presidential candidate who vowed, “Segregation now, segregation tomorrow, segregation forever.”
Note that many of these men leavened their gospel of hate and their lust for power with populism — giving the people hospitals, schools and highways. Father Coughlin spoke up for organized labor. Both he and Huey Long campaigned for the redistribution of wealth. Tillman even sponsored the first national campaign-finance reform law, the Tillman Act, in 1907, banning corporate contributions to federal candidates.
But their populism was tinged with poison — a pernicious nativism that called for building walls to keep out people and ideas they didn’t like.
Which brings us back to Trump and the hotheaded, ego-swollen provocateur he most resembles: Joseph McCarthy, US senator from Wisconsin — until now perhaps our most destructive demagogue. In the 1950s, this madman terrorized and divided the nation with false or grossly exaggerated tales of treason and subversion — stirring the witches’ brew of anti-Communist hysteria with lies and manufactured accusations that ruined innocent people and their families. “I have here in my hand a list,” he would claim — a list of supposed Reds in the State Department or the military. No one knew whose names were there, nor would he say, but it was enough to shatter lives and careers.
In the end, McCarthy was brought down. A brave journalist called him out on the same television airwaves that helped the senator become a powerful, national sensation. It was Edward R. Murrow, and at the end of an episode exposing McCarthy on his CBS series See It Now, Murrow said:
“It is necessary to investigate before legislating, but the line between investigating and persecuting is a very fine one, and the junior senator from Wisconsin has stepped over it repeatedly. His primary achievement has been in confusing the public mind, as between the internal and the external threats of Communism. We must not confuse dissent with disloyalty. We must remember always that accusation is not proof and that conviction depends upon evidence and due process of law. We will not walk in fear, one of another. We will not be driven by fear into an age of unreason, if we dig deep in our history and our doctrine, and remember that we are not descended from fearful men — not from men who feared to write, to speak, to associate and to defend causes that were, for the moment, unpopular.”
There also was the brave and moral lawyer Joseph Welch, acting as chief counsel to the US Army after it was targeted for one of McCarthy’s inquisitions. When McCarthy smeared one of his young associates, Welch responded in full view of the TV and newsreel cameras during hearings in the Senate. “You’ve done enough,” Welch said. “Have you no sense of decency, sir, at long last? Have you left no sense of decency?… If there is a God in heaven, it will do neither you nor your cause any good. I will not discuss it further.”
It was a devastating moment. Finally, McCarthy’s fellow senators — including a handful of brave Republicans — turned on him, putting an end to the reign of terror. It was 1954. A motion to censure McCarthy passed 67-22, and the junior senator from Wisconsin was finished. He soon disappeared from the front pages, and three years later was dead.
Here’s something McCarthy said that could have come straight out of the Trump playbook: “McCarthyism is Americanism with its sleeves rolled.” Sounds just like The Donald, right? Interestingly, you can draw a direct line from McCarthy to Trump — two degrees of separation. In a Venn diagram of this pair, the place where the two circles overlap, the person they share in common is a fellow named Roy Cohn.
Cohn was chief counsel to McCarthy’s Senate Permanent Subcommittee on Investigations, the same one Welch went up against. Cohn was McCarthy’s henchman, a master of dark deeds and dirty tricks. When McCarthy fell, Cohn bounced back to his hometown of New York and became a prominent Manhattan wheeler-dealer, a fixer representing real estate moguls and mob bosses — anyone with the bankroll to afford him. He worked for Trump’s father, Fred, beating back federal prosecution of the property developer, and several years later would do the same for Donald. “If you need someone to get vicious toward an opponent,” Trump told a magazine reporter in 1979, “you get Roy.” To another writer he said, “Roy was brutal but he was a very loyal guy.”
Cohn introduced Trump to his McCarthy-like methods of strong-arm manipulation and to the political sleazemeister Roger Stone, another dirty trickster and unofficial adviser to Trump who just this week suggested that Hillary Clinton aide Huma Abedin was a disloyal American who may be a spy for Saudi Arabia, a “terrorist agent.”
Cohn also introduced Trump to the man who is now his campaign chair, Paul Manafort, the political consultant and lobbyist who without a moral qualm in the world has made a fortune representing dictators — even when their interests flew in the face of human rights or official US policy.
So the ghost of Joseph McCarthy lives on in Donald Trump as he accuses President Obama of treason, slanders women, mocks people with disabilities and impugns every politician or journalist who dares call him out for the liar and bamboozler he is. The ghosts of all the past American demagogues live on in him as well, although none of them have ever been so dangerous — none have come as close to the grand prize of the White House.
Because even a pathological liar occasionally speaks the truth, Trump has given voice to many who feel they’ve gotten a raw deal from establishment politics, who see both parties as corporate pawns, who believe they have been cheated by a system that produces enormous profits from the labor of working men and women that are gobbled up by the 1 percent at the top. But again, Trump’s brand of populism comes with venomous race-baiting that spews forth the red-hot lies of a forked and wicked tongue.
We can hope for journalists with the courage and integrity of an Edward R. Murrow to challenge this would-be tyrant, to put the truth to every lie and publicly shame the devil for his outrages. We can hope for the likes of Joseph Welch, who demanded to know whether McCarthy had any sense of decency. Think of Gonzalo Curiel, the jurist Trump accused of persecuting him because of the judge’s Mexican heritage. Curiel has revealed the soulless little man behind the curtain of Trump’s alleged empire, the avaricious money-grubber who conned hard-working Americans out of their hard-won cash to attend his so-called “university.”
And we can hope there still remain in the Republican Party at least a few brave politicians who will stand up to Trump, as some did McCarthy. This might be a little harder. For every Mitt Romney and Lindsey Graham who have announced their opposition to Trump, there is a weaselly Paul Ryan, a cynical Mitch McConnell and a passel of fellow travelers up and down the ballot who claim not to like Trump and who may not wholeheartedly endorse him but will vote for him in the name of party unity.
As this headline in The Huffington Post aptly put it, “Republicans Are Twisting Themselves Into Pretzels To Defend Donald Trump.” Ten GOP senators were interviewed about Trump and his attack on Judge Curiel’s Mexican heritage. Most hemmed and hawed about their presumptive nominee. As Trump “gets to reality on things he’ll change his point of view and be, you know, more responsible.” That was Sen. Orrin Hatch of Utah. Trump’s comments were “racially toxic” but “don’t give me any pause.” That was Tim Scott of South Carolina, the only Republican African-American in the Senate. And Sen. Pat Roberts of Kansas? He said Trump’s words were “unfortunate.” Asked if he was offended, Jennifer Bendery writes, the senator “put his fingers to his lips, gestured that he was buttoning them shut, and shuffled away.”
No profiles in courage there. But why should we expect otherwise? Their acquiescence, their years of kowtowing to extremism in the appeasement of their base, have allowed Trump and his nightmarish sideshow to steal into the tent and take over the circus. Alexander Pope once said that party spirit is at best the madness of the many for the gain of a few. A kind of infection, if you will — a virus that spreads through the body politic, contaminating all. Trump and his ilk would sweep the promise of America into the dustbin of history unless they are exposed now to the disinfectant of sunlight, the cleansing torch of truth. Nothing else can save us from the dark age of unreason that would arrive with the triumph of Donald Trump.
Buy Out — Alexia Fernandez Campbell in The Atlantic on those of us who refuse to retire.
The term “gray-haired professor” may seem like a cliché, but there’s some truth to it. Academia has long had a disproportionate number of employees older than 65, and the average American professor is getting even older.The share of people older than 65 teaching full time at American colleges and universities nearly doubled between 2000 and 2010. College professors are now among the oldest Americans in the workforce. Job satisfaction, job protection due to tenure, and concern about their retirement nest eggs are all reasons they cite for sticking around longer. And while their experience is valuable in its own way, the cost of paying senior professors in an era of rising expenses and shrinking endowments has led universities to borrow a budget-cutting strategy from the corporate world: buyouts.A growing number of private and public universities are resorting to offering large sums of money to faculty and staff in exchange for early retirement (or, if they prefer, heading back to the job market). In the past year alone, Oberlin College here in Oberlin, Ohio; the University of Wisconsin, Eau Claire; and the University of North Dakota, all offered some sort of voluntary separation-incentive deal to faculty members. John Barnshaw, a senior researcher at the American Association of University Professors, says the financial crises of 2008 dealt a big blow to universities, which had invested much of their endowments in stocks and other financial products. “They started paying very close attention to their portfolios in a way they never have done,” says Barnshaw. “One of the ways they saw to save money was to offer retirement packages.”Oberlin College, an exclusive liberal arts college about 45 minutes from Cleveland, is testing out this cost-cutting strategy. I recently spoke to the president, Marvin Krislov, about the unexpected, end-of-semester buyout, which was offered to about a third of the faculty. Krislov says the college needs to offset expensive health-care costs and employee salaries. Additionally, he says that Oberlin’s commitment to offering grants and financial help to students from all socioeconomic backgrounds is a source of financial stress. Nearly half of Oberlin’s students receive some sort of financial aid. Tuition and fees, without aid, is about $50,000 a year.This is the first time the college has offered early retirement packages, says Krislov. Since about 90 percent of faculty is tenured, many end up working way past the traditional retirement age of 65. “[The buyouts] allow us to have more predictability in knowing who is going to be working and until when,” he says.
To take the buyout, employees must be at least 52 years old and must have worked at Oberlin for at least 10 years. The college will then pay their salaries for a year after they leave and waive health insurance premiums during that time.
One reason academia has seen so much aging has to do with federal law. In 1986, Congress barred employers from enforcing mandatory retirement ages, but colleges and universities were exempt for a while. They were able to impose a retirement age of 70 until the exemption expired in 1993. A recent survey of college professors now shows that 60 percent plan to work past the age of 70.
The buyout programs seem like a direct path to reducing the numbers of most highly paid employees. But it also poses a risk: When those professors leave, their tenure-track positions may be replaced with non-tenure-track ones, meaning that over time, the number of tenured positions on campus could plummet. Though tenure has its detractors, it also serves a valuable purpose: Tenured faculty can’t be fired without just cause, which is meant to foster academic freedom and innovation. The rise of tenured positions in the United States was a response to McCarthyism, when university professors were fired for real or imagined ties to the Communist Party.Over the years, the share of tenured teaching positions has been shrinking, while the percentage of part-time positions has increased. A report from the American Association of University Professors shows that, in the past 40 years, the percentage of professors in full-time, tenured positions dropped by 26 percent and tenure-track positions dropped by 50 percent. Meanwhile, academia has seen a 62 percent jump in full-time, non-tenure-track positions and a 70 percent jump in part-time teaching positions. Today, the majority of academic positions are part-time jobs.“Our concern is that those tenure-track jobs are not being replaced. That they are just hiring a bunch of part-time professors,” says Barnshaw.
At Oberlin, Krislov says he will not replace full-time, tenured positions with part-time jobs. But he might move positions to departments with more in-demand fields, though he wouldn’t say which ones.
One tenured professor taking the buyout at Oberlin is Roger Copeland, who has been teaching dance and theater there for 41 years. The 66-year-old professor (whose former students include Girls creator Lena Dunham) said he was surprised to get the offer as the semester came to an end.
“I was completely dumbfounded,” said Copeland, a few hours before signing the separation agreement. “I don’t think anybody suspected that the [financial] situation could be so bad.”
Copeland hadn’t plan to retire for at least another four years, but said he couldn’t pass up the deal. He says he understands why the college is doing it, and thinks it will inject the faculty with fresh blood and new ideas. “For what they pay me, they can get two people out of grad school,” he says.
About 85 people so far have accepted the buyout (16 are professors and all are tenured; the rest are administrative and professional staff), representing about 25 percent of all eligible employees, Krislov says. He expects this to save the college about $3 million per year, depending on how many positions are replaced. According to him, the goal isn’t to replace tenured professors with non-tenure-track faculty. “Our commitment to tenure and tenured professors is iron clad,” he says.
ST. PAUL, MINN. — Garrison Keillor was riding shotgun in a rented Chevy, motoring east through the steamy Midwestern heat.
His linen suit was appropriately rumpled — everything about this public radio legend suggests disregard for crisp lines — and his gangly legs were jacked up against the glove box, as he resisted suggestions to slide his seat back. Hitching a ride with a reporter from Minneapolis to his home here, he filled the yawning silences with a weird little singsong, “bomp, bomp, bomp, bomp.”
He had just spent hours rehearsing for the following night, May 21, when he hosted “A Prairie Home Companion,” at the State Theater in Minneapolis, before a packed, adoring crowd for the last time.
After more than four decades of hosting this homespun Americana musical variety program, which he created and which, in turn, created him, Mr. Keillor is retiring. He has done this before, in 1987, though that retirement ended up being a sabbatical. In 2011, there were rumors — baseless, Mr. Keillor’s people said — that he was thinking of abandoning ship then, too.
But this time, Mr. Keillor, 73, said he means it. He has named a successor and lined up meaty post-“Prairie” projects, among them columns for The Washington Post, a screenplay and a book. While he has a solo tour planned through the year, along with a “Prairie”-esque Labor Day weekend show at the Minnesota State Fair, he will host his final official “Prairie Home Companion” on July 1 at, of all places, the Hollywood Bowl.
“It’s very much real, and it’s simply a matter of wanting to rearrange one’s life,” Mr. Keillor said after we had arrived at his large, handsome Georgian house, and he had eased his stooping 6-foot-4 frame into a porch chair. “In order to do these things, I’ve got to clear out the big buffalo in the room, which is the show.”
At his home, Mr. Keillor looms, a melancholy presence, and doesn’t make much eye contact, keeping his bespectacled eyes averted under scraggly eyebrows. Rather than savor the conversation, he seems to cordially endure it. His mellifluous voice, likened to a down comforter or “a slow drip of Midwestern molasses,” feels warmly familiar to any public radio listener who has heard him sing “Tishomingo Blues,” which opens his show each Saturday evening.
Yet as familiar and cherished as “Prairie” has become to millions, it was always about Mr. Keillor’s fascinations, rather than the inner tickings of its host.
“It was never about self expression, never,” Mr. Keillor said.
Everything about “Prairie Home” — the Guy Noir and Lives of the Cowboys sketches, the spots for Powdermilk Biscuits and the Ketchup Advisory Board, the monologues about the fictional Lake Wobegon — sprang from Mr. Keillor’s imagination. But the man spinning the plates at the center of it all managed to stay a mystery, even to people who know him well.
“Garrison in person is quite different,” said his longtime friend, the writer Mark Singer. “Garrison does not express emotion in interpersonal conversations the way the rest of us do.”
Performers often cultivate alternate personas, but with Mr. Keillor the difference is startling. That night, onstage in Minneapolis, he was garrulous and affable, and afterward ventured out onto the sidewalk to meet his hundreds-strong admirers, many of whom feel they know him intimately.
As fans flocked around him, Mr. Keillor graciously deflected questions, directing queries back to the scrum. This helps him gather story ideas but also serves as a bridge from his onstage personality to his default setting, the introverted, removed man who seems miles away, even when you’re sitting two feet from him on his porch, eating the jelly beans he has set out.
“His gaze is often floating and takes you in from a strange distance,” said the writer and editor Roger Angell, who in 1970 edited Mr. Keillor’s first piece for The New Yorker. “He is certainly the strangest person I know.”
There is debate about whether Mr. Keillor should have exited a while ago. His weekly radio audience peaked 10 years ago, at 4.1 million, and has since dropped to 3.2 million. While that does not include listeners on Sirius XM, or the show’s three million monthly digital requests, many stations have dropped their Sunday repeat broadcast of his show.
“Prairie Home” captured a time, before tweets and Facebook posts, when people talked more over fence posts and pots of coffee but nowadays feels increasingly removed from many listeners’ lives.
“A lot of the conversation has been: ‘Did Garrison wait too long? Should Garrison have done this years ago?’” said Eric Nuzum, former vice president for programming at NPR. “The problem of ‘Prairie Home Companion’ is it’s part of public radio’s past, not their future,” Mr. Nuzum said. (American Public Media distributes “Prairie Home”; NPR member stations air programs from APM as well as from other distributors.)
Still, Mr. Keillor played an outsize role in shaping what public radio has become.
He was a pioneering force and taught public radio valuable lessons, Mr. Nuzum said. The live performances and touring built audiences and kept them connected and deeply loyal. That proved lucrative, as did sales of “Prairie Home Companion” recordings, books, clothes and tchotchkes. Mr. Keillor also became one of public radio’s earliest celebrities, appearing on the cover of Time in 1985.
“‘Prairie Home Companion’ came on the scene just as public radio was trying to figure out what its identity was,” said Ira Glass, the host of “This American Life.” “The fact that here was such a visibly weird, funny, idiosyncratic show opened up the space of other weird, idiosyncratic shows, like ‘Car Talk,’ and our show.”
Adored as he has been by millions, Mr. Keillor drove a few critics around the bend.
Detractors view “Prairie Home” as excruciatingly hokey, syrupy and dull. In a 1993 episode of “The Simpsons,” Homer bangs on the television — the Disney Channel broadcast the show in the late ’80s — hollering, “Be more funny!” In a withering review of Robert Altman’s 2006 film, “A Prairie Home Companion,” Rex Reed called Mr. Keillor “a myopic doughboy” and his program “a lumbering, affected and pointless audio curiosity.”
Yet Mr. Glass believes that many people mistake “Prairie Home” for quaint, homespun nostalgia, even though the tales from Lake Wobegon are, as often as not, richly emotional, contemporary and quite dark.
In recent monologues, Mr. Keillor has lambasted the gun lobby, told of people’s relatives being buried alive and mentioned a would-be suicidal woman left bald after she accidentally set her hair on fire in her gas oven, a presumably fictitious anecdote that is trademark Keillor: equal parts alarming, heartbreaking and funny.
“Like Howard Stern, Garrison Keillor created a packaging that nonlisteners took as real,” Mr. Glass said. “And the actual show is so much more complex, and human and complicated than nonlisteners think it is.”
Mr. Keillor has had health concerns, suffering a stroke in 2009, and, less than a week after the Minneapolis show, a seizure. But he insists it’s his other projects that compelled him to step away. After July, he will continue to have a small radio foothold, hosting “The Writer’s Almanac,” a stand-alone five-minute radio program he started in the early ’90s. And “Prairie Home” reruns will continue to air. Jon McTaggart, chief executive of American Public Media Group, the parent of American Public Media, said that as much as “Prairie Home” contributed financially, he has faith in the allure of the new version of the show and that “this transition has been planned for a while.”
Still, the future of “Prairie Home Companion,” and public radio, without Mr. Keillor remains somewhat of an open question.
Mr. Keillor’s handpicked successor, the folk musician Chris Thile, 35, who first performed on the show as a teenager, cheerfully admitted in an interview that it could all go down the drain if audiences reject him after he begins hosting on Oct. 15. Details are still being hammered out, but Mr. Thile plans to do musical numbers and comedy bits. There will be no Lake Wobegon.
I sometimes think the Bill of Rights is a test of character for the country. It’s as if the Founding Fathers said, “Okay, America, we’re going to give you all of these rights; let’s see how you handle them.”
There have been times when we have risen to them and proven ourselves worthy: when equal rights for all truly does mean real equality, not separate but equal or equal only for white Christians. And there have been times when we have failed them: internment for citizens who immigrated from a place we’re currently hating or the idea that because some see one amendment as thoroughly inviolable we have to accede to their fetishism as the way things must be.
It is serendipitous that as we recover from the shock and horror of the massacre in Orlando we saw a celebration of a musical that honored Alexander Hamilton, one of the founding fathers. It’s a civics lesson through hip-hop, and while some may find it incongruous to see 18th century characters rapping about starting a country, it reminds us that we are forever being challenged on how we answer to those who set us on our way.
Transcendent — Charlie Pierce on Muhammad Ali.
I play it cool/I dig all jive/That is how I stay alive
There is no real place to begin with him and no ending fit enough for the life he led. Muhammad Ali died on Friday, true enough. They will take him to his final rest on Wednesday in Louisville, which was only his first hometown in a world that he made his true hometown. So he was not immortal, the way we all thought he might be, but he lived a life beyond the bounds of mortality anyway, a life that has no real beginning and that still has a vital spirit for which no ending is adequate.
He was an iconic human being in an era that produced icons with every turn of the television dial, every front page of every morning newspaper and, my god, most of them died young. John and Robert Kennedy. Martin Luther King and Malcolm X. None of them ever made 50. None of them ever made old bones. Only Ali lived to see how he truly changed the world around him, how it had come to understand that some lives are lived beyond the mortal limits.
He was a transcendent athlete, first and foremost, every bit as skilled at what he did for a living as Michael Jordan or Pele. The greatest change in athletes over the span of his physical life is that big athletes got fast. LeBron James plays basketball and he is just about the same size as Antonio Gates, who is a tight end. When he first arrived at Wimbledon, Boris Becker looked like a college linebacker. Ali was tall for a heavyweight, bigger than anyone who was faster than he was and faster than anyone who was bigger.
You have to have seen him before he was stripped of his livelihood to appreciate fully his gifts as an athlete. Foot speed. Hand speed. Before it all hit the fan in 1968, Sports Illustrated put him in a lab with strobe lights and everything, to time the speed of his punches. The results looked something out of a special-effects lab. In one of his routines, the late Richard Pryor used to talk about sparring with Ali in a charity exhibition. A Golden Gloves fighter in his youth, as Pryor later put it, “you don’t see his punches until they comin’ back. And your mind be sayin’, ‘Wait a minute now. There was some shit in my face a minute ago. I know that.'” He was an accelerated man in an accelerated age. Saying he was “ahead of his time” was only the half of it. His time was all time.
That was what led to the rest of it—the opposition to the criminal stupidity that was being practiced by this country in Southeast Asia, stated in terms as fundamentally American as the First Amendment to the Constitution. “Congress shall make no law…” His stubborn insistence that his life was his own, that it did not belong to the sclerotic old gangsters who still ran boxing, nor to the sclerotic old men who still ran the government, with their wiretaps and their phony indictments and their lawbooks. He was too fast for them all to catch, ultimately, and too pretty for a country that was vandalizing its most beautiful elements. That stubbornness also likely led to his physical downfall. All gifts have their dark side. All debts come due.
He was a prophet, in every way that America makes its prophets, in the same way that was William Lloyd Garrison, who told his country “I am in earnest—I will not equivocate—I will not excuse—I will not retreat a single inch—AND I WILL BE HEARD,” and in the same way that was Dr. King, who told that same country that:
“In a sense we’ve come to our nation’s capital to cash a check. When the architects of our republic wrote the magnificent words of the Constitution and the Declaration of Independence, they were signing a promissory note to which every American was to fall heir. This note was a promise that all men, yes, black men as well as white men, would be guaranteed the “unalienable Rights” of ‘Life, Liberty and the pursuit of Happiness.’ It is obvious today that America has defaulted on this promissory note.”
He embodied the country, in all its historic, inherent contradictions, in all its promises, broken and unbroken, and in all of its lost promises and hard-won glories. He insisted on the rights that the country said were his from birth and, in demanding them, freed himself to enjoy them, and freed the country, if only for a moment, to be something more than even the Founders thought it would be. And now, he’s passed from the earth. It was a great, golden trumpet of a life he led, and it is calling, calling still, and will still be calling, as the old hymn puts it, when time shall be no more.
The Real Scandal — Julianne Hing in The Nation on Donald Trump’s scam “university.”
This is where we are at this point of the collective national nightmare of the Republican Party’s 2016 campaign: On Thursday, Donald Trump toldThe Wall Street Journal that because of US District Judge Gonzalo Curiel’s “Mexican heritage,” the federal judge has an “absolute conflict” in presiding over a lawsuit brought by former students of Trump’s self-named real-estate courses. Curiel’s ethnic background is of importance because, Trump said, “I’m building a wall. It’s an inherent conflict of interest.” Trump clearly misunderstands the concept; a defendant’s own prejudices have no bearing on whether a judge is unfit for the job.
When Trump first mentioned the Indiana-born judge’s ethnicity at a San Diego rally last Friday, it was to do his usual jabbing and dancing to avoid ethical punches. At that event, Trump raised Curiel and mentioned his ethnic background in the same breath: “The judge, who happens to be, we believe, Mexican, which is great, I think that’s fine,” adding that he was sure that Mexican Americans would come around to support him “when I give all these jobs, OK?” Then he circled back around. “I’m getting railroaded by the legal system,” he said, “Frankly, they should be ashamed.” Trump labeled Curiel “a hater of Donald Trump,” and also called him “a total disgrace.”
It was a classic Trump move: create bogeymen out of thin air in order to prop up his self-imagined victimhood; home in on a person’s race or sex as the basis for his attacks; and then antagonize as a form of diversion from the matter at hand. That matter would be Trump University, the mogul’s real-estate courses that purportedly taught customers how to become like Trump, for as much as $35,000, or starting at the low, low price of $1,495. The lawsuit alleges that far from teaching students actual real-estate expertise, Trump ran a fraudulent business scheme.
The marketing schemes for Trump’s real-estate seminars at times sound ripped straight from the recruitment playbooks of the scandal-plagued for-profit school industry, which has preyed on single moms, people of color, veterans, and those who’ve been locked out of more prestigious avenues for higher education.
Take, for instance, the 2010 Senate testimony of Joshua Pruyn, a former admissions representative for Westwood College, a for-profit chain of, at the time, 17 campuses. Pruyn was technically an admissions advisor, but in reality his position was that of a glorified sales rep. “During the interview, we were taught to portray ourselves as advisors looking out for the students’ best interests and ensuring they were a good fit for the school. This fake interview would allow the representative to ask students questions to uncover a student’s motivators and pain points—their hopes, fears, and insecurities—all of which would later be used to pressure a student to enroll,” Pruyn testified.
The for-profit schools industry targeted people of color, poor people, and veterans because they more likely to be eligible for public financial aid like Pell Grants. This much-parodied Everest College commercial should be very familiar with anyone who watches daytime television.
Students of color ended up forming the backbone of the industry’s explosive growth in the early and mid-2000s. In the 2010–11 school year, just as the Obama administration’s regulatory hammer started to fall on the industry, the for-profit system University of Phoenix was the nation’s top producer of new black undergrad graduates. The nation’s second-highest producer of new black baccalaureates that year was Ashford University, also a for-profit college.
When the industry’s comeuppance came, it was devastating. In lawsuit after lawsuit, universities were accused of fleecing students of their federal student-aid money and saddling them with debt they couldn’t repay, and leaving students with an education and credits that weren’t transferrable or recognized as valid by other educational institutions. In December 2015, after multiple settlements in various lawsuits, Westwood College—where former admissions recruiter Pruyn worked—agreed not to enroll any more students.
After suffering a barrage of these kinds of lawsuits and increased regulation from the Obama administration, the for-profit schools industry is now in the tank these days. Enrollment is down; many have been maligned for the shady businesses that they were, including Trump University.
Hours after Trump brought Judge Curiel into his campaign theater last week, Curiel unsealed documents related to his case, at the request of The Washington Post. Those documents detail the aggressive marketing and recruitment playbook that Trump University sales staffers worked from. The playbook urged sales members to not let prospective customers be deterred by their own lack of money (“If they believe in you and your product, they will find the money”), and to guide consumers through “the roller coaster of emotions,” so as to encourage students to cough up cash. The guides urged sales members to home in on people’s vulnerabilities for maximum effect “during closing time.” (“For example: are they a single parent of three children that may need money for food?”)
These tactics, Trump would rather not discuss. Always easier, after all, to pivot to the most base of appeals—racial and ethnic antagonism—and the cheapest of tactics—bullying others and calling it self-defense.
Spectacles Spectacle — Peter Schjeldahl in The New Yorker on the latest art craze.
A recent little sensation at the San Francisco Museum of Modern Art delights and bemuses. Two teen-age boys from San Jose were perusing, with perplexity, the museum’s exhibits of contemporary art when they had a notion. One of them, Kevin Nguyen, sixteen, set his opened eyeglasses on the floor, neatly perpendicular to a nearby wall. He and T. J. Khayatan, seventeen, then stood back to watch what ensued: viewers observing the glasses with the curiosity and respect befitting a work of art—which, under the circumstances, they were.
Not that the glasses were good art, necessarily—an issue made moot, in any case, when Nguyen picked them up and put them back on.
But consider: an object manufactured to enhance seeing, presented as something to see. By being underfoot, the glasses were divorced from their function and protected only by the don’t-touch protocol of museums. They might have seemed, to a suggestible audience, to be about being-in-a-museum—and that audience could have included me. Suggestibility, undaunted by fear of proving foolish, is essential to art love.
Invoked, of course, was the evergreen aesthetic of the readymade, demonstrated by Marcel Duchamp with a urinal, in 1917. But that trope is hardly surefire. During their visit, Nguyen and Khayatan ventured two other placements, of a jacket and a baseball cap, which, at least visibly, intrigued no one. Some conceptual poetry or satirical bite is needed to bring a readymade off. The glasses managed both the former, at first, and then the latter, when their backstory emerged.
Many sane citizens will deem the spectacle of the spectacles ridiculous. They won’t be wrong. A risk of absurdity always attends the willingness to surrender oneself to the spell of any mere object: the dirtied swatch of cloth that is a painting, for example. It’s a game, whatever else it is, which makes sense only with knowledge of the rules and customs that are in play.
Museums edit, for our convenience, the universe of existing things. What they let in and what they keep out shape culture. How far in the way of inclusion is too far? How much in the way of discrimination is just crabby?
Have we witnessed the entire art career, now, of the San Jose Two?
You go, boys.
Doonesbury — Heir Apparent.
Liar, Liar — Jonathan Chait on the serial mendacity of Donald J. Trump.
Donald Trump is a wildly promiscuous liar. He also has disturbing authoritarian tendencies. Trump’s many critics have seized upon both traits as his two major disqualifications for the presidency, yet both of them frustratingly defy easy quantification. All politicians lie some, and many of them lie a lot, and most presidents also push the limits of their authority in ways that can frighten their opponents. So what is so uniquely dangerous about Trump? Perhaps the answer is that both of these qualities are, in a sense, the same thing. His contempt for objective truth is the rejection of democratic accountability, an implicit demand that his supporters place undying faith in him. Because the only measure of truth he accepts is what he claims at any given moment, the power his supporters vest in him is unlimited.
Trump lies routinely, about everything. Various journalists have tried to tally up his lies, inevitably giving up and settling for incompletesummaries. Some of these lies are merely standard, or perhaps somewhat exaggerated, versions of the way members of his party talk about policy. (The “real” unemployment rate is as high as 42 percent, or his gargantuan tax-cut plan “will be revenue-neutral.”) At times he engages in especially brazen rewriting of his own positions, such as insisting he opposed the Iraq War when he did not, or denying his past support for universal health insurance. Some of his lies are conspiracy theories that run toward the edges of respectable Republican thought (Barack Obama was actually born abroad) or even well beyond it (Ted Cruz’s father may have conspired to kill John F. Kennedy). In all these areas, Trump has merely improved upon the methods used by the professionals in his field.
Where he has broken truly unique ground is in his lies about relatively small, routine matters. As I’ve pointed out before — it’s become a small personal fixation — after Mitt Romney mocked the failure of Trump Steaks, Trump held a press conference in which he insisted Trump Steaks remained a going concern, despite the undeniable fact that the business no longer exists. (His campaign displayed store-bought steaks for the media, not even bothering to fully remove the labels of the store at which they purchased them.) The New York Times actually reported this week that Trump had displayed his steaks, without mentioning the blatant deception. Another such example is Trump’s prior habit of impersonating an imaginary p.r. representative while speaking to reporters. Obviously, the practice itself is strange enough, but the truly Trumpian touch is that he admitted to the ruse publicly, and then subsequently went back to denying it.
The normal rules of political lying hold that when the lie has been exposed, or certainly when it has been confessed, the jig is up. You have to stop lying about it and tell the truth, or at least retreat to a different lie. Trump bends the rules of the universe to his own will, at no apparent cost. His brazenness is another utterly unique characteristic. His confidence that he can make the truth whatever he wishes at any moment, and toggle back and forth between incompatible realities at will, without any cost to himself, is a display of dominance. Possibly Trump’s most important statement of the campaign was his idle boast that he could shoot somebody on Fifth Avenue without losing any votes.
Finally, there is Trump’s habit of settling all disputes with his own peculiar form of ad hominem. He dismisses all criticisms of his statements and his record with an array of put-downs, and likewise confirms all endorsements with praise. Anybody who disagrees with Trump is ugly, short, corrupt, a loser, a habitual liar, a total joke, and so forth. People who support him are smart, beautiful, fair, esteemed, etc. But politics being as it is — and, especially, Trump’s positions being as fluid as they are — the composition of the two categories is in constant flux. One day, you are a failing, ridiculous, deranged liar, and the next day a citizen of the highest regard. Trump literally called Ben Carson a “violent criminal” and a “pathological liar,” akin to a “child molester.” When later accepting Carson’s endorsement, Trump praised his “dignity.” Once Trump mocked Rick Perry as a moron who wore glasses to look smart and who should be required to take an IQ test to participate in presidential debates. Now he is a “good guy, good governor.” This is the pattern Trump uses to dismiss all media criticism, or to amplify friendly coverage. Every reporter or publication is either pathetic and failing or fair and wonderful, and the same reporters and publications can be reclassified as one or the other as Trump sees fit.
1984 is a cliché for invoking totalitarianism, and in any case, Trump is merely an authoritarian and a bully, not a totalitarian. (A totalitarian government, like North Korea, exerts control over every aspect of its citizens’ lives; an authoritarian one, like Putin’s Russia, merely uses enough fear and violence to maintain control.) Nonetheless, the novel does capture the relationship between dictatorial authority and the power to manipulate any fact into a binary but permeable scheme:
The past was alterable. The past never had been altered. Oceania was at war with Eastasia. Oceania had always been at war with Eastasia. Jones, Aaronson, and Rutherford were guilty of the crimes they were charged with. He had never seen the photograph that disproved their guilt. It had never existed, he had invented it. He remembered remembering contrary things, but those were false memories, products of self-deception.
Truth and reason are weapons of the powerless against the powerful. There is no external doctrine he can be measured against, not even conservative dogma, which he embraces or discards at will and with no recognition of having done so. Trump’s version of truth is multiple truths, the only consistent element of which is Trump himself is always, by definition, correct. Trump’s mind is so difficult to grapple with because it is an authoritarian epistemology that lies outside the democratic norms that have shaped all of our collective experiences.
Magnetic Personalities — M.R. O’Connor in The New Yorker on nature’s GPS system.
Every three years, the Royal Institute of Navigation organizes a conference focussed solely on animals. This April, the event was held southwest of London, at Royal Holloway College, whose ornate Victorian-era campus has appeared in “Downton Abbey.” For several days, the world’s foremost animal-navigation researchers presented their data and findings in a small amphitheatre. Most of the talks dealt with magnetoreception—the ability to sense Earth’s weak but ever-present magnetic field—in organisms as varied as mice, salmon, pigeons, frogs, and cockroaches. This marked a change from previous years, Richard Nissen, a member of the Institute, told me, when a range of other navigation aids were part of the discussion: landmarks, olfactory cues, memory, genetics, polarized light, celestial objects. “Everyone now seems completely sold on the idea that animal navigation is based on magnetism,” Nissen said. Human-centric as it sounds, most of the conference’s attendees believe that animals possess a kind of compass.
Scientists have sought for centuries to explain how animals, particularly migratory species, find their way with awesome precision across the globe. Examples of these powers abound. Bar-tailed godwits depart from the coastal mudflats of northern Alaska in autumn and set out across the Pacific Ocean, flying for eight days and nights over featureless water before arriving in New Zealand, seven thousand miles away. If the birds misjudge their direction by even a few degrees, they can miss their target. Arctic terns travel about forty thousand miles each year, from the Arctic to the Antarctic and back again. And odysseys of this sort are not limited to the feathered tribes. Some leatherback turtles leave the coast of Indonesia and swim to California, more than eight thousand miles away, then return to the very beaches where they hatched. Dragonflies and monarch butterflies follow routes so long that they die along the way; their great-grandchildren complete the journey.
Although the notion of a biocompass was widely disparaged in the first half of the twentieth century, the evidence in favor of it has since become quite strong. In the early nineteen-sixties, a German graduate student named Wolfgang Wiltschko began conducting experiments with European robins, which he thought might find their way by picking up radio waves that emanated from the stars. Instead, Wiltschko discovered that if he put the robins in cages equipped with a Helmholtz coil—a device for creating a uniform magnetic field—the birds would change their orientation when he switched the direction of north. By the start of this century, seventeen other species of migratory bird, as well as honeybees, sharks, skates, rays, snails, and cave salamanders, had been shown to possess a magnetic sense. In fact, practically every animal studied by scientists today demonstrates some capacity to read the geomagnetic field. Red foxes almost always pounce on mice from the northeast. Carp floating in tubs at fish markets in Prague spontaneously align themselves in a north-south axis. So do dogs when they crouch to relieve themselves, and horses, cattle, and deer when they graze—except if they are under high-voltage power lines, which have a disruptive influence.
The only problem is that no one can seem to locate the compass. “We are still crying out for how do they do this,” Joseph Kirschvink, a geobiologist at the California Institute of Technology, said. “It’s a needle in the haystack.” Kirschvink meant this almost literally. In 1981, as a Ph.D. student at Princeton University, he proposed that magnetite, a naturally occurring oxide of iron that he had found in honeybees and homing pigeons, was the basis of the biocompass. Even a handful of magnetite crystals, he wrote at the time, could do the trick. “One equivalent of a magnetic bacteria can give a whale a compass—one cell,” he told me. “Good luck finding it.” Even in animals smaller than a whale, this is no easy task. Throughout the two-thousands, researchers pointed to the presence of iron particles in the olfactory cells of rainbow trout, the brains of mole rats, and the upper beaks of homing pigeons. But when scientists at the Research Institute of Molecular Pathology, in Vienna, took a closer look, slicing and examining the beaks of hundreds of pigeons, they found that the iron-rich cells were likely the product of an immune response—nothing to do with the biocompass. The study’s lead researcher, David Keays, has since turned his focus to iron-containing neurons inside the pigeons’ ears.
The search for the biocompass has extended to even smaller scales, too. In 1978, the German biophysicist Klaus Schulten proposed that birds’ innate sense of direction was chemical in nature. According to his theory, incoming light would hit some sort of sensory mechanism, which Schulten hadn’t yet pinpointed, and induce a transfer of electrons, triggering the creation of a radical pair—two molecules, each with an extra electron. The electrons, though slightly separated, would spin in synchrony. As the bird moved through the magnetic field, the orientation of the spinning electrons would be modulated, providing the animal with feedback about its direction. For the next twenty years, it remained unclear which molecules could be responsible for such a reaction. Then, in 2000, Schulten suggested an answer—cryptochromes, a newly discovered class of proteins that respond to blue light. Cryptochromes have since been found in the retinas of monarch butterflies, fruit flies, frogs, birds, and even humans. They are the only candidate so far with the right properties to satisfy Schulten’s theory. But the weakest magnetic field that affects cryptochromes in the laboratory is still twenty times stronger than Earth’s magnetic field. Peter Hore, a chemist at Oxford University, told me that establishing cryptochromes as the biocompass will require at least another five years of research.
At the conference, the magnetite and cryptochrome researchers made up distinct camps, each one quick to point out the opposing theory’s deficiencies. One person stood alone: Xie Can, a biophysicist at Peking University. Xie spent six years developing a kind of unified model of magnetic animal navigation. Last year, he published a paper in Nature Materials describing a protein complex that he dubbed MagR, which consists of iron crystals enveloped in a double helix of cryptochrome—the two main theories rolled into one. Xie has yet to win over other researchers, some of whom believe that his findings are the result of iron oxide contaminating his lab experiments. (Keays has said that he will eat his hat if MagR is proved to be the real magnetoreceptor.) But at the end of the conference, with one mystery of animal navigation after another left unanswered, Xie told me that he felt more confident than ever of his and his colleagues’ model. “If we are right, we can explain everything,” he said. Michael Walker, a biologist at the University of Auckland, was more circumspect. If history is any indication, he said, many of the current hypotheses about how the biocompass works will turn out to be wrong.
“We Will Not Be Ignored” — Amanda Hess in the New York Times on the rising visibility of Asian-American actors.
When Constance Wu landed the part of Jessica Huang, the Chinese-American matriarch on the ABC sitcom “Fresh Off the Boat,” she didn’t realize just how significant the role would turn out to be. As she developed her part, Ms. Wu heard the same dismal fact repeated over and over again: It had been 20 years since a show featuring a predominantly Asian-American cast had aired on television. ABC’s previous offering, the 1994 Margaret Cho vehicle “All-American Girl,” was canceled after one season.
“I wasn’t really conscious of it until I booked the role,” Ms. Wu said. “I was focused on the task at hand, which was paying my rent.”
The show, which was just renewed for a third season, has granted Ms. Wu a steady job and a new perspective. “It changed me,” Ms. Wu said. After doing a lot of research, she shifted her focus “from self-interest to Asian-American interests.”
In the past year, Ms. Wu and a number of other Asian-American actors have emerged as fierce advocates for their own visibility — and frank critics of their industry. The issue has crystallized in a word — “whitewashing” — that calls out Hollywood for taking Asian roles and stories and filling them with white actors.
On Facebook, Ms. Wu ticked off a list of recent films guilty of the practice and said, “I could go on, and that’s a crying shame, y’all.” On Twitter, she bit back against Hollywood producers who believe their “lead must be white” and advised the creators of lily-white content to “CARE MORE.” Another tip: “An easy way to avoid tokenism? Have more than one” character of color, she tweeted in March. “Not so hard.”
It’s never been easy for an Asian-American actor to get work in Hollywood, let alone take a stand against the people who run the place. But the recent expansion of Asian-American roles on television has paradoxically ushered in a new generation of actors with just enough star power and job security to speak more freely about Hollywood’s larger failures.
And their heightened profile, along with an imaginative, on-the-ground social media army, has managed to push the issue of Asian-American representation — long relegated to the back burner — into the current heated debate about Hollywood’s monotone vision of the world.
“The harsh reality of being an actor is that it’s hard to make a living, and that puts actors of color in a very difficult position,” said Daniel Dae Kim, who stars in “Hawaii Five-0” on CBS and is currently appearing in “The King and I” on Broadway.
Mr. Kim has wielded his Twitter account to point to dire statistics and boost Asian-American creators. Last year, he posted a cheeky tribute to “the only Asian face” he could find in the entire “Lord of the Rings” series, a woman who “appears for a glorious three seconds.”
Other actors lending their voices include Kumail Nanjiani of “Silicon Valley,” Ming-Na Wen of “Agents of S.H.I.E.L.D.” and Aziz Ansari, who in his show, “Master of None,” plays an Indian-American actor trying to make his mark.
They join longtime actors and activists like BD Wong of “Gotham”; Margaret Cho, who has taken her tart comedic commentary to Twitter; and George Takei, who has leveraged his “Star Trek” fame into a social media juggernaut.
“There’s an age-old stereotypical notion that Asian-American people don’t speak up,” Mr. Wong said. But “we’re really getting into people’s faces about it.”
This past year has proved to be a particularly fraught period for Asian-American representation in movies. Last May, Sony released “Aloha,” a film set in Hawaii that was packed with white actors, including the green-eyed, blond-haired Emma Stone as a quarter-Chinese, quarter-Native Hawaiian fighter pilot named Allison Ng.
In September, it was revealed that in the planned adaptation of the Japanese manga series Death Note, the hero, a boy with dark powers named Light Yagami, would be renamed simply Light and played by the white actor Nat Wolff. In “The Martian,” released in October, the white actress Mackenzie Davis stepped into the role of the NASA employee Mindy Park, who was conceived in the novel as Korean-American.
The list goes on. In December, set photographs from the coming “Absolutely Fabulous” film showed the Scottish actress Janette Tough dressed as an over-the-top Asian character. Last month, Marvel Studios released a trailer for “Doctor Strange,” in which a character that had originated in comic books as a Tibetan monk was reimagined as a Celtic mystic played by Tilda Swinton.
And in the live-action American film adaptation of the manga series Ghost in the Shell, scheduled for next year, the lead character, Major Motoko Kusanagi, will be called Major and played by Scarlett Johansson in a black bob.
Studios say that their films are diverse. “Like other Marvel films, several characters in ‘Doctor Strange’ are significant departures from the source material, not limited by race, gender or ethnicity,” the Marvel Studios president Kevin Feige said in a statement. Ms. Swinton will play a character that was originally male, and Chiwetel Ejiofor a character that was originally white. Paramount and DreamWorks, the studios behind “Ghost in the Shell,” said that the film reflects “a diverse array of cultures and countries.”
But many Asian-American actors aren’t convinced. “It’s all so plainly outlandish,” Mr. Takei said. “It’s getting to the point where it’s almost laughable.”
Doonesbury — Play safe.
Miami Vice — Martin Longman on Marco Rubio’s connections with drug dealers.
When you see a headline like this [How Rubio helped his ex-con brother-in-law acquire a real estate license] in the Washington Post, you figure that you’re about to read a very long and sordid exposé. That’s not really what Post reporters Manuel Roig-Franzia and Scott Higham delivered in this case, though. Their piece has enough substantiation to justify the headline, but it doesn’t delve too deeply into the greater meaning and it leaves the most important question unanswered.
Let’s start with the fact that “ex-con” doesn’t really do justice to Marco Rubio’s brother-in-law. Orlando Cicilia was a major drug trafficker at a time and in a place that has gone down in history in movies like Scarface and television programs like Miami Vice for being notoriously violent and destructive.
According to public records, Cicilia was arrested after federal law enforcement seized the Miami home where he lived with Barbara Rubio, Senator Rubio’s sister. Barbara Rubio was not arrested or indicted. Cicilia was sentenced to 25 years in prison for conspiracy to distribute cocaine and marijuana.
The arrest was part of “Operation Cobra,” a federal crackdown on a Florida drug smuggling ring that killed a federal informer and chopped up his body, according to a NYT story published at the time. The story reports that the ring, led by Cuban American Mario Tabraue, paid $150,000 in bribes to the Key West police chief and Miami-Dade county officials, and used Miami police officers to collect, count, and disburse drug profits.
About that part where they killed a federal informer and chopped up his body, the New York Times reported on December 17th, 1987:
The authorities said that in July 1980, members of [Cicilia’s drug ring] apparently became aware that Larry Nash was an informer for the Bureau of Alcohol, Tobacco and Firearms.
“Mr. Nash was murdered and mutilated,” Mr. Dean said. “His body was cut up with a chain saw and then burned.”
This drug ring reportedly did $75 million of business trading in marijuana and cocaine, of which Cicilia was personally responsible for $15 million. That’s a lot of cocaine and a lot of ruined lives, and the way they operated, it was a lot of violence, intimidation, and the cause of a shameful amount of public corruption.
To call this man merely an “ex-con” doesn’t capture the scope of his crimes.
When Cicilia was arrested, Marco Rubio was sixteen years old, and he can’t be held accountable for what his sister’s boyfriend and eventual husband did for a living. That his sister and the family stayed loyal to this man throughout his incarceration and welcomed him back into their lives and homes when he was released is admirable in its own way. When you look at the totality of the circumstances with this case, the Rubio family deserves a degree of credit for loyalty and a willingness to forgive. Orlando Cicilia served his time and he ought to be afforded the opportunity to demonstrate that he’s been rehabilitated.
Still, this was a choice. It was a choice to essentially overlook the immense damage done by Cicilia and his gang to countless individuals and to the integrity of the local government and law enforcement institutions.
We have to balance the good and the bad here, and that’s the context with which we should judge the following:
When Marco Rubio was majority whip of the Florida House of Representatives, he used his official position to urge state regulators to grant a real estate license to his brother-in-law, a convicted cocaine trafficker who had been released from prison 20 months earlier, according to records obtained by The Washington Post.
In July 2002, Rubio sent a letter on his official statehouse stationery to the Florida Division of Real Estate, recommending Orlando Cicilia “for licensure without reservation.” The letter, obtained by The Washington Post under the Florida Public Records Act, offers a glimpse of Rubio using his growing political power to assist his troubled brother-in-law and provides new insight into how the young lawmaker intertwined his personal and political lives.
Rubio did not disclose in the letter that Cicilia was married to his sister, Barbara, or that the former cocaine dealer was living at the time in the same West Miami home as Rubio’s parents. He wrote that he had known Cicilia “for over 25 years,” without elaborating.
The Rubio campaign responds that it would have been worse if he had revealed his conflict of interest because revealing that Cicilia was his brother-in-law and was living with his parents would have put undue pressure on the members of the Florida Division of Real Estate. This is because, as majority whip of the Florida House of Representatives, he had “significant influence” over the Division’s budget.
That’s a defense, certainly, but a poor one. Rubio had two truly defensible options. He could have refused to write the letter because of the obvious conflict or he could have fully disclosed it and let the chips fall where they may. He chose to hide the conflict, and that was the wrong decision.
Tell Me a Story — John Yorke in The Atlantic looks at the roots of all tales.
A ship lands on an alien shore and a young man, desperate to prove himself, is tasked with befriending the inhabitants and extracting their secrets. Enchanted by their way of life, he falls in love with a local girl and starts to distrust his masters. Discovering their man has gone native, they in turn resolve to destroy both him and the native population once and for all.Avatar or Pocahontas? As stories they’re almost identical. Some have even accused James Cameron of stealing the Native American myth. But it’s both simpler and more complex than that, for the underlying structure is common not only to these two tales, but to all of them.
Take three different stories:
A dangerous monster threatens a community. One man takes it on himself to kill the beast and restore happiness to the kingdom …
It’s the story of Jaws, released in 1976. But it’s also the story of Beowulf, the Anglo-Saxon epic poem published some time between the eighth and 11th centuries.
And it’s more familiar than that: It’s The Thing, it’s Jurassic Park, it’s Godzilla, it’s The Blob—all films with real tangible monsters. If you recast the monsters in human form, it’s also every James Bond film, every episode of MI5, House, or CSI. You can see the same shape in The Exorcist, The Shining, Fatal Attraction, Scream, Psycho, and Saw. The monster may change from a literal one in Nightmare on Elm Street to a corporation in Erin Brockovich, but the underlying architecture—in which a foe is vanquished and order restored to a community—stays the same. The monster can be fire in The Towering Inferno, an upturned boat in The Poseidon Adventure, or a boy’s mother in Ordinary People. Though superficially dissimilar, the skeletons of each are identical.
Our hero stumbles into a brave new world. At first he is transfixed by its splendor and glamour, but slowly things become more sinister . . .
It’s Alice in Wonderland, but it’s also The Wizard of Oz, Life on Mars, and Gulliver’s Travels. And if you replace fantastical worlds with worlds that appear fantastical merely to the protagonists, then quickly you see how Brideshead Revisited, Rebecca, The Line of Beauty, and The Third Man all fit the pattern too.
When a community finds itself in peril and learns the solution lies in finding and retrieving an elixir far, far away, a member of the tribe takes it on themselves to undergo the perilous journey into the unknown …
It’s Raiders of the Lost Ark, Morte D’Arthur, Lord of the Rings, and Watership Down. And if you transplant it from fantasy into something a little more earthbound, it’s Master and Commander, Saving Private Ryan, Guns of Navarone, and Apocalypse Now. If you then change the object of the characters’ quest, you find Rififi, The Usual Suspects, Ocean’s Eleven, Easy Rider, and Thelma & Louise.
So three different tales turn out to have multiple derivatives. Does that mean that when you boil it down there are only three different types of story? No. Beowulf, Alien, and Jaws are ‘monster’ stories—but they’re also about individuals plunged into a new and terrifying world. In classic “quest” stories like Apocalypse Now or Finding Nemo the protagonists encounter both monsters and strange new worlds. Even “Brave New World” stories such as Gulliver’s Travels, Witness, and Legally Blonde fit all three definitions: The characters all have some kind of quest, and all have their own monsters to vanquish too. Though they are superficially different, they all share the same framework and the same story engine: All plunge their characters into a strange new world; all involve a quest to find a way out of it; and in whatever form they choose to take, in every story “monsters” are vanquished. All, at some level, too, have as their goal safety, security, completion, and the importance of home….
My favorite film of 1977 was not “Star Wars” but “Close Encounters of the Third Kind,” Steven Spielberg’s U.F.O fantasia. Notwithstanding the fact that I was nine years old, I considered “Star Wars” a little childish. Also, the trash-compactor scene scared me. “Close Encounters,” on the other hand, drew me back to the theatre—the late, great K-B Cinema, in Washington, D.C.—five or six times. I irritated friends by insisting that it was better than “Star Wars,” and followed the box-office grosses in the forlorn hope that my favorite would surpass its rival.
“Close Encounters” still strikes me as an amazing creation—a one-off fusion of blockbuster spectacle with the disheveled realism of nineteen-seventies filmmaking. It has a wildness, a madness that is missing from Spielberg’s subsequent movies. The Disneyesque fireworks of the finale can’t hide the fact that the hero of the tale is abandoning his family in the grip of a monomaniacal obsession. Looking back, though, I’m sure that what really held me spellbound was the score, which, like that of “Star Wars,” was written by John Williams. I was a full-on classical-music nerd, playing the piano and trying to write my own compositions. I’d dabbled in Wagner, Bruckner, and Mahler, but knew nothing of twentieth-century music. “Close Encounters” offered, at the start, a seething mass of dissonant clusters, which abruptly coalesce into a bright, clipped C-major chord, somehow just as spooky as what came before. The “Star Wars” music had a familiar ring, but this kind of free, frenzied painting with sound was new to me, and has fascinated me ever since.
Now eighty-three years old, Williams remains a vital presence. “Star Wars: The Force Awakens,” his latest effort, is doing fairly good business, and he is at work on Spielberg’s next picture. He has scored all of the “Star Wars” movies, all of the Indiana Jones movies, several Harry Potters, “Jaws,” “E.T.,” “Superman,” “Jurassic Park,” and almost a hundred others. BoxOfficeMojo.com calculates that since 1975 Williams’s films have grossed around twenty billion dollars worldwide—and that leaves out the first seventeen years of his career. He has received forty-nine Oscar nominations, with a fiftieth almost certain for 2016. Perhaps his most crucial contribution is the role he has played in preserving the art of orchestral film music, which, in the early seventies, was losing ground to pop-song soundtracks. “Star Wars,” exuberantly blasted out by the London Symphony, made the orchestra seem essential again.
Williams’s wider influence on musical culture can’t be quantified, but it’s surely vast. The brilliant young composer Andrew Norman took up writing music after watching “Star Wars” on video, as William Robin notes in a Times profile. The conductor David Robertson, a disciple of Pierre Boulez and an unabashed Williams fan, told me that some current London Symphony players first became interested in their instruments after encountering “Star Wars.” Robertson, who regularly stages all-Williams concerts with the St. Louis Symphony, observed that professional musicians enjoy playing the scores because they are full of the kinds of intricacies and motivic connections that enliven the classic repertory. “He’s a man singularly fluent in the language of music,” Robertson said. “He’s very unassuming, very humble, but when he talks about music he can be the most interesting professor you’ve ever heard. He’s a deep listener, and that explains his ability to respond to film so acutely.”
It has long been fashionable to dismiss Williams as a mere pasticheur, who assembles scores from classical spare parts. Some have gone as far as to call him a plagiarist. A widely viewed YouTube video pairs the “Star Wars” main title with Erich Wolfgang Korngold’s music for “Kings Row,” a 1942 picture starring Ronald Reagan. Indeed, both share a fundamental pattern: a triplet figure, a rising fifth, a stepwise three-note descent. Also Korngoldesque are the glinting dissonances that affirm rather than undermine the diatonic harmony, as if putting floodlights on the chords.
To accuse Williams of plagiarism, however, brings to mind the famous retort made by Brahms when it was pointed out that the big tune in the finale of his First Symphony resembled Beethoven’s Ode to Joy: “Any ass can hear that.” Williams takes material from Korngold and uses it to forge something new. After the initial rising statement, the melodies go in quite different directions: Korngold’s winds downward to the tonic note, while Williams’s insists on the triplet rhythm and leaps up a minor seventh. I used to think that the latter gesture was taken from a passage in Bruckner’s Fourth Symphony, but the theme can’t have been stolen from two places simultaneously.
Although it’s fun to play tune detective, what makes these ideas indelible is the way they’re fleshed out, in harmony, rhythm, and orchestration. (To save time, Williams uses orchestrators, but his manuscripts arrive with almost all of the instrumentation spelled out.) We can all hum the trumpet line of the “Star Wars” main title, but the piece is more complicated than it seems. There’s a rhythmic quirk in the basic pattern of a triplet followed by two held notes: the first triplet falls on the fourth beat of the bar, while later ones fall on the first beat, with the second held note foreshortened. There are harmonic quirks, too. The opening fanfare is based on chains of fourths, adorning the initial B-flat-major triad with E-flats and A-flats. Those notes recur in the orchestral swirl around the trumpet theme. In the reprise, a bass line moves in contrary motion, further tweaking the chords above. All this interior activity creates dynamism. The march lunges forward with an irregular gait, rugged and ragged, like the Rebellion we see onscreen.
This is not to deny that Williams has a history of drawing heavily on established models. The Tatooine desert in “Star Wars” is a dead ringer for the steppes of Stravinsky’s “The Rite of Spring.” The “Mars” movement of Holst’s “Planets” frequently lurks behind menacing situations. Jeremy Orosz, in a recent academic paper, describes these gestures as “paraphrases”: rather than quoting outright, Williams “uses pre-existing material as a creative template to compose new music at a remarkable pace.” There’s another reason that “Star Wars” contains so many near-citations. At first, George Lucas had planned to fill the soundtrack with classical recordings, as Stanley Kubrick had done in “2001.” The temp track included Holst and Korngold. Williams, whom Lucas hired at Spielberg’s suggestion, acknowledged the director’s favorites while demonstrating the power of a freshly composed score. He seems to be saying: I can mimic anything you want, but you need a living voice.
In that delicate balancing act, Williams may have succeeded all too well. After “Star Wars,” he became a sound, a brand. The diversity and occasional daring of the composer’s earlier work—I’m thinking not only of “Close Encounters” but also of Robert Altman’s “Images” and “The Long Goodbye” and of Brian De Palma’s “The Fury”—subsided over time. Williams invariably achieves a level of craftsmanship that no other living Hollywood composer can match; his fundamental skill is equally evident in his sizable catalogue of concert-hall scores. Yet he’s been boxed in by the billions that his music has helped to earn. He has become integral to a populist economy on which thousands of careers depend.
Doonesbury — No harm no foul.
It’s time for my annual re-cap and prognostication for the past year and the year coming up. Let’s see how I did a year ago.
– Now that we have a Republican House and Senate and a president who isn’t running for re-election, get out the popcorn, and I mean the good stuff. The GOP will try to do everything they can to destroy the legacy of Barack Obama, but they will end up looking even more foolish, petulant, infantile, and borderline nuts than they have for the last two years, and that’s saying something. Repeals of Obamacare, Dodd-Frank, and recharged attempts to investigate Benghazi!, the IRS, and the VA will be like the three rings of Barnum & Bailey, all of which President Obama will gleefully veto. As Zandar noted at Balloon Juice, “Over/under on when a Republican declares on FOX that Obama’s veto is “illegal”, Feb 8.”
They did all that except actually pass the bills for President Obama to veto. Instead they putsched John Boehner and replaced him with Paul Ryan who will more than likely face the same nutsery in 2016.
– Hillary Clinton will announce that she is running for president by March 2015 at the latest. Elizabeth Warren will not run, but Bernie Sanders, the Gene McCarthy of this generation, will announce as an independent and become a frequent guest on MSNBC. Jeb Bush, after “actively exploring” a run in 2016, will announce that he is running and quickly fade to the single digits when the GOP base gets a taste of his views on immigration and Common Core. He may be popular in Republican polls, but those people don’t vote in primaries. The frontrunners for the Iowa caucuses a year from now will be Rand Paul and Chris Christie.
Nailed that one except for the last sentence. But to be fair I don’t think anyone had Donald Trump on their betting sheets a year ago, and if they did, it was more for the entertainment value than serious consideration as a Republican candidate.
– The war in Afghanistan is officially over as of December 2014, but there will be U.S. troops actively engaged in combat in what is left of Syria and Iraq in 2015.
More’s the pity.
– The U.S. economy will continue to improve at a galloping pace. The Dow will hit 19,000 at some point in 2015 and oil will continue to flood the market, keeping the price below $60 a barrel and gasoline will sell for under $2 a gallon, and finally wages will start to catch up with the improving economy. I blame Obama.
Except for my overly-optimistic prediction on the Dow, this pretty much came true, even down to the price for gasoline: I paid $1.99 last night in Miami, which is not the lowest-priced city in the country. President Obama is not getting any credit whatsoever for helping the economy improve, which he should, but then the Republicans never blamed Bush for crashing it in the first place.
– The Supreme Court will rule that bans on same-sex marriage violate the Constitution. They will also narrowly uphold Obamacare again.
Happy dance, happy dance.
– The embargo against Cuba will end on a narrow vote in the Senate thanks to the overwhelming influence of Republican donors who see 11 million Cubans starving for Dunkin Donuts and car parts and don’t care what a bunch of domino-playing dreamers on Calle Ocho think.
The embargo is still in place as a matter of law, but for all intents and purposes, it is crumbling. U.S. airlines and cruise ships are setting schedules, direct mail service is resuming, and travel there has become routine.
– The Tigers will win their division again.
Oh, shut up.
– We will lose the requisite number of celebrities and friends as life goes on. As I always say, it’s important to cherish them while they are with us.
I hold them in the Light.
– I technically retired on September 1, 2014, but my last day at work will be August 30, 2019. (It’s complicated.) I’m planning a return trip to Stratford this summer — more on that later — and I’ll get more plays produced. I will finish at least one novel in 2015.
This was a productive year for me on the writing front: several plays of mine were done either in full stage productions or readings, and more are on the way. No, I did not finish a novel yet.
Now for the predictions for 2016:
Okay, it’s your turn. What do you see for 2016?
Earlier this week Playboy magazine announced that it would no longer print pictures of nude women. This is based on the theory that if you want to see them, you have a lot of choices on the internet. And you won’t have to smuggle them into the garage attic to look at them with a flashlight.
For boys of a certain age, Playboy was a rite of passage. Fifty years ago it was how thirteen year old boys got their first glimpse of undressed women. I remember a friend of mine showing me a rather rumpled copy of the magazine with all the sophisticated ads for liquor and rich-guy toys, and then there was the centerfold. Zowie.
I tried to show enthusiasm, but when I finally saw it my reaction was “enh.” It did nothing for me, and I couldn’t help but wonder what all the fuss was about. I didn’t really process it then, but I think that’s about the time that I was beginning to be aware of the fact that, at least in terms of responding to sexual stimuli, I am gay.
So, thanks, Playmate of the Month for November 1965. It would be another eleven years before I actually came out, but you helped get the journey going.
“A Dream Undone” — From the New York Times magazine, Jim Rutenberg reports on the efforts to bring back Jim Crow.
On the morning of his wedding, in 1956, Henry Frye realized that he had a few hours to spare before the afternoon ceremony. He was staying at his parents’ house in Ellerbe, N.C.; the ceremony would take place 75 miles away, in Greensboro, the hometown of his fiancée; and the drive wouldn’t take long. Frye, who had always been practical, had a practical thought: Now might be a good time to finally register to vote. He was 24 and had just returned from Korea, where he served as an Air Force officer, but he was also a black man in the American South, so he wasn’t entirely surprised when his efforts at the registrar’s office were blocked.
Adopting a tactic common in the Jim Crow South, the registrar subjected Frye to what election officials called a literacy test. In 1900, North Carolina voters amended the state’s Constitution to require that all new voters “be able to read and write any section of the Constitution in the English language,” but for decades some registrars had been applying that already broad mandate even more aggressively, targeting perfectly literate black registrants with arbitrary and obscure queries, like which president served when or who had the ultimate power to adjourn Congress. “I said, ‘Well, I don’t know why are you asking me all of these questions,’ ” Frye, now 83, recalled. “We went around and around, and he said, ‘Are you going to answer these questions?’ and I said, ‘No, I’m not going to try.’ And he said, ‘Well, then, you’re not going to register today.’ ”
Sitting with me on the enclosed porch of his red-brick ranch house in Greensboro, drinking his wife’s sweet tea, Frye could joke about the exchange now, but at the time it left him upset and determined. When he met Shirley at the altar, the first thing he said was: “You know they wouldn’t let me register?”
“Can we talk about this later?” she replied.
After a few weeks, Frye drove over to the Board of Elections in Rockingham, the county seat, to complain. An official told him to go back and try again. This time a different registrar, after asking if he was the fellow who had gone over to the election board, handed him a paragraph to copy from the Constitution. He copied it, and with that, he became a voter.
But in the American South in 1956, not every would-be black voter was an Air Force officer with the wherewithal to call on the local election board; for decades, most had found it effectively impossible to attain the most elemental rights of citizenship. Only about one-quarter of eligible black voters in the South were registered that year, according to the limited records available. By 1959, when Frye went on to become one of the first black graduates of the University of North Carolina law school, that number had changed little. When Frye became a legal adviser to the students running the antisegregation sit-ins at the Greensboro Woolworth’s in 1960, the number remained roughly the same. And when Frye became a deputy United States attorney in the Kennedy administration, it had grown only slightly. By law, the franchise extended to black voters; in practice, it often did not.
What changed this state of affairs was the passage, 50 years ago this month, of the Voting Rights Act. Signed on Aug. 6, 1965, it was meant to correct “a clear and simple wrong,” as Lyndon Johnson said. “Millions of Americans are denied the right to vote because of their color. This law will ensure them the right to vote.” It eliminated literacy tests and other Jim Crow tactics, and — in a key provision called Section 5 — required North Carolina and six other states with histories of black disenfranchisement to submit any future change in statewide voting law, no matter how small, for approval by federal authorities in Washington. No longer would the states be able to invent clever new ways to suppress the vote. Johnson called the legislation “one of the most monumental laws in the entire history of American freedom,” and not without justification. By 1968, just three years after the Voting Rights Act became law, black registration had increased substantially across the South, to 62 percent. Frye himself became a beneficiary of the act that same year when, after a close election, he became the first black state representative to serve in the North Carolina General Assembly since Reconstruction.
In the decades that followed, Frye and hundreds of other new black legislators built on the promise of the Voting Rights Act, not just easing access to the ballot but finding ways to actively encourage voting, with new state laws allowing people to register at the Department of Motor Vehicles and public-assistance offices; to register and vote on the same day; to have ballots count even when filed in the wrong precinct; to vote by mail; and, perhaps most significant, to vote weeks before Election Day. All of those advances were protected by the Voting Rights Act, and they helped black registration increase steadily. In 2008, for the first time, black turnout was nearly equal to white turnout, and Barack Obama was elected the nation’s first black president.
Since then, however, the legal trend has abruptly reversed. In 2010, Republicans flipped control of 11 state legislatures and, raising the specter of voter fraud, began undoing much of the work of Frye and subsequent generations of state legislators. They rolled back early voting, eliminated same-day registration, disqualified ballots filed outside home precincts and created new demands for photo ID at polling places. In 2013, the Supreme Court, in the case of Shelby County v. Holder, directly countermanded the Section 5 authority of the Justice Department to dispute any of these changes in the states Section 5 covered. Chief Justice John Roberts Jr., writing for the majority, declared that the Voting Rights Act had done its job, and it was time to move on. Republican state legislators proceeded with a new round of even more restrictive voting laws.
All of these seemingly sudden changes were a result of a little-known part of the American civil rights story. It involves a largely Republican countermovement of ideologues and partisan operatives who, from the moment the Voting Rights Act became law, methodically set out to undercut or dismantle its most important requirements. The story of that decades-long battle over the iconic law’s tenets and effects has rarely been told, but in July many of its veteran warriors met in a North Carolina courthouse to argue the legality of a new state voting law that the Brennan Center for Justice at the New York University Law School has called one of the “most restrictive since the Jim Crow era.” The decision, which is expected later this year, could determine whether the civil rights movement’s signature achievement is still justified 50 years after its signing, or if the movement itself is finished.
Upping the Outrage — James Hamblin in The Atlantic on how the internet fuels the response to something and then moves on.
Now is the point in the story of Cecil the lion—amid non-stop news coverage and passionate social-media advocacy—when people get tired of hearing about Cecil the lion. Even if they hesitate to say it.But Cecil fatigue is only going to get worse. On Friday morning, Zimbabwe’s environment minister, Oppah Muchinguri, called for the extradition of the man who killed him, the Minnesota dentist Walter Palmer. Muchinguri would like Palmer to be “held accountable for his illegal action”—paying a reported $50,000 to kill Cecil with an arrow after luring him away from protected land. And she’s far from alone in demanding accountability. This week, the Internet has served as a bastion of judgment and vigilante justice—just like usual, except that this was a perfect storm directed at a single person. It might be called an outrage singularity.Palmer didn’t just kill a lion. He killed an especially good-looking and “beloved” lion in an ostentatious and gruesome fashion that culminated in decapitation. To make things worse, that lion had a human name. To make things worse still, that name was Cecil.
The Internet has served to facilitate outrage, as the Internet does: the hotter the better. And because the case is so visceral and bipartisan in its opposition to Palmer’s act, few people stepped in to suggest that the fury, the people tweeting his home address, might be too much. That argument wins no outrage points.Instead, the people who hadn’t jumped on the Cecil-outrage bandwagon jumped on the superiority-outrage bandwagon. It’s a bandwagon of outrage one-upmanship, and it’s just as rewarding as the original outrage bandwagon. Anyone can play, like this:
It’s fine to be outraged about one lion, but what about all of the other lions who are hunted and killed every year? There are 250 Cecils killed annually across Africa as trophies, and that’s what you should really be outraged by. But good job caring now.
Actually, what about all of the animals? All of the cattle and fish and brilliant pigs who are systematically slaughtered for human consumption every day? Were you eating a hot dog when you posted that thing about Cecil on Facebook? Anyone who is not vegan is no better than the dentist Walter Palmer. That is what you really should be outraged by.
Actually, you only care about Zimbabwe when a lion is killed? Great of you. Killing animals is part of the circle of life, but you know what’s not? Human trafficking. People are bought and sold as slaves today all over the world. Why are you talking about one aged jungle cat in a place where the relationship between impoverished pastoralist communities and wealthy foreign tourists is more complicated than you actually understand?
And I’m glad you’re so concerned about human trafficking, but there will be no humans at all if we don’t do something about climate change. Reliance on fossil fuels and industrialized farming is the real problem, and that’s what you should be outraged by. You don’t know what to care about. I know what to care about.
The Internet launders outrage and returns it to us as validation, in the form of likes and stars and hearts. The greatest return comes from a strong and superior point of view, on high moral ground. And there is, fortunately and unfortunately, always higher moral ground. Even when a dentist kills an adorable lion, and everyone is upset about it, there’s better outrage ground to be won. The most widely accepted hierarchy of outrage seems to be: Single animal injured < single animal killed < multiple animals killed < systematic killing of animals < systematic oppression/torture of people < systematic killing of humans < end of all life due to uninhabitable planet.
To say that there’s a more important issue in the world is always true, except in the case of climate change ending all life, both human and animal. So it’s meaningless, even if it’s fun, to go around one-upping people’s outrage. Try it. Someone will express legitimate concern over something, and all you have to do is say there are more important things to be concerned about. All you have to do is use the phrase “spare me” and then say something about global warming. You can literally write, “My outrage is more legit than your outrage! Ahhh!”
Jon Stewart, Patriot — An appreciation in The New Yorker by David Remnick.
Political life in America never ceases to astonish. Take last week’s pronouncements from the Republican Presidential field. Please. Mike Huckabee predicted that President Obama’s seven-nation agreement limiting Iran’s nuclear capabilities “will take the Israelis and march them to the door of the oven.” Ted Cruz anointed the American President “the world’s leading financier of radical Islamic terrorism.” Marco Rubio tweeted, “Look at all this outrage over a dead lion, but where is all the outrage over the planned parenthood dead babies.” And the (face it) current front-runner, the halfway hirsute hotelier Donald Trump, having insulted the bulk of his (count ’em) sixteen major rivals plus (countless) millions of citizens of the (according to him) not-so-hot nation he proposes to lead, announced via social media that in this week’s Fox News debate he plans “to be very nice & highly respectful of the other candidates.” Really, now. Who’s writing this stuff? Jon Stewart?
Over the decades, our country has been lucky in many things, not least in the subversive comic spirits who, in varying ways, employ a joy buzzer, a whoopee cushion, and a fun-house mirror to knock the self-regard out of an endless parade of fatuous pols. Thomas Nast drew caricatures so devastating that they roiled the ample guts of our town’s Boss, William Marcy Tweed. Will Rogers’s homespun barbs humbled the devious of the early twentieth century. Mort Sahl, the Eisenhower-era comic whose prop was a rolled-up newspaper, used conventional one-liners to wage radical battle: “I’ve arranged with my executor to be buried in Chicago, because when I die I want to still remain politically active.” Later, Dick Gregory, Richard Pryor, and Joan Rivers continued to draw comic sustenance from what Philip Roth called “the indigenous American berserk.”
Four nights a week for sixteen years, Jon Stewart, the host and impresario of Comedy Central’s “The Daily Show,” has taken to the air to expose our civic bizarreries. He has been heroic and persistent. Blasted into orbit by a trumped-up (if you will) impeachment and a stolen Presidential election, and then rocketing through the war in Iraq and right up to the current electoral circus, with its commodious clown car teeming with would-be Commanders-in-Chief, Stewart has lasered away the layers of hypocrisy in politics and in the media. On any given night, a quick montage of absurdist video clips culled from cable or network news followed by Stewart’s vaudeville reactions can be ten times as deflating to the self-regard of the powerful as any solemn editorial—and twice as illuminating as the purportedly non-fake news that provides his fuel.
Stewart set out to be a working comedian, and he ended up an invaluable patriot. But the berserk never stops. His successor, Trevor Noah, will not lack for material. As Stewart put it wryly on one of his last nights on the air, “As I wind down my time here, I leave this show knowing that most of the world’s problems have been solved by us, ‘The Daily Show.’ But sadly there are still some dark corners that our broom of justice has not reached yet.”
Doonesbury — Amateur Night.
Obama and History — Josh Marshall on what a legacy means to President Obama.
We all remember that week last month when the country seemed to be marching with history. The Court upheld the Affordable Care Act against what is likely its last serious legal challenge, effectively embedding it deeply into the structure of American social policy. The Court then (in what was unfortunately a weakly argued majority decision) made marriage equality the law of the land nationwide. Then on the heels of these events came the President’s speech (transcript here) in Charleston, South Carolina – actually a eulogy for Clementa Pinckney, one of the victims of the Emmanuel Church massacre on June 17 but in fact a commemoration and meditation on the meaning of the whole event. (James Fallows’ is one of the best appreciations and treatments of it.)
When I look at Obama I don’t see a President desperately trying to cram legacy achievements into the declining months of his presidency. I see achievements coming to fruition that were usually years in the making but often seemed errant or quixotic and uncertain in their outcome. This is what for many was so bracing about the end of June. This has been a long long seven years. What seemed like an uncertain list of achievements, long on promise but hacked apart by mid-term election reverses and Obama’s sometimes over-desire for accommodation, suddenly appeared closer to profound, like a novel or a play which seems scattered or unresolved until all the pieces fall into place, clearly planned all along, at the end.
Whatever you think of this Iran agreement, it is not only the product of years of work but is core to the foreign policy vision Obama brought with him to the presidency. It’s as core to the goals he entered the presidency with as anything that has happened in recent weeks. He has it in view; his political opponents will be very hard pressed to block him. And he is pushing ahead to get it done.
None of this is to say that there isn’t a clear and palpable change in the President’s affect and demeanor. His presidency is coming to an end and his range of action will diminish further as the presidential election moves to center stage next year. As the budget deficit has receded from public view, Obama’s fucks deficit has come to the forefront. After six and a half years in office, he may have a small stockpile of fucks left. But he has none left to give. He is increasingly indifferent to the complaints and anger of his political foes and focused on what he can do on his own or with reliable political supporters. You can see it too in the more frequent lean-in-on-the-lectern moments during press conferences and speeches. He’s truly out of fucks to give. But it’s more a product of focus on finishing aspects of his presidency in motion for years than of cramming at the end. For most of his supporters, this was the Obama they always wanted. And he’s giving it to them. What comes off to reporters as testiness is more like the indifference of someone who’s got work to do and is intent on doing it.
Scout’s Honor — Dale Russakoff in The New Yorker reconnects with the woman who played Scout in the film of To Kill a Mockingbird looks back at her role on and off the screen.
After playing Scout in the movie of “To Kill a Mockingbird,” in 1962, Mary Badham endured a rude homecoming when she returned to Birmingham, Alabama. Having just spent six months in California with her mother, living in a racially integrated apartment complex, she found herself suddenly an outsider back home. “The attitude was ‘Lord knows what she might’ve learned out there!’ ” Badham recalled the other day. “Some families, I’d been welcome in their homes, and after the film, I was no longer welcome.”
Like the adult Scout in Harper Lee’s newly published “Go Set a Watchman,” Badham left the South during the era of segregation, and returned to find that people she once considered unequivocally good in fact bore the markings of that evil system. In the case of Scout, as revealed with alarm by reviewers of “Watchman,” it’s her sainted father, Atticus, who emerges as an overt racist, inveighing against threats to segregation from the U.S. Supreme Court and local lawyers for the N.A.A.C.P. Badham similarly discovered a mean streak in family friends who didn’t tolerate her breaking of Southern white taboos. “I was ostracized and it was painful,” said the adult Badham.
This past Tuesday night, nine hundred people, a sellout crowd, came to hear Badham read from “Go Set a Watchman” and “To Kill a Mockingbird” at the 92nd Street Y. Harper Lee herself made New York City—specifically the Upper East Side neighborhood around the Y—her second home for more than fifty years. These were her fans, and they clearly had come looking for something to celebrate. When Badham was introduced, they whooped and cheered.
Badham, who was nine when she played the iconic six-year-old and is now sixty-two, was completely overcome. Today a furniture-restorer in rural Virginia, she clasped her hands, raised them in celebration, then took a bow, and finally laughed until she almost cried. She read a brief excerpt from “Mockingbird,” and the first chapter of “Watchman.” Her voice is slow and lilting, quintessentially Southern. Alternately funny and poignant, Badham’s channelling of Jean Louise Finch—in “Watchman,” she has mostly shed her famous nickname—elicited frequent laughter.
In the Q. & A. that followed, moderator Mary Murphy, the director of the documentary “Harper Lee: From Mockingbird to Watchman,” asked Badham if she was surprised by the evolution of Atticus. She was not. In the Alabama she knew, it was not unheard of for a white man like him to righteously defend a black man like Tom Robinson against an unjustified charge of rape, and at the same time believe, as Atticus says in “Watchman,” that black people were “backward,” not “ready” to exercise their full civil rights. She heard all that and much more growing up in Birmingham. We all did.
Could Florida Democrats Blow It Again? — David A. Graham in The Atlantic on the fight brewing for the Senate seat.
The road to a Democratic majority in the Senate is a narrow one, and it runs through Florida. Marco Rubio is running for president, so he can’t run for reelection, freeing up his seat—and in a swing state like Florida, with the more Democratic-friendly electorate of a presidential cycle, there’s a good chance Democrats can win.
If they have the right candidate, of course.
That’s where Alan Grayson comes in. Democrats have had a rough run in Florida recently. In 2010, their candidate was walloped in a three-way Senate race that Rubio won—Governor Charlie Crist ran as an independent after losing the Republican primary; Democrat Kendrick Meek finished a distant third. That same year, Alex Sink lost a close race for governor to Rick Scott. In early 2014, Sink lost a special election for the seat of deceased Representative C. W. “Bill” Young. In fall 2014, Crist—by now a Democrat—lost the governor’s race to Scott, even though the incumbent was strongly disliked.
The remedy, state and national Democrats believe, is Patrick Murphy, a young two-term representative who reached office after defeating Representative Allen West—as fiery and controversial a Republican as Grayson is a Democrat—in 2012. Murphy is a notably moderate Democrat (he was previously a Republican), but he’s a polished candidate who showed he could win in a closely divided district. Party leaders marked him for great things. Early polls show him leading the top Republican candidates.
Then Grayson announced his decision to run. He’s the famously (or infamously) loudmouthed U.S. representative from Orlando—the guy who, during the healthcare-reform debate said the Republican health plan was “Don’t get sick, and if you do get sick, die quickly.” Grayson has a long history of similarly inflammatory or hotheaded comments, which he says is evidence that he’s willing to fight for his principles. The wealthy liberal is serving his third term, but it’s nonconsecutive—elected in 2008, he was defeated in 2010 and then returned to Congress in the 2012 election.
How big a threat to Murphy is Grayson? That’s a tough call. There’s not a great deal of good polling in the race. Several earlier polls showed a close race. A poll in early July from Gravis Marketing showed Grayson leading Murphy by an astonishing 63-19 margin. It’s probably best not to put too much stock in that result—it’s early, it’s an outlier, and Gravis’s track record is, um, not sterling.
But Grayson has one big advantage: He’s willing to say anything. In particular, he’ll deliver inflammatory quotes left and right about anything and anyone, allowing him to effectively tap into the Democratic id. Or in his own, typically modest words, “Voters will crawl naked over hot coals to vote for me. And that’s something that no other candidate in either party can say.” That also means he has strong fundraising potential from the grassroots, though he’s also independently wealthy. His act might play well in a Democratic primary, but could he win a general election in a purple state? The Democratic Senatorial Campaign Committee seems unconvinced. The DSCC praised Murphy in a statement when Grayson officially entered the race last week, but didn’t even mention Grayson’s name.
Doonesbury — Hotter than ever.
What Do You Know? — Eric Lio in The Atlantic on the knowledge gap in America.
Is the culture war over?
That seems an absurd question. This is an age when Confederate monuments still stand; when white-privilege denialism is surging on social media; when legislators and educators in Arizona and Texas propose banning ethnic studies in public schools and assign textbooks euphemizing the slave trade; when fear of Hispanic and Asian immigrants remains strong enough to prevent immigration reform in Congress; when the simple assertion that #BlackLivesMatter cannot be accepted by all but is instead contested petulantly by many non-blacks as divisive, even discriminatory.
And that’s looking only at race. Add gender, guns, gays, and God to the mix and the culture war seems to be raging along quite nicely.
Yet from another perspective, much of this angst can be interpreted as part of a noisy but inexorable endgame: the end of white supremacy. From this vantage point, Americanness and whiteness are fitfully, achingly, but finally becoming delinked—and like it or not, over the course of this generation, Americans are all going to have to learn a new way to be American.
Imagine that this is true; that this decades-long war is about to give way to something else. The question then arises: What? What is the story of “us” when “us” is no longer by default “white”? The answer, of course, will depend on how aware Americans are of what they are, of what their culture already (and always) has been. And that awareness demands a new kind of mirror.
It helps first to consider some recent history. In 1987, a well-regarded professor of English at the University of Virginia named E.D. Hirsch Jr. published a slim volume called Cultural Literacy. Most of the book was an argument—textured and subtle, not overtly polemical—about why nations need a common cultural vocabulary and why public schools should teach it and, indeed, think of their very reason for being as the teaching of that vocabulary.
At the end of the book Hirsch and two colleagues tacked on an appendix: an unannotated list of about 5,000 names, phrases, dates, and concepts that, in their view, “every American needs to know.” The rest (to use a phrase that probably should’ve been on the list) was history.
The appendix became a sensation and propelled the book to the top of the best-seller list. Hirsch became that rare phenomenon: a celebrity intellectual. His list was debated in every serious publication and elite circles. But he also was profiled in People magazine and cited by pundits who would never read the book.
Hirsch’s list had arrived at a ripe moment of national anxiety, when critics like Allan Bloom and Arthur Schlesinger Jr. were bemoaning the “closing of the American mind” and “the disuniting of America”; when multicultural curricula had arrived in schools, prompting challenges to the Western canon and leading Saul Bellow to ask mockingly who the Tolstoy of the Zulus was, or the Proust of the Papuans; a time when Bill Bennett first rang alarms about the “dumbing-down of America.”
The culture wars were on. Into them ambled Hirsch, with his high credentials, tweedy profile, reasoned arguments, and addictively debatable list. The thing about the list, though, was that it was—by design—heavy on the deeds and words of the “dead white males” who had formed the foundations of American culture but who had by then begun to fall out of academic fashion. (From a page drawn at random: Cotton Mather, Andrew Mellon, Herman Melville).
Conservatives thus embraced Hirsch eagerly and breathlessly. He was a stout defender of the patrimony. Liberals eagerly and breathlessly attacked him with equal vigor. He was retrograde, Eurocentric, racist, sexist. His list was a last gasp (or was it a fierce counterattack?) by a fading (or was it resurgent?) white establishment.
Lost in all the crossfire, however, were two facts: First, Hirsch, a lifelong Democrat who considered himself progressive, believed his enterprise to be in service of social justice and equality. Cultural illiteracy, he argued, is most common among the poor and power-illiterate, and compounds both their poverty and powerlessness. Second: He was right.
A generation of hindsight now enables Americans to see that it is indeed necessary for a nation as far-flung and entropic as the United States, one where rising economic inequality begets worsening civic inequality, to cultivate continuously a shared cultural core. A vocabulary. A set of shared referents and symbols…
Doonesbury — Final Curtain.
Assassination in Moscow — Matt Schiavenza in The Atlantic on the murder of a Putin opponent.
Hours after Boris Nemtsov was slain on Friday night near the Kremlin, Russian president Vladimir Putin vowed to seek justice: “Everything will be done so that the organizers and perpetrators of a vile and cynical murder get the punishment they deserve,” he said in a condolence message to the 55-year-old Nemtsov’s mother. Whether Putin is being sincere is something only he and his closest advisors know. But Russia’s recent history inspires little confidence that Nemtsov’s killers, whomever they are, will be brought to justice.
Nemtsov was a high-profile politician, having served as a deputy prime minister and, more recently, as a regional legislator. He was such an outspoken critic of Putin in those roles that he openly feared for his life. Along with his colleague Leonid Martynyuk, Nemtsov published a report detailing the immense corruption surrounding the 2014 Winter Olympics, which were hosted in the Russian resort town of Sochi. Nemtsov also spoke out about Russia’s seizure of Crimea last February and subsequent support for pro-Kremlin rebels in eastern Ukraine. But Nemtsov is hardly the first critic of Putin to lose his life to premeditated murder. Dozens of journalists have been killed since the Russian president first assumed office in 2000. Few of those responsible have been brought to justice—a point Nemtsov himself was well aware of. “The murderers understand that killing journalists is not a problem,” he told Foreign Policy‘s Christian Caryl in a 2010 interview.
The assassination of a well-known politician, however, is somewhat more unusual. In an attempt to preempt public outrage, the Kremlin has already formed a committee to investigate the causes of Nemtsov’s death. One possibility they cited was that Nemtsov’s commentary about the satirical publication Charlie Hebdo, whose offices suffered a murderous assault in January, made him a target of Islamists. The committee also mentioned Nemtsov’s controversial position on Ukraine, and, most spectacularly, suggested that he was killed by fellow opponents of Putin in an attempt to rally opposition to the Russian president.
Putin’s critics have not had it easy in Russia. A major economic slowdown triggered by falling oil prices has not diminished the president’s popularity. The country’s liberal opposition—epitomized by Nemtsov and the jailed politician Alexei Navalny—is weak and marginalized, and their positions on Ukraine, Putin, and the Sochi Olympics are not widely shared among ordinary Russians.
Nevertheless, the Kremlin appears wary of turning Nemtsov into a martyr. On Sunday, he was scheduled to appear at an anti-Putin rally in Moscow. But when the organizers asked to turn the rally into a memorial for Nemtsov, Russian authorities denied the request. Even still, protests have done little to challenge Putin’s grip on power—something that Nemtsov himself acknowledged in a recent interview published in Newsweek‘s Polish edition:
[The liberals’] idea is the one of a democratic and open Russia. A country which is not applying bandit methods to its own citizens and neighbors. But, as I mentioned, Russian fascism is a hybrid. And hybrids are extremely resistant.
As the world mourns his death, Nemtsov’s vision seems very far from being realized.
Early Bird vs. Night Owl — Maria Konnikova in The New Yorker on the morals dictated by our sleep pattern.
The idea of the virtuous early bird goes back at least to Aristotle, who wrote, in his Economics, that “Rising before daylight is … to be commended; it is a healthy habit.” Benjamin Franklin, of course, framed the same sentiment in catchier terms: “Early to Bed, and early to rise, makes a Man healthy, wealthy and wise.” More recently, there has been a push for ever earlier work starts, conference calls, and breakfast meetings, and a steady stream of advice to leave Twitter and Facebook to the afternoon and spend the morning getting real things done. And there may be some truth to the idea: a 1998 study in the Journal of Personality and Social Psychology suggests that we become more passive as the day wears on. You should do the most important thing first, the theory goes, because, well, you won’t be able to do it quite as well later on.
In last January’s issue of Psychological Science, Maryam Kouchaki and Isaac Smith took that theory even further, proposing what they called the morning morality effect, which posits that people behave better earlier in the day. Their research caught the attention of Sunita Sah, a behavioral scientist at Georgetown University and a professed night owl. For the previous five years, Sah had been studying how different situations influence ethical behavior. “You always hear these sweeping statements: morning is saintly, evening is bad; early to bed, early to rise,” she told me recently. A former physician, she found it plausible that something with such profound health consequences as time of day might also have a moral dimension. But she wondered how strong the effect really was. Were people like her—principled late risers—the exception to the rule? To test the limits of Kouchaki and Smith’s findings, Sah and her colleagues began by looking at the underlying biology.
Our sleep patterns are governed by circadian rhythms, our bodies’ response to changes in light and dark in a typical day. The rhythms are slightly different for every person, which is why our energy levels ebb and flow in ways that are unique to us. This internal clock determines what is called our chronotype—whether we are morning people, night people, or somewhere in between. Chronotypes are relatively stable, though they have been known to shift with age. Children and older adults generally prefer mornings; adolescents and young adults prefer evenings. Figuring out where you fall is simple: spend a few weeks going to bed when you feel tired and waking up without an alarm clock. A quicker alternative is the Horne-Ostberg questionnaire, which presents various scenarios—a difficult exam, twice-weekly exercise with a friend—and determines your chronotype on the basis of what time of day you’d feel most up to confronting them.
Chronotype, of course, doesn’t control wakefulness all on its own. There is also what is known as homeostatic sleep drive. The longer we are awake, irrespective of where we are in our established circadian rhythms, the more fatigue exerts its pressure on us. In morning people, sleep drive and chronotype tend to be aligned. Their internal clocks are pretty well synchronized with their over-all energy levels. For night owls, however, things get complicated. When the sun comes up, the light resets their circadian clocks, telling them to wake up. But, because of their chronotypes, they don’t have much energy and they want to go back to sleep. At night, the reverse happens: one system is telling them to sleep and another is telling them to remain awake. About forty per cent of people fall into this latter category.
The Right to Get Weird — Marin Cogan reports in New York magazine on the sideshows at CPAC.
“It’s hard to punch through here,” Travis Brown, a writer for the anti-tax website How Money Walks, is saying. Standing in front of us, a towering silver robot with glowing red LED lights in his eyes and chest plate takes a clunky step forward. “We need to be creative. There’s so much going on.” The robot takes another step forward. A college-aged girl walks by asking who he is.
“Govtron is a robot built and fueled by government inefficiency,” one of the robot’s handlers says. “So he’s armored with pages of the Obamacare bill, he’s got a red tape cannon, he’s stomping on some Gadsden snakes as we speak, stomping on your freedom. We’re pitting this super villain against the How Money Walks Reformers, which includes Captain America and Iron Man, as well as Iron Patriot.” Behind him, a man in a Captain America costume gives a halfhearted wave. “That is so funny!” the girl says. “And what is his name? Goovtron?”
Govtron is the subject of a short comic book Brown authored specially for CPAC, the annual confab hosted by the American Conservative Union. He’s there to direct attention to their website, and right now even he’s struggling to stand out. A few booths away, a limited government youth group called Turning Point USA is blasting Sia while students mill about, tagging their “Big Government Sucks” signing wall. “We’re working around this theme, big government sucks,” says Marko Sukovic, the group’s Midwest field director. “It’s probably the most relevant phrase any young person can relate to on college campuses.”
Behind the wall, on which someone has scrawled, “I love our freedom and dislike big politics,” a man in a “Muhammad is a homo” T-shirt is giving an interview in front of an audience of empty chairs. Three aisles down, at the end of the Gaylord Hotel’s massive expo center, the American Atheists are posted up at a booth with the banner “Conservative Atheists Matter!”
“There are millions and millions of atheists who would be voting Republican if the Republicans would just let them!” David Silverman, the group’s president, says, eyes as big as saucers. “Just ask for our vote! Tell us that we count! Tell us that we matter, once! It’s never happened. Not in my lifetime.” The fresh-faced youth manning the World Congress of Families booth beside them does not know how to deal with the atheists next door. “It’s urrrgh … ” he mumbles until an adult steps in to cut him off.
In a big ballroom upstairs, Ted Cruz, Rand Paul, and Marco Rubio will practice their nascent stump speeches to adoring crowds, and Jeb Bush and Chris Christie — the more moderate and less favored potential candidates in the CPAC straw poll — will get grilled by conservative luminaries like Laura Ingraham and Sean Hannity. With the exception of some awkward jokes, and Scott Walker’s awkward reply to a question about how he would take on ISIS (“We need a leader with that kind of confidence. If I can take on 100,000 protesters, I can do the same across the world,” he tells a questioner), most of the conference’s events are too scripted to be memorable.
Doonesbury — Planned disruption. (You may have to scroll down the page to actually see the comic.)