Monday, June 13, 2016

A Test of Character

I sometimes think the Bill of Rights is a test of character for the country.  It’s as if the Founding Fathers said, “Okay, America, we’re going to give you all of these rights; let’s see how you handle them.”

There have been times when we have risen to them and proven ourselves worthy: when equal rights for all truly does mean real equality, not separate but equal or equal only for white Christians.  And there have been times when we have failed them: internment for citizens who immigrated from a place we’re currently hating or the idea that because some see one amendment as thoroughly inviolable we have to accede to their fetishism as the way things must be.

It is serendipitous that as we recover from the shock and horror of the massacre in Orlando we saw a celebration of a musical that honored Alexander Hamilton, one of the founding fathers.  It’s a civics lesson through hip-hop, and while some may find it incongruous to see 18th century characters rapping about starting a country, it reminds us that we are forever being challenged on how we answer to those who set us on our way.

Sunday, June 5, 2016

Sunday Reading

Transcendent — Charlie Pierce on Muhammad Ali.

I play it cool/I dig all jive/That is how I stay alive

—Langston Hughes

There is no real place to begin with him and no ending fit enough for the life he led. Muhammad Ali died on Friday, true enough. They will take him to his final rest on Wednesday in Louisville, which was only his first hometown in a world that he made his true hometown. So he was not immortal, the way we all thought he might be, but he lived a life beyond the bounds of mortality anyway, a life that has no real beginning and that still has a vital spirit for which no ending is adequate.

He was an iconic human being in an era that produced icons with every turn of the television dial, every front page of every morning newspaper and, my god, most of them died young. John and Robert Kennedy. Martin Luther King and Malcolm X. None of them ever made 50. None of them ever made old bones. Only Ali lived to see how he truly changed the world around him, how it had come to understand that some lives are lived beyond the mortal limits.

He was a transcendent athlete, first and foremost, every bit as skilled at what he did for a living as Michael Jordan or Pele. The greatest change in athletes over the span of his physical life is that big athletes got fast. LeBron James plays basketball and he is just about the same size as Antonio Gates, who is a tight end. When he first arrived at Wimbledon, Boris Becker looked like a college linebacker. Ali was tall for a heavyweight, bigger than anyone who was faster than he was and faster than anyone who was bigger.

You have to have seen him before he was stripped of his livelihood to appreciate fully his gifts as an athlete. Foot speed. Hand speed. Before it all hit the fan in 1968, Sports Illustrated put him in a lab with strobe lights and everything, to time the speed of his punches. The results looked something out of a special-effects lab. In one of his routines, the late Richard Pryor used to talk about sparring with Ali in a charity exhibition. A Golden Gloves fighter in his youth, as Pryor later put it, “you don’t see his punches until they comin’ back. And your mind be sayin’, ‘Wait a minute now. There was some shit in my face a minute ago. I know that.'” He was an accelerated man in an accelerated age. Saying he was “ahead of his time” was only the half of it. His time was all time.

That was what led to the rest of it—the opposition to the criminal stupidity that was being practiced by this country in Southeast Asia, stated in terms as fundamentally American as the First Amendment to the Constitution. “Congress shall make no law…” His stubborn insistence that his life was his own, that it did not belong to the sclerotic old gangsters who still ran boxing, nor to the sclerotic old men who still ran the government, with their wiretaps and their phony indictments and their lawbooks. He was too fast for them all to catch, ultimately, and too pretty for a country that was vandalizing its most beautiful elements. That stubbornness also likely led to his physical downfall. All gifts have their dark side. All debts come due.

He was a prophet, in every way that America makes its prophets, in the same way that was William Lloyd Garrison, who told his country “I am in earnest—I will not equivocate—I will not excuse—I will not retreat a single inch—AND I WILL BE HEARD,” and in the same way that was Dr. King, who told that same country that:

“In a sense we’ve come to our nation’s capital to cash a check. When the architects of our republic wrote the magnificent words of the Constitution and the Declaration of Independence, they were signing a promissory note to which every American was to fall heir. This note was a promise that all men, yes, black men as well as white men, would be guaranteed the “unalienable Rights” of ‘Life, Liberty and the pursuit of Happiness.’ It is obvious today that America has defaulted on this promissory note.”

He embodied the country, in all its historic, inherent contradictions, in all its promises, broken and unbroken, and in all of its lost promises and hard-won glories. He insisted on the rights that the country said were his from birth and, in demanding them, freed himself to enjoy them, and freed the country, if only for a moment, to be something more than even the Founders thought it would be. And now, he’s passed from the earth. It was a great, golden trumpet of a life he led, and it is calling, calling still, and will still be calling, as the old hymn puts it, when time shall be no more.

The Real Scandal — Julianne Hing in The Nation on Donald Trump’s scam “university.”

This is where we are at this point of the collective national nightmare of the Republican Party’s 2016 campaign: On Thursday, Donald Trump toldThe Wall Street Journal that because of US District Judge Gonzalo Curiel’s “Mexican heritage,” the federal judge has an “absolute conflict” in presiding over a lawsuit brought by former students of Trump’s self-named real-estate courses. Curiel’s ethnic background is of importance because, Trump said, “I’m building a wall. It’s an inherent conflict of interest.” Trump clearly misunderstands the concept; a defendant’s own prejudices have no bearing on whether a judge is unfit for the job.

When Trump first mentioned the Indiana-born judge’s ethnicity at a San Diego rally last Friday, it was to do his usual jabbing and dancing to avoid ethical punches. At that event, Trump raised Curiel and mentioned his ethnic background in the same breath: “The judge, who happens to be, we believe, Mexican, which is great, I think that’s fine,” adding that he was sure that Mexican Americans would come around to support him “when I give all these jobs, OK?” Then he circled back around. “I’m getting railroaded by the legal system,” he said, “Frankly, they should be ashamed.” Trump labeled Curiel “a hater of Donald Trump,” and also called him “a total disgrace.”

It was a classic Trump move: create bogeymen out of thin air in order to prop up his self-imagined victimhood; home in on a person’s race or sex as the basis for his attacks; and then antagonize as a form of diversion from the matter at hand. That matter would be Trump University, the mogul’s real-estate courses that purportedly taught customers how to become like Trump, for as much as $35,000, or starting at the low, low price of $1,495. The lawsuit alleges that far from teaching students actual real-estate expertise, Trump ran a fraudulent business scheme.

The marketing schemes for Trump’s real-estate seminars at times sound ripped straight from the recruitment playbooks of the scandal-plagued for-profit school industry, which has preyed on single moms, people of color, veterans, and those who’ve been locked out of more prestigious avenues for higher education.

Take, for instance, the 2010 Senate testimony of Joshua Pruyn, a former admissions representative for Westwood College, a for-profit chain of, at the time, 17 campuses. Pruyn was technically an admissions advisor, but in reality his position was that of a glorified sales rep. “During the interview, we were taught to portray ourselves as advisors looking out for the students’ best interests and ensuring they were a good fit for the school. This fake interview would allow the representative to ask students questions to uncover a student’s motivators and pain points—their hopes, fears, and insecurities—all of which would later be used to pressure a student to enroll,” Pruyn testified.

The for-profit schools industry targeted people of color, poor people, and veterans because they more likely to be eligible for public financial aid like Pell Grants. This much-parodied Everest College commercial should be very familiar with anyone who watches daytime television.

Students of color ended up forming the backbone of the industry’s explosive growth in the early and mid-2000s. In the 2010–11 school year, just as the Obama administration’s regulatory hammer started to fall on the industry, the for-profit system University of Phoenix was the nation’s top producer of new black undergrad graduates. The nation’s second-highest producer of new black baccalaureates that year was Ashford University, also a for-profit college.

When the industry’s comeuppance came, it was devastating. In lawsuit after lawsuit, universities were accused of fleecing students of their federal student-aid money and saddling them with debt they couldn’t repay, and leaving students with an education and credits that weren’t transferrable or recognized as valid by other educational institutions. In December 2015, after multiple settlements in various lawsuits, Westwood College—where former admissions recruiter Pruyn worked—agreed not to enroll any more students.

After suffering a barrage of these kinds of lawsuits and increased regulation from the Obama administration, the for-profit schools industry is now in the tank these days. Enrollment is down; many have been maligned for the shady businesses that they were, including Trump University.

Hours after Trump brought Judge Curiel into his campaign theater last week, Curiel unsealed documents related to his case, at the request of The Washington Post. Those documents detail the aggressive marketing and recruitment playbook that Trump University sales staffers worked from. The playbook urged sales members to not let prospective customers be deterred by their own lack of money (“If they believe in you and your product, they will find the money”), and to guide consumers through “the roller coaster of emotions,” so as to encourage students to cough up cash. The guides urged sales members to home in on people’s vulnerabilities for maximum effect “during closing time.” (“For example: are they a single parent of three children that may need money for food?”)

These tactics, Trump would rather not discuss. Always easier, after all, to pivot to the most base of appeals—racial and ethnic antagonism—and the cheapest of tactics—bullying others and calling it self-defense.

Spectacles Spectacle — Peter Schjeldahl in The New Yorker on the latest art craze.

A recent little sensation at the San Francisco Museum of Modern Art delights and bemuses. Two teen-age boys from San Jose were perusing, with perplexity, the museum’s exhibits of contemporary art when they had a notion. One of them, Kevin Nguyen, sixteen, set his opened eyeglasses on the floor, neatly perpendicular to a nearby wall. He and T. J. Khayatan, seventeen, then stood back to watch what ensued: viewers observing the glasses with the curiosity and respect befitting a work of art—which, under the circumstances, they were.

Not that the glasses were good art, necessarily—an issue made moot, in any case, when Nguyen picked them up and put them back on.

But consider: an object manufactured to enhance seeing, presented as something to see. By being underfoot, the glasses were divorced from their function and protected only by the don’t-touch protocol of museums. They might have seemed, to a suggestible audience, to be about being-in-a-museum—and that audience could have included me. Suggestibility, undaunted by fear of proving foolish, is essential to art love.

Invoked, of course, was the evergreen aesthetic of the readymade, demonstrated by Marcel Duchamp with a urinal, in 1917. But that trope is hardly surefire. During their visit, Nguyen and Khayatan ventured two other placements, of a jacket and a baseball cap, which, at least visibly, intrigued no one. Some conceptual poetry or satirical bite is needed to bring a readymade off. The glasses managed both the former, at first, and then the latter, when their backstory emerged.

Many sane citizens will deem the spectacle of the spectacles ridiculous. They won’t be wrong. A risk of absurdity always attends the willingness to surrender oneself to the spell of any mere object: the dirtied swatch of cloth that is a painting, for example. It’s a game, whatever else it is, which makes sense only with knowledge of the rules and customs that are in play.

Museums edit, for our convenience, the universe of existing things. What they let in and what they keep out shape culture. How far in the way of inclusion is too far? How much in the way of discrimination is just crabby?

Have we witnessed the entire art career, now, of the San Jose Two?

You go, boys.

Doonesbury — Heir Apparent.

Sunday, May 29, 2016

Sunday Reading

Liar, Liar — Jonathan Chait on the serial mendacity of Donald J. Trump.

Donald Trump is a wildly promiscuous liar. He also has disturbing authoritarian tendencies. Trump’s many critics have seized upon both traits as his two major disqualifications for the presidency, yet both of them frustratingly defy easy quantification. All politicians lie some, and many of them lie a lot, and most presidents also push the limits of their authority in ways that can frighten their opponents. So what is so uniquely dangerous about Trump? Perhaps the answer is that both of these qualities are, in a sense, the same thing. His contempt for objective truth is the rejection of democratic accountability, an implicit demand that his supporters place undying faith in him. Because the only measure of truth he accepts is what he claims at any given moment, the power his supporters vest in him is unlimited.

Trump lies routinely, about everything. Various journalists have tried to tally up his lies, inevitably giving up and settling for incompletesummaries. Some of these lies are merely standard, or perhaps somewhat exaggerated, versions of the way members of his party talk about policy. (The “real” unemployment rate is as high as 42 percent, or his gargantuan tax-cut plan “will be revenue-neutral.”) At times he engages in especially brazen rewriting of his own positions, such as insisting he opposed the Iraq War when he did not, or denying his past support for universal health insurance. Some of his lies are conspiracy theories that run toward the edges of respectable Republican thought (Barack Obama was actually born abroad) or even well beyond it (Ted Cruz’s father may have conspired to kill John F. Kennedy). In all these areas, Trump has merely improved upon the methods used by the professionals in his field.

Where he has broken truly unique ground is in his lies about relatively small, routine matters. As I’ve pointed out before — it’s become a small personal fixation — after Mitt Romney mocked the failure of Trump Steaks, Trump held a press conference in which he insisted Trump Steaks remained a going concern, despite the undeniable fact that the business no longer exists. (His campaign displayed store-bought steaks for the media, not even bothering to fully remove the labels of the store at which they purchased them.) The New York Times actually reported this week that Trump had displayed his steaks, without mentioning the blatant deception. Another such example is Trump’s prior habit of impersonating an imaginary p.r. representative while speaking to reporters. Obviously, the practice itself is strange enough, but the truly Trumpian touch is that he admitted to the ruse publicly, and then subsequently went back to denying it.

The normal rules of political lying hold that when the lie has been exposed, or certainly when it has been confessed, the jig is up. You have to stop lying about it and tell the truth, or at least retreat to a different lie. Trump bends the rules of the universe to his own will, at no apparent cost. His brazenness is another utterly unique characteristic. His confidence that he can make the truth whatever he wishes at any moment, and toggle back and forth between incompatible realities at will, without any cost to himself, is a display of dominance. Possibly Trump’s most important statement of the campaign was his idle boast that he could shoot somebody on Fifth Avenue without losing any votes.

Finally, there is Trump’s habit of settling all disputes with his own peculiar form of ad hominem. He dismisses all criticisms of his statements and his record with an array of put-downs, and likewise confirms all endorsements with praise. Anybody who disagrees with Trump is ugly, short, corrupt, a loser, a habitual liar, a total joke, and so forth. People who support him are smart, beautiful, fair, esteemed, etc. But politics being as it is — and, especially, Trump’s positions being as fluid as they are — the composition of the two categories is in constant flux. One day, you are a failing, ridiculous, deranged liar, and the next day a citizen of the highest regard. Trump literally called Ben Carson a “violent criminal” and a “pathological liar,” akin to a “child molester.” When later accepting Carson’s endorsement, Trump praised his “dignity.” Once Trump mocked Rick Perry as a moron who wore glasses to look smart and who should be required to take an IQ test to participate in presidential debates. Now he is a “good guy, good governor.” This is the pattern Trump uses to dismiss all media criticism, or to amplify friendly coverage. Every reporter or publication is either pathetic and failing or fair and wonderful, and the same reporters and publications can be reclassified as one or the other as Trump sees fit.

1984 is a cliché for invoking totalitarianism, and in any case, Trump is merely an authoritarian and a bully, not a totalitarian. (A totalitarian government, like North Korea, exerts control over every aspect of its citizens’ lives; an authoritarian one, like Putin’s Russia, merely uses enough fear and violence to maintain control.) Nonetheless, the novel does capture the relationship between dictatorial authority and the power to manipulate any fact into a binary but permeable scheme:

The past was alterable. The past never had been altered. Oceania was at war with Eastasia. Oceania had always been at war with Eastasia. Jones, Aaronson, and Rutherford were guilty of the crimes they were charged with. He had never seen the photograph that disproved their guilt. It had never existed, he had invented it. He remembered remembering contrary things, but those were false memories, products of self-deception.

Truth and reason are weapons of the powerless against the powerful. There is no external doctrine he can be measured against, not even conservative dogma, which he embraces or discards at will and with no recognition of having done so. Trump’s version of truth is multiple truths, the only consistent element of which is Trump himself is always, by definition, correct. Trump’s mind is so difficult to grapple with because it is an authoritarian epistemology that lies outside the democratic norms that have shaped all of our collective experiences.

Magnetic Personalities — M.R. O’Connor in The New Yorker on nature’s GPS system.

Every three years, the Royal Institute of Navigation organizes a conference focussed solely on animals. This April, the event was held southwest of London, at Royal Holloway College, whose ornate Victorian-era campus has appeared in “Downton Abbey.” For several days, the world’s foremost animal-navigation researchers presented their data and findings in a small amphitheatre. Most of the talks dealt with magnetoreception—the ability to sense Earth’s weak but ever-present magnetic field—in organisms as varied as mice, salmon, pigeons, frogs, and cockroaches. This marked a change from previous years, Richard Nissen, a member of the Institute, told me, when a range of other navigation aids were part of the discussion: landmarks, olfactory cues, memory, genetics, polarized light, celestial objects. “Everyone now seems completely sold on the idea that animal navigation is based on magnetism,” Nissen said. Human-centric as it sounds, most of the conference’s attendees believe that animals possess a kind of compass.

Scientists have sought for centuries to explain how animals, particularly migratory species, find their way with awesome precision across the globe. Examples of these powers abound. Bar-tailed godwits depart from the coastal mudflats of northern Alaska in autumn and set out across the Pacific Ocean, flying for eight days and nights over featureless water before arriving in New Zealand, seven thousand miles away. If the birds misjudge their direction by even a few degrees, they can miss their target. Arctic terns travel about forty thousand miles each year, from the Arctic to the Antarctic and back again. And odysseys of this sort are not limited to the feathered tribes. Some leatherback turtles leave the coast of Indonesia and swim to California, more than eight thousand miles away, then return to the very beaches where they hatched. Dragonflies and monarch butterflies follow routes so long that they die along the way; their great-grandchildren complete the journey.

Although the notion of a biocompass was widely disparaged in the first half of the twentieth century, the evidence in favor of it has since become quite strong. In the early nineteen-sixties, a German graduate student named Wolfgang Wiltschko began conducting experiments with European robins, which he thought might find their way by picking up radio waves that emanated from the stars. Instead, Wiltschko discovered that if he put the robins in cages equipped with a Helmholtz coil—a device for creating a uniform magnetic field—the birds would change their orientation when he switched the direction of north. By the start of this century, seventeen other species of migratory bird, as well as honeybees, sharks, skates, rays, snails, and cave salamanders, had been shown to possess a magnetic sense. In fact, practically every animal studied by scientists today demonstrates some capacity to read the geomagnetic field. Red foxes almost always pounce on mice from the northeast. Carp floating in tubs at fish markets in Prague spontaneously align themselves in a north-south axis. So do dogs when they crouch to relieve themselves, and horses, cattle, and deer when they graze—except if they are under high-voltage power lines, which have a disruptive influence.

The only problem is that no one can seem to locate the compass. “We are still crying out for how do they do this,” Joseph Kirschvink, a geobiologist at the California Institute of Technology, said. “It’s a needle in the haystack.” Kirschvink meant this almost literally. In 1981, as a Ph.D. student at Princeton University, he proposed that magnetite, a naturally occurring oxide of iron that he had found in honeybees and homing pigeons, was the basis of the biocompass. Even a handful of magnetite crystals, he wrote at the time, could do the trick. “One equivalent of a magnetic bacteria can give a whale a compass—one cell,” he told me. “Good luck finding it.” Even in animals smaller than a whale, this is no easy task. Throughout the two-thousands, researchers pointed to the presence of iron particles in the olfactory cells of rainbow trout, the brains of mole rats, and the upper beaks of homing pigeons. But when scientists at the Research Institute of Molecular Pathology, in Vienna, took a closer look, slicing and examining the beaks of hundreds of pigeons, they found that the iron-rich cells were likely the product of an immune response—nothing to do with the biocompass. The study’s lead researcher, David Keays, has since turned his focus to iron-containing neurons inside the pigeons’ ears.

The search for the biocompass has extended to even smaller scales, too. In 1978, the German biophysicist Klaus Schulten proposed that birds’ innate sense of direction was chemical in nature. According to his theory, incoming light would hit some sort of sensory mechanism, which Schulten hadn’t yet pinpointed, and induce a transfer of electrons, triggering the creation of a radical pair—two molecules, each with an extra electron. The electrons, though slightly separated, would spin in synchrony. As the bird moved through the magnetic field, the orientation of the spinning electrons would be modulated, providing the animal with feedback about its direction. For the next twenty years, it remained unclear which molecules could be responsible for such a reaction. Then, in 2000, Schulten suggested an answer—cryptochromes, a newly discovered class of proteins that respond to blue light. Cryptochromes have since been found in the retinas of monarch butterflies, fruit flies, frogs, birds, and even humans. They are the only candidate so far with the right properties to satisfy Schulten’s theory. But the weakest magnetic field that affects cryptochromes in the laboratory is still twenty times stronger than Earth’s magnetic field. Peter Hore, a chemist at Oxford University, told me that establishing cryptochromes as the biocompass will require at least another five years of research.

At the conference, the magnetite and cryptochrome researchers made up distinct camps, each one quick to point out the opposing theory’s deficiencies. One person stood alone: Xie Can, a biophysicist at Peking University. Xie spent six years developing a kind of unified model of magnetic animal navigation. Last year, he published a paper in Nature Materials describing a protein complex that he dubbed MagR, which consists of iron crystals enveloped in a double helix of cryptochrome—the two main theories rolled into one. Xie has yet to win over other researchers, some of whom believe that his findings are the result of iron oxide contaminating his lab experiments. (Keays has said that he will eat his hat if MagR is proved to be the real magnetoreceptor.) But at the end of the conference, with one mystery of animal navigation after another left unanswered, Xie told me that he felt more confident than ever of his and his colleagues’ model. “If we are right, we can explain everything,” he said. Michael Walker, a biologist at the University of Auckland, was more circumspect. If history is any indication, he said, many of the current hypotheses about how the biocompass works will turn out to be wrong.

“We Will Not Be Ignored” — Amanda Hess in the New York Times on the rising visibility of Asian-American actors.

Asian-American actors 05-29-16When Constance Wu landed the part of Jessica Huang, the Chinese-American matriarch on the ABC sitcom “Fresh Off the Boat,” she didn’t realize just how significant the role would turn out to be. As she developed her part, Ms. Wu heard the same dismal fact repeated over and over again: It had been 20 years since a show featuring a predominantly Asian-American cast had aired on television. ABC’s previous offering, the 1994 Margaret Cho vehicle “All-American Girl,” was canceled after one season.

“I wasn’t really conscious of it until I booked the role,” Ms. Wu said. “I was focused on the task at hand, which was paying my rent.”

The show, which was just renewed for a third season, has granted Ms. Wu a steady job and a new perspective. “It changed me,” Ms. Wu said. After doing a lot of research, she shifted her focus “from self-interest to Asian-American interests.”

In the past year, Ms. Wu and a number of other Asian-American actors have emerged as fierce advocates for their own visibility — and frank critics of their industry. The issue has crystallized in a word — “whitewashing” — that calls out Hollywood for taking Asian roles and stories and filling them with white actors.

On Facebook, Ms. Wu ticked off a list of recent films guilty of the practice and said, “I could go on, and that’s a crying shame, y’all.” On Twitter, she bit back against Hollywood producers who believe their “lead must be white” and advised the creators of lily-white content to “CARE MORE.” Another tip: “An easy way to avoid tokenism? Have more than one” character of color, she tweeted in March. “Not so hard.”

It’s never been easy for an Asian-American actor to get work in Hollywood, let alone take a stand against the people who run the place. But the recent expansion of Asian-American roles on television has paradoxically ushered in a new generation of actors with just enough star power and job security to speak more freely about Hollywood’s larger failures.

And their heightened profile, along with an imaginative, on-the-ground social media army, has managed to push the issue of Asian-American representation — long relegated to the back burner — into the current heated debate about Hollywood’s monotone vision of the world.

“The harsh reality of being an actor is that it’s hard to make a living, and that puts actors of color in a very difficult position,” said Daniel Dae Kim, who stars in “Hawaii Five-0” on CBS and is currently appearing in “The King and I” on Broadway.

Mr. Kim has wielded his Twitter account to point to dire statistics and boost Asian-American creators. Last year, he posted a cheeky tribute to “the only Asian face” he could find in the entire “Lord of the Rings” series, a woman who “appears for a glorious three seconds.”

Other actors lending their voices include Kumail Nanjiani of “Silicon Valley,” Ming-Na Wen of “Agents of S.H.I.E.L.D.” and Aziz Ansari, who in his show, “Master of None,” plays an Indian-American actor trying to make his mark.

They join longtime actors and activists like BD Wong of “Gotham”; Margaret Cho, who has taken her tart comedic commentary to Twitter; and George Takei, who has leveraged his “Star Trek” fame into a social media juggernaut.

“There’s an age-old stereotypical notion that Asian-American people don’t speak up,” Mr. Wong said. But “we’re really getting into people’s faces about it.”

This past year has proved to be a particularly fraught period for Asian-American representation in movies. Last May, Sony released “Aloha,” a film set in Hawaii that was packed with white actors, including the green-eyed, blond-haired Emma Stone as a quarter-Chinese, quarter-Native Hawaiian fighter pilot named Allison Ng.

In September, it was revealed that in the planned adaptation of the Japanese manga series Death Note, the hero, a boy with dark powers named Light Yagami, would be renamed simply Light and played by the white actor Nat Wolff. In “The Martian,” released in October, the white actress Mackenzie Davis stepped into the role of the NASA employee Mindy Park, who was conceived in the novel as Korean-American.

The list goes on. In December, set photographs from the coming “Absolutely Fabulous” film showed the Scottish actress Janette Tough dressed as an over-the-top Asian character. Last month, Marvel Studios released a trailer for “Doctor Strange,” in which a character that had originated in comic books as a Tibetan monk was reimagined as a Celtic mystic played by Tilda Swinton.

And in the live-action American film adaptation of the manga series Ghost in the Shell, scheduled for next year, the lead character, Major Motoko Kusanagi, will be called Major and played by Scarlett Johansson in a black bob.

Studios say that their films are diverse. “Like other Marvel films, several characters in ‘Doctor Strange’ are significant departures from the source material, not limited by race, gender or ethnicity,” the Marvel Studios president Kevin Feige said in a statement. Ms. Swinton will play a character that was originally male, and Chiwetel Ejiofor a character that was originally white. Paramount and DreamWorks, the studios behind “Ghost in the Shell,” said that the film reflects “a diverse array of cultures and countries.”

But many Asian-American actors aren’t convinced. “It’s all so plainly outlandish,” Mr. Takei said. “It’s getting to the point where it’s almost laughable.”

Doonesbury — Play safe.

Sunday, January 3, 2016

Sunday Reading

Miami Vice — Martin Longman on Marco Rubio’s connections with drug dealers.

When you see a headline like this [How Rubio helped his ex-con brother-in-law acquire a real estate license] in the Washington Post, you figure that you’re about to read a very long and sordid exposé. That’s not really what Post reporters Manuel Roig-Franzia and Scott Higham delivered in this case, though. Their piece has enough substantiation to justify the headline, but it doesn’t delve too deeply into the greater meaning and it leaves the most important question unanswered.

Let’s start with the fact that “ex-con” doesn’t really do justice to Marco Rubio’s brother-in-law. Orlando Cicilia was a major drug trafficker at a time and in a place that has gone down in history in movies like Scarface and television programs like Miami Vice for being notoriously violent and destructive.

According to public records, Cicilia was arrested after federal law enforcement seized the Miami home where he lived with Barbara Rubio, Senator Rubio’s sister. Barbara Rubio was not arrested or indicted. Cicilia was sentenced to 25 years in prison for conspiracy to distribute cocaine and marijuana.

The arrest was part of “Operation Cobra,” a federal crackdown on a Florida drug smuggling ring that killed a federal informer and chopped up his body, according to a NYT story published at the time. The story reports that the ring, led by Cuban American Mario Tabraue, paid $150,000 in bribes to the Key West police chief and Miami-Dade county officials, and used Miami police officers to collect, count, and disburse drug profits.

About that part where they killed a federal informer and chopped up his body, the New York Times reported on December 17th, 1987:

The authorities said that in July 1980, members of [Cicilia’s drug ring] apparently became aware that Larry Nash was an informer for the Bureau of Alcohol, Tobacco and Firearms.

“Mr. Nash was murdered and mutilated,” Mr. Dean said. “His body was cut up with a chain saw and then burned.”

This drug ring reportedly did $75 million of business trading in marijuana and cocaine, of which Cicilia was personally responsible for $15 million. That’s a lot of cocaine and a lot of ruined lives, and the way they operated, it was a lot of violence, intimidation, and the cause of a shameful amount of public corruption.

To call this man merely an “ex-con” doesn’t capture the scope of his crimes.

When Cicilia was arrested, Marco Rubio was sixteen years old, and he can’t be held accountable for what his sister’s boyfriend and eventual husband did for a living. That his sister and the family stayed loyal to this man throughout his incarceration and welcomed him back into their lives and homes when he was released is admirable in its own way. When you look at the totality of the circumstances with this case, the Rubio family deserves a degree of credit for loyalty and a willingness to forgive. Orlando Cicilia served his time and he ought to be afforded the opportunity to demonstrate that he’s been rehabilitated.

Still, this was a choice. It was a choice to essentially overlook the immense damage done by Cicilia and his gang to countless individuals and to the integrity of the local government and law enforcement institutions.

We have to balance the good and the bad here, and that’s the context with which we should judge the following:

When Marco Rubio was majority whip of the Florida House of Representatives, he used his official position to urge state regulators to grant a real estate license to his brother-in-law, a convicted cocaine trafficker who had been released from prison 20 months earlier, according to records obtained by The Washington Post.

In July 2002, Rubio sent a letter on his official statehouse stationery to the Florida Division of Real Estate, recommending Orlando Cicilia “for licensure without reservation.” The letter, obtained by The Washington Post under the Florida Public Records Act, offers a glimpse of Rubio using his growing political power to assist his troubled brother-in-law and provides new insight into how the young lawmaker intertwined his personal and political lives.

Rubio did not disclose in the letter that Cicilia was married to his sister, Barbara, or that the former cocaine dealer was living at the time in the same West Miami home as Rubio’s parents. He wrote that he had known Cicilia “for over 25 years,” without elaborating.

The Rubio campaign responds that it would have been worse if he had revealed his conflict of interest because revealing that Cicilia was his brother-in-law and was living with his parents would have put undue pressure on the members of the Florida Division of Real Estate. This is because, as majority whip of the Florida House of Representatives, he had “significant influence” over the Division’s budget.

That’s a defense, certainly, but a poor one. Rubio had two truly defensible options. He could have refused to write the letter because of the obvious conflict or he could have fully disclosed it and let the chips fall where they may. He chose to hide the conflict, and that was the wrong decision.

Tell Me a Story — John Yorke in The Atlantic looks at the roots of all tales.

A ship lands on an alien shore and a young man, desperate to prove himself, is tasked with befriending the inhabitants and extracting their secrets. Enchanted by their way of life, he falls in love with a local girl and starts to distrust his masters. Discovering their man has gone native, they in turn resolve to destroy both him and the native population once and for all.Avatar or Pocahontas? As stories they’re almost identical. Some have even accused James Cameron of stealing the Native American myth. But it’s both simpler and more complex than that, for the underlying structure is common not only to these two tales, but to all of them.

Take three different stories:

A dangerous monster threatens a community. One man takes it on himself to kill the beast and restore happiness to the kingdom …

It’s the story of Jaws, released in 1976. But it’s also the story of Beowulf, the Anglo-Saxon epic poem published some time between the eighth and 11th centuries.

And it’s more familiar than that: It’s The Thing, it’s Jurassic Park, it’s Godzilla, it’s The Blob—all films with real tangible monsters. If you recast the monsters in human form, it’s also every James Bond film, every episode of MI5, House, or CSI. You can see the same shape in The Exorcist, The Shining, Fatal Attraction, Scream, Psycho, and Saw. The monster may change from a literal one in Nightmare on Elm Street to a corporation in Erin Brockovich, but the underlying architecture—in which a foe is vanquished and order restored to a community—stays the same. The monster can be fire in The Towering Inferno, an upturned boat in The Poseidon Adventure, or a boy’s mother in Ordinary People. Though superficially dissimilar, the skeletons of each are identical.

Our hero stumbles into a brave new world. At first he is transfixed by its splendor and glamour, but slowly things become more sinister . . .

It’s Alice in Wonderland, but it’s also The Wizard of Oz, Life on Mars, and Gulliver’s Travels. And if you replace fantastical worlds with worlds that appear fantastical merely to the protagonists, then quickly you see how Brideshead Revisited, Rebecca, The Line of Beauty, and The Third Man all fit the pattern too.

When a community finds itself in peril and learns the solution lies in finding and retrieving an elixir far, far away, a member of the tribe takes it on themselves to undergo the perilous journey into the unknown …

It’s Raiders of the Lost Ark, Morte D’Arthur, Lord of the Rings, and Watership Down. And if you transplant it from fantasy into something a little more earthbound, it’s Master and Commander, Saving Private Ryan, Guns of Navarone, and Apocalypse Now. If you then change the object of the characters’ quest, you find Rififi, The Usual Suspects, Ocean’s Eleven, Easy Rider, and Thelma & Louise.

So three different tales turn out to have multiple derivatives. Does that mean that when you boil it down there are only three different types of story? No. Beowulf, Alien, and Jaws are ‘monster’ stories—but they’re also about individuals plunged into a new and terrifying world. In classic “quest” stories like Apocalypse Now or Finding Nemo the protagonists encounter both monsters and strange new worlds. Even “Brave New World” stories such as Gulliver’s Travels, Witness, and Legally Blonde fit all three definitions: The characters all have some kind of quest, and all have their own monsters to vanquish too. Though they are superficially different, they all share the same framework and the same story engine: All plunge their characters into a strange new world; all involve a quest to find a way out of it; and in whatever form they choose to take, in every story “monsters” are vanquished. All, at some level, too, have as their goal safety, security, completion, and the importance of home….

The Music of John Williams — Alex Ross in The New Yorker on the influence of the composer of Star Wars and many other film scores.

My favorite film of 1977 was not “Star Wars” but “Close Encounters of the Third Kind,” Steven Spielberg’s U.F.O fantasia. Notwithstanding the fact that I was nine years old, I considered “Star Wars” a little childish. Also, the trash-compactor scene scared me. “Close Encounters,” on the other hand, drew me back to the theatre—the late, great K-B Cinema, in Washington, D.C.—five or six times. I irritated friends by insisting that it was better than “Star Wars,” and followed the box-office grosses in the forlorn hope that my favorite would surpass its rival.

“Close Encounters” still strikes me as an amazing creation—a one-off fusion of blockbuster spectacle with the disheveled realism of nineteen-seventies filmmaking. It has a wildness, a madness that is missing from Spielberg’s subsequent movies. The Disneyesque fireworks of the finale can’t hide the fact that the hero of the tale is abandoning his family in the grip of a monomaniacal obsession. Looking back, though, I’m sure that what really held me spellbound was the score, which, like that of “Star Wars,” was written by John Williams. I was a full-on classical-music nerd, playing the piano and trying to write my own compositions. I’d dabbled in Wagner, Bruckner, and Mahler, but knew nothing of twentieth-century music. “Close Encounters” offered, at the start, a seething mass of dissonant clusters, which abruptly coalesce into a bright, clipped C-major chord, somehow just as spooky as what came before. The “Star Wars” music had a familiar ring, but this kind of free, frenzied painting with sound was new to me, and has fascinated me ever since.

Now eighty-three years old, Williams remains a vital presence. “Star Wars: The Force Awakens,” his latest effort, is doing fairly good business, and he is at work on Spielberg’s next picture. He has scored all of the “Star Wars” movies, all of the Indiana Jones movies, several Harry Potters, “Jaws,” “E.T.,” “Superman,” “Jurassic Park,” and almost a hundred others. BoxOfficeMojo.com calculates that since 1975 Williams’s films have grossed around twenty billion dollars worldwide—and that leaves out the first seventeen years of his career. He has received forty-nine Oscar nominations, with a fiftieth almost certain for 2016. Perhaps his most crucial contribution is the role he has played in preserving the art of orchestral film music, which, in the early seventies, was losing ground to pop-song soundtracks. “Star Wars,” exuberantly blasted out by the London Symphony, made the orchestra seem essential again.

Williams’s wider influence on musical culture can’t be quantified, but it’s surely vast. The brilliant young composer Andrew Norman took up writing music after watching “Star Wars” on video, as William Robin notes in a Times profile. The conductor David Robertson, a disciple of Pierre Boulez and an unabashed Williams fan, told me that some current London Symphony players first became interested in their instruments after encountering “Star Wars.” Robertson, who regularly stages all-Williams concerts with the St. Louis Symphony, observed that professional musicians enjoy playing the scores because they are full of the kinds of intricacies and motivic connections that enliven the classic repertory. “He’s a man singularly fluent in the language of music,” Robertson said. “He’s very unassuming, very humble, but when he talks about music he can be the most interesting professor you’ve ever heard. He’s a deep listener, and that explains his ability to respond to film so acutely.”

It has long been fashionable to dismiss Williams as a mere pasticheur, who assembles scores from classical spare parts. Some have gone as far as to call him a plagiarist. A widely viewed YouTube video pairs the “Star Wars” main title with Erich Wolfgang Korngold’s music for “Kings Row,” a 1942 picture starring Ronald Reagan. Indeed, both share a fundamental pattern: a triplet figure, a rising fifth, a stepwise three-note descent. Also Korngoldesque are the glinting dissonances that affirm rather than undermine the diatonic harmony, as if putting floodlights on the chords.

To accuse Williams of plagiarism, however, brings to mind the famous retort made by Brahms when it was pointed out that the big tune in the finale of his First Symphony resembled Beethoven’s Ode to Joy: “Any ass can hear that.” Williams takes material from Korngold and uses it to forge something new. After the initial rising statement, the melodies go in quite different directions: Korngold’s winds downward to the tonic note, while Williams’s insists on the triplet rhythm and leaps up a minor seventh. I used to think that the latter gesture was taken from a passage in Bruckner’s Fourth Symphony, but the theme can’t have been stolen from two places simultaneously.

Although it’s fun to play tune detective, what makes these ideas indelible is the way they’re fleshed out, in harmony, rhythm, and orchestration. (To save time, Williams uses orchestrators, but his manuscripts arrive with almost all of the instrumentation spelled out.) We can all hum the trumpet line of the “Star Wars” main title, but the piece is more complicated than it seems. There’s a rhythmic quirk in the basic pattern of a triplet followed by two held notes: the first triplet falls on the fourth beat of the bar, while later ones fall on the first beat, with the second held note foreshortened. There are harmonic quirks, too. The opening fanfare is based on chains of fourths, adorning the initial B-flat-major triad with E-flats and A-flats. Those notes recur in the orchestral swirl around the trumpet theme. In the reprise, a bass line moves in contrary motion, further tweaking the chords above. All this interior activity creates dynamism. The march lunges forward with an irregular gait, rugged and ragged, like the Rebellion we see onscreen.

This is not to deny that Williams has a history of drawing heavily on established models. The Tatooine desert in “Star Wars” is a dead ringer for the steppes of Stravinsky’s “The Rite of Spring.” The “Mars” movement of Holst’s “Planets” frequently lurks behind menacing situations. Jeremy Orosz, in a recent academic paper, describes these gestures as “paraphrases”: rather than quoting outright, Williams “uses pre-existing material as a creative template to compose new music at a remarkable pace.” There’s another reason that “Star Wars” contains so many near-citations. At first, George Lucas had planned to fill the soundtrack with classical recordings, as Stanley Kubrick had done in “2001.” The temp track included Holst and Korngold. Williams, whom Lucas hired at Spielberg’s suggestion, acknowledged the director’s favorites while demonstrating the power of a freshly composed score. He seems to be saying: I can mimic anything you want, but you need a living voice.

In that delicate balancing act, Williams may have succeeded all too well. After “Star Wars,” he became a sound, a brand. The diversity and occasional daring of the composer’s earlier work—I’m thinking not only of “Close Encounters” but also of Robert Altman’s “Images” and “The Long Goodbye” and of Brian De Palma’s “The Fury”—subsided over time. Williams invariably achieves a level of craftsmanship that no other living Hollywood composer can match; his fundamental skill is equally evident in his sizable catalogue of concert-hall scores. Yet he’s been boxed in by the billions that his music has helped to earn. He has become integral to a populist economy on which thousands of careers depend.

Doonesbury — No harm no foul.

Thursday, December 31, 2015

Looking Back/Looking Forward

It’s time for my annual re-cap and prognostication for the past year and the year coming up.  Let’s see how I did a year ago.

– Now that we have a Republican House and Senate and a president who isn’t running for re-election, get out the popcorn, and I mean the good stuff.  The GOP will try to do everything they can to destroy the legacy of Barack Obama, but they will end up looking even more foolish, petulant, infantile, and borderline nuts than they have for the last two years, and that’s saying something.  Repeals of Obamacare, Dodd-Frank, and recharged attempts to investigate Benghazi!, the IRS, and the VA will be like the three rings of Barnum & Bailey, all of which President Obama will gleefully veto.  As Zandar noted at Balloon Juice, “Over/under on when a Republican declares on FOX that Obama’s veto is  “illegal”, Feb 8.”

They did all that except actually pass the bills for President Obama to veto.  Instead they putsched John Boehner and replaced him with Paul Ryan who will more than likely face the same nutsery in 2016.

– Hillary Clinton will announce that she is running for president by March 2015 at the latest.  Elizabeth Warren will not run, but Bernie Sanders, the Gene McCarthy of this generation, will announce as an independent and become a frequent guest on MSNBC.  Jeb Bush, after “actively exploring” a run in 2016, will announce that he is running and quickly fade to the single digits when the GOP base gets a taste of his views on immigration and Common Core.  He may be popular in Republican polls, but those people don’t vote in primaries.  The frontrunners for the Iowa caucuses a year from now will be Rand Paul and Chris Christie.

Nailed that one except for the last sentence.  But to be fair I don’t think anyone had Donald Trump on their betting sheets a year ago, and if they did, it was more for the entertainment value than serious consideration as a Republican candidate.

– The war in Afghanistan is officially over as of December 2014, but there will be U.S. troops actively engaged in combat in what is left of Syria and Iraq in 2015.

More’s the pity.

– The U.S. economy will continue to improve at a galloping pace.  The Dow will hit 19,000 at some point in 2015 and oil will continue to flood the market, keeping the price below $60 a barrel and gasoline will sell for under $2 a gallon, and finally wages will start to catch up with the improving economy.  I blame Obama.

Except for my overly-optimistic prediction on the Dow, this pretty much came true, even down to the price for gasoline: I paid $1.99 last night in Miami, which is not the lowest-priced city in the country.  President Obama is not getting any credit whatsoever for helping the economy improve, which he should, but then the Republicans never blamed Bush for crashing it in the first place.

– The Supreme Court will rule that bans on same-sex marriage violate the Constitution.  They will also narrowly uphold Obamacare again.

Happy dance, happy dance.

– The embargo against Cuba will end on a narrow vote in the Senate thanks to the overwhelming influence of Republican donors who see 11 million Cubans starving for Dunkin Donuts and car parts and don’t care what a bunch of domino-playing dreamers on Calle Ocho think.

The embargo is still in place as a matter of law, but for all intents and purposes, it is crumbling.  U.S. airlines and cruise ships are setting schedules, direct mail service is resuming, and travel there has become routine.

– The Tigers will win their division again.

Oh, shut up.

– We will lose the requisite number of celebrities and friends as life goes on. As I always say, it’s important to cherish them while they are with us.

I hold them in the Light.

– I technically retired on September 1, 2014, but my last day at work will be August 30, 2019.  (It’s complicated.)  I’m planning a return trip to Stratford this summer — more on that later — and I’ll get more plays produced.  I will finish at least one novel in 2015.

This was a productive year for me on the writing front: several plays of mine were done either in full stage productions or readings, and more are on the way.  No, I did not finish a novel yet.

Now for the predictions for 2016:

  • Hillary Clinton will be the next President of the United States.  I have no idea who she will beat; I don’t think the Republicans know, either, but she will win, and I’m going to go out on a limb here and say that it will be a decisive win.  The GOP will blame everybody else and become even more cranky, self-injuring, and irresponsible.
  • The Democrats down-ticket will do better than expected by taking back the Senate and narrowing their gap in the House.  This will be achieved by the number of voters who will turn out to vote for them in order to hold off the GOP’s attempt to turn the country back over to the control of white Christian males.
  • The economy will continue to improve; maybe this is the year the Dow will hit 19,000.  The limiting factor will be how the rest of the world, mainly China, deals with their economic bubble.  I think a lot of the economic news will be based on the outcome of the U.S. election and the reaction to it.  If by some horrifying chance Donald Trump wins, all bets are off.  Economists and world markets like stability and sanity, and turning the U.S. over to a guy who acts like a used car hustler crossed with a casino pit boss will not instill confidence.
  • ISIS, which barely registered on the radar as an existential threat to the U.S. and the west a year ago, will be contained.  There will not be a large American troop presence in Syria and Iraq thanks in part to the response by the countries that themselves are being invaded by ISIS.  Finally.
  • Refugees will still be pouring out of the Middle East, putting the strain on countries that have taken them in.  It will be a test of both infrastructure and moral obligation, and some, such as Canada, will set the example of how to be humane.
  • Maybe this will be the year that Fidel Castro finally takes a dirt nap.
  • The Supreme Court will narrowly uphold affirmative action but leave room for gutting it later on.  They will also narrowly rule against further restrictions on reproductive rights.  And I am going out on a limb by predicting that President Obama will get to choose at least one more new justice for the Court, an appointment that will languish in the Senate until after the election.
  • Violence against our fellow citizens such as mass shootings will continue.  The difference now is that we have become numb to them and in an election year expecting any meaningful change to the gun laws or the mindset is right up there with flying pigs over downtown Miami.
  • Marriage equality will gain acceptance as it fades from the headlines, but the LGBTQ community’s next front will be anti-discrimination battles for jobs and housing.  It’s not over yet, honey.
  • We’re going to see more wild weather patterns but none of it will convince the hard-core deniers that it’s either really happening or that there’s anything we can do about it.
  • The Tigers will not win the division in 2016.  (Caution: reverse psychology at play.)
  • On a personal level, this could be a break-out year for my writing and play production.  I don’t say that every year.
  • A year from today I will write this same post and review what I got right and what I didn’t.  But stick around and see how I do on a daily basis.

Okay, it’s your turn.  What do you see for 2016?

Thursday, October 15, 2015

My First Clue

Earlier this week Playboy magazine announced that it would no longer print pictures of nude women.  This is based on the theory that if you want to see them, you have a lot of choices on the internet.  And you won’t have to smuggle them into the garage attic to look at them with a flashlight.

For boys of a certain age, Playboy was a rite of passage.  Fifty years ago it was how thirteen year old boys got their first glimpse of undressed women.  I remember a friend of mine showing me a rather rumpled copy of the magazine with all the sophisticated ads for liquor and rich-guy toys, and then there was the centerfold.  Zowie.

I tried to show enthusiasm, but when I finally saw it my reaction was “enh.”  It did nothing for me, and I couldn’t help but wonder what all the fuss was about.  I didn’t really process it then, but I think that’s about the time that I was beginning to be aware of the fact that, at least in terms of responding to sexual stimuli, I am gay.

So, thanks, Playmate of the Month for November 1965.  It would be another eleven years before I actually came out, but you helped get the journey going.

Sunday, August 2, 2015

Sunday Reading

“A Dream Undone” — From the New York Times magazine, Jim Rutenberg reports on the efforts to bring back Jim Crow.

On the morning of his wedding, in 1956, Henry Frye realized that he had a few hours to spare before the afternoon ceremony. He was staying at his parents’ house in Ellerbe, N.C.; the ceremony would take place 75 miles away, in Greensboro, the hometown of his fiancée; and the drive wouldn’t take long. Frye, who had always been practical, had a practical thought: Now might be a good time to finally register to vote. He was 24 and had just returned from Korea, where he served as an Air Force officer, but he was also a black man in the American South, so he wasn’t entirely surprised when his efforts at the registrar’s office were blocked.

Adopting a tactic common in the Jim Crow South, the registrar subjected Frye to what election officials called a literacy test. In 1900, North Carolina voters amended the state’s Constitution to require that all new voters “be able to read and write any section of the Constitution in the English language,” but for decades some registrars had been applying that already broad mandate even more aggressively, targeting perfectly literate black registrants with arbitrary and obscure queries, like which president served when or who had the ultimate power to adjourn Congress. “I said, ‘Well, I don’t know why are you asking me all of these questions,’ ” Frye, now 83, recalled. “We went around and around, and he said, ‘Are you going to answer these questions?’ and I said, ‘No, I’m not going to try.’ And he said, ‘Well, then, you’re not going to register today.’ ”

Sitting with me on the enclosed porch of his red-brick ranch house in Greensboro, drinking his wife’s sweet tea, Frye could joke about the exchange now, but at the time it left him upset and determined. When he met Shirley at the altar, the first thing he said was: “You know they wouldn’t let me register?”

“Can we talk about this later?” she replied.

After a few weeks, Frye drove over to the Board of Elections in Rockingham, the county seat, to complain. An official told him to go back and try again. This time a different registrar, after asking if he was the fellow who had gone over to the election board, handed him a paragraph to copy from the Constitution. He copied it, and with that, he became a voter.

But in the American South in 1956, not every would-be black voter was an Air Force officer with the wherewithal to call on the local election board; for decades, most had found it effectively impossible to attain the most elemental rights of citizenship. Only about one-quarter of eligible black voters in the South were registered that year, according to the limited records available. By 1959, when Frye went on to become one of the first black graduates of the University of North Carolina law school, that number had changed little. When Frye became a legal adviser to the students running the antisegregation sit-ins at the Greensboro Woolworth’s in 1960, the number remained roughly the same. And when Frye became a deputy United States attorney in the Kennedy administration, it had grown only slightly. By law, the franchise extended to black voters; in practice, it often did not.

What changed this state of affairs was the passage, 50 years ago this month, of the Voting Rights Act. Signed on Aug. 6, 1965, it was meant to correct “a clear and simple wrong,” as Lyndon Johnson said. “Millions of Americans are denied the right to vote because of their color. This law will ensure them the right to vote.” It eliminated literacy tests and other Jim Crow tactics, and — in a key provision called Section 5 — required North Carolina and six other states with histories of black disenfranchisement to submit any future change in statewide voting law, no matter how small, for approval by federal authorities in Washington. No longer would the states be able to invent clever new ways to suppress the vote. Johnson called the legislation “one of the most monumental laws in the entire history of American freedom,” and not without justification. By 1968, just three years after the Voting Rights Act became law, black registration had increased substantially across the South, to 62 percent. Frye himself became a beneficiary of the act that same year when, after a close election, he became the first black state representative to serve in the North Carolina General Assembly since Reconstruction.

In the decades that followed, Frye and hundreds of other new black legislators built on the promise of the Voting Rights Act, not just easing access to the ballot but finding ways to actively encourage voting, with new state laws allowing people to register at the Department of Motor Vehicles and public-assistance offices; to register and vote on the same day; to have ballots count even when filed in the wrong precinct; to vote by mail; and, perhaps most significant, to vote weeks before Election Day. All of those advances were protected by the Voting Rights Act, and they helped black registration increase steadily. In 2008, for the first time, black turnout was nearly equal to white turnout, and Barack Obama was elected the nation’s first black president.

Since then, however, the legal trend has abruptly reversed. In 2010, Republicans flipped control of 11 state legislatures and, raising the specter of voter fraud, began undoing much of the work of Frye and subsequent generations of state legislators. They rolled back early voting, eliminated same-day registration, disqualified ballots filed outside home precincts and created new demands for photo ID at polling places. In 2013, the Supreme Court, in the case of Shelby County v. Holder, directly countermanded the Section 5 authority of the Justice Department to dispute any of these changes in the states Section 5 covered. Chief Justice John Roberts Jr., writing for the majority, declared that the Voting Rights Act had done its job, and it was time to move on. Republican state legislators proceeded with a new round of even more restrictive voting laws.

All of these seemingly sudden changes were a result of a little-known part of the American civil rights story. It involves a largely Republican countermovement of ideologues and partisan operatives who, from the moment the Voting Rights Act became law, methodically set out to undercut or dismantle its most important requirements. The story of that decades-long battle over the iconic law’s tenets and effects has rarely been told, but in July many of its veteran warriors met in a North Carolina courthouse to argue the legality of a new state voting law that the Brennan Center for Justice at the New York University Law School has called one of the “most restrictive since the Jim Crow era.” The decision, which is expected later this year, could determine whether the civil rights movement’s signature achievement is still justified 50 years after its signing, or if the movement itself is finished.

Upping the Outrage — James Hamblin in The Atlantic on how the internet fuels the response to something and then moves on.

Now is the point in the story of Cecil the lion—amid non-stop news coverage and passionate social-media advocacy—when people get tired of hearing about Cecil the lion. Even if they hesitate to say it.But Cecil fatigue is only going to get worse. On Friday morning, Zimbabwe’s environment minister, Oppah Muchinguri, called for the extradition of the man who killed him, the Minnesota dentist Walter Palmer. Muchinguri would like Palmer to be “held accountable for his illegal action”—paying a reported $50,000 to kill Cecil with an arrow after luring him away from protected land. And she’s far from alone in demanding accountability. This week, the Internet has served as a bastion of judgment and vigilante justice—just like usual, except that this was a perfect storm directed at a single person. It might be called an outrage singularity.Palmer didn’t just kill a lion. He killed an especially good-looking and “beloved” lion in an ostentatious and gruesome fashion that culminated in decapitation. To make things worse, that lion had a human name. To make things worse still, that name was Cecil.

[…]

The Internet has served to facilitate outrage, as the Internet does: the hotter the better. And because the case is so visceral and bipartisan in its opposition to Palmer’s act, few people stepped in to suggest that the fury, the people tweeting his home address, might be too much. That argument wins no outrage points.Instead, the people who hadn’t jumped on the Cecil-outrage bandwagon jumped on the superiority-outrage bandwagon. It’s a bandwagon of outrage one-upmanship, and it’s just as rewarding as the original outrage bandwagon. Anyone can play, like this:

It’s fine to be outraged about one lion, but what about all of the other lions who are hunted and killed every year?  There are 250 Cecils killed annually across Africa as trophies, and that’s what you should really be outraged by. But good job caring now.

Actually, what about all of the animals? All of the cattle and fish and brilliant pigs who are systematically slaughtered for human consumption every day? Were you eating a hot dog when you posted that thing about Cecil on Facebook? Anyone who is not vegan is no better than the dentist Walter Palmer. That is what you really should be outraged by.

Actually, you only care about Zimbabwe when a lion is killed? Great of you. Killing animals is part of the circle of life, but you know what’s not? Human trafficking. People are bought and sold as slaves today all over the world. Why are you talking about one aged jungle cat in a place where the relationship between impoverished pastoralist communities and wealthy foreign tourists is more complicated than you actually understand?

And I’m glad you’re so concerned about human trafficking, but there will be no humans at all if we don’t do something about climate change. Reliance on fossil fuels and industrialized farming is the real problem, and that’s what you should be outraged by. You don’t know what to care about. I know what to care about.

The Internet launders outrage and returns it to us as validation, in the form of likes and stars and hearts. The greatest return comes from a strong and superior point of view, on high moral ground. And there is, fortunately and unfortunately, always higher moral ground. Even when a dentist kills an adorable lion, and everyone is upset about it, there’s better outrage ground to be won. The most widely accepted hierarchy of outrage seems to be: Single animal injured < single animal killed < multiple animals killed < systematic killing of animals < systematic oppression/torture of people < systematic killing of humans < end of all life due to uninhabitable planet.

To say that there’s a more important issue in the world is always true, except in the case of climate change ending all life, both human and animal. So it’s meaningless, even if it’s fun, to go around one-upping people’s outrage. Try it. Someone will express legitimate concern over something, and all you have to do is say there are more important things to be concerned about. All you have to do is use the phrase “spare me” and then say something about global warming. You can literally write, “My outrage is more legit than your outrage! Ahhh!”

Jon Stewart, Patriot — An appreciation in The New Yorker by David Remnick.

Political life in America never ceases to astonish. Take last week’s pronouncements from the Republican Presidential field. Please. Mike Huckabee predicted that President Obama’s seven-nation agreement limiting Iran’s nuclear capabilities “will take the Israelis and march them to the door of the oven.” Ted Cruz anointed the American President “the world’s leading financier of radical Islamic terrorism.” Marco Rubio tweeted, “Look at all this outrage over a dead lion, but where is all the outrage over the planned parenthood dead babies.” And the (face it) current front-runner, the halfway hirsute hotelier Donald Trump, having insulted the bulk of his (count ’em) sixteen major rivals plus (countless) millions of citizens of the (according to him) not-so-hot nation he proposes to lead, announced via social media that in this week’s Fox News debate he plans “to be very nice & highly respectful of the other candidates.” Really, now. Who’s writing this stuff? Jon Stewart?

Over the decades, our country has been lucky in many things, not least in the subversive comic spirits who, in varying ways, employ a joy buzzer, a whoopee cushion, and a fun-house mirror to knock the self-regard out of an endless parade of fatuous pols. Thomas Nast drew caricatures so devastating that they roiled the ample guts of our town’s Boss, William Marcy Tweed. Will Rogers’s homespun barbs humbled the devious of the early twentieth century. Mort Sahl, the Eisenhower-era comic whose prop was a rolled-up newspaper, used conventional one-liners to wage radical battle: “I’ve arranged with my executor to be buried in Chicago, because when I die I want to still remain politically active.” Later, Dick Gregory, Richard Pryor, and Joan Rivers continued to draw comic sustenance from what Philip Roth called “the indigenous American berserk.”

Four nights a week for sixteen years, Jon Stewart, the host and impresario of Comedy Central’s “The Daily Show,” has taken to the air to expose our civic bizarreries. He has been heroic and persistent. Blasted into orbit by a trumped-up (if you will) impeachment and a stolen Presidential election, and then rocketing through the war in Iraq and right up to the current electoral circus, with its commodious clown car teeming with would-be Commanders-in-Chief, Stewart has lasered away the layers of hypocrisy in politics and in the media. On any given night, a quick montage of absurdist video clips culled from cable or network news followed by Stewart’s vaudeville reactions can be ten times as deflating to the self-regard of the powerful as any solemn editorial—and twice as illuminating as the purportedly non-fake news that provides his fuel.

[…]

Stewart set out to be a working comedian, and he ended up an invaluable patriot. But the berserk never stops. His successor, Trevor Noah, will not lack for material. As Stewart put it wryly on one of his last nights on the air, “As I wind down my time here, I leave this show knowing that most of the world’s problems have been solved by us, ‘The Daily Show.’ But sadly there are still some dark corners that our broom of justice has not reached yet.”

Doonesbury — Amateur Night.

Sunday, July 19, 2015

Sunday Reading

Obama and History — Josh Marshall on what a legacy means to President Obama.

We all remember that week last month when the country seemed to be marching with history. The Court upheld the Affordable Care Act against what is likely its last serious legal challenge, effectively embedding it deeply into the structure of American social policy. The Court then (in what was unfortunately a weakly argued majority decision) made marriage equality the law of the land nationwide. Then on the heels of these events came the President’s speech (transcript here) in Charleston, South Carolina – actually a eulogy for Clementa Pinckney, one of the victims of the Emmanuel Church massacre on June 17 but in fact a commemoration and meditation on the meaning of the whole event. (James Fallows’ is one of the best appreciations and treatments of it.)

[…]

When I look at Obama I don’t see a President desperately trying to cram legacy achievements into the declining months of his presidency. I see achievements coming to fruition that were usually years in the making but often seemed errant or quixotic and uncertain in their outcome. This is what for many was so bracing about the end of June. This has been a long long seven years. What seemed like an uncertain list of achievements, long on promise but hacked apart by mid-term election reverses and Obama’s sometimes over-desire for accommodation, suddenly appeared closer to profound, like a novel or a play which seems scattered or unresolved until all the pieces fall into place, clearly planned all along, at the end.

Whatever you think of this Iran agreement, it is not only the product of years of work but is core to the foreign policy vision Obama brought with him to the presidency. It’s as core to the goals he entered the presidency with as anything that has happened in recent weeks. He has it in view; his political opponents will be very hard pressed to block him. And he is pushing ahead to get it done.

None of this is to say that there isn’t a clear and palpable change in the President’s affect and demeanor. His presidency is coming to an end and his range of action will diminish further as the presidential election moves to center stage next year. As the budget deficit has receded from public view, Obama’s fucks deficit has come to the forefront. After six and a half years in office, he may have a small stockpile of fucks left. But he has none left to give. He is increasingly indifferent to the complaints and anger of his political foes and focused on what he can do on his own or with reliable political supporters. You can see it too in the more frequent lean-in-on-the-lectern moments during press conferences and speeches. He’s truly out of fucks to give. But it’s more a product of focus on finishing aspects of his presidency in motion for years than of cramming at the end. For most of his supporters, this was the Obama they always wanted. And he’s giving it to them. What comes off to reporters as testiness is more like the indifference of someone who’s got work to do and is intent on doing it.

Scout’s Honor — Dale Russakoff in The New Yorker reconnects with the woman who played Scout in the film of To Kill a Mockingbird looks back at her role on and off the screen.

After playing Scout in the movie of “To Kill a Mockingbird,” in 1962, Mary Badham endured a rude homecoming when she returned to Birmingham, Alabama. Having just spent six months in California with her mother, living in a racially integrated apartment complex, she found herself suddenly an outsider back home. “The attitude was ‘Lord knows what she might’ve learned out there!’ ” Badham recalled the other day. “Some families, I’d been welcome in their homes, and after the film, I was no longer welcome.”

Like the adult Scout in Harper Lee’s newly published “Go Set a Watchman,” Badham left the South during the era of segregation, and returned to find that people she once considered unequivocally good in fact bore the markings of that evil system. In the case of Scout, as revealed with alarm by reviewers of “Watchman,” it’s her sainted father, Atticus, who emerges as an overt racist, inveighing against threats to segregation from the U.S. Supreme Court and local lawyers for the N.A.A.C.P. Badham similarly discovered a mean streak in family friends who didn’t tolerate her breaking of Southern white taboos. “I was ostracized and it was painful,” said the adult Badham.

This past Tuesday night, nine hundred people, a sellout crowd, came to hear Badham read from “Go Set a Watchman” and “To Kill a Mockingbird” at the 92nd Street Y. Harper Lee herself made New York City—specifically the Upper East Side neighborhood around the Y—her second home for more than fifty years. These were her fans, and they clearly had come looking for something to celebrate. When Badham was introduced, they whooped and cheered.

Badham, who was nine when she played the iconic six-year-old and is now sixty-two, was completely overcome. Today a furniture-restorer in rural Virginia, she clasped her hands, raised them in celebration, then took a bow, and finally laughed until she almost cried. She read a brief excerpt from “Mockingbird,” and the first chapter of “Watchman.” Her voice is slow and lilting, quintessentially Southern. Alternately funny and poignant, Badham’s channelling of Jean Louise Finch—in “Watchman,” she has mostly shed her famous nickname—elicited frequent laughter.

In the Q. & A. that followed, moderator Mary Murphy, the director of the documentary “Harper Lee: From Mockingbird to Watchman,” asked Badham if she was surprised by the evolution of Atticus. She was not. In the Alabama she knew, it was not unheard of for a white man like him to righteously defend a black man like Tom Robinson against an unjustified charge of rape, and at the same time believe, as Atticus says in “Watchman,” that black people were “backward,” not “ready” to exercise their full civil rights. She heard all that and much more growing up in Birmingham. We all did.

Could Florida Democrats Blow It Again? — David A. Graham in The Atlantic on the fight brewing for the Senate seat.

The road to a Democratic majority in the Senate is a narrow one, and it runs through Florida. Marco Rubio is running for president, so he can’t run for reelection, freeing up his seat—and in a swing state like Florida, with the more Democratic-friendly electorate of a presidential cycle, there’s a good chance Democrats can win.

If they have the right candidate, of course.

That’s where Alan Grayson comes in. Democrats have had a rough run in Florida recently. In 2010, their candidate was walloped in a three-way Senate race that Rubio won—Governor Charlie Crist ran as an independent after losing the Republican primary; Democrat Kendrick Meek finished a distant third. That same year, Alex Sink lost a close race for governor to Rick Scott. In early 2014, Sink lost a special election for the seat of deceased Representative C. W. “Bill” Young. In fall 2014, Crist—by now a Democrat—lost the governor’s race to Scott, even though the incumbent was strongly disliked.

The remedy, state and national Democrats believe, is Patrick Murphy, a young two-term representative who reached office after defeating Representative Allen West—as fiery and controversial a Republican as Grayson is a Democrat—in 2012. Murphy is a notably moderate Democrat (he was previously a Republican), but he’s a polished candidate who showed he could win in a closely divided district. Party leaders marked him for great things. Early polls show him leading the top Republican candidates.

Then Grayson announced his decision to run. He’s the famously (or infamously) loudmouthed U.S. representative from Orlando—the guy who, during the healthcare-reform debate said the Republican health plan was “Don’t get sick, and if you do get sick, die quickly.” Grayson has a long history of similarly inflammatory or hotheaded comments, which he says is evidence that he’s willing to fight for his principles. The wealthy liberal is serving his third term, but it’s nonconsecutive—elected in 2008, he was defeated in 2010 and then returned to Congress in the 2012 election.

How big a threat to Murphy is Grayson? That’s a tough call. There’s not a great deal of good polling in the race. Several earlier polls showed a close race. A poll in early July from Gravis Marketing showed Grayson leading Murphy by an astonishing 63-19 margin. It’s probably best not to put too much stock in that result—it’s early, it’s an outlier, and Gravis’s track record is, um, not sterling.

But Grayson has one big advantage: He’s willing to say anything. In particular, he’ll deliver inflammatory quotes left and right about anything and anyone, allowing him to effectively tap into the Democratic id. Or in his own, typically modest words, “Voters will crawl naked over hot coals to vote for me. And that’s something that no other candidate in either party can say.” That also means he has strong fundraising potential from the grassroots, though he’s also independently wealthy. His act might play well in a Democratic primary, but could he win a general election in a purple state? The Democratic Senatorial Campaign Committee seems unconvinced. The DSCC praised Murphy in a statement when Grayson officially entered the race last week, but didn’t even mention Grayson’s name.

Doonesbury — Hotter than ever.

Sunday, July 5, 2015

Sunday Reading

What Do You Know? — Eric Lio in The Atlantic on the knowledge gap in America.

Is the culture war over?

That seems an absurd question. This is an age when Confederate monuments still stand; when white-privilege denialism is surging on social media; when legislators and educators in Arizona and Texas propose banning ethnic studies in public schools and assign textbooks euphemizing the slave trade; when fear of Hispanic and Asian immigrants remains strong enough to prevent immigration reform in Congress; when the simple assertion that #BlackLivesMatter cannot be accepted by all but is instead contested petulantly by many non-blacks as divisive, even discriminatory.

And that’s looking only at race. Add gender, guns, gays, and God to the mix and the culture war seems to be raging along quite nicely.

Yet from another perspective, much of this angst can be interpreted as part of a noisy but inexorable endgame: the end of white supremacy. From this vantage point, Americanness and whiteness are fitfully, achingly, but finally becoming delinked—and like it or not, over the course of this generation, Americans are all going to have to learn a new way to be American.

Imagine that this is true; that this decades-long war is about to give way to something else. The question then arises: What? What is the story of “us” when “us” is no longer by default “white”? The answer, of course, will depend on how aware Americans are of what they are, of what their culture already (and always) has been. And that awareness demands a new kind of mirror.

It helps first to consider some recent history. In 1987, a well-regarded professor of English at the University of Virginia named E.D. Hirsch Jr. published a slim volume called Cultural Literacy. Most of the book was an argument—textured and subtle, not overtly polemical—about why nations need a common cultural vocabulary and why public schools should teach it and, indeed, think of their very reason for being as the teaching of that vocabulary.

At the end of the book Hirsch and two colleagues tacked on an appendix: an unannotated list of about 5,000 names, phrases, dates, and concepts that, in their view, “every American needs to know.” The rest (to use a phrase that probably should’ve been on the list) was history.

The appendix became a sensation and propelled the book to the top of the best-seller list. Hirsch became that rare phenomenon: a celebrity intellectual. His list was debated in every serious publication and elite circles. But he also was profiled in People magazine and cited by pundits who would never read the book.

Hirsch’s list had arrived at a ripe moment of national anxiety, when critics like Allan Bloom and Arthur Schlesinger Jr. were bemoaning the “closing of the American mind” and “the disuniting of America”; when multicultural curricula had arrived in schools, prompting challenges to the Western canon and leading Saul Bellow to ask mockingly who the Tolstoy of the Zulus was, or the Proust of the Papuans; a time when Bill Bennett first rang alarms about the “dumbing-down of America.”

The culture wars were on. Into them ambled Hirsch, with his high credentials, tweedy profile, reasoned arguments, and addictively debatable list. The thing about the list, though, was that it was—by design—heavy on the deeds and words of the “dead white males” who had formed the foundations of American culture but who had by then begun to fall out of academic fashion. (From a page drawn at random: Cotton Mather, Andrew Mellon, Herman Melville).

Conservatives thus embraced Hirsch eagerly and breathlessly. He was a stout defender of the patrimony. Liberals eagerly and breathlessly attacked him with equal vigor. He was retrograde, Eurocentric, racist, sexist. His list was a last gasp (or was it a fierce counterattack?) by a fading (or was it resurgent?) white establishment.

Lost in all the crossfire, however, were two facts: First, Hirsch, a lifelong Democrat who considered himself progressive, believed his enterprise to be in service of social justice and equality. Cultural illiteracy, he argued, is most common among the poor and power-illiterate, and compounds both their poverty and powerlessness. Second: He was right.

A generation of hindsight now enables Americans to see that it is indeed necessary for a nation as far-flung and entropic as the United States, one where rising economic inequality begets worsening civic inequality, to cultivate continuously a shared cultural core. A vocabulary. A set of shared referents and symbols…

Doonesbury — Final Curtain.

Sunday, March 1, 2015

Sunday Reading

Assassination in Moscow — Matt Schiavenza in The Atlantic on the murder of a Putin opponent.

Hours after Boris Nemtsov was slain on Friday night near the Kremlin, Russian president Vladimir Putin vowed to seek justice: “Everything will be done so that the organizers and perpetrators of a vile and cynical murder get the punishment they deserve,” he said in a condolence message to the 55-year-old Nemtsov’s mother. Whether Putin is being sincere is something only he and his closest advisors know. But Russia’s recent history inspires little confidence that Nemtsov’s killers, whomever they are, will be brought to justice.

Nemtsov was a high-profile politician, having served as a deputy prime minister and, more recently, as a regional legislator. He was such an outspoken critic of Putin in those roles that he openly feared for his life. Along with his colleague Leonid Martynyuk, Nemtsov published a report detailing the immense corruption surrounding the 2014 Winter Olympics, which were hosted in the Russian resort town of Sochi. Nemtsov also spoke out about Russia’s seizure of Crimea last February and subsequent support for pro-Kremlin rebels in eastern Ukraine. But Nemtsov is hardly the first critic of Putin to lose his life to premeditated murder. Dozens of journalists have been killed since the Russian president first assumed office in 2000. Few of those responsible have been brought to justice—a point Nemtsov himself was well aware of. “The murderers understand that killing journalists is not a problem,” he told Foreign Policy‘s Christian Caryl in a 2010 interview.

The assassination of a well-known politician, however, is somewhat more unusual. In an attempt to preempt public outrage, the Kremlin has already formed a committee to investigate the causes of Nemtsov’s death. One possibility they cited was that Nemtsov’s commentary about the satirical publication Charlie Hebdo, whose offices suffered a murderous assault in January, made him a target of Islamists. The committee also mentioned Nemtsov’s controversial position on Ukraine, and, most spectacularly, suggested that he was killed by fellow opponents of Putin in an attempt to rally opposition to the Russian president.

Putin’s critics have not had it easy in Russia. A major economic slowdown triggered by falling oil prices has not diminished the president’s popularity. The country’s liberal opposition—epitomized by Nemtsov and the jailed politician Alexei Navalny—is weak and marginalized, and their positions on Ukraine, Putin, and the Sochi Olympics are not widely shared among ordinary Russians.

Nevertheless, the Kremlin appears wary of turning Nemtsov into a martyr. On Sunday, he was scheduled to appear at an anti-Putin rally in Moscow. But when the organizers asked to turn the rally into a memorial for Nemtsov, Russian authorities denied the request. Even still, protests have done little to challenge Putin’s grip on power—something that Nemtsov himself acknowledged in a recent interview published in Newsweek‘s Polish edition:

[The liberals’] idea is the one of a democratic and open Russia. A country which is not applying bandit methods to its own citizens and neighbors. But, as I mentioned, Russian fascism is a hybrid. And hybrids are extremely resistant.

As the world mourns his death, Nemtsov’s vision seems very far from being realized.

Early Bird vs. Night Owl — Maria Konnikova in The New Yorker on the morals dictated by our sleep pattern.

The idea of the virtuous early bird goes back at least to Aristotle, who wrote, in his Economics, that “Rising before daylight is … to be commended; it is a healthy habit.” Benjamin Franklin, of course, framed the same sentiment in catchier terms: “Early to Bed, and early to rise, makes a Man healthy, wealthy and wise.” More recently, there has been a push for ever earlier work starts, conference calls, and breakfast meetings, and a steady stream of advice to leave Twitter and Facebook to the afternoon and spend the morning getting real things done. And there may be some truth to the idea: a 1998 study in the Journal of Personality and Social Psychology suggests that we become more passive as the day wears on. You should do the most important thing first, the theory goes, because, well, you won’t be able to do it quite as well later on.

In last January’s issue of Psychological Science, Maryam Kouchaki and Isaac Smith took that theory even further, proposing what they called the morning morality effect, which posits that people behave better earlier in the day. Their research caught the attention of Sunita Sah, a behavioral scientist at Georgetown University and a professed night owl. For the previous five years, Sah had been studying how different situations influence ethical behavior. “You always hear these sweeping statements: morning is saintly, evening is bad; early to bed, early to rise,” she told me recently. A former physician, she found it plausible that something with such profound health consequences as time of day might also have a moral dimension. But she wondered how strong the effect really was. Were people like her—principled late risers—the exception to the rule? To test the limits of Kouchaki and Smith’s findings, Sah and her colleagues began by looking at the underlying biology.

Our sleep patterns are governed by circadian rhythms, our bodies’ response to changes in light and dark in a typical day. The rhythms are slightly different for every person, which is why our energy levels ebb and flow in ways that are unique to us. This internal clock determines what is called our chronotype—whether we are morning people, night people, or somewhere in between. Chronotypes are relatively stable, though they have been known to shift with age. Children and older adults generally prefer mornings; adolescents and young adults prefer evenings. Figuring out where you fall is simple: spend a few weeks going to bed when you feel tired and waking up without an alarm clock. A quicker alternative is the Horne-Ostberg questionnaire, which presents various scenarios—a difficult exam, twice-weekly exercise with a friend—and determines your chronotype on the basis of what time of day you’d feel most up to confronting them.

Chronotype, of course, doesn’t control wakefulness all on its own. There is also what is known as homeostatic sleep drive. The longer we are awake, irrespective of where we are in our established circadian rhythms, the more fatigue exerts its pressure on us. In morning people, sleep drive and chronotype tend to be aligned. Their internal clocks are pretty well synchronized with their over-all energy levels. For night owls, however, things get complicated. When the sun comes up, the light resets their circadian clocks, telling them to wake up. But, because of their chronotypes, they don’t have much energy and they want to go back to sleep. At night, the reverse happens: one system is telling them to sleep and another is telling them to remain awake. About forty per cent of people fall into this latter category.

The Right to Get Weird — Marin Cogan reports in New York magazine on the sideshows at CPAC.

“It’s hard to punch through here,” Travis Brown, a writer for the anti-tax website How Money Walks, is saying. Standing in front of us, a towering silver robot with glowing red LED lights in his eyes and chest plate takes a clunky step forward. “We need to be creative. There’s so much going on.” The robot takes another step forward. A college-aged girl walks by asking who he is.

“Govtron is a robot built and fueled by government inefficiency,” one of the robot’s handlers says. “So he’s armored with pages of the Obamacare bill, he’s got a red tape cannon, he’s stomping on some Gadsden snakes as we speak, stomping on your freedom. We’re pitting this super villain against the How Money Walks Reformers, which includes Captain America and Iron Man, as well as Iron Patriot.” Behind him, a man in a Captain America costume gives a halfhearted wave. “That is so funny!” the girl says. “And what is his name? Goovtron?”

Govtron is the subject of a short comic book Brown authored specially for CPAC, the annual confab hosted by the American Conservative Union. He’s there to direct attention to their website, and right now even he’s struggling to stand out. A few booths away, a limited government youth group called Turning Point USA is blasting Sia while students mill about, tagging their “Big Government Sucks” signing wall. “We’re working around this theme, big government sucks,” says Marko Sukovic, the group’s Midwest field director. “It’s probably the most relevant phrase any young person can relate to on college campuses.”

Behind the wall, on which someone has scrawled, “I love our freedom and dislike big politics,” a man in a “Muhammad is a homo” T-shirt is giving an interview in front of an audience of empty chairs. Three aisles down, at the end of the Gaylord Hotel’s massive expo center, the American Atheists are posted up at a booth with the banner “Conservative Atheists Matter!”

“There are millions and millions of atheists who would be voting Republican if the Republicans would just let them!” David Silverman, the group’s president, says, eyes as big as saucers. “Just ask for our vote! Tell us that we count! Tell us that we matter, once! It’s never happened. Not in my lifetime.” The fresh-faced youth manning the World Congress of Families booth beside them does not know how to deal with the atheists next door. “It’s urrrgh … ” he mumbles until an adult steps in to cut him off.

In a big ballroom upstairs, Ted Cruz, Rand Paul, and Marco Rubio will practice their nascent stump speeches to adoring crowds, and Jeb Bush and Chris Christie — the more moderate and less favored potential candidates in the CPAC straw poll — will get grilled by conservative luminaries like Laura Ingraham and Sean Hannity. With the exception of some awkward jokes, and Scott Walker’s awkward reply to a question about how he would take on ISIS (“We need a leader with that kind of confidence. If I can take on 100,000 protesters, I can do the same across the world,” he tells a questioner), most of the conference’s events are too scripted to be memorable.

Doonesbury — Planned disruption.  (You may have to scroll down the page to actually see the comic.)

Sunday, January 4, 2015

Sunday Reading

Re-reading Huckleberry Finn — Andrew Levy in Salon discusses how the Mark Twain novel speaks more to our time than it does to 19th Century America.

For anyone who wants to try to unravel the tangled knot that ties modern Americans to their past, Mark Twain’s Adventures of Huckleberry Finn (1885) remains essential. According to the most re­cent studies, Twain’s novel about a white boy and a runaway slave es­caping down the Mississippi River is the most frequently read classic American book in American schools. Few critics’ lists of the “greatest American novels” fail to cite it; few reporters describing its influence fail to quote Hemingway’s famous claim that “all modern American literature comes from one book by Mark Twain called Huckleberry Finn.”

At the same time, it also remains one of the most controversial books in American history, and in many schools has been removed from reading lists or shifted into elective courses. One hundred years after his death, Mark Twain can still put a book on top of the best-seller list—as his Autobiography did in October 2010. And Huck Finn, 125 years after its publication, can trend high on Twitter, as it did in January 2011 when NewSouth Books announced it would publish a version that excised the racial epithet “n***r,” which appears more than 200 times in the original, and replace it with “slave”—an edito­rial gesture both praised and derided with an intensity rarely reserved for the classics anymore. Huck Finn was, and remains, “an amazing, troubling book,” as novelist Toni Morrison tells us; an “idol and tar­get,” as critic Jonathan Arac writes.

Predictably, our regard for the book is even more two-sided than that summary suggests. For over a century, Twain’s oft-beloved novel has been taught both as a serious opportunity to reflect on matters of race and as a lighthearted adventure for children. Authors, his­torians, teachers, and politicians have sung its praises as a model of interracial empathy, or debated the wisdom and limits of that claim; studio motion pictures, big-budget musicals, cartoons, comic books, and children’s editions have all focused on it as a story of boyish escapade, an “adventure” with, at best, modest political ambitions.

[…]

The best way to read Huck Finn, in fact, might be to see that Twain found the borders that divide parents and children as false as the borders that divide black and white—and that he even saw the way those borders overlapped. In turn, he attacked both with the same rough play, a tricksterish mix of comedy and political se­riousness that meshed with the stereotypes of the time but fought them, too. And now we are indulging in more rough play—myths of nostalgia and myths of progress, and the instinct to classify, classify, classify—that inspires modern politicians, critics, teachers, filmmak­ers, and readers to divide the book into two books, one funny and “harmless” and one not. Huck Finn can show us more about how we keep the discussion of childhood stalled, and the engine of racial difference humming, than any other book in our canon. To benefit from that insight, however, we would have to admit that it is not a book (flawed or otherwise) about children and adventure, or about racial progress. It is a book about what Junot Díaz calls “dedicated amnesia” on a national scale. It is a plea—as is this book—to remem­ber, and a fatalistic comedy about how we don’t.

This work is a cultural biography of Twain in his era, one that shows how Huck Finn is the great book about American forgetfulness, and how our misjudgments of the book’s messages about race and children reveal the architecture of our forgetting. I started it twenty years ago with a dim idea that there was something about the child in Huck that was misunderstood and something in the argument about the book’s treatment of race that had reached an impasse. I spent months in the late 1990s reading ancient newspapers, tracking Twain as he toured America in 1884 and 1885 alongside Louisiana writer George Washington Cable in a show he called the “Twins of Genius,” which was intended to help Twain promote the publication of Huck Finn. I explored the debate about children and schools that raged at the time to see if Huck Finn entered into it. And I explored what black readers of the day said about Twain’s book, scouring through the frayed remains of black newspapers from the 1880s. Yet what stayed with me was the milieu, not the thesis: the whispers of a lost, dying America, and an America uncannily like our own. A lot had changed. And nothing had.

The Last Bastion — Amy Davidson in The New Yorker on why marriage equality may take hold in the South.

It’s been a year and a half since the Supreme Court declared, in United States v. Windsor, that the Defense of Marriage Act—which prevented the federal government from recognizing same-sex marriages, even if individual states did—violated the Constitution. The decision did not assert a larger constitutional right to marriage, but that didn’t stop lower-court judges from finding one in its reasoning. In October, the Court declined to hear challenges to such rulings from three circuits, thus bringing the number of marriage-equality states to thirty-five—including, remarkably, South Carolina. In November, however, the Sixth Circuit upheld bans in four states, and appeals to that decision may force the Court to finally rule in 2015 on whether same-sex couples in all fifty states have a constitutional right to marry.

At this point, the marriage-equality map looks essentially like a CNN projection for a Democratic electoral landslide, with New England and the mid-Atlantic states, plus a good part of the Midwest, the Southwest, the lower Rockies, and the West Coast. But gays and lesbians can also wed in states that the Democrats can only dream of carrying: Utah (after a lawsuit brought by three couples, one of whom runs a hummus business in Salt Lake City, which sells “hummusexual” T-shirts) and Oklahoma (where two Tulsa women filed a suit a decade ago). The final fortress, with the exception of South Carolina, is the Deep South. That is where the last legal battles are likely to be fought, and it is precisely the sort of place that gay-marriage opponents say shouldn’t be rushed by the courts, because it’s “not ready.”

Judge Jeffrey Sutton, who wrote the opinion for the Sixth Circuit, took up the not-ready argument, asking, “Who decides?” He meant the courts or the states, acting through their legislatures or ballot initiatives, which he called, echoing old states-rights arguments, “less expedient, but usually reliable.” He suggested that gays and lesbians, rather than fighting in a courtroom, would find it more rewarding to gradually win over “heads and hearts” in their communities and enjoy “earned victories” at the polls. The plaintiffs in the 1967 Supreme Court decision Loving v. Virginia would likely have disagreed. That decision struck down laws banning interracial marriage in sixteen states—many of them the states that currently ban gay marriage.

[…]

Judge Reeves, who heard the Mississippi case, graduated from Jackson State, a historically black college. When the lawyers for the state talked about the benefits of “orderly” change, not rushed by the courts, Reeves interrupted them. Brown v. Board of Education was decided in 1954 and, he said, “in Mississippi, it was 1970 before my first-grade class was integrated.” He then asked the lawyers to explain the “rational basis” for denying couples the right to marry—and their children the right to married parents—adding, “All a child wants is to be loved. They don’t care by whom or what.”

The courts are not simply a check on the democratic process but a part of it. Across the country, men and women have filed declarations, testified, gone to trial, and appealed. If voting is an act of participatory democracy, so are those actions. Southerners with cases pending include a widow in Georgia, who doesn’t want her wife’s death certificate to bear a box checked “never married,” and two female Atlanta police officers, who want to be sure that each is recognized as a spouse and a parent in case one is killed in the line of duty.

The great achievement of Windsor has been to force states to explain why same-sex couples should be treated differently. For lack of any logical argument, some opponents make the “irresponsible procreation” case, which holds, perplexingly, that marriage should be reserved for a man and a woman because only they can have sex that results in accidental pregnancy. As Judge Richard Posner has written, “Heterosexuals get drunk and pregnant, producing unwanted children; their reward is to be allowed to marry. Homosexual couples do not produce unwanted children; their reward is to be denied the right to marry. Go figure.”

The lawyers in Judge Reeves’ s courtroom tried that argument, too. It didn’t work. Two days before Thanksgiving, Reeves ruled for the plaintiffs, writing,“ ‘Tradition’ will not suffice to uphold Mississippi’s marriage ban.” He cited the “overlapping” record of discrimination in America. (Bayard Rustin’s name appears in the decision twenty times.) “Gay and lesbian citizens cannot be subjected to such second-class citizenship,” he wrote. Reeves granted a stay, pending an appeal to the Fifth Circuit, to be argued on January 9th, when the Mississippi case will be joined with others from Texas and Louisiana. Otherwise, he saw no reason to wait.

Oh Brother — Andy Borowitz on clearing the way for Jeb Bush to run for president.

WASHINGTON (The Borowitz Report)—In the strongest sign to date that he intends to seek the 2016 Republican Presidential nomination, former Florida Governor Jeb Bush has officially resigned his position as George W. Bush’s brother.

“No longer being related to his brother is a key step to clearing Jeb’s path to the nomination,” an aide said on New Year’s Day. “We expect his poll numbers to soar on this.”

According to the aide, the former Florida governor resigned his post as brother in a ten-minute phone call with George W. Bush, after which he blocked the former President’s phone number and e-mail address.

In an official statement, George W. Bush said that he “understands and supports” his former brother’s decision.

“If I were him, I would no longer be related to me either,” he said.

Doonesbury — War story.

Thursday, December 11, 2014

Flunking the Rabbi Test

You probably can’t find a more goyisher governor in America than Scott Walker of Wisconsin, so you have to give him props for at least trying to reach out to the Jewish community.  But he needs to update his spell-check.

In an undated letter unearthed by the liberal group One Wisconsin Now during the August release of documents from the first of two John Doe investigations related to the governor, Walker responded to a letter from Milwaukee attorney and chairman of the Wisconsin Center District Franklyn Gimbel.

Walker told Gimbel his office would be happy to display a menorah celebrating “The Eight Days of Chanukah” at the Milwaukee County Courthouse, and asked Gimbel to have a representative from Lubavitch of Wisconsin contact Walker’s secretary, Dorothy Moore, to set it up.

The letter is signed, “Thank you again and Molotov.” [Emphasis added.]

I’m pretty sure he meant “mazel tov,” which is Hebrew for “good luck!” as in “congratulations.”  He just thought he’d toss that in to sound ethnically correct, but it blew up on him.

Friday, December 5, 2014

Live TV

I watched the first hour of Peter Pan Live last night, then switched over to Rachel Maddow where they had a whole different live TV show going on: feeds of demonstrations from Chicago, New York, and other places on behalf of Eric Garner and justice.

As for the attempt at theatre on TV on NBC, it was inoffensive.  Allison Williams has a very nice singing voice and she was able to carry off the illusion of being a boy on the verge of puberty, carrying on the tradition of having a woman play the role that goes back to Maude Adams.  She had the tough task of rising to the bar set by Mary Martin, but then the target audience for this performance had no idea who Mary Martin was.  I’m pretty sure even their parents weren’t around when she flew in the window.  From what I saw, Ms. Williams did a good job.

Casting Christopher Walken as Captain Hook was, as they say in the business, a bold move.  It’s harking back to his early days as a hoofer on Broadway (he was in the chorus of the 1964 Noel Coward musical High Spirits), and I’m sure he approached it with his trademark intensity.  But again he had to fill the pumps of the legendary Cyril Ritchard (who also played Mr. Darling in a bit of Freudian double-casting), and while Mr. Walken’s performance in the pirate production number was interesting to say the least, he came across as more menacing than flamboyantly vicious.  Even Dustin Hoffman in Hook had more fun.  Besides, what’s the point of playing Captain Hook if you can’t camp it up?

I guess I’m just a nostalgic curmudgeon, but I liked it better seeing it in grainy black and white on our old Magnavox TV-radio-phono console in the living room when I was eight.  It was more theatrical.  You knew you were watching theatre, and seeing the cables that made the kids fly added to the fun.  Last night it was more a distraction knowing that they were staging it for TV.

Switching over to watch the marches on the streets of America had their own theatrical quality.  This was real street theatre.  There’s something karmic about changing channels from one show about fighting the forces of evil set to music to another show set to chants of “I can’t breathe.”

Sunday, November 30, 2014

Sunday Reading

Predicting the Inevitable — Jenali Brown in The New Yorker on the reaction in Ferguson to the grand jury finding.

New Yorker 11-30-14What transpired in Ferguson last night was entirely predictable, widely anticipated, and, yet, seemingly inevitable. Late last week, Michael Brown, Sr., released a video pleading for calm, his forlorn eyes conveying exhaustion born of not only shouldering grief but also of insisting on civic calm in the wake of his son’s death. One of the Brown family’s attorneys, Anthony Gray, held a press conference making the same request, and announced that a team of citizen peacekeepers would be present at any subsequent protests. Ninety minutes later, the St. Louis mayor, Francis Slay, held a press conference in which he pledged that the police would show restraint in the event of protests following the grand-jury decision. He promised that tear gas and armored vehicles would not be deployed to manage protests. The two conferences bore a disturbing symmetry, an inversion of pre-fight hype in which each side deprecated about possible violence but expressed skepticism that the other side was capable of doing the same. It’s possible that, recognizing that violence was all but certain, both sides were seeking to deflect the charge that they had encouraged it. Others offered no such pretense. Days ahead of the announcement, local businesses began boarding up their doors and windows like a coastal town anticipating a hurricane. Missouri Governor Jay Nixon declared a preëmptive state of emergency a week before the grand jury concluded its work. His announcement was roughly akin to declaring it daytime at 3 A.M. because the sun will rise eventually.

From the outset, the great difficulty has been discerning whether the authorities are driven by malevolence or incompetence. The Ferguson police let Brown’s body lie in the street for four and a half hours, an act that either reflected callous disregard for him as a human being or an inability to manage the situation. The release of Darren Wilson’s name was paired with the release of a video purportedly showing Brown stealing a box of cigarillos from a convenience store, although Ferguson police chief Tom Jackson later admitted that Wilson was unaware of the incident when he confronted the young man. (McCulloch contradicted this in his statement on the non-indictment.) Last night, McCulloch made the inscrutable choice to announce the grand jury’s decision after darkness had fallen and the crowds had amassed in the streets, factors that many felt could only increase the risk of violence. Despite the sizable police presence, few officers were positioned on the stretch of West Florissant Avenue where Brown was killed. The result was that damage to the area around the police station was sporadic and short-lived, but Brown’s neighborhood burned. This was either bad strategy or further confirmation of the unimportance of that community in the eyes of Ferguson’s authorities.

The pleas of Michael Brown’s father and Brown’s mother, Lesley McSpadden, were ultimately incapable of containing the violence that erupted last night, because in so many ways what happened here extended beyond their son. His death was a punctuation to a long, profane sentence, one which has insulted a great many, and with damning frequency of late. In his statement after the decision was announced, President Barack Obama took pains to point out that “there is never an excuse for violence.” The man who once told us that there was no black America or white America but only the United States of America has become a President whose statements on unpunished racial injustices are a genre unto themselves. Perhaps it only seems contradictory that the deaths of Oscar Grant and Trayvon Martin, Ezell Ford and John Crawford and Michael Brown—all unarmed black men shot by men who faced no official sanction for their actions—came during the first black Presidency.* Or perhaps the message here is that American democracy has reached the limits of its elasticity—that the symbolic empowerment of individuals, while the great many remain citizen-outsiders, is the best that we can hope for. The air last night, thick with smoke and gunfire, suggested something damning of the President.

Artless Miami — Brett Sokol in the New York Times reports on why Art Basel hasn’t made Miami the art mecca it once dreamed of becoming.

MIAMI BEACH — “It was a really devastating message,” the Miami art dealer Fredric Snitzer said, recalling the personal impact when Emmanuel Perrotin’s 13,000-square-foot outpost closed in 2010. “If he couldn’t make a go of it, what I am doing here?”

The opening of the Perrotin gallery on the eve of the Art Basel Miami Beach fair in 2005 was a high-water mark for the city’s cultural scene, anticipating its imminent status as an art mecca second only to New York and Los Angeles. Art Basel itself was billed as the economic tide that would lift all artistic boats, not just for a week every December, but year-round, too. Why else would a top-tier contemporary-art player from Paris like Mr. Perrotin expand to Miami?

“This is Paris in the ’20s and that guy down the block is Picasso,” Mr. Snitzer said at the time.

Yet by 2009, Perrotin had ceased regular exhibitions in Miami, turning off the lights completely the following year. Several other leading galleries that opened in the wake of Art Basel’s 2002 arrival have also shut down, while many of the city’s most promising younger artists have decamped to New York and Los Angeles in search of greener career pastures.

More than a decade after Art Basel’s debut, the city’s cultural milieu has been undeniably transformed. But beyond the splashy galas surrounding the fair’s kickoff on Wednesday, and the expensive new centers for art like the waterfront Pérez Art Museum Miami and the planned home for the Institute of Contemporary Art, Miami, many local artists and art dealers remain deeply dissatisfied.

Some blame rising rents that have scattered a once-cohesive art community, while others point to a dearth of local collectors and visiting Basel-ites interested in owning their work. Without that bigger pool of buyers, they say, there’s no way to sustain artists amid the continued expansion of the art scene.

“I couldn’t support myself,” said Bert Rodriguez, a conceptual artist, in a phone call from his new home in Los Angeles. After appearing in the 2008 Whitney Biennial, Mr. Rodriguez became one of Miami’s hometown heroes.

Yet despite awards and commissions, he felt stuck. “All the collectors there who were going to support me had already bought my work,” said Mr. Rodriguez, known for prankish projects that include burying himself up to his neck on a museum’s front lawn. “I had tapped into every well I could, and it just wasn’t enough.”

But now that he’s in Los Angeles, he said, advertising agencies and Silicon Valley clients who once ignored him are lining up. This winter, he will get $50,000 from a company behind a new travel app to drive cross-country and “virtually” write his name across America. “I’ve made more money in the last three years in Los Angeles than in the previous 10 in Miami,” he said.

[…]

“Too many people are obsessed with chasing the next hippest, newest thing,” said Kristen Thiele, an ArtCenter board member as well as a former resident artist there. Ms. Thiele cited the core ideas first laid out by Mrs. Schneiderman: Artists need cheap studio space, the ability to sell their work — out of those same studios, if necessary — and, not least, “the genuine sense of community that comes from being surrounded by your fellow artists with trained eyes.”

There’s nothing especially revolutionary about Mrs. Schneiderman’s thinking. Still, for the Miami painter John Sanchez, it’s been more than he could have ever hoped for. Originally represented by Emerson Dorsch, he felt his rain-slicked urban landscapes were falling out of step with that gallery’s turn toward an art-theory laden program.

“I’m a realist painter,” he said. “I’m trying to paint everyday moments as beautifully as I can. It’s not rocket science.” By contrast, at the ArtCenter, just by dint of being on a heavily trafficked street, he said, “I got a vast amount of exposure to people from everywhere, not just those in the know.”

He’s since picked up both sales and fresh brushwork techniques. Having found a formula for survival as an artist, he’s hoping to move into the ArtCenter’s remaining Lincoln Road building.

“I want to be like mold,” he said, laughing. “I want to stay.”

Doonesbury — No deposit, no return.

Thursday, November 13, 2014

It’s A Tradition

Fiddler_on_the_roofIf you’re in South Florida and looking for some family entertainment this weekend, check out the Miami Acting Company’s production of Fiddler on the Roof at the Banyan Bowl in Pinecrest Gardens.  This legendary musical that opened fifty years ago once held the record as the longest-running show on Broadway.  This production runs tonight through Sunday, so go already.

I attended the final dress rehearsal last night (which explains the dearth of posts this morning) and it looks good to go with a strong cast, a good-sized orchestra, and a very nice set that was assembled by a dedicated crew of skilled (if uncredited) carpenters last Sunday just out of range of the pouring rain.

This is not my first trip to the shtetl.  In 1972 the University of Miami Ring Theatre did Fiddler on the Roof.  Tevye was played by Ernie Sabella and the cast also included Gail Edwards and yours truly as the Russian priest.  I had one scene behind a scrim.  But to quote the immortal Avery Schreiber, there are no small parts, just short pay.

Friday, September 19, 2014

Rated Arr

I feel that it is my obligation to warn you that today is Talk Like A Pirate Day.

PirateyActor Robert Newton, who specialized in portraying pirates, especially Long John Silver in the 1950 Disney film Treasure Island, the 1954 Australian film Long John Silver, and as the title character in the 1952 film Blackbeard, the Pirate,[10] is described as the “patron saint” of Talk Like A Pirate Day.[1] Newton was born in Dorset and educated in Cornwall, and it was his native West Country dialect, which he used in his portrayal of Long John Silver and Blackbeard, that some contend is the origin of the standard “pirate accent”.[11]

The archetypal pirate grunt “Arrr!” (alternatively “Rrrr!” or “Yarrr!”) first appeared in fiction as early as 1934 in the film Treasure Island starring Lionel Barrymore,[11] and was used by a character in the 1940 novel Adam Penfeather, Buccaneer by Jeffrey Farnol.[11] However it was popularized and widely remembered with Robert Newton’s usage in the classic 1950 Disney film Treasure Island. It has been speculated that the rolling “rrr” has been associated with pirates because of the location of major ports in the West Country of England, drawing workers from the surrounding countryside. West Country speech in general, and Cornish speech in particular, may have been a major influence on a generalized British nautical speech.[12][13] This can be seen in the Gilbert and Sullivan operetta The Pirates of Penzance, which is set in Cornwall; although the play did not (originally) use the phrase “arrr”, the pirates used words with a lot of rrr’s such as “Hurrah” and “pour the pirate sherry”.[14]

Sorry, Bob.

Thursday, September 18, 2014

Saturday, February 8, 2014

Sunday, February 2, 2014

Sunday Reading

The Champion — Ta-Nehisi Coates in The Atlantic on the legacy of African-American politics.

Last week The New Yorker ran a lengthy profile of Barack Obama, by David Remnick, in which you can hear the president’s opinions on everything from marijuana legalization to war to racism. Obama is as thoughtful as ever, and I expect that admiration for his thoughtfulness will grow as the ages pile upon us. I have tried to get my head around what he represents. Two years ago, I would have said that whatever America’s roots in white supremacy, the election of a black president is a real thing, worthy of celebration, a sign of actual progress. I would have pointed out that you should not expect a black head of state in any other Western country any time soon, and that this stands as singular accolade in the long American democratic tradition. Today, I’m less certain about national accolades. I’m not really sure that a writer—whose whole task is the attempt to see clearly—can afford such attachments.

More interesting to me is why this happened. If you begin from the proposition that African-Americans are fundamentally American, in a way that the Afro-French are not; and that America is, itself, a black country in a way that the other European countries are not, Barack Obama’s election strikes you somewhat differently. African-American politics is literally as old as American politics, as old as Crispus Attucks shot down for his nascent country. One of the earliest and bloodiest proving grounds for “Western” democratic ideals was Gettysburg. The line that saved the Union, that ensured that “government of the people, for the people, by the people, shall not perish from this earth” was marked by the house of the black farmer Abraham Brian. On that Brian property lived the great Mag Palm, currently lost to our memory, who fought off man-catchers determined to reduce her to peonage.

The first African-American to be nominated for president was Frederick Douglass, a biracial black man of exceptional gifts who dreamed of his estranged father as surely as the present occupant of the White House, perhaps even in this day, dreams of his. The last black Southerner to serve in Congress, before this country assented to the desecration of its own Constitution, was George Henry White, who did not leave in despair but in awesome prophecy:

This is perhaps the Negroes’ temporary farewell to the American Congress, but let me say, Phoenix-like he will rise up some day and come again. These parting words are in behalf of an outraged, heart-broken, bruised and bleeding, but God-fearing people; faithful, industrious, loyal, rising people—full of potential force.

And come again, we have.

All Together wit Pete Seeger — Emily Greenhouse remembers the impact he had on her family.

After the Second World War, my grandparents married and moved to Long Island, and my grandfather opened a dry-cleaning shop. On his delivery route, he would look for customers who received the right kind of magazines and then slip fliers underneath their doors: Committee for a Sane Nuclear Policy, the March on Washington, Ban the Bomb, Stop the War in Vietnam. That’s how my grandparents made new friends. They were meetings people—Grace Paley people, union people. They brought a baby in a stroller to the Rosenbergs’ funeral. Some winters before my grandfather died, he joined in a protest against the Iraq War, in Washington, D.C. After standing for four hours in fifteen-degree weather, he came down with pneumonia. I thought this was heroic, but for him it was normal. He was a Seeger man: he would not be moved or deterred.

The sad morning we learned that Seeger was gone, I spoke to Rob Rosenthal, a professor of sociology at Wesleyan University, and his son, Sam, who recently edited the book “Pete Seeger: In His Own Words.” They met Seeger when he replied to an ad that the elder Rosenthal had placed in the Nation. “He was never pessimistic,” Rosenthal said. “He always thought that humans would get it together.” He added: “When you look at the grand movements of the twentieth century, he was involved in them all (the women’s movement most peripherally). We may think now, ‘Wow, we’re so messed up.’ But he travelled through the South in the thirties, he saw the Hudson cleaned up—a huge, huge thing. He was realistic about how difficult all this was.”

Seeger got in at the ground level—on the union movement, the civil rights movement, the anti-Vietnam War movement, the environmentalist movement—and spoke directly to those there. Gabriel Winant, a scholar of labor history, described how Seeger, with a song like “Miner’s Lifeguard,” showed coal miners that they were like sailors—widely perceived as the original modern workers—even if their work was out of the boss’s view. And that, like the sailors, the miners were stronger together.

Sam Rosenthal told me that it was hard to imagine Seeger’s perspective. “He didn’t feel the weight of history the way we did,” he said. “It was staggering to hear him talk about certain things—going to this huge historic march, hanging out with Guthrie or Lead Belly. In the next breath, he would start talking about his neighbor down the road who grew tomatoes.”

It’s Debatable — Sean McElwee and Abigail Salvatore in Salon argue that scientists shouldn’t debate Creationists.

Bill Nye and Ken Ham will be debating creationism on Feb. 4, and it’s a bad idea for both scientists and Christians. Ham’s young-earth creationism represents the distinct tendency of American Christian fundamentalists to reject science and use their religion to defend economic ideas, environmental degradation and anti-science extremism. But these views aren’t actually inherent in Christianity — they’ve been imposed on the biblical text by politically motivated and theologically inept readers. The solution is not anti-theism but better theological and scientific awareness.

The vast majority of right-wing Christian fundamentalists in the U.S. are evangelicals, followers of an offshoot of Protestantism. Protestantism is based on the premise that truth about God and his relationship with the world can be discovered by individuals, regardless of their level of education or social status. Because of its roots in a schism motivated by a distrust of religious experts (priests, bishops, the pope), Protestantism today is still highly individualistic. In the United States, Protestantism has been mixed with the similarly individualistic American frontier mythos, fomenting broad anti-intellectualism.

Richard Hofstadter’s classic, “Anti-Intellectualism in American Life,” perfectly summarizes the American distaste for intellectualism and how egalitarian sentiments became intertwined with religion. He and Walter Lippmann point to the first wave of opposition to Darwinian evolution theory, led by William Jennings Bryan, as the quintessential example of the convergence of anti-intellectualism, the egalitarian spirit and religion. Bryan worried about the conflation of Darwinian evolution theory and capitalist economics that allowed elites to declare themselves superior to lower classes. He felt that the teaching of evolution challenged popular democracy: “What right have the evolutionists — a relatively small percentage of the population — to teach at public expense a so-called scientific interpretation of the Bible when orthodox Christians are not permitted to teach an orthodox interpretation of the Bible?” He notes further, “The one beauty of the word of God, is that it does not take an expert to understand it.”

This American distrust of experts isn’t confined to religion. It explains the popularity of books like “Wrong” by David Freedman (a book that purports  to show “why experts are wrong”) that take those snobbish “experts” down a peg.  The delightfully cynical H.L. Mencken writes,

The agents of such quackeries gain their converts by the simple process of reducing the inordinately complex to the absurdly simple.  Unless a man is already equipped with a considerable knowledge of chemistry, bacteriology and physiology, no one can ever hope to make him understand what is meant by the term anaphylaxis, but any man, if only he be idiot enough, can grasp the whole theory of chiropractic in twenty minutes.

Thus, an American need not understand economics to challenge Keynes, nor possess a PhD to question climate change, nor to have read Darwin to declare his entire book a fraud. One need not read journals, for Gladwell suffices, and Jenny McCarthy’s personal anecdotes trump the Institute of Medicine and National Academy of Sciences.

Doonesbury — Keeping it real.

Monday, January 27, 2014