One Man’s Riot

Chip Somodevilla/Getty Images

Consider the following scenarios. In Ferguson, Missouri, a young black man fights with a cop and ends up dead. In Cleveland, Ohio, cops approach a black 12- year-old with an Airsoft gun tucked in his waistband. The boy reaches for the pistol and ends up dead. In Staten Island, New York, a black man selling loose cigarettes on the street resists arrest and ends up dead. In North Charleston, South Carolina, a 50- year-old black man ends up dead after bolting from his car and allegedly attempting to take an officer’s Taser. In Baltimore, Maryland, a young black man with a switchblade in his pocket runs from the cops and ends up dead.

Each of these stories has a vital common element. Can you identify it?

If you said, “Cops all around the country are killing black men,” then you probably watched last night’s violence in Baltimore with a fair amount of sympathy for the rioters. To you, burning down your own city is a legitimate expression of rage against institutionalized racism and unrestrained police brutality. You understand why Mayor Stephanie Rawlings-Blake said she wanted to give the rioters “space” to destroy property. To you, the indignities suffered by black men at the hands of the police have gone on too long. You nod your head in something like agreement when you hear Morehouse College professor Marc Lamont Hill tell CNN host Don Lemon: “We’ve been dying in the streets for months, years, decades, centuries. I think there can be resistance to oppression.” To you, the time for resistance has finally arrived.

If, however, you looked at these scenarios and said, “These deaths were the tragic result of poor choices on the part of the deceased,” then you probably watched last night’s violence in Baltimore with horror and revulsion. To you, the looting and stone-throwing is inexcusable lawlessness on the part of bullies and thugs. You think Mayor Rawlings-Blake is probably in over her head. You think Ivory Tower provocateurs like Hill are playing a dangerous and self-serving game. To you, most cops are honest and unbiased. You think the obvious best way to avoid a potentially deadly confrontation with the police is not to run and not to fight.

Who owns Baltimore’s rage? Some facts: Every member of the Baltimore City Council is a Democrat. Every mayor since 1967 has been a Democrat. Such political homogeneity invites corruption. If some element of the city’s police department is brutal and corrupt, it’s because no one in Baltimore has lifted a finger to stop them. Maybe that should be next on Marc Lamont Hill’s to-do list.

However you see things, you may share a sense that our cities are spiraling toward a destructive period of racial conflagration. Whether Democrat or Republican, you may wish to avoid this fate. What can be done? On whom does the burden of de-escalation fall? Stephanie Rawlings-Blake? Mark Lamont Hill? Bill Bratton? Rush Limbaugh? Barack Obama? Baltimore street gangs like the Black Guerilla Family?

How you answer says a lot about how you see the world. From where I stand, it’s hard to credit claims that a police force that is nearly 50 percent African-American is somehow hardwired to abuse residents of a majority black city. But if police corruption is a problem in Baltimore, and news reports suggest that it is, the solution to it cannot be running battles between cops and citizens. Rather, the clear solution is better governance. You say you want a revolution. How about electing a Republican or two to the Baltimore City Council? That would be truly radical.

In so much as anyone in Baltimore is angry at the “system,” they should direct their rage where it belongs—an unbroken, five-decade string of one-party rule in the city and a national War on Poverty that has systematically dismantled the black family.

City Journal

post from sitemap

Before the Bell

Photo by Stephen Dunn/Editorial/Getty Images

Tickets finally went on sale Thursday for the May 2 Floyd Mayweather-Manny Pacquiao bout at the MGM Grand in Las Vegas—500 tickets, anyway, at prices ranging from $1,500 to $7,500 per seat. The boxers’ promoters and the MGM control the rest, and prices on the “secondary market” will be much higher for what has been touted as “the biggest event in the history of boxing.” It’s certainly the richest: the fight should shatter records for total revenue, live gate, and pay-per-view receipts, to say nothing of the more than $100 million both boxers will haul in (in Mayweather’s case, it’s perhaps closer to $200 million). Boxing remains the only sport whose participants can top the career earnings of other athletes in one night. In fact, Mayweather and Pacquiao will make more than the total payrolls of some Major League Baseball .

Fabulous payouts for this most controversial of sports go back a long way. Even in the 1880s, John L. Sullivan pulled down more money than anyone in America below robber-baron status. In the 1920s, Jack Dempsey made almost as much in one fight as Babe Ruth earned in his whole career. Muhammad Ali and Joe Frazier earned more in their 1971 fight—$2.5 million each—than baseball or football players could earn in a career in those pre-free agency days. And Mike Tyson made over $20 million for knocking out Michael Spinks in 91 seconds in 1988, when the highest-paid major league baseball player, Ozzie Smith, made $2.3 million, and the average major league salary was $438,000.

Boxing has faded dramatically in popularity since then, and it remains to be seen whether Mayweather-Pacquiao can resuscitate general interest or whether it serves more as an echo of earlier times, when “fights of the century” happened often. Still, there is the money, which has grown to such absurd increments by now because of pay-per-view, which bestows casino-like riches on the best fighters. Pay-per-view also hastened boxing’s decline, though, by taking it off free television for a generation (it recently returned), removing boxers from the popular conversation. That’s why Pacquiao’s promoter, Bob Arum, was right to say that this event can’t compare with big fights of the past, for which “the world stopped,” as he put it, to learn the outcome. Our 24/7 media and entertainment machine stops for little anymore, and for many people, boxing won’t even be the most important sport on the menu on May 2. It’s Kentucky Derby Day and more besides.

Things were different in March 1971, when Ali and Frazier fought for the first time, in Madison Square Garden. The two rivals made the cover of , then an important cultural barometer. About 300 million people around the world saw the closed-circuit television broadcast, and legend has it that the warring sides in Vietnam took a break from hostilities to watch. In 1938, when Joe Louis fought Germany’s Max Schmeling, 60 million Americans—nearly half the nation’s population then—tuned in to the NBC radio broadcast. Franklin Roosevelt and Adolf Hitler listened, too. Closer to our own time, Mike Tyson’s fights enjoyed similar mass attention. Tyson was the last in a line of American boxers—Ali, Louis, Dempsey, Jack Johnson—who attained fame comparable only with ubiquitous cultural figures like presidents or movie stars.

Floyd Mayweather may be the richest athlete in history, but he isn’t famous like that. Nor, at least in America, is Manny Pacquiao, who hails from the Philippines. What separates the May 2 fight from its predecessors is that millions in the United States will have no prior knowledge of or opinion about its principals. That’s ironic considering what strong personalities they are.

Pacquiao is so easy to like that more Americans may wind up rooting for him than Mayweather, who grew up in Grand Rapids, Michigan. An explosive fighter in the ring but soft-spoken and humble outside it, Pacquiao devotes considerable resources to fighting poverty in his country. “The social welfare system in the Philippines is called Manny Pacquiao,” says Arum, and the crime rate is said to plunge there when he fights. Pacquiao’s rise from impoverishment to become one of boxing’s legendary champions—he’s won titles at eight different weights, a record—has given him a stature resembling that of a deity among Filipinos. He might become president of the country someday; he’s already a sitting congressman from Sarangani province. His interests extend beyond politics. He announced recently that he was giving up on his singing career—no great loss, given his taste for mawkish seventies love ballads—but then turned around and recorded a song for the Mayweather fight, dedicated to the Filipino people. He’s also a professional basketball player back home, where criticism of him is not welcome. One former NBA player was bounced out of Filipino basketball (and fined) when he questioned Pacquiao’s hardcourt talents; a team official likened it to insulting Martin Luther King Jr. Still, for all his popularity, Pacquiao’s penchant for drink and non-spousal affection nearly derailed his career and his marriage before he committed himself to Christianity. And allegations continue to follow him—though with no proof—that, somewhere along the line, he made use of performance-enhancing drugs.

One of those dropping such hints was Mayweather himself, who, in this and most other areas, is no courter of public affection. In his persona as “Money” Mayweather, Floyd revels in his wealth and in his own general wonderfulness. “All roads lead to Floyd Mayweather,” he likes to say, whatever that means. He has filmed himself surrounded by piles of greenbacks, and he likes the feel of, say, $10,000 in cash in his pocket as walking-around money: “You never know when you might need a Brioni shirt.” He bristles at even gentle criticism and sees adversaries and conspirators everywhere, not surprising instincts for a kid whose home life was capsized by the drug problems (and prison sentences) of those dearest to him and who learned early on to look out for Number One. So shrewd has he been in steering his own career that it seems likely that the antipathy he arouses is, at least to some degree, part of the manipulation. But it’s not all a game: he served a brief sentence in 2012 for domestic battery. His evident intelligence is no brake on his consistently boorish behavior, as when he branded Pacquiao a “little yellow chump” in a 2010 video rant. HBO boxing commentator Jim Lampley called Mayweather “an often aggressively distasteful human being whose behaviors are a blight on the boxing landscape.” In the ring, though, Floyd is a wizard, one of the great defensive specialists in the sport’s history, eager to remind everyone—including Pacquiao, at the press conference announcing the bout—that he has never lost. His 47-0 record proves, at least to his mind, that he is The Best Ever (the Greatest label is taken).

Who wins? The smart money is on Mayweather, as he would be first to tell you. (Haven’t you heard? He’s unbeaten.) But at 38 and 36, respectively, Mayweather and Pacquiao have both lost a step, and maybe more, from their career peaks. Each fights in a style that causes the other trouble. It may come down to who has the faster hands. Decades ago, boxing archivist Jim Jacobs ran film of Ali and other greats through a simulator to measure, in frames, how long it took their punches to arrive. Jacobs determined that Ali’s jab took fewer frames to reach its target than anyone else’s.

Now, if someone would replicate that experiment with Pacquiao and Mayweather, I might know which way to bet. Except I never bet.

City Journal.

post from sitemap

The Heretic We Need

Photo by Elisabetta Villa/Getty Images

, by Ayaan Hirsi Ali (Harper Collins, 288 pp., $27.99)

Remember Mahmoud Ahmadinejad, former president of the Islamic Republic of Iran? I always had a soft spot for his vainglorious lack of cognitive dissonance: the way he would chase after Western missile and nuclear technology while wearing his quasi-Members Only jacket—because donning the classic suit and tie was too indicative of the decadent infidel West. Turns out Mahmoud was more progressive than I gave him credit for: in 2010, he offered support to those Iranian men who preferred a clean shave and a tie to . That prompted an internecine battle with Ayatollah Ahmad Khatami, who eventually proclaimed: “The supreme guide [Ayatollah Ali Khamenei] himself has said in afatwa that the wearing of ties or bow ties is not permitted.”

I was reminded of this episode while reading Ayaan Hirsi Ali’s , which, among other theses, points out the intellectual bankruptcy of a political/religious ideology that preaches a return to seventh-century law and disorder while utilizing the tools of twenty-first century Western progress. This ideology practically screams “Topple Me With Satire and Post-Enlightenment Ideas,” and yet few have dared try, for doing so is (even in a free society) to risk death, whether your name is Salman Rushdie, Lars Vilks, Theo van Gogh, —or Ayaan Hirsi Ali.

If you’re not familiar with Ali and her story, buy a copy of her 2007 memoir, . I’m confident you’ll eagerly purchase the follow-up, , and maybe even , an earliercollection of essays. Born into a devout though not extremist Islamic household in Somalia, Ali bounced from there to Saudi Arabia, then Ethiopia, and finally Kenya—all before the age of 12. In 1992, she left Africa for the Netherlands to escape an arranged marriage, but not before suffering genital mutilation, beatings (including a skull-bashing that nearly took her life), and radicalization in the name of Islam and tribal tradition. A little over a decade later, she was a Leiden University graduate and “Westernized” member of the Dutch parliament, gaining notoriety for her impassioned critique of Islamist violence, particularly against women. She’s now an American citizen, having fled the Netherlands after the murder by Islamists of Theo van Gogh— her collaborator on the film —and a tempest over her Dutch citizenship.

, as its subtitle suggests, is not a third bio but rather Ali’s most academic and aspirational book to date. It clearly and cleverly offers five “amendments” to Islamic orthodoxy in order to alleviate its violent and oppressive tendencies. But before getting into those, let’s tackle some clichés Ali battles on the press circuit. Most interviews with her go something like this: “You’ve lived a tough life and are a fiery critic of Islam—but aren’t you painting with a broad brush? You know that the vast majority of the world’s 1.6 billion Muslims are peaceful, right? You know that the Bible has violent parts, right? You know that some people today take the Bible literally, right? Also, the Crusades. Oh look, we’re outta time.”

An elongated version of this lazy script was enacted during Ali’s recent appearance on “The Daily Show.” Host Jon Stewart talked in circles to Ali, at one point musing, “But did the Bible change, or did people’s interpretation of it change?” He seemed unaware that the freedom to interpret and debate Islam’s holy texts is Ali’s first proposed reform. Another deep Stewart-ism: “[I]t feels like Muslims are being asked to answer to something that has not much to do with them, that a group of radicals has stolen a text [from them].” Someone didn’t do his homework.

carefully separates Muslims into three categories. “Medina Muslims,” as Ali calls them, are the jihadists and their supporters, who intertwine the faith with seventh-century political and martial order, as Muhammad did during his time in that city. A low-ball estimate puts this population at 48 million, a tiny fraction of the world’s Muslims—but, considering that it took only 19 men armed with box cutters and the “Medina” ideology to bring us 9/11, it’s a number to be concerned about. In the second group are the apostates and heretics, like Ali herself, who have left the faith altogether or are so critical of it that they can no longer be considered “true” Muslims. This population is tiny but growing. Finally, what Ali terms the “Mecca Muslims” comprise the majority of Islam’s adherents. These are the hearts and minds to whom she preaches reform: devout Muslims who desire access to Western thought, education, technology, and civil law, but who find that “pure” Islamic scripture and discipline (or their government’s adoption of sharia law) renders accommodation all but impossible. Torn by this conflict, many of these Muslims find themselves ripe for jihadist plucking: better to side with the devil you kinda know (Medina Islam) than the one you don’t (hell-bound apostasy).

“Avoidance was my main strategy to deal with the terrible dissonance” while in Holland, writes Ali. For others, it’s silent disapproval (or approval) of the Medina crew, a stifled life, or a bullet through the brain (see: Noble Prize winner Malala Yousafzai). That Islam itself might be incongruent with basic human rights is an argument met with much derision, but as Ali drolly replies: “To me, however, when a murderer quotes the Qur’an in justification of his crime, we should at least discuss the possibility that he means what he says.” She dedicates whole chapters to diagnosing the five root problems in Islam that manifest themselves via violence and oppression: 1) The Qur’an’s status as the immutable word of God and the infallibility of Muhammad as the last divine messenger; 2) an emphasis on the afterlife over the here and now; 3) sharia’s claims to be a comprehensive system of law governing the spiritual and temporal realms; 4) the obligation of ordinary Muslims to command right and forbid wrong; and 5) the concept of jihad, or holy war.

Millions of Muslims currently live under systems of sharia law that Westerners would be hard-pressed to distinguish from the fictional, Medieval-inspired fantasy of : the men employed full-time as sword-wielding head-n’-limb choppers, the strategic marriages for honor and power, and the heartless tunnel vision of hoisting one flag above another, rivers of blood and bodies be damned. Western academics and liberals, Ali notes, should confront this fact with the same outrage they brought to the anti-apartheid movement in the 1970s and 1980s. Instead, they mostly respond with silence, or with some face-palm-inducing counterpoint, à la Stewart.

I have two quibbles with the book. The first arises, ironically, out of Ali’s being a victim of her previous success. ’s introduction states that it was written for “not only Muslims but also Western apologists for Islam.” I couldn’t help but think that every “amendment” chapter would have been buttressed by the in-depth personal anecdotes that packed and . Granted, Ali can’t write her biography three times (and does include another truncated telling of her life story), but the beauty of those earlier books was how they immersed the Western reader in the Muslim Ummah: the omnipresence of clan mentality, violence (both real and threatened), and malevolent personal and spiritual guilt. The intended audience might do better to read Ali’s earlier books first.

The second quibble is with Ali’s case for optimism. Page after page is filled with disconcerting statistics. Scientific polls—nearly all taken within the last decade—show staggering support for “Medina” opinions in the most populous and fecund Islamic nations, including those supplying the bulk of Western immigration. Seventy-five percent of Pakistanis “favor the death penalty for leaving Islam.” According to Pew, “91 percent of Iraqi Muslims and 99 percent of Afghan Muslims supported making sharia their country’s official law.” And, Ali writes, “65 percent [of European Muslims] say that religious rules are more important to them than the laws of the country in which they live.” And so on. Ali even writes off any hope for a modern-day Reagan or Thatcher regarding the shadow of Medina: “I do not expect our political leadership to take the lead in directly challenging the inequities of political Islam. The ideological self-confidence that characterized Western leaders during the Cold War has given way to a feeble relativism.”

Given all this, it's thus not overly convincing that the scales tip toward optimism, but how clearly Ali can read the crystal ball shouldn’t detract from ’s message of “reform now.” And Ali is blunt about what the world is up against in that effort: “Indeed, the term ‘,’ the nearest thing to reform in Arabic, means trying to determine God’s will on some new issue . . . Islam even has its own pejorative term for theological trouble-makers: those who indulge in innovations and follow their passions (the Arabic words ).” She then reminisces on the statement that earned her heretic status in Amsterdam: “[J]ust allow us Muslims one [Voltaire], please.”

Voltaire, Locke, Luther, Spinoza . . . it’s tempting to call Ali the modern incarnation of one or another of these. Yet just being the Ayaan Hirsi Ali of our own time is more than enough: she’s the heretic who risks her life with rich intellectual treatises and memoirs to hasten an ideological reformation that could liberate millions. We ignore her quill to our shame and peril.

post from sitemap

Running With the Predators

Q. SAKAMAKI/REDUX
A December 2014 protest against the police in New York City

Starting in late summer 2014, a protest movement known as Black Lives Matter convulsed the country. Triggered by the fatal police shooting of a black teenager in Ferguson, Missouri, the movement claimed that blacks are still oppressed by widespread racism, especially within law enforcement. The police subject black communities to a gratuitous regime of stops and arrests, resulting in the frequent use of lethal force against black men, according to the activists and their media and academic allies. Indeed, America’s police are the greatest threat facing young black men today, the protesters charged. New York’s mayor Bill de Blasio announced in December that he worries “every night” about the “dangers” his biracial son may face from “officers who are paid to protect him.” Less than three weeks later, a thug from Brooklyn, inspired by the nationwide anti-cop agitation, assassinated two New York police officers.

The protest movement’s indictment of law enforcement took place without any notice of the actual facts regarding policing and crime. One could easily have concluded from the agitation that black and white crime rates are identical. Why the police focus on certain neighborhoods and what the conditions are on the ground were questions left unasked.

The year 2014 also saw the publication of a book that addressed precisely the questions that the Black Lives Matter movement ignored. Alice Goffman, daughter of the influential sociologist Erving Goffman, lived in an inner-city Philadelphia neighborhood from 2002 to 2008, integrating herself into the lives of a group of young crack dealers. Her resulting book, , offers a detailed and startling ethnography of a world usually kept far from public awareness and discourse. It has been widely acclaimed; a film or TV adaptation may be on the way. But is an equally startling—if unintentional—portrait of the liberal elite mind-set. Goffman draws a devastating picture of cultural breakdown within the black underclass, but she is incapable of acknowledging the truth in front of her eyes, instead deeming her subjects the helpless pawns of a criminal-justice system run amok.

At the center of are three half-brothers and their slightly older friend Mike, all of whom live in a five-block area of Philadelphia that Goffman names Sixth Street. Sixth Street, we are told, isn’t viewed as a particularly high-crime area, which can only leave the reader wondering what an actual high-crime area would look like. In her six years living there, Goffman attended nine funerals of her young associates and mentions several others, including one for “three kids” paid for by local drug dealers, eager to cement their support in the community.

Goffman contends that it is the legal system itself that is creating crime and dysfunction in poor black communities. Young men get saddled with a host of allegedly petty warrants for having missed court dates, violated their parole and probation conditions, and ducked the administrative fees levied on their criminal cases. Fearful of being rounded up under these senseless procedural warrants, they adopt a lifestyle of subterfuge and evasion, constantly in flight from an increasingly efficient and technology-enhanced police force. “Once a man fears that he will be taken by the police, it is precisely a stable and public daily routine of work and family life . . . that allows the police to locate him,” Goffman writes. “A man in legal jeopardy finds that his efforts to stay out of prison are aligned not with upstanding, respectable action but with being a shady and distrustful character.”

Goffman’s own material demolishes this thesis. documents a world of predation and law-of-the-jungle mores, riven with violence and betrayal. Far from being the hapless victims of random “legal entanglements”—Goffman’s euphemism for the foreseeable consequences of lawless behavior—her subjects create their own predicaments through deliberate involvement in crime.

In 2002, when Goffman began her acquaintance with Sixth Street, the half-brothers Chuck, Reggie, and Tim were 18, 15, and nine, respectively. All had different fathers by the same crack-addict mother, Miss Linda. Their Section 8–subsidized house reeked of vomit, alcohol, and urine; roaches and ants crawled over the inhabitants as well as the furniture; cat feces covered a kitchen corner. Chuck’s and Reggie’s arrest records had begun in their early teens; Tim would graduate from middle school to the juvenile courts when he turned 12. Fatherlessness is a virtually universal condition among the young men in Goffman’s tale, but gradations exist within it. Chuck’s father came around during his early years, which helps explain, says Chuck, “why [Chuck] knew right from wrong and his young brothers did not”—a poignant acknowledgment of the role of fathers in raising sons, even if its premise (that Chuck knows right from wrong) is questionable.

On Sixth Street, drug dealing is tantamount to a bourgeois occupation. Chuck complains that his middle brother, Reggie, lacks the patience for “making slow money selling drugs hand to hand.” Instead, Reggie favors armed robberies, to the admiration of his mother, Miss Linda. “He fearless,” she says. “A stone-cold gangster.” It would be a mistake, however, to think of drug dealing as a peaceful activity. Early on, a disgruntled supplier firebombs Chuck’s car. Chuck responds by shooting at the supplier’s home. In 2007, at the end of Goffman’s chronicle, Chuck is fatally shot in the head while standing outside a Chinese restaurant, one of three shootings that night in Philadelphia. The killer, Goffman writes, was “trying to make it at the bottom rung of a shrinking drug trade.”

Accompanying this drug-related violence is a more random violence that springs from dog-eat-dog exploitation and lack of impulse control. In an earlier incident, Goffman’s fourth main character, Mike, another crack dealer, is walking home one night with a large wad of cash from a dice game. An armed robber accosts him—presumably tipped off to Mike’s stash by the other players. Mike tries to pull his own gun but gets shot in the hip first. Several days later, Mike sees the gunman in a Buick and opens fire. Two days after that, Mike and his attacker drive past each other, guns blazing. Mike’s car takes seven bullets, and he starts wearing a bulletproof vest. During another dice game, a young thug from Sixth Street named Tino puts a gun to a fellow player’s head and demands his money. His target, Jay Jay, refuses, so Tino, who is high on PCP, kills him. Jay Jay’s fellow crew members take to driving up and down Sixth Street firing at residents. Chuck gets shot in the neck—this time, not fatally—and his friend Steve is hit in the thigh.

Ned, 43, supports himself in part by stealing credit cards and intercepting checks in the mail. When he and his girlfriend Jean, a crack addict, need money for property taxes, they lure a cousin of Reggie’s (Miss Linda’s second son) to their house with the promise of gossip about a former girlfriend. Waiting there is a man in a hoodie, who robs the cousin at gunpoint. The unintended punch line of the story: Ned and Jean also get income from working as foster-care parents, a fact that does not apparently give Goffman pause but that speaks volumes, sadly, about the quality of parenting in the area.

Theft is constant among Sixth Street residents. Mike invited a man he met in prison to play video games at his mother’s house. The guest steals the stereo, DVD player, and two TVs. Anthony, another Sixth Street resident, was thrown out by his mother for stealing from her purse. He was turned in to the police by neighbors on a warrant, after stealing their shoes. When he stayed at Miss Linda’s, he grumbled that he couldn’t save money because she would steal from him while he slept. Mike gives Anthony crack to sell, but he could not shoot his fellow dealers when they stole from him, since his usual whereabouts at night were widely known, making him an easy target. As a result, he was not a very effective drug dealer.

The characters’ mishaps often resemble farce. Reggie, on the run for a drug crime, takes refuge in his mother’s house. Miss Linda had instructed him to leave before midnight, but he falls asleep. When a SWAT team arrives, Miss Linda persuades them not to go upstairs, and Reggie jumps out the bedroom window and flees into the alley, like Cherubino leaping from the Countess’s window in . Mike gives himself a birthday party, and the guests start stealing liquor bottles. He sets up sentry on the windowsill, gun on his lap, threatening to pistol-whip the next guy who takes a bottle. But he, too, falls asleep, and a guest lifts a wad of cash from his pocket.

After the police find Reggie cowering in a shed one day, he is sent to the county jail. He wallows in self-pity because his Sixth Street male friends are not visiting him or putting money into his commissary account. “Niggas ain’t riding right! Niggas ain’t got no respect,” he complains to Goffman. “When I come home, man, I’m not fucking with of these niggas. Where the fuck they at? They think it’s going to be all love when I come home, like, what’s up, Reggie, welcome back and shit . . . but fuck those niggas, man, they ain’t riding for me. I got no rap for them when I touch.”

The residents’ chaotic sex lives generate further farcical situations—if one can overlook for a moment the consequences for their children. Virtually every male has a baby mom and a simultaneous collection of girlfriends; the females have children and their own series of boyfriends. After a prison term, Mike is sentenced to a halfway house in North Philly. He starts sleeping with a caseworker there named Tamara. Mike violates curfew and winds up back in prison. He tries to ensure that Tamara’s visits are on different days from those of Marie, the baby mom of his two children. One day, however, Tamara shows up unexpectedly, “ostensibly,” Goffman qualifies, to visit her inmate brother. Tamara sees Marie and Mike sitting across from each other and says hello. Marie sizes up the situation and announces loudly: “I ain’t drive five fucking hours for this shit.” Mike tries to quiet Marie down—like Don Giovanni trying to hush up Donna Elvira—but she retorts: “You fucked her, didn’t you.” Tamara announces loudly to her brother that she really likes Mike and hopes that he is not still messing with his baby mom, while Marie conspicuously plays with Mike’s hair. Mike starts talking loudly to cover up Tamara’s monologue to her brother while looking desperately at Goffman to rescue him. Marie stands up and leans in for a kiss, which Mike, cornered, supplies. Tamara ends up in tears.

But the sexual complications usually take on a more depressing aspect. At the hospital where Chuck has died after his head wound, his “on-again-off-again girlfriend,” Tanesha, shows up, but everyone wonders “where the hell Chuck’s baby-mom Brianna was.” Miss Linda asks Goffman to give the Pampers money, which the author had promised her, to Tanesha, who is looking after Chuck’s two daughters until Brianna can be located. This is not an arrangement likely to end well.

False incriminations are pervasive. When Mike was 24 and his children were three and six, he started dating a woman from North Philly named Michelle. He had high hopes for her, he tells Goffman, since, as a Puerto Rican, she should be more loyal than the “black chicks” who “love the cops” and turn in their boyfriends. Moreover, Michelle’s father and brothers sold drugs, so she was well accustomed to criminal proceedings. Michelle said that she loved Mike more than any man she had ever met, including her three-year-old’s father, then serving a ten-year federal prison sentence for an undisclosed crime. But Mike misses a court appointment, and a warrant issues for his arrest. The police find drugs and a gun in his apartment, which he tries to pin on Michelle and her father. The police show Michelle Mike’s statement against her, as well as his texts and phone calls to Marie that indicate that he is still involved sexually with his baby mom. Indignant, Michelle tells the police everything she knows about his drug dealing. Mike writes her from jail: “Don’t come up here, don’t write, don’t send no more money [this last mandate entailing heroic self-sacrifice, no doubt]. . . . You thought I wasn’t going to find out that you a rat? . . . Fuck it. I never gave a fuck about you anyway. You was just some pussy to me and your pussy not even that good!”

But Mike is the victim of double-crossing as well. He acts as godfather to a young, hoodie-wearing tough named Ronny, a close competitor to Miss Linda’s son Reggie for the status of Sixth Street’s most loathsome figure. Ronny started carrying a gun at 13 and shot himself in the leg while boarding a bus at 15. He periodically gets kicked out of school for such offenses as hitting his teacher and trying to steal his principal’s car. He brags to Goffman that he has slept with women older than she (she was then 21). Most of his days are spent running from truant officers and serving suspensions. One night, when Ronny was 16, he and some Sixth Street associates try to break into a motorcycle store on the outskirts of Philadelphia to steal motorbikes. They fail to get in to the store and, when their Pontiac doesn’t start, are unable to make their getaway. Ronny calls Goffman and Mike at 2 AM to pick him up. (Mike is, at that point, living in Goffman’s apartment, along with Chuck.) The silent alarm in the motorcycle dealership has already alerted the police. They arrest Ronny and Mike, and in the stationhouse, Ronny falsely incriminates Mike as the mastermind behind the break-in. The police let Ronny go and charge Mike with attempted breaking and entering. Mike spreads the word that Ronny is a snitch. Eager to redeem his reputation, Ronny burgles a house in Southwest Philly with Mike’s gun and pays Mike’s bail with the proceeds from the stolen TV, stereo, and jewelry.

This lawlessness cascades into the legal economy as well. Health-care workers steal antibiotics and medical supplies from their employers to provide to their fugitive friends who are fearful of being apprehended at a hospital. University of Pennsylvania law professor Regina Austin has approvingly referred to such “pilfering employees [who] spread their contraband around the neighborhood” as occupying the “good middle ground between straightness and more extreme forms of law-breaking.”

Goffman looks at this unending stream of lawless behavior and sees only the helpless pawns of a mindlessly draconian criminal-justice system: “Since the 1980s, the War on Crime and War on Drugs have taken millions of Black young men out of school, work, and family life, sent them to jails and prisons, and returned them to society with felony convictions.” Actually, it is these men’s own consistently bad decisions that remove them from lawful society. “Felony convictions” do not simply fall from the sky; they result from serious criminal activity—and persistence at criminal activity, at that—required to induce a district attorney actually to seek a felony charge and possibly a trial. If any of Goffman’s subjects made a disciplined effort at “school, work, and family life,” she forgot to include that detail.

Revealingly, Goffman explains how she arrived at her incongruous interpretation of Sixth Street’s malaise. As a graduate student at Prince-ton, she had been casting about for a theme for her still-growing ethnographic material. Princeton was a “hotbed” of mass-incarceration theory, she says, which holds that American prison practices have “cease[d] to be the incarceration of individual offenders and [have become] the systematic imprisonment of whole groups,” in the words of sociologist David Garland. Eureka! Under the tutelage of Bruce Western and other criminal-justice critics (and with obvious influence from the writings of Michel Foucault), Goffman comes to see that her “project could be framed as an on-the-ground look at mass incarceration and its accompanying systems of policing and surveillance. I was documenting the massive expansion of criminal justice intervention into the lives of poor Black families in the United States.”

Yet Goffman’s material refuses to conform to this template. To her credit, she devotes a chapter to “clean people”—individuals who have no dealings with the criminal-justice system. A group of young men on Sixth Street try to steer as clear as possible from the “dirty people.” They remain at home at night, playing video games together. They drink beer, rather than smoke marijuana, because there are drug tests at their jobs, which include security guard, maintenance man, and convenience-store clerk. If they lose their jobs, they don’t start dealing drugs; they rely on friends and family until they find another position. When they break traffic laws, they pay off their fines and recover their driving licenses before they start driving again. Their unassuming rejection of criminality comes as an enormous relief after the squalid behavior of Goffman’s closest associates. Their respect for the law should be celebrated and studied, as Robert Woodson has long advocated.

Remarkably, however, Goffman tries to shoehorn even these law-abiding individuals into her mass-incarceration framework, resulting in the most incoherent passage in the book: “In a community where only a few young men end up in prison, we might speak of bad apples or of people who have fallen through the cracks,” she writes. “Given the unprecedented levels of policing and imprisonment in poor Black communities today, these individual explanations make less sense. We begin to see a more deliberate social policy at work. In that context simply bearing witness to the people who are avoiding the authorities and the penal system seems worth a few pages. The people featured here are all, in a variety of ways, leading clean lives in a dirty world. In so doing, they demonstrate that the criminal justice system has not entirely taken over poor and segregated Black neighborhoods like Sixth Street, only parts of them.”

It would be more accurate to say that the clean people demonstrate that have “not entirely taken over poor and segregated Black neighborhoods like Sixth Street.” The fact that the criminal-justice system distinguishes people who break the law from those who do not shows precisely that “individual explanations” for who gets incarcerated are accurate, not mystifying. The clean people do not run from the police because they are not wanted by the police. Even more absurd is Goffman’s ascription of a “deliberate social policy” of oppression to the prosecution of crime. If such a policy existed, there would be no reason to make exceptions for anyone.

Goffman’s thesis that the supervision of offenders creates more crime also lacks support in her reportage. She claims that the enforcement of warrants for missed court dates, probation violations, and unpaid court fees drives the Sixth Street drug dealers and thieves underground, preventing them from joining the “clean” world. But she never reveals why her subjects miss their court dates. Do those court obligations inflexibly interfere with job schedules in the legal economy? She would have said so. Instead, these drifting drug dealers most likely simply lack the organization and will to make their court appointments. Goffman herself notes that many a Sixth Street resident who blamed his joblessness on his fugitive status made no effort to find work when he had no outstanding warrants. As for testing dirty for drugs in violation of parole or probation conditions, no one forces a parolee to take drugs. Goffman gives us no reason to think that these thugs would behave better with less supervision; nor does she suggest what a court’s response should be when they go AWOL. (UCLA professor Mark Kleiman advocates the use of “flash incarceration” for parole and probation violations—short stays in a local jail, rather than prison, swiftly meted out. It is not clear that such an option was available to Philadelphia prosecutors and judges. In any case, flash incarceration possesses little deterrent value for seasoned criminals.)

Goffman’s most persuasive critique of the justice system is that court fees are imposed on defendants who lack the means to pay them, resulting in a vicious cycle of judgments for nonpayment and further warrant enforcement and incarceration. (A U.S. Justice Department report, written in the aftermath of the police shooting of Michael Brown last August, lodged this complaint against Ferguson, as well, and it is a growing focus of academic attention.) Here, too, though, Goffman shows no instance of someone making a good-faith effort to pay his fees. While her young men are not prosperous, she mentions Mike’s sizable collection of worldly possessions, which include cars, motorbikes, sneakers, speakers, jewelry, and CDs. Some men may indeed lack the resources to pay their court fines, in which case the system self-defeating; but it is also quite possible that they choose to spend their money on other things, such as drugs and sneakers.

unwittingly demonstrates why police presence is heavy in black inner-city neighborhoods. Goffman mentions just one fatal police shooting: Anthony had shot at undercover officers in an alley, thinking that they were gang rivals; they returned fire and killed him. Otherwise, and contrary to the claims of the Black Lives Matter movement, her young black men overwhelmingly die at one another’s hands, such as a friend of Chuck’s, shot while exiting Goffman’s car outside a bar. The clean people of Sixth Street do not complain about the police; indeed, Miss Linda’s father, a retired postal clerk, regularly calls the cops on his grandsons and welcomes the heavy police activity in the neighborhood. Even the Sixth Street criminals try to get themselves arrested when the local gang violence becomes too hot; prisons and jails are the only place they feel safe.

Goffman claims to have witnessed officers beating up suspects 14 times in 18 months of daily observation and asserts that the Philadelphia Police Department has an official, if sub rosa, policy of pummeling suspects who so much as put a finger on an officer. She also claims, without a source, that the cops routinely steal cash during drug raids. (She doesn’t mention the alleged deficiencies in the department’s deadly force training, for which it is criticized in another recent Justice Department report, which also noted that black and Hispanic officers were far more likely than white officers to shoot black civilians based on a mistaken perception of threat.) Such brutality and corruption, if true, must be punished and eradicated. (One should note, though, in assessing Goffman’s credibility in such matters, that her loathing of the police is such that she develops a fear of white men in particular, and white people more generally.) But such police misconduct, if it exists—as it did in North Charleston, South Carolina, where Walter Scott was shot to death in wholly unjustified circumstances—does not mean that lawful police activity is any less needed in neighborhoods still plagued by violence and other forms of disorder. Philadelphia’s high crime rate has been a perennial drag on its economy. Data-driven policing and the incarceration buildup that Goffman and her mentors so decry resulted nationally in the steepest crime drop in modern history (especially in New York), saving countless inner-city lives, both clean and dirty. At the end of the book, Reggie and Tim are serving long prison sentences. We have no reason to believe that those punishments were not deserved.

It is remarkable enough that Goffman, seeing the lawless behavior of Sixth Street’s “dirty people,” still views them as helpless victims of a racist criminal-justice system. She has clearly been captured by her subjects. After Chuck is killed, she chauffeurs Mike around the neighborhood, Glock in his lap, as he seeks to find and gun down the murderer. She feels “ashamed and sorry” about being white, when Miss Linda’s extended family complains about there being a white girl in their midst. (Such pervasive antiwhite antagonism is perhaps the best-kept secret about black inner-city culture.) Goffman refuses to give the police information about the crimes she has witnessed.

But it is even more remarkable that so many influential readers have bought Goffman’s thesis that law enforcement is the predominant source of trouble in her subjects’ lives. Journalist Malcolm Gladwell, lauding the book in , draws the conclusion that the criminal-justice system blocks black criminals and their progeny from entering the middle class, unlike its earlier treatment of the Mafia. Harvard’s Christopher Jencks, writing in , rues the “terrible collateral damage inflicted on the young black men of Sixth Street by their interminable struggle with the police”—echoing Goffman’s contention that such struggles simply happen, rather than being the result of voluntary behavior. Like Goffman, her well-placed readers focus on the consequences of crime for the criminal and ignore the crime itself. could have been a needed corrective to the post-Ferguson conceit of a racist justice apparatus arbitrarily descending on helpless black communities. But it is not being received that way. Instead, the book’s reception has demonstrated how ineradicably committed liberal elites are to the belief in black victimhood. And that belief, continuously fed to the street by the advocates and the media, means that police-community relations in New York and other American cities will continue to be fraught with tension and danger.

City Journal

post from sitemap

A Monument to Tastelessness

Photo by Steven SeverInghaus

On a recent visit to New York City, I had the opportunity to walk around the exterior of the new Whitney Museum, built at a cost of $442 million. It is a monument of a kind: to the vanity, egotism, and aesthetic incompetence of celebrity architects such as Renzo Piano, and to the complete loss of judgment and taste of modern patrons.

If it were not a tragic lost opportunity (how often do architects have the chance to build an art gallery at such cost?), it would be comic. I asked the person with whom I was walking what he would think the building was for if he didn’t know. The façade—practically without windows—looked as if it could be the central torture chambers of the secret police, from which one half expects the screams of the tortured to emerge. Certainly, it was a façade for those with something to hide: perhaps appropriately so, given the state of so much modern art.

The building was a perfect place from which to commit suicide, with what looked like large diving boards emerging from the top of the building, leading straight to the ground far below. Looking up at them, one could almost hear in one’s mind’s ear the terrible sound of the bodies as they landed on the ground below. There were also some (for now) silvery industrial chimneys, leading presumably from the incinerators so necessary for the disposal of rubbishy art. The whole building lacked harmony, as if struck already by an earthquake and in a half-collapsed state; it’s a tribute to the imagination of the architect that something so expensive should be made to look so cheap. It is certain to be shabby within a decade.

Almost as interesting to me as the building itself was Michael Kimmelman's “criticism” of it in the . I have seldom read a piece of criticism in which the fundamental question was avoided in so pusillanimous a fashion, and in which the writer so delicately refrained from passing aesthetic judgment, presumably from fear of disagreement or appearing reactionary.

At no point did Kimmelman offer a clear indication of whether he considered the building good or bad, beautiful or ugly. Instead, he used locutions such as the following, compatible with any value judgment whatever: “It ratifies Chelsea;” “The museum becomes . . . an outdoor perch to see and be seen;” “Mr. Piano’s galleries borrow from the old downtown loft aesthetic;” “They’re nonprescriptive places . . . that may prove to be the ticket.”

Or, of course, “they may end up a headache.” “But it is a deft, serious achievement, a signal contribution to downtown and the city’s changing cultural landscape;” though, on the other hand, “The new museum isn’t a masterpiece.” But it’s an “eager neighbor;” and “it also exudes a genteel eccentricity that plays off the rationalism of Mr. Piano, and of Manhattan’s street grid.”

All this makes Buridan’s ass seem positively decisive. Kimmelman continues: “I’m reminded of the Pompidou Center in Paris, which Mr. Piano designed with Richard Rogers. The breakthrough there was not just the inside-out factory aesthetic but the development of a populist hangout . . . .” Not only does Kimmelman make the building sound like new, but unpleasant, cancer therapy, he also forgets that public executions were also “a populist [or is it popular?] hangout,” and probably would be still if carried out.

With architectural critics like this, no wonder celebrity architects get away with it.

City Journal

post from sitemap

Playing at Protest

Spencer Platt/Editorial/Getty Images

Civil disobedience historically involves a demonstration by an aggrieved people against their government. In an unusual twist, New York City is now governed by aggrieved politicians who demonstrate against the people. Standing in traffic, picketing small businesses, and interrupting their own meetings, Gotham’s elected representatives enjoy the privileges of power while simultaneously donning the mantle of the oppressed.

Last November, 15 New York city council members interrupted a meeting of the full council to protest the non-indictment of Ferguson, Missouri, police officer Darren Wilson for the shooting of unarmed teenager Michael Brown. As the presiding officer banged a gavel and lawmakers stomped out of the chamber, Council Member Andrew King read a prepared statement proclaiming that “black lives matter.” The protesting members stood in the lobby of City Hall for five minutes chanting “hands up, don’t shoot,” before heading back inside to take their seats and resume regular business. Two weeks later, following the non-indictment of NYPD officer Daniel Pantaleo for the death of Eric Garner in Staten Island, 25 council members and a contingent of supporters stood in the middle of Broadway chanting “I can’t breathe,” before staging a “die-in” in front of City Hall. Then the assemblage trooped inside, again begging “hands up, don’t shoot” to an imaginary phalanx of hostile troops, before attending their scheduled meeting.

These spectacles were widely reported as “civil disobedience,” and Mayor Bill de Blasio spoke approvingly of them as such, noting that “it’s part of our values as Americans” to protest perceived injustices. That nobody was arrested or even admonished, or that the police assigned to City Hall facilitated the disruption of a public roadway by two dozen elected officials, failed to temper the lawmakers’ sense that their actions constituted a bold strike against the powers that be. Even when elected officials in New York City get arrested for supposed acts of civil disobedience, they are almost immediately released and face only the slightest inconvenience. Council Member and Deputy Majority Leader Jumaane Williams, a protest habitué, acknowledged as much on Twitter, noting that “pre-planned civil disobedience participant arrests don’t get processed the same” as those of ordinary civilians.

No recent example exists of a city politician spending the night in jail or otherwise suffering any of the discomforts one might expect of the truly politically committed. Council Member Brad Lander, ideological mastermind of the council’s progressive wing, was arrested in March for protesting at a car wash in Brooklyn. Lander and Council Member Carlos Menchaca, who was also arrested, were participating in a labor action by eight employees of the Vegas Auto Spa, who are suing the car wash to gain union recognition and back pay. The two council members were led away in handcuffs, smiling broadly, and were apparently released immediately. Writing on his blog, Lander noted that “a few minutes ago—in an act of civil disobedience—I was arrested . . . for standing up against the injustices happening at our neighborhood car wash.” One imagines that a typical arrestee in New York is not able within “minutes” to broadcast on social media his heroic stance against injustice.

Properly understood, civil disobedience is resistance against an authoritarian state. A powerful elected official involving himself in a labor dispute against a tiny local business scarcely rises to Gandhian levels of moral courage. In New York’s caricature of civil disobedience, politicians pretend to protest, and the police pretend to arrest them.

post from sitemap

The Food Stamp Pirouette

Photo by U.S. Department of Agriculture

You’re not alone if you’ve ever wondered how the government at all levels can spend about $1 trillion per year on “anti-poverty” programs, while the poverty rate never goes down. By the federal government’s official standard, the poverty rate stood at about 15 percent when the War on Poverty began more than 50 years ago—and it remains about 15 percent today, despite more than $20 trillion of anti-poverty spending since then. Could all that money really have had no effect? The question puts government officials in a tricky position. On the one hand, they feel duty-bound to defend the effectiveness of the spending; on the other, any major reduction in the official measure of poverty risks undermining political support for continued and increased spending.

Congressional budget trimmers are currently eyeing the federal Supplemental Nutrition Assistance Program, a.k.a. SNAP, a.k.a. food stamps, which is administered by the Department of Agriculture. The recent budget resolution envisions converting the program into block grants to the states, a move that would put about three-quarters of the DOA out of business. In late March, Secretary of Agriculture Tom Vilsack took to the pages of the to defend the program against such a move.

Among the many federal anti-poverty programs, food stamps play a role that seems almost magical. The program currently spends approximately $80 billion per year, supposedly to combat poverty and hunger. Yet in official Census Bureau poverty statistics, food stamps are defined as “in-kind” benefits that don’t count as part of the “cash-income” poverty measure. And the food stamp program also somehow manages not to make a dent in the government’s most-cited proxy for “hunger”—the annual “food insecurity” survey, also administered by Vilsack’s department, which asks people whether they felt “food insecure” at any time during the past year. Since food stamps require recipients to make a budget last for a month, many beneficiaries understandably answer that question in the affirmative. And thus, somehow, the number of people declaring food insecurity has barely declined at all during the Obama presidency, even as the number of food stamp recipients has mushroomed, from about 32 million to more than 46 million.

Some might say spending $80 billion to expand food stamp enrollment and leave the official poverty rate untouched was the whole idea: if the program dramatically reduced measured poverty and food insecurity, people might think that these problems are getting solved, and the support for continued spending increases could erode. Still, you would think that the Department of Agriculture would feel embarrassed about the program’s apparent failure. How is it conceivable that the United States could spend so much every year on a food-for-the-poor program that has exactly effect?

So Secretary Vilsack has a difficult position to defend. In his op-ed, he took his best shot: “SNAP continues to reduce poverty. . . . More than 4.8 million Americans, including 2.1 million children, are lifted out of poverty when SNAP benefits are counted as income. . . . Rather than arbitrarily taking a budget axe to a program with a proven record of effectiveness and declining costs, common sense tells us that we should instead be working together to put those SNAP recipients who can work back to work.”

See how he snuck in how SNAP reduces poverty “”? Except SNAP benefits are counted as income when the Census Bureau measures poverty—and no one has heard Vilsack demand that Census change its methodology. Vilsack and his cohorts want to have it both ways: when they want to show lots of people in poverty in order to sell the public on more anti-poverty funding, then food stamps don’t count; but when they want to tout the program’s success, then suddenly food stamps do count.

You have to admire Vilsack’s brazenness in describing SNAP as “a program with a proven record of effectiveness.” By the metrics that Washington provides—namely the official poverty and “food insecurity” rates—SNAP may be the most ineffective use of money in the entire federal budget. Because of the way the methodology is designed, the two leading metrics never budge, and they never will, even if SNAP spending is doubled or tripled or quadrupled—or, for that matter, cut in half.

post from sitemap

Freedom From Choice?

Photo by Pogonici

, by Alfred R. Mele (Oxford, 99 pp., $14.95)

In his , Ludwig Wittgenstein complained that “in psychology there are experimental methods and conceptual confusion.” What he meant is that academic psychologists too often interpret empirical evidence in light of unexamined and dubious metaphysical assumptions. What is presented as good science is really just bad philosophy.

The recent spate of neuroscientific and psychological literature claiming to show that free will is an illusion provides a case in point. Philosopher Alfred Mele’s new book, , is a brief, lucid, and decisive refutation of these arguments. Mele demonstrates that scientific evidence comes nowhere close to undermining free will, and that the reasoning leading some scientists to claim otherwise is amazingly sloppy.

Perhaps the best known alleged evidence against free will comes from the work of neurobiologist Benjamin Libet. In Libet’s experiments, subjects were asked to flex a wrist whenever they felt like doing so, and then to report on when they had become consciously aware of the urge to flex it. Their brains were wired so that the activity in the motor cortex responsible for causing their wrists to flex could be detected. While an average of 200 milliseconds passed between the conscious sense of willing and the flexing of the wrist, the activity in the motor cortex would begin an average of over milliseconds before the flexing. Hence the conscious urge to flex seems to follow the neural activity which initiates the flexing, rather than causing that neural activity. If free will requires that consciously willing to do something is the cause of doing it, then it follows (so the argument goes) that we don’t really act freely.

As Mele shows, the significance of Libet’s results has been vastly oversold. One problem is that Libet did not demonstrate that the specific kind of neural activity he measured is followed by a flexing of the wrist. Given his experimental setup, only cases where the neural activity was actually followed by flexing were detected. Also, Libet did not check for cases where the neural activity occurred but was not followed by flexing. Hence we have no evidence that that specific kind of neural activity really is sufficient for the flexing. For all Libet has shown, it may be that the neural activity leads to flexing (or doesn’t) depending on whether it is conjoined with a conscious free choice to flex.

There’s a second problem. The sorts of actions Libet studied are highly idiosyncratic. The experimental setup required subjects to wait passively until they were struck by an urge to flex their wrists. But many of our actions don’t work like that—especially those we attribute to free choice. Instead, they involve active deliberation, the weighing of considerations for and against different possible courses of action. It’s hardly surprising that conscious deliberation has little influence on what we do in an experimental situation in which deliberation has been explicitly excluded. And it’s wrong to extend conclusions derived from these artificial situations to all human action, including cases which involve active deliberation.

Even if the neural activity Libet identifies (contrary to what he actually shows) invariably preceded a flexing of the wrist, it still wouldn’t follow that the flexing wasn’t the product of free choice. Why should we assume that a choice is not free if it registers in consciousness a few hundred milliseconds after it is made? Think of making a cup of coffee. You don’t explicitly think, “Now I will pick up the kettle; now I will pour hot water through the coffee grounds; now I will put the kettle down; now I will pick up a spoon.” You simply do it. You may, after the fact, bring to consciousness the various steps you just carried out; or you may not. We take the action to be free either way. The notion that a free action essentially involves a series of conscious acts of willing, each followed by a discrete bodily movement, is a straw man, and doesn’t correspond to what common sense (or, for that matter, philosophers like Wittgenstein or Aquinas) have in mind when they talk about free action.

Other arguments against free will are no better. For example, in psychologist Stanley Milgram’s famous obedience experiments, participants were instructed to administer what they falsely supposed were genuine electric shocks to people who gave incorrect answers to questions put to them. Many participants reluctantly obeyed these commands even when they seemed to be causing severe pain. As with the neuroscientific evidence, some have argued that such data casts doubt on free will. But as Mele says, it’s difficult to see “exactly what the argument is supposed to be.” Is the claim that Milgram’s experimental setup made it that participants would obey? That can’t be it, because not every participant obeyed the commands. Is the idea merely that situations exist in which people find it difficult to disobey authority figures? If so, what defender of free will ever denied it?

Mele’s book shows that, if anyone has been too quick to follow authority, it’s those who swallow dubious philosophical claims merely because they are peddled by scientists.

Scholastic Metaphysics: A Contemporary Introduction.

post from sitemap

Steps on an Upward Ladder

Photo by Sergey Nivens

Last week, a University of California study reported that Washington spends more than $150 billion annually on means-tested benefits—principally food stamps, Medicaid, Temporary Assistance for Needy Families, and the Earned-Income Tax Credit—for low-income workers. The usual suspects on the left wasted no time attacking what the termed “in effect, a huge subsidy for employers of low-wage workers.” Such attacks misrepresent the actual role of the social safety net in the labor market and stigmatize low-wage work, which offers the best opportunity for struggling people to improve their circumstances.

In practice, means-tested government programs generally function more as an enormous on low-wage work than as a subsidy. Consider that in 2012, the Congressional Budget Office reported that a single parent with one child and no income would receive approximately $20,000 in such benefits, while a single parent earning $20,000 would receive less than $5,000—creating an effective tax rate of more than 75 percent on the working single parent. Our social programs are thus telling the employer and employee: for every $1 you pay or earn, we’re going to take away almost $1 of the worker’s benefits. That’s some “subsidy.” The result is the same as for any tax: you get less of the taxed thing—in this case, fewer poor Americans entering the workforce.

That’s regrettable, because not only do low-wage jobs replace at least some government spending with productive economic activity; they are also critical to disrupting cycles of poverty and providing economic opportunity. If we want to reinforce self-sufficiency as a societal value, if we account for the myriad effects the presence of a wage-earner can have on a family, and if we recognize that such jobs are, for many people, the first step on an economic ladder that can reach higher, then encouraging the creation of entry-level positions—even at very low wages—should be a priority. There is no reason why we can’t embrace low-wage jobs while also seeking to improve the condition of those working in them.

From this perspective, the idea of dramatically raising the minimum wage—a favorite proposal of today’s progressive Left—makes little sense. It attempts to solve the problems associated with low-wage jobs by pretending as if the alternative is a high-wage job, rather than no job. That approach might erase a troublesome statistical category, but it will mean many fewer of the positions that we should be trying to create as employers forgo hiring of workers whose productivity cannot support the higher wage. Nor, despite the lack of direct government spending required, is it free. Someone must pay the additional wages and that funding often comes from higher prices passed on to the customers of low-wage employers, who are themselves often low-income families.

By contrast, subsidizing low-wage work brings more such jobs into existence. It can leave each worker at least as well off as a higher minimum wage would. And the cost is borne disproportionately by the higher-income Americans whose taxes fund the government spending.

The one true subsidy we have today for low-income Americans is rarely mentioned by critics of low-wage subsidies. The Earned Income Tax Credit is a direct subsidy for low-income families, paid only to those that earn income and designed to increase that income by up to almost 50 percent. It is also the rare anti-poverty program with broad bipartisan support, at least during those seasons of the year when rhetorical assaults on subsidies are not in fashion.

The University of California study sheds light on the struggles low-wage workers face—but its findings should encourage support for genuine low-wage subsidies that don’t penalize work. Instead of vilifying low-wage employers and low-wage subsidies, we should recognize the valuable role they play in our economy. The more we can reorient anti-poverty spending to function as a subsidy for low-wage work, the more effective it will be.

post from sitemap

De Blasio’s Iowa Field of Daydreams

Photo by NYC Mayor’s Office

This week, Mayor Bill de Blasio traveled from New York, a state with one of the highest levels of income inequality in America, according to research by University of Washington professor Richard Morrill, to Iowa, a state with one of the lowest levels of inequality, in order to lecture Iowans on . . . how to end inequality! While in Iowa, the second-year mayor repeatedly blasted the rich for not paying enough in taxes. In one speech, he mentioned taxes some 20 times. What de Blasio neglected to explain to Iowans, though, is how New York, which already has some of the highest taxes in America (especially on the wealthy and on businesses), hasn’t managed to moderate or restrain the growth of inequality, while Iowa (which taxes its residents below the national average, according to the Tax Foundation) apparently has.

De Blasio was in Iowa because prospective presidential candidates are visiting the state, and the mayor wants to ensure that income inequality figures prominently in the discussion. But once de Blasio ventured outside of his comfort zone of bashing the wealthy—especially the Wall Street executives and hedge-fund managers whom he loves to treat as villains—his policy proposals were anything but coherent. Two of de Blasio’s national agenda items, for instance, are to hike the minimum wage and require all businesses to grant mandatory sick-leave time to workers. But hedge funds and Wall Street firms already pay workers way above the minimum wage, of course, and offer some of the best employee benefits around. The real burden of de Blasio’s solutions would fall not on them, but on small firms, many of which struggle to make a profit and don’t have extra cash lying around. That’s why decades of research by economists have shown that raising the minimum wage almost always increases unemployment, as hard-pressed small firms hire fewer workers.

De Blasio couldn’t very well start off his Iowa speech bashing small businesses, but he’s no friend to them, as his mayoralty is showing back in New York. When the uber-progressive New York City Council held hearings on a law to expand mandatory sick leave, for instance, the council chambers overflowed as de Blasio administration representatives testified in favor of the idea. After they finished, the de Blasio team as well as most city council members walked out—before small-business groups, including several minority business associations, could share their thoughts on the matter. So a nearly empty chamber didn’t hear the head of a 200-member Hispanic supermarket association testify that the legislation—which not only required mandatory sick leave but also necessitated a significant increase in paperwork for businesses—“could create havoc with small independent supermarkets,” especially as “this burden falls on supermarkets just as they face other burdens, like the Affordable Care Act.”

While in the Hawkeye State, de Blasio also felt compelled to explain how he would spend the proceeds from all the new tax money he covets. One of his big agenda items is universal pre-K across America—a prime example of how New York’s mayor is a “spend first, worry about results later” politician. Pre-K is not a new idea—it’s been around for decades and academics have studied it extensively. The singular result of these studies is that pre-K demonstrates little lasting educational value for most kids, with the sole exception being some small, well-run programs for poor children, who seem to benefit from a head start on schooling. But even that small advantage almost always vanishes by third grade. Universal pre-K’s real appeal is as a tool for teachers’ unions to boost membership, one reason why the issue has such passionate advocates.

The mayor’s other big-spending idea is to pour more money into infrastructure. But the rest of the nation might pause before taking advice from a New Yorker on the subject. As ’s Aaron Renn noted recently, the city wastes billions of dollars on poorly conceived and poorly executed infrastructure projects, thanks to conscious decisions to overspend by ignoring union featherbedding, mandating “buy American” programs for materials, and requiring protracted environmental reviews. De Blasio recently urged Congress to fund more infrastructure spending, but as Renn noted, “how can New York demand Congress do its job if the city and region won’t take care of its own by doing its part to stop this [spending] insanity?”

A closer look at de Blasio’s message in Iowa reveals that his broader theme is about supporting a labor-union-friendly agenda—but that isn’t a cure for inequality, either. It certainly hasn’t worked out that way in New York, where one-quarter of all workers are unionized—more than double the percentage than in more-equal, right-to-work Iowa, where only 11 percent of workers belong to unions. One can only hope that Iowans recognize the quality of the advice they’re getting from New York City’s mayor.

City JournalShakedown: The Continuing Conspiracy Against the American Taxpayer.

post from sitemap

Free Speech in Peril

Shut up or die. It’s hard to think of a more frontal assault on the basic values of Western freedom than al-Qaida’s January slaughter of French journalists for publishing cartoons they disliked. I disagree with what you say, and I’ll defend to the death my right to make you stop saying it: the battle cry of neo-medievalism. And it worked. The , in reporting the massacre, flinched from printing the cartoons. The London showed the magazine’s cover but pixelated the image of Muhammad. All honor to the and the for the courage to show, as the latter so often does, the naked truth.

Illustration by Arnold Roth
Illustrations by Arnold Roth

The Paris atrocity ought to make us rethink the harms we ourselves have been inflicting on the freedom to think our own thoughts and say and write them that is a prime glory of our Bill of Rights—and that its author, James Madison, shocked by Virginia’s jailing of Baptist preachers for publishing unorthodox religious views, entered politics to protect. Our First Amendment allows you to say whatever you like, except, a 1942 Supreme Court decision held, “the lewd and obscene, the profane, the libelous, and the insulting or ‘fighting’ words—those which by their very utterances inflict injury or tend to incite an immediate breach of the peace,” though subsequent decisions have allowed obscene and profane speech. A 1992 judgment further refined the “fighting words” exemption, ruling that the First Amendment forbids government from discriminating among the ideas that the fighting words convey, banning anti-Catholic insults, for example, while permitting slurs against anti-Catholics. In other words, government can’t bar what we would now call “hate speech”—speech that will cause “anger, alarm or resentment in others on the basis of race, color, creed, religion or gender.”

This expansive freedom prevails nowhere else on earth. European countries, and even Canada, have passed hate-speech laws that criminalize casual racial slurs or insults to someone’s sexual habits. An Oxford student spent a night in jail for opining to a policeman that his horse seemed gay. France, which has recently fined citizens for antigay tweets and criminalized calls for jihad as an incitement to violence—a measure that our First Amendment would allow only if the calls presented a “clear and present danger”—also (most improperly) forbids the denial of crimes against humanity, especially the Holocaust. The pope has weighed in as well, with the platitude that no one should insult anyone’s religion—or his mother.

I am as scandalized by Holocaust denial or race baiting as anyone else, but I think Madison right to say that the proper response is not criminalization but argumentation. In a remarkable foreshadowing of John Stuart Mill’s 1859 classic, , Madison wrote in 1800 that it is to free speech and a free press, despite all their abuses, that “the world is indebted for all the triumphs which have been gained by reason and humanity, over error and oppression.” Only out of freewheeling discussion, the unbridled clash of opinion and assertion—including false, disagreeable, and unpopular opinions, Madison believed no less than Mill—can truth ultimately emerge. So it is troubling to see that the camel of repression has gotten his nose under the Constitutional tent by a law allowing the prosecution of bosses for tolerating speech by some employees that allegedly creates a “hostile environment” for others. The Court ought to squelch such an affront to the First Amendment. And it is equally troubling that state and federal laws have created such a thing as a “hate crime.” All that should matter to the law is whether the perpetrator of a crime acted with criminal intent, not whether that intent rested on an outlandish opinion.

As John Stuart Mill observed in , though, it is not law but “stigma which is really effective” in silencing “the profession of opinions which are under the ban of society.” So it ought to be with Holocaust denial or racial slurs. Yet when scorn stifles the free expression of opinion that is unorthodox or unfashionable, but over which reasonable men can differ—and that could prove incontestably true, as has happened often enough—trouble begins.

Let me give you an example from my own experience. Over the course of a year or two as the 1970s turned into the 1980s, I lost all my friends, for saying what I had recently come to believe. I was teaching at Columbia, and my friends were my English department colleagues, along with some of what used to be called the New York Intellectuals. But I was moving rightward politically, pushed by the reality I saw all around me in emphatically ungentrified Morningside Heights.

In those days, the War on Poverty was in full swing. Welfare, as a kind of reparation for racism, was a come-and-get-it proposition, and as newly destigmatized out-of-wedlock childbearing skyrocketed, one in seven New Yorkers went on the dole. Meanwhile, make-work, affirmative-action jobs on the city payroll mushroomed, along with taxes to fund them. Tax-subsidized housing projects loomed like menacing outposts of disorder over down-at-the-heels neighborhoods like mine and threatened to invade such bastions of hard-won, quasi-suburban middle-class respectability as Forest Hills, Queens. Though the era’s national emphasis on school desegregation had turned the focus of urban education from learning to racial equality, here in New York a bitter, racially incited 1968 teachers’ strike had pushed it more toward racial antagonism, as happened in Boston six years later, when court-ordered school busing for racial integration began. Also under the banner of racial equality, New York’s public colleges had opened their doors to all comers, ready or not; and, as graduation rates fell into the teens, standards fell still faster, so that few of the small band of graduates could get real jobs, and few real learners could get real learning.

The streets and parks grew squalid and menacing, as police turned a blind eye to so-called victimless crimes, from loitering, disturbing the peace, and public urination, to retail dope dealing and solicitation by prostitutes of every gender. Deinstitutionalized madmen panhandled with desperate aggression, when they weren’t too far gone merely to babble or hit. On my stretch of Broadway, able-bodied bums, seeing what easy touches my students were, swelled the beggarly ranks, with no interference from the police, wary of accusations of racism—and one bum killed one of my neighbors. We all lived in constant fear of violence, for crime became epidemic, nasty, and brutish.

So all my liberal nostrums had gotten a fair trial, and this was the result. If we do not learn from reasoning upon our observation and experience, what do we learn from?

Maybe the criminal isn’t a victim, I hazarded at one dinner party. Maybe to blame for his actions, not “society.” Maybe the real victim is, well, the . Shocked silence, as if I had flatulated. “That’s racist,” one guest muttered to her plate, tacitly admitting the not-to-be-mentioned truth that criminals were disproportionally minority. Then conversation resumed on another topic, as if no noxious disturbance had occurred—certainly not one that polite society would acknowledge. In those days, every right-thinking person knew that crime had its “root causes” in poverty and racism, and to understand that was to excuse the criminal, who might even be a justified, if somewhat heavy-handed, rebel against oppression, for which we around the comfortably plentiful dinner table were ultimately responsible.

Later, I opined to another friend, a music professor, that rent control was an injustice to the landlord, confiscating what was rightfully his—and this in my friend’s rent-controlled apartment. “Do you want me to be homeless?” he spluttered incredulously. “Do you want to evict me from New York?” However tactless—one doesn’t speak about the Fifth Amendment takings clause in the house of the rent-controlled—I really wasn’t being personal. But alas, so ended another long and cherished friendship.

But however gauche, such opinions stood the test of experience. When I said these insensitive things, New York was dying, with 1 million well-educated and prosperous residents, along with two-thirds of its big corporate headquarters, streaming out of town. But the following two decades of activist policing that treated a crime as a crime regardless of the race of the criminal or victim, along with the realization that the victim of “victimless” crimes against public order was the city itself, turned Gotham back into the glittering metropolis to which so many flock today. And the steady erosion of rent controls helped fuel a gentrification boom, which ended in a building boom that included fancy apartment towers for all the rich foreigners who felt safe having their families and investments here.

Later still, at Diana Trilling’s dinner table, I committed yet another of my irrepressible faux pas. Turning to Christopher Lehmann-Haupt, then the august daily book reviewer of the then-august , I asked, in all seriousness, “Don’t you think the whole effort of modernism—in architecture, in literature, in music, in painting—might have been a huge dead end, from which Western culture will painfully have to extricate itself?” Shocked silence again, though all these decades later, the question still seems inexhaustibly interesting to me. But again, conversation resumed as if I hadn’t spoken and wasn’t there. As soon enough I wasn’t, for the invitations stopped.

Thus I learned the truth of Mill’s argument that social stigma can be as powerful as law in silencing heterodox opinion, except for people rich enough to be “independent of the good will of other people.” Everyone else who utters “opinions which are under the ban of society . . . might as well be imprisoned as excluded from the means of earning their bread.” No more academic career for me (fortunately, it turned out).

What prompts me to tell such slight tales is that they mark an early stage of a trend that increasingly threatens American freedom—the closing of the universities to the free and critical examination of ideas. As I didn’t know then, universities have been centers of real inquiry only for brief brilliant moments in centuries of scholastic murk. While eighteenth-century Glasgow and Edinburgh were beacons of the European Enlightenment, for instance, the Oxford and Cambridge of that time had countless dry-as-dust pedants or hard-drinking timeservers against one Isaac Newton. And there were no more ardent Nazi supporters than German university faculties, intellectual dynamos in the nineteenth century, once they became . So the close-mindedness of today’s universities is nothing new.

But it is especially troubling, because it’s not just the elites who go to college in America any more, by contrast with eighteenth-century Oxford’s mostly highborn students, who could ensconce themselves in their own hard-drinking and gambling clubs and snobbishly ignore their hard-drinking dons. In our own day, the remarks that Constitution signer William Livingston made about American colleges more than 250 years ago still hold true: the doctrines kids learn there “pass from the memory and understanding to the heart, and at length become a second nature.” When the students grow up, these doctrines shape the culture and the laws, “appear[ing] on the bench, at the bar, in the pulpit, and in the senate.” So the intellectual intolerance now so strong on the nation’s campuses, the hostility to Mill’s politically incorrect “opinions under the ban of society,” is pregnant with a threat to the freedom of thought, speech, and press that are the foundations of American liberty, if the students bring this intolerance into adulthood.

The examples are so numerous that they become a blur, so it’s worth enumerating a few specifics, starting with the days when junior science instructors couldn’t get tenure without endorsing the theory that an asteroid impact caused a sun-blocking dust cloud that triggered the extinction of the dinosaurs. Denial would undermine the then–politically correct theory that atomic warfare would start a “nuclear winter” fatal to earthly life, save perhaps some worms and microbes—so we had better ban the bomb. Another impermissible scientific hypothesis, raised by Harvard president Larry Summers—that biological differences between men and women might account for the paucity of top female math and science professors—cost him his job, for gender-theory orthodoxy outlawed such still-unsettled questions. The refusal of college students so much as to listen to speakers whose viewpoint they think they dislike has become notorious, ever since Brown seniors shouted down commencement speaker Ray Kelly, then New York’s police commissioner, in 2013, and graduating classes from Azusa on the Pacific to Brandeis on the Atlantic, with Smith and Rutgers in between, refused last year to listen to Charles Murray, Ayaan Hirsi Ali, Christine Lagarde, and Condoleezza Rice.

College speech codes, outlawing whole lexicons of politically incorrect words and encyclopedias of heretical ideas, have become infamous, and courts, when asked, have struck them down, only to see them replaced with “trigger warnings”—cautions that or might cause distress to black or Jewish students, for example, who might therefore not want to read them. Oberlin has supplied teachers with a trigger-warning guide, advising them to consider not assigning works that could spark upset because of their “racism, classism, sexism, heterosexism, cissexism, ableism, and other issues of privilege and oppression,” such as Chinua Achebe’s novel , which could “trigger readers who have experienced racism, colonialism, religious persecution, violence, suicide and more.” What more? one wonders—and at the University of California at Santa Barbara, the answer is rape. And now we have the campus microaggression hysteria, outrage over instances of supposed—and certainly unintended—racism, sexism, and the like too microscopic to be discerned by any but the most exquisitely sensitive moralist, with a hair-trigger sense of grievance. (See “The Microaggression Farce,” Autumn 2014.)

If it sounds as though we are back in the days when ladies fainted at the mention of the legs of pianos, which had to wear skirts for decency, and when one couldn’t utter words “that would bring a blush to the cheek of a young person,” as Dickens jeered, we are. The Columbia Law School Coalition of Concerned Students of Color claimed that its members were “falling apart” over the failure of grand juries to indict cops for the deaths of Michael Brown in Ferguson, Missouri, and Eric Garner in Staten Island—they were so “traumatized,” in fact, “by the devaluation of Black and Brown lives,” that they were now inhibited “from sleeping at night.” After having so long borne “the burden of educating the broader community about issues that have wreaked havoc on our psyches and lives . . . with unfailing grace,” they now needed to demand “that the community care for us too,” by postponing their exams. The equally sensitive dean readily acceded—though high-profile defense lawyer Benjamin Brafman sharply noted: “If law students cannot function with difficult issues like these, maybe they should not try to become lawyers.” But for pure, three-hankie schmaltz over the Ferguson and Staten Island events, the Columbia students are no match for Harvard College dean Rakesh Khurana, who wrote “with great humility” of his “hav[ing] watched and listened in awe of our students, faculty, and staff who have come together to declare with passion, grace, and growing resolve that ‘Black Lives Matter’ and to call for justice, for ally-ship, and for hope.” Transcending sentimentality and reaching the pure empyrean of incoherence, the good dean concludes: “The diversity of our student body at Harvard College should be on the forefront of this paradigm shift.” As to thinking clearly, arguing vigorously, and writing incisively, what is that, compared with feelings?

Illustration by Arnold Roth

There are three grave problems here. First, you can’t learn much if you are unwilling to listen to ideas that challenge your self-righteous orthodoxy, nor can you even understand exactly what your orthodoxy means until you have had to think hard enough about it to defend it vigorously. All that “critical thinking” that college students were supposed to have learned before they arrived on campus and refined once they got there seems to involve nothing more than indoctrination in contempt for the politically incorrect ideas of the supposedly unenlightened. True, the humanities departments, where the race, class, and gender orthodoxy is central to the subject, are losing enrollment, but science, engineering, and business students also marinate in the all-pervasive atmosphere of such ideas, shaping their political and social assumptions, which become badges of enlightenment and superiority.

Second, the constant social pressure of having to monitor everything you say, lest some unguarded politically incorrect utterance loses you friends, dates, status, or even employment makes for (pardon the fifties’ expression) boring conformists, apparatchiks afraid to think for themselves—quite the opposite of the sturdily independent, resourceful, thoughtful, plainspoken, and creative character that used to be the American ideal. Take the case of Smith College president Kathleen McCartney, who joined her students’ “shared fury,” she said, as “we raise our voices in protest” against the grand jury decisions in Ferguson and Staten Island. Trouble is, she raised her voice in the wrong slogan, declaring that “All lives matter,” when the approved chant was “Black lives matter.” How could she be so disgracefully discriminatory in her nondiscrimination? her scandalized undergraduates exploded. A modern college president may be the very definition of an apparatchik, but there is something humiliating to human nature in the cringingly self-abasing apology that McCartney fairly sobbed out, without even having to be carted off in a dunce cap to a reeducation camp, as if she were her own Maoist cultural-revolutionary commissar. What would it take to make characters like this pull the lever at Treblinka?

John Stuart Mill worried that in intellectual matters, “society has now fairly got the better of individuality.” He feared that “everyone now lives as under the eye of a hostile and dreaded censorship. . . . Thus the mind itself is bowed to the yoke: even in what people do for pleasure, conformity is the first thing thought of; they like in crowds; they exercise choice only among things commonly done; peculiarity of taste, eccentricity of conduct are shunned equally with crimes, until by dint of not following their own nature they have no nature to follow: their human capacities are withered and starved.” Such intellectual conformity, he argued, squashes “the qualities which are the distinctive endowments of a human being. The human faculties of perception, judgment, discriminative feeling, mental activity, and even moral preference are exercised only in making a choice.” In 1859, when the brilliantly irascible Thomas Carlyle and the bracingly judgmental John Ruskin were writing their great works, when the inimitable Charles Dickens was peopling the Victorian imagination with “Dickens characters” drawn from the eccentrics he saw all around him, this warning was, Lord Macaulay thought, like crying “Fire!” in Noah’s flood. But it was a prophetic warning. Look at President McCartney and ask yourself how many such bland, interchangeable items of mortality man the bureaucracies that now organize our society.

Third, this educational climate has already ushered in an era of what left-wing academic Herbert Marcuse praised as “liberating tolerance” in 1965. In his day, he claimed, America’s system of “civil rights and liberties” permitted “opposition and dissent”—as long as they didn’t lead to “violent subversion” of “established society,” in which the interests of the laborer, the consumer, and the intellectual must always yield to those of the boss, the producer, and the college administrator. It is, at bottom, a “repressive tolerance.” Real tolerance—“liberating tolerance,” wrote Marcuse (in Orwellian Newspeak)—“would mean intolerance against movements from the Right and toleration of movements from the Left. As to the scope of this tolerance and intolerance: . . . it would extend to the stage of action as well as of discussion and propaganda, of deed as well as of word.” (See “Illiberal Liberalism,” Spring 2001.) Well-funded Stasi-like groups such as UnKochMyCampus, reports the ’s Kimberley Strassel, are already at work, seeking “trusted informants” among faculty and students to target and harass the few remaining conservative professors whose thought runs counter to “progressive values” and might “undermine environmental protection, worker’s rights, health care expansion, and quality education.”

Sure enough, the self-righteous oppression that calls itself tolerance has moved out from the universities into the larger culture. On Election Day 2008, California voters passed Proposition 8, outlawing same-sex marriage. In the pure spirit of liberating tolerance, the campaign against the proposition included boycotts of the businesses of big donors to the pro–Proposition 8 effort, and the boycotts only expanded once the proposition had passed, helped by online interactive maps showing who donors were, where they lived, and where they worked. Some received death threats; others got envelopes of white powder in the mail (harmless, it proved, but scary). One, Mozilla CEO Brandon Eich, got forced out of his job, once two married gay Silicon Valley executives took their business away from his company in disgust, and activists laid siege to his board, which apologized for not firing him quickly enough. “Taking a public stand on Eich means painting a target on yourself,” one tech executive told a columnist. With his usual courage, blogger Andrew Sullivan summed up: “If this is the gay rights movement today—hounding our opponents with a fanaticism more like the religious right than anyone else—then count me out. If we are about intimidating the free speech of others, we are no better than the anti-gay bullies who came before us.” In 2010, a federal judge threw out the state’s gay-marriage ban.

Activists tarred Proposition 8 as antigay bigotry. It wasn’t. A person with no animus against homosexuals can reasonably believe that the only justification for the state to get involved in marriage—formerly a church concern—is that it has an interest in encouraging, by inheritance laws, the reproduction of society by the strong two-parent families that, mountains of research show, raise the happiest and most successful children. For someone who remembers the 1960s push to get the government out of the bedroom, the current urgency of homosexuals to drag the government back to bed seems bizarre. You don’t need Lois Lerner or Barack Obama to tell you that you really love your partner—and a limited-government conservative would insist that it is none of the government’s business. As for those who have a religious objection to homosexuality, it’s hard to see how the First Amendment’s protection of religious freedom can permit a Colorado court to order a devout baker to make wedding cakes for gay couples—or, given the First Amendment’s ban on a government establishment of religion, to order a nearby baker, in a case that is wending its way to court, to make a Bible-shaped cake with an antigay inscription that she believes to be loathsome bigotry. Indiana has just passed a law reinforcing the First Amendment rights of religious bakers, and the corporate Big Brothers are out in force to punish the state for it.

The larger point, of course, is that “liberating tolerance” can create a climate of opinion in which reasonable discussion of the merits of controversial topics such as gay marriage is impossible, and in which Congress will make laws that courts will say pass First Amendment muster when they don’t, or that unaccountable agencies of the administrative state will pass rules abhorrent to the First Amendment—a process that will begin with laws and regulations forbidding hate speech and that will end who-knows-where. If you believe in free speech, unfortunately you must sometimes hear sentiments that would bring a blush to the cheek of a young person. Since the civil rights and women’s rights revolutions have succeeded, even though the critical-race theorists and women’s studies professors don’t want to hear such job-killing news, there is no rationale whatever for hate-speech laws.

Finally, we should view with special alarm any attempt to muzzle political speech, such as outgoing Attorney General Eric Holder has been threatening. As Madison insisted, with all the earnestness of his character, “the right of freely examining public characters and public measures, and of free communication among the people thereon, . . . has ever been justly deemed, the only effectual guardian of every other right.” So recent rumblings from Democrats on the Federal Election Commission about regulating political commentary on the Internet are disgraceful (the more so because that body shouldn’t exist, since the Constitution allows only Congress to interfere with the states’ organizing of elections). And, of course, the IRS’s interference with Tea Party groups is flat-out tyranny, for all its bureaucratic banality.

Equally wrong are campaign-finance laws, which, happily, the Supreme Court’s decision has begun to undo. In the American political system, based on man’s natural right to life, liberty, and property, money talk. The core of Madison’s worry about the “tyranny of the majority” in 10 was that the unpropertied many might vote themselves the property of the rich few—whether by disproportionate taxation, abolition of debts, inflation to erode savings and investments, “an equal division of property, or . . . any other improper or wicked project”—which the Founders believed would be no less a tyranny than an absolute monarch’s expropriation of property. Madison argued in 10 that the clash of many competing interests in such a big republic as America would prevent such democratic tyranny from occurring; but he proved wrong. Though the Constitution requires taxes to be apportioned equally, the Sixteenth Amendment, ratified in 1913, imposed a graduated—that is, unequal—income tax that the Socialist Labor Party had first called for in the 1880s. Once the original Constitution’s shield against wealth redistribution disappeared—and also in 1913, the Seventeenth Amendment substituted popular election of senators for their election by state legislators, whom the Framers had thought would choose protectors of wealth—the redistribution juggernaut inexorably gathered speed. (See “It’s Not Your Founding Fathers’ Republic Any More,” Summer 2014.) By 2010, the richest 40 percent of households were paying 106.2 percent of federal income taxes, with the top 5 percent paying 57 percent, while the bottom 40 percent of tax filers paid minus 9.1 percent, thanks to refundable low-income tax credits, food stamps, and Medicaid, along with Social Security and Medicare, which are also income-transfer programs, in which poorer recipients get back a much higher proportion of what they paid in than do richer households.

So for just over a century, American politics has been a contest between the Founding Fathers’ vision and the Socialist Labor Party’s, most recently expressed by President Obama’s false and supercilious taunt to entrepreneurs that “you didn’t build that.” The chief protection the propertied now have for the Founders’ belief in the inalienable right of Americans to own their own property, secure against the tyranny of the majority, is their ability to speak up to defend it and to support candidates for office who pledge to do so—with as much money as they like, the Founders thought. And this is a fundamental right, determining whether it is government that gets to determine how much to take from each according to his ability, and to give to each according to his need, or whether individual citizens will make their own decisions on such matters, which is what self-government means.

If you need to offend by speaking up in defense of liberty, don’t be shy. It’s your most precious possession.

City JournalThe Founders at Home.

post from sitemap