Not Your Parents’ Open House

Photo by Wesleyan University

Just when you’d gotten the hang of LGBTQ, they go and triple the number of categories. Wesleyan University is now offering a “safe space” (formerly known as a “dorm”) for students of the LGBTTQQFAGPBDSM persuasions, or, for those who need things spelled out, for Lesbian, Gay, Bisexual, Transgender, Transsexual, Queer, Questioning, Flexual, Asexual, Genderfuck, Polyamourous, Bondage/Disciple, Dominance/Submission, Sadism/Masochism student acolytes. If you are so heteronormative as to see the word “FAG” in the center of that jumble, you will surely not be allowed into the “safe space,” known as Open House.

At this rate of exponential increase in student gender identities, there will soon not be enough paper in college bureaucrats’ offices to provide official recognition and “safety.” Parents concerned that their little darlings may come home with bruises and abrasion from the whips and leather handcuffs need not worry, though. This proliferation of in-your-face sexual identities is all posturing, just part of the dance between students desperate to find one last means of being transgressive and college bureaucrats eager to show their sensitivity and to justify their six-figure salaries. Students who should be studying European history and the roots of the novel—would that such subjects were still taught—are instead combing the farthest reaches of the American Psychiatric Association’s Diagnostic Manual for ways to distinguish themselves. By posing what they hope will be rejected demands on their administrations, they seek only to prove that they are living a life of oppression.

Despite the seemingly all-inclusive aspirations of the LGBTTQQFAGPBDSM acronym, the university recognizes that not every student will feel comfortable in this new “safe space.” To accommodate still further variations in student interest, Wesleyan’s Office of Residential Life offers a variety of unique living options. Farm House provides students “interested in the politics and culture of food production and sustainability a place to cultivate a mutualistic relationship with the earth that provides them with their lunch everyday.” Residents of Earth House can “espouse the values and principles of social ecology, deep ecology, and eco-feminism” while simultaneously “challenging traditional social structures and replacing them with new, creative and egalitarian alternatives.” African-American upperclassmen are welcome to apply to live in Malcolm X House, where they can dedicate themselves to “the exploration and celebration of the cultural heritage of the African Diaspora, both for themselves and for the larger Wesleyan community.” Turath House is for Arab, Middle Eastern, and Muslim students looking “to articulate their views and express and affirm their culture and religion without fear of harassment and discrimination.”

With so many marginalized groups on campus, one wonders who is left to do the discriminating and oppressing.

City Journal

post from sitemap

The Earth Is Not a God

Photo by Christian

, by Alex Epstein (Portfolio, 256 pp., $27.95)

The seventeenth-century philosopher Sir Francis Bacon argued that the human mind had been squandered on superstition: metaphysical speculation, theological disputation, and violent political delusions. Bacon’s greatest American disciple, Benjamin Franklin, agreed. It would be better, both believed, to focus on the conquest of man’s common enemy: nature. Bacon and Franklin were right, but they misjudged superstition’s staying power. Fast-forward to a conversation I had with the late Arne Naess, the Norwegian father of “deep ecology” and guru of the European Green movement. With a straight face, Naess told me that the eradication of smallpox was a technological crime against nature. For Naess’s deep ecology, the smallpox virus “deserved” and needed our protection, despite having maimed, tortured, and killed millions of people.

In his sprightly recent book, , Alex Epstein takes on Naess’s American progeny—people such as Bill McKibben and David M. Graber—who have become influential opinion-makers on the environment, fossil fuels, and technology. Epstein asks us to imagine someone transported to the present from a virtually fossil fuels-free England in 1712, when the Newcomen steam engine was invented. What would that person think of our world, where 87 percentof all energy is produced from fossil fuels? In short, he’d be amazed to find clean drinking water, sanitation, enviable and improving air quality, long life, freedom from much disease, material prosperity, mobility, and leisure.

Epstein makes a compelling “big picture” case that the interaction of technology and fossil fuels provides everything we take for granted today. He also reminds us of earlier hysterical predictions of doom concerning fossil-fuel use. In the late 1960s and early 1970s, environmentalists such as Paul Ehrlich predicted mass starvation by the year 2000 because “world food production could not keep up with the galloping growth of population.” Flat wrong: the world’s population doubled, and the average person today is far fed than when the starvation apocalypse was announced. That’s because the other apocalypse proclaimed back then—the depletion of oil and natural gas by 1992 and 1993, respectively—also proved wrong. Since 1980, worldwide usage of fossil fuels increased massively, yet both oil and natural gas supplies have more than doubled, and we have enough coal to last 3,000 years.

Epstein explains what the environmental doomsayers could not or would not see: first, that “fossil fuel energy is the fuel of food”; and second, that the human mind is as powerful as Franklin and Bacon said it was. Humans discovered more fossil fuels, and technology used those fuels to industrialize food production. Moreover, fossil fuels enabled Norman Borlaug’s Green Revolution in food science, which, unlike the political movement of that name, actually did something to improve world nutrition and relieve the suffering of millions. Ehrlich was also wrong about fossil-fuel pollution in the developed world. In the U.S., though the use of fossil fuels climbed steadily since 1970, emissions of pollutants decreased dramatically—thanks to technology.

Predictions of starvation, depletion, and pollution didn’t pan out. What about global warming? Epstein’s warming discussion should be required reading. He acknowledges the greenhouse effect of carbon dioxide, which can be demonstrated in a laboratory. But the effect is not linear; if it was, every new molecule of carbon dioxide added to the atmosphere would add a unit of heat equivalent to the one preceding it. Rather, the greenhouse effect is decelerating and logarithmic, which means that every additional molecule of carbon dioxide is than the preceding one. Many theories of rapid global warming are based on speculative models of carbon dioxide interacting in positive feedback loops with increases in atmospheric water vapor. Most climate models are based on so-called “hindcasting,” coming up with explanatory schemes that predict what has happened in the past. There’s nothing inherently wrong with this, since the only alternative would be clairvoyance—but predicting the past with a computer model is not the same as accurately predicting the future.

Most climate models, says Epstein, have consistently and dramatically over-predicted mid-tropospheric global warming. We haven’t “burned up,” as McKibben predicted we would in 1989. Some suggest that the warming is occurring in the oceans; but mean sea levels around the world have been stable or declining for the last 100-plus years. Since the beginning of the industrial revolution, atmospheric carbon-dioxide levels have increased by .03 percent to .04 percent and since 1850, temperatures have risen less than one degree Celsius (an increase that has happened in many earlier time periods). And for the past 15 years—a period of record emissions—there has been little to no warming.

The warming models may prove correct in the long term, of course, so Epstein asks a reasonable question: What if it becomes clear that, in the next 100 years, the seas will rise by two feet and the globe will warm by 2 degrees Celsius, as predicted by many climate scientists? The answer is simple, though often ignored by climate alarmists: we’ll adapt. Since the Industrial Revolution, and especially in the last 30 years, the human race has become progressively better at remediating the harmful effects of storms, heat, cold, floods, and so on. It’s irresponsible, says Epstein, to trivialize the power of technology to solve the problems generated by fossil fuels. Much of that technology could consist of fossil-powered techniques to capture and recycle or sequester carbon dioxide.

Epstein exposes the profound misanthropy motivating much contemporary environmentalism. He quotes Graber: “Human happiness, and certainly human fecundity, are not as important as a wild and healthy planet . . . human beings have become a plague upon ourselves and upon the Earth . . . and until such time as Homo Sapiens should decide to rejoin nature, some of us can only hope for the right virus to come along.” Alexis de Tocqueville noted that democratic peoples have a tendency toward pantheism in religion: given their passion for equality, they come to think that is God. To radical Greens like Naess, Graber, and McKibben, everything is God, with one exception: the human being, whose “impact” spoils the “independent and mysterious” divine.

Why do hysterical warnings about sustainability and depletion persist despite the failure of the crackpot 1960s and 1970s predictions? Because the non-impact standard—conceiving of the environment as a loving but finite God—sees the environment as having a limited “carrying capacity” of gifts, such as arable land, water, and crucial minerals, in addition to fossil fuels. The more people on the planet, the closer we are to maxing out that carrying capacity, the thinking goes. Thus the urgent call, made in 2010 by White House Office of Science and Technology director John P. Holdren, to “de-develop the United States.” This notion of a finite carrying capacity discounts the powerful role of human ingenuity in natural resources. But the deeper problem is rooted in the divinization of the planet as something that simply is what it is.

Epstein argues brilliantly that the carrying-capacity superstition amounts to a “backward understanding of resources.” The fact is that nature by itself gives us very few directly supplied energy resources: most resources “are not from nature, but from nature,” he maintains. Every raw material in nature is but a “potential resource, with unlimited potential to be to be rendered valuable by the human mind.” Right now we have enough fossil fuels and nuclear power to last us thousands of years. “The amount of raw matter and energy on this planet,” Epstein writes, “is so incomprehensibly vast that it is nonsensical to speculate about running out of it. Telling us that there is only so much matter and energy to create resources from is like telling us that there is only so much galaxy to visit for the first time. True, but irrelevant.”

Bill McKibben says that the post-Ice Age Holocene period is the climate that humans can live in. Epstein responds that the Holocene is an abstraction that summarizes “an incredible variety of climates that individuals lived in. And in practice, we can live in pretty much any of them if we are industrialized and pretty much none of them if we aren’t.” Until the Industrial Revolution, the climate was dangerous for all human beings. Since then, we have marched steadily toward “climate mastery.” Fewer people die today from the weather than at any time in history. “We don’t take a safe climate and make it dangerous,” according to Epstein. “We take a dangerous climate and make it safe.”

The non-impact standard is a pervasive but irrational prejudice—irrational because it’s a neo-pagan faith that the earth is in effect an uncreated God, and a prejudice because it’s asserted dogmatically by those who profess it and taken for granted by a public unaware of being in its grip. The default position on environmental matters is “respect” for the planet. It tilts opinion to focus only on the harms of fossil fuels and technology, not their benefits. The bottom line is always the same: humans should minimize their impact on nature.

Alex Epstein’s book is a breath of fresh air in this polluted opinion climate. shows why fossil fuels are good for human flourishing in general and good for the world’s poor in particular. Epstein is a true friend of the earth—an earth inhabited and made better by human beings.

post from sitemap

When Transparency Isn’t Transparent

Photo by David Valdez

Last April, the Centers for Medicare and Medicaid (CMS) released records detailing the amounts physicians were paid and the procedures they performed in 2012. CMS has now announced that it will begin publishing these records annually, in what advocates see as a victory for transparency. But the new policy will accomplish little beyond confusing patients and embarrassing physicians.

The problem is that patients cannot intelligently interpret the CMS data. If the records show, for example, that a doctor has received what seem like high payments for a particular procedure—and for performing that procedure an unusual number of times—is the doctor an expert, or a crook? Health researchers have long maintained that high-volume providers have better outcomes. Perhaps the doctor is especially proficient, and her expertise attracts large numbers of patients who need the procedure? The CMS records won’t help patients assess the quality of the services provided or compare one doctor with another. A patient could just as easily believe that the highly paid doctor is over-utilizing the procedure, performing unnecessary and possibly harmful procedures to boost revenue.

The payment records could also mislead patients because they don’t indicate whether a payment was made to a single provider or to multiple providers out of a single office, using one provider’s Unique Physician Identification Number for billing purposes. A pathologist in Minnesota collected $11 million from Medicare in 2012. It wasn’t fraud; he was chairman of the Mayo Clinic’s Department of Laboratory Medicine and Pathology, one of the busiest in the country, and the entire lab billed under his name and Medicare number.

Similarly, the records don’t indicate what portion of a payment represents reimbursements to physicians who make negligible profits from treating patients with expensive drugs or treatments. When a hematologist or oncologist administers chemotherapy, the drug’s cost to the doctor dwarfs what he can collect. The same is true for ophthalmologists who administer macular-degeneration treatments. Current Medicare rules allow physicians to collect the average sale price of the drug plus 6 percent. The actual margin, however, may be less, when overhead costs are factored in. Small practices that purchase drugs in small amounts at higher than “average” costs may actually lose money.

Disclosure advocates also claim that disclosure will help researchers study geographic variations in health-care expenditures and how to eliminate them. But the data are not risk-adjusted. Some doctors treat a greater number of sick patients than others. Unfortunately, the payment records provide no insight into the severity or complexity of diseases treated by doctors, so they cannot be useful for comparing costs and utilization among providers or different regions.

CMS’s release of the records will have at least one clear benefit: helping to identify fraud and abuse. Outliers in total billings or total procedures performed warrant further investigation. But these records are already available to law enforcement and regulatory authorities. In fact, many physicians with unusual billing patterns in the 2012 record release had already been disciplined by state medical boards and/or law enforcement. Yet, Medicare continued to pay physicians who had been sanctioned, lost their licenses, or had been convicted of fraud and theft and spent time in jail—a failure of the system’s fraud-detection procedures that won’t be solved by the public release of physician payment records.

One is left with the suspicion that the primary purpose of the record release is to shame doctors whom policymakers believe routinely exploit the current fee-for-service payment system for personal gain. True, some physicians are guilty as charged; as in any profession, some bad apples exist. But most physicians take their professional duties seriously. They make a good faith effort to perform services for the best interests of their patients and not for personal gain. They don’t deserve to be pilloried based on misleading information. Until Medicare payment records can be made useful and readily intelligible, CMS should suspend their release. If CMS insists on going ahead, it should at least help people understand what they’re looking at.

post from sitemap

Council of Crackpots

Bill de Blasio built his 2013 campaign for New York City mayor around unapologetically progressive policies: an end to the NYPD’s “stop-and-frisk” practices, loosened restrictions on welfare availability, universal prekindergarten, and various other initiatives designed to reduce the economic inequality that, he claimed, had made Gotham “a tale of two cities.” De Blasio’s victory has quickly given him a national profile as one of liberalism’s standard-bearers, and he appears to have broader ambitions. Realizing any such aspirations will depend, of course, on how he performs as mayor, but de Blasio came into office with an advantage that his recent predecessors lacked: overwhelming support in the city council, whose members, as one lawmaker put it, are like a “cult of true believers,” eager to follow their leader. In New York, the mayor makes most of the news, and it’s easy to ignore the city council. That would be a mistake. Not only did de Blasio himself come from its ranks, where he spent years building alliances; New York’s next mayor may well be a sitting council member today—and, if so, judging by the views of some of the council’s leading figures, the de Blasio years might be ironically remembered for how moderate they were.

Illustrations by Arnold Roth

New York mayors have traditionally allowed the political bosses of Brooklyn, Queens, and the Bronx to determine who would hold the speakership of the 51-member council. But even before taking office in January 2014, de Blasio had orchestrated an unprecedented campaign to secure the leadership post for his close political ally Melissa Mark-Viverito, of East Harlem. Drawing on his extensive political capital, de Blasio leaned on individual council members to back Mark-Viverito. The city’s powerful labor unions joined the push, and the mayor-elect won the support of Kings County Democratic boss Frank Seddio, with the promise of key committee chairmanships for Brooklyn council members. De Blasio thus ensured that he would have a reliable ally running the council.

Insiders had considered Mark-Viverito a long shot for the speakership. She grew up in privilege in Puerto Rico, where her ophthalmologist father, who owned his own dual-engine Cessna, founded a hospital that, after his death, sold for $165 million. She and her family own properties in San Juan and along the coast that provide them with a stream of rental income. Under the auspices of a program aimed at middle-income, first-time home buyers, Mark-Viverito purchased her current home, a three-family structure on East 111th Street, for $350,000, and paid off the mortgage after only ten years. The property is now estimated to be worth $1.2 million. Despite appearing at Zuccotti Park during the Occupy Wall Street sit-in and announcing on camera that she belongs to the “99 percent,” Mark-Viverito has the real-estate portfolio of a 1 percenter.

Curiously, Mark-Viverito, like de Blasio (who grew up as Warren Wilhelm, Jr.), changed her name in adulthood: Melissa Mark added her mother’s maiden name, Viverito, after graduating from Columbia University. It’s no stretch to infer that political considerations drove this decision, as they probably did de Blasio’s. The Teutonic resonances of “Wilhelm” would not likely attract votes or donors in the heavily Jewish or persistently Italian precincts of Brooklyn where de Blasio launched his career. Likewise, “Melissa Mark” lacks the tongue-tripping sequence of vowels that Mark-Viverito exploited for ethnic appeal when she moved to East Harlem from Greenwich Village to run against Philip Reed, her black predecessor, in 2003. She lost that race, but assumed Reed’s seat two years later, when term limits forced him out.

Even by the standards of New York City politics, Mark-Viverito stands on the left-wing fringe. During her first seven years on the council, she stood for, but did not recite, the Pledge of Allegiance. (Her spokesperson claimed that, having grown up in Puerto Rico, Mark-Viverito was “unfamiliar” with the one-sentence pledge.) In 2010, the future speaker circulated a petition calling for the release of Oscar López Rivera, who was convicted of seditious conspiracy for his leadership role in FALN, the Puerto Rican paramilitary group that bombed Fraunces Tavern in lower Manhattan, among other targets, in the 1970s. Mark-Viverito counts among her friends Evo Morales, the socialist president of Bolivia, whom she visited in 2008. She also participated in the protest movement to expel the U.S. Navy from Vieques in Puerto Rico, where she was arrested, along with other prominent progressives such as Robert F. Kennedy, Jr. and Al Sharpton. Mark-Viverito’s family owns a 12-acre estate abutting the former Roosevelt Roads Naval Station, from which the Naval Forces Southern Command oversaw the controversial bombing tests. Perhaps the council speaker enjoyed sunbathing in peace.

New York City Democratic politics are largely shaped by organized labor. Almost every liberal leader in New York City—including the mayor and city council speaker—owes some measure of allegiance to the radical-leftist Working Families Party, which grew out of an alliance between labor unions and activist groups, such as the scandal-plagued Association of Community Organizations for Reform Now (Acorn), which applied the union-organizing model to public housing complexes. WFP-backed candidates (like Mark-Viverito) are usually endorsed by the city’s most powerful unions: 1199 SEIU, which represents health-care and hospital workers; 32BJ, a union for property-service workers; the Transit Workers Union Local 100; and the United Federation of Teachers, which represents New York City’s public school teachers. (See “The Union That Devoured Education Reform,” Autumn 2014.) Mark-Viverito worked as an organizer for 1199, and the powerful union steered her to move to East Harlem to seek political office there.

The WFP runs pro-labor candidates with the goal of forcing mainstream Democrats and incumbents further left. Though it has national aspirations, the party thrives mainly in New York, where “fusion” voting rules allow candidates to seek office on multiple ballot lines. In 2010, a group of WFP-backed council members, led by Mark-Viverito and Brad Lander—who succeeded de Blasio in his Park Slope district after the future mayor became public advocate—formed the Progressive Caucus. Including Mark-Viverito as speaker, the caucus, with 18 seats, now makes up more than one-third of the 51-member council and dominates the council politically, holding all leadership posts and most key committee chairmanships. Its members are unremittingly left-wing.

Consider Margaret Chin, a two-term council member representing lower Manhattan. The first Chinese-American to represent Chinatown, Chin got her political start in the 1970s as a founding member of Asian-Americans for Equality, a front group for the Communist Workers Party. As a council member, Chin lifted the protective-landmark designation on an early-nineteenth-century wood-framed building on the Bowery, one of the few such historic structures remaining in Manhattan. The 1817 building, owned by a Chinese-American bank that contributed generously to Chin’s campaign, was quickly torn down to allow for the development of a generic, eight-story brick-and-glass structure. Chin defended the erasure of a remnant of New York’s Federal-era past, saying that the new building would offer below-market “affordable office space.” “Affordability” in reference to housing is a legal term pertaining to residents’ income. There is no equivalent term for commercial real estate, so Chin concocted the phrase to give cover to her donors.

Chin is a master of infelicity, as she demonstrated this past summer, when the city council voted 43–3 to pass a bill that will make municipal identification cards available to all city residents, regardless of immigration status. The law’s goal is to provide illegal or “undocumented” aliens with a form of legal ID for the purpose of entering city buildings or signing leases, with a view toward regularizing their presence in New York. The law’s many city council proponents worry, however, that citizens and legal residents have no compelling reason to apply for a municipal ID. If the only people who do sign up are illegal aliens, it will become a stigma to have one, and the normalization project will fail. So advocates have been busy coming up with reasons why everyone should carry a card. Council Member Carlos Menchaca, a cosponsor of the bill, argued that the ID card could acquire a “cool factor”; to that end, the administration strong-armed 28 cultural institutions that receive city funding into offering admission or membership discounts to card bearers for the first year of the program, which starts in January.

Advocates insist that the card will come in handy for New Yorkers who lack any legal form of photo identification—the same argument that Democrats have made nationwide in opposition to voter ID laws. Chin took these claims to another level, inexplicably declaring that “there are a lot of people with green cards that don’t have ID.” As an immigrant from Hong Kong, Chin likely possessed a resident alien’s green card herself at some point. Surely, she understands that the card is precisely a federally issued form of identification. City residents, of course, can readily obtain nondriver’s photo-ID cards for $5 from the Department of Motor Vehicles, but Chin complains that to obtain those, “we have to have so many forms and documents. And where do you go to get a nondriver’s ID? It is very difficult to find. I had a very hard time finding a motor vehicle place to do it.” As it happens, Chin’s district is home to a DMV office, about a ten-minute walk from City Hall.

The municipal ID bill’s other cosponsor, Council Member Daniel Dromm of Queens—a close Mark-Viverito ally—has also backed a bill to let noncitizen legal residents of New York vote in municipal elections. The bill contains provisions ensuring that no visible distinction be made between citizens and noncitizens at polling sites, lest noncitizens feel stigmatized. (Since noncitizens still wouldn’t be able to vote in state or federal elections, they would need a different ballot. The logical way to manage that process would be to have two separate voter lines, but that would make a visible distinction, precisely what Dromm’s bill would prohibit.) Proponents of noncitizen voting point to the experience of such enlightened jurisdictions as Barnesville, Maryland (population 176), as models for New York. Barnesville has had no trouble accommodating the votes of its half-dozen or so resident noncitizens, the advocates say—so New York, with its thousands of polling sites, multiple ballot lines, and overlapping municipal, state, and federal elections, should be able to handle the minor technical adjustments that true democracy demands.

As chair of the council’s education committee, Dromm has emerged as the progressive attack dog on education policy, excoriating the mayor’s political enemies as anti-child and racist. In his introductory comments before an eight-hour hearing on the city’s charter schools—where at least 90 percent of students are black or Latino—Dromm likened the policy of school choice to apartheid. Dromm also compared one charter school’s practice of disciplining unruly students by putting them in a time-out room for ten minutes with the solitary confinement of inmates on Rikers Island, the city jail.

Public Advocate and former council member Letitia James is the council’s presiding officer and constitutionally the successor to the mayor—a disturbing prospect, for those who know what a loose cannon she can be. At Dromm’s hearing on charter schools, James invoked the 1954 v. desegregation case and referred to charter schools as a form of neo-segregation. In 2009, she expressed outrage about a Tuesday night riot in which three teens were shot outside a restaurant offering 50-cent chicken wings. Rather than condemn the violence, James instead chided the restaurant’s owners for “irresponsible” management in holding their promotion on the eve of a school holiday, when the teenagers would be more likely to go out for the evening. “I want this Tuesday restaurant promotion stopped, or the lease of this business revoked,” she demanded. A 2014 report found that Public Advocate James “is often absent from the office for hours to attend personal appointments and political events,” though she is never shy about touting her purported accomplishments. James started her career as public advocate, after all, by claiming to have championed the plight of 11-year-old Dasani Coates, a homeless girl living in a city shelter and profiled in a poignant feature; she had helped bring the girl to the paper’s attention, she claimed. When her boasts proved untrue, James backtracked—but she still showed up for de Blasio’s inauguration with the girl as her prop, referring to her as “my new BFF.” Even liberal supporters found it exploitative.

Council Member Jumaane Williams of Brooklyn was the prime mover behind bills to end the NYPD’s stop-and-frisk practices, which enabled cops to stop pedestrians engaged in suspicious behavior, with an eye toward discouraging them from keeping guns on their persons. The policy helped reduce shooting deaths in New York. (See “Courts v. Cops,” Winter 2013.) Now that officers have reduced their street stops, potential criminals could calculate that the risk of being caught with an illegal gun has diminished. Williams is also known for his opposition to the Broken Windows theory of policing, which seeks to forestall serious crime by cracking down on “minor” violations such as subway turnstile-jumping or graffiti. (See “Why We Need Broken Windows Policing,” page 10.) In an effort to reduce the “significant stress” that getting arrested can cause, Williams sponsored a resolution calling on police to “stop arresting people for committing minor infractions in the transit system.” According to the text of the resolution, being arrested for littering, gambling, or urinating in the subway can be “very disruptive” to the arrestee. Moreover, being arrested is “overly punitive and unfair” and can even cause “financial hardship.” Clearly, Williams’s policy priorities don’t include protecting subway riders from disorderly or violent behavior.

The only reason that Council Member Inez Barron isn’t a Progressive Caucus member is because the group’s politics aren’t radical . In January 2014, Barron, a former state assembly member, assumed the seat formerly held by her husband, Charles Barron; he, in turn, will succeed in Albany this year. A former Black Panther, Charles Barron was known during his time on the council for never standing during the Pledge of Allegiance, for inviting Zimbabwean dictator Robert Mugabe to City Hall, and for saying that he wanted to go up to the next white person and “slap him, just for my mental health.” Inez Barron has introduced little legislation so far, but given her other talents, perhaps she doesn’t need to. In February 2014, to mark the anniversary of Trayvon Martin’s death, she asked her colleagues to snap their fingers while she declaimed an ode:

We’re an African people, we’re related you and I,
African by heritage beneath God’s sky
We have a common bloodline, we’re the first to walk the earth:
Black, red, tan and gold, we’re the first of human birth.
We built the pyramids and yes we made the Sphinx;
We sailed the ocean wide and with Mexico we linked.
We used the Nile River to cultivate the land
And then we used technology to pull crops from sand.
The Greeks came to African universities
We taught them how to diagnose and do brain surgery
We taught them math, geometry and then we taught them trig,
Physics and astronomy, oh yes oh yes we did!
. . .
This is just a tidbit of ancient Africa:
She civilized the whole world, we owe it all to her.
We’re an African people, we’re an African pebrople, etc etc.

The assembled council members smiled and applauded this bizarre performance. The press tweeted about it as though it were an amusing end-of-school prank. No one asked what the reaction would have been if a white council member had recited, say, an Ode to Europe celebrating how the white “bloodline” had “civilized the whole world.”

Few council members have much experience of professional life outside politics. Many worked as staffers to council members before becoming council members themselves. One, Progressive Caucus member Richie Torres of the Bronx, said in an interview with , “Do I look like a politician to you? I’m a 25-year-old college dropout who grew up in public housing. I’m gay. I’m Afro-Latino. I hardly have the characteristics people associate with a politician, but here I am.” Council Member Torres had previously served on the staff of a council member from an adjoining district. He was essentially selected by the local machine to fill a vacancy. So when he asks, “Do I look like a politician to you?” the answer, more or less, is yes.

De Blasio and his left-wing allies have worked with impressive unity to achieve their goals. For example, de Blasio promised during his campaign to expand paid sick leave in New York City. During the new mayor’s first month in office, Mark-Viverito introduced an amendment to the city’s existing law requiring virtually all employers to provide five days of paid sick leave for workers. The amendment passed overwhelmingly, with few members concerned about the possible effect on small businesses. San Francisco mandated paid sick leave in 2007, and progressive advocates, citing research showing only slight declines in profitability, insist that the impact on businesses there has been minimal. A study by the Employment Policies Institute, however, reveals that most of these surveys include businesses that had offered paid sick leave to their employees, before the introduction of the new regulations. Newly covered businesses wound up laying off workers, raising prices, and cutting back employees’ hours.

Affordable housing is a progressive obsession in New York City, where paying half of one’s income to share an apartment half the size of a racquetball court is considered reasonable. An expanding swath of Manhattan and Brooklyn is dominated by the construction of market-rate or “luxury” housing. Indeed, when the annual going rate for the rental of a one-bedroom apartment is itself higher than the area median income (AMI), it’s hard to say exactly where “luxury” begins. Many New Yorkers are priced out of the neighborhoods in which they want to live.

Mayor de Blasio came to office promising to “build and preserve” 200,000 units of affordable housing over ten years, with an emphasis on providing units to those making less than 50 percent of the AMI. The current system allows developers to build extra floor area in exchange for building or preserving affordable units, either within existing developments or elsewhere in the neighborhood. De Blasio’s proposed Mandatory Inclusionary Zoning program would developers to make a significant percentage of their new units permanently affordable—that is, restricted to particular income levels, as determined by a percentage of the AMI—without anything in return from the city. Market-rate renters will effectively subsidize those paying “permanently affordable” rents. This coercion may work when developers build in desirable areas, such as lower Manhattan or north-central Brooklyn, but in other parts of the city, where low-income housing is most in demand, no incentive exists to build when 20 percent of the units must be rented below-market. Subsidized housing requires that someone do the subsidizing.

A curious wrinkle in de Blasio’s plan arose last summer when it turned out that a new development on Manhattan’s Upper West Side had, with city council permission, built separate entrances for its market-rate and subsidized tenants, effectively segregating the owners of luxury condominium units from the renters of the affordable units. The “poor door” issue intersected with the “two cities” narrative and sparked outrage from a variety of elected officials, many of whom, including then–council member de Blasio, had voted in 2009 for the package of zoning changes that included the “poor door” provision. Public Advocate James denounced “this kind of segregation which we as a society abhor.” Jumaane Williams, chair of the council’s Housing Committee, circulated a petition decrying the “terrible practice” whereby “taxpayer money is being used to subsidize segregationist housing policies.”

The scandal goes beyond the issue of separate entrances. Luxury buildings in New York City are increasingly equipped with amenities such as rooftop decks, gyms, and pools. Developers attempting to maximize the value of units that sell for upward of $2,000 per square foot have, in some cases, limited the use of these amenities to market-rate occupants. Progressive Caucus council members Mark Levine and Corey Johnson, both of Manhattan, want to introduce legislation forbidding any such separation of tenants or limitations on their use of building amenities. Other proposals insist that affordable units must have access to the same as the market-rate units.

Should such measures be adopted, the effect will likely be the placement of affordable units in separate, stand-alone buildings with few or no amenities—or, worse, developers could opt not to build affordable housing. Presumably, many space-starved New Yorkers would be willing to bear the shame of using a “poor door,” if the upside was paying a submarket rent for an apartment in a desirable—and otherwise unaffordable—location. The poor-door brouhaha illustrates again the misguided and counterproductive thinking of New York’s “affordable housing” advocates.

The Progressive Caucus has lots of ideas for future reforms. Council Member Lander, for instance, has introduced a bill outlawing credit checks as part of the hiring process. Credit checks, he claims, “slow economic and job growth during a time of high unemployment,” though businesses might be better placed to assess the economic effects. The bill has supermajority status, virtually ensuring its passage. The speaker and the mayor have issued statements on the pressing need to make New York City a safe haven for Central American migrants. Mark-Viverito insists that the city should be prepared to take in and house thousands of illegal-immigrant children. In October, Mark-Viverito sponsored and the council passed a series of bills that will substantially erase any cooperation between the city and the federal Immigration and Customs Enforcement Agency. The bills forbid ICE from maintaining an office on Rikers Island and prohibit the city from sharing information regarding illegal-alien criminals, unless they’ve recently been convicted of violent felonies. The Progressive Caucus supports a minimum wage of $15 per hour for employees of large retail chains, including fast-food workers. That would require approval from Albany. Raising the minimum wage is a high priority for organized labor, though, which can use it as a baseline to push for increases in its own workers’ contracts.

The quirky personalities and often zany ideas of New York’s progressive council members can make for amusing reading, but the job of governing a city of 8 million people is no joke. The city’s economy has rallied impressively from the Great Recession, but its high tax burden and cost of living make it hard for middle-class residents to get ahead. (See “The Cost of New York,” Summer 2014.) A looming budget crunch will force the city to make difficult political choices, and it remains to be seen whether two decades of public-safety gains can be preserved under de Blasio’s less assertive policing regime. In short, the city faces serious challenges and needs serious leaders to meet them. Based on the quality of its political bench, New York could be in real trouble soon.

post from sitemap

Ranting About Rudy

Photo by PBS NewsHour

Former New York mayor Rudy Giuliani’s statement about President Obama’s lack of love for America has set off a firestorm of denunciation. Giuliani has been accused of racism, and he has even received death threats. Defenders of Obama have evoked everything from his grandfather’s (on his mother’s side) service in World War II to the two years Giuliani’s father served in Sing Sing to prove either that Obama love America or that Giuliani has no standing to issue such criticism.

The ranting has obscured the reasons why so many Americans take Giuliani’s remarks to heart. Starting with his June 2009 speech in Cairo, when he apologized for American actions in the Middle East, Obama has consistently given credence to Islamic grievances against America while showing reluctance to confront Islamic terrorism. In 2009, after Major Nidal Hasan killed 13 American soldiers and wounded 40 others at Fort Hood while shouting “Allahu Akhbar,” the administration labeled the killings workplace violence. In recent months, the pace of evasions has quickened. Obama was the only major Western leader absent from the massive Paris march held in the wake of the killings. Worse yet, Obama referred to the killings in a Jewish supermarket in Paris as “random” acts of violence.

But this was only the beginning of a string of curious comments and loopy locutions made by the president or his spokespeople in the weeks that followed. While ISIS rampaged across the Middle East, the president told a Washington prayer breakfast that Christians shouldn’t get on their “high horse,” because they were guilty of the Crusades, among other crimes. Not only were the Crusades many centuries past, but they were also a complicated matter in which both sides behaved barbarically. But more important, Obama’s comments reinforced the standard Muslim propaganda about how the jihad is merely defensive. Shortly thereafter, ISIS murdered 21 Coptic Christians in Libya (a country in complete chaos, thanks to an Obama-led Western intervention). The White House’s response was to condemn the killing, not of Christians but rather of “Egyptian citizens,” another evasive locution. The casual listener need not have knowledge of the White House’s associations with the Muslim Brotherhood in Egypt or the administration’s hostility toward the anti-jihadist regime in Cairo to find Obama’s words and behavior peculiar.

As one gaffe followed another and as the president and his spokespeople engaged in semantic somersaults to avoid mentioning Islam in regard to terror, public unease mounted. It was exacerbated by a three-day conference on combating “violent extremism,” the White House euphemism for Islamic terrorism. Here again, the public likely didn’t know that some of the invited guests had histories of supporting jihad. The president’s statements gave them enough to worry about.

Bizarrely, Obama presented himself as an expert on legitimate Islam. “This is not true Islam,” Obama said, referring to the ISIS creed, assuming again his role as Defender of the Islamic Faith. “Al Qaeda and ISIL and groups like it . . . try to portray themselves as religious leaders, holy warriors in defense of Islam,” Obama said. “We must never accept the premise that they put forward, because it is a lie.” Operatives of al-Qaida and ISIS “are not religious leaders,” Obama insisted. “They’re terrorists.”

Listening to speakers at the conference—to which the FBI had not been invited—you would think that, if only American Muslims were treated better, ISIS would wither on the vine. “We in the administration and the government should give voice to the plight of Muslims living in this country and the discrimination that they face,” said Secretary of Homeland Security Jeh Johnson. “And so I personally have committed to speak out about the situation that very often people in the Muslim community in this country face.”The real issue, Johnson suggested, was not Islamic terrorism but Islamophobia. The casual listener might surmise that Johnson’s remarks had less to do with ISIS than with winning the 2016 Muslim vote in key states such as Michigan and Ohio.

At the same conference, Obama announced, or more accurately pronounced, that Islam “has been woven into the fabric of our country since its founding.” As one friend asked me: “What is he talking about?” America’s earliest encounter with Islam came in the form of seventeenth-century colonists purchasing slaves from Muslim slave traders in Africa. The next came when President Thomas Jefferson was forced to fight the Barbary Pirates. Surely, these weren’t the examples Obama had in mind.

In the midst of these attempts to achieve what a magician would call misdirection, State Department spokeswoman Marie Harf explained that, if there were only more jobs in the Middle East, ISIS would have a harder time recruiting. But then again, if there was less Islamic extremism, more jobs might be created. Harf referred to the “root causes” of terrorism much as liberal Democrats have long referred to the “root causes” of poverty, with about the same degree of insight.

With all the atrocities that the ISIS fanatics have committed, Obama’s anger has been more often directed not at them, but at Israeli Prime Minister Benjamin Netanyahu. The animosity between the two men has never been a secret, but now, with Obama’s term in office waning, he is determined to cut a deal on the Iranian nuclear program on terms much to Tehran’s liking. Netanyahu is an obstacle to that goal.

All of this, then, was a backdrop to Giuliani’s remarks, in which he called out Obama for the president’s many rhetorical bluffs. If the former mayor’s words have created a firestorm, it’s because for many, they have helped make sense of Barack Obama’s words and actions. Attacking Giuliani, and trying to delegitimize him with the racist label, will do little to allay public anxieties about an administration short on competence but long on ideological evasion—and blessed with media allies.

Prince of the City: Rudy Giuliani and the Genius of American LifeThe Revolt Against the Masses

post from sitemap

Hip-Hop Hamilton

Photo by Hamilton at the Public

The new off-Broadway musical is pure genius. Don’t take my word for it: the show about Alexander Hamilton—George Washington’s aide-de-camp during the American Revolution, primary author of the , and the nation’s first Treasury secretary—is becoming a cultural phenomenon. The show is still in previews and doesn’t open until February 17, but it has already extended its run three times and will play through May 3 at New York’s Public Theater. According to the , producers are “dueling” over who gets to bring the show to Broadway. The notes that tickets are going for $650 on StubHub. Andrew Lloyd Webber tweeted that “raises & changes the bar for musicals.” weighed in with an 8,000-word piece on the show and its creator, Lin-Manuel Miranda.

Miranda, who plays Hamilton in the show, wrote , which won the 2008 Tony for best Broadway musical. He got the idea for a musical about Hamilton several years ago, when he read Ron Chernow’s 2004 biography. Chernow, who served as a consultant on the show and is cited near the top of the credits, should be proud. Miranda tells Hamilton’s story in an intelligent and historically accurate way, while also making the show enormously entertaining. It takes some doing to set Washington’s Farewell Address (which Hamilton helped write) to music and make it work.

New York–born, of Puerto Rican descent, Miranda sees Hamilton—an orphaned immigrant striver battling to prove himself—as a quintessential New Yorker. He went to university, joined the army, and practiced law in New York. His last home stands in Harlem; his grave lies at the foot of Wall Street, across from the Bank of New York, which he founded. The newspaper he started, the , still makes waves.

As Miranda explains, Hamilton was also contemporary, in his way. His tumultuous life, from his love of the ladies (Martha Washington once named an amorous cat “Hamilton”) to his verbal and nonverbal duels, reminds Miranda of “the stories of Tupac and Biggie.” The show’s first song, “Alexander Hamilton” (Miranda performed an early version at the White House in 2009), summarizes Hamilton’s chaotic childhood, which Miranda says “embodies hip-hop.”

In Miranda’s hands, Hamilton’s story is a crowd pleaser. The audience howls with laughter each time the foppish King George comes on stage, crooning to his loyal subjects. After spending most of the American Revolution in France, Thomas Jefferson struts onto the scene at the start of Act 2, singing “What’d I Miss?” At several pivotal moments in American history, Washington emcees a Hamilton vs. Jefferson rap duel, always coming down on Hamilton’s side.

Most of the cast is nonwhite. Black actors portray Washington, Madison, Jefferson, and Burr. As the director’s notes in state, the cast is “as diverse as the city we love, is claiming the American Revolution as their revolution, and the ideals of freedom that founded this country as their own, still-unfinished dreams.”

Aaron Burr plays a major role in the show, as does Hamilton’s wife, Eliza. Both help make ’s saga a human one. Miranda spotlights how often Hamilton and Burr’s lives intersected before that fateful morning in Weehawken. We see Eliza’s pain when her husband wrote a public pamphlet detailing his affair with another woman—to prove that the hush money that he offered was not from public coffers—and how devastated she and Alexander were when their eldest son died in a duel defending his father’s honor.

the musical is largely focused on the story of Hamilton the man. My only wish? I would have liked more about Hamilton’s legacy. If not for Hamilton charting its early course, American history might have been quite different. As Chernow writes, Hamilton was “the messenger from a future that we now inhabit. We have left behind the rosy agrarian rhetoric and slaveholding reality of Jeffersonian democracy and reside in the bustling world of trade, industry, stock markets, and banks that Hamilton envisioned.” Today, groups across the political spectrum claim Hamilton as their own. Tea Partiers like him because he defended the Constitution in the . Wall Street conservatives like him because he is the architect of American capitalism. Liberals like him because he believed in a strong federal government. The Manhattan Institute holds an annual Hamilton Award Dinner, and the Brookings Institution has a Hamilton Project. And some just like the “hunky” way he looks on the redesigned ten-dollar bill.

There’s much to like about Hamilton. He understood that effective yet limited government, along with a well-functioning financial system, would nurture the entrepreneurialism that is America’s genius. He recognized the importance of innovation, and helped establish Paterson, New Jersey, as a model manufacturing town (with private funds). While many Founders were slaveholders, Hamilton was a staunch abolitionist. An immigrant himself, he understood the economic dynamism that immigration can foster. Education transformed Hamilton’s life, and he helped to found Hamilton College in upstate New York.

runs three hours, with an intermission; the producers are looking to shorten it, someone involved with the show told me. They shouldn’t cut a word. With any luck, the show will come to Broadway soon, and every high school student in New York will get a chance to see it. What Lin-Manuel Miranda has created could introduce a new generation of young people to the Founders. Especially the coolest one.

post from sitemap

You Work for Me, Mac!

Photo by Aaron van Dorn

Never was there so perfect an emblem of public employees’ public-be-damned attitude than the outrage of New York’s Transport Workers Union over the arrest of veteran bus driver Francisco DeJesus for running down a 15-year-old girl legally crossing the street in a crosswalk. The seriously injured girl, who had the “walk” sign in her favor, was on her way to school. Bus drivers, the union whined, are being treated like “criminals.” Henceforth, the union demanded, cops must exempt its members from arrest for failure to yield to pedestrians in crosswalks. After all, why should they be treated like other motorists under Mayor Bill de Blasio’s Vision Zero program to reduce pedestrian traffic fatalities and injuries? How can anyone expect them to be “perfect?”

You would think that running down a pedestrian legally crossing the street with a ten- to 20-ton vehicle is prima facie evidence not of “imperfection” but of negligence and incompetence. It’s not perfection for someone who drives such a vehicle professionally to leave pedestrians uninjured. It is a basic requirement of the job, the equivalent of the physician’s injunction to “First, do no harm.” The public does not hire its bus drivers to leave a certain proportion of citizens unmangled but rather to leave everybody in one piece. If this bus driver can’t manage not to ruin the life of an innocent teenager, he should lose his job.

During the Great Depression, Mayor Fiorello LaGuardia famously walked into a city welfare office to make an unannounced inspection and found one employee sprawled back in his chair, his feet on his desk, his hat on his head, and a sandwich in his hand. The mayor made an inquiry, and the welfare worker growled through his well-stuffed mouth that he was eating his lunch. LaGuardia strode over to him, knocked his hat off his startled head, and barked, “Stand up and take your hat off when you talk to a citizen!” To the reporter following him, he remarked, “There’s another son of a bitch who has no job.”

That always struck me as the perfect model of public-employee management. The city worker is the servant of the public, hired and paid by the public to do the job it appoints him to do. Citizens don’t work for government, but vice-versa—and in an age when public employees have better retirement packages than many of the taxpayers whose money funds them, that should be truer than ever. But in a world in which public employees have not only civil-service protection but also union protection—which even Franklin D. Roosevelt dismissed as a ridiculous idea—the opposite is ever more usually the case.

Noticing that crosstown traffic constantly gets slowed by buses stuck in Fifth Avenue crosswalks because they have tried to beat the light when the intersection was already backed up, I once suggested to Mayor Rudolph Giuliani that he could have the police ticket the drivers of those buses. Just a few tickets would do the job, I thought, and the action would serve a larger purpose than speeding up crosstown traffic, for it would dramatically demonstrate that public employees are not above the law, but that they must observe it as scrupulously as anyone else—more scrupulously, in fact, so as to set an example. An unticketed MTA bus stuck in an intersection through driver arrogance is an especially egregious example (because a city official is at fault) of the kind of neglected disorder that Broken Windows policing aims to curb, so as to encourage citizens to obey the law in matters great and small. But even Giuliani, for all his immense courage, never set one public union against another. So kudos to de Blasio for showing some real guts. Perhaps he might even have the police ticket garbage trucks as they race down cars-only West End Avenue, radios blaring, on their way to the West Side garbage terminal.

By such small steps we might approach the larger question of why private-sector retirees must sell the homes they’ve lived in all their lives, because they can no longer afford the property taxes that go to fund public-sector retirees’ pensions and health-care benefits. And voters might start to hold accountable politicians who sign IOUs that come due long after the pols in question have left office for their own gilt-edged retirements.

City JournalThe Founders at Home.

post from sitemap

How Smart is 50 Shades of Grey?

Photo by Todd Mecklem

In her standup act, comedian Whitney Cummings scoffs at the claim that men like strong women. “Sorry, I’ve watched porn,” she says. “Men like Asian schoolgirls with duct tape on their mouths.” In that vein, consider the popular idea that women want sensitive men who do the laundry without being told. Sorry, I’ve read—and now watched— Women like men who tie them up and flog them in a Red Room of Pain. With duct tape on their mouths.

I’m only half-kidding. The film’s reviews, like the reviews of E.L. James’s 2011 book, are full of well-deserved snark about its inane dialogue, flat characters, and contrived plot. But the story’s wild popularity suggests that James knows something most of us don’t about the mix of lust, romantic longing, and post-feminist morality that swirls inside the brains of young women today.

It’s a remarkable coincidence that this particular pornographic fantasy has seized the global female imagination at the same moment that rape and sexual violence against women has become a leading social justice cause. The coincidence is heightened by the fact that the story’s protagonist, Anastasia Steele, is a coed on the cusp of graduation. She is in many respects an ordinary, modern college girl. She’s independent, a little boozy, cash-strapped, and working her way through school in a hardware store. She drives a battered Volkswagen beetle. Yes, she is a virgin. But that’s not because she’s a prude—“Holy crap, no!” as the feisty heroine would put it. She just hasn’t found a guy who pushes her buttons. That is, until she sacrifices her virginity and good judgment to the highly practiced sexual power of the brooding and distinctly un-politically-correct billionaire Christian Grey, a man of “singular tastes.”

Ana’s passion for the dominant Mr. Grey has created an uncomfortable dilemma in enlightened circles. The strict line between sex and aggression, and the injustice of unequal gender relations in the boardroom and co-ed dorm, are crucial tenets of modern-day feminism. ignores them all. In fact, the book and movie’s harshest critics see an attempt to “romanticize violence against women” and to glamorize the rapist. Defenders insist that by depicting (and no doubt arousing) intense female sexual pleasure, the book “empowers” women. Pornography has always been male-centric; now, they say, women have a porn of their own.

More than anything, represents the mainstreaming and feminization of S&M pornography. Once confined to the shadows of the art-movie house, sadomasochism is having its moment in the bright light of the mall. Both critics and fans of miss the essential point about pornography: that it speaks to primitive, pre-rational, taboo desires. Its lure is precisely the refusal to bow to social limits. It doesn’t matter who sets those limits: fathers, priests, or gender studies professors can all have the sort of authority that the unconscious is determined to flout. Nor will gender progress stop the rebellious id. Even a Hillary Clinton presidency won’t rid the nation of libidinous fantasies about dangerous Alpha Males wielding duct tape.

Still, James is clever enough to know that this taboo fantasy is hardly the final word on women’s desires. I’m far from the only one to observe that beneath the outré distractions of the Red Room lies the most conventional of love stories: the emotionally stunted man who finally cannot resist the love of a good woman. In a familiar vein, Anastasia—the Disney-sounding name is no accident—wants “more” (i.e., emotional connection, communication, love, and all that stereotypical female stuff) from this distant man who not only doesn’t commit, but “doesn’t do relationships” altogether. Her triumph over his psychological demons is as far from the multi-partner degradations of the 1954 S&M “classic,” , as a children’s game of hide and seek is from combat. But it’s clearly a major element in the book’s success.

James modernizes the seemingly outmoded fantasy of her tale through an elaborate performance of consent. Grey presents Anastasia with a contract dense with “fundamental terms” and appendices, stipulating rules of location, time, and meticulously described limits. He even insists that she sign off on various bondage accessories. The contract negotiations give Anastasia a degree of power even as she signs it away.

But if James takes care to make the sex between Christian and Ana so consensual it could pass muster at University of California campus tribunals, she perhaps unwittingly points to the limitations of such consent. Though James wrote her novel before the current spate of questionable campus rape accusations, she all but predicted them. Ana’s consent is shaped not by enthusiasm for Christian’s predilections, but by her desire not to lose him. Consent seems a misleading word to describe this state of mind.

The prevalence of pornography—and, now, of itself—is bound to fuel this sort of youthful confusion. We can and should prize consent, but most 18-year-olds know little about their own motivations. Ply young men and women with images of extreme sexual adventure, barrels of liquor, and empty, unsupervised dorm rooms, and sexual assault is bound to remain in the headlines.

City JournalMarriage and Caste in America: Separate and Unequal Families in a Post-Marital Age.

post from sitemap

Schoolgirls feared to be joining IS

Police are urgently trying to trace three runaway schoolgirls amid fears they have fled to Syria to join Islamic State, as questions are raised about how they were able to leave the country so easily.

Shamima Begum, 15, Kadiza Sultana, 16, and an un-named 15-year-old, all from east London, are good friends with another 15-year-old girl who fled to Syria in December.

The three girls who all go to the Bethnal Green Academy school, are described as “straight-A students”.

They flew to Istanbul, in Turkey, from Gatwick Airport on Tuesday without leaving any messages behind and their families are “devastated” by their disappearance, according to Commander Richard Walton, head of the Metropolitan Police’s counter terror command.

He said there was a “good chance” the girls were still in Turkey but the force has been “increasingly concerned” by a growing trend of young girls showing an interest or intent in joining IS, an organisation now notorious for its barbaric treatment of hostages and oppression of women.

The three girls left their homes before 8am on Tuesday providing their families with “plausible” reasons as to why they would be out for the day.

They boarded a Turkish Airlines flight, TK1966, which departed at 12.40 to Istanbul, Turkey and landed at 18.40 local time.

Turkish Airlines did not notify police that the girls were on board the flight.

The girls’ departure, unaccompanied by adults, to a country known to be “a staging post to Syria” should have raised suspicions, an expert has said.

“The fact that this is still happening shows that security needs to be stepped up,” Emily Dyer, a research fellow specialising in Islamism and terrorism at the Henry Jackson Society, told the Daily Mail.

Police have said they may have been able to intervene before the girls departed had they been notified by the airline.

Mr Walton said: “We are concerned about the numbers of girls and young women who have or are intending to travel to the part of Syria that is controlled by the terrorist group calling themselves Islamic State.

“It is an extremely dangerous place and we have seen reports of what life is like for them and how restricted their lives become.

“It is not uncommon for girls or women to be prevented from being allowed out of their houses or if allowed out, only when accompanied by a guardian.

“The choice of returning home from Syria is often taken away from those under the control of Islamic State, leaving their families in the UK devastated and with very few options to secure their safe return.

“If we are able to locate these girls whilst they are still in Turkey we have a good possibility of being able to bring them home to their families.”

Shamima is described as approximately 5ft 7in, and wearing black thick rimmed glasses, a black hijab, light brown and black leopard print scarf, dark red jumper, black trousers and jacket, carrying a dark blue cylindrical shape holdall with white straps.

She is a British national of Bangladeshi heritage and speaks English with a London accent. She also speaks Bengali.

Kadiza is described as 5ft 6in and of slim build. She was wearing black rimmed glasses, a long black jacket with a hood, grey striped scarf, grey jumper, dark red trousers, carrying a black holdall.

She is also a British national of Bangladeshi heritage and speaks English with a London accent and also speaks Bengali.

The third missing girl, who is not being named, is described as 5ft 6in and of slim build, wearing black thick rimmed glasses, black head scarf, long dark green jacket with fur lined hood, light yellow long sleeved top, black trousers, white trainers carrying a black Nike holdall. She speaks English.

Salman Farsi, spokesman for the East London Mosque, said: “They have been misled. I do not know what was promised to them. It is just sad. We have not had anything like this before in our community.

“I do not know what was told to them but if they do go to Syria, it is a war zone and there are serious ramifications for going in to a war zone. Some of the things we have seen happening in Syria are not very nice.

“We just want to see them brought back.

“I think the girls need to know they have done nothing wrong. They have been manipulated.”

Shadow home secretary Yvette Cooper said: “The idea of 15-year-old British schoolgirls setting off to Syria is very disturbing, and shows that more action is urgently needed to stop young people being drawn into extremism and conflict, and to help families and communities who are trying to counteract extremist recruitment messages.”

post from sitemap

Fire fighters tackle house fire in Aberdeenshire

    Tags

Fire fighters were called to a house fire at Kintore this evening.

Two appliances were sent to the property at School Road just before 10.30pm.

One hose reel jet and four breathing apparatuses were used to tackle the flames.

A spokeswoman for the fire service described the blaze as a “small”.

post from sitemap

Metal thefts down a third after Act

Metal thefts have fallen by one third in England and Wales following the introduction of legislation to crack down on dealers buying stolen materials as scrap, according to new figures.

Thefts of materials ranging from electricity cables to metal from railway lines, war memorials, road signs, children’s playground equipment and church roofs are costing the country as much as £770 million a year, said the Local Government Association (LGA).

But following the October 2013 introduction of the Scrap Metal Dealers Act, which requires dealers to hold a licence to trade and gave councils new powers to deal with rogue operations, numbers of metal thefts fell from almost 60,000 a year to 40,680 in 2013/14.

Some 8,000 licences have been issued since the Act came into force.

The first closure notice on a dealer trading without a licence was served last month in Milton Keynes, while an unlicensed dealer in Lichfield was ordered to pay costs and fines totalling £1,961 after pleading guilty to illegally trading.

The South East region saw the biggest fall in metal thefts (46%), followed by London (44%) and the North West (40%).

Ann Lucas, chairwoman of the LGA’s Safer and Stronger Communities Board, said: “Such a significant drop in metal thefts is excellent news for communities who have suffered from the chaos, disruption and heartache caused by unscrupulous metal thieves.”

post from sitemap

Lotto’s 2,000th draw after 20 years

The UK’s 2,000th Lotto draw will take place tonight after a 20-year history that has seen more than 13,000 winners taking a share of the jackpot.

The milestone will see an estimated jackpot of £4.1 million up for grabs on this evening, with figures suggesting around 70% of adults in the UK regularly play National Lottery games, including Lotto.

The largest individual Lotto winner since the game launched in 1994 is Iris Jeffrey, from Belfast, who won £20.1 million in July 2004.

But the largest Lotto jackpot ever paid out was £22.5 million to Paul Maddison and Mark Gardiner, from Hastings, in June 1995.

According to Camelot, the best location to buy a chance at the top prize is Dumfries, which has a lottery millionaire for every 8,288 of the adult population.

A poll for Camelot found 99% of millionaires say they are as happy, or happier, than before their win, citing the key reasons as financial security, freedom of choice and having time to spend with their family.

The survey of 101 winners of at least one million pounds on the National Lottery last year found that a fifth (20%) hid their winning ticket in a bag, wallet or purse, 8% put it in their pocket, 6% hit it under their pillow, 5% opted for a drawer or cupboard and 2% secured it in their underwear.

The favourite purchase among winners was a new home (32%) followed by a car (27%) and the perfect holiday (8%).

Of those were working at the time of their win, half (52%) handed in their notice, 7% set up their own business, 4% returned part time and 1% took up voluntary work.

Almost all winners (95%) gave money away to family, 87% to charity and 70% to friends.

A National Lottery spokeswoman said: “The National Lottery has created more than 3,700 millionaires and each week raises over £33 million for National Lottery projects.

“We’d like to thank all our players who through playing our games have made a life-changing difference to individuals and communities across the UK.”

post from sitemap

Spring tides trigger floods warning

Strong winds and high tides are set to bring large waves to parts of the country this weekend as coastal communities brace themselves for floods.

Gusts of 50mph alongside higher than average spring tides could mean stormy weather, and travel disruption, across many of the UK’s seaside towns and villages, the Environment Agency said.

Alerts have been issued for the north west coast right through to Monday, while the north east is likely to see flooding in low-lying land today, it said.

The east coast, and parts of the south coast as well as Devon, Cornwall and the Bristol Channel may see some disruption from today through to Monday, it added.

There will be little let-up for those in the west of the country as strong winds tomorrow are expected to bring a continued risk of large waves into Monday.

Jonathan Day, flood risk manager at the Environment Agency, said: “We are monitoring the situation closely with the Met Office and will issue flood alerts and warnings as required.

“It’s possible we could see some large waves and spray and urge people to take care near coastal paths and promenades and not to drive through flood water.”

He advised people to check the Environment Agency website at gov.uk and their @EnvAgency Twitter feed for the latest updates.

Meanwhile snow and ice warnings have been issued for northern and north-western areas of the UK, with potential blizzard conditions in-store over higher ground including the northern Pennines and Cumbria.

Up to 2ins of snow could fall in some parts which, combined with gusts of more than 50mph, could create localised blizzard conditions, Simon Partridge of the Met Office said.

“The whole of the country will experience strong winds and severe gales over the course of tomorrow, meaning it will be a wet and windy day for many – and a snowy and windy day for northern parts,” said Mr Partridge.

Strong winds and blustery showers are forecast for the early part of next week, with a risk of hail and thunder.

“Winter is not over just yet,” Mr Partridge added.

post from sitemap

What Must We Think About When We Think About Politics?

NATIONAL TRUST PHOTO LIBRARY/ART RESOURCE, NY
A headless body in a topless bar would not have surprised political philosopher Thomas Hobbes, right.

The late political scientist James Q. Wilson used to caution, with his elegant precision, that it’s not enough to have political opinions. You also need facts—which, for him and his brilliant colleagues at of the 1960s and 1970s, meant data. You think this policy will produce that outcome? Okay, try it—and then measure what happens. Did you reduce poverty? Raise test scores? And you had also better comb the data for consequences you neither expected nor intended, for all policies must stand or fall by the totality of their results. Remember, too, Wilson and his colleagues used to insist, that correlation is not causation: if two things alter more or less in tandem, that doesn’t by itself prove that one of the changes the other. They may be independent of each other, or some as-yet-unnoticed third force may have sparked both of them. Data don’t speak for themselves but require interpretation—which may or may not be correct. It’s art, not science.

This warning proved a powerful corrective to the liberal ideology about social policy that reigned in the 1960s—pious, unproved platitudes about “root causes” that gave birth to the War on Poverty, whose dire consequences, including an ever-more-deeply entrenched underclass, still bedevil America. But Wilson’s rigor tones up only one of the areas where political thought and discourse tend to be flabby. At least two more elements, well known to political philosophers since antiquity but often ignored today, are essential to intelligent political thinking. You have to have some understanding of psychology—of the minds and hearts that motivate the individuals who are the stuff of politics—and you have to know something about culture, the thick web of beliefs and customs that shape individuals and their social world at least as much as public policies do.

It may well be that the great English political philosophers—Thomas Hobbes and John Locke, who so profoundly influenced America’s Founding Fathers—were wrong as a matter of historical fact in positing that states and governments arose when men made a social contract, mutually agreeing to give up some of their natural freedom of aggression in order to protect their lives, liberty, and property from the aggression of others, and arming a magistrate with the power of the community to punish infractions of the contract with force. But if they were wrong as historians—if there never really were a solemn conclave in which men set their hands to an actual parchment or swore a solemn oath—they were right as psychologists. Individuals come into the world endowed by their Creator with an array of instincts, “among which,” argued the most famous psychologist of all, Sigmund Freud, “is to be reckoned a powerful share of aggressiveness. As a result, their neighbor is for them not only a potential helper or sexual object, but also someone who tempts them to satisfy their aggressiveness on him, to exploit his capacity for work without compensation, to use him sexually without his consent, to seize his possessions, to humiliate him, to cause him pain, to torture and to kill him.” Anyone who has brought up children knows that being a parent is doing the work of civilization on a child-by-child basis, initiating your offspring into the social contract and giving Hobbes’s myth a firm basis in individual, if not historical, reality. No fighting, no biting; learn to share and take turns; use your words to work out disputes; those are your brother’s toys, these are yours. The inborn aggressive impulses don’t disappear but come under control, from wanting to please your parents, out of love or fear. And they get redirected, into wanting to excel others—in sports, in school, in career, in prestige.

If they get redirected, that is. Pick up the tabloids any morning, and you’ll read a litany of depravity that emphatically proves Freud’s and Hobbes’s contention about inborn human aggressiveness, from headless bodies in topless bars, as one famous headline put it, to people spraying bullets into crowds and killing bystanders. Some tabloid images stay in your mind forever, such as six-year-old Eliza Izquierdo, whose crack-addled mother burned, beat, and sexually abused her and forced to eat her own feces, while the neighbors heard her crying, “Mommy, Mommy, please stop! No more! I’m sorry.” It did stop, when she died in 1995. Or Nixzmary Brown, whose very name implies obliteration and who suffered a similar fate at the hands of her mother and stepfather, virtually in front of the eyes of the social worker on her case, before she died at the age of seven in 2006. Or the stories of sex slaves, from the California girl kidnapped at 15 and held for a decade, until she broke loose last year, to the Idaho 25-year-old chained up by an illegal alien until she escaped two years later in 2014, to the two Amish sisters, 12 and seven, kidnapped last August and fortunately soon released by the alleged perverts who lost their nerve. For more fastidious readers, the Iagos, Fagins, Simon Legrees, and Professor Moriartys of literature—or any one of Mickey Spillane’s or Elmore Leonard’s villains—stand out so vividly because we all know just how crooked the timber of humanity is, perhaps even by self-examination.

That’s why any government needs not just magistrates but also cops, as longtime New Yorkers—who have watched good policing cut the number of murders in their town from one every four hours in 1991 to fewer than one a day in 2014—have learned by experience. Public safety, protection against aggressors both domestic and foreign, is the first job of government; and any politician who thinks that government is chiefly about redistribution or providing social services doesn’t know his job. But don’t forget Michael Pena, the drunken off-duty Gotham cop who forced a terrified 24-year-old about to begin her first teaching job into a courtyard at gunpoint one morning in 2011 and raped her while threatening to blow her head off if she made a sound or looked at his face. When a neighbor yelled at him out a window, Pena held up a finger, as if to signal, “Wait a sec, I’m almost done.” Now, three years later, he whines through his lawyer that his 75-year-to-life sentence is “an injustice.” Political thinkers from Plato to James Madison well knew that those charged with administering the laws and keeping the peace are made of the same crooked timber as the rest of all-too-human humanity, and governments need to build safeguards against the abuse of their power. In fact, as Madison saw it, those who seek to wield government power are made of perhaps crookeder timber than the rest of humanity, so the less power vouchsafed to them, the better.

Faced with such a psychological reality, it’s hard to credit the basic libertarian claim that the primary human motivation is rational self-interest, especially in economic matters. The 30-year-old Charles Dickens, already an international celebrity, constantly heard that argument when he toured America in 1842, and he dismissed it in words that are hard to beat. Southern slaveholders claimed that they never mistreated their slaves because doing so would lessen the value of their property—their capital equipment—and obviously, no one would act against his own economic self-interest. Oh, really? the incredulous Dickens responded. “Is it in the interest of any man to steal, to game, to waste his health and mental faculties by drunkenness, . . . indulge hatred, seek desperate revenge, or do murder? No. All these are roads to ruin. And why, then, do men tread them? Because such inclinations are among the vicious qualities of mankind.” Generations earlier, a thoughtful slaveholder like Thomas Jefferson acknowledged how easily slavery sets loose “the most boisterous passions, the most unremitting despotism” in the master. One example out of legions: the slave chained atop a fence for punishment, until the pickets worked their way through his bare feet. Perhaps no framers of any government were ever as clear-eyed about human nature’s darker side as America’s Founding Fathers.

But we needn’t look to such an extreme example as slavery. Will anyone argue that rational self-interest dependably governs even all modern business decisions? Do not revenge, pride, and other boisterous passions play their part here, too? Does rational self-interest require a CEO to arrive at Aspen in the newest corporate jet with the biggest engines? Does it explain why he would spend $1.2 million to redecorate his office, with—among other hard-to-rationalize items—an $87,000 rug? Does anyone believe that rational self-interest guides every single hiring and firing choice? Even in financial markets, fear and greed stand shoulder to shoulder with reason, constantly jostling it, from tulip mania in the seventeenth century to the recent mortgage madness. In politics, who will argue that rational self-interest guides the hands of upper- and upper-middle-class voters in Manhattan, Park Slope, Cambridge, and so on as they push the levers in the voting booth in favor of ever-higher taxes on themselves for “services” they mostly don’t use (and that often help few or none), provided by overpaid and over-pensioned workers? The real point of such political behavior is to appear morally superior in your own and your neighbors’ eyes, even if the superiority is a fantasy based on an illusion.

Though the Right disdains Jean-Jacques Rousseau as a proto-hippie and the Left as a proto-fascist, the Enlightenment philosophe happens to be a thinker and prose stylist of genius, and he offers a profound proto-Darwinian understanding of how evolving into a social creature remade human psychology. When our early ancestors, quasi-solitary semi-animals—perhaps something like orangutans, Rousseau theorized in 1754—came out of the woods and invented societies, agriculture, and private property, they began to compare themselves with one another. The result: self-consciousness, individuality, and envy—the capstone, and the bane, of fully developed humanity. Rousseau focused on the furor for distinction that this psychological development led to (which explains the $87,000 rug and the smug Park Slope political attitudes), but later philosophers stressed the envy and resentment it produced, whose power anyone who wants to understand politics mustn’t underestimate. Whatever we have ourselves, even in ample sufficiency, we seethe with jealousy that others are richer, sexier, more charming, or more honored than we. Our absolute condition matters less than our sense of grievance at the affront we feel that others have done us by having or being more.

So when you hear angry talk of inequality, don’t expect to reason people out of it by asking them to look at what they themselves have. Don’t expect to sway them by explaining that a bigger pie means a bigger slice for all. Yes, men are blessed with rationality, but they are reasoning rather than reasonable creatures—which is why demagoguery, the real lingua franca of politics, works so well, outweighing reason by far. And the demagogue’s recurrent message is that an equal but poorer and autocratic society is better, “fairer,” than a more prosperous but unequal and free one. The power hunger of rulers, coupled with the resentment of the ruled, is too often the dynamo that turns the wheels of politics, especially in democracies.

Man’s cross-grained tendency toward resentment is also why governing by incentives is so undependable: you never know exactly what you will incentivize. The designers of the early housing projects, for instance, took pains to include stretches of what were supposed to be verdant lawns, on which children could play as in suburban backyards, becoming socialized into a community. The residents of the projects viewed the grass as pure condescension on the part of the authorities, self-satisfied by their munificence in providing such an amenity to the poor, and they expressed their contempt for what they saw as a patronizing gesture by trampling the grass into dust bowls in projects nationwide.

Add to the power of psychology in shaping political reality also the power of culture. Irving Kristol, cofounder of , pointed out in his finest essay a curious and crucially important fact about Adam Smith, the premier philosopher of man as a calculator of self-interest, and of the good effects for all of society that each individual’s pursuit of his rational self-interest produces. Smith, for all his towering genius, noted Kristol, was oddly blind to a dazzlingly obvious characteristic of his rationally calculating : as a student and professor at the University of Glasgow, the thoroughly Scottish Smith best knew Scotsmen—and Scotsmen of the time when Scotland was one of the most brilliant centers of European Enlightenment thought. As a result, he ascribed to all men the Presbyterian rectitude and Enlightenment reason of the people around him. What Smith didn’t see, in other words, was that his rationally calculating man wasn’t any and all men, wasn’t Man in the abstract, but was instead a man formed by a particular culture—by a complex web of customs, assumptions, unexamined beliefs, and loyalties. An Enlightenment Scot is the purest embodiment of the Protestant ethic that sociologist Max Weber saw as the cultural underpinning of capitalism: he works hard, is frugal and entrepreneurial, defers gratification, and believes that his word is his bond and a deal is a deal—and that the fate of his very soul is inseparable from such virtues. These are not attributes of Man in general but of men bred in a particular culture that endows them with particular beliefs and habits, manners and morals.

Theodore Dalrymple wrote in these pages over a decade ago about a striking lesson he learned about the power of culture when he worked as a doctor in Rhodesia, as Zimbabwe was then called. (See “After Empire,” Spring 2003.) He shared a gracious British colonial house, set in manicured gardens, with three other English doctors. The similar houses of their African colleagues in the same compound, by contrast, soon degenerated into slums, and not because the African doctors were any less intelligent, skilled, or well paid than the Europeans. Instead, their high pay obligated them to care for troops of relatives, who turned the grand old houses into overcrowded tenements and let their goats ravage the grounds. Such is the force of the customs, loyalties, and beliefs that make up a culture. They become a part of your identity: if you don’t observe them, you feel shame and guilt; you feel you have failed and are not a good person. It’s not that the African doctors wouldn’t have liked to live in gracious, well-tended villas but rather that other things were more important to them. With such cultural demands on individuals, Dalrymple observes, no one could expect the Rhodesian civil service to be anything other than as corrupt as it was. Family obligation drove officials to demand bribes. An efficient and honest civil service can thrive only in a different kind of culture, like the one that grew out of the samurai ethic in Japan.

That’s why the Bush administration’s “freedom agenda” in Iraq was doomed from the start. It is an error—generous-hearted but nonetheless mistaken—to believe that all people naturally yearn for freedom and that if you merely remove the yoke of despotism from them, they will instinctively seize their chance to become democratic republicans. They may tell you that is their wish, and even believe they mean it; but other, stronger cultural impulses guide their actions—family and tribal loyalties, ancient, inherited hatreds, religious intolerance and fanaticism, traditional dominance and submission, both social and sexual. These are not ideas of the mind but feelings of the heart, intrinsic to selfhood. To create and maintain an American-style democratic republic takes centuries of multifaceted cultural development. As a matter of historical fact, in America, it took Protestant ideas of individual responsibility and freedom; a Puritan tradition of self-governing congregations; British ideas of liberty, limited government, and patriotism; an Enlightenment spirit of rationality, freedom of thought, and tolerance; and the entrepreneurial spirit that created a nation out of a wilderness. It also took the amazing good fortune of having Founding Fathers of world-historical wisdom and magnanimity. For the Western democracies in general, the rule of law, the sanctity of contract, and the relative honesty of civil servants are immense cultural as well as political achievements, unmatched from China to Argentina.

Some readers are old enough to have seen firsthand how momentous changes in American culture in the 1960s dramatically transformed the nation’s political and social reality—and the consequences of those cultural changes haven’t stopped radiating outward even today. In that decade, elite culture gave up on many of the bourgeois virtues and began to beat the drums for shunning the career rat race in favor of a search for self-realization and self-fulfillment, for experimentation in matters sexual and pharmacological, and for dumping unfulfilling spouses in search of your own bliss—with the bizarre idea that your children would be happy if you were, even with one parent and shrunken financial support. Personal responsibility went under the bus, including the responsibility of criminals, whose depredations elite culture now claimed were the ineluctable consequence of vast social and racial inequalities. A lot of wrecked lives resulted, across the social spectrum. But the greatest damage occurred among the inner-city poor, who had no margin for error. With the stigma lifted from single parenthood—increasingly so as feminism also began to reshape the culture—and lawbreaking chalked up to circumstances beyond individual control, illegitimacy and crime exploded, trapping many in intergenerational poverty and creating a permanent underclass, after decades during which the incomes and educational levels of African-Americans had been strongly rising.

For more prosperous Americans, though their adventures with sex and drugs never stopped, much of the old bourgeois ethic returned. Today’s graduate of Scarsdale High, Brown University, and the Harvard Business School works long hours, conceives children in wedlock and stays married, and unremittingly pushes his or her kids to succeed in everything from science to soccer. By contrast, the ghetto has now developed its own subculture, an intensification of the old sixties promiscuity that has resulted in most inner-city kids being born out of wedlock, a readiness to drop out that puts no stigma on welfare dependency, and a contempt for authority that hampers the ability to learn in already-flawed public schools, makes people unemployable, and blocks cooperation with the police to maintain orderly communities. And these social pathologies are now spreading into the white working class, as they did in Britain long ago—around the time that the minority underclass arose in America—with the same deplorable consequences.

In his famed Farewell Address, George Washington exhorted Americans never to let their culture of liberty and self-reliance weaken. The Constitution, over whose framing he presided, was a remarkable achievement, he acknowledged; but in the end, it is just a parchment barrier against tyranny, a dead letter if the spirit that animates it gutters out. The real Constitution—the one that safeguards the written one against the schemes of omnipresent, power-hungry demagogues—lives in the hearts and minds of the citizens; and parents, teachers, and preachers must never forget their duty to nourish it and keep it vibrant. That is what makes Americans Americans.

Vain words. For decades, those who shape our culture, from grammar-school teachers to newspaper editors, from professors to presidents, have striven to inculcate precisely the opposite lesson. Their main points: this is not a free country but rather one that has oppressed along race, class, and gender lines from its birth to this very moment. There is nothing exceptional or admirable about it or the tradition of Western civilization it rests upon, but rather it is a force for worldwide exploitation and oppression. Individuals can’t be self-reliant because only government is powerful enough to protect them from the devouring power of corporations. Nor are individuals the engines of progress: Thomas Edison didn’t build that; Jonas Salk didn’t build that; Steve Jobs didn’t build that—it took a village. Nor did their efforts create wealth that wouldn’t exist but for them, for wealth creation is a natural occurrence, like Old Faithful, while only poverty is anomalous and requires an explanation. Government functionaries aren’t power-hungry, self-interested people like everyone else but rather benevolent experts, dedicated to turning the most up-to-date knowledge into programs for the public good. Government exists not to protect our God-given liberty but to make us equal through redistribution—to bring about equality of condition rather than to ensure equality of opportunity. Moreover, it can do the job that families used to do better than the traditional family ever could, from raising children to caring for Grandma, from womb to tomb. Merit is really a disguised by-product of privilege, from career success down to high scores on school tests, which result from expensive tutoring, high-priced private education, and costly test coaching.

Though George Washington was too clear-sighted about the perversity of human nature to cherish any sentimental fantasy about the perfectibility of man, he nevertheless shared the humanist and Enlightenment view that individuals, through reason, ingenuity, creativity, effort, and knowledge (from experience and study), could make themselves into good citizens who could better their own condition and contribute to the welfare of all. He believed, with most of the Founding Fathers, that nature had endowed man with freedom for just this purpose and that using this freedom for self-improvement and for the good of the community gave life its meaning. Today’s official culture is more a culture of dependency rather than of freedom. It sees individuals as something like gerbils in the government’s cage, depending on allotments of state-supplied kibble (bought with the tax dollars of the productive) while they copulate, reproduce, and die, occasionally running pointlessly on a wheel with no thought of a higher purpose.

Many people mouth the platitudes of this new culture, more European social-democratic than American in spirit. But only a portion of Americans really live by it. The great task of politics at this moment is to change the American mind back to a full-throated, rather than embarrassed, belief in enterprise, creativity, freedom of thought, and individualism and its concomitant stress on self-reliance, self-control, and self-improvement. Policies are important, to be sure; but ideas and beliefs ultimately drive politics, and they can’t be left to take care of themselves. They have to be articulated and battled for—a job too important to be left to the schoolteachers and professors. It is a job for citizens, and doing it is what citizenship means.

City JournalThe Founders at Home.

post from sitemap