Progressivism and omnipotent government

In line with the “Obama as Messiah” post, here is another example of secularism turning into paganism.  Godless people, trying to fill the void, can also invest the state with divine power and authority.  Drawing on Charles R. Kesler’s I Am the Change: Barack Obama and the Crisis of Liberalism, George Will shows that progressive politics, from the beginning, has an intrinsic connection to the belief in unlimited government power that can then solve all problems:

Progress, as progressives understand it, means advancing away from, up from, something. But from what?

From the Constitution’s constricting anachronisms. In 1912, Wilson said, “The history of liberty is the history of the limitation of governmental power.” But as Kesler notes, Wilson never said the future of liberty consisted of such limitation.

Instead, he said, “every means . . . by which society may be perfected through the instrumentality of government” should be used so that “individual rights can be fitly adjusted and harmonized with public duties.” Rights “adjusted and harmonized” by government necessarily are defined and apportioned by it. Wilson, the first transformative progressive, called this the “New Freedom.” The old kind was the Founders’ kind — government existing to “secure” natural rights (see the Declaration) that preexist government. Wilson thought this had become an impediment to progress. The pedigree of Obama’s thought runs straight to Wilson.

And through the second transformative progressive, Franklin Roosevelt, who counseled against the Founders’ sober practicality and fear of government power: “We are beginning to wipe out the line that divides the practical from the ideal” and are making government “an instrument of unimagined power” for social improvement. The only thing we have to fear is fear of a government of unimagined power:

“Government is a relation of give and take.” The “rulers” — FDR’s word — take power from the people, who in turn are given “certain rights.”

This, says Kesler, is “the First Law of Big Government: the more power we give the government, the more rights it will give us.” It also is the ultimate American radicalism, striking at the roots of the American regime, the doctrine of natural rights. . . . [Read more…]

The era of black-and-white TV

President Obama dismissed the Republican convention in these terms:

“Despite all the challenges that we face … what they offered over those three days was more often than not an agenda that was better suited for the last century. It was a re-run. We’ve seen it before. You might as well have watched it on a black-and-white TV.”

If only it were!  That was the last time anything was consistently good on television.  That was the golden age of TV, the era of Jack Benny, Gracie Allen, Rod Serling, Edward R. Murrow.

The  Eisenhower administration!  The early Elvis!  Intact families!  Route 66!

I guess the dividing line would be one’s attitude to the counter culture beginning in the late 1960s.  Liberals would generally favor that, I suppose, with Conservatives bemoaning the changes (e.g., the sexual revolution).

Though the era of black-and-white TV was a vibrant, creative, and positive time culturally for America, it was no utopia, with real problems.  For example, the institutionalized racism of the Jim Crow laws.  But compare the early Civil Rights protesters–moral, religious, dignified–with today’s Occupy Wall Street protesters (unfocused, hedonistic, squalid).  And, if you want counter culture, surely the Beatniks, reading existentialist philosophy and listening to jazz, were cooler than the Hippies, tripped out on acid and wearing flowers in their hair.

I wonder if we could date our cultural collapse from the advent of color television.  (The first all-color lineup was in 1966, which would be about right.)

via Obama: RNC fare for ‘black-and-white TV’ – POLITICO.com.

From citizens to clients

George Will sums up Spoiled Rotten: How the Politics of Patronage Corrupted the Once Noble Democratic Party and Now Threatens the American Republic by Jay Cost, who argues “that the party has succumbed to ‘clientelism,’ the process of purchasing cohorts of voters with federal favors.”

Before Franklin Roosevelt, “liberal” described policies emphasizing liberty and individual rights. He, however, pioneered the politics of collective rights — of group entitlements. And his liberalism systematically developed policies not just to buy the allegiance of existing groups but to create groups that henceforth would be dependent on government.

Under FDR, liberalism became the politics of creating an electoral majority from a mosaic of client groups. Labor unions got special legal standing, farmers got crop supports, business people got tariff protection and other subsidies, the elderly got pensions, and so on and on.

Government no longer existed to protect natural rights but to confer special rights on favored cohorts. As Irving Kristol said, the New Deal preached not equal rights for all but equal privileges for all — for all, that is, who banded together to become wards of the government.

In the 1960s, public-employee unions were expanded to feast from quantitative liberalism (favors measured in quantities of money). And qualitative liberalism was born as environmentalists, feminists and others got government to regulate behavior in the service of social “diversity,” “meaningful” work, etc. Cost notes that with the 1982 amendments to the Voting Rights Act, a few government-approved minorities were given an entitlement to public offices: About 40 “majority-minority” congressional districts would henceforth be guaranteed to elect minority members.

Walter Mondale, conceding to Ronald Reagan after the 1984 election, listed the groups he thought government should assist: “the poor, the unemployed, the elderly, the handicapped, the helpless and the sad.” Yes, the sad.

Republicans also practice clientelism, but with a (sometimes) uneasy conscience. Both parties have narrowed their appeals as they have broadened their search for clients to cosset.

via George Will: An election to call voters’ bluff – The Washington Post.

America’s culture gap

Democrats are often citing a widening economic gap between the affluent and those barely scraping by.  The controversial social scientist Charles Murray, who is more on the conservative side, says that’s just the half of it.  There is a growing cultural gap between the affluent (who still, usually, get educated, get married, and go to church) and the working class (who increasingly raise children without marriage and are becoming more and more secular).

Note how this flies in the face of conventional wisdom, that religion is for the poor and uneducated, and the upper crust lives a hedonistic, permissive lifestyle.  It’s actually the reverse!  And this isn’t a racial thing:  Murray is looking specifically at the demographics of white people. (Lower-income blacks, for example, tend to be very religious, unlike lower-income whites.)

Murray, drawing from his new book, Coming Apart: The State of White America, 1960-2010 explains his findings in the Wall Street Journal from earlier in the year.  He describes  two fictional-but-based-in-fact cities, the upper-income suburb of Belmont and the lower-income community of Fishtown (both predominately white):

In Belmont and Fishtown, here’s what happened to America’s common culture between 1960 and 2010.

Marriage: In 1960, extremely high proportions of whites in both Belmont and Fishtown were married—94% in Belmont and 84% in Fishtown. In the 1970s, those percentages declined about equally in both places. Then came the great divergence. In Belmont, marriage stabilized during the mid-1980s, standing at 83% in 2010. In Fishtown, however, marriage continued to slide; as of 2010, a minority (just 48%) were married. The gap in marriage between Belmont and Fishtown grew to 35 percentage points, from just 10.

Single parenthood: Another aspect of marriage—the percentage of children born to unmarried women—showed just as great a divergence. Though politicians and media eminences are too frightened to say so, nonmarital births are problematic. On just about any measure of development you can think of, children who are born to unmarried women fare worse than the children of divorce and far worse than children raised in intact families. This unwelcome reality persists even after controlling for the income and education of the parents.

In 1960, just 2% of all white births were nonmarital. When we first started recording the education level of mothers in 1970, 6% of births to white women with no more than a high-school education—women, that is, with a Fishtown education—were out of wedlock. By 2008, 44% were nonmarital. Among the college-educated women of Belmont, less than 6% of all births were out of wedlock as of 2008, up from 1% in 1970.

Industriousness: The norms for work and women were revolutionized after 1960, but the norm for men putatively has remained the same: Healthy men are supposed to work. In practice, though, that norm has eroded everywhere. In Fishtown, the change has been drastic. (To avoid conflating this phenomenon with the latest recession, I use data collected in March 2008 as the end point for the trends.)

The primary indicator of the erosion of industriousness in the working class is the increase of prime-age males with no more than a high school education who say they are not available for work—they are “out of the labor force.” That percentage went from a low of 3% in 1968 to 12% in 2008. Twelve percent may not sound like much until you think about the men we’re talking about: in the prime of their working lives, their 30s and 40s, when, according to hallowed American tradition, every American man is working or looking for work. Almost one out of eight now aren’t. Meanwhile, not much has changed among males with college educations. Only 3% were out of the labor force in 2008.There’s also been a notable change in the rates of less-than-full-time work. Of the men in Fishtown who had jobs, 10% worked fewer than 40 hours a week in 1960, a figure that grew to 20% by 2008. In Belmont, the number rose from 9% in 1960 to 12% in 2008.

Crime: The surge in crime that began in the mid-1960s and continued through the 1980s left Belmont almost untouched and ravaged Fishtown. From 1960 to 1995, the violent crime rate in Fishtown more than sextupled while remaining nearly flat in Belmont. The reductions in crime since the mid-1990s that have benefited the nation as a whole have been smaller in Fishtown, leaving it today with a violent crime rate that is still 4.7 times the 1960 rate.

Religiosity: Whatever your personal religious views, you need to realize that about half of American philanthropy, volunteering and associational memberships is directly church-related, and that religious Americans also account for much more nonreligious social capital than their secular neighbors. In that context, it is worrisome for the culture that the U.S. as a whole has become markedly more secular since 1960, and especially worrisome that Fishtown has become much more secular than Belmont. It runs against the prevailing narrative of secular elites versus a working class still clinging to religion, but the evidence from the General Social Survey, the most widely used database on American attitudes and values, does not leave much room for argument.

For example, suppose we define “de facto secular” as someone who either professes no religion at all or who attends a worship service no more than once a year. For the early GSS surveys conducted from 1972 to 1976, 29% of Belmont and 38% of Fishtown fell into that category. Over the next three decades, secularization did indeed grow in Belmont, from 29% in the 1970s to 40% in the GSS surveys taken from 2006 to 2010. But it grew even more in Fishtown, from 38% to 59%.

It can be said without hyperbole that these divergences put Belmont and Fishtown into different cultures.

via Charles Murray on the New American Divide – WSJ.com.

What are the implications of  this cultural divide?  I would think it means, for one thing, that churches should concentrate their evangelistic efforts in working class areas rather than the current target of affluent suburbs.  (Working class folks used to be the backbone of the church.  What would be necessary to make that happen again?)

HT:  Roberta Bayer

Is the President our national pastor?

No, the president is most emphatically NOT a national pastor, such an understanding betraying a deadly confusion of God’s Two Kingdoms and completely distorting the nature of the pastoral office.  But the normally circumspect Christianity Today has two articles that say, yes, the president kind of is a national pastor.

See Owen Strachan,“Our American President: The ‘Almost Pastor’ of an ‘Almost Chosen’ Land” and  Judd Birdsall, “Is the President America’s Pastor in Chief?“.  A sample from the latter:

Ironically, the curious American integration of piety and the presidency largely stems from our separation of church and state. Without an established religion led by an archbishop, ecumenical patriarch, or grand mufti, the President acts, for better or worse, as the nation’s senior religious figure.

Cambridge University professor Andrew Preston makes this point in his massive, 815-page work Sword of the Spirit, Shield of Faith: Religion in American War and Diplomacy: “There is no official hierarchy in the American civil religion, but as the nation’s head of state as well as its chief executive … the president has acted as its de facto pope.”

What exactly are the President’s papal duties? Preston explains: “Since George Washington, the president has been the interpreter of rites, symbols, and meanings of the civil religion, with some—particularly Abraham Lincoln, Franklin Roosevelt, and Harry Truman—significantly recasting it under the pressure of war.”

Obama’s and Romney’s faith-infused interpretations of the Aurora shooting are case in point, and the most recent chapter in the long history of the presidential pastorate. Both politicians denounced the killing as “evil,” and both turned to the Bible for meaning, solace, and hope.

In his public statement after meeting with victims’ families in Aurora, Obama quoted the famous eschatological promise found in Revelation 21:4: “He will wipe away every tear from their eyes, and death shall be no more. Neither shall there be mourning, nor crying, nor pain anymore, for the former things have passed away.”

Focusing on the here and now, Romney encouraged his audience to “mourn with those who mourn,” a reference to Romans 12:15. In poignant remarks packed with Christian language, Romney expressed his prayer that “the grieving will know the nearness of God” and “the comfort of a living God.” Citing the apostle Paul by name, Romney quoted from 2 Corinthians 1:3–4, “blessed be God, who comforteth us in all our tribulations, that we may be able to comfort them which are in any trouble.”

Many commentators applauded Romney for sounding “presidential.” Especially in times of tribulation, Americans expect their President to be their pastor—not in any formal sense as a leader of a church but in the general sense as a provider of spiritual care and theological perspective for the nation.

The president as our archbishop, since we’re not allowed to have a state church?  Our pope?  A provider of our spiritual care?  These writers, of course, are speaking by way of analogy.  The workings of the “civil religion,” not to be confused (though it often is) with Christianity, though I’m not sure these articles finish that point.  They describe how the presidency functions and how the public responds, not how things should be.  But still. . . .

What is wrong with this picture?

HT:  Paul McCain

Poverty rate is highest in 50 years

New data shows that the poverty rate has climbed to the highest levels since the 1960s:

The ranks of America’s poor are on track to climb to levels unseen in nearly half a century, erasing gains from the war on poverty in the 1960s amid a weak economy and fraying government safety net. . . .

The official poverty rate will rise from 15.1 percent in 2010, climbing as high as 15.7 percent. Several predicted a more modest gain, but even a 0.1 percentage point increase would put poverty at the highest level since 1965. . . .

Even after strong economic growth in the 1990s, poverty never fell below a 1973 low of 11.1 percent. That low point came after President Lyndon Johnson’s war on poverty, launched in 1964, that created Medicaid, Medicare and other social welfare programs. . . .

The analysts’ estimates suggest that some 47 million people in the U.S., or 1 in 6, were poor last year. An increase of one-tenth of a percentage point to 15.2 percent would tie the 1983 rate, the highest since 1965. The highest level on record was 22.4 percent in 1959, when the government began calculating poverty figures. . . .

Analysts also believe that the poorest poor, defined as those at 50 percent or less of the poverty level, will remain near its peak level of 6.7 percent. . . .

The 2010 poverty level was $22,314 for a family of four, and $11,139 for an individual, based on an official government calculation that includes only cash income, before tax deductions. It excludes capital gains or accumulated wealth, such as home ownership, as well as noncash aid such as food stamps and tax credits, which were expanded substantially under President Barack Obama’s stimulus package.

An additional 9 million people in 2010 would have been counted above the poverty line if food stamps and tax credits were taken into account.

So, by these definitions, out of every six Americans you see on the streets, one will be poor.  One in twelve (if I’m figuring that right) will be really poor.


CLOSE | X

HIDE | X