Freedom reconsidered

Now that the truths that were foundational to the American republic–that there is a Creator who is the basis for human equality and rights that transcend the state–are no longer self-evident, we are starting to see a rethinking of everything America used to stand for.  For example, Princeton professor Philip Pettit, in a book entitled Just Freedom, argues that we need to do away with the “libertarian” notion of individual freedom.  Instead, we should pursue “democratic freedom,” based on the liberty of groups not to be dominated by another group.

Liberal think-tanker Danielle Allen explains, after the jump. [Read more...]

Individualism vs. collectivism

Here is how George Will answers Elizabeth Warren’s statement that we posted yesterday:

Such an agenda’s premise is that individualism is a chimera, that any individual’s achievements should be considered entirely derivative from society, so the achievements need not be treated as belonging to the individual. Society is entitled to socialize — i.e., conscript — whatever portion it considers its share. It may, as an optional act of political grace, allow the individual the remainder of what is misleadingly called the individual’s possession.

The collectivist agenda is antithetical to America’s premise, which is: Government — including such public goods as roads, schools and police — is instituted to facilitate individual striving, a.k.a. the pursuit of happiness. The fact that collective choices facilitate this striving does not compel the conclusion that the collectivity (Warren’s “the rest of us”) is entitled to take as much as it pleases of the results of the striving.

Warren’s statement is a footnote to modern liberalism’s more comprehensive disparagement of individualism and the reality of individual autonomy. A particular liberalism, partly incubated at Harvard, intimates the impossibility, for most people, of self-government — of the ability to govern one’s self. This liberalism postulates that, in the modern social context, only a special few people can literally make up their own minds. . . .

Many members of the liberal intelligentsia, that herd of independent minds, agree that other Americans comprise a malleable, hence vulnerable, herd whose “false consciousness” is imposed by corporate America. Therefore the herd needs kindly, paternal supervision by a cohort of protective herders. This means subordination of the bovine many to a regulatory government staffed by people drawn from the clever minority not manipulated into false consciousness.

Because such tutelary government must presume the public’s incompetence, it owes minimal deference to people’s preferences. These preferences are not really “theirs,” because the preferences derive from false, meaning imposed, consciousness. This convenient theory licenses the enlightened vanguard, the political class, to exercise maximum discretion in wielding the powers of the regulatory state.

Warren’s emphatic assertion of the unremarkable — that the individual depends on cooperative behaviors by others — misses this point: It is conservatism, not liberalism, that takes society seriously. Liberalism preaches confident social engineering by the regulatory state. Conservatism urges government humility in the face of society’s creative complexity.

Society — hundreds of millions of people making billions of decisions daily — is a marvel of spontaneous order among individuals in voluntary cooperation. Government facilitates this cooperation with roads, schools, police, etc. — and by getting out of its way. This is a sensible, dynamic, prosperous society’s “underlying social contract.”

via Elizabeth Warren and liberalism, twisting the ‘social contract’ – The Washington Post.

The choices are individualism or collectivism.  Or is there something in between?

The internet as collectivist monster

Jaron Lanier was one of the inventors of “virutal reality,” but now he is saying that the internet has turned into a collectivist monster.  From a review of his book  entitled You Are Not a Gadget: A Manifesto:

A self-confessed “humanistic softie,” Lanier is fighting to wrest control of technology from the “ascendant tribe” of technologists who believe that wisdom emerges from vast crowds, rather than from distinct, individual human beings. According to Lanier, the Internet designs made by that “winning subculture” degrade the very definition of humanness. The saddest example comes from young people who brag of their thousands of friends on Facebook. To them, Lanier replies that this “can only be true if the idea of friendship is reduced.”

Anyone who has followed technology and for years has resented the adoration heaped upon the ascendant tribe will positively swoon as Lanier throws into one great dustbin such sacred concepts as Web 2.0, singularity, hive mind, wikis, the long tail, the noosphere, the cloud, snippets, crowds, social networking and the Creative Commons — dismissing them all as “cybernetic totalism” and, more fun yet, as potential “fascism.”

The “cybernetic totalists” base their thinking on decades-old ideas known as “chaos” or “complexity” theory, which began with a question about ants: How does something as complex as a colony arise from the interactions of dumb ants? This approach can be useful if one is studying mass phenomena such as traffic jams. The problem comes when we try to apply ant-derived thinking to people who are trying to lead creative, expressive lives.

In the totalist model, algorithms (most of them secret and proprietary, such as Google’s search engine) create knowledge by making links among the system's many human participants. From this possibly infinite set of connections arises intelligence. The creative actor is no longer the human being but the system and its algorithms, out of which emerges a living, nonhuman or trans-human higher being. (Lanier does not hesitate to compare this to religion.) There are some, such as Google co-founder Larry Page, who believe the Internet will soon be alive.

The poor human participants become “peasants” working for the “lords” of technology: those who have deeper access to the workings of the Web (read Google, Yahoo and hedge funds with vast analytic resources) and who profit from our volunteer labor. Our role is simply to keep contributing our code-bits and snippets and Facebook pages. We become what Lanier calls “computer peripherals,” and he is raising a defense against this reduction of our being.

Lanier says there is still time “to promote alternate designs [of the Internet] that resonate with human-kindness.” He is fighting for something “ineffable” in the human imagination and creativity; for us to see personhood as “a quest, a mystery, and a leap of faith.” These are not views normally expressed by computer scientists, and anyone but Lanier would get laughed off the stage. Yet he dares to say the forbidden: that computers as we know them may be incapable of truly representing human thoughts and relationships.

This book is very much like the Jaron Lanier he shows in his public appearances: mind-bending, exuberant, brilliant, thinking in all directions. He describes how computer software locks us into rigid ways of thinking (which brings up the next logical question, though he fails to ask it: How can a computer, with its need for standard interfaces, not lock us into the behavior and thought patterns implicit in our software?). He discusses how pack-like attacks arise on the Web wherever there is an opportunity for “consequence-free, transient anonymity.” The topic hardly matters: “Jihadi chat looks just like poodle chat.”

He describes the sad, stressful lives of young people who “must manage their online reputations constantly.” He makes the point that the free use of everything on the Web leads to endless mashups, except for the one thing legally protected from being mashed-up: ads, making advertising the one thing on the Internet that can be “owned.” In the book's final pages, he tries to imagine an alternative to “totalist” computing: a new sort of virtual-reality software that would allow us to express ourselves through transformations of our virtual bodies, as if we were cephalopods. All of which sounds quite wild, but so did virtual reality in 1980.

via Book review: You Are Not a Gadget, by Jaron Lanier –

Buy the book here.