Tattoo regrets

Despite the current economic doldrums, a new business is booming:   the tattoo removal industry.  Emily Wax reports:

She arrives quietly, coming in from the rain after work. She lies down on her stomach atop a sleek, white reclining chair. She lifts her shirt and tugs down her jeans slightly.

It’s enough to unveil a large pink flower tattoo with fat, webby green leaves, which she’s here to have lasered off her lower back. She wants to become a mother someday, and she doesn’t want her children to see this. The process could take up to 10 sessions, she says. She pauses. Then she starts crying.

“I was only 18. It was a homemade tattoo done at a party,” says Lizeth Pleitez, 30, who quickly dries her eyes. Her voice is shaking. “I wasn’t thinking about what it meant, you know? Little did I know it meant something else — like people calling it a ‘tramp stamp.’ I’m a Pentecostal, and the body is a temple. And I felt really ashamed.”

If tattoos are the marks of an era — declarations of love, of loss, of triumph, of youthful exuberance or youthful foolishness — then tattoo removals are about regret, confessions that those landmarks are in the past. They’re about the realization that whatever you believed in with such force that you wanted it eternally branded on your skin is now foreign to you.

According to the Pew Research Center, more than 40 percent of Americans between the ages of 26 and 40 have at least one tattoo. Getting a tattoo, once the province of sailors rather than suburbanites, is so mainstream that tats are inked at the mall and seen on everyone from Middle American mothers to H Street hipsters to Hollywood starlets.

Perhaps not surprisingly, a parallel trend is emerging: tattoo removal, with dozens of businesses and training schools opening across the country. . . .

Tattooing was once considered audacious, powerful and rebellious, precisely because of its permanence.

But for a generation that has come of age during an unprecedented revolution in medical technology, tattoo removal by a super-powered laser seems like a facelift for young people, a chance to start over, erase, rewind. Like deleting a bad photo from a digital camera or defriending a Facebook friend.

“It was such an underserved market,” says Christian Slavin, 54, who has an MBA from Harvard and owns Zapatat in Arlington County, which opened in September. “The difference between the regret rate and the removal rate is huge.”

While older lasers burned off the skin, Slavin’s new model interacts only with the ink and “makes it shake and makes it break,” he says. But it still hurts — it feels like hot rubber bands snapping against your skin, most removers say — and often is more painful than getting a tattoo.“When it’s all said and done, I’m just not that guy anymore,” says Corey Newman, 29, who is getting married in May and wanted to get three tattoos removed: his left arm’s panther, his right shoulder blade’s bull, and two small Chinese characters on his right leg. He is spending $2,500 to take off tattoos that cost $600 to put on. (Which might explain why tattoo removers tend to be better dressed and better paid than tattoo artists.)

via Rethinking the ink: Laser tattoo removal gains popularity – The Washington Post.

OK, so if the demographics of this blog hold true, 40% of you 26-40 year-olds have tattoos.  Who has stories of tattoo regret?

Whose job is it to keep Christ in Christmas–and in sermons?

Issues, Etc. host Todd Wilkens has posted a provocative point on his Facebook page on the perennial “keep Christ in Christmas” controversies.  Since I’m one of the ten or eleven Americans not on Facebook, I’m indebted to my friend Michael O’Connor for showing it to me and for asking Todd for permission to post it here:

I don’t expect the culture to keep Christ in Christmas; that’s the church’s responsibility.

Besides, the “Christ” of culture bears no resemblance to the Christ we find in scripture. So it’s probably best that the culture leave Christ out of the holiday.

What does disturb me is that many of the Christians worried about keeping Christ in Christmas have little problem with Christ being left out of the preaching they hear the rest of the year.

The end of the malls?

Yes, the shopping malls are packed this time of year.  But hardly any are being built any more.  And many of the existing malls are being demolished.  The concept of the vast enclosed shopping space surrounded by a vast parking lot seems to be fading.  In its place is the “town center,” the shopping area that is pedestrian friendly, open to the sky, and that combines shops, restaurants, movie theaters, and places to live.  Architect Roger K. Lewis gives a good account of what happened:

After World War II, the enclosed regional shopping mall emerged because of two interdependent American phenomena: construction of the interstate highway system and rapid growth of low-density metropolitan suburbs.

Starting in the early 1950s, residents and many businesses fled cities, populating the expanding outer suburbs. Downtown department stores and smaller shops had ever fewer customers, but suburbanites still needed a place to shop, and the regional shopping center satisfied that need perfectly.

With affordable cars, cheap gas, a growing network of arterial roads and a seemingly endless supply of inexpensive land, the regional shopping mall was a logical invention. Equally logical was the real estate and mall design formula: acquire land with access to a major highway; assemble enough acreage to build a very large, weatherproof structure surrounded by parking lots; construct long concourses (often two levels high) lined on each side by scores of shops; and plug the ends of concourses with anchor department stores. To complete and enhance the formulaic picture, provide a food court, pump up concourse light levels, design enticing storefronts, pipe in music and pleasant scents, and install seasonal decorations, including Santa Claus.This formula proved extremely successful throughout America.

Today, however, middle-class flight from cities has ebbed. Adult children of the generations that inhabited post-war suburbia often choose not to stay in the the suburban settings where they grew up. Even their parents, tired of maintaining a house bigger than they now need, are heading back toward or into cities. Others (the young, middle-aged or elderly) are choosing to live in denser, walkable communities, where there is more to do and where shopping does not require driving several miles. This is one reason why town centers are being built, even in suburban locations, and why huge shopping malls are not.Traditional nuclear families (mother, father, two kids) are now less than half of all American households. Coupled with falling home values, mortgage foreclosures and unemployment, demographic reality is contributing to the depopulation of many suburban and exurban communities. A shopping mall cannot survive without population growth and customers who can afford to shop.

Also, for essentially aesthetic reasons, more people prefer not to shop in fading, older retail facilities that may be poorly maintained and perhaps half-empty. This suggests that Americans’ taste and appreciation of good architecture is improving.

via Visions of lively town centers dancing in more developers’ heads – The Washington Post.

What a concept.  Diverse businesses arranged off sidewalks with people living upstairs.  Sounds like downtowns.

But I do like the new town centers.  There are some good ones around where we live.  I’m curious how prevalent these are.  Do you have some where you live?  Are they an improvement over malls?  Or are they basically the same things only without roofs?

How John Stuart Mill changed the culture

Roger Kimball on the legacy of John Stuart Mill:

In 1859, two revolutionary books were published. One was Charles Darwin’s On the Origin of Species. The other was John Stuart Mill’s pamphlet On Liberty. Darwin’s book revolutionized biology and fundamentally altered the debate between science and religion. Mill’s book revolutionized the way we think about innovation in social and moral life.

What is your opinion of innovation? Do you think it is a good thing? Of course you do. You may or may not have read Mill on the subject, but you have absorbed his lessons. What about established opinion, customary ways of doing things? Do you suspect that they should be challenged and probably changed? Odds are that you do. Mill has taught you that, too, even if you have never read a line of On Liberty.

Mill’s essay was ostensibly about the relation between individual freedom and society. Mill famously argued that the only grounds on which society was justified in exercising control over its members, whether that control be in the form of “legal penalties” or simply “the moral coercion of public opinion,” was to “prevent harm to others. His own good, either physical or moral, is not a sufficient warrant.”

This part of Mill’s argument quickly attracted searching criticism. The British judge James Fitzjames Stephen, for example, went to the heart of the problem when he observed that Mill assumed that “some acts regard the agent only, and that some regard other people. In fact, by far the most important part of our conduct regards both ourselves and others.” As for withholding “the moral coercion of public opinion,” Stephen observed that “the custom of looking upon certain courses of conduct with aversion is the essence of morality.”

Stephen’s criticisms of Mill were published in his book Liberty, Equality, Fraternity, which appeared about a decade after On Liberty. Many of the criticisms are devastating. Intellectually, Stephen made mincemeat of Mill. But that has hardly mattered. Mill’s doctrines have taken the world by storm, while Stephen has receded to become a footnote in intellectual history.

Why? One reason is that Mill said things that people wanted to hear. Mill seemed to be giving people a permanent vacation from the moral dictates of society. How often have you heard the argument “It’s not hurting anyone else” put forward as a justification for self-indulgence?

But it was not simply what he said about the relation between individual freedom and social control that made On Liberty such an influential tract. Much more important was the attitude, the emotional weather, of the book.

On Liberty is only incidentally a defense of individual freedom. Its deeper purpose is to transform the way we regard established morality and conventional behavior as such. In brief, Mill taught us to be suspicious of established morality not because what it says is wrong (maybe it is, maybe it isn’t) but simply because it is established.

Think about that. The tradition that Mill opposed celebrated custom and established morality precisely because they had prevailed and given good service through the vicissitudes of time and change; their longevity was an important token of their worthiness.

Mill overturned this traditional view. Henceforth, the customary, the conventional was suspect not because it had failed but simply because if was customary and conventional. . . .

Granted that every change for the better has depended on someone embarking on a new departure. Well, so too has every change for the worse. And surely, [David] Stove observes, there have been at least as many proposed innovations which “were or would have been for the worse as ones which were or would have been for the better.” Which means that we have at least as much reason to discourage innovators as to encourage them, especially when their innovations bear on things as immensely complex as the organization of society.

The triumph of Mill’s teaching shows that such objections have fallen on deaf ears. But why? Why have “innovation,” “originality,” etc., become mesmerizing charms that neutralize criticism before it even gets started when so much that is produced in the name of innovation is obviously a change for the worse? An inventory of the fearsome social, political, and moral innovations made in this century alone should have made every thinking person wary of unchaperoned innovation.

One reason that innovation has survived with its reputation intact, Stove notes, is that Mill and his heirs have been careful to supply a “one-sided diet of examples.” It is a technique as simple as it is effective:

Mention no past innovators except those who were innovators-for-the-better. Harp away endlessly on the examples of Columbus and Copernicus, Galileo and Bruno, Socrates and (if you think the traffic will bear it) Jesus. Conceal the fact that there must have been at least one innovator-for-the-worse for every one of these (very overworked) good guys. Never mention Lenin or Pol Pot, Marx or Hegel, Robespierre or the Marquis de Sade.

via Roger’s Rules » Liberty, Equality, Fraternity.

 

The diner as American icon

Foreigners are fascinated by American diners, seeing them as icons of American culture.  So says the BBCg:

Sitting in a diner, on the inside looking outside.

This is a quintessential American experience. Add a booth, a Formica counter and a cup of joe – as diner patrons call their coffee.

Themed restaurants and burger chains from Mumbai to Manchester aim to replicate this chrome-flashed experience, and diner fare such as home fries and fluffy pancakes are now global fast food staples.

So why are these kerbside kitchens a landmark of US culture?

The first such establishment opened in 1872 in Providence, Rhode Island – a “night lunch wagon” to serve those who worked and played long after the restaurants had shut at 20:00.

Its mix of open-all-hours eating and cheap, homemade food proved a hit, and the formula has been repeated ever since.

Today the diner occupies a place in the American heartland. The closest British approximation is not a retro-chic replica diner where hip patrons eat gourmet burgers, but the local pub.

Just as dignitaries visiting the UK and Ireland are taken for a pint and a photo call, no US election campaign is complete without a stop at a diner to emphasise the candidate’s everyman or everywoman credentials.

On the campaign trail in a diner (clockwise from left): George W Bush, Barack Obama, Sarah Palin, Mitt Romney, Al Gore Common touch: The diner is now a compulsory stop on the campaign trail

“The thing about this democratic counter is that anyone can go in and sit down. It can be a professor, it can be a worker,” says Richard Gutman, author of American Diner Then and Now.

“A friend of mine in Pennsylvania ate in a diner and he’s in the middle of two guys. One is the chief of police and the other is just some character. The policeman looks over and says, ‘Didn’t I arrest you last year?’ and the guy says, ‘Yes you did – pass the ketchup.’”

via BBC News – Why the diner is the ultimate symbol of America.

That diners are democratic is striking in countries with a rigid class system!  The article goes on to survey the figure of the diner in American art (Edward Hopper) and movies (Pulp Fiction).   I would say that other countries would do well to imitate our diners, as opposed to our fast food joints.

The invention of the weekend

Monica Hesse on the invention of the weekend:

Before weekends could be long, they first had to be weekends.

For most of the 19th century and part of the 20th, there were none — there were simply weeks that ended. The working class had Sundays off only. Because of this, many of them would spend the Lord’s day carousing, then call in sick on Mondays. This practice was observed with enough regularity that it was called “Keeping Saint Mondays.” Religious groups hated it, and so did bosses, writes University of Pennsylvania professor Witold Rybczynski in his leisure-time history, “Waiting for the Weekend.” Various special interest groups put their heads together to come up with a solution: Saturdays. Give the people Saturday afternoon off so they have less reason to be plastered Monday morning.

The term “weekend” first shows up in the Oxford English Dictionary in 1879; it wasn’t until the Great Depression that the Saturday-Sunday dynamic duo really became codified in the United States. Shorter hours were seen as a “remedy” for unemployment, Rybczynski writes. “Each person would work less, but more people would have jobs.”

via Giving Thanks — for Long Weekends – The Washington Post.

I think this is a little oversimplified.  Certainly the Biblical sabbath was the source of the practice of a day of rest, a dramatic example of the influence of Christianity on the civilization as a whole.  This account does explain adding Saturday, which, however, was the Jewish day of rest, not to mention Christian Adventist groups.  I wonder if the climate of immigration in the 19th century–all those Jewish immigrants who would not work on Saturday–contributed to the additional day off.

Nevermind that the commandment says “six days shall you labor,” as well as underscoring the one day thou shalt not.  I suppose Saturdays became the day people labored for themselves–fixing things around the house, tinkering with this and that, running errands, “getting things done”–as opposed for laboring for someone else for pay.  That doubtless helped carve out space for the individual and the home.


CLOSE | X

HIDE | X