… It happens all the time in Hollywood. Whenever an actress wants to prove she’s all grown up the clothes start to come off.
Hollywood, where everyone is a feminist and a sexist at the same time.
Emma Watson, who spoke at the UN demanding equality for all sexes, doesn’t even see the paradox within the industry she works for. I’d like to know her thoughts on why male actors aren’t encouraged to get naked and compromise their morals to prove their acting chops.
Most recently, Reese Witherspoon, has decided to shed her girl-next-door image by doing, what else, “racy” sexually explicit sex scenes in her newest film, Wild.
According to Vogue, for her role in Wild, … the Mud actress [Reese] had to learn how to look like she was shooting heroin, and employed a hypnotist to get her through the racy sex scenes.
“I just didn’t want to hear, ‘Oh, we don’t want to see Reese have sex…Oh, can we not have any profanity?’” she explained to the magazine of stepping outside her comfort zone for the movie. “I wanted it to be truthful, I wanted it to be raw, I wanted it to be real.” [Source]
Are there no other roles for an actress to be professionally challenged by other than having graphic sex on the big screen?
Why aren’t more actresses outraged by this? Why are women so willing to accept their sexuality as the height of all they have to offer?
It’s beyond frustrating to see women, presumably reasonably intelligent women, who have daughters, allow themselves to be used not for their minds or character qualities, but for how hot they look naked on camera.