Contrary to my usual custom, I really don’t have anything to say about the Academy Awards. I tried to watch some of the show, but I found it insufferable and had to turn away. Which raised another question in my mind: Is it really true lately that movies influence the culture? I think we are seeing the dysfunction evident in the rest of the arts, in which the “high culture” of the artsy elite has become culturally irrelevant, while the “pop culture” of the money-makers simply conforms to whatever trends are out there.