Captain Ed has a post up about Hollywood's failure to get people to see anti-American movies. The big point? Hollywood still doesn't get it.
The virulent anti-Americanism that Hollywood seems to foster and nurture doesn't sell well in America. While movies stars jet around the world and recieve praise for denouncing the USA, the rest of America lives in the real world, where bad guys try to kill us, good and evil can be defined, and telling us that we're all evil morons makes us less likely to open our wallets. One of my early statements years ago was that Liberals are people who never have to deal with the consequences of their own actions. Nothing embodies that statement more than Hollywood, where the stars are insulated from the real world by layers of helpers, staff, gates and guards, and any measure that money can buy. They can traipse around calling American all kinds of nasty names, and all they hear is applause. Well, when access to an event is tightly controlled, the only people sure to be let in are those who will applaud such statements. It's one huge echo chamber, and the idiots in Hollywood actually believe that everyone agrees with them.
So they make their anti-America movie, where US Soldiers are dumb brutes who like to rape girls and kill people indiscriminately. And the rest of America, who still live in the real world, refuse to buy the bullshit.
There's a reason I've only gone to see one movie in the theater in the past two years. And the script for that movie (Harry Potter/Order of the Phoenix) came from a book written by someone outside of Hollywood. I refuse to be preached to by idiots who haven't dealt with real life in years. Damn near every movie coming out of Hollywood for the past decade has had some sort of anti-American or anti-conservative bent to it, and it's been getting worse and worse as of late. I refuse to pay money to watch someone else's propaganda.
As far as I care, Hollywood can sink and disappear.
No comments:
Post a Comment