Have you watched any of the recent Hollywood awards shows? Why do they turn them into a Political event? They are supposed to be about giving out awards, not who can make the best anti-Trump speech or joke. Now Hollywood is getting a big dick in the ass and looking like hypocrites for everything they've been preaching about.
