There are different interpretations of the word "success." An actor/director can achieve great financial success or critical acclaim without winning an award. However, an award usually leads to greater financial success.
Without a doubt, awards lead to mainstream recognition. When people win Oscars, they will probably be in higher demand and will be offered more roles/jobs, and higher salaries. They will receive more respect, and be taken more seriously. They will also probably be offered more commercial endorsements, and be on the cover of more magazines. This doesn't necessarily legitimize them, but it does give them a more concrete role in mainstream show business, which is very present in our culture.
When African American actors/directors win awards, it seems to say that we are making progress. Success is being enjoyed by all races. Discrimination is on the way out.
I'm not saying those statements are true, but they are generally used in propaganda.
In my opinion, awards won't have much of an effect on the racial politics in Hollywood until black actors and directors are more present in movies. Hollywood movies are still pretty much dominated by white actors and directors. The Oscars only happen once a year, but new movies are released almost every week.
thanks for your thoughtful post. also, really good point about the amount of films coming out each week vs. acknowledgement by the oscars--
ReplyDelete