How important are awards to the success and professional legitimatization of Black American actors and/or directors?
I think awards are beneficial to anyone that win them. However, that doesn't mean that they aren't more important to a specific demographic of people. In all fairness, Black Americans have obviously been 'snubbed' from Oscar ceremonies in the past. So in a way, I believe that winning for them is a greater achievement since it happens a lot less often. And if people were to look down upon people of color, it would allow them to be seen in a better light and to be taken more seriously.
No comments:
Post a Comment