The experiences and lives of Black women have often been overlooked in Hollywood, but in recent decades, this has thankfully begun to change. Women of color have made great strides in front of and behind the camera and brought their experiences to bear in the types of stories that the industry is willing to tell. Though there is still a long way to go regarding equality in representation, remarkable films and TV series demonstrate the extent to which Black women’s stories are just as powerful and important as those of their white counterparts.
More must-reads:
Get the latest news and rumors, customized to your favorite sports and teams. Emailed daily. Always free!