Serious question - do white people notice this?
"Despite the number of [slavery] films,...there's a relative paucity of thematic range. All of these critically acclaimed films use variations on a single narrative: Black people are oppressed by bad white people. They achieve freedom through the offices of good white people. Happy ending."
And do whites really think this isn't a racist country when that's the tale whites were compelled, without pressure, to tell?