This is a year late, but here's my take on it: in the same way that male directors tend to center their movies around males because they know what being a male is like and women center around women, white people tend to make films with white people because they know what it's like to be white. The film industry isn't extremely diverse, and I believe this will slowly change as certain cultures gain prominence in America. I think it's just a bit of a waiting game for the 'pot' to be stirred.