As a black man (not biracial) i got the same thing growing up in black schools: "you act white." And everytime I heard that from a friend I'm just like, "man... what does that even mean..." Granted theyre young like i was, so they just didnt understand as the media pushes black ppl to be portrayed a certain way, but it was pretty fucking annoying. Anyways, i didn't hear that as of yet while im going to college, hoping its bc ppl are more mature and aware.