I'm traveling to the US and I am curious to see if my opinions about the US have been negatively affected by US media. Before coming to China, I had a bad opinion about China, because of the western media, but the truth was very different from the reports. Now, I have been outside of the US for a long time and my opinions of my home country have changed. The question is "Have I been brainwashed by US media?"
What do you think? As the US gotten worse in the last 5 years? Or is it just social media and news sites pushing the most shocking stories for clicks?
Leave a comment below and let me know your opinion.
Join this channel to get access to perks:
/ @tripbitten
Also, a VPN because it might help:
www.astrill.co...
_________________________
FOLLOW US ON OUR TRAVELS:
/ tripbitten
/ tripbitten
_________________________
Some of the link are affiliate links and we will receive a small commission if you buy something from the links.
21 сен 2024