What Have the Americans Ever Done for Us?
Jul 04, 2019
According to a recent international poll in citizens of 25 developed countries, only 50% have a positive view of the USA. In countries like Greece, Germany, and France, the negative views reach or supersede 60%. This is quite puzzling if one thinks that a lot of the achievements that are making our life better, from the gadgets we use to the tv series we enjoy and the medicines that are expanding our days on this earth, come from the USA.
What is it in the foundational principles of the country that have made it such a unique phenomenon among other nations? Is America loved for its virtues and despised for its vices? What does the world owe to the Founding Fathers and what did they declare their independence from on the 4th of July 1776? Is their legacy secure today in the ideas that dominate they country they established?