The answer is clearly “Yes”.
Western nations like to pretend that colonialism was all but wiped out in the years following the Second World War. However, those who know better than to trust the accounts of the western ‘victors’ of WWII understand that imperialism is alive and well. Constant military and political interventions in Latin America, the Middle East and Africa serve to uphold American hegemony in the world. While the US maintains that these incursions into the sovereignty of other nations are in the interests of promoting ‘universal values’, such as so-called democracy and freedom, it is clear in the aftermath of such attacks that the opposite is true. Rather than improving the countries that the US (among other nations) invades, the outcome is often mass bloodshed and severe degradation of the quality of life for the lower classes.
The economically-driven genocide conducted by the United States in the Philippines (1899-1902) is an early…
View original post 1,425 more words