Tuesday, March 11, 2014

What effect did the Europeans have on North and South America?

Europeans had positive and negative effects on North and South America. The Europeans had established colonies throughout the Americas. As the Europeans developed their colonies, they were able to develop the economies of the region. Through trade with Europeans, new products were brought to the colonies. As the population grew, new cities developed. The Europeans were able to establish governments that allowed them to show the people they colonized how to run a country. They...

Europeans had positive and negative effects on North and South America. The Europeans had established colonies throughout the Americas. As the Europeans developed their colonies, they were able to develop the economies of the region. Through trade with Europeans, new products were brought to the colonies. As the population grew, new cities developed. The Europeans were able to establish governments that allowed them to show the people they colonized how to run a country. They helped prepare these people for the day when they would become independent, even if that independence was often achieved by revolution. The Europeans also brought Christianity to the Americas. Missionaries worked to convert people to Christianity.


There were some negative effects of the European colonization of the Americas. The Europeans brought diseases to the Americas to which the local people had no immunities. Many people in the Americas died from these diseases. The Europeans also enslaved some of the people with whom they came into contact. When the Europeans discovered rich deposits of minerals, they used these enslaved people to mine the minerals. The Europeans took these minerals back to their countries for their benefit, not for the benefit of the colonies. The entire purpose of establishing colonies was to benefit the Europeans. Thus, policies were developed with the needs of the Europeans in mind, not the needs of the people who were being colonized.

No comments:

Post a Comment

Is Charlotte Bronte's Jane Eyre a feminist novel?

Feminism advocates that social, political, and all other rights should be equal between men and women. Bronte's Jane Eyre discusses many...