Saturday, December 28, 2013

How did the Westward Expansion affect the lives of Americans in the United States?

Westward expansion had a tremendous impact on the lives of people in the United States.  Of course, it impacted different people in different ways.  Let us examine a few of these impacts. 


The Americans who were most negatively impacted by westward expansion were the Native Americans.  The Indians had their land taken from them and were (if they survived the wars) pushed on to reservations.  They lost their way of life as well.  This was...

Westward expansion had a tremendous impact on the lives of people in the United States.  Of course, it impacted different people in different ways.  Let us examine a few of these impacts. 


The Americans who were most negatively impacted by westward expansion were the Native Americans.  The Indians had their land taken from them and were (if they survived the wars) pushed on to reservations.  They lost their way of life as well.  This was a terrible impact on a large group of Americans.


Americans who moved west were affected in different ways.  Some lost their lives to the sometimes harsh conditions.  Some were able to make good lives for themselves as farmers or merchants.  Westward expansion helped them because it allowed them to have more opportunities than they would have had in the more crowded eastern part of the country.


Westward expansion helped to increase economic opportunities for those who stayed in the East as well.  The “opening” of the West gave Americans access to much more in the way of resources than they previously had.  The new sources of metal ores, timber, and other things allowed the economy to grow.  This provided more jobs for working people in the East and more money-making opportunities for the wealthier people there.


For Americans as a whole, historians often argue that westward expansion helped to create a national ethos.  It helped cause us to see ourselves as a nation of pioneers, of people who bravely and independently worked hard to improve their lives.  It helped us to see ourselves as a nation of individuals who could fend for themselves.  All of this helped (they argue) make us more democratic and it helped to shape the way we Americans see ourselves.

No comments:

Post a Comment

Is Charlotte Bronte's Jane Eyre a feminist novel?

Feminism advocates that social, political, and all other rights should be equal between men and women. Bronte's Jane Eyre discusses many...