Saturday, February 20, 2016

What would the Americas be like without the Europeans?

What America would be like without the Europeans is an interesting question to ponder. I will look at this from a few different angles.


If the Europeans never came to the Americas, the Native Americans would very likely have been in a better situation. The Europeans mistreated the Native Americans from the time they came to the Americas. The Europeans took their land and also brought diseases from which they died. The Native American way...

What America would be like without the Europeans is an interesting question to ponder. I will look at this from a few different angles.


If the Europeans never came to the Americas, the Native Americans would very likely have been in a better situation. The Europeans mistreated the Native Americans from the time they came to the Americas. The Europeans took their land and also brought diseases from which they died. The Native American way of life was disrupted. Thus, the Native Americans would very likely have been better off if the Europeans never came.


It is possible there may have been a greater conservation of our resources and a greater respect for our land. The Native Americans believed the land belonged to everybody. They believed the land was holy and should be respected. Thus, our resources might have been conserved. It also is possible the Americas wouldn’t be developed as much as they are today. The Europeans used the land and the resources to grow and develop the Americas. If they didn’t do this, it is possible there would have been less development in the Americas.


It is also important to consider the likelihood of the Americas not being discovered by the Europeans. The Europeans had been constantly growing and progressing. It is highly unlikely to think the Europeans would never have come to the Americas at some point in time. The Europeans were also looking to expand, and they eventually would have come to the Americas.

No comments:

Post a Comment

Is Charlotte Bronte's Jane Eyre a feminist novel?

Feminism advocates that social, political, and all other rights should be equal between men and women. Bronte's Jane Eyre discusses many...