Wednesday, August 17, 2016

How did Hawaii become part of the United States?

Hawai’i originally became part of the United States when people of American descent living in Hawai’i overthrew the native Hawai’ian monarchy.  Not long after, they were able to prevail upon the United States to annex Hawai’i, making it a territory of the United States.  Roughly 60 years later, Hawai’i became the 50th state in the Union.


Hawai’i was originally an independent nation.  It eventually came to be ruled by a monarchy that had united all...

Hawai’i originally became part of the United States when people of American descent living in Hawai’i overthrew the native Hawai’ian monarchy.  Not long after, they were able to prevail upon the United States to annex Hawai’i, making it a territory of the United States.  Roughly 60 years later, Hawai’i became the 50th state in the Union.


Hawai’i was originally an independent nation.  It eventually came to be ruled by a monarchy that had united all of the islands in the chain.  Beginning in the 1820s, Americans started to come to the islands, largely as missionaries.  Eventually, Americans became very involved in the Hawai’ian economy and were some of the biggest landowners in the country.  These Americans built a massive sugar industry on the islands.


In 1887, Americans in Hawai’i forced King Kalakaua to sign the “Bayonet Constitution,” taking many of the powers away from the monarchy.  They wanted more control over the islands for themselves.  In 1893, when Queen Liliuokalani tried to restore some of these powers, they deposed her with help from American military forces.  The islands remained independent, but under the rule of the American elites, for a few years.  Then, in 1898, the US government agreed to annex Hawai’i during the Spanish-American War.  The islands continued to be a US territory until 1959, when they achieved statehood. 


It is this history that makes many native Hawai’ians feel that their land was taken from them and that they should be given sovereignty over it once again.


No comments:

Post a Comment

Is Charlotte Bronte's Jane Eyre a feminist novel?

Feminism advocates that social, political, and all other rights should be equal between men and women. Bronte's Jane Eyre discusses many...