U.S. History/US Imperialism
why did the United States largely shy away from imperialism prior to 1890?
I wouldn't say that the United States shied away from imperialism prior to 1890. In 1867, the US acquired Alaska from Russia. In 1845, the US started a war with Mexico for the purpose of acquiring huge portions of Mexican territory in western North America. In 1838, the US forced Cherokee Indians to leave their ancestral lands in Georgia, so that Americans could have their land. (Taking over Native Americans' lands has been an American practice since the beginning.) In 1819, the US both militarily and diplomatically pressured Spain into selling Florida. In 1812, the US declared war on England not only to defend its rights at sea, but also to try to conquer Canada. In 1810, the US annexed West Florida - which was part of Spain - as the result of a rebellion of Americans there who wanted to be part of the US rather than to Spain. So Jamar, as you can see, by 1890, the US had been an active imperialist for more than 80 years. What changed in the decade of the 1890s was the acquisition of overseas territories. That change resulted from the desires to have coaling stations for ships, to expand trade networks, and to enhance the prestige of the US. Nevertheless, the United States has never been shy about exerting its power to secure national interests.
I hope this helps!