The Rise of American Imperialism Imperialism â Defined The period at the end of the 19th century when the United States extended its economic, political, and social influence…