In dictionaries:
United States non-interventionism
United States non-interventionism primarily refers to the foreign policy that was eventually applied by the United States between the late 18th century and the first half of the 20th century whereby it sought to avoid alliances with other nations in order to prevent itself from being drawn into wars that were not related to the direct territorial self-defense of the United States.
more...