Interventionist policy has deep roots in U.S. history. From the struggle to limit the expansion of communism in the 1900s to the several operations conducted by US military forces on foreign soil, the United States is notoriously known to intervene in other countries in order to fulfil certain agenda that benefits its national interests.