April 7, 2016
After World War II, the U.S. abandoned the concepts of total and hard war and adopted a more politically correct view of war.