Perhaps most importantly, the war’s outcome boosted national self-confidence and encouraged the growing spirit of American expansionism that would shape the better part of the 19th century.
I would say Your answer is the First option. Hope I helped. :)