WWII changed the US identity greatly. For one, it showed the rest of the world that they are a dominant force to be reckoned with. For another, it also showed that they would do what needed to be done to win battles, and more importantly, wars. It also disproved the thought that they were not a unified nation, and that they could work cohesively toward the same goal