The United States and Japan emerged stronger after World War I.
The United States and Japan emerged out of World War I stronger than when they entered the conflict. While Europe and Asia faced reconstruction, the United States could focus on further growth and development. The United States became the dominant economic and military power globally after the war.
https://brainly.com/question/38841322