Talk:Vietnamball/@comment-31370044-20190207154647/@comment-26192486-20190316030755

Well, yes and no.

The Vietnam War (American War to the Vietnamese) was never originally the U.S.'s in the first place but between North and South Vietnam. The US joined due to the fear of the "Domino Effect". After years past, the war getting worse, rising casualties, and public opinion getting more unpopular back at home, the US withdrew from the conflict, leaving South Vietnam to fend for themselves.

So did America lose the Vietnam War. Depends. Yes because they failed to protect South Vietnam from the communist and the North won. No because it was never their war to begin with.

But yeah, Vietnam is stronk. They're no strangers to war.