World War I was as transformational for the United States as the War for Southern Independence. While the 1860s helped produce the imperial executive and then led to Reconstruction, World War I created the American nation. I explain in this episode of The Brion McClanahan Show.
https://mcclanahanacademy.com
https://brionmcclanahan.com/support
http://learntruehistory.com
view more