The story of the American West as it is often told typically involves Spanish, British, and American Empires struggling with Indigenous people for control of the vast territory lands and riches from the Mississippi to the Pacific. After the seventeenth century, French colonists and French-speaking Metis are often relegated to...
view more