The West, particularly the mountain West of states like Colorado, Utah, Idaho, has long had an image as a land of white men. This image dates to the 19th century, yet it is counterintuitive. Before it became a white man’s paradise, the West was the land of Native Americans, immigrants,...