Native Americans long dominated the vastness of the American West. Linked culturally and geographically by trade, travel, and warfare, various indigenous groups controlled most of the continent west of the Mississippi River deep into the nineteenth century. Spanish, French, British, and later American traders had integrated themselves into many regional economies, and American emigrants pushed ever westward, but no imperial power had yet achieved anything approximating political or military control over the great bulk of the continent. But then the Civil War came and went and decoupled the West from the question of slavery just as the United States industrialized and laid down rails and pushed its ever-expanding population ever farther west.
Indigenous Americans had lived in North America for over ten millennia and, into the late nineteenth century, perhaps as many as 250,000 Natives still inhabited the American West.1 But then unending waves of American settlers, the American military, and the unstoppable onrush of American capital conquered all. The United States removed Native groups to ever-shrinking reservations, incorporated the West first as territories and then as states, and, for the first time in its history, controlled the enormity of land between the two oceans.
The history of the late-nineteenth-century West is many-sided. Tragedy for some, triumph for others, the many intertwined histories of the American West marked a pivotal transformation in the history of the United States.