Bet You Didn't Learn This Fact About The Wild West In History Books Men looking for wealth weren't the only ones who helped settle the West