For the last couple of decades, enthusiasts have lamented the demise of Westerns while the rest of the world has gone about its business, ignorant that anyone might care about a genre relegated to a few obscure shelves at the local bookstore. Westerns were hugely popular for over a hundred years. Not only were they popular in the United States, but the whole world devoured them. The Western was a staple of fiction, Hollywood, television, and daydreams. What happened?