Portal:Western films

The American Film Institute defines Western films as those "set in the American West that [embody] the spirit, the struggle and the demise of the new frontier."

Film

Western drama

Television series