Nearly five decades after his death, John Wayne still remains one of the defining figures of American culture, through his movies, and through his identifiable public image as the all-American man.