I am skimming through a few episodes of The West Wing in disbelief that the series is almost twenty years old. In nearly every episode president Bartlet gives his staff mini lectures on honor and service to the country and the main characters appear to be genuinely striving toward those principles in their conduct.
Another show I remember from those years, Dr. Quinn, Medicine Woman, was about Utopian American frontier where Native Americans, Blacks, and Whites attempt to create a harmonious society in which goodness and truth always win.
President Bartlet’s sermonettes seem comically nostalgic in the context of contemporary cynical world. Could a show like this enjoy as much popularity now as it did twenty years ago? Would Dr. Quinn be confined only to the Hallmark channel today?
Although these dramas were highly idealistic and historically fictional, they did represent for me at that time a model of America I believed to be true, or at least possible. Is it still possible? Or has the general crudeness and bloodthirstiness of our entertainment eroded our sensibilities beyond repair? (Compare House of Cards to The West Wing).
Has the American society changed that much in the last twenty years or has it always been the same and it is only a matter of my own changing perception? Is the art (entertainment) a reflection of the real world or is the world a creation and reflection of the art?