Over the last half of the twentieth century, many advances in the field of psychiatry and mental health have occurred and continue today. Among these developments are the increasing recognition of patient rights and the expanding role of psychiatric nurses. This paper presents a view of how these changes have been reflected in film over a period of fifty years in both documentary and Hollywood movies. Discussion of advances in psychiatry, as identified in the selected films, is presented against the background of social change that was occurring in the United States and Western Europe during this period.