What do people think are good movies that give a great feel for the 70's as a "stereotypical era"?
Not necessarily movies *from* the 1970's, but about the culture of America in the 1970's.
Certainly there is a lot out there, but I'm looking for a list of the better and more interesting ones that people have run across. Like Saturday Night Fever is clearly iconic, but I don't know how really interesting it is or fun to watch, for example.
Not necessarily movies *from* the 1970's, but about the culture of America in the 1970's.
Certainly there is a lot out there, but I'm looking for a list of the better and more interesting ones that people have run across. Like Saturday Night Fever is clearly iconic, but I don't know how really interesting it is or fun to watch, for example.