An 'American' Theory
Is it me, or has there been a sudden surge of the word "American" in the titles of movies and TV shows over the past 20 years? Seriously, take a look...
An American Christmas Carol
All-American Christmas Carol
An American Tail
The American President
American History X
American Ninja Warrior
American Crime Story
American Horror Story...
I ask this question because I have a theory. Hear me out...
I believe the increasing appearance of the word "American" in titles is a result of a growing sentiment in the United States, pride, which is, in itself, a result of an increase in nationalism. We're living in a time when the rest of the world is portrayed as scary, unwelcoming, and harmful, a time when "globalism" has become a dirty word. And when something is "American," certain people will invest their time in buying it, consuming it, experiencing it.
One could then argue that blatantly slapping on the word as an adjective is a lazy marketing exercise, an attempt to lure in a certain group of people that won't bother consuming content that appears to be foreign or unfamiliar. Throw in the word "American," and people will know what they're getting. This was made here in the US of A, dagnabbit! This is for the red-blooded citizens of The Greatest Country on Earth!
In other words, this could very well be another example of the dumbing down of consumer culture -- or society -- selling something in the most digestible, accessible, approachable manner.
I can't think of any other country that exhibits its nationalistic pride in the titles of its own film and TV shows to this extent. There's no German Horror Story. No Australian Beauty. Not even a Brazilian Psycho. So what gives, America? Why the need to label "American" so much? Do you feel compelled to mark your property, to make it clear that America makes the best stuff? Huh?
*This was written in a drunken stupor last year.