Hollywood's Effect On America Term Paper

The Free essays given on our site were donated by anonymous users and should not be viewed as samples of our custom writing service. You are welcome to use them to inspire yourself for writing your own term paper. If you need a custom term paper related to the subject of Culture and Mythology or Hollywood's Effect On America , you can hire a professional writer here in just a few clicks.

Hollywood has long been the symbolic center of the U.S. motion-picture industry. It dominates the

world's motion-picture industry as well, with the top ten highest grossing movies of all time being

American made. Hollywood, with it's stronghold on the movie industry, exerts powerful moral, cultural,

and political influences on our society as well. These influences can often perpetuate a narrow-minded way of

thinking.

The moral influence exerted by Hollywood tends to be negative, for the most part. The most popular

films are often full of gratuitous sex and violence. Young people who see their role models engaging in

promiscuous behavior, blasting their enemies to bits, or taking drugs are likely to be influenced by this. Not

to say that Hollywood is solely responsible for the behavior of our youth, but it would be ridiculous to think

that repeatedly watching movies that glorify these behaviors would not have some sort of adverse effect.

The domination of Hollywood has also led to it's influence of cultural values on our society.

Typically, Hollywood films tend to be very black and white, which creates the image of the world being black

and white; but it is not. Movies also tend to exaggerate our differences and make light of many problems

which plague our society. We are constantly being bombarded with stereotypes of age, race, gender, and

sexual preference. It's very hard to overcome these stereotypes when people see them portrayed positively, on

a daily basis. It is not only our view of ourselves that is affected. Films often affect how we feel about the

world as well.

Hollywood movies are often guilty of promoting a biased American view of the world. Hollywood

tends to portray a very simplistic "good versus evil" view of international conflicts and to feature negative,

stereotypical images of people from other countries, Muslims, Russians and Japanese in particular, have been

presented as the enemies of freedom and progress. American-made movies also distort history. Numerous

World War II films have downplayed the contribution of other nations to allied victory. Anyone watching

one of those big budget Vietnam movies from the 80's would think that only American soldiers were sent over

there.

It is obvious that Hollywood has, and will continue to have, a tremendous influence on society.

Hollywood regularly presents us a world which is nothing like the world we live in. With our growing

acceptance of this world as reality, society will have greater difficulty relating to the real world. Inevitably, we

will be driven not only by our ignorance, but by our desire to live in a perfect world, as is portrayed in film.

Related Essays on Culture and Mythology