Hollywood should be taken for what it is and not for reality. A lot of people confuse the two and don’t know when to draw the line that separates them. People look to celebrities and those who are in the public eye as idols. They watch them, they read about them, they follow them, and they sometimes look to them as inspiration and role models. What a lot of people fail to realize is that they’re looking at people that are honestly no different than themselves. The people of Hollywood are normal human beings just like you and I, except they make more money than the average person and their lives are often televised or published in some sort of way. They eat, grow, sleep, and breathe just like you and me.
There is a lot of misconception that Hollywood is having a negative impact on society and I don’t understand why that is. If this topic were to be discussed many years ago then I may be able to understand the reasoning for it but with the world becoming so diverse and open to change nowadays, I don’t understand it. There are many different faces of Hollywood now and those faces are becoming more realistic with real life issues, stories, and drama. It’s easier for society to relate to them and many people are realizing that they are just normal human beings like you and I.
I for one am starting to see the positive effects of Hollywood and our society. For years and even now people struggled with real life issues such as weight gain/loss, drugs, family problems, and health issues. Well with the advancement of reality shows, blogs, and social networking sites like twitter, people can now see that celebrities and the people of Hollywood go through the same thing and they often deal with it in the same ways.
It’s more common now for plus sized women and men to embrace their size and love themselves for who they are because of the inspirational plus sized commercials and celebrities embracing that community. With more celebrities and people of Hollywood participating in the awareness of diseases such as heart cancer, breast cancer, and HIV/AIDS it’s more common now for people affected with those illnesses to feel cared for and not left out of society and forgotten.
Hollywood is also encouraging others to contribute more to those less fortunate. Look at the disaster that went on in Haiti and the support that Hollywood has given them. They’re all over T.V and in magazines encouraging people to donate whether it’s money or time and they’re showing us different ways that we can come together to help the many people in need which is a great thing. So you ask me does Hollywood have a negative impact on society and I say “No, absolutely not!”