Movies and Hollywood have taught us a lot. In fact, a good amount of things you know about the world you learned via movies and film. Consider the wealth of pop culture references, music, quotes, and even your understanding and expectations of how relationships should work that is all stored in your brain thanks to the film industry. While some of these can be accurate, they are generally twists on the truth. For example, Hollywood taught us that if two people really hate each other then they are more likely to fall in love. Is that true? Maybe. But in the movies you can almost bet on it. As you may have deduced by now, all of the items on this list are tongue in cheek. While the movies taught us all these things they are not actually true. We know that most of you probably would have realized that, but there is always somebody who will start a debate thinking we were actually serious. So, with that disclaimer out of the way, let’s enjoy what Hollywood has taught us about life. From 10th century peasants with perfect teeth to spaceships with internal gravity systems that will stay intact no matter how much of a beating the ship takes, these are 25 important things that we learned from movies!
Featured Image: opensource.com via Flickr