I am a women's studies major and am almost finished with all of my classes. In almost all of the classes we are continually taught that the media is one of the most oppressive forces in the world today. But we never actually learn about it from the media's perspective. It's always, "This is bad." but never really how, or why?
Also the media is a primary source of education for most people. It tells us what to like, how to look, how speak, act, feel etc. I want to gain a broader understanding of something that regularly shapes my thoughts and ideas on a regular basis.