Feminism is the “belief that women should have economic, political, and social equality with men”(Gustafson). Many inspirational feminists have “challenged the traditional gender roles and demanded more opportunities for women throughout history” (Gustafson). In 1920, women finally gained the right to vote in the United...