society pushes women,
to have body confidence,
yet photoshops every picture in magazines,
to advocate for equal rights as men,
yet refuses to pay equal wages,
to feel safe where they live,
yet blames women for being raped,
to have children,
yet frowns ...
why is society so hypocritical when it comes to the treatment of women?