More and more women are coming forward with accusations of sexual harassment in politics, media and entertainment following the flood of allegations against Hollywood producer Harvey Weinstein. This may well represent a watershed moment but will it change the culture that has existed for so long? Do we need new guidelines to codify these changes and more importantly do women need to get real representation in the workplace in addition to their "formal" equality under the law?
Photo courtesy of Pixabay