Women born in the West within the last 10 years and 10 years from now will have it easier than men on average. They will grow up in a society more careful about cold approach, equal opportunity, and other crap that impacts feelings of safety. They will be more educated than white men on average, be able to better integrate into society due to sociability, and have more avenues of revenue streams, if they’re unable to find “regular jobs”. Safety nets/ organizations are more plentiful for women as well, since stigma to reach out for help isn’t attached to their sex. There’s still a lot of abuse and stuff against women so they have it much worse in terms of harm done to them. The answer to the question is thus determined on whether you place more value on the positive or negatives.
If this thread is referring to the world in general it’s obvious women have it way worse, since they don’t have anything close to equal rights or opportunities in MANY parts of the world.