If Trump is elected, womankind is in deep trouble.
There are things that happen during pregnancy that we love to talk about, like how you have a mother’s nurturing glow, how the baby is kicking, and how perfectly round your belly is getting. On the negative end, we also love to vent to our friends about the heaviness, the swollen ankles and those surprising bouts of morning sickness that happen in less than ideal situations.
After the pregnancy is over and the beautiful bubba is born, the conversation switches to how the baby is sleeping, debates over breastfeeding versus bottle feeding, and quick snaps at anyone who asks you how your weight loss is going post-pregnancy. One thing that isn’t talked about a lot is what actually happens to the body after birth, especially sensitive areas like the breasts.
A recent post on Vogue about what happens to breasts after pregnancy and breastfeeding had me thinking about this even more. I’ve always joked to my partner that after we’ve finished having kids, he’ll have to buy me a breast lift to make up for the soccer team he wants, but for a growing number of women out there, getting a breast lift is no joke; it’s a way to feel like yourself again after giving your body to your children.
While stretch marks are newly taking some focus after birth, it’s still not a popular conversation to have about what happens to a woman’s most intimate places on her body; namely her breasts and vagina. Vaginal rejuvenation has become a popular topic, especially in Hollywood, but it seems that we are still hiding what our breasts are like after birth.
Every woman and her body are completely different, and our bodies react differently to child birth and breastfeeding, but for those who happen to sag significantly, it can affect self esteem and happiness. If you’re thinking that it’s shallow to want a breast lift after pregnancy, everyone is entitled to their own opinion. But we should all have the right to feel great about ourselves, even if it means having a little help to get your breasts back to what makes you feel fantastic.
Women are slowly getting there in terms of what is socially accepted to speak about, but I think that what happens to a woman’s body after she nurtures a life is very important. This shouldn’t be a deterrent from breastfeeding; quite the opposite. Women should be empowered to feel great after having given birth to new life. They shouldn’t feel ashamed for not feeling happy with their bodies and shouldn’t feel guilty about wanting a little extra to feel great again.
It’s a new era, a new time; women can talk about these things. And we should be talking about these things and letting each other know that we are all beautiful and we all have the right to define beauty the way we want to, without judgement or critique. Let’s start being open about what giving birth is actually like and start being more open about the way women feel, and are allowed to feel.
Image via welladjusted.co