Body Positivity is a social movement rooted in the belief that all human beings should have a positive body image, and be accepting of their own bodies as well as the bodies of others. 2018 was an interesting year for me as a human. I learnt that I had a serious issue that needed to… Keep Reading