I don't understand modern views on negative body image...
One thing I hear over and over is "every man/woman deserves to feel good about their body". And I agree. But why the hell does that involve having other people spoonfeed them compliments and lying to them? If a person is out of shape, I'm not obligated to tell them that they aren't. They don't "deserve" for me to lie to them just to save their ego. In fact, they are generally a detriment to their own health, so why shouldn't they feel bad about it?
If they want to feel good about their own body --regardless of how little care they take of it-- that's cool with me. Confidence is self-derived. But I don't understand why it's expected of other people to stroke their egos or to not tell them the obvious truth?
With that mentality, everyone "deserves" to feel good about everything. If I was shit at art, and never put any work into it, I still deserve to be treated to compliments and falsehoods about how amazing my work is. Likewise if a person is out of shape and has never bothered to do anything about it, clearly they deserve to be regarded the same as a person who has put time and effort into staying in shape.
I don't get it at all.