Nature has many positive effects on health and well-being, and now new studies show that it also helps improve body image.
Body image constitutes all the perceptions, thoughts, and assumptions that we have about our physicality and appearance.
A negative body image can be detrimental to confidence and happiness, especially in extreme cases when it leads to eating disorders and body dysmorphia.
But now, the journal Body Image has published a paper comprising of five different studies that examined the effect nature had on positive body image. The research was conducted by researchers from Anglia Ruskin University, Perdana University in Malaysia, and University College London.
Several of the studies even showed that a person could experience the positive effects of nature on body image without actually going outside, but instead looking at photographs of scenic natural landscapes.
Three of the studies used photographs of natural and built environments. Student participants were shown the pictures and had a more positive outlook on their own body image after looking at the natural environment images.
The final two studies asked volunteers to walk around outdoor park areas and built environments, such as a busy city street around London’s Hampstead Heath.
The results were similar with the first three studies in that being in a more natural environment such as a park, improved body image for the participants.
The researchers found that nature may have this effect because it is distracting, and being outdoors can help keep things in perspective, allowing for self-love and appreciation.
“There are several reasons why exposure to nature could be having this effect on positive body image,” said Viren Swami, Professor of Social Psychology at Anglia Ruskin University and lead author of the report. “It might be that it distances people, physically and mentally, from appearance-focused situations that are one of the causes of negative body image.”