Naturism /(?)/

Na·tur·ism

Naturism

n.
  1. The belief or doctrine that attributes everything to nature as a sanative agent. (Med.)