Naturism /(?)/ Na·tur·ism Naturism n. The belief or doctrine that attributes everything to nature as a sanative agent. (Med.)