Organicism /(?)/

Or·gan·i·cism

Organicism

n.
  1. The doctrine of the localization of disease, or which refers it always to a material lesion of an organ. (Med.)