Organicism /(?)/ Or·gan·i·cism Organicism n. The doctrine of the localization of disease, or which refers it always to a material lesion of an organ. (Med.)