Dr. Xiaoge Zhang delivered a talk on “Enhancing the Performance of Neural Networks Through Causal Discovery and Integration of Domain Knowledge” at Sichuan University, China
In this talk, I will present a generic methodology to encode hierarchical causal structure among observed variables into a neural network to improve its prediction performance. The proposed causality-informed neural network (CINN) leverages three coherent steps to systematically map the structural causal knowledge into the layer-to-layer design of neural network while strictly preserving the orientation of every causal relationship. In the first step, CINN discovers causal relationships from observational data via directed acyclic graph (DAG) learning, where causal discovery is recast as a continuous optimization problem to avoid the combinatorial nature. In the second step, the discovered hierarchical causal structure among observed variables is encoded into neural network through a dedicated architecture and customized loss function. By categorizing variables as root, intermediate, and leaf nodes, the hierarchical causal DAG is translated into CINN with a one-to-one correspondence between nodes in the DAG and units in the CINN while maintaining the relative order among these nodes. Regarding the loss function, both intermediate and leaf nodes in the DAG are treated as target outputs during CINN training to drive co-learning of causal relationships among different types of nodes. In the final step, as multiple loss components emerge in CINN, we leverage the projection of conflicting gradients to mitigate gradient interference among the multiple learning tasks. Computational experiments across a broad spectrum of UCI datasets demonstrate substantial advantages of CINN in prediction performance over other state-of-the-art methods. In addition, we conduct an ablation study by incrementally injecting structural and quantitative causal knowledge into neural network to demonstrate their role in enhancing neural network’s prediction performance.