The Rectified Linear Unit (ReLU) activation function is widely employed in deep learning (DL). ReLU shares structural similarities with censored regression and Tobit models common in econometrics and ...