资源论文Orthogonal Machine Learning: Power and Limitations

Orthogonal Machine Learning: Power and Limitations

2020-03-16 | |  75 |   41 |   0

Abstract

Double machine learning provides 图片.png-consistent estimates of parameters of interest even when high-dimensional or nonparametric nuisance parameters are estimated at an 图片.pngrate. The key is to employ Neyman-orthogonal moment equations which are first-order insensitive to perturbations in the nuisance parameters. We show that the 图片.png requirement can be improved to 图片.png by employing a k-th order notion of orthogonality that grants robustness to more complex or higher-dimensional nuisance parameters. In the partially linear regression setting, popula in causal inference, we show that we can construct second-order orthogonal moments if and only if the treatment residual is not normally dis tributed. Our proof relies on Stein’s lemma and may be of independent interest. We conclude by demonstrating the robustness benefits of an explicit doubly-orthogonal estimation procedure for treatment effect.

上一篇:Interpretability Beyond Feature Attribution: Quantitative Testing with Concept Activation Vectors (TCAV)

下一篇:Optimization, Fast and Slow: Optimally Switching between Local and Bayesian Optimization

用户评价
全部评价

热门资源

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • Learning to learn...

    The move from hand-designed features to learned...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...