资源论文Lazy Paired Hyper-Parameter Tuning Alice X. Zheng and Mikhail Bilenko

Lazy Paired Hyper-Parameter Tuning Alice X. Zheng and Mikhail Bilenko

2019-11-11 | |  142 |   94 |   0
Abstract In virtually all machine learning applications, hyper-parameter tuning is required to maximize predictive accuracy. Such tuning is computationally expensive, and the cost is further exacerbated by the need for multiple evaluations (via crossvalidation or bootstrap) at each con?guration setting to guarantee statistically signi?cant results. This paper presents a simple, general technique for improving the ef?ciency of hyper-parameter tuning by minimizing the number of resampled evaluations at each con?guration. We exploit the fact that train-test samples can easily be matched across candidate hyper-parameter con?gurations. This permits the use of paired hypothesis tests and power analysis that allow for statistically sound early elimination of suboptimal candidates to minimize the number of evaluations. Results on synthetic and real-world datasets demonstrate that our method improves over competitors for discrete parameter settings, and enhances state-of-the-art techniques for continuous parameter settings.

上一篇:Learning High-Order Task Relationships in Multi-Task Learning Yu Zhang† and Dit-Yan Yeung‡

下一篇:Adaptive Error-Correcting Output Codes

用户评价
全部评价

热门资源

  • Deep Cross-media ...

    Cross-media retrieval is a research hotspot in ...

  • Regularizing RNNs...

    Recently, caption generation with an encoder-de...

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Visual Reinforcem...

    For an autonomous agent to fulfill a wide range...

  • Joint Pose and Ex...

    Facial expression recognition (FER) is a challe...