资源论文A Unifying Framework for Mutual Information Methods for Use in Non-linear Optimisation

A Unifying Framework for Mutual Information Methods for Use in Non-linear Optimisation

2020-03-30 | |  73 |   45 |   0

Abstract

Many variants of MI exist in the literature. These vary pri- marily in how the joint histogram is populated. This paper places the four main variants of MI: Standard sampling, Partial Volume Estima- tion (PVE), In-Parzen Windowing and Post-Parzen Windowing into a single mathematical framework. Jacobians and Hessians are derived in each case. A particular contribution is that the non-linearities implicit to standard sampling and post-Parzen windowing are explicitly dealt with. These non-linearities are a barrier to their use in optimisation. Side-by- side comparison of the MI variants is made using eight diverse data-sets, considering computational expense and convergence. In the experiments, PVE was generally the best performer, although standard sampling of- ten performed nearly as well (if a higher sample rate was used). The widely used sum of squared differences metric performed as well as MI unless large occlusions and non-linear intensity relationships occurred. The binaries and scripts used for testing are available online.

上一篇:Kernel-Predictability: A New Information Measure and Its Application to Image Registration

下一篇:Learning Discriminative Canonical Correlations for Ob ject Recognition with Image Sets

用户评价
全部评价

热门资源

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Rating-Boosted La...

    The performance of a recommendation system reli...