Noise Contrastive MetaLearning for Conditional Density Estimation using Kernel Mean Embeddings
Abstract
Current metalearning approaches focus on learning functional representations of relationships between variables, i.e. on estimating conditional expectations in regression. In many applications, however, we are faced with conditional distributions which cannot be meaningfully summarized using expectation only (due to e.g. multimodality). Hence, we consider the problem of conditional density estimation in the metalearning setting. We introduce a novel technique for metalearning which combines neural representation and noisecontrastive estimation with the established literature of conditional mean embeddings into reproducing kernel Hilbert spaces. The method is validated on synthetic and realworld problems, demonstrating the utility of sharing learned representations across multiple conditional density estimation tasks.
 Publication:

arXiv eprints
 Pub Date:
 June 2019
 arXiv:
 arXiv:1906.02236
 Bibcode:
 2019arXiv190602236T
 Keywords:

 Statistics  Machine Learning;
 Computer Science  Machine Learning