Conference Agenda

Overview and details of the sessions of this conference. Please select a date or location to show only sessions at that day or location. Please select a single session for detailed view (with abstracts and downloads if available).


Oral session 2 (2 talks)
Time: 01/Jul/2019: 3:30pm-4:10pm · Location: Amphi B00

Cluster-based Optimal Transport Alignment

John LEE, Eva DYER, Christopher J. ROZELL

Georgia Institute of Technology, United States of America

The unsupervised alignment (or registration) of point clouds in low-dimensional representations is a fundamental problem at the heart of transfer learning. In particular, we consider the problem of aligning dataset lying in a union of subspaces and apply tools from optimal transport to tackle it. We propose a novel block optimal transport framework that incorporates cluster-based structure, as a way to add robustness against poor local minima. We demonstrate superior empirical results over the baseline Procrustes method and also provide theoretical insights on dataset conditions required for the alignability of such cluster-based methods.


Lunch + Poster session 2
Time: 02/Jul/2019: 12:10pm-2:30pm · Location: C101 - C103

Robust Incorporation of Signal Predictions into the Sparse Bayesian Learning Framework


Georgia Institute of Technology, United States

We propose an algorithm for dynamic filtering of time-varying sparse signals based on the sparse Bayesian learning (SBL) framework. The key idea underlying the algorithm is the incorporation of a signal prediction generated from a dynamics model and estimates of previous time steps into the hyperpriors of the SBL probability model. We interpret the effect of this change to the probability model from several perspectives. Numerical simulations demonstrate that the proposed method converges faster and to more accurate solutions than standard SBL and other dynamic filtering algorithms.


Lunch + Poster session 2
Time: 02/Jul/2019: 12:10pm-2:30pm · Location: C101 - C103

Active embedding search via noisy paired comparisons

Gregory Humberto CANAL, Andrew Kenneth MASSIMINO, Mark Andrew DAVENPORT, Christopher John ROZELL

Georgia Institute of Technology, United States of America

Suppose that we wish to estimate a user's preferences from comparisons between pairs of items (such as in a recommender system), where both the user and items are embedded in a low-dimensional space with distances that reflect user and item similarities. Since such observations can be extremely costly and noisy, we aim to actively choose pairs that are most informative given the results of previous comparisons. We provide new theoretical insights into greedy information maximization and develop two novel strategies that provably approximate information maximization. We use simulated responses from a real-world dataset to validate our strategies over state-of-the-art methods.


Lunch + Poster session 2
Time: 02/Jul/2019: 12:10pm-2:30pm · Location: C101 - C103

Fast Numerical Methods for Convex Problems with Optimal Transport Regularization

John LEE, Christopher J. ROZELL

Georgia Institute of Technology, United States of America

Although convex losses (e.g., L1-norm for sparsity or nuclear norm for rank) are popular toolbox staples, they do not account for underlying geometry in the support when comparing signals. The optimal transport (OT) distance is a powerful mathematical framework that compares signals by measuring the "effort" required to "push" mass between their configurations. We propose fast numerical approaches that overcome the OT's notorious computational complexity that (i) have comparable time-complexity as entropic-regularized counterparts but are free from approximation-artifacts, (ii) extend variational OT problems to the space of Euclidean signals, and (iii) can easily be extended to recent unbalanced OT formulations.


Lunch + Poster session 4
Time: 04/Jul/2019: 12:10pm-2:30pm · Location: C101 - C103

Natural Variation Transfer using Learned Manifold Operators

Marissa CONNOR1, Benjamin CULPEPPER2, Huy NGUYEN2, Christopher ROZELL1

1Georgia Institute of Technology, United States of America; 2Yahoo! Research, United State of America

The variations in many types of data are constrained by natural physical laws and identity-preserving variations are often shared between several classes. In many settings, the within-class variation in a high-dimensional dataset can be modeled as being on a low-dimensional manifold. If the manifold structure shared between classes can be learned, it can be exploited for transfer learning tasks. In this work, we represent the manifold structure using a learned dictionary of generative operators and develop methods for using those operators for few-shot learning and realistic data generation to demonstrate state of the art manifold learning performance in transfer settings.

Contact and Legal Notice · Contact Address:
Privacy Statement · Conference: SPARS 2019
Conference Software - ConfTool Pro 2.6.127+TC+CC
© 2001 - 2019 by Dr. H. Weinreich, Hamburg, Germany