Infinia ML Research Accepted at NIPS 2018

September 10, 2018 — Infinia ML is focused on business impact, but our commitment to academic research keeps us on the cutting edge of machine learning.

That’s why we’re proud to announce that the work of our Research Scientist Hongteng Xu, Ph.D., was among the ~22% of papers accepted at the upcoming Conference on Neural Information Processing System (NIPS), which Quartz has called “the world’s biggest and most important AI conference.”

In “Distilled Wasserstein Learning for Word Embedding and Topic Modeling”, Dr. Xu and his co-authors, including our Chief Scientist Larry Carin, proposed a machine learning method that could have clinically-related applications in disease network construction, mortality prediction, and procedure recommendation.

Since joining Infinia ML in March 2018, Dr. Xu has been highly productive, co-authoring seven other accepted papers under our company name:

Quaternion Convolutional Neural Networks, The European Conference on Computer Vision (ECCV), 2018. (Acceptance Rate: ~25%)

Predicting Smoking Events with a Time-Varying Semi-Parametric Hawkes Process Model, The Conference on Machine Learning for Healthcare, 2018

Learning Registered Point Processes from Idiosyncratic Observations, The International Conference on Machine Learning (ICML), 2018. (Acceptance Rate: 25%)

Flexible Network Binarization with Layer-wise Priority, The International Conference on Image Processing (ICIP), 2018.

Online Continuous-Time Tensor Factorization Based on Pairwise Interactive Point Processes, The International Joint Conference on Artificial Intelligence (IJCAI-ECAI), 2018. (Oral, Acceptance Rate: 20.4%)

Learning an Inverse Tone Mapping Network with a Generative Adversarial Regularizer,” IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 2018.

Benefits from Superposed Hawkes Processes, The 21st International Conference on Artificial Intelligence and Statistics (AISTATS), 2018. (Acceptance Rate: 33.2%)

Congratulations Dr. Hongteng Xu, and keep up the good work!

Share this post