As a result, at Open AI engineers and researchers work closely together to build these large systems as opposed to a strict researcher/engineer split.In this talk, we will go over some of the lessons we’ve learned, and how they come together in the design and internals of our system for learning-based robotics research.Large-scale Machine Learning: Deep, Distributed and Multi-Dimensional: Modern machine learning involves deep neural network architectures which yields state-of-art performance on multiple domains such as computer vision, natural language processing and speech recognition.As the data and models scale, it becomes necessary to have multiple processing units for both training and inference.Pushing the current boundaries of deep learning requires using multiple dimensions and modalities.These can be encoded into tensors, which are natural extensions of matrices.She has been featured in a number of forums such as the yourstory, Quora ML session, O’Reilly media, and so on. Tech in Electrical Engineering from IIT Madras in 2004 and her Ph D from Cornell University in 2009.
She is currently a member of the SIAM Board of Trustees and serves as associate editor for both the SIAM J. In this talk, we demonstrate the wide-ranging utility of the canonical polyadic (CP) tensor decomposition with examples in neuroscience and chemical detection.
We present new deep learning architectures that preserve the multi-dimensional information in data end-to-end.
We show that tensor contractions and regression layers are an effective replacement for fully connected layers in deep learning architectures.
If you're seeing this message, it means we're having trouble loading external resources on our website.
If you're behind a web filter, please make sure that the domains *.and *.are unblocked.