Neural Continuous-Discrete State Space Models for Irregularly-Sampled Time Series
We develop a family of stable continuous-time neural state space-models.
Laruso blog features productivity, tips, inspiration and strategies for massive profits. Find out how to set up a successful blog or how to make yours even better!
We develop a family of stable continuous-time neural state space-models.
We explore the potential of LLMs to act as zero-shot human models for HRI. We contribute an empirical study and case studies on a simulated table-clearing task and a new robot utensil-passing experiment.
We extend gradient flow methods to a variety of high-quality image synthesis tasks using a novel density ratio learning method.
Xie Yaqi successfully defended her thesis and is now Dr. Xie. Congratulations Yaqi! You can find out more about Yaqi’s work on embedding symbolic knowledg...
We contribute an empirical study into the effectiveness of LLMs, specifically GPT-3.5 variants, for the task of natural language goal translation to PDDL.
Transfer source policies to a target reinforcement learning task with safety constraints using Successor Features.
We introduce a model of multi-population learning with heterogenous beliefs.