Embedding Symbolic Temporal Knowledge into Deep Sequential Models
We embed symbolic knowledge expressed as linear temporal logic (LTL) and use these embeddings to guide the training of deep sequential models.
Laruso blog features productivity, tips, inspiration and strategies for massive profits. Find out how to set up a successful blog or how to make yours even better!
We embed symbolic knowledge expressed as linear temporal logic (LTL) and use these embeddings to guide the training of deep sequential models.
Harold was awarded the university level and faculty teaching awards for the year 20/21!
We construct a shared latent space from different sensory modalities via contrastive learning.
Bridging the gap between symbolic and connectionist paradigms via Graph Neural Network embeddings
Using Gradient Flows to Refine Samples from Deep Generative Models
Accurate, Fast, and Low-powered Multi-Sensory Perception via Neuromorphic Sensing and Learning
We examine how recent advances in psychometrics, trustworthy systems, deep learning etc. can help address challenges that arise with respect to trust in real-world human robot interactions.