Sitemap
A list of all the posts and pages found on the site. For you robots out there, there is an XML version available for digesting as well.
Pages
Posts
Future Blog Post
Published:
This post will show up by default. To disable scheduling of future posts, edit config.yml
and set future: false
.
Blog Post number 4
Published:
This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.
Blog Post number 3
Published:
This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.
Blog Post number 2
Published:
This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.
Blog Post number 1
Published:
This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.
lab
portfolio
Portfolio item number 1
Short description of portfolio item number 1
Portfolio item number 2
Short description of portfolio item number 2
publications
CCN+: A neuro-symbolic framework for deep learning with requirements
Published in International Journal of Approximate Reasoning, 2024
In this work we present a neural layer able to make any network compliant by design with constraints expressed in full propositional logic. The paper also presents an adaptation of the standard binary cross-entropy loss that guarantees the correct behaviour of the gradients through the layer.
Recommended citation: Eleonora Giunchiglia, Alex Tatomir, Mihaela Cǎtǎlina Stoian, and Thomas Lukasiewicz. CCN+: A neuro-symbolic framework for deep learning with requirements. International Journal of Approximate Reasoning, page 109-124, 2024.
Download Paper | Download Bibtex
How Realistic Is Your Synthetic Data? Constraining Deep Generative Models for Tabular Data
Published in ICLR, 2024
This paper is about the number 3. The number 4 is left for future work.
Recommended citation: Mihaela C. Stoian, Salijona Dyrmishi, Maxime Cordy, Thomas Lukasiewicz, and Eleonora Giunchiglia. How Realistic Is Your Synthetic Data? Constraining Deep Generative Models for Tabular Data. In Proceedings of International Conference on Learning Representations, 2024.
Download Paper | Download Slides
PiShield: A PyTorch Package for Learning with Requirements
Published in IJCAI, 2024
In this paper we present PiShield, a PyTorch package which encompasses our entire work. PiShield gives you access to the function build_shield_layer which takes in input the path to a txt file where your requirements are written, and returns a pytorch layer which can be added on top of any network and guarantees the requirements satisfaction.
Recommended citation: Mihaela C˘at˘alina Stoian, Alex Tatomir, Thomas Lukasiewicz, and Eleonora Giunchiglia. PiShield: A PyTorch Package for Learning with Requirements. Proceedings of IJCAI, 2024.
Download Paper
Exploiting T-norms for Deep Learning in Autonomous Driving
Published in NeSy, 2024
This paper shows how incorporating logical constraints in the loss function can improve the results in the autonomous driving setting. This application domain is particularly challenging because for every frame we have to encode hundreds of constraints over houndreds of bounding boxes. To overcome the obvious high memory demand, we propose a encoding of the constraints that uses sparse matrices.
Recommended citation: Mihaela C. Stoian, Eleonora Giunchiglia, and Thomas Lukasiewicz. Exploiting t-norms for deep learning in autonomous driving. In Proceedings of the International Workshop on Neural- Symbolic Learning and Reasoning, 2023.
Download Paper
ULLER: A unified language for learning and reasoning
Published in NeSy (Spotlight), 2024
In this paper we present ULLER: a Unified Language for LEarning and Reasoning. ULLER has a first-order logic syntax specialised for NeSy for which we provide example semantics including classical FOL, fuzzy logic, and probabilistic logic. This paper represents a first step in a longer term project whose goal is to create a library to make NeSy methods more accesible and comparable.
Recommended citation: van Krieken, E., Badreddine, S., Manhaeve, R., and Giunchiglia, E. Uller: A unified language for learning and reasoning. In International Conference on Neural-Symbolic Learning and Reasoning, pp. 219–239. Springer, 2024.
Download Paper
Beyond the convexity assumption: Realistic tabular data generation under quantifier-free real linear constraints
Published in ICLR, 2025
This paper is about creating a neural layer able to make any neural network compliant by design with constraints expressed as disjunctions over linear inequalities, i.e., each constraint can have form \(\Phi_1 \vee \Phi_2 \vee \ldots \Phi_n\), where each \(\Phi_i\) for \(i = 1, \ldots, n\) is a linear inequality. This problem is particularly interesting because the constraints can define non-convex and even disconnected spaces.
Recommended citation: Mihaela C Stoian and Eleonora Giunchiglia. Beyond the convexity assumption: Realistic tabular data generation under quantifier-free real linear constraints. In The Thirteenth International Conference on Learning Representations (ICLR), 2025.
Download Paper
talks
Talk 1 on Relevant Topic in Your Field
Published:
This is a description of your talk, which is a markdown file that can be all markdown-ified like any other post. Yay markdown!
Conference Proceeding talk 3 on Relevant Topic in Your Field
Published:
This is a description of your conference proceedings talk, note the different field in type. You can put anything in this field.
teaching
Mathematics for Machine Learning
Graduate level course, Imperial College London, I-X, 2025
Machine Learning is at the core of contemporary AI research and applications. This module develops a foundation for the mathematical theory underpinning key ML methods which are necessary for their understanding and analysis. The module covers six units: Linear Algebra, Geometry, Calculus, Optimisation, Probability and Statistics. All to establish a comprehensive setting to strenghen the student’s understanding of widely used ML models and methods.