Portfolio item number 1
Short description of portfolio item number 1
Short description of portfolio item number 1
Short description of portfolio item number 2
Published in International Journal of Approximate Reasoning, 2024
In this work we present a neural layer able to make any network compliant by design with constraints expressed in full propositional logic. The paper also presents an adaptation of the standard binary cross-entropy loss that guarantees the correct behaviour of the gradients through the layer.
Recommended citation: Eleonora Giunchiglia, Alex Tatomir, Mihaela Cǎtǎlina Stoian, and Thomas Lukasiewicz. CCN+: A neuro-symbolic framework for deep learning with requirements. International Journal of Approximate Reasoning, page 109-124, 2024.
Download Paper | Download Bibtex
Published in ICLR, 2024
This paper is about the number 3. The number 4 is left for future work.
Recommended citation: Mihaela C. Stoian, Salijona Dyrmishi, Maxime Cordy, Thomas Lukasiewicz, and Eleonora Giunchiglia. How Realistic Is Your Synthetic Data? Constraining Deep Generative Models for Tabular Data. In Proceedings of International Conference on Learning Representations, 2024.
Download Paper | Download Slides
Published in IJCAI, 2024
In this paper we present PiShield, a PyTorch package which encompasses our entire work. PiShield gives you access to the function build_shield_layer which takes in input the path to a txt file where your requirements are written, and returns a pytorch layer which can be added on top of any network and guarantees the requirements satisfaction.
Recommended citation: Mihaela C˘at˘alina Stoian, Alex Tatomir, Thomas Lukasiewicz, and Eleonora Giunchiglia. PiShield: A PyTorch Package for Learning with Requirements. Proceedings of IJCAI, 2024.
Download Paper
Published in NeSy, 2024
This paper shows how incorporating logical constraints in the loss function can improve the results in the autonomous driving setting. This application domain is particularly challenging because for every frame we have to encode hundreds of constraints over houndreds of bounding boxes. To overcome the obvious high memory demand, we propose a encoding of the constraints that uses sparse matrices.
Recommended citation: Mihaela C. Stoian, Eleonora Giunchiglia, and Thomas Lukasiewicz. Exploiting t-norms for deep learning in autonomous driving. In Proceedings of the International Workshop on Neural- Symbolic Learning and Reasoning, 2023.
Download Paper
Published in NeSy (Spotlight), 2024
In this paper we present ULLER: a Unified Language for LEarning and Reasoning. ULLER has a first-order logic syntax specialised for NeSy for which we provide example semantics including classical FOL, fuzzy logic, and probabilistic logic. This paper represents a first step in a longer term project whose goal is to create a library to make NeSy methods more accesible and comparable.
Recommended citation: van Krieken, E., Badreddine, S., Manhaeve, R., and Giunchiglia, E. Uller: A unified language for learning and reasoning. In International Conference on Neural-Symbolic Learning and Reasoning, pp. 219–239. Springer, 2024.
Download Paper
Published in ICLR, 2025
This paper is about creating a neural layer able to make any neural network compliant by design with constraints expressed as disjunctions over linear inequalities, i.e., each constraint can have form \(\Phi_1 \vee \Phi_2 \vee \ldots \Phi_n\), where each \(\Phi_i\) for \(i = 1, \ldots, n\) is a linear inequality. This problem is particularly interesting because the constraints can define non-convex and even disconnected spaces.
Recommended citation: Mihaela C Stoian and Eleonora Giunchiglia. Beyond the convexity assumption: Realistic tabular data generation under quantifier-free real linear constraints. In The Thirteenth International Conference on Learning Representations (ICLR), 2025.
Download Paper
Published:
This is a description of your talk, which is a markdown file that can be all markdown-ified like any other post. Yay markdown!
Published:
This is a description of your conference proceedings talk, note the different field in type. You can put anything in this field.
Graduate level course, Imperial College London, I-X, 2025
Machine Learning is at the core of contemporary AI research and applications. This module develops a foundation for the mathematical theory underpinning key ML methods which are necessary for their understanding and analysis. The module covers six units: Linear Algebra, Geometry, Calculus, Optimisation, Probability and Statistics. All to establish a comprehensive setting to strenghen the student’s understanding of widely used ML models and methods.