Alessandro Oltramari, Ph.D.
Senior Research Scientist in the areas of Knowledge Representation and Reasoning, Cognitive Systems and Neuro-Symbolic AI
I work in the area of Intelligent Assistance, with a focus on decision support systems that combine machine perception and reasoning. My primary interest is investigating how semantic resources, whether structured or unstructured, can be integrated with data-driven algorithms, and help machines make sense of the physical and digital worlds. I strive to make progress in the area of Human-Machine Collaboration, which can benefit greatly from designing AI-based systems that infuse powerful neural models with transparent knowledge representations.
Please tell us what fascinates you most about research.
For me, research is a journey of the mind. It’s an exploration of a problem space that enables a map of solutions to be drawn. As with any human endeavor, research is accompanied by a full range of emotions — joy, frustration, disappointment, and surprise are all part of any scientist’s life. I should add that research for me has also been a journey in the literal sense of the word. I left Italy in search of better opportunities to fulfill my career goals. In Bosch Research, I found my ideal “research home.”
What makes research done at Bosch so special?
There are several aspects that make working at Bosch special — from the multi-cultural environment to the open dialog with upper management. But the essential feature for me is the balance between business-driven research problems and scientific investigations born out of research teams. At Bosch, striking the balance between short-term and long-term research is facilitated by a rigorous yet flexible practice that promotes clear goal setting, business model construction and validation, collaborative planning and task execution, retrospective review, and open discussion.
What research topics are you currently working on at Bosch?
I’m investigating how we can build more robust intelligent assistants by integrating heterogeneous knowledge sources (electronic documents, information on the Internet, social networks) with machine learning algorithms. Specifically, I’m looking at this topic from a concrete application angle in three different domains — context-aware chatbots for customer services, decision support systems for (semi-)autonomous cars, and scene understanding in smart environments.
What are the biggest scientific challenges in your field of research?
Explainable artificial intelligence is the number one challenge. In order to progress towards explainable AI, it is necessary to design hybrid systems that integrate human-accessible machine representations with neural machines. Currently, no general framework of integration between deep networks and knowledge graphs really exists.
How do the results of your research become part of solutions “Invented for life”?
Enabling explainable AI means endowing intelligent machines with semantic transparency, in terms of both internal functioning and correlation between input and output. Just as humans learn how to trust each other by sharing knowledge, explainable systems will make human-machine interaction more trustworthy and personalized, and therefore suitable for improving human life in many areas where AI is crucial (healthcare, mobility, etc.).
Curriculum vitae
Since 2024
Senior Research Scientist and President of the Carnegie Bosch Institute
Since 2023
Industry Mentor, Carnegie Bosch Institute (Pittsburgh, USA)
2010 to 2016
Research Associate, Carnegie Mellon University (Pittsburgh, USA)
2005 to 2006
Visiting Research Associate, Princeton University (Princeton, USA)
2001 to 2010
Research Fellow, Laboratory for Applied Ontology (CNR, Trient, Italy)
Selected publications
F. Ilievsky et al. (2021)
- Filip Ilievski, Alessandro Oltramari, Kaixin Ma, Bin Zhang, Deborah L. McGuinness, and Pedro Szekely
- Knowledge-Based Systems 229 (2021): 107347.F17
A. Oltramari et al. (2021)
- K. Ma, F. Ilievsky, J. Francis, Y. Bisk, E. Nyberg, A. Oltramari
- Proceeding of The Thirty-Fifth AAAI Conference on Artificial Intelligence (AAAI-21)
- February 2-9 2021
A. Oltramari et al. (2020)
- Oltramari, A., Francis J, Henson C, Ma K., Wickramarachchi R.
- In “Knowledge Graphs for eXplainable Artificial Intelligence: Foundations, Applications and challenges”
- Studies on the Semantic Web (47), Ed. I. Tiddi, F. Lecué, P. Hitzler.
A. Oltramari et al. (2020)
- S. Somers, A. Oltramari, C. Lebiere
- Proceedings of AAAI Spring Symposia 2020 (AAAI-MAKE: Combining Machine Learning and Knowledge Engineering in Practice)
- March 23-25 2020
Get in touch with me
Alessandro Oltramari, Ph.D.
Senior Research Scientist in the areas of Knowledge Representation and Reasoning, Cognitive Systems and Neuro-Symbolic AI