This site uses cookies. By continuing to browse the site, you are agreeing to our use of cookies.
Accept all cookies and servicesDo not acceptLearn moreWe may request cookies to be set on your device. We use cookies to let us know when you visit our websites, how you interact with us, to enrich your user experience, and to customize your relationship with our website.
Click on the different category headings to find out more. You can also change some of your preferences. Note that blocking some types of cookies may impact your experience on our websites and the services we are able to offer.
These cookies are strictly necessary to provide you with services available through our website and to use some of its features.
Because these cookies are strictly necessary to deliver the website, refusing them will have impact how our site functions. You always can block or delete cookies by changing your browser settings and force blocking all cookies on this website. But this will always prompt you to accept/refuse cookies when revisiting our site.
We fully respect if you want to refuse cookies but to avoid asking you again and again kindly allow us to store a cookie for that. You are free to opt out any time or opt in for other cookies to get a better experience. If you refuse cookies we will remove all set cookies in our domain.
We provide you with a list of stored cookies on your computer in our domain so you can check what we stored. Due to security reasons we are not able to show or modify cookies from other domains. You can check these in your browser security settings.
These cookies collect information that is used either in aggregate form to help us understand how our website is being used or how effective our marketing campaigns are, or to help us customize our website and application for you in order to enhance your experience.
If you do not want that we track your visit to our site you can disable tracking in your browser here:
We also use different external services like Google Webfonts, Google Maps, and external Video providers. Since these providers may collect personal data like your IP address we allow you to block them here. Please be aware that this might heavily reduce the functionality and Menus of our site. Changes will take effect once you reload the page.
Google Webfont Settings:
Google Map Settings:
Google reCaptcha Settings:
Vimeo and Youtube video embeds:
The following cookies are also needed - You can choose if you want to allow them:
You can read about our cookies and privacy settings in detail on our Privacy Policy Page.
Legal Information – Impressum
Explainable AI Methods – A brief overview (open access)
/in HCAI success, Recent Publications, Science News/by Andreas Holzingeropen access paper available – free to the international research community
AI TechnikerIN gesucht (open position)
/in General/by Andreas HolzingerWir suchen für unser junges Forschungsteam ein(e) AI-TechnikerIn mit Abschluss HTL für Informatik Softwaretechnik oder vergleichbarer Ausbildung für die technische Betreuung des Human-Centered AI Labs und Unterstützung unseres jungen Forschungsteams (embodied intelligence, human-in-the-loop robotics, IoT, sensorik, etc) am Institut für Forsttechnik der BOKU Wien am Standort Campus Tulln – 30 Minuten Wien City. Wir bieten ein extrem spannendes Research Environment mit zukunftsträchtigen Entwicklsungsmöglichkeiten. Bei Interesse bitte direkt Kontaktaufnahme via andreas.holzinger AT boku.ac.at
Ben Shneiderman fosters Human-Centered AI
/in General, HCAI success/by Andreas HolzingerBen Shneiderman fosters human-centered ai
FWF Explainable AI project P 32554 in the News
/in HCAI success, Science News/by Andreas HolzingerThis basic research project will contribute novel results, algorithms and tools to the international ai and machine learning community
Talk of Dr Isabelle Augenstein, Monday, December, 6, 2021, 13:00 CET
/in Explainability, Lectures/by Andreas HolzingerHCAI research seminar
The Next Frontier – AI we can really Trust
/in Conferences, Recent Publications/by Andreas HolzingerRobustness and Explainability are the two ingredients to ensure trustworthy artificial intelligence – talk at ECML 2021
Research Seminar Friday, September, 10, 2021, 11:00 CEST
/in General/by Andreas HolzingerHCAI research seminar
Fairness in Artificial Intelligence Survey
/in experiments/by Andreas HolzingerPlease take part in our study of Fairness in Artificial Intelligence to help to overcome bias of machine learning
Please take part in our EMPAIA XAI Survey
/in experiments, General/by Andreas HolzingerThe Human-Centered AI Lab invites to take part in a causability measurement study to test the new causabilometer
Please take part in our “Causabilometer” Survey
/in experiments, Explainability/by Andreas HolzingerThe Human-Centered AI Lab invites to take part in a causability measurement study to test the new causabilometer