Call for Papers

The Actionable Knowledge Representation and Reasoning for Robots (AKR3) workshop, co-located with the Extended Semantic Web Conference (ESWC), is dedicated to Knowledge Representation and Reasoning (KRR) in the area of cognitive robotics, with the focus on acquiring knowledge from the Web and making it actionable for robotic applications. We aim to bring together the European communities specializing in KRR and robotics to increase collaboration and accelerate advancements in the field.

Household robots are still not able to autonomously prepare meals, set or clean the table or do other chores besides vacuum cleaning. Much of the knowledge needed to refine vague task instructions and transfer them to new task variations is contained in instruction websites like WikiHow, encyclopedic web sites like Wikipedia, and many other web-based information sources. We argue that such knowledge can be used to teach robots to perform new task variations.

Given the availability of a plethora of sources and datasets of common sense knowledge on the Web (e.g. ConceptNet, OMICS, CSKG) as well as recent advances in language modeling, it is a timely research question to investigate which methods and approaches can enable robots to take advantage of existing common sense and task knowledge to reason on how to perform tasks in the real world. The main issue to be addressed in particular is how to allow robots to perform tasks flexibly and adaptively, gracefully handling contextually determined variance in task execution. We expect this line of research to contribute to better generalizability and robustness of robots performing in every-day environments. 

Invited Speaker:

The workshop will feature Prof. Dr. Lars Kunze from the Oxford Robotics Institute as invited speaker!

Submission Guidelines:

We solicit papers on the following guiding topics, but are open to any related research direction and topic:

  • Knowledge Representation for cognitive robotics: The importance of linking object to action and environment information
  • Approaches to leverage common sense and task knowledge from existing structured (e.g., common sense knowledge bases) or unstructured (e.g., the Web) sources 
  • Linking common sense knowledge to perception and execution
  • Translation of task requests to body movements and parametrisation of such body movements with Web knowledge
  • Novel formalisms and approaches to represent and encode knowledge for robots
  • Novel cognitive architectures and paradigms supporting reasoning with Web knowledge
  • Use of large language models and prompting to infer action-relevant knowledge
  • Natural language processing applied to common sense and task knowledge extraction from unstructured source

All papers must represent original work not submitted or published already at another workshop or conference, in the following formats: 

  • Full papers of up to 12 pages excluding references (formatted according to Springer LNCS) describing novel and substantial work including an evaluation / validation of the proposed approach
  • Short papers of up to 8 pages excluding references (formatted according to Springer LNCS) describing preliminary work or a position 

Papers should be submitted via Easychair: https://easychair.org/conferences/?conf=r3

Important Dates: https://kr3-workshop.net/important-dates/

Committees: https://kr3-workshop.net/organisation/

Venue: The workshop will be collocated with the Extended Semantic Web Conference (ESWC) in Hersonissos, Crete, Greece (https://2024.eswc-conferences.org/)

Publication:

Proceedings of the AKR3 Workshop will be published by CEUR Workshop Proceedings together with the proceedings of other workshops. 

The best paper of AKR³ will appear in the Companion volume of the conference post-proceedings published by Springer. 

Contact: All questions about submissions should be emailed to Philipp Cimiano at cimiano@cit-ec.uni-bielefeld.de

Sponsors:  The workshop is sponsored by the  Joint Research Center on Cooperative and Cognition-enabled AI (CoAI JRC) (Co-AI), https://coai-jrc.de