2nd LLMs4Subjects Shared Task: Energy- and Compute-Efficient LLM Systems
Join the 2nd LLMs4Subjects Shared Task and contribute to the development of energy- and compute-efficient Large Language Models for subject tagging of technical records.
Join the 2nd LLMs4Subjects Shared Task, co-located with KONVENS 2025, and contribute to the development of energy- and compute-efficient Large Language Models (LLMs). This shared task is part of the German Evaluation (GermEval 2025) Shared Task Series and will take place from September 10-12, 2025, in Hildesheim, Germany.
The task focuses on subject tagging of technical records from Leibniz University’s Technical Library (TIBKAT) using LLMs. Participants will need to develop solutions that can process technical documents in both German and English, leveraging bilingual language modeling.
Two subtasks are organized: Multi-Domain Classification of Library Records and Large-scale Multilabel Subject Indexing of Library Records. The shared task encourages participants to explore strategies that enhance model performance while optimizing for energy consumption and inference speed.
Important dates include the release of training data on March 8, 2025, and the deadline for system submissions on June 2, 2025. The evaluation will end on June 27, 2025, and the paper submission deadline is July 7, 2025.
For more information, visit the 2nd LLMs4Subjects Shared Task website and the KONVENS 2025 website.
Tags: LLMs4Subjects, GermEval 2025, KONVENS 2025, Large Language Models, Energy-Efficient AI, Compute-Efficient AI, Subject Tagging, Technical Library, Bilingual Language Modeling