. . . . "Contextualization Distillation from Large Language Model for Knowledge Graph Completion" . . . . . . . . "CSProm-KG" . . "https://github.com/Li0406/Contextulization-Distillation"^^ . . "Contextualization Distillation leverages LLMs to generate high-quality, context-rich descriptions for KG triplets. These LLM-generated contexts are then used in auxiliary tasks (reconstruction or contextualization) to train smaller KGC models, thereby enhancing their performance on Knowledge Graph Completion. The core idea is that LLMs augment the training data/signal for the KG completion task." . "Contextualization Distillation" . . . "GenKGC" . . "KG-BERT" . . "KG-S2S" . . . "2026-02-26T15:35:31.208Z"^^ . . . "LLM-KG assessment for paper 10.48550/arXiv.2402.01729" . "RSA" . "MIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEAwNz2QK3SEifno78S7+48zUB0xpTex3mAzW73ZimHqNcdEMU5/apslrGrTHGFAt/Chocgo++r6JQp5ygY7NyJHGWdaIqnt85pjX4PbNfLAvapyUO00qZP34fY61w4eZ9UMtleWEsmZKRtQPyJ8ODl46i/rfPuZlcJGpM9Nmy5mpGWuepqIEvF4a/t7pLVeCEDFSYXT+yaiygt6ynIK5f7TtEDhZpeUf/Q74WhMPJXm4yTU/hqOX4IW+50kWHNArGGZwUaXwzyG6M3Zd6UMModryGkLqS4H/MSE3ZA1Ylnms7BfWLEXhMWlaKi6HRV4nGRDLhxVSi9LSRi3LWKLhNIIQIDAQAB" . "E/rA2axSbK1HsAt5dwcPr78KUtWOaaxaeirpsiTBxSNnLFhUfkMdkCKnA3CZf4/5/i9QGKCts1TD/b9FOCSqhNzNj5dSbGctKN801FyaIcbN3Ik8mMu7ivoMwcCwsbwf8GiGYAi8Rify3nRow77UOAwwz4J+K+Fa9gtDgjQ5GiQhxCYI1jaEUnhBgnwCp0X3rkxwKNC+I62ff8k/vptFIbpr9xnDyQoNpLOuTtPnjOyFGG73NFM2ZLZVZYq7whUp6LltNGnMGql0UIw6F6pxoK02mVi4rXcLF7ZRARskZl9J6cdCnC505OtqtbeR7PRscu1lEJxEgfGQnqI0bEpEgw==" . . .