Results 1 to 10 of about 2,675,724 (75)

A Bipartite Graph is All We Need for Enhancing Emotional Reasoning with Commonsense Knowledge [PDF]

open access: yes, 2023
The context-aware emotional reasoning ability of AI systems, especially in conversations, is of vital importance in applications such as online opinion mining from social media and empathetic dialogue systems. Due to the implicit nature of conveying emotions in many scenarios, commonsense knowledge is widely utilized to enrich utterance semantics and ...
arxiv   +1 more source

Logical Reasoning over Natural Language as Knowledge Representation: A Survey [PDF]

open access: yesarXiv, 2023
Logical reasoning is central to human cognition and intelligence. It includes deductive, inductive, and abductive reasoning. Past research of logical reasoning within AI uses formal language as knowledge representation and symbolic reasoners. However, reasoning with formal language has proved challenging (e.g., brittleness and knowledge-acquisition ...
arxiv  

ExBERT: An External Knowledge Enhanced BERT for Natural Language Inference [PDF]

open access: yesarXiv, 2021
Neural language representation models such as BERT, pre-trained on large-scale unstructured corpora lack explicit grounding to real-world commonsense knowledge and are often unable to remember facts required for reasoning and inference. Natural Language Inference (NLI) is a challenging reasoning task that relies on common human understanding of ...
arxiv  

Annotating Implicit Reasoning in Arguments with Causal Links [PDF]

open access: yesarXiv, 2021
Most of the existing work that focus on the identification of implicit knowledge in arguments generally represent implicit knowledge in the form of commonsense or factual knowledge. However, such knowledge is not sufficient to understand the implicit reasoning link between individual argumentative components (i.e., claim and premise).
arxiv  

Commonsense Knowledge Reasoning and Generation with Pre-trained Language Models: A Survey [PDF]

open access: yesarXiv, 2022
While commonsense knowledge acquisition and reasoning has traditionally been a core research topic in the knowledge representation and reasoning community, recent years have seen a surge of interest in the natural language processing community in developing pre-trained models and testing their ability to address a variety of newly designed commonsense ...
arxiv  

Neural, Symbolic and Neural-Symbolic Reasoning on Knowledge Graphs [PDF]

open access: yesarXiv, 2020
Knowledge graph reasoning is the fundamental component to support machine learning applications such as information extraction, information retrieval, and recommendation. Since knowledge graphs can be viewed as the discrete symbolic representations of knowledge, reasoning on knowledge graphs can naturally leverage the symbolic techniques.
arxiv  

VQA-GNN: Reasoning with Multimodal Knowledge via Graph Neural Networks for Visual Question Answering [PDF]

open access: yesarXiv, 2022
Visual question answering (VQA) requires systems to perform concept-level reasoning by unifying unstructured (e.g., the context in question and answer; "QA context") and structured (e.g., knowledge graph for the QA context and scene; "concept graph") multimodal knowledge.
arxiv  

KRISP: Integrating Implicit and Symbolic Knowledge for Open-Domain Knowledge-Based VQA [PDF]

open access: yesarXiv, 2020
One of the most challenging question types in VQA is when answering the question requires outside knowledge not present in the image. In this work we study open-domain knowledge, the setting when the knowledge required to answer a question is not given/annotated, neither at training nor test time.
arxiv  

GreaseLM: Graph REASoning Enhanced Language Models for Question Answering [PDF]

open access: yesarXiv, 2022
Answering complex questions about textual narratives requires reasoning over both stated context and the world knowledge that underlies it. However, pretrained language models (LM), the foundation of most modern QA systems, do not robustly represent latent relationships between concepts, which is necessary for reasoning. While knowledge graphs (KG) are
arxiv  

Incorporating Connections Beyond Knowledge Embeddings: A Plug-and-Play Module to Enhance Commonsense Reasoning in Machine Reading Comprehension [PDF]

open access: yesarXiv, 2021
Conventional Machine Reading Comprehension (MRC) has been well-addressed by pattern matching, but the ability of commonsense reasoning remains a gap between humans and machines. Previous methods tackle this problem by enriching word representations via pre-trained Knowledge Graph Embeddings (KGE).
arxiv  

Home - About - Disclaimer - Privacy