Entailment Graph Learning with Textual Entailment and Soft Transitivity
Published in Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics, 2022
Recommended citation: Zhibin Chen, Yansong Feng, and Dongyan Zhao. 2022. Entailment Graph Learning with Textual Entailment and Soft Transitivity. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 5899–5910, Dublin, Ireland. Association for Computational Linguistics. https://aclanthology.org/2022.acl-long.406.pdf
Typed entailment graphs try to learn the entailment relations between predicates from text and model them as edges between predicate nodes. The construction of entailment graphs usually suffers from severe sparsity and unreliability of distributional similarity. We propose a two-stage method, Entailment Graph with Textual Entailment and Transitivity (EGT2). EGT2 learns the local entailment relations by recognizing the textual entailment between template sentences formed by typed CCG-parsed predicates. Based on the generated local graph, EGT2 then uses three novel soft transitivity constraints to consider the logical transitivity in entailment structures. Experiments on benchmark datasets show that EGT2 can well model the transitivity in entailment graph to alleviate the sparsity, and leads to signifcant improvement over current state-of-the-art methods.
Zhibin Chen, Yansong Feng, and Dongyan Zhao. 2022. Entailment Graph Learning with Textual Entailment and Soft Transitivity. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 5899–5910, Dublin, Ireland. Association for Computational Linguistics.