Leveraging Natural Supervision: Learning Semantic Knowledge from Wikipedia
2024-6-2 00:0:56 Author: hackernoon.com(查看原文) 阅读量:0 收藏

New Story

Open TLDRtldt arrow

Too Long; Didn't Read

In this study, researchers exploit rich, naturally-occurring structures on Wikipedia for various NLP tasks.

featured image - Leveraging Natural Supervision: Learning Semantic Knowledge from Wikipedia

Writings, Papers and Blogs on Text Models HackerNoon profile picture

Author:

(1) Mingda Chen.

CHAPTER 4 - LEARNING SEMANTIC KNOWLEDGE FROM WIKIPEDIA

In this chapter, we describe our contributions to exploiting rich, naturally-occurring structures on Wikipedia for various NLP tasks. In Section 4.1, we use hyperlinks to learn entity representations. The resultant models use contextualized representations rather than a fixed set of vectors for representing entities (unlike most prior work). In Section 4.2, we use article structures (e.g., paragraph positions and section titles) to make sentence representations aware of the broader context in which they situate, leading to improvements across various discourse-related tasks. In Section 4.3, we use article category hierarchies to learn concept hierarchies that improve model performance on textual entailment tasks.

The material in this chapter is adapted from Chen et al. (2019a), Chen et al. (2019b), and Chen et al. (2020a).

This paper is available on arxiv under CC 4.0 license.

L O A D I N G
. . . comments & more!


About Author

Writings, Papers and Blogs on Text Models HackerNoon profile picture

Writings, Papers and Blogs on Text Models@textmodels

We publish the best academic papers on rule-based techniques, LLMs, & the generation of text that resembles human text.

TOPICS

THIS ARTICLE WAS FEATURED IN...

RELATED STORIES


文章来源: https://hackernoon.com/leveraging-natural-supervision-learning-semantic-knowledge-from-wikipedia?source=rss
如有侵权请联系:admin#unsafe.sh