| | |
| | |
Stat |
Members: 3645 Articles: 2'501'711 Articles rated: 2609
20 April 2024 |
|
| | | |
|
Article overview
| |
|
SenseBERT: Driving Some Sense into BERT | Yoav Levine
; Barak Lenz
; Or Dagan
; Dan Padnos
; Or Sharir
; Shai Shalev-Shwartz
; Amnon Shashua
; Yoav Shoham
; | Date: |
15 Aug 2019 | Abstract: | Self-supervision techniques have allowed neural language models to advance
the frontier in Natural Language Understanding. However, existing
self-supervision techniques operate at the word-form level, which serves as a
surrogate for the underlying semantic content. This paper proposes a method to
employ self-supervision directly at the word-sense level. Our model, named
SenseBERT, is pre-trained to predict not only the masked words but also their
WordNet supersenses. Accordingly, we attain a lexical-semantic level language
model, without the use of human annotation. SenseBERT achieves significantly
improved lexical understanding, as we demonstrate by experimenting on SemEval,
and by attaining a state of the art result on the Word in Context (WiC) task.
Our approach is extendable to other linguistic signals, which can be similarly
integrated into the pre-training process, leading to increasingly semantically
informed language models. | Source: | arXiv, 1908.5646 | Services: | Forum | Review | PDF | Favorites |
|
|
No review found.
Did you like this article?
Note: answers to reviews or questions about the article must be posted in the forum section.
Authors are not allowed to review their own article. They can use the forum section.
browser Mozilla/5.0 AppleWebKit/537.36 (KHTML, like Gecko; compatible; ClaudeBot/1.0; +claudebot@anthropic.com)
|
| |
|
|
|
| News, job offers and information for researchers and scientists:
| |