Neural Relation Extraction with Selective Attention over Instances阅读笔记

这篇文章之前看过😂。

论下载地址

Problem Statement

​ Distant supervision inevitably accompanies with the wrong labelling problem, and thse noisy data will substantially hurt the performance of relation extraction.

Contribution

  • As compared to existing neural relation extraction model, our model can make full use of all informative sentences of each entity pair.
  • To address the wrong labelling problem in distant supervision, we propose selective attention to de-emphasize those noisy instances.
  • In the experiments, we show that selective attention is beneficial to two kinds of CNN models in the task of relation extraction.

Methodology

模型整体架构如下所示:

20190626156153649268323.jpg


 上一篇
Learning Entity and Relation Embeddings for Knowledge Graph Completion阅读笔记 Learning Entity and Relation Embeddings for Knowledge Graph Completion阅读笔记
TransR embeds entities and relations in distinct entity space and relation space, and learns embeddings via translation
下一篇 
Learning as the Unsupervised Alignment of Conceptual Systems阅读笔记 Learning as the Unsupervised Alignment of Conceptual Systems阅读笔记
这篇文章没怎么看懂,主要思想应该是代表同时概念的不同形式(文本,图像,语音等)应该具有相似的分布,以此来进行无监督的概念对齐。这种思路挺不错的,不过还没有深入的想法,算是拓展视野吧! KEYThe key insight is tha
  目录