NYSDY
I just want to be a bit closer to the best.
Neural Relation Extraction with Selective Attention over Instances阅读笔记 Neural Relation Extraction with Selective Attention over Instances阅读笔记
这篇文章之前看过😂。 论下载地址 Problem Statement​ Distant supervision inevita
Learning as the Unsupervised Alignment of Conceptual Systems阅读笔记 Learning as the Unsupervised Alignment of Conceptual Systems阅读笔记
这篇文章没怎么看懂,主要思想应该是代表同时概念的不同形式(文本,图像,语音等)应该具有相似的分布,以此来进行无监督的概念对齐。这种思路挺不
ERNIE Enhanced Language Representation with Informative Entities阅读笔记 ERNIE Enhanced Language Representation with Informative Entities阅读笔记
该篇论文借鉴BERT,试图将实体信息(TransE)融入token(singal word)中,通过类似实体对齐的方法将实体与token对
Incorporating Literals into Knowledge Graph Embeddings阅读笔记 Incorporating Literals into Knowledge Graph Embeddings阅读笔记
读完了前两章,简单的看了一下作者提出的模型,感觉并没有太大价值,就是给实体输入多加入了一个literal的信息(加入方法可以采用线性、非线
Learning Knowledge Embeddings by Combining Limit-based Scoring Loss阅读笔记 Learning Knowledge Embeddings by Combining Limit-based Scoring Loss阅读笔记
此篇文章最为重要的就是作者设计的 margin-based ranking loss 的改进,对两个超参数$\lambda$和$\gamm
Knowledge Graph Embedding by Translating on Hyperplanes阅读笔记 Knowledge Graph Embedding by Translating on Hyperplanes阅读笔记
作为trans系列经典文献,必读。文章主要精华在于这种超平面想法的由来解决了同一实体的多关系问题。 Authors proposed Tr
Attention Is All You Need阅读笔记 Attention Is All You Need阅读笔记
transformer 是一个完全由注意力机制组成的搭建的模型,模型复杂度低,并可以进行并行计算,使得计算速度快。在翻译模型上取得了较好的
Graph Neural Networks with Generated Parameters for Relation Extraction阅读笔记 Graph Neural Networks with Generated Parameters for Relation Extraction阅读笔记
本文将GNNs应用到处理非结构化文本的(多跳)关系推理任务来进行关系抽取。采用从句子序列中获取的实体构建全链接图,应用编码(sequenc
Triple Trustworthiness Measurement for Knowledge Graph阅读笔记 Triple Trustworthiness Measurement for Knowledge Graph阅读笔记
本文提出了一种通过计算triple trustworthiness来评估知识图谱的准确程度的方法。模型利用神经网络综合来自实体(借鉴Res
2 / 4