Discriminator-based adversarial networks for knowledge graph completion
Article
Tubaishat, A., Zia, T., Faiz, R., Al Obediat, F., Shah, B. and Windridge, D. 2022. Discriminator-based adversarial networks for knowledge graph completion. Neural Computing and Applications. https://doi.org/10.1007/s00521-022-07680-w
Type | Article |
---|---|
Title | Discriminator-based adversarial networks for knowledge graph completion |
Authors | Tubaishat, A., Zia, T., Faiz, R., Al Obediat, F., Shah, B. and Windridge, D. |
Abstract | Knowledge graphs (KGs) inherently lack reasoning ability which limits their effectiveness for tasks such as question-answering and query expansion. KG embedding (KGE) is a predominant approach where proximity between relations and entities in the embedding space is used for reasoning over KGs. Most existing KGE approaches use structural information of triplets and disregard contextual information which could be crucial to learning long-term relations between entities. Moreover, KGE approaches mostly use discriminative models which require both positive and negative samples to learn a decision boundary. KGs, by contrast, contain only positive samples, necessitating that negative samples are generated by replacing the head/tail of predicates with randomly-chosen entities. They are thus usually irrational and easily discriminable from positive samples, which can prevent the learning of sufficiently robust classifiers. To address the shortcomings, we propose to learn contextualized KGE using pretrained adversarial networks. We assume multi-hop relational paths(mh-RPs) as textual sequences for competitively learning discriminator-based KGE against the negative mh-RP generator. We use a pre-trained ELECTRA model and feed it with relational paths. We employ a generator to corrupt randomly-chosen entities with plausible alternatives and a discriminator to predict whether an entity is corrupted or not. We perform experiments on multiple benchmark knowledge graphs and the results show that our proposed KG-ELECTRA model outperforms BERT in link prediction. |
Keywords | Knowledge Graph Completion; Pretrained Language Model; Transformer Model |
Sustainable Development Goals | 9 Industry, innovation and infrastructure |
Middlesex University Theme | Creativity, Culture & Enterprise |
Publisher | Springer |
Journal | Neural Computing and Applications |
ISSN | 0941-0643 |
Electronic | 1433-3058 |
Publication dates | |
Online | 10 Aug 2022 |
Publication process dates | |
Deposited | 23 Aug 2022 |
Submitted | 23 May 2022 |
Accepted | 25 Jul 2022 |
Accepted author manuscript | |
Copyright Statement | This version of the article has been accepted for publication, after peer review (when applicable) and is subject to Springer Nature’s AM terms of use (https://www.springernature.com/gp/open-research/policies/accepted-ma...), but is not the Version of Record and does not reflect post-acceptance improvements, or any corrections. The Version of Record is available online at: http://dx.doi.org/10.1007/s00521-022-07680-w |
Digital Object Identifier (DOI) | https://doi.org/10.1007/s00521-022-07680-w |
Web of Science identifier | WOS:000838484500004 |
Language | English |
https://repository.mdx.ac.uk/item/89y7v
Download files
74
total views24
total downloads2
views this month3
downloads this month