To avoid the problem caused by low-frequent entity-relation pairs, our MBS uses the estimated probabilities from a trained model $\mathbf{\theta}'$ to calculate frequencies for each triplet and query. By using $\mathbf{\theta}'$, the NS loss in KGE with MBS is represented as follows: \begin{align} &\ell_{mbs}(\mathbf{\theta};\mathbf{\theta}') \nonumber \ =&-\frac{1}{|D|}\sum_{(x,y) \in D} \Bigl[A_{mbs}(\mathbf{\theta}')\log(\sigma(s_{\mathbf{\theta}}(x,y)+\gamma))\nonumber\ &+\frac{1}{\nu}sum_{y_{i}\sim p_n(y_{i}|x)}^{\nu}B_{mbs}(\mathbf{\theta}')\log(\sigma(-s_{\mathbf{\theta}}(x,y_i)-\gamma))\Bigr], \end{align}
Source: Model-based Subsampling for Knowledge Graph CompletionPaper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
Management | 1 | 14.29% |
Speech Enhancement | 1 | 14.29% |
Super-Resolution | 1 | 14.29% |
Model Compression | 1 | 14.29% |
Graph Embedding | 1 | 14.29% |
Knowledge Graph Completion | 1 | 14.29% |
Knowledge Graph Embedding | 1 | 14.29% |
Component | Type |
|
---|---|---|
🤖 No Components Found | You can add them if they exist; e.g. Mask R-CNN uses RoIAlign |