Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash 85 tok/s
Gemini 2.5 Pro 48 tok/s Pro
GPT-5 Medium 37 tok/s
GPT-5 High 37 tok/s Pro
GPT-4o 100 tok/s
GPT OSS 120B 473 tok/s Pro
Kimi K2 240 tok/s Pro
2000 character limit reached

Revealing missing parts of the interactome (1307.3329v1)

Published 12 Jul 2013 in q-bio.MN

Abstract: Protein interaction networks (PINs) are often used to "learn" new biological function from their topology. Since current PINs are noisy, their computational de-noising via link prediction (LP) could improve the learning accuracy. LP uses the existing PIN topology to predict missing and spurious links. Many of existing LP methods rely on shared immediate neighborhoods of the nodes to be linked. As such, they have limitations. Thus, in order to comprehensively study what are the topological properties of nodes in PINs that dictate whether the nodes should be linked, we had to introduce novel sensitive LP measures that overcome the limitations of the existing methods. We systematically evaluate the new and existing LP measures by introducing "synthetic" noise to PINs and measuring how well the different measures reconstruct the original PINs. Our main findings are: 1) LP measures that favor nodes which are both "topologically similar" and have large shared extended neighborhoods are superior; 2) using more network topology often though not always improves LP accuracy; and 3) our new LP measures are superior to the existing measures. After evaluating the different methods, we use them to de-noise PINs. Importantly, we manage to improve biological correctness of the PINs by de-noising them, with respect to "enrichment" of the predicted interactions in Gene Ontology terms. Furthermore, we validate a statistically significant portion of the predicted interactions in independent, external PIN data sources. Software executables are freely available upon request.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.