Graph Embeddings and PyTorch-BigGraph
Вставка
- Опубліковано 8 лют 2025
- This video provides an overview of Graph Embeddings and how PyTorch-BigGraph enables learning graph embeddings for very large graphs. The key challenge with this is you would need a lot of memory to store the vectors for each node in a large graph. To solve this, BigGraph uses novel partitioning, distributed execution, and negative sampling algorithms. I hope this is a decent introduction to graph embeddings and PyTorch-BigGraph, really excited about the upcoming release of the Wikidata Weaviate web demo!
PyTorch-BigGraph (blog post) - / open-sourcing-pytorch-...
PyTorch-BigGraph (paper) - arxiv.org/abs/...
Paperswithdata (graph filter) - paperswithcode...
Great presentation. I'm working on drug target prediction and it is nice to know that with graph embeddings I'm halfway through my journey. By the way which font did you use for your slides? They really look nice.
Thanks, for such great content.
I am working on MultiVariate Time Series Anomaly Detection using GNNs, Transformers, and GANs, do you know of any resource where I can start?
I searched a lot but couldn't find anything other than papers, which are not that useful.
Thanks again
Hey Muhammad, thank you and thanks for the question! Unfortunately, I just recently started working with Time Series data myself so I can't give you a ton of recommendations. But I think a Contrastive Learning approach would be a good bet for OOD detection in any data-domain - maybe check out "Domain Agnostic Contrastive Learning" for a general framework on how to encode your data into a semantic embedding space for this. Good luck with your project!
@@connor-shorten Thanks alot.
Maybe a practical example