illinoishilt.blogg.se

Poincare disk graph theory
Poincare disk graph theory








poincare disk graph theory
  1. #Poincare disk graph theory how to#
  2. #Poincare disk graph theory code#

Hyperbolic space is a beautiful and sometimes weird place. We hope our effort inspires further development of techniques for constructing hyperbolic embeddings and incorporating them into more applications.įollowing prior work, our hyperbolic spaces of choice are the Poincaré disk, in which all points are in the interior of the unit disk in two dimensions, and the Poincaré ball, its higher-dimensional cousin. We also investigated the advantages of embedding structured data in hyperbolic space for certain tasks within natural language processing and relationship prediction using knowledge bases. We built a scalable PyTorch implementation using the insights we gained from our exploration.

poincare disk graph theory

These ideas reveal trade-offs involving precision, dimensionality, and fidelity that affect all hyperbolic embeddings. We also solve what is called the exact recovery problem (given only distance information, recover the underlying hyperbolic points). In our work, we extend this construction with tools from coding theory. In this post, we describe the exciting properties of hyperbolic space and introduce a combinatorial construction (building on an elegant algorithm by Sarkar) for embedding tree-like graphs. Even better-angles in the hyperbolic world are the same as in Euclidean space, suggesting that hyperbolic embeddings are useful for downstream applications (and not just a quirky theoretical idea). It turns out that hyperbolic space can better embed graphs (particularly hierarchical graphs like trees) than is possible in Euclidean space. The big goal when embedding a space into another is to preserve distances and more complex relationships. One example is embedding taxonomies (such as Wikipedia categories, lexical databases like WordNet, and phylogenetic relations). The motivation is to combine structural information with continuous representations suitable for machine learning methods. Hyperbolic embeddings have captured the attention of the machine learning community through two exciting recent proposals.

  • tradeoffs and theoretical properties for these strategies these give us a new simple and scalable PyTorch-based implementation that we hope people can extend!.
  • #Poincare disk graph theory how to#

    how to solve the optimal recovery problem and dimensionality reduction (called principal geodesic analysis),.a simple linear-time approach that offers excellent quality (0.989 in the MAP graph reconstruction metric on WordNet synonyms-better than previous published approaches-with just two dimensions!),.We cover a lot of the basics, and, for experts, we show This post describes the magic of hyperbolic embeddings, and some of our recent progress solving the underlying optimization problems. Recent work proposes using a fancy sounding technique-hyperbolic geometry-to encode these structures. Fundamentally, the problem is that these objects are discrete and structured, while much of machine learning works on continuous and unstructured data. Embedding these structured, discrete objects in a way that can be used with modern machine learning methods, including deep learning, is challenging. Valuable knowledge is encoded in structured data such as carefully curated databases, graphs of disease interactions, and even low-level information like hierarchies of synonyms.

    #Poincare disk graph theory code#

    Hyperbolic Embeddings with a Hopefully Right Amount of Hyperbole by Chris De Sa, Albert Gu, Chris Ré, and Fred Sala Ĭheck out our paper on arXiv, and our code on GitHub!










    Poincare disk graph theory