Hyperbolic Recommender Systems
How exotic geometries can have more practical use-cases than you might think
| UPDATED
We’re typically used to imagining machine learning parameters acting in euclidean space. High-dimensional Euclidean space, true, but still euclidean space nonetheless. For thousands of years it was assumed that the world behaving accoriding to the rules of Euclidean Geometry was an inalienable truth.
Of course much later, we found that spacetime can devaite from this Platonic ideal, even if for most humans it’s at scales either too small (subatomic scale) or too large (significant fractions of the size of the observable universe) to be meainingful in everyday life.
With this axiom of the universality of Euclidean geometry challenged, why continue with Euclidean gemoetry as the default in machine learning? What can we gain from switching to hyperbolic/Lobachevskian geometry?
Put simply, hyperbolic/Lobachevskian geometry is based on the same fundamental premises as Euclidean geometry, except for the axiom of parallelism.
For a more intuitive feel for how living in hyperbolic space would be, see the game Hyperbolica, and Greg Egan’s novel Dichronauts.
References
The overwhelming majority of my guidance for these 3 sections came from the boo
Cited as:
@article{mcateer2021hrs,
title = "Hyperbolic Recommender Systems",
author = "McAteer, Matthew",
journal = "matthewmcateer.me",
year = "2021",
url = "https://matthewmcateer.me/blog/hyperbolic-recommender-systems/"
}
If you notice mistakes and errors in this post, don’t hesitate to contact me at [contact at matthewmcateer dot me
] and I will be very happy to correct them right away! Alternatily, you can follow me on Twitter and reach out to me there.
See you in the next post 😄