1d ago

John Carmack shares PyTorch squared distance code

0

John Carmack shared a PyTorch implementation for computing squared Euclidean distances between batch embeddings of shape [batch, latent] and history embeddings of shape [history, latent]. The method uses precomputed vector norms via pow(2).sum and a matrix multiply for cross terms, avoiding large intermediate tensors. Researchers Andreas Kirsch and Lucas Beyer replied, noting its value for k-nearest-neighbor lookups and equivalence to normalized dot-product similarity. Carmack leads AGI research at Keen Technologies.

Original post

I'm a little disappointed with myself that the high school algebra identity didn't occur to me right away.

12:49 PM · May 14, 2026 View on X
Reposted by

@ID_AA_Carmack yeah and if they are normalized, then dot vs square dist are equivalent, which is a pretty neat party trick.

John CarmackJohn Carmack@ID_AA_Carmack

I'm a little disappointed with myself that the high school algebra identity didn't occur to me right away.

7:49 PM · May 14, 2026 · 168.8K Views
8:19 PM · May 14, 2026 · 8.2K Views

I'm a little disappointed with myself that the high school algebra identity didn't occur to me right away.

7:49 PM · May 14, 2026 · 168.8K Views

@ID_AA_Carmack Yeah this is an important optimization for kNN lookups

John CarmackJohn Carmack@ID_AA_Carmack

I'm a little disappointed with myself that the high school algebra identity didn't occur to me right away.

7:49 PM · May 14, 2026 · 168.8K Views
8:27 PM · May 14, 2026 · 1.3K Views
John Carmack shares PyTorch squared distance code · Digg