2d ago

Appen independently benchmarks Subquadratic SubQ attention mechanism

0

Appen independently benchmarked Subquadratic's SubQ attention mechanism and reported state-of-the-art results across four suites. The evaluation showed a 56.2× wall-clock speedup and 62.8× FLOP reduction versus FlashAttention-2 at 1M tokens on NVIDIA B200 hardware. Martin Shkreli posted a letter presenting the Appen results on Subquadratic letterhead. Research engineer Will Depue replied that the figures match prior releases and requested additional independent evaluations.

Original post

@MartinShkreli same numbers they published with, would love to see other evals first

7:58 AM · May 14, 2026 View on X

@MartinShkreli same numbers they published with, would love to see other evals first

Martin ShkreliMartin Shkreli@MartinShkreli

real?

2:49 PM · May 14, 2026 · 72.8K Views
2:58 PM · May 14, 2026 · 4.8K Views

I met with the @subquadratic team yesterday to discuss testing on ARC-AGI

They’ve had a ton of inbound and they’re looking forward to getting verified scores in a few weeks after it calms down

Martin ShkreliMartin Shkreli@MartinShkreli

real?

2:49 PM · May 14, 2026 · 72.8K Views
3:20 PM · May 14, 2026 · 20.4K Views

@MartinShkreli There's no LLM judge required for RULER so that's kind of odd?

and yeah, subquadratic attention already exists and it's already faster than flashattention 2 (from 2023).

if they have the goods, they're not showing them

Martin ShkreliMartin Shkreli@MartinShkreli

real?

2:49 PM · May 14, 2026 · 72.8K Views
6:59 PM · May 14, 2026 · 572 Views
Appen independently benchmarks Subquadratic SubQ attention mechanism · Digg