Unconventional AI sets May 15 deadline for grant pre-proposals
Unconventional AI announced May 15 as the final date to submit pre-proposals for its Unconventional Grant. The program targets AI efficiency ideas that move beyond scaling existing models. Recent submissions explored computation as dynamics, in-memory and in-physics compute, data-movement-minimizing architectures, and abstractions beyond linear algebra. CEO Naveen Rao posted a reminder to applicants. Researcher Michael Carbin highlighted the opportunity for efficiency work. Winners will be named shortly after the cutoff, with organizers emphasizing the need for fundamentally different data representation and computation methods.
If you were considering submitting an application for the Unconventional Grant, please get on it! We're closing submissions tomorrow! We'll announce the winners soon and are excited to see the work that comes from it
Tomorrow, May 15, is the final day to submit pre-proposals for the Unconventional Grant. Over the past several weeks, we’ve seen proposals spanning: • computation as dynamics • in-memory and in-physics compute • architectures that minimize data movement • new abstractions beyond linear algebra Many converge on the same intuition: meaningful efficiency gains in AI will not come from scaling existing approaches alone, but from fundamentally different ways of representing and computing. We are looking for technically grounded ideas that challenge assumptions across hardware, systems, and learning. We’re not looking for taller ladders to the moon. We’re looking for rockets. https://unconv.ai/blog/unconventional-grant-final-call-for-proposals/
Great opportunity to fund work on effiiency that plans that gamut from conventional to unconventional.
Tomorrow, May 15, is the final day to submit pre-proposals for the Unconventional Grant. Over the past several weeks, we’ve seen proposals spanning: • computation as dynamics • in-memory and in-physics compute • architectures that minimize data movement • new abstractions beyond linear algebra Many converge on the same intuition: meaningful efficiency gains in AI will not come from scaling existing approaches alone, but from fundamentally different ways of representing and computing. We are looking for technically grounded ideas that challenge assumptions across hardware, systems, and learning. We’re not looking for taller ladders to the moon. We’re looking for rockets. https://unconv.ai/blog/unconventional-grant-final-call-for-proposals/