NYU professor Andrew Gordon Wilson suggests ICML 2010 nostalgia event
Andrew Gordon Wilson at NYU expressed missing rigorous machine learning debates on methods including MCMC and variational inference. He suggested a nostalgia ICML 2010 event featuring AI-generated submissions under historical scrutiny standards. Several researchers responded noting field progress via conjecture and refutation along with the unexpected rise of deep learning and former appeal of topic models.
It progressed via conjecture and refutation.
Sometimes I miss the days when people were passionately fighting about MCMC versus variational methods, or whether posterior tempering is problematic. We should have a nostalgia ICML 2010. You can submit AI slop, but expect a 2010 era reaction. What happened to our field?
To be honest, it was quite clear at the time that the field was ripe for disruption and that there was enormous untapped potential. People's jaws would drop over, like, topic models. It was so early. I just didn't expect deep learning to be the thing that did the disrupting.
Sometimes I miss the days when people were passionately fighting about MCMC versus variational methods, or whether posterior tempering is problematic. We should have a nostalgia ICML 2010. You can submit AI slop, but expect a 2010 era reaction. What happened to our field?
Every paper on RL would run experiments on a 20 state POMDP called, like, "the tiger environment" because there was a door that might have a tiger behind it. Not like a video game, just a discrete state labelled "TIGER" with -100 reward. IT WAS SO EARLY.
To be honest, it was quite clear at the time that the field was ripe for disruption and that there was enormous untapped potential. People's jaws would drop over, like, topic models. It was so early. I just didn't expect deep learning to be the thing that did the disrupting.
Sometimes I miss the days when people were passionately fighting about MCMC versus variational methods, or whether posterior tempering is problematic. We should have a nostalgia ICML 2010. You can submit AI slop, but expect a 2010 era reaction. What happened to our field?
@_rockt More via catastrophic forgetting.
It progressed via conjecture and refutation.
@andrewgwils i do not miss those days :p
Sometimes I miss the days when people were passionately fighting about MCMC versus variational methods, or whether posterior tempering is problematic. We should have a nostalgia ICML 2010. You can submit AI slop, but expect a 2010 era reaction. What happened to our field?
@andrewgwils there may actually be such a conference again (sans slop, I hope)
AABI is dead, long live AABI!
AABI is dead, long live AABI!

Sometimes I miss the days when people were passionately fighting about MCMC versus variational methods, or whether posterior tempering is problematic. We should have a nostalgia ICML 2010. You can submit AI slop, but expect a 2010 era reaction. What happened to our field?
@pfau You say that but topic models were pretty cool.
To be honest, it was quite clear at the time that the field was ripe for disruption and that there was enormous untapped potential. People's jaws would drop over, like, topic models. It was so early. I just didn't expect deep learning to be the thing that did the disrupting.
@pfau Surprising! I thought it was the thing and I still have a chip on my shoulder about how people in ML reacted to it. In the process gave away their voice to people outside ML. At some point ~2012, the internecine warfare/posturing had public spotlight, and it looked so defensive.
To be honest, it was quite clear at the time that the field was ripe for disruption and that there was enormous untapped potential. People's jaws would drop over, like, topic models. It was so early. I just didn't expect deep learning to be the thing that did the disrupting.