I've been deploying some new general control policies on drones, and I got so pissed off with calibrating one of my sensors I just randomized its orientation in the training sim
And the AI can still use it to fly..
It learned this odd little calibration behavior where it oscillates in a circle, on roll and pitch axes while very slowly yawing left. Really fascinating
I've been deploying some new general control policies on drones, and I got so pissed off with calibrating one of my sensors I just randomized its orientation in the training sim And the AI can still use it to fly..
I think the thing that really excites me, when it comes to robotics, is where state "estimation" is directly improved by the behaviour itself. And then, the behaviour can be improved by the improved estimation..
It learned this odd little calibration behavior where it oscillates in a circle, on roll and pitch axes while very slowly yawing left. Really fascinating
this drone is being flown by a generalist neural network, with compute on the drone. The compute can tolerate any orientation of the sensors. You'll notice it do a little wobble. It's a learned strategy to calibrate everything in its hidden state. This will work on any drone
I'm building the Canadian DJI with a debit card
The compute is very cheap. Cheaper than you can possibly imagine. A dishwasher has enough compute for this.
See the broken sensors on one of the axes? It doesn't matter. The neural network can handle it in the fusion. Just teach it to handle broken sensors
this drone is being flown by a generalist neural network, with compute on the drone. The compute can tolerate any orientation of the sensors. You'll notice it do a little wobble. It's a learned strategy to calibrate everything in its hidden state. This will work on any drone
This is a test for new infrastructure, which drives down inference cost greatly, and therefore, control hz
I have policies that can do backflips on command. Give me a week
The compute is very cheap. Cheaper than you can possibly imagine. A dishwasher has enough compute for this. See the broken sensors on one of the axes? It doesn't matter. The neural network can handle it in the fusion. Just teach it to handle broken sensors

This is a test for new infrastructure, which drives down inference cost greatly, and therefore, control hz I have policies that can do backflips on command. Give me a week