Day 12 - Final presentations and demos, goodbye activities
This final morning started in the lecture room with presentations of the final results of projects at the workshop.
Giacomo started things off by reminding people to put stuff on the CCNW cloud for posterity.
Chiara reminded everyone to give credit later to the workshop and to all the partners in the projects (and it is good to write down their names now so no one gets left out).
Andreas talked about the specifically funded hardware neuromorphic group at the Telluride neuromorphic workshop (NIC) this year, to think about applying to participate in 2025.
Jamie Knight showed GeNN running a simulation of the ellipsoid body which was done by Gabriel and XXX and implemented in preliminary form on their robot with its impressive dome sky polarization sensor.
Next Gabriel (Gabi) and Thomas talked about SPIXER (spiking event on robot XXXX) implemented on the robot and last night very late ran experiments under starlight on the tennis court (fueled in part by beer). Thomas showed laptop and "in simulation it works really well"... they will show demo later
Patrick showed some results on how to represent the path integration of ants when they are blown off their path. At the workshop, he used his previous environment of a grid with 4 distinguishable walls and showed how snapshots can be integrated. He was inspired by the CML from Wolfgang's group and worked on the first cut of using a method like this in the future. To distinguish if the ant has place fields or integrates snapshots he has a plan.
Filippo talked about their project to study efficient spike coding using a basis set of wavelets and a single spike per neuron, as sketched below. They tried to do it also in DYNAP but didn't get far with this part.
Jesse and Stein showed results of learning drone pose from events and motor RPMs, finally finished at 230am last night. So it works! Next step is to put it on the drone!
Elvin and Paulo and Ton showed ODORINC, incrementally learning odors from a 5-element odor sensor, 3nonspeicific and 2 more specific. They connected to Loihi and implemented first try, demo to follow later.
Thomas Novotny showed results from a 3 layer network processing the heidelberg TIDIGITS on Loihi. In floating point it got 89% and on Loihi SNN so far got 5% correct, debugging needed!
Valentin, Pouya and Steve showed results of learning body schema using a robot arm to learning a mapping from reaching to a prototype touch pad surface, mesmerizing demo later.
Sanja and Felix showed first results of implementing an improved SNN on Speck for gesture recognition, the Synsense DVS+SNN chip. In software they get 90% accuracy but so far on speck it does not really work, but at least they got it running now.
Felix showed EI-clustered network on BrainScales2 hardware.
Julien and Jimmy showed their DelGrad work targeting the idea Maryada suggested to try Morse code decoding (NeuroMorsick). They will show a demo later.
Jean, Muhammad Aitsam and Gauvri showed off the application of event camera workgroup results along with all the participants and their scores in their adaptation to the Stroop task.
Alessio, Yannick and Sebastian showed results that started last year to use BS2 multicompartment capabilities and their positive feedback abilities to implement more classes of bursting neural behaviors. This year they implemented a feedback controller to stabilize the bursting behavior, a simple form of homeostasis. In response to a big perturbation, the neuron could recover to the original bursting behavior.
Ed and Eleni showed their results on Matthias Kampa's 'slime mold robot' Controlling a robot with a slime", is actually a Cellular Neural Network. Some LED cells are input and some are output, then you train a little DNN to interpret these units to produce behavior, e.g. steering and speed.
Gabriel Bena and Ismael Jaras and Mahmoud Akl showed SpiNNaker2 neuron models ("+ useless meta-learning") with a teacher-student approach to meta-learning. Maryeme Ouafoudi and Naresh Ravichandren and Sirine Arfa also got first Spin2 results.
Jakub Fil talked about the projects involving Ameca (one of the stars of the opera) and thanked everyone for their hard work.
Then we had a coffee break, an amazing list of projects already!
After lunch, there was a great demo session with the following list of live demos:
- Ant robot navigating back to starting point using ellipsoid body ring attractor path integration
- ODORINC odor robot successfully learned the smell of Tobi's Bullit bourbon and recalling this memory later.
- A ring oscillator running on DYNAP where the rotating pattern could be disrupted by a perturbation of the inhibitory population
- Neuromorsick Morse code decoder
- The Spikey DVS piano
- A lovely cellular automata running on the LED display of the slime mold robot
- A cool demo of new Ameca capability to use simple human pose estimation measured from SL camera to generate arm curls and shoulder arm lifts, plus the ability for Ameca to describe what it sees around itself by panning over the room. Finally Ameca was asked to tell a joke: "What do neuromorphic engineers do? They build machines with the intelligence of squids".
- Cool demo of the CML to learning mapping from the touchpad (with topography unknown to machine) and 2 arm joints to pan to a location on touchpad selected by a tap.
The afternoon and evening were filled with last swimming, snorkeling, tennis, and a party that combined Eurovision viewing and beach trance music and night snorkeling. The last partygoers came back from the beach long after watching the sun rise over the beach. A great conclusion to a fantastic workshop.
On behalf of your blog authors Eleni Mohammad and Tobi, Goodbye until 2025
Comments
Post a Comment