Evolving Spiking Neurons from Wheels to Wings Dario Floreano, Jean-Christophe Zufferey, Claudio Mattiussi Autonomous Systems Lab, Institute of Systems Engineering Swiss Federal Institute of Technology (EPFL), CH-1015 Lausanne, Switzerland WWW: http://asl.epfl.ch

E-mail: [email protected]

Abstract. We give an overview of the EPFL indoor flying project, whose goal is to evolve autonomous, adaptive, indoor micro-flyers. Indoor flight is still a challenge because it requires miniaturization, energy efficiency, and smart control. This ongoing project consists in developing an autonomous flying vision-based micro-robot, a bio-inspired controller composed of adaptive spiking neurons directly mapped into digital micro-controllers, and a method to evolve such a network without human intervention. This document describes the motivation and methodology used to reach our goal as well as the results of a number of experiments on vision-based wheeled and flying robots.

1

Issues and Challenges

Flying a small aircraft in a sitting room is probably more challenging than flying in open sky because space is small and closed, there may be several obstacles of different shape and texture, and illumination may vary quite strongly within a few meters. Realising an autonomous, indoor flying robot is a formidable challenge that requires novel solutions for mechatronics, energy efficiency, and artificial intelligence. Today, there are not yet flying vehicles capable of autonomously navigating within a house. Insects are very good at flying inside a room and represent therefore a rich source of inspiration. A team in Berkeley is attempting to create a miniature flying robot modelled on the wing mechanics and dynamics of the flies [1]. However, these micro-mechatronic devices cannot yet fly and do not have sufficient payload for sensors and microelectronics required by autonomous flight. Therefore, in the first stage of our project at EPFL we aim at building micro-airplanes that can carry microelectronics, sensors, and batteries, equipped with two mechanisms that make insect so successful at flying in diverse and cluttered environments: vision and spiking neural networks. Vision is a very rich source of information about the environment and is also more energy efficient than other types of sensors used in robotics, such as active infrared sensors, sonar, and laser. Furthermore, the miniaturization trend driven by the demand for multi-media consumer electronics is bringing to the market increasingly smaller and cheaper vision devices. A commercial and fully packaged vision chip, composed of some hundreds photoreceptors, with plastic optics can weigh less than 0.4 grams. Biological vision systems deal mainly with spatial and temporal change in the image. Spatial change is given by the relationship among activation values of adjacent pixels measured at the same time. Spatial relationship is used to detect contrast, shapes, and landmarks. Temporal change is given by the relationship among activation values of the same pixels measured over time. Temporal relationship provides information about self-motion, motion of objects, and imminent collision.

In biological systems, spatial and temporal information is captured and mapped into motor actions by neuronal networks with evolved architectures and time-dependent dynamics. In man-made systems, there are two major classes of artificial neuronal networks that can capture spatial and temporal information: Continuous Time Recurrent Neural Networks (CTRNN) [2] and Spiking Neural Networks (SNN) [3]. Some scientists have been trying to unveil the mechanisms of vision-guided behaviour by combining behavioural and neuro-physiological analysis with modelling and development on vision-guided mobile robots. Some of the major actors in this field include the teams led by Franceschini [4] at CNRS in Marseilles, by Buelthoff [5] at Max-Planck Institute in Tuebingen, and by Srinivasan [6] at the Australian National University in Canberra. Some of these models can be formalized in terms of CTRNN or as collections of non-linear filters. However, these methods require relatively high memory storage and computation power to handle time constants, synaptic weights, or other parameters and functions. Spiking neurons have been mainly studied and formalized within the biology-oriented community. In this project, we decided to investigate spiking neurons as candidates for our micro-systems because they communicate by binary events that can be easily mapped into digital micro-controllers. Furthermore, simple leakage and refractoriness in a spiking neuron can provide rich non-linear and time-dependent dynamics. Designing functional spiking networks is still a major challenge and there are not yet many learning algorithms that can be used to find a suitable set of synaptic connections for a desired behaviour. Therefore, in this project we use artificial evolution to discover minimal networks of spiking neurons coupled to vision sensor and actuators.

From Wheels to Wings The first stage of the project consisted in assessing the feasibility of evolving networks of spiking neurons for vision-guided robots [7]. To keep things simple, we started our experiments on the miniature mobile robot Khepera equipped with a linear camera.

Figure 1. Evolution of vision-based navigation with the Khepera robot.

The robot was required to navigate as straight and fast as possible for 40 seconds in a rectangular arena with randomly spaced stripes on the walls (if the stripes are regularly spaced, it is relatively trivial to detect distance from walls). A fully recurrent network of 10

spiking neurons connected to the photoreceptors of the robot was genetically encoded and evolved on the physical robot. In particular, this experiment was aimed at studying whether functional behaviours can be achieved by simply evolving the connectivity among neurons, but not their synaptic weights (in other words, all existing connections are set to strength 1, with possible inhibitory neurons). Such a network and its genetic encoding require very small memory resources and computational power. The results showed that artificial evolution could reliably generate in about 20 generations robots that navigate without hitting walls. The second stage of the project consisted in developing a low-level implementation of the evolutionary spiking network in a PIC™ micro-controller with few bytes of memory and a few MHz of clock speed. These micro-controllers are a suitable solution for micro-flyers because they require very little power, are extremely small and light, and include most of the circuitry required to interface sensors and actuators.

Figure 2. The autonomous micro-robot Alice equipped with microcontroller, infrared active sensors, batteries, and two wheels.

The implementation of a spiking neural network with 8 neurons and 8 input units, of its genetic encoding and fitness computation, and of a steady-state evolutionary algorithm took less than 35 bytes of memory storage, approximately 500 lines of assembly-code, and an update rate of 1ms, which is comparable to the update speed of biological neural networks. This was achieved by mapping neural dynamics and genetic operators directly into the architecture and functioning of the digital micro-controller without wasting even a single bit. The system was then evaluated on the Alice micro-robot, which is equipped with the same family of PIC micro-controllers, to evolve a navigation and obstacle-avoidance behaviour using the same fitness function described in [8]. It took less than 20 minutes for the robot to develop and retain smooth navigation abilities in a simple maze [9]. However, in these experiments we used active infrared sensors, instead of vision, because the vision module was not yet available for the Alice micro-robot. The third stage of the project consisted in evaluating the evolutionary spiking network for its ability to drive a vision-based blimp in a 5 by 5 meters room. In these experiments, we used the same algorithm developed in stage one for the Khepera experiments described above. The development of the autonomous indoor blimp took significant effort in order to provide it with the technology necessary to carry out evolutionary experiments.

Figure 3. The evolutionary blimp in a room with randomly spaced stripes.

Our blimp is equipped with two propellers for horizontal displacement, one propeller for vertical displacement, one active infrared sensor to detect altitude, a linear vision system facing forward, 6 antenna-like bumpers (not used in these experiments), a micro-controller, a BluetoothTM chip for communication with a desktop computer, rechargeable batteries, and one anemometer to estimate forward speed. At this stage, the entire algorithm runs on the desktop PC, which exchanges vision data and motor commands with the blimp every 100 ms. The evolutionary blimp is asked to move forward as fast as possible for 60 seconds using only visual information (altitude control is provided by an automatic on-board routine). The fitness is proportional to the reading of the anemometer, which is mounted on the front of the robot. A preliminary set of experiments indicated that artificial evolution can generate in about 20 generations spiking controllers that drive the blimp around the room [10]. A number of experiments remain to be done with the blimp. These include an experiment where altitude control is left to the evolutionary spiking network and one using the microcontroller implementation that was tested on the Alice micro-robot. These and other experiments are under way at the moment of writing.

Figure 4. A prototype of the indoor autonomous flyer.

The fourth stage of the project, currently in progress, is the development of a micro airplane capable of indoor flight. A major requirement of such an airplane is to be slow enough to

allow on-board and on-line vision acquisition, network update, and motor control using simple micro-controllers that require little power. Various prototypes have been developed and tested in wind tunnel [11]. The current prototype, shown in figure 4, weighs 45 grams, has an autonomy of 15 minutes when tele-operated, can fly within a room at walking speed, and is equipped with batteries, micro-controller, and a BluetoothTM chip. Although this may not yet be the final model, it already has a payload of 10 grams, which is sufficient for a vision system and related microelectronics.

Future Work The methodology used to evolve the spiking circuits for the Khepera, Alice, and blimp robots is not applicable to the indoor micro airplane because of its inability to recover from collisions with obstacles. The solution that we currently envisage is to evolve the control circuit in simulation and transfer the evolved individuals on the real airplane. Of course, a straightforward transfer is not going to work because the difference between a simulated and a physical flyer is likely to be quite large. Therefore, instead of evolving the connectivity of the circuit, we will genetically encode and evolve the plasticity rules and let the spiking circuit develop suitable connection strengths literally on the fly. In previous work, we showed that this method generates circuits that adapt very quickly to the environment where they are located [12]. We also showed that such evolved systems transfer very well from simulated to physical robots (and even across different robotic platforms). Our previous work on evolution of plasticity rules was done with conventional neural networks. In that case, the chromosomes encoded four types of plasticity rules, each being a complementary variation of the Hebb rule. These rules will have to be mapped into the temporal domain by taking into account the time difference between pre-synaptic and postsynaptic spikes. Current work on evolution of plasticity rules for spiking neurons, performed within another project aimed at creating an evolutionary and self-organizing electronic tissue [13], will help us to explore the best way of implementing such plastic circuits on microcontrollers. Acknowledgements. The authors acknowledge important contributions by Jean-Daniel Nicoud, Cyril Halter, Michael Bonani and Tancredi Merenda for the design of the blimp and micro airplane, and Matthijs van Leeuwen for experiments with the blimp. This work is supported by the Swiss National Science Foundation, grant nr. 620-58049.

References [1] [2] [3] [4] [5]

R. S. Fearing, K. H. Chiang, M. H. Dickinson, D. L. Pick, M. Sitti, and J. Yan, Wing transmission for a micromechanical flying insect, IEEE International Conference on Robotics and Automation, 2000. R. Beer and J. Gallagher, Evolving Dynamical Neural Networks for Adaptive Behavior, Adaptive Behavior, MIT Press, 1992. W. Maas and C. Bishop, Pulsed Neural Networks, Cambridge, MA: MIT Press, 1998. N. Franceschini, J. M. Pichon, and C. Blanes, From insect vision to robot vision, Philosophical Transactions of the Royal Society B, 337, 283-294, 1992. T. R. Neumann, and H. H. Buelthoff, Insect inspired visual control of translatory flight. In J. Keulemen et al. (Eds.), Proceedings of the 6th European Conference on Artificial Life, Berlin: Springer, 2001.

[6] [7] [8]

[9] [10]

[11] [12] [13]

K. Weber, S. Venkatesh, M. Srinivasan, Insect Inspired Behaviours for the Autonomous Control of Mobile Robots. From Living Eyes to Seeing Machines, 1997. D. Floreano, and C. Mattiussi, Evolution of Spiking Neural Controllers for Autonomous Vision-based Robots. In T. Gomi (Ed.), Evolutionary Robotics IV, Berlin: Springer, 2001. D. Floreano, and F. Mondada, Automatic Creation of an Autonomous Agent: Genetic Evolution of a Neural Network Driven Robot. In D. Cliff, P. Husbands, J.-A. Meyer, and S. Wilson (Eds.), From Animals to Animats 3. Proceedings of the Third International Conference on Simulation of Adaptive Behavior, Cambridge, MA: MIT Press, 1994. D. Floreano, N. Schoeni, G. Caprari, and J. Blynel, Evolutionary Bits’n’Spikes, To be published in Proceedings of Artificial Life (ALIFE’02), MIT Press, 2002. J-C. Zufferey, D. Floreano, M. van Leeuwen, and T. Merenda, Evolving Vision-based Flying Robots. In Bülthoff, Lee, Poggio, Wallraven (Eds.), Proceedings of the 2nd International Workshop on Biologically Motivated Computer Vision (BMCV), Berlin, Springer, 2002. J-D. Nicoud, and J-C. Zufferey, Toward Indoor Flying Robots, Proceedings of the International Conference on Intelligent Robots (IROS’02), 2002. J. Urzelai, and D. Floreano, Evolution of Adaptive Synapses: Robots with Fast Adaptive Behavior in New Environments. Evolutionary Computation, 9, 495-524, 2001. D. Roggen, D. Floreano, and C. Mattiussi, A Morphogenetic Evolutionary System: Phylogenesis of the POEtic Tissue. Accepted for publication in: Proceedings of the Fifth International Conference on Evolvable Systems (ICES), 2003.

Evolving Spiking Neurons from Wheels to Wings

spiking neurons directly mapped into digital micro-controllers, and a method to ... Furthermore, the miniaturization trend driven by the demand for multi-media.

382KB Sizes 3 Downloads 193 Views

Recommend Documents

Spiking neurons, dopamine, and plasticity
we provide a quantitative fit to the STDP data of Fino .... cessful in capturing amperometric DA data (Venton ..... These results constitute a step forward in dem-.

Visual focus with spiking neurons
Abstract. Attentional focusing can be implemented with a neural field. [1], which uses a discharge rate code. As an alternative, we propose in the present work an ...

Response of Spiking Neurons to Correlated Inputs
RubÚn Moreno, Jaime de la Rocha, Alfonso Renart,* and NÚstor Parga†. Departamento de Fısica Teˇrica, Universidad Autˇnoma de Madrid, Cantoblanco, 28049 Madrid, Spain. (Received 3 July 2002; published 27 December 2002). The effect of a temporal

engineering a sequence machine through spiking neurons ... - APT
Sep 5, 2006 - 9.3.2 Limitations and weaknesses of the sequence machine . . . 187 ..... Another motivation is to implement the spiking neural system in custom-built ... We hope that this work will prove useful to both hardware and software neural ....

Spiking neurons that keep the rhythm
Sep 16, 2010 - Action Editor: X.-J. Wang. Electronic supplementary material The online version of this article ... e-mail: [email protected]. P. Cisek. Groupe de ... different degrees of neuronal excitability (Masland 2001). In cortex, neural ...

engineering a sequence machine through spiking neurons ... - APT
Sep 5, 2006 - 2.4.3 Sequence learning and prediction in the neocortex . . . . . 46 ...... anisms and causing the transfer of chemical messengers called ..... There are many models of context free and context sensitive languages for ...... As an illus

Toward Data Representation with Formal Spiking Neurons Michael ...
Hence, we propose here data representation by means of spiking neuron models. ... the recovery time constant τ of the recovery current Ir and the reset amount ..... ing” explain the center-surround properties of retinal ganglion cell receptive.

The evolution of numerical cognition-from number neurons to linguistic ...
The evolution of numerical cognition-from number neurons to linguistic quantifiers.pdf. The evolution of numerical cognition-from number neurons to linguistic ...

The evolution of numerical cognition-from number neurons to linguistic ...
Page 1 of 6. Mini-Symposium. The Evolution of Numerical Cognition: From Number. Neurons to Linguistic Quantifiers. Edward M. Hubbard,1 Ilka Diester,2 Jessica F. Cantlon,3 Daniel Ansari,4 Filip van Opstal,5 and Vanessa Troiani6. 1. INSERM Unite ́ 562

pdf-14109\from-neuron-to-brain-fifth-edition-with-neurons ...
... apps below to open or edit this item. pdf-14109\from-neuron-to-brain-fifth-edition-with-neur ... ns-using-neuron-download-by-john-g-nicholls-john-w.pdf.

Preparing to spread its wings
many one-offs to cloud comparisons, we focus on reconstructing a segmental breakdown for F&N's soft drinks and beer ...... DTAC – Very Good, EA - Good, ECL – not available, EGCO - Excellent, GFPT - Very Good, GLOBAL - Good, GLOW - Good, GRAMMY -

Neurons Drawing Paper.ReginaDavis.pdf
... 1 of 2. Neurons! Brain Related Drawing Paper. by Regina Davis. Clip art by Charlottes' Clips http://www.teacherspayteachers.com/Store/Charlottes-Clips-4150 ...

pdf-15103\maritime-law-evolving-thirty-years-at-southampton-from ...
Connect more apps... Try one of the apps below to open or edit this item. pdf-15103\maritime-law-evolving-thirty-years-at-southampton-from-hart-publishing.pdf.

Active Learning from Evolving Streaming Data
Jun 13, 2012 - Data. ○ arrives in real time, potentially infinitely. ○ is changing over time. ○ not possible to store everything, discard (or archive) after processing. ○ .... It includes a collection of online and offline algorithms and tool

Evolving the Program for a Cell: From French ... - Semantic Scholar
of cells, the atoms of life, modular structures used to perform all the functions of a ... by computer scientists to create models inspired by biological developmental. ..... course, that every third integer from the left must be a valid function lab

Evolving the Program for a Cell: From French ... - Semantic Scholar
by computer scientists to create models inspired by biological developmental. ... exploring the degree to which developmental approaches may help us solve ...

Diverse coupling of neurons to populations in ... - Matteo Carandini
Apr 6, 2015 - with this view, intracellularin vivomeasurements indicated that popu- ..... field of view of ,120u360u, extending in front and to the right of the ...

Evolving the Program for a Cell: From French Flags to Boolean Circuits
The development of an entire organism from a single cell is one of the most profound and awe inspiring phenomena in the whole of the natural world. The complexity of living systems itself dwarfs anything that man has produced. This is all the more th

Amplification of Trial-to-Trial Response Variability by Neurons in ...
Poisson-like behavior of firing rates is well known, although reports differ on the ...... Available: http://sfn.scholarone.com/itin2003/ via the Internet. Accessed 19.

Synapses, Neurons and Brains - GitHub
UNIVERSITY; IT DOES NOT CONFER A DEGREE FROM THE HEBREW UNIVERSITY; AND IT DOES NOT VERIFY THE IDENTITY OF THE STUDENT.