Home >> Project

Behaviroal Mapping System of Animal and Robot LocomotionJun. 2018 - Aug. 2018
Terradynamics Lab, Johns Hopkins University

The 2-D behavioral mapping result of robot crossing the beams

Natural terrain is uneven, heterogeneous, and dynamically changing. The properties of the terrain like geometry, stiffness, friction change spatio-temporally. But even in such a complex terrain, animals can move robustly, by changing their locomotor modes. For example, the cockroach traverses the grass by a combination of locomotor modes like pushing against the grass, climbing, and running over.

A frequent assumption in behavioural science is that most of an animal’s activities can be described in terms of a small set of stereotyped motifs. A method for mapping an animal’s actions, relying only upon the underlying structure of postural movement data to organize and classify behaviours is important to reveal the low-dimensional nature of animal motions.

I have developed a motion mapper framework based on the method of Gordon J. Berman in MATLAB, which can be used to process 1) kinematic data of robot/animal motion and 2) videos of robot/animal motion. The result of the framework would be a 2d embedding space that represents the robot motion. The regions in the embedding space with high point’s density probably represent stereotyped behaviors of the robot.

Serveral signal processing and manifold learning techniques have been adopted. Image segmentation and alignment was performed for image pre-processing. PCA was used to decompose the image into low-dimensional postural mode. Morlet continuous wavelet transform was applied to provide a multiple time-scale representation of our postural mode dynamics. At the end, t-SNE was used to embed data points into 2D space.


View Project Source Code