We describe detailed protocols for using FLLIT, a fully automated machine learning method for leg claw movement tracking in freely moving Drosophila melanogaster and other insects. These protocols can be used to quantitatively measure subtle walking gait movements in wild type flies, mutant flies and fly models of neurodegeneration.
The Drosophila model has been invaluable for the study of neurological function and for understanding the molecular and cellular mechanisms that underlie neurodegeneration. While fly techniques for the manipulation and study of neuronal subsets have grown increasingly sophisticated, the richness of the resultant behavioral phenotypes has not been captured at a similar detail. To be able to study subtle fly leg movements for comparison amongst mutants requires the ability to automatically measure and quantify high-speed and rapid leg movements. Hence, we developed a machine-learning algorithm for automated leg claw tracking in freely walking flies, Feature Learning-based Limb segmentation and Tracking (FLLIT). Unlike most deep learning methods, FLLIT is fully automated and generates its own training sets without a need for user annotation, using morphological parameters built into the learning algorithm. This article describes an in depth protocol for carrying out gait analysis using FLLIT. It details the procedures for camera setup, arena construction, video recording, leg segmentation and leg claw tracking. It also gives an overview of the data produced by FLLIT, which includes raw tracked body and leg positions in every video frame, 20 gait parameters, 5 plots and a tracked video. To demonstrate the use of FLLIT, we quantify relevant diseased gait parameters in a fly model of Spinocerebellar ataxia 3.
In the last few decades, neurodegenerative diseases and movement disorders have grown more prevalent in our aging populations. Although our understanding of many neurodegenerative diseases has advanced at the molecular and cellular level, fundamental features of the affected neuronal circuitry underlying disease remain poorly understood. Recently developed behavioral tracking tools1,2,3,4 now allow us to study movement abnormalities in animal disease models in order to identify molecular, cellular and circuit dysregulation underlying disease.
Molecular pathways involved in many neurodegenerative diseases are conserved in the fruit fly Drosophila melanogaster, and Drosophila disease models have helped to elucidate fundamental mechanisms underlying neurodegeneration5,6. We recently showed that fly models of Parkinson’s Disease (PD) and Spinocerebellar ataxia 3 (SCA3) exhibit distinct, conserved gait signatures that resemble those of the respective human diseases1, demonstrating that the fly model can be used to understand circuit mechanisms underlying movement dysfunction in specific movement disorders. The rich and continually growing arsenal of tools in the fly model for targeted manipulation and visualization of neurons at the single gene and single cell level7,8,9,10 makes the fly an ideal model one to probe the relationship between disease pathways, neuronal circuitry and behavioral phenotypic manifestation in vivo. To enable precise, automated insect gait analysis, we recently developed a machine learning method, Feature Learning-based LImb segmentation and Tracking (FLLIT)1.
FLLIT consists of a fully automated multi-stage algorithm that first segments the leg pixels, which are subsequently used to locate and track the corresponding leg claws. FLLIT employs a boosting algorithm for segmentation, in contrast to deep learning algorithms used in recent work2,3. There are some similarities with convolutional neural networks in that for both frameworks, feature extraction is done automatically through learning convolutional kernels. The first step in FLLIT involves using morphological operations (edge and skeletonization) to automatically generate positive (pixels on the legs) and negative (background or pixels on the fly body) training samples with high confidence. Hence, FLLIT is fully automated and does not require user annotated training samples. Using the above training samples, a classifier is then trained in the framework of a boosting algorithm. An ensemble of weak classifiers is iteratively learnt, with each consisting a set of convolutional kernels for feature extraction and a decision tree. The final learnt classifier is then used for leg segmentation and is able to better discern difficult regions/hard samples better than morphological operations, producing an overall much more accurate segmentation for tracking1. From the segmented legs, we locate the tips and track them using the Hungarian algorithm: by matching tips across frames such that the sum of the distance moved by each tip is minimized. FLLIT can handle occlusion cases by remembering the last seen location (in fly centered coordinates) so that a leg tip is recovered once it is no longer under occlusion.
We previously showed that FLLIT can automatically and accurately track leg movements and analyze gait in an unmarked, freely moving fly or spider from high-speed video1; FLLIT should hence be broadly applicable for arthropod leg tracking. By extracting machine learning training sets using morphological parameters, FLLIT automatically trains itself to segment and track insect legs without the need for laborious manual annotation, which is required for most deep learning methods. FLLIT is hence fully automated. After leg segmentation and tracking, FLLIT automatically produces raw tracked body and leg positions in every video frame, 20 gait parameters, 5 plots and a tracked video for gait analysis and visualization of gait movements. This protocol provides a step-by-step guide to using FLLIT.
1. System setup
2. Preparation of flies for recording
3. Generation of videos for FLLIT analysis
NOTE: This step is specific to the video camera used. In this case, a commercially available video camera is used (see Table of Materials).
4. Installation of FLLIT program
NOTE: Up-to-date instructions can be found at: https://github.com/BII-wushuang/FLLIT/blob/master/Compiled/Readme.pdf
5. Running FLLIT for automated leg tracking
Following leg segmentation, tracking and data processing, FLLIT automatically generates raw data for the positions of the body and each leg claw, 20 gait parameters, 5 plots and a tracked video (Table 1).
Here, we demonstrate these analyses using a fly model of Spinocerebellar ataxia 3 (SCA3). The pan-neuronal driver Elav-GAL4 was used to drive either the full-length wildtype human SCA3 with 27 glutamines in the polyQ tract (UAS-SCA3-flQ27), or a full-length mutant human SCA3 with 84 glutamines in the polyQ tract (UAS-SCA3-flQ84)11. SCA3 is typified by an ataxic gait with body veering, erratic foot placement and short, lurching steps12,13 (Table 2). To characterize gait of mutant SCA3 flies and investigate whether they display a similar gait to that of human patients, we analyzed relevant gait parameters generated by FLLIT, namely: Number of body turns, footprint regularity, leg domain overlap and sizes, and leg stride lengths (Table 2).
We found that SCA3-Q84 flies exhibited more turns (Figure 4A,A’), erratic foot placement as exhibited by low footprint regularity (enlarged standard deviations of the AEP14) (Figure 4B), increased leg domain overlap (Figure 4C-D), enlarged leg domains in length and area (Figure 4E,F), and decreased stride length (Figure 4G).
FLLIT also generates a video showing the tracked fly and legs in the arena-centered and body-centered views, body trajectory and heading direction, and vertical and lateral displacements of each leg (Figure 5). The tracked videos allow side-by-side comparison of leg movements in different flies. Representative videos of Elav-GAL4>SCA3-flQ27 (Video 1) and Elav-GAL4>SCA3-flQ84 (Video 2) flies demonstrate that compared to Elav-GAL4>UAS-SCA3-flQ27 flies (Figure 5A), Elav-GAL4>UAS-SCA3-flQ84 flies (Figure 5B) exhibit irregular, intersecting leg domains of different sizes, indicative of a lurching, ataxic gait.
Figure 1. Setup of the recording station and arena. Recordings from the (A) front and (B) side views. (C) An example of an arena used for making fly recordings for FLLIT tracking. Please click here to view a larger version of this figure.
Figure 2: View of the active window during fly gait recording using a dual head camera, which allows simultaneous recording of two flies. Please click here to view a larger version of this figure.
Figure 3: Active FLLIT window showing the button panel and labelled legs after segmentation and tracking. Please click here to view a larger version of this figure.
Figure 4: Representative FLLIT-generated data for relevant gait parameters of flies expressing wildtype (SCA3-flQ27) vs. mutant (SCA3-flQ84) SCA3. (A) Number of turns in the body trajectory. (B) Mid-leg footprint regularity normalized to body length. (C-C’) Traversed leg domains of each leg. (D) Domain overlap between legs. (E) Mid-leg domain length normalized to body length. (F) Mid-leg domain area normalized to body length2. (G) Mid-leg stride length normalized to body length. Please click here to view a larger version of this figure.
Figure 5: Snapshot of representative FLLIT-generated videos. (A) Elav-GAL4>UAS-SCA3-flQ27 and (B) Elav-GAL4>UAS-SCA3-flQ84 flies. Please click here to view a larger version of this figure.
Video 1: Representative video of a fly expressing pan-neuronal wild-type human full-length SCA3 (Elav-GAL4>UAS-SCA3-flQ27). Please click here to view this video. (Right-click to download.)
Video 2: Representative video of a fly expressing pan-neuronal mutant human full-length SCA3 (Elav-GAL4>UAS-SCA3-flQ84). Please click here to view this video. (Right-click to download.)
Supplemental Figure 1: Configurations for VcXSrv. Please click here to view a larger version of this figure.
Supplemental Figure 2: Configuration for Xquartz. Please click here to view a larger version of this figure.
Supplemental Figure 3: Image labelled with the dimensions needed for calculating Field of view. Please click here to view a larger version of this figure.
Category | Parameters | Description | File/Plot (if applicable) |
Raw data | Body position | Positional coordinates of the body centroid in each frame | First two columns of CoM.csv |
Body trajectory | Angle of rotation of the body axis in degrees (relative to the y-axis) | Third column of CoM.csv | |
Arena-centred leg claw positions |
Positional coordinates of each leg claw in each frame based on arena coordinates | trajectory.csv | |
Body-centred leg claw positions |
Positional coordinates of each leg claw in each frame based on arena coordinates |
norm_trajectory.csv | |
Body motion | Body length (mm) | Length of the sample animal estimated in each frame (anterior-most position on head to posterior-most position on the wings) |
bodylength.csv |
Instantaneous body velocity (mm/s) |
Instantaneous velocity of the body (centroid) in the sample animal | BodyVelocity.csv; BodyVelocity.pdf | |
Turning points of the body trajectory |
To locate the turning points, the trajectory is reduced to a piecewise-linear curve using the Dougl asâASPeucker algorithm, following which a turning event is identified as involving an angle > 50 deg between two neighbouring linear segments constituting the simplified trajectory |
BodyTrajectory.pdf | |
Individual stride parameters | Stride duration (ms) | The duration of a stride event | StrideParameters.csv |
Stride period (ms) | The duration from one stride event to the next | ||
Stride displacement (mm) | Displacement of the leg claw during a stride event | ||
Stride path covered (mm) | Total path covered by the leg claw during a stride event | ||
Anterior extreme position (mm) |
Landing position (relative to the body) of a leg claw at the end of a stride event | ||
Posterior extreme position (mm) | Take-off position (relative to the body) of a leg claw at the start of a stride event | ||
Stride amplitude (mm) | Displacement along the direction of motion for a stride event | ||
Stance linearity (mm) | Defined as the deviation of the stride path from a curve smoothed over (at 20ms intervals) the corresponding anterior and posterior extreme positions of the stride |
||
Stride stretch (mm) | Distance of the leg claw position from the body centre in the middle of a stride event | ||
Leg motion | Leg speed (mm/s) | The instantaneous speed of each leg | LegSpeed.csv; Gait.pdf |
Gait index | This measures the type of gait coordination exhibited by the (six-legged) sample animal during its motion. A gait index of 1 corresponds to a tripod gait, _1 corresponds to a tetrapod gait while 0 constitutes an non-canonical gait. In our implementation, the gait index is obtained by a moving average over a 120 ms window |
GaitIndex.csv; GaitIndex.pdf | |
Movement percentage | Percentage of the time that a leg is in motion | LegParameters.csv | |
Mean stride period (ms) | Average duration from one stride event to the next | LegParameters.csv | |
Footprint regularity (mm) | Measured as the standard deviations of the posterior and | LegParameters.csv | |
anterior extreme positions of a leg | |||
Leg trajectory domain area (mm2) |
The area of the minimal convex hull that contains the entire leg trajectory in the body-centred frame of reference | LegParameters.csv; LegDomain.pdf | |
Length and width of the leg trajectory domain (mm) |
Obtained via the maximum projected distance of the claw positions onto the major (domain length) and minor (domain width) principal axes |
LegParameters.csv | |
Leg domain intersection/overlap (mm2) |
The intersection/overlap between each possible | LegDomainOverlap.csv | |
Stance width (mm) | Average of the distance between the AEP and PEP of the left and middle legs | StanceWidth.csv |
Table 1: Gait parameters generated by FLLIT.
Gait feature | ||||
Gait features of Spinocerebellar ataxia 3 (SCA3) | Veering | Erratic foot placement and leg crossing over | Lurching steps | Short strides |
Measurement Parameter | Number of body turn events | Footprint regularity | Size of leg domains, degree of domain overlap | Stride length |
FLLIT File | BodyTrajectory.pdf | LegParameters.csv | LegDomainOverlap.csv | StrideParameters.csv |
Table 2: Table showing hallmark SCA3 gait features in human patients with their corresponding FLLIT parameters and output files.
In this manuscript, we describe in detail the steps involved in using FLLIT, an automated machine-learning program1, to analyze gait in freely walking Drosophila. After tracking and data analysis, FLLIT automatically generates raw data for the positional information of the body and leg claws, producing twenty body and gait features as well as a video of the tracked fly to enable gait visualization.
There are now a number of methods for leg movement tracking of Drosophila and other animals1,2,3,4,14,15,16, giving researchers a wide range of options depending on the goals of the experiment. Some of these are foot printing-based approaches, which are highly accurate but which report only claw contact points with the detection surface4,14. On the other hand, recent deep learning approaches2,3,16 are highly versatile, allowing analysis of behaviors that require tracking of leg joints and other body parts in any animal, with the caveat that the algorithms need to first be trained with user annotated datasets. A third type of approach uses morphology or image-contrast-based methods1,15,17 to find the outline of each leg to identify claw positions. In general, these methods deal poorly with behaviors where the legs cross over (e.g. during grooming). FLLIT combines the second and third approaches, using morphological parameters to train a boosting algorithm for leg segmentation. This allows FLLIT to bypass the laborious task of user annotation to generate the training dataset, while enhancing accuracy using machine learning. Future improvements to FLLIT will have to deal with instances where legs cross over, to allow for analysis of more complex behaviors.
FLLIT is robust to slight changes in illumination, recording resolution and frame speed1. However, frame speed of recorded videos should not fall below 250 fps, and FLLIT runs optimally for videos recorded at 1000 fps. If there is motion blur in the images, such that it is challenging for a human annotator to identify the leg position, FLLIT will not be able to accurately identify leg tips in those frames. In the light of this, it is essential that the camera be focused sharply on the leg tips. To prevent segmentation artifacts, the arena should be cleaned thoroughly, and should not be moved during the recording. For accurate background subtraction and clean segmentation, the fly should move at least one body length during the recording, without pausing. After automatic segmentation and tracking the labeling of the all legs must be checked. If the fly gait is not tracked or tracked wrongly, the file should be retracked manually using the Manually Initiate Tracking option (step 5.2.7 – 5.2.10).
Neurodegenerative diseases and movement disorders are increasingly prevalent in our aging societies. Fly models of neurodegeneration have been studied for more than 2 decades, during which advances have been made regarding the molecular and cellular aspects of disease pathophysiology. However, specific behavioral consequences of disease have been technically difficult to assess. For example, while reports of trembling movements in the fly have been made18,19, these had not been quantitatively studied until recently1. The climbing assay has been a useful and quantitative, yet relatively coarse measure6. This technical deficit has similarly hampered high-resolution movement analysis in other animal models. The advent of new tools for behavioral analysis, hence, has promise to rejuvenate the field of movement disorders to enable researchers to study how molecular and cellular mechanisms of neuromuscular diseases lead to specific behavioral outcomes in animal models. In this paper and in our previous work1, we showed using FLLIT that fly models of SCA3 exhibit a hyperkinetic ataxic gait, while PD fly models exhibit a hypokinetic rigid gait, recapitulating movement hallmarks of the respective human diseases1. Gait analysis also enabled us to identify distinct neuronal populations underlying specific movement dysfunctions. Going forward, detailed movement analysis, combined with the powerful imaging and functional tools available in the fly, will allow us to gain novel insight into mechanisms of locomotor dysfunction, illuminating our understanding of neurodegenerative diseases with respect to circuit mechanisms.
FLLIT should be widely applicable to study gait in other small arthropods, as it was previously demonstrated to be highly accurate for tracking spider leg movements1. While we focus here on the use of detailed movement phenotyping for quantifying pathogenic gait and its underlying circuitry, advances in movement tracking have already revolutionized, and will have continuing impact on, the understanding of normal walking coordination and gait and its underlying circuits, especially in myriad different branches of the evolutionary tree.
The authors have nothing to disclose.
The authors would like to thank Moumita Chatterjee and Alice Liu for technical support, and the Bloomington Drosophila Stock Centre (Indiana, USA) for making available the Drosophila strains used in this work. This work was supported by the Institute of Molecular and Cell Biology, Singapore; the Bioinformatics Institute, Singapore; the Agency for Science Technology and Research Joint Council Organization (grant number 15302FG149 to SA and LC); the Clinical Research Flagship Programme (Parkinson’s Disease) administered by the Singapore Ministry of Health’s National Medical Research Council (grant number NMRC/TCR/013-NNI/2014 to SA), the University of Alberta (startup grant to LC), and the Natural Sciences and Engineering Research Council of Canada (NSERC) Discovery Grant (grant number RGPIN-2019-04575 to LC).
Acrylic Sheets | Dama | 1.6 mm thickness, Clear sheets | Singapore, Singapore |
Clear Glass slides | Biomedia | BMH 880101 | Singapore, Singapore |
High speed camera | Photron | Fastcam MC2.1 | Tokyo, Japan. A shutter speed of 1 msec or faster is ideal to reduce motion blur of captured images |
Infra Red LED | Any – Generic from hardware store | 940nm Infrared Light Board | Singapore, Singapore |
Kimwipe | Kimberly Clark | 34155-01LS | Irving, Texas, USA |