Frictionless TreadScan is the latest addition to our product line for motor function, and it is also the newest for measuring gait and neuro-function simultaneously on a head-fixed animal.
The frictionless TreadScan is a fully automated test for detecting coordinated neuro-activities while the animal walking is on a frictionless treadmill with its head fixed. The system consists of a special treadmill that has a transparent belt like our ExerGait treadmill in TreadScan system, however, the belt has almost no friction and the animals can walk on it to propel the belt backward.
The system has a head fixing device that can keep its head fixed while actively walking so that neuron firing, optogenetics, fiber photometry etc. can be performed on the active animal.
A mirror is installed underneath the transparent frictionless belt and sees all the foot prints of animal walking on the belt. A high speed camera capable of capturing video at 100 frames per second or greater is installed opposite to the mirror and records the video using the BCamCapture software.
Liquid reward delivery is achieved by installing a pump that delivers liquid to the animal for drinking. Control and measurement of delivery is implemented.
When neuron firing, optogenetics, fiber photometry etc. are executed, those generated signals that are read and recorded in-sync with the videos.
Enhanced TreadScan software, as used in TreadScan system, is used to analyze the recorded video and signal code. It will generate complete set of gait data, as well as the signal code.
TreadScan outputs the detailed results of these parameters from analysis into Microsoft Excel and gives statistical results to meet research requirements. Advanced functionalities such as batch mode analysis, group export, direct analysis of AVI captured files, and post‐hoc conversion to compressed MPEG videos to save disk‐space are available. A sample screenshot of the TreadScan software is shown to the right. The stride data is shown on the right for each foot individually. The results are updated as the analysis takes place.
November 23, 2018