LabVIEW application for motion tracking using USB camera

The technical state of the contact line and also the additional equipment in electric rail transport is very important for realizing the repairing and maintenance of the contact line. During its functioning, the pantograph motion must stay in standard limits. Present paper proposes a LabVIEW application which is able to track in real time the motion of a laboratory pantograph and also to acquire the tracking images. An USB webcam connected to a computer acquires the desired images. The laboratory pantograph contains an automatic system which simulates the real motion. The tracking parameters are the horizontally motion (zigzag) and the vertically motion which can be studied in separate diagrams. The LabVIEW application requires appropriate tool-kits for vision development. Therefore the paper describes the subroutines that are especially programmed for real-time image acquisition and also for data processing.


Introduction
This paper presents the technologies for optical signal processing in motion tracking using an USB camera. For experimentally validation a laboratory pantograph stand is used. The laboratory pantograph is made on 1:2 scales and it is controlled by a pneumatic system. By functioning of this mechanism, the pantograph-contact wire assembly describes a trajectory in horizontal plan (zigzag) and another trajectory in vertical plan [1], [2]. A LabVIEW application is accomplished for optical tracking the pantograph motion and for acquiring the trajectory coordinates for statistics. Other paragraphs are indented (BodytextIndented style).

Principle formulation
In order to acquire images in real time, the system composed by USB camera and the LabVIEW application has to work properly. The minimum requirements for the USB camera are the followings: 1280x720 pixels, refresh rate 25 fps, autofocus system. Figure 1 presents the block diagram for the proposed system. The laboratory pantograph motion in vertically and horizontally plans are tracked by the USB camera. The Figure 2 depicts the acquiring images flow.
Configuring the image acquisition system implies to determine the minimum feature size in physical units such as millimeters and to determine the number of pixels to allocate to the minimum-sized feature. A very important thing is to arrange the lighting system in order to obtain high contrast for the features of interest and to suppress distracting background features that complicate subsequent software processing.  Present application is accomplished using RGB colour images, but for a better efficiency, the grayscale images are preferred. Colour images tend to slow down the display, especially at higher resolutions.
The proposed solution for optically tracking the pantograph motion presumes to design a LabVIEW application which uses the Colour pattern matching technology for checking the presence of a template in the entire colour image or in a region of interest [3], [4].
Two colour markers are proposed to be programmed on the pantograph image: the green marker for tracking the horizontally motion (zigzag) and the orange marker for tracking the vertically motion ( Figure 3). These markers will have the same motion as the contact line and the pantograph. The application is able to save into .txt files the coordinates of the motion trajectories.
For acquiring imaging in real-time and also for inline processing, the special toolkits are necessary: Vision Acquisition Software and Vision Development Module Run-Time.

Description of the application for optically tracking
The logic scheme for the proposed tracking method is presented in Figure 4.
Vision Acquisition Software once installed, the main application is able to call the Vision Aquisition.vi subroutine which can acquire, save and display in real time the images from the camera connected to the computer.
The configuration of this subroutine ( Figure 5) presumes passing of some steps in which the acquisition method is selected: finite or continuous, with in-line or post-processing the images. The configuration implies the selection of the parameters along the image: resolution (implies the refresh rate), brightness, contrast, sharpness, gamma, etc.
These elements can be introduced into the main program as variables, so they can be modified in real-time.
Vision Assistant.vi subroutine permits to process the images acquired by the camera, using a large variety of programming technologies. For measuring the vertically and horizontally motion, the Colour pattern matching technology is used.   The application will follow the motion of a colour pattern, saving into a .txt file the coordinates of the chosen pattern centre.
These coordinates are saved in order to be used for a statistical purpose. The refresh rate is essential, because it will determine the reconstruction of the elements trajectories [5], [6]. For an image with continuity effect, a refresh rate of minimum 25 fps is necessary.
The attributes which can be modified using this technology are the followings: the number of matches, minimum score, colour sensitivity, searching strategy, the possibility of searching rotated patterns, etc. (Figure 6).
The source code of the main application is presented in Figure 7. The continuous acquisition with inline processing is permitted using a While loop. Due to the fact that two elements must be tracked (using green and orange markers), a series of two Vision Assistant.vi subroutines must be configured. The information containing the templates which must be optically tracked flows into a cluster form. The templates will be followed at each iteration of the main structure by two markers (green and orange). The coordinates of the markers centres are saved into .txt documents. The subroutines which make possible the saving information into .txt files are presented in Figure 8.
The coordinates of the markers centres form a vector which is written into a text file together with the saving folder path and the file name.
A series of subroutines: Create File.vi, Write to Text File.vi, Close File.vi and Simple Error Handler.vi serve to create and save the trajectory information into .txt file.
In order to accomplish a graphical variation of the vertically and horizontally motion of the pantograph, another LabVIEW application is designed.
The source code of this application is presented in Figure 9. Figure 10 presents the variation of the pantograph motion in vertically and also in the horizontally plan.

Conclusions
The optically tracking object represents an actual method for measuring distances between objects in motion without requiring of any other sensing devices. From Figure 10 (a) it can be observed an amplitude of approximate 4cm in pantograph vertically motion. The zigzag motion (b) follows an elliptical trajectory. It can also be observed the low quality of this trajectory due to the camera refresh rate (15 fps). The represented trajectory can be improved by using a high definition imaging from an appropriated camera.