Infoscience

Student project

Automatic Vehicle Recognition from Multi-spectral and LiDAR Elevation Data with Object-oriented Analysis

The objective of this diploma thesis was the recognition of vehicles, with perspective to generate a knowledge base for the automatic recognition of vehicles from similar type of data. The goal was achieved by developing a knowledge base in the object-oriented image processing software, eCognition, and with the use of multi-spectral data received from an RGB/NIR line scanner and with intensity and elevation data from a LiDAR scanner. At first, the data have been processed and analyzed with two approaches based on object-oriented analysis, and the resulted classes have been described with fuzzy logic. From the “top-down” approach had been produced large surfaces of buildings, soil, vegetation, asphalt and several types of vehicles. From the second, “bottom-up” approach, after merging the resulted objects that were precisely describing parts of the actual objects of classes, various types of vehicles emerged as compact objects. The process of object-oriented analysis included three levels of segmentation, which afterwards were classified based on descriptions that were drawn up for the photointerpretation classes that were recognized in the two parts of the dataset that were examined. At the classification of coarse level of the first dataset, that of the urban environment, were revealed surfaces of null-value pixels in various positions, probably due to error during the acquisition of measurements by the opto-electronic LIDAR system, both in the Blue spectral channel and in the Digital Terrain Model (DTM). Consequently, the classification and further analysis of first dataset was rendered impossible and the work for the automatic recognition of vehicles was limited to the second dataset. The coarse segmentation level was produced by the top-down approach. In this level it was mainly sought the extraction of the majority of building surfaces, and secondarily traffic routes, surfaces of vegetation, dense or sparse, and soil. The boundaries of surfaces of buildings were corrected by merging and reclassifying similar land cover surfaces (classification-based segmentation). The completion of recognition of all surfaces of the buildings took place with the classification of a medium segmentation level. The objective of creating the medium level was the better distinction of the borders and of the surfaces of buildings and of low-height flat roof buildings, so that to avoid misclassification of lorries in this category. The fine level of analysis resulted from top-down approach, aiming mainly at the recognition of vehicles. Given the limited size of objects, it was also made possible the identification of low-bushy, tall-arborescent, thin-mixed vegetation, soil, traffic routes, light and dark asphalt surfaces. By projecting the building surfaces from the coarse and medium to this particular level, no further attempt had to be made concerning the classification of those, fact that resulted at considerable decrease of the size of field of research for vehicles. Given the focus on clearly identifying vehicles and their parts, the accuracy of the classification of objects that were depicting vehicles, was rendered greater than that of other classes. The reason is also partly due to the fact that the description of the vehicle class was assigned without significant overlap in the range of features used to describe other classes and might recommend the difference. Despite efforts to produce suitable segments during the segmentation of this level, due to the size of the vehicles in the given data resolution and of the errors resulting from the general classification, vehicle shape could not be properly defined. The developed methodology, offered as a guide for recognizing vehicles both in urban and industrial environment from the same resolution of multi-spectral and LiDAR elevation data, by making minor changes in the range of features that has been employed.

Fulltext

  • There is no available fulltext. Please contact the lab or the authors.

Related material

Contacts

EPFL authors