Brought to you by:
Paper The following article is Open access

Analysis and Comparison of Hough Transform Algorithms and Feature Detection to Find Available Parking Spaces

, , , , and

Published under licence by IOP Publishing Ltd
, , Citation S Rahman et al 2020 J. Phys.: Conf. Ser. 1566 012092 DOI 10.1088/1742-6596/1566/1/012092

1742-6596/1566/1/012092

Abstract

Parking space is one of the most critical needs of people's lives, especially in Indonesia. According to the Central Statistics Agency, vehicle growth in Indonesia in the last ten years is 9% per year. Meanwhile, parking needs are being eroded by settlements, shops, and public service buildings. Limited parking lots make it hard for drivers to find available parking spaces. When looking for a parking space, it was causing impacts such as traffic jams, air pollution, causing noise and panic. The intelligent parking system is the solution to this problem. This system can provide information on available parking slots. In this study, parking locations are marked with a circle. If a circle is visible, then a parking lot is available, and if not, then the parking location has been filled by the vehicle. Circle objects in images taken using the camera can be identified by the Hough transformation method or feature extraction. These two methods are compared to measure the accuracy and speed of the process. Experiments and observations on the performance of both methods show that both methods can recognize the location of the available parking slot. The feature extraction method has a better detection speed with an average processing time of 1.1 seconds. The Hough transformation algorithm has an average processing time of 4.1 seconds. Then it can be concluded that the feature extraction method is better applied to the smart parking system.

Export citation and abstract BibTeX RIS

Content from this work may be used under the terms of the Creative Commons Attribution 3.0 licence. Any further distribution of this work must maintain attribution to the author(s) and the title of the work, journal citation and DOI.

Please wait… references are loading.
10.1088/1742-6596/1566/1/012092