This site uses cookies. By continuing to use this site you agree to our use of cookies. To find out more, see our Privacy and Cookies policy.
Paper The following article is Open access

Object detection on dental x-ray images using deep learning method

, and

Published under licence by IOP Publishing Ltd
, , Citation D Suryani et al 2021 IOP Conf. Ser.: Mater. Sci. Eng. 1073 012058 DOI 10.1088/1757-899X/1073/1/012058

1757-899X/1073/1/012058

Abstract

Radiological examination has an important role in determining the diagnosis of dental problems and making decisions about the right type of treatment according to the case indications. Dental x-ray is a medical procedure for taking pictures of the inside of the mouth using radiation fluid, where the results are used diagnostically to help the dentist see the entire structure of the jaw bone and teeth, and dental problems that cannot be seen directly. Dental radiographic interpretation, which is generally performed by dentists, is a time-consuming and error-prone process due to high variations in tooth structure, low experience levels, and fatigue factors experienced by dentists. The workload of a dentist and the occurrence of misdiagnosis can be reduced by the existence of a system that can automatically interpret the x-ray results. To overcome these problems, a model will be developed to be able to detect objects in the dental panoramic x-ray images using Mask R-CNN, one of the methods in Deep Learning. Deep Learning is an artificial intelligence function that modelled the workings of human brain in processing data and creating patterns for use in decision making. With the detection of objects in panoramic x-ray image automatically, it is expected to save time, improve the quality of dental care, and also the quality of diagnosis made by dentists.

Export citation and abstract BibTeX RIS

Content from this work may be used under the terms of the Creative Commons Attribution 3.0 licence. Any further distribution of this work must maintain attribution to the author(s) and the title of the work, journal citation and DOI.

Please wait… references are loading.
10.1088/1757-899X/1073/1/012058