This site uses cookies. By continuing to use this site you agree to our use of cookies. To find out more, see our Privacy and Cookies policy.
Paper The following article is Open access

CNN with residual learning extensions in neutrino high energy physics

and

Published under licence by IOP Publishing Ltd
, , Citation Miroslav Kubu and Petr Bour 2021 J. Phys.: Conf. Ser. 1730 012133 DOI 10.1088/1742-6596/1730/1/012133

1742-6596/1730/1/012133

Abstract

As many reconstruction steps in neutrino high energy physics (HEP) are similar to image pattern recognition tasks, we explore the potential of Convolutional Neural Networks (CNN) combined with residual machine learning algorithm. Characteristic features from neutrino track image pixelmaps are extracted at different scales and these features are used for classification of the type of neutrino interaction. In this contribution, we sumarize observed performance of the residual neural networks (ResNet) for neutrino charged current (CC) interaction detections using image-like Monte Carlo simulated data for muon and electron neutrinos. The two topologies depicted at the neutrino detectors differ, muon neutrino CC interaction is dominated by a slowly ionizing muon, while electron neutrino CC interaction is usually recorded as a wide shower. For the ResNet performance evaluation, we use area under ROC curve (AUC) as the evaluation metric. We observe an improvement while using residual learning compared to general CNN architecture, which is caused by a more stable training with lesser vulnerability to the vanishing gradient of the ResNets. Moreover, stacking other hidden layers within our ResNet model greatly increased the AUC value on the test neutrino dataset without the signs of unstable training or overfitting.

Export citation and abstract BibTeX RIS

Content from this work may be used under the terms of the Creative Commons Attribution 3.0 licence. Any further distribution of this work must maintain attribution to the author(s) and the title of the work, journal citation and DOI.

Please wait… references are loading.
10.1088/1742-6596/1730/1/012133