Abstract
In this contribution the statistical inference based on score functions is developed with the aim of future utilization throughout different fields of physics, for example in detector collision data processing or neutrino prongs matching. New score functions between theoretical and empirical probability measures are defined and the corresponding minimum score estimators are presented. We find that consistency of different estimators in various score functions leads to the well-known consistency in commonly used statistical distances or disparity measures between probability distributions. Conditions under which a specific score function pass to ˚–divergence are formulated. Conversely, each ˚–divergence is a score function. Furthermore, the minimization of arbitrary divergence score function leads to the classical histogram density estimator and thus can be used to alternative interpretation of histogram based calculations in (high energy) physics. The Kolmogorov-Smirnov testing statistics can be achieved through absolute score function under the class of mutually complement interval partitioning of the real line. It means that the most popular statistical methods, such as histogram estimation and Kolmogorov goodness of fit testing used in physics, can be covered by one unifying score based statistical approach. Also, these methods were previously successfully applied to data sets originated from the particular material elasticity testing (nondestructive defectoscopy) within Preisach-Mayergoyz space modeling.
Export citation and abstract BibTeX RIS
Content from this work may be used under the terms of the Creative Commons Attribution 3.0 licence. Any further distribution of this work must maintain attribution to the author(s) and the title of the work, journal citation and DOI.