Table of contents

Volume 459

2013

Previous issue Next issue

2013 Joint IMEKO (International Measurement Confederation) TC1-TC7-TC13 Symposium: Measurement Across Physical and Behavioural Sciences 4–6 September 2013, Genoa, Italy

Accepted papers received: 22 July 2013
Published online: 06 September 2013

Preface

011001
The following article is Open access

, and

The 2013 Joint IMEKO (International Measurement Confederation) TC1-C7-TC13 was organised by the University of Genova - DIME/MEC, Measurement Laboratory, Italy, on 4–6 September 2013. The work of this symposium is reported in this volume.

The scope of the symposium includes the main topics covered by the above Technical Committees:

TC1 Education and Training in Measurement and Instrumentation TC7 Measurement Science TC13 Measurements in Biology and Medicine

This is in keeping with the tradition set by the previous events of this well established series. There has been a special focus on measurement across physical and behavioural sciences, with the aim of highlighting the interdisciplinary character of measurement science and of promoting constructive interactions with scientists in other disciplines. The discussion was introduced by keynote lectures on measurement challenges in psychophysics, psychometrics and quantum physics. The symposium was attended by experts working in these areas from 18 countries, including USA, Australia and Japan, and provided a useful forum for them to share and exchange their work and ideas.

In total over sixty papers are included in the volume, organised according to the presentation sessions. Each paper was independently peer-reviewed by two reviewers from a distinguished international panel.

The Symposium was held in Genova, which was the European Capital of Culture in 2004, and took place in Palazzo Ducale, an important historical building whose construction started in the 13th century, and that has been the house of the Duke of Genova from the 14th century. Genova, whose name comes from the Latin word 'Janua' (meaning 'door', as January is the door month of the year), has been regarded over the centuries as a door connecting Europe with the different countries and cultures of the Mediterranean basin and thus was an appropriate site for an international symposium involving different and new scientific visions and approaches to measurement, focused on a common objective: the human being.

We would like to take this opportunity to thank the members of the Symposium Steering and International Programme Committees, many of whom acted as reviewers of the papers presented here under a very short timescale, as well as our sponsor, National Instruments.

The editors hope that this volume will provide a useful contribution to enhancing the science, technology, education, and training in measurement and instrumentation.

Giovanni Battista Rossi, Francesco Crenna and Vittorio Belotti Editors

Università degli Studi di Genova - DIME/MEC Laboratorio di Misure Via all'Opera Pia 15 a I - 16145 Genova Italy

011002
The following article is Open access

All papers published in this volume of Journal of Physics: Conference Series have been peer reviewed through processes administered by the proceedings Editors. Reviews were conducted by expert referees to the professional and scientific standards expected of a proceedings journal published by IOP Publishing.

Papers

Keynote lectures

012001
The following article is Open access

The imagined existences of "quantity" in psychology is a prerequisite for measurement. Any person (researcher or subject) may imagine homogeneous concepts as quantities, for example, (perceived) redness, loudness or heaviness of different kinds of objects or events. Conversely, heterogeneous concepts may be merged to imagined quantities such as intelligence or personality, which both are constructed from agreed upon sets of theoretical concepts. Measurement in physics, psychophysics and psychology will be contemplated, comparatively.

012002
The following article is Open access

and

The talk introduces the basics of Rasch models by systematically interpreting them in the conceptual and lexical framework of the International Vocabulary of Metrology, third edition (VIM3). An admittedly simple example of physical measurement highlights the analogies between physical transducers and tests, as they can be understood as measuring instruments of Rasch models and psychometrics in general.

From the talk natural scientists and engineers might learn something of Rasch models, as a specifically relevant case of social measurement, and social scientists might re-interpret something of their knowledge of measurement in the light of the current physical measurement models.

012003
The following article is Open access

The awareness of the physical possibility of models of space, alternative with respect to the Euclidean one, begun to emerge towards the end of the 19-th century.

At the end of the 20-th century a similar awareness emerged concerning the physical possibility of models of the laws of chance alternative with respect to the classical probabilistic models (Kolmogorov model).

In geometry the mathematical construction of several non-Euclidean models of space preceded of about one century their applications in physics, which came with the theory of relativity. In physics the opposite situation took place. In fact, while the first example of non Kolmogorov probabilistic models emerged in quantum physics approximately one century ago, at the beginning of 1900, the awareness of the fact that this new mathematical formalism reflected a new mathematical model of the laws of chance had to wait until the early 1980's.

In this long time interval the classical and the new probabilistic models were both used in the description and the interpretation of quantum phenomena and negatively interfered with each other because of the absence (for many decades) of a mathematical theory that clearly delimited the respective domains of application.

The result of this interference was the emergence of the so-called the "paradoxes of quantum theory".

For several decades there have been many different attempts to solve these paradoxes giving rise to what K. Popper baptized "the great quantum muddle": a debate which has been at the core of the philosophy of science for more than 50 years.

However these attempts have led to contradictions between the two fundamental theories of the contemporary physical: the quantum theory and the theory of the relativity.

Quantum probability identifies the reason of the emergence of non Kolmogorov models, and therefore of the so-called the paradoxes of quantum theory, in the difference between the notion of passive measurements like "reading pre-existent properties" (urn metaphor) and measurements consisting in reading "a response to an interaction" (chameleon metaphor).

The non-trivial point is that one can prove that, while the urn scheme cannot lead to empirical data outside of classic probability, response based measurements can give rise to non classical statistics.

The talk will include entirely classical examples of non classical statistics and potential applications to economic, sociological or biomedical phenomena.

Foundations of measurement

012004
The following article is Open access

Within the representational theory of measurement, Stevens presented a classification of scales based on the set of empirical operations preserved by each type of scale. In this classification, the group structure of admissible transformations of each scale type were exposed. In this paper, we present recent studies that focus the scale classification on the group structure of admissible transformations. Then the classification is extended to other types of scales.

012005
The following article is Open access

and

Measurement can be defined on the basis of the notion of empirical relation, which appears as a primitive term in the representational theory. Here a language is introduced for dealing with empirical relations and a probabilistic logic is applied to statements concerning them.

012006
The following article is Open access

and

Kemeny rule is one of deeply justified ways to solve the problem allowing to find such a linear order (Kemeny ranking) of alternatives that a distance from it to the initial rankings (input preference profile) is minimal. The approach can give considerably more than one optimal solutions. The multiple solutions (output profile) can involve intransitivity of the input profile. Favorable obstacle in dealing with intransitive output profile is that the intransitive cycles are lexicographically ordered what can help when algorithmically revealing them.

012007
The following article is Open access

, , and

Being an infrastructural, widespread activity, measurement is laden with stereotypes. Some of these concern the role of measurement in the relation between quality and quantity. In particular, it is sometimes argued or assumed that quantification is necessary for measurement; it is also sometimes argued or assumed that quantification is sufficient for or synonymous with measurement. To assess the validity of these positions the concepts of measurement and quantitative evaluation should be independently defined and their relationship analyzed. We contend that the defining characteristic of measurement should be the structure of the process, not a feature of its results. Under this perspective, quantitative evaluation is neither sufficient nor necessary for measurement.

012008
The following article is Open access

, and

Measurement has long been an important element of epistemology in the physical sciences and natural philosophy. More recently, the psychological sciences have developed a variety of techniques that purport to be instances of measurement as well. However, it is not clear how the understanding of measurement invoked in psychological science applications accords with the understanding of measurement found in other scientific disciplines. A sharper focus on conceptual clarity and coherence across the psychological and physical sciences has the potential to add a great deal to efforts to improve such practices. In this paper, we argue that it is possible to formulate a philosophically coherent account of how measurement works in both the physical and the human sciences.

012009
The following article is Open access

and

Affective engineering uses mathematical models to convert the information obtained from persons' attitude to physical elements into an ergonomic design. However, applications in the domain have not in many cases met measurement assumptions. This paper proposes a novel approach based on Rasch measurement theory to overcome the problem. The research demonstrates that if data fit the model, further variables can be added to a scale. An empirical study was designed to determine the range of compliance where consumers could obtain an impression of a moisturizer cream when touching some product containers. Persons, variables and stimulus objects were parameterised independently on a linear continuum. The results showed that a calibrated scale preserves comparability although incorporating further variables.

Education in measurement

012010
The following article is Open access

, , , , and

Aim of the paper is the demonstration of a paradigm shift in shape, color and spectral measurements in industry, biology and medicine as well as in measurement science, education and training. Laboratory applications will be supplemented and replaced by innovative in-field and point-of-care applications. Innovative functional modules are smartphones and/or smartpads supplemented by additional hardware apps and software apps. Specific examples are given for numerous practical applications concerning optodigital methods. The methodological classification distinguishes between different levels for combinations of hardware apps (hwapps) and software apps (swapps) with smartphones and/or smartpads. These methods are fundamental enablers for the transformation from conventional stationary working places in industry, biology, medicine plus science, education and training towards innovative mobile working places with in-field and point-of-care characteristics as well as mobile open online courses MOOCs. The innovative approach opens so far untapped enormous markets for measurement science and engineering. These working conditions will be very common due to their convenience, reliability and affordability. The fundamental enablers are smartphones and/or smartpads. A highly visible advantage of smartphones and/or smartpads is the huge number of their distribution, their worldwide connectivity via Internet and cloud services and the experienced capabilities of their users for practical operations. Young people are becoming the pioneers.

012011
The following article is Open access

, and

The topic of this paper is to describe the current course at Czech Technical University in Prague X38KLS (Construction of medical systems) and to describe proposed improvements and differences between novel and old approach. The changes and in a state of preparation and have not been fully implemented. The new course should take over the old in September 2013.

012012
The following article is Open access

, and

This paper is focused on innovation of laboratory exercises in course Distributed Systems and Computer Networks. These exercises were introduced in November of 2012 and replaced older exercises in order to reflect real life applications.

Clinical measurements

012013
The following article is Open access

, and

Lateral inhibition is described as an emergent property of the Delta-Notch signalling network. Two separate model representations of lateral inhibition are proposed for different purposes. One provides information about bioenergetics while the other has the capability to produce a physical representation. It is proposed that both can be used in further studies of the sensory pathways in the human connectome model of brain function.

012014
The following article is Open access

, , , , and

Metrology in regenerative medicine aims to develop traceable measurement technologies for characterizing cellular and macromolecule behaviour in regenerative medicine products and processes. One key component in regenerative medicine is using three-dimensional porous scaffolds to guide cells during the regeneration process. The regeneration of specific tissues guided by tissue analogous substrates is dependent on diverse scaffold architectural properties that can be derived quantitatively from scaffolds images. This paper discuss the results obtained with the multimodal NLO microscope recently realized in our laboratory in characterizing 3D tissue engineered (TE) scaffolds colonized from human Mesenchimal stem cells (hMSC), focusing on the study of the three-dimensional metrological parameters.

012015
The following article is Open access

, and

A method for recognizing the respiration cycle lengths from the electrocardiographic (ECG) signal recorded with textile electrodes that are attached to a bed sheet is proposed. The method uses two features extracted from the ECG that are affected by the respiration: respiratory sinus arrhythmia and the amplitude of the R-peaks. The proposed method was tested in one hour long recordings with ten healthy young adults. A relative mean absolute error of 5.6 % was achieved when the algorithm was able to provide a result for approximately 40 % of the time. 90 % of the values were within 0.5 s and 97 % within 1 s from the reference respiration value. In addition to the instantaneous respiration cycle lengths, also the mean values during 1 and 5 minutes epochs are calculated. The effect of the ECG signal source is evaluated by calculating the result also from the simultaneously recorded reference ECG signal. The acquired respiration information can be used in the estimation of sleep quality and the detection of sleep disorders.

012016
The following article is Open access

, and

We designed a novel sensor specifically aimed at ex vivo measurements of white thrombus volume growth; a white thrombus is induced within an artificial micro-channel where hemostasis takes place starting from whole blood under flow conditions. The advantage of the proposed methodology is to identify the time evolution of the thrombus volume by means of an original data fusion methodology based on 2D optical and electrical impedance data simultaneously processed. On the contrary, the present state of the art optical imaging methodologies allow the thrombus volume estimation only at the end of the hemostatic process.

012017
The following article is Open access

, , , and

Electromiography (EMG) is the gold-standard technique used for the evaluation of muscle activity. This technique is used in biomechanics, sport medicine, neurology and rehabilitation therapy and it provides the electrical activity produced by skeletal muscles. Among the parameters measured with EMG, two very important quantities are: signal amplitude and duration of muscle contraction, muscle fatigue and maximum muscle power. Recently, a new measurement procedure, named Laser Doppler Myography (LDMi), for the non contact assessment of muscle activity has been proposed to measure the vibro-mechanical behaviour of the muscle.

The aim of this study is to present the LDMi technique and to evaluate its capacity to measure some characteristic features proper of the muscle. In this paper LDMi is compared with standard superficial EMG (sEMG) requiring the application of sensors on the skin of each patient. sEMG and LDMi signals have been simultaneously acquired and processed to test correlations. Three parameters has been analyzed to compare these techniques: Muscle activation timing, signal amplitude and muscle fatigue. LDMi appears to be a reliable and promising measurement technique allowing the measurements without contact with the patient skin.

012018
The following article is Open access

, , and

Clinic thermometers are, probably, the most used measurement instrument in the medical facilities (hospitals, clinics, etc.) all around the world. A good part of the assessment the physician does on the patient's health status will depend on the result of such a measurement. In this work, a system to assess the quality of non-contact clinic thermometers is developed and presented; the accuracy of the system is designed to be a useful tool in the phase of the instrument verification and as a base for a future automated calibration facilities.

Measurement of human behaviour

012019
The following article is Open access

Risk-aversion is a fundamental parameter determining how humans act when required to operate in situations of risk. Its general applicability has been discussed in a companion presentation, and this paper examines methods that have been used in the past to measure it and their attendant problems. It needs to be borne in mind that risk-aversion varies with the size of the possible loss, growing strongly as the possible loss becomes comparable with the decision maker's assets. Hence measuring risk-aversion when the potential loss or gain is small will produce values close to the risk-neutral value of zero, irrespective of who the decision maker is. It will also be shown how the generally accepted practice of basing a measurement on the results of a three-term Taylor series will estimate a limiting value, minimum or maximum, rather than the value utilised in the decision. A solution is to match the correct utility function to the results instead.

012020
The following article is Open access

A probabilistic model was applied to problem of measuring pre-literacy in young children. First, semiotic philosophy and contemporary cognition research were conceptually integrated to establish theoretical foundations for rating 14 characteristics of children's drawings and narratives (N = 120). Then ratings were transformed with a Rasch model, which estimated linear item parameter values that accounted for 79 percent of rater variance. Principle Components Analysis of item residual matrix confirmed variance remaining after item calibration was largely unsystematic. Validation analyses found positive correlations between semiotic measures and preschool literacy outcomes. Practical implications of a semiotics dimension for preschool practice were discussed.

012021
The following article is Open access

, , , and

Vulnerable travellers experience various problems in the transport environment. These may reduce public travel confidence and consequently lead to decreased mobility. A goal of our research is to find out how to improve the accessibility to railway travelling, especially, for persons with functional limitations. By reducing barriers, the ability of travelling would be improved, and consequently allow for more flexible travel behaviors. In order to develop a model and a method of measurement for accessibility, we (a) constructed a reference group of representative 'typical older persons' (65–85 years) from questionnaire data, and (b) developed an accessibility measure for persons with functional limitations. In this measure barriers have different weights for the different persons depending on their functional ability and travel behavior. This gives the probability of facing a certain barrier when travelling to a certain destination; that is, a measure of accessibility for the individual. The more weight placed on a certain barrier, the less probable it is that the particular journey will take place. These weights will be obtained in forthcoming research on the perception of a set of various travel scenarios representing barriers.

012022
The following article is Open access

and

This article reviews existing study habit measurement instruments and discusses their drawbacks, in the light of new evidence from neuroscience on the workings of the brain. It is suggested that in addition to traditional frequency based past behavioural measures, the predictive accuracy of study habit measurement instruments could be improved by including measures of habit strength that take into account behaviour automaticity and efficacy, such as the Self-Report Habit Index (SRHI) developed by [1]. The SRHI has shown high reliability and internal validity in a wide range of contexts and its applicability and validity in the context of learning and higher education as an enhancement to study habit measurement instruments is as yet to be tested.

012023
The following article is Open access

, , and

Numerous subjects have trouble in understanding various conceptions connected to statistical problems. Research reports how students' ability to solve problems (including statistical problems) can be influenced by exhibiting proofs. In this work we aim to contrive an original and easy instrument able to assess statistical reasoning on uncertainty and on association, regarding two different forms of proof presentation: pictorial-graphical and verbal–numerical. We have conceived eleven pairs of simple problems in the verbal–numerical and pictorial–graphical form and we have presented the proofs to 47 undergraduate students. The purpose of our work was to evaluate the goodness and reliability of these problems in the assessment of statistical reasoning. Each subject solved each pair of proofs in the verbal-numerical and in the pictorial–graphical form, in different problem presentation orders. Data analyses have highlighted that six out of the eleven pairs of problems appear to be useful and adequate to estimate statistical reasoning on uncertainty and that there is no effect due to the order of presentation in the verbal–numerical and pictorial–graphical form.

Human-related measurements I

012024
The following article is Open access

and

The public and researchers in psychology and the social sciences are largely unaware of the huge resources invested in metrology and standards in science and commerce, for understandable reasons, but with unfortunate consequences. Measurement quality varies widely in fields lacking uniform standards, making it impossible to coordinate local behaviours and decisions in tune with individually observed instrument readings. However, recent developments in reading measurement have effectively instituted metrological traceability methods within elementary and secondary English and Spanish language reading education in the U.S., Canada, Mexico, and Australia. Given established patterns in the history of science, it may be reasonable to expect that widespread routine reproduction of controlled effects expressed in uniform units in the social sciences may lead to significant developments in theory and practice.

012025
The following article is Open access

and

The central importance of reading ability in learning makes it the natural place to start in formative and summative assessments in education. The Lexile Framework for Reading constitutes a commercial metrological traceability network linking books, test results, instructional materials, and students in elementary and secondary English and Spanish language reading education in the U.S., Canada, Mexico, and Australia.

012026
The following article is Open access

and

The concatenation of units of length is widely viewed as the paradigmatic expression of fundamental measurement. Survey, assessment, and test scores in educational and psychological measurement are often interpreted in ways that assume a concatenation of units to have been established, even though these assumptions are rarely stated or tested. A concatenation model for measurement is shown to be equivalent to a Rasch model: any two units of measurement placed end to end must together be of the same length as either one of them added to itself. This additive principle and a concatenation model of measurement together serve as a heuristic guide for organizing two experimental approaches to calibrating instruments for measuring length. The capacity to reproduce the unit of measurement from theory with no need for repeated empirical calibration experiments, as in the geometrical bisection of the line and the resultant halving of the length measure, is highlighted as essential to demonstrating a thorough understanding of the construct.

012027
The following article is Open access

, , , and

Visual acuity, a forced-choice psychophysical measure of visual spatial resolution, is the sine qua non of clinical visual impairment testing in ophthalmology and optometry patients with visual system disorders ranging from refractive error to retinal, optic nerve, or central visual system pathology. Visual acuity measures are standardized against a norm, but it is well known that visual acuity depends on a variety of stimulus parameters, including contrast and exposure duration. This paper asks if it is possible to estimate a single global visual state measure from visual acuity measures as a function of stimulus parameters that can represent the patient's overall visual health state with a single variable. Psychophysical theory (at the sensory level) and psychometric theory (at the decision level) are merged to identify the conditions that must be satisfied to derive a global visual state measure from parameterised visual acuity measures. A global visual state measurement model is developed and tested with forced-choice visual acuity measures from 116 subjects with no visual impairments and 560 subjects with uncorrected refractive error. The results are in agreement with the expectations of the model.

012028
The following article is Open access

Measurement is analyzed as a process of task-oriented actions with material and model objects. It is emphasized that model elements of measurement are of crucial importance in the formulation of measurement tasks and implementation of measurement experiments. It is shown that there is a methodological gap between the rows of material and model elements of measurement, which belong to fundamentally incompatible systems. This gap determines an adaptive nature of measurement, during which the attribute of an object being measured is specified more accurately. The need for constructing equivalence systems aimed at overcoming the gap is substantiated. The measurement traceability system fulfils the role of an internal equivalence system for measurement as a cognitive procedure. The application system in the interests of which the measurements are made fulfils the role of an external equivalence system.

Human-related measurements II

012029
The following article is Open access

and

Spectrophotometric analysers of food, being instruments for determination of the composition of food products and ingredients, are today of growing importance for food industry, as well as for food distributors and consumers. Their metrological performance significantly depends of the numerical performance of available means for spectrophotometric data processing; in particular – the means for calibration of analysers. In this paper, a new algorithm for this purpose is proposed, viz. the algorithm using principal components analysis (PCA). It is almost as efficient as PLS-based algorithms of calibration, but much simpler.

012030
The following article is Open access

and

This paper concerns detection of solid particles suspended in conductive media by impedance technique. The technique is based on changes in impedance measured between two electrodes placed across a given volume of conducting medium. It presents a methodology for modelling and investigation of the feasibility of such a technique for particle detection by 2D finite element (FE) field modelling. This is based on modelling and computation of electric field distribution between the above electrodes. It establishes the modelling approach, the complexity involved and justifies the need for modelling in 3D to incorporate some of the effects that cannot be taken into account in 2D models. It reports on the modelling investigation for a specific case of detecting, by impedance technique cholesterol particles suspended in human blood and points to a possible instrument for non-invasive measurement of blood cholesterol level.

012031
The following article is Open access

, , , , and

Automatic face recognition is a biometric technique particularly appreciated in security applications. In fact face recognition presents the opportunity to operate at a low invasive level without the collaboration of the subjects under tests, with face images gathered either from surveillance systems or from specific cameras located in strategic points. The automatic recognition algorithms perform a measurement, on the face images, of a set of specific characteristics of the subject and provide a recognition decision based on the measurement results. Unfortunately several quantities may influence the measurement of the face geometry such as its orientation, the lighting conditions, the expression and so on, affecting the recognition rate. On the other hand human recognition of face is a very robust process far less influenced by the surrounding conditions. For this reason it may be interesting to insert perceptual aspects in an automatic facial-based recognition algorithm to improve its robustness. This paper presents a first study in this direction investigating the correlation between the results of a perception experiment and the facial geometry, estimated by means of the position of a set of repere points.

012032
The following article is Open access

, , , and

Research on tactile sensitivity has been conducted since the last century and many devices have been proposed to study in detail this sense through experimental tests. The sense of touch is essential in every-day life of human beings, but it can also play a fundamental role for the assessment of some neurological disabilities and pathologies. In fact, the level of tactile perception can provide information on the health state of the nervous system. In this paper, authors propose the design and development of a novel test apparatus, named DITA (Dynamic Investigation Test-rig on hAptics), aiming to provide the measurement of the tactile sensitivity trough the determination of the Just Noticeable Difference (JND) curve of a subject. The paper reports the solution adopted for the system design and the results obtained on the set of experiments carried out on volunteers.

Uncertainty evaluation

012033
The following article is Open access

, , , , and

In this paper we give the results of four methods of calculating uncertainty associated with a mass calibration problem, three based on different implementations – the first and second order law of propagation of uncertainty and the Monte Carlo method – of the general methodology described by the Guide to the Expression of Uncertainty in Measurement, the fourth based on a Bayesian formulation. The nonlinearities present in the model for the calibration problem means that the first order approach can be an unreliable method for evaluating uncertainties, relative to the other three approaches.

012034
The following article is Open access

, and

When flat plates are submitted to wind tunnel tests and the upstream Mach number is greater than 1.0 this gives rise to shock and expansion waves. This study experimentally evaluates the airflow field parameters ahead of and behind the shock region and compares the data obtained in the tests with the expected theoretical results. For the case of expansion waves only theoretical values are supplied in this preliminary part of the study. The drag and lift forces acting on the model being tested are also estimated from the pressure values measured on the plate surface. Data reduction includes evaluation of the uncertainties associated with the measured quantities, as well as their propagation to the output quantities, employing the Monte Carlo method. Numerical simulation of the phenomena using Computational Fluid Dynamics is also done. Data obtained analytically, experimentally and numerically are compared.

012035
The following article is Open access

Described is the proposal of application of the GUM uncertainty type A evaluation to measurements with auto-correlated observations. The first step to it is the identification and cleaning of the raw sample data from the regularly variable components. Then formulas for standard deviation of the sample and of the mean value are expressed with the use correction coefficients or the so-called "effective number" of observations. These quantities depend on number of observations and on the sample autocorrelation function and allow to calculate the expanded uncertainty due to the GUM recommendations. The method of estimation of autocorrelation function for the sample data is also given. Considerations are illustrated by examples.

012036
The following article is Open access

Since the beginning of the history of modern measurement science, the experimenters faced the problem of dealing with systematic effects, as distinct from, and opposed to, random effects. Two main schools of thinking stemmed from the empirical and theoretical exploration of the problem, one dictating that the two species should be kept and reported separately, the other indicating ways to combine the two species into a single numerical value for the total uncertainty (often indicated as 'error'). The second way of thinking was adopted by the GUM, and, generally, adopts the method of assuming that their expected value is null by requiring, for all systematic effects taken into account in the model, that corresponding 'corrections' are applied to the measured values before the uncertainty analysis is performed. On the other hand, about the value of the measurand intended to be the object of measurement, classical statistics calls it 'true value', admitting that a value should exist objectively (e.g. the value of a fundamental constant), and that any experimental operation aims at obtaining an ideally exact measure of it. However, due to the uncertainty affecting every measurement process, this goal can be attained only approximately, in the sense that nobody can ever know exactly how much any measured value differs from the true value. The paper discusses the credibility of the numerical value attributed to an estimated correction, compared with the credibility of the estimate of the location of the true value, concluding that the true value of a correction should be considered as imprecisely evaluable as the true value of any 'input quantity', and of the measurand itself. From this conclusion, one should derive that the distinction between 'input quantities' and 'corrections' is not justified and not useful.

012037
The following article is Open access

and

The Statistical Dynamic Specifications Method (SDSM) relies on in-line measurements to manage dimensional specifications, target and tolerance. SDSM is the cornerstone of an innovative assembling technique based on the Statistical Feed-Forward Control Model (SFFCM) for processes in which the stacked dimensional variation of the assembled components reaches the order of the total allowed tolerance. Since the magnitude of such variation might jeopardize the process capability, it is a matter of interest to study the inclusion of the measurement uncertainty when applying SDSM on a target subprocess. By means of simulating the production of assemblies made of parts having high dimensional variation, a set of experiments were designed to compare the impact of different levels of measurement uncertainty on the capability of a target subprocess. Simulation results showed that depending on the magnitude of the uncertainty the capability index cp of the target subprocess increases between 2.5% and 34.5% from 1.27 to 1.82 as a direct consequence of adjusting the respective tolerance. Thus, the inclusion of the measurement uncertainty in the proposed SDSM has a significant impact in its practical realization since a decrement in cp implies an increment of the scrap percentage of the target subprocess.

012038
The following article is Open access

, , and

The paper discuss the computation of the worst case uncertainty (WCU) in common measurement problems. The usefulness of computing the WCU besides the standard uncertainty is illustrated. A set of equations to compute the WCU in almost all practical situations is presented. The application of the equations to real-world cases is shown.

Modelling in measurement

012039
The following article is Open access

and

We describe a model selection methodology for partial differential equation (PDE) based models and show the results of applying it to a test problem derived from a model of the laser flash thermal diffusivity measurement technique. A methodology for comparing and choosing simplified models is of benefit to the model development process in metrology. It is assumed that the computational aim is not only to solve the model to obtain the results that the metrologist requires, but to ensure that the model is no more complex than necessary to achieve this and that results can be obtained in a reasonable time using the available computing resources. The advantage of the proposed method is that it avoids the need to solve directly the underlying complex model. We present the results of comparisons of four models of the laser flash problem and identify the further work needed to extend the approach to a wider range of problems and to identify suitable measures for comparing residuals.

012040
The following article is Open access

and

The paper emphasizes the importance that fundamental concepts in measurement science are defined according to a structured strategy, which provides both a general, qualitative characterization and a specific, type-related, quantitative definition. As a significant case, the concept 'sensitivity' is discussed and a definition for it proposed.

012041
The following article is Open access

Valid inference drawn from analysis of experimental results needs scientific grounds, whereas conclusions based on statistical significance tests or hypothesis testing may be problematic, especially when dealing with a multiplicity of tested hypotheses, as in experiments performed on bio-molecules. The problem of false discovery rate is focused in the present paper, aiming at eliciting application of sound criteria for rejection/acceptance of hypotheses and related methods of uncertainty characterisation.

012042
The following article is Open access

The paper illustrates and discusses some problems that should be taken into account, should the proposed use of fundamental constants in the definition of measurement units of the SI be implemented: (a) more base units being multi-dimensional, instead of fixing the present problems in this respect; (b) the multidimensionality in the definitions; (c) the use of CODATA adjusted values of the constants for this specific purpose; (d) formal issues in stipulating algebraic expressions of the definitions, and in respect to the rounding or truncation of the numerical values in their transformation from uncertain to exact values; (e) formal issues with the use of the integer number NA; (f) limitations that can arise from the stipulation of the values of several constants for the CODATA Task Group to continue performing in future meaningful least squares adjustments of the fundamental constants taking into account future data.

012043
The following article is Open access

We describe properties of measurement quantities and measurement processes verbally by using accepted terms, some of which are sanctioned by international standards and guides. However, some of these terms are used inconsistently, as compared to definitions in base sciences like for example Mathematics, Signal and System Theory, Estimation and Optimisation Theory, Stochastics and Statistics. This is the case for different terms in Metrology, like stability, drift and stationarity. In this paper we show systematic relations between these and similar terms and discuss the adequacy of their application in practice. We have to admit that we incorrectly implement the terms stable / instable unconsciously, when we describe properties of measurement results, due to the use of common colloquial habits.

Advanced instrumentation

012045
The following article is Open access

, , and

This paper shows the network management and operation to monitor landslide disaster at slop of mountain and hill. Natural disasters damage a measuring system easily. It is necessary for the measuring system to be flexible and robust. The measuring network proposed in this paper is the telemetry system consisted of host system (HS) and local sensing nodes network system (LSNNS). LSNNS operates autonomously and sometimes is controlled by commands from HS. HS collects data/information of landslide disaster from LSNNS, and controls LSNNS remotely. HS and LSNNS are communicated by using "cloud" system. The dual communication is very effective and convenient to manage a network system operation.

012046
The following article is Open access

, and

In the article is a systematic analysis of the prepackaged products conformity assessment that could help in practice by applying the general provisions of the Directives to conformity assessment of prepackaged products. Manufacturers have to choose the method of conformity assessment that best suits the needs of his/her firm by choosing the optimal packing structure, methods and tools, based on the recommendations of the Article. Procedures for the maintenance of measuring instruments (calibration and verification) depend on the composition of packing line, level of automation. They shall be chosen only after the accurate identification of packaging line. When choosing the measuring instruments for packing manufacturers must provide traceability capabilities of measuring instruments used in weighing process, properly using calibration and verification results of the measuring instruments. Measurement results have demonstrated that for evaluation of average mass of prepackages procedures, mentioned below, can be applied. Statistical analysis procedure is more accurate than random sample procedure. It was proved that accuracy (repeatability and systematic error) of measurement results for sticky products is more than 2 times bigger than for powdery products (especially for low mass). Manufacturer shall pay attention to this fact while adjusting packing line for sticky products.

012047
The following article is Open access

and

In this article we present a novel approach for the characterization of cold appliances and in particular of refrigerators based on the standard vapour compression cycle with a reciprocating on/off compressor. The test procedure is based on a virtual instrument that perform both the stimulus and the data acquisition on the device under test. Acquired data is elaborated to fit a semi-empirical model based on the energetic balances between thermal and electrical sub systems and the heat exchanged with the environment. This approach results in a simple method to calculate useful parameters of the refrigerator, such as energetic performance, cooling effect and limit values of thermal loads. The test procedure requires only a few temperatures and the electric power consumption to be monitored, resulting in a low impact on the refrigerator. Preliminary tests showed a good estimation of parameters and prediction of energy consumption and heat extraction capacity of the refrigerator under test.

012048
The following article is Open access

, , and

This paper presents a study carried out with the commonly used experimental techniques of ballistic pressure measurement. The comparison criteria were the peak chamber pressure and its standard deviation inside specific weapon/ammunition system configurations. It is impossible to determine exactly how precise either crusher, direct or conformal transducer methods are, as there is no way to know exactly what the actual pressure is; Nevertheless, the combined use of these measuring techniques could improve accuracy. Furthermore, a particular attention has been devoted to the problem of calibration. Calibration of crusher gauges and piezoelectric transducers is paramount and an essential task for a correct determination of the pressure inside a weapon. This topic has not been completely addressed yet and still requires further investigation. In this work, state of the art calibration methods are presented together with their specific aspects. Many solutions have been developed to satisfy this demand; nevertheless current systems do not cover the whole range of needs, calling for further development effort. In this work, research being carried out for the development of suitable practical calibration methods will be presented. The behavior of copper crushers under different high strain rates by the use of the Split Hopkinson Pressure Bars (SHPB) technique is investigated in particular. The Johnson-Cook model was employed as suitable model for the numerical study using FEM code

Cell-related measurements

012049
The following article is Open access

, , , and

The paper presents a novel methodology to measure the fibril formation in protein solutions. We designed a bench consisting of a sensor having interdigitated electrodes, a PDMS hermetic reservoir and an impedance meter automatically driven by calculator. The impedance data are interpolated with a lumped elements model and their change over time can provide information on the aggregation process. Encouraging results have been obtained by testing the methodology on K-casein, a protein of milk, with and without the addition of a drug inhibiting the aggregation. The amount of sample needed to perform this measurement is by far lower than the amount needed by fluorescence analysis.

012050
The following article is Open access

, , , and

The cell elasticity gives information about its pathological state and metastatic potential. The aim of this paper is to study the AFM Force Spectroscopy technique with the future goal of realizing a reference method for accurate elastic modulus measurement in the elasticity range of living cells. This biological range has not been yet explored with a metrological approach. Practical hints are given for the realization of a Sylgard elasticity scale. Systematic effects given by the sample curing thickness and nanoindenter geometry have been found with regards of the measured elastic modulus. AFM measurement reproducibility better than 20% is obtained in the entire investigated elastic modulus scale of 101 – 104 kPa.

012051
The following article is Open access

, , and

Cell counting is a fundamental procedure in living cell culture-based experiments and protocols in which the cell number quantification is required. The number of cells is one of the parameters necessary to investigate several cell culture features requiring to be monitored as function of time, such as cell viability, proliferation, growth, fitness and metabolism. Aim of this paper is contributing to declare a comprehensive uncertainty budget for cell counting through metabolic assays according to the EURACHEM/CITAC Guide Quantifying Uncertainty in Analytical Measurement.

Posters

012052
The following article is Open access

A utility function with risk-aversion as its sole parameter is developed and used to examine the well-known psychological phenomenon, whereby risk averse people adopt behavioural strategies that are extreme and apparently highly risky. The pioneering work of the psychologist, John W. Atkinson, is revisited, and utility theory is used to extend his mathematical model. His explanation of the psychology involved is improved by regarding risk-aversion not as a discrete variable with three possible states: risk averse, risk neutral and risk confident, but as continuous and covering a large range. A probability distribution is derived, the "motivational density", to describe the process of selecting tasks of different degrees of difficulty. An assessment is then made of practicable methods for measuring risk-aversion.

012053
The following article is Open access

and

This article deals with wide frequency range behaviour of DC tolerant current transformers that are usually used in modern static energy meters. In this application current transformers must comply with European and International Standards in their accuracy and DC tolerance. Therefore, the linear DC tolerant current transformers and double core current transformers are used in this field. More details about the problems of these particular types of transformers can be found in our previous works. Although these transformers are designed mainly for power distribution network frequency (50/60 Hz), it can be interesting to understand their behaviour in wider frequency range. Based on this knowledge the new generations of energy meters with measuring quality of electric energy will be produced. This solution brings better measurement of consumption of nonlinear loads or measurement of non-sinusoidal voltage and current sources such as solar cells or fuel cells. The determination of actual power consumption in such energy meters is done using particular harmonics component of current and voltage. We measured the phase and ratio errors that are the most important parameters of current transformers, to characterize several samples of current transformers of both types.

012054
The following article is Open access

and

Calibration curves and dynamic characteristics are widely used as metrological models of measuring instruments properties; but the initial concepts and formal representations have not been clearly formulated so far. This paper focuses on the construction of a formal framework, which should enable introduction and estimation of adequacy as a characteristic of the model quality. The formal system is constructed, that is similar to the representative theory formalization. So, an initial model for calibration curve is a complex {U, C0, Φ}, containing a set of dependences U, a space of continuous monotonous functions C0, and monotonic homomorphism Φ: U → C0. Construction of calibration curve is started from an initial table of experimental data, and transfer to the analytical form of calibration curve is analysed. It is demonstrated that adequacy parameters should be expressed by functionals of both measurement accuracy and frequency range. The analogy of measurement and modelling of metrological characteristics should be extended up to the construction of system ensuring the traceability of calibration curves and dynamic characteristics. This system can be created using the experience in constructing a system ensuring the measurement traceability.

012055
The following article is Open access

, and

This paper describes the conception and development of an optical system applied to suspension bridge structural monitoring, aiming real-time and long-distance measurement of dynamical three-dimensional displacement, namely, in the central section of the main span. The main innovative issues related to this optical approach are described and a comparison with other optical and non-optical measurement systems is performed. Moreover, a computational simulator tool developed for the optical system design and validation of the implemented image processing and calculation algorithms is also presented.

012056
The following article is Open access

, and

In planetary exploration space missions, motion measurement of a vehicle on the surface of a planet is a very important task. In this work, a non linear vision-based algorithm for ego-motion measurement is described and calibrated using telephoto lenses. Several motion types, including displacement, rotation and their combination, are considered and the evaluated uncertainties are compared, pointing out strengths and weaknesses of employing telephoto lenses for motion measurement.

012057
The following article is Open access

and

A better understanding of how to characterise human response is essential to improved person-centred care and other situations where human factors are crucial. Challenges to introducing classical metrological concepts such as measurement uncertainty and traceability when characterising Man as a Measurement Instrument include the failure of many statistical tools when applied to ordinal measurement scales and a lack of metrological references in, for instance, healthcare. The present work attempts to link metrological and psychometric (Rasch) characterisation of Man as a Measurement Instrument in a study of elementary tasks, such as counting dots, where one knows independently the expected value because the measurement object (collection of dots) is prepared in advance. The analysis is compared and contrasted with recent approaches to this problem by others, for instance using signal error fidelity.

012058
The following article is Open access

and

The paper presents a method for evaluating the accuracy of indications of the direct current Watt-hour meters in a designed and constructed measuring system. Such measuring system is composed of two multi-function calibrators, and a specialised high-voltage attachment dedicated for this system, which makes it possible to generate direct voltages in a required range up to 4 kV with a suitably high precision. The authors described in detail particular elements of the measuring system together with the results of its calibration. They showed the results of experiments carried out on a large and representative number of LE3000plus Watt-hour meter items.

012059
The following article is Open access

, and

The proposed exercise is focused on the measurement of temperature using thermocouple, students acquire theoretical knowledge of the Seebeck effect, the design and application of thermocouples including the differences between their various types. The students measure the voltage at the thermocouple by various methods: directly with a compensation box, using operational amplifiers and USB module. The exercise explains also general principles of low voltage measurements, error compensation and uncertainty evaluation.

012060
The following article is Open access

The method of fast impedance spectroscopy of technical objects with high impedance (|Zx| > 1 GΩ) is evaluated in this paper. An object is excited with a signal generated by digital-to-analog converter (DAC) located on the U2531A DAQ module. Response signals proportional to current flowing through and voltage across the measured object are sampled by analog-to-digital converters (ADC) in the DAQ module. The object impedance spectrum is obtained with use of continuous Fourier transform on the basis of acquired signals. Different excitation signals (square, triangle, sawtooth and sinc) were compared in order to estimate accuracy of impedance spectrum evaluation. The method is evaluated by means of simulation and practical experiment in a measuring system on an exemplary object in the form of multielement two-terminal RC network modelling of an anticorrosion coating impedance.

012061
The following article is Open access

, and

In the representational approach, measurement is defined on the basis of primitive (in the context) notion of empirical relation. To account for measurement uncertainty, it is quite natural to consider probabilistic (or, more generally, non deterministic) empirical relations. Here the notion of probabilistic relation is investigated. After a brief historical outline, we point out that probabilistic relations appear in everyday measurement situation. To demonstrate this, a simple illustrative experiment has been developed, whose results are presented and discussed.

012062
The following article is Open access

, , , , and

In humans, sodium is essential for the regulation of blood volume and pressure. During hemodialysis, sodium measurement is important to preserve the patient from hypo- or hyper-natremia Usually, sodium measurement is performed through laboratory equipment which is typically expensive, and requires manual intervention. We propose a new method, based on conductivity measurement after treatment of dialysate solution through ion-exchange resin. To test this method, we performed in vitro experiments. We prepared 40 ml sodium chloride (NaCl) samples at 280, 140, 70, 35, 17.5, 8.75, 4.375 mEq/l, and some "mixed samples", i.e., with added potassium chloride (KCl) at different concentrations (4.375–17.5 mEq/l), to simulate the confounding factors in a conductivity-based sodium measurement. We measured the conductivity of all samples. Afterwards, each sample was treated for 1 min with 1 g of Dowex G-26 resin, and conductivity measured again. On average, the difference ε in the conductivity between mixed samples and corresponding pure NaCl samples (at the same NaCl concentration) was 20.9%. With treatment with the resin, it was 9.9%, only. We conclude that ion-exchange resin treatment coupled with conductivity measures may be a possible simple approach for continuous and automatic sodium measurement during hemodialysis.

012063
The following article is Open access

, and

In 2010, the World Health Organization estimates that there were about 285 million people in the world with disabling eyesight loss (246 millions are visually impaired (VI) and 39 millions are totally blind). For such users, hits during mobility tasks are the reason of major concerns and can reduce the quality of their life. The white cane is the primary device used by the majority of blind or VI users to explore and possibly avoid obstacles; it can monitor only the ground (< 1m) and it does not provide protection for the legs, the trunk and the head. In this paper, authors propose a novel stand-alone Electronic Travel Aid (ETA) device for obstacle detection based on multi- sensing (by 4 ultrasonic transducers) and a microcontroller. Portability, simplicity, reduced dimensions and cost are among the major pros of the reported system, which can detect and localize (angular position and distance from the user) obstacles eventually present in the volume in front of him and on the ground in front of him.

012064
The following article is Open access

, and

Optical coherence tomography (OCT) has been proving to be an efficient diagnostics technique for imaging in vivo tissues, an optical biopsy with important perspectives as a diagnostic tool for quantitative characterization of tissue structures. Despite its established clinical use, there is no international standard to address the specific requirements for basic safety and essential performance of OCT devices for biomedical imaging. The present work studies the parameters necessary for conformity assessment of optoelectronics equipment used in biomedical applications like Laser, Intense Pulsed Light (IPL), and OCT, targeting to identify the potential requirements to be considered in the case of a future development of a particular standard for OCT equipment. In addition to some of the particular requirements standards for laser and IPL, also applicable for metrological reliability analysis of OCT equipment, specific parameters for OCT's evaluation have been identified, considering its biomedical application. For each parameter identified, its information on the accompanying documents and/or its measurement has been recommended. Among the parameters for which the measurement requirement was recommended, including the uncertainty evaluation, the following are highlighted: optical radiation output, axial and transverse resolution, pulse duration and interval, and beam divergence.

012065
The following article is Open access

and

In this work the authors propose a novel approach to obtain the electrocardiogram in the forearm using non-contact sensing. This new solution should be at same time portable, ergonomic and robust, enabling its use in different set of applications. A system of four electrodes was used in an adjustable sleeve to be wrapped in the forearm. No additional electrode references were used in other body parts. In order to increase the sensitivity of the system, an harmonium like approach was used in the design of the electrodes. The prototype was then compared with a similar system with a flat conformation. The developed prototype enabled the acquisition of an ECG signal in the forearm and the inclusion of the harmonium like electrode conformation resulted in a considerable increase of the sensitivity of the system. The acquired signal did not enable the identification of all characteristic cardiac waves. However, it was possible to identify clearly a signal pattern, characteristic of the QRS complex. The properties of the acquired signal restrict their use in rigorous electrocardiographic studies, allowing, however, its application in heart rate variability monitoring and biometric identification without the disadvantages usually associated with conventional electrodes. This makes it specially useful for man-machine interfaces and automated identification.