IMEKO Event Proceedings Search

Page 556 of 977 Results 5551 - 5560 of 9762

Igor Zakharov, Natalya Shtefan
MINIMIZATION OF UNCERTAINTIES IN MEASUREMENTS WITH REPEATED OBSERVATIONS

Algorithms minimizing A-type uncertainties when processing results of measurements with repeated observations distributed according to the laws unlike the normal one are given. A comparative estimation of the developed and traditional algorithms is cited.

Ulf Persson
ESTIMATE OF MEASUREMENT UNCERTAINTY USING ANALOG SWITCHES IN ELECTRONIC TEST

An analog switch is a component, which can be used in testing analog and mixed-signal units. Analog switches offer the potential to sense nodes on printed board assemblies without physical contact. Generators and analyzers are connected to edge connectors on printed board assemblies and routed via analog switches to the appropriate nodes that are to be measured. The concept of using analog switches in an analog test is introduced in this paper. Measurements of on-resistance dependence on input voltage, supply voltage and temperature are presented. Simulations are presented which show that the use of analog switches, in some cases, increases the uncertainty of measurement. How this can be treated in a manufacturing test is discussed in this paper.

Przemyslaw Otomanski
MODELLING OF THE COVERAGE FACTOR IN INDIRECT MEASUREMENTS

The results of examining error evaluation of the coverage factor in indirect measurements have been presented in the paper. To examine coverage factor value, the use of mathematical models, for analysed convolution of component distribution, has been presented. The knowledge of the coverage factor characteristics for the convolution of two Student’s distributions and two rectangular distributions was used for the examination.

Anna G. Chunovkina, Patrizia Ciarlini, Maurice G. Cox, Franco Pavese
HANDLING AND TREATMENT OF MEASUREMENT DATA FROM DIFFERENT GROUPS OR SOURCES

This paper addresses the problem of evaluation of data acquired from different groups or sources. It concentrates on a grouping of measurement tasks into (1) measurand value estimation, (2) quantification of the quality of the measurement method, (3) quantification of the comparison of measurement quality. The emphasis is on appropriate use of statistical methods for parameter estimation, experiment design and testing hypothesis for measurement data evaluation.

Yasuo Iwaki, Tadao Inamura
STUDY OF ""LAW OF PROPAGATION OF UNCERTAINTY"" ON THE ISO-GUM – APPLICATION TO THE BLOOD CHEMICAL ANALYSIS

Generally, analysis of variance (ANOVA) has been using for the data analysis of the measured value. On the other hand, J.W. Tukey had studied the exploratory data analysis (EDA) in 1977. Therefore, the ISO (International organization for standardization) made the ISO-GUM (Guide to the expression of Uncertainty) that is a guidebook adopted EDA in 1993. In the ISO-GUM, the dispersion of data is evaluated as the uncertainty with the error. In the medical field, ISO-GUM was also taken up it. Nowadays, the improvement of accuracy of the measurement data on the blood chemical material analysis (BCA) is done by WHO standard based on ANOVA. ANOVA is said as the classical data analysis as well (CDA). The purpose of this study is to make the improvement of the accuracy in the quality control (QC) of the measured value and to make an accuracy assurance. So, it was verified how it was different when ISO-GUM rule and WHO standard were applied to BCA.

P. S. Smertenko, A. V. Maryenko, L. P. Pochekaylova
DIMENSIONLESS SENSITIVITY AS THE BASE FOR ESTIMATION OF ERROR OF METERS OF INTEGRAL CHARACTERISTICS

The new approach for development of determination method of accuracy estimation for meters of integral characteristics is proposed. This method is based on the measurement or determination of dimensionless sensitivity. The method can be used for the cases with difficulties for calibration of meters of integral values.

G. O. Sukach, P. S. Smertenko, A. V. Bushma, Yu. V Zarkov, L. P. Pochekailova
SOME ASPECTS OF HARD MEASUREMENT ASSURANCE OF ELECTRO-MAGNETIC IRRADIATION CONVERTERS

The factors influenced on the inaccuracy of electromagnetic irradiation flux meters have been considered. The factors were divided onto informative, non-informative and destabilising. They can be systematic and random, spatial or temporal distributed. Totality of non-informative and especially destabilised factors, which acts in different ratio with signal measured, are the main source for decrease of accuracy of control of electromagnetic irradiation flux with use of optoelectronic converters of physical values. Introduction of additional correction channels, which uses the correlation of internal signals with destabilised and noninformative factors for minimisation of their influence, allows significantly increase the accuracy of control.

Salvatore Nuccio, Ciro Spataro, Giovanni Tine
PC-BASED MEASUREMENT INSTRUMENTS: UNCERTAINTY ASSESSMENT UNDER ELECTROMAGNETIC DISTURBANCES

In the paper we examine if and how, in a generic measure performed by using a generic PC-based measurement instrument, the measurement uncertainty is influenced by the electromagnetic disturbances. With this aim, by means of experimental tests, we check how the electromagnetic threats affect the single uncertainty sources. In order to apply standard requirements and criteria, we consider the IEC-61326 standard and follow the test procedures prescribed by the IEC-61000-4 series standards. The results show that, in many cases, both radiated and conducted emissions boost the uncertainty sources and, consequently, increase the uncertainty values, worsening the measurement quality.

Dusan Agrez
UNCERTAINTY OF NON-UNIFORM QUANTIZATION AND INFORMATION

In many analog-to-digital conversion schemes, numerical values are attained with successive approximations of the difference between the reference and measured quantity. For effectiveness of differential tracking, the non-uniform quantization must fulfil three conditions: partitions into halves, increasing quantization uncertainty with difference, and low overlapping of the quantization intervals. The best trade between the number of decision levels and the settling time is with pure exponential quantization rule. The fastest response is achievable with base 2. The information rate per step is a little below three. The number of bits decreases towards the end of conversion and the sharp stop of the uniform constant information flow is smoothed.

Alexander Zaiko, Taisiya Zaiko
COMPLEX APPROACH TO THE ERROR DEFINITION OF MEASUREMENTS

Complex approach to error definition residing in the fact that the error of measurement is considered as a uniform and indivisible whole, which is transformed with the change of the measurement modes, operation conditions and other factors. It differs from the standard elementary approach by that the resulting error of measurements, instead of its elementary components, is classified at once. In this case the variety of measurement modes and operation conditions is more fully taken into account, the transformation of errors is described from individual positions, the objectivity of their estimations and efficiency of methods of accuracy increase raises, experimental definitions may be simplified.

Page 556 of 977 Results 5551 - 5560 of 9762