Abstract
The aim of the measurement is to gain quantitative knowledge of certain object of study and it may be achieved, in most case, by taking readings of an analog instrument. From a standpoint of information theory, we can regard measurement as reducing entropy which defines the uncertainty of our knowledge. In view of this, we investigate the possibility of using the information method of analysis for estimating the uncertainty of our knowledge which changes with number of readings.
Following and generalizing the previous report, we can evaluate the average amount of information as we take n readings in static measurement by means of Bayesian estimation process. In view of the result of this evaluation, we may expect that the accuracy of measurement, in a wide sense, when measured by the amount of information, increases nearly proportionally by √n.
Finally we estimate the parameter of error distribution of the measuring system from the scatter of n readings when n is moderately large. In this sense, when we know only the pattern of n readings, we can estimate the transition of the expected amount of information as n=2, 3, 4, … and this transition diagram is illustrated.