||Heisenberg's uncertainty principle is usually explained to state that if one attempts to simultaneously measure a fundamental physical quantity and its conjugate, then the product of those errors has a lower bound set by the Planck constant, and this relation has been formally represented by an inequality for the product of the standard deviations of those quantities. There is an obvious conceptual difference between measurement error and standard deviation. However, it remains obscure how such difference has been neglected. In this talk, we clarify the reason for this confusion from the fact that quantum measurement theory in those days has not been general enough so that an additional assumption, called the repeatability hypothesis, has been generally required. From the 1970s to the 1980s, quantum measurement theory has been established with complete generality to treat all the physically realizable measuring processes, and the repeatability hypothesis has been abandoned. Nowadays, the relation for measurement errors are not considered to be equivalent to,
nor to be a straightforward consequence of, the relation for standard deviations.