Calibration

Jacob
1 Min Read
Disclosure: This website may contain affiliate links, which means I may earn a commission if you click on the link and make a purchase. I only recommend products or services that I personally use and believe will add value to my readers. Your support is appreciated!


No instrument is intrinsically perfect. Calibration techniques are used to extend the usefulness of an instrument, correcting for offsets, nonlinearity, hysteresis and other undesired

characteristics of an instrument. To calibrate an instrument one needs to measure known quantities, and then devise an Error Model, i.e. a set of equations that allow the instrument raw

reading to be corrected. Error models often involve lookup tables and interpolation, i.e. they are applicable for measurements between the minimum and maximum values of the Calibration

Standards used. Some calibration scheme is always present in commercial instruments, although it often consists of adjusting a few trimmer potentiometers in the associated electronics.

Calibration becomes essential, mathematically complicated and rather tricky to perform at high frequency and/or high precision measurements. Calibration is related to fitting and interpolation, discussed later in the course. We will discuss calibration issues as we discuss specific measurement techniques.

Share This Article
Inspiring the world through Personal Development and Entrepreneurship. Experimenter in life, productivity, and creativity. Work in SenseCentral.
Leave a review