Definition of phase meter
The phase meter is a measuring instrument for directly measuring the phase difference of two sinusoidal signals with the same frequency, which consists of a shaper, a phase detector and an indicator. The working principle of the phase meter is to measure the time interval between adjacent zero crossings. This time interval is proportional to the phase difference between the two measured signals, and measuring this time interval is the phase difference between the two signals.
Phase meter is mainly used for ranging and positioning of phased radar, radio navigation system, automatic control system, and phase difference measurement of phase voltage in power system. Precision phase meter can be used as the phase measurement standard of measurement department.
Phase is the integral of frequency. This relationship is generalized mathematically, even if F is a variable. When we return to the physical world, we find that there is no need to force strict periodic signals, and both frequency and phase can be instantaneous values.