|
Eastern SD | Believe it or not there isn't an absolute standard for a radar or simulated radar signal. Thus most devices have a way to calibrate the speed signal they receive. All that I have encountered use a change in frequenxy, not a change in amplitude.
Some documentation shows the rate as Hz per MPH. That seems like a strained unit to me. One 'standard' is 50 pulses per meter.
If you get position data at 5 Hz it will be up to you to do any necessary filtering prior to setting the output clock. Probably not necessary, but you could also use algorithms to take into account accelerations. You can update your output faster than the 5Hz you get updated GPS position, so can smooth the change in output speed instead if having big step changes each time you get a new position. | |
|