FDC 300, 100, C Series controls and VR18 paperless recorder all have common input and output resolution and scan rate specifications.
What Does Scan Rate really mean?
Typically scan rate refers to how often the input value is updated to the microprocessor; a 200 ms scan specification the sensor value is taken and its' value updated to the microprocessor 5 times per second Typically controls offer a filter to average the input values. FDC's filter, configurable in steps from 0 [none] to 60 seconds, averages the value provided to the PV display, not to the control algorithm.
FDC scan specification also includes control calculations, control output and optional communications, all with 200ms updates. The benefit is a higher performing control.
What does input and output resolution really mean?
A/D Input Resolution [Analog to Digital]: Controls utilize microprocessors, digital devices that do not accept the infinite resolution of a traditional analog device. Microprocessors require the analog input signal be converted to digital values. These digital values are used as input to the control logic and display values. The higher the resolution on the analog input results in a more precise and accurate input value to the microprocessor. As an example, if an A/D of 10,000, a temperature range of 2,000 F the microprocessor would see temperature values in half-degree increments [2,000F/10,000 units resolution = 0.2F].
D/A Output Resolution [Digital to Analog]: Analog outputs, whether control or retransmission, are not infinite in their resolution. The microprocessor increments the analog output to values dependent upon its' D/A resolution specification. As an example, with a D/A of 10,000, a retransmitted temperature range of 2,000F would have values in 0.2F increments.
How A/D and D/A Resolution is Specified:
The input & output resolution specifications are normally described in the number of bits; i.e. 18 bits. Bits are a numeric value defined as the number 2. 18 bits = 2 x 2 x 2... for 18 times = 262,144 bytes. This Bit value is used to determine an instruments input and output resolution to and from the microprocessor. The calculation to determine resolution in degrees includes the number of bits, mV and gain of the specific span.
A/D Specifications and real world "Effective A/D":
A/D Input specification is defined in bits, i.e. 18 Bit. The calculated A/D resolution per degree using the number of bits specified assumes no degradation from electrical noise or other influences. Although there may be some variance from one instrument vendor to another, typically electrical noise up to the levels used in CE testing degrades the calculated resolution per degree by a factor of three. Future Design Controls 18 Bit specification has a real world "effective A/D" of between 16 & 17 Bit.
For Further information download the pdf this text was taken - FDC DIN Resolution.pdf