Re: '67 Temperature gauge problem...
Early temp gauges were calibrated at the factory by connecting them to a test fixture intended to simulate the voltage, current profile of a car running with the alternator charging the battery and the temp sender 'seeing' a potential over-heat situation (up near the red line warning band on the gauge). The test operator then installed the pointer needle onto the shaft to 'force' this high end of the dial reading.
Later versions of the temp gauge were fully assembled and calibrated by the test operator hand selecting a precision, wire wound, ceramic resistor to install as a 'shunt' on to the guage to similarly 'force' the gauge's high end reading to be correct.
So, it takes 'two to tango'... The temp guage HAS to be properly calibrated before you can expect decent pointer readings regardless of what temp sender you use. Then, you need a temp sender whose temp vs. resistance profile matches the factory original design silhouette.
Last, there was quite a bit of 'slop' in AC's temp vs. resistance specification to start with. Designers primarily cared that the system be accurate at the HIGH end of the guage vs. mid-low readings and I see folks trying to FORCE mid-range accuracy into the system (around 180F) that really wasn't there in the first place...
If you use a 40-ohm resistor to replace the temp sender, you can effectively test the accuracy of your existing gauge. The 40-ohm resistor ought to give a high end gauge reading that's in the red warning zone but not pegged off the scale to the hot side...
If you don't get that kind of reading (engine running, alternator charging and electrical system stable), then it's time to question the calibration/accuracy of the temp gauge itself....
Early temp gauges were calibrated at the factory by connecting them to a test fixture intended to simulate the voltage, current profile of a car running with the alternator charging the battery and the temp sender 'seeing' a potential over-heat situation (up near the red line warning band on the gauge). The test operator then installed the pointer needle onto the shaft to 'force' this high end of the dial reading.
Later versions of the temp gauge were fully assembled and calibrated by the test operator hand selecting a precision, wire wound, ceramic resistor to install as a 'shunt' on to the guage to similarly 'force' the gauge's high end reading to be correct.
So, it takes 'two to tango'... The temp guage HAS to be properly calibrated before you can expect decent pointer readings regardless of what temp sender you use. Then, you need a temp sender whose temp vs. resistance profile matches the factory original design silhouette.
Last, there was quite a bit of 'slop' in AC's temp vs. resistance specification to start with. Designers primarily cared that the system be accurate at the HIGH end of the guage vs. mid-low readings and I see folks trying to FORCE mid-range accuracy into the system (around 180F) that really wasn't there in the first place...
If you use a 40-ohm resistor to replace the temp sender, you can effectively test the accuracy of your existing gauge. The 40-ohm resistor ought to give a high end gauge reading that's in the red warning zone but not pegged off the scale to the hot side...
If you don't get that kind of reading (engine running, alternator charging and electrical system stable), then it's time to question the calibration/accuracy of the temp gauge itself....
Comment