让我们来帮助您

- 查找经销商
- 成为我们的经销商
- 咨询技术支持
- 获取更多的电池监测相关信息
- 注册Portal账户

The Evolution of Battery Monitoring

David Brown reviews different approaches to battery monitoring in his Battery Power Magazine article, "The Evolution of Battery Monitoring"

A Brief History of AC and DC Approaches to Battery Monitoring and the Latest Predictive Technology

David Brown, CEO and CTO, NDSL

As many electrical engineering graduates will recall, a lead acid battery is an analog device that is often represented in electrical circuits by resistors, capacitors and inductors. However, the actual behavior and performance of the battery is more complicated and can be misrepresented by such primitive descriptions.

For the purpose of predicting the performance and state of charge, batteries behave the same way across their entire range of performance and can be measured under heavy load or light load equally. In 2000, the Electrical Power Research Institute (EPRI) started a study of this topic and their conclusions are very clear in their published report of 2002. They proved beyond any doubt that conductance, internal resistance and impedance are linearly correlated to each other across a wide range of battery capacities (1).

For this reason, the IEEE chose to standardize the terms "ohmic measurement and ohmic value" in their draft standard on the subject. They chose this to mean any of the three expressions: conductance, internal resistance and impedance, which are all basically the same measurement (2).

Types of Battery Monitoring Instruments Through The Ages

The early and most popular type of standby battery measurement instrument was a manual device, not unlike that used by car mechanics today. These devices apply a big and precisely timed load on the battery and measure the resulting drop in voltage to judge the capacity and performance. The instrument measured the battery voltage prior to and at the end of this load time, which was typically three seconds of 30 to 70 amps load stressing the battery. Pioneers in this technique found that the value they measured accurately indicated the capacity, likely performance and life of the battery. Because this measurement technique used DC measurements and ran for a relatively long time, advocates called the measurement internal resistance.

While accurate, the test current is so large the test can take its toll, and if over-used, can shorten the life of the lead acid battery. Therefore advocates of this method proposed relatively infrequent testing to monthly or even less.

Why was such a big load used? When these tests were first attempted it was very difficult for the existing technology to determine the difference between the background UPS inverter and charger electrical noise, and the ‘signal’ produced by a battery monitoring system (BMS). It was like trying to have a conversation in a high school lunchroom. Everyone has to speak loudly just to conduct their own conversation, thus the noise grows louder and louder. With all of the background noise in a typical UPS environment it seemed obvious that the best approach was to create a test signal that would stand out against the background noise. So early battery monitoring systems hit the battery with a large current load so the instrument could see the test signal amongst all the noise. You hear the expression "signal to noise ratio", this is what it means.

The net effect was that the first inventors used large loads, non-reactive measurements and infrequent testing based on the technology they had and therefore the expression internal resistance defined their comfort zone with that technology.

After a few years, companies came along with improved electronic technology. These inventors realized that if they used an AC load at a very specific frequency they could discriminate the load signal by frequency from all the other random AC noise on the battery. Much like the effect of a dog whistle in the same lunchroom cafeteria. Even with all of the noise in a crowded room the dog can still hear the whistle because their ears are capable of distinguishing the frequencies since they stand out against the background noise.

With this testing method each time the load is switched on, there is a slight drop in voltage and in the off half of the cycle, there is slight rise in voltage. By measuring the AC ripple induced on the battery at that specific frequency, the system can get a measure of the internal ohmic value of the battery. Using an AC load means the test equipment may get effects from the internal inductance and capacitance of the battery which affects their signal quality.

Because the test signal used on the battery does not need to be so strong, or as large in terms of current, more frequent tests may be used, perhaps monthly or weekly. Users of this AC measurement technique found that the value they measured was a good indicator of battery capacity, performance and life of the battery. They became very keen on calling it internal impedance and fought hard against the internal resistance lobby promoting its merits.

It is clear through this evolution of battery monitoring that the periodicity of testing was determined by the test load used and that was determined by the electronic sophistication used in the designs. A stimulus was to occur that triggered the rapid development of the next step in technology.

VRLA Technology

In the 1990’s VRLA batteries quickly became the norm in standby applications and they started to replace the more reliable, better built (and much more expensive and code-restricted) flooded or wet cells in the battery room. While flooded cells take weeks and months to fail, VRLA batteries fail rapidly and the ‘holy grail’ became the development of a battery monitoring system that gave sufficient time resolution such that it could detect the imminent future failure of a VRLA jar before it actually failed. Testing each month or every two weeks would simply not do that.

The next challenge to be addressed was test current level. There are a number of negative effects on a string of batteries caused by applying large load currents. Many system application designers think of batteries as large, highly fault tolerant entities. They are not. A string of jars, connected in series has a different fault tolerance and behavior than the sum of the individual jars. Taking large currents from sections of the string can cause effects in the string that can impact the life of the entire battery and even create risks to the assets. Therefore the problem with test currents is not limited to just taking the current down but how it is done and the timing.

In the last twenty years, the problem has been approached by electronic design engineers rather than battery technologists. Using the latest modern solid-state technologies involving lock-in amplifiers (3) and synchronous demodulators or digital signal processors, these engineers were able to generate signals using switched DC circuits to stimulate the load on the battery. With this advanced technology they were able to understand the internal characteristics of the battery with tiny loads thought impossible until recently. The new designs were able to use loads as low as one amp applied over a series of pulses measured in thousandths of a second and then take the resulting data and produce a clean signature which defines the ohmic value of the battery using Ohm’s law. This would be analogous to hearing the sound of a butterfly’s wings flapping in that same high school cafeteria full of noisy students.

Figure 1 shows an actual battery reading where the ohmic value testing has captured a battery failure. Note the rapid decline in the battery performance. This particular jar went from fully functional to complete failure in just four days. With an older measurement technique the failure may have gone undetected for as many as 28 days or longer. Had this battery been called on for active service following this degradation it would not have performed as desired and an outage would have likely occurred.

By perfecting this low impact measurement technique it became possible to carry out ohmic value measurements more frequently without damaging the battery. This gentle approach had negligible effect on the battery compared to the older technology.

They found that the parameter they measured gave an excellent characterization of the capacity, performance and life of a wide range of batteries. Equally important, batteries could now be measured daily, allowing an operator to plot and predict with greater certainty the moment when a jar or cell became suspect and required further investigation. This ability to predict failure ahead of the actual sharp rise in ohmic value gave rise to the concept of predictive analysis and puts battery monitoring squarely in the camp of condition based monitoring (CBM). In addition, by being able to measure daily by default, VRLA rapid failure, which is not uncommon, can be mitigated without putting critical loads at risk.

References

1. Stationary Battery Monitoring by Internal Ohmic Measurements, EPRI, Palo Alto, CA: 2002. 1002925. (Page 5-2 etc)

2. IEEE P1491 Draft Guide for Selection and Use of Battery Monitoring Equipment in Stationary Applications

3. E.G. & G. Princeton Applied Research, Explore the Lock-In Amplifier

David has more than 25 years of technical and international management experience in electronics engineering, manufacturing and microprocessor development. Throughout his many years with NDSL, David has been instrumental in the technical leadership of the company’s flagship product, the Cellwatch battery monitoring system.