We are seeing the same voltage and amps at the Battery as we see at the alternator when measuring with your software, but when we actually measure the alternator with an Amp clamp and multimeter we get much higher readings as expected.
Our config is two WS500s connected together across two engines feeding a lithium bank... everything seems to look right except the readings on the Software for voltage and amps at the alternator..
It is key to remember, the ASCII stuff is intended for diagnostics, and not day to day monitoring. As such, some of the values will vary depending on the install. This is such a case.
The WS500 has an option to have a remote sensor provide things like battery voltage, current, temperature. Ala, maybe a BMS. In that case an option is to connect the WS500 sensing wires locally vs. at the battery. And in the extreme case, both voltage AND current can be sensed locally. Under that setup you will see a delta between ‘battery’ and ‘alternator’ volts/amps. But in the simpler case of no remote instrumentation via CAN, only the built in WS500 sensors, those two values will be the same. All depends on the capabilities of the deployed system, and also how the alternator regulator is configured.