I'm hoping I missed something obvious here.
I'm running HS2 with MCSTemperature...it has been working fine with outdoor weather station, and I decided to add on a 1-wire rain gauge that I had in my misc parts bin. It has a 6-inch bowl, so it takes about 400 ml of water to make an inch of measurable rain.
When I pour in an inch of "rain", the tipping bucket increments the status (count) by 36.
My first place to start scaling was the mcsTemperature startup screen, in the Devices/Files tab. Moving down to the Rain Count Calibration field, I set a count of 0.028 (1/36th of a gallon per click), and a rate of 1. This seems to get me graphs in the ballpark of reality, although the unit of measure is still "gallons". I assume I could so something similar for the rainfall rate.
This seems to be kind of a roundabout route for measuring rainfall and rate of rainfall. Is there a cleaner way to a solution? I've googled around, read the docs, and haven't found anything obvious.
Many thanks.
<small></small>
I'm running HS2 with MCSTemperature...it has been working fine with outdoor weather station, and I decided to add on a 1-wire rain gauge that I had in my misc parts bin. It has a 6-inch bowl, so it takes about 400 ml of water to make an inch of measurable rain.
When I pour in an inch of "rain", the tipping bucket increments the status (count) by 36.
My first place to start scaling was the mcsTemperature startup screen, in the Devices/Files tab. Moving down to the Rain Count Calibration field, I set a count of 0.028 (1/36th of a gallon per click), and a rate of 1. This seems to get me graphs in the ballpark of reality, although the unit of measure is still "gallons". I assume I could so something similar for the rainfall rate.
This seems to be kind of a roundabout route for measuring rainfall and rate of rainfall. Is there a cleaner way to a solution? I've googled around, read the docs, and haven't found anything obvious.
Many thanks.
<small></small>
Comment