If this is your first visit, be sure to check out the FAQ. You must register before you can post. Your first post will be checked for appropriate content
(SPAM) - please allow a bit of time for that. After that, you'll be able to post at will!
Many thanks...that did it. I'm a little suspicious of the magnet/sensor placing, since it counts as the bucket tips in one direction, and not in the other.
If you select your rain bucket counter to Todays Rainfall type of sensor then the units displayed should be in inches. If it set to be a water flow counter then it will show in gallons.
The "standard" domestic rainbucket is 0.01 inches/count. The calibration approach you used appears to be correct to allow mcsTemperature recognize that your count corresponds to something other than 0.01 inches.
I'm running HS2 with MCSTemperature...it has been working fine with outdoor weather station, and I decided to add on a 1-wire rain gauge that I had in my misc parts bin. It has a 6-inch bowl, so it takes about 400 ml of water to make an inch of measurable rain.
When I pour in an inch of "rain", the tipping bucket increments the status (count) by 36.
My first place to start scaling was the mcsTemperature startup screen, in the Devices/Files tab. Moving down to the Rain Count Calibration field, I set a count of 0.028 (1/36th of a gallon per click), and a rate of 1. This seems to get me graphs in the ballpark of reality, although the unit of measure is still "gallons". I assume I could so something similar for the rainfall rate.
This seems to be kind of a roundabout route for measuring rainfall and rate of rainfall. Is there a cleaner way to a solution? I've googled around, read the docs, and haven't found anything obvious.
Leave a comment: