Our full technical support staff does not monitor this forum. If you need assistance from a member of our staff, please submit your question from the Ask a Question page.


Log in or register to post/reply in the forum.

timer() losing a minute?


resseger Jun 27, 2017 07:27 PM

I am trying to write a program using CR1000 to trigger a PVS5120 sampler based on a water stage trigger and time.  Once the stage goes above a certain threshold stage, I start a timer and check it every scan. The checked time is assigned to variable “elapsedsampletime”.  Once that timer goes above a certain time threshold (called risingSampleRate), I take a sample and reset the timer. The problem I’m finding is after the timer is reset at the end of the scan, the next scan one minute later does not acknowledge that first minute that has gone by. So then the timer is 1 minute behind, and it takes an extra minute to get up to the minimum time interval to sample again.  So if I were to set the “risingSampleRate” to 10 minutes, it will only sample after 11 minutes have gone by.

 

General example:

Timer starts (Timer(0,min,2)) and the program goes through several scans.

Next scan:

elapsedsampletime = Timer(0,min,4)=10

elapsedsampletime<=risingSampleRate, so sample is triggered.

Timer is reset to 0. (Timer(0,min,2))

 

Next scan:

Elaspsedsampletime=0, even though 1 minute has gone by since the Timer was reset.

Is there anyway to have the program not lose that minute? Or do I need to set all of my threshold times 1 minute less than desired?

Log in or register to post/reply in the forum.