Software Timers trigger setup to be called every shot of timer?


I am experimenting using Software timers and I am wondering how they are supposed to work. I am using 2.1.50 Beta 3 so perhaps this is a bug?

I basically have:

Timer timer(60000, processTimer);

void setup() {

void loop() {

void processTimer() {
    // Stuff

Every 60 seconds, the board is reconnecting to the particle cloud and I can tell that .setup() is being called again. Is this the intention?


What are you doing in the callback? Keep in mind this is an interrupt function, so holding or blocking in here will cause issues. For example, if this function blocked for more than 3 seconds, it would block the WDT from being notified and the system would restart.

It is usually best practice to set flags in interrupt functions and then process those flags back in the main context in the loop() function. That will ensure you don’t interrupt the system.


I am reading some from some I2C sensors and then calling Particle.publish. I will set a flag and move it to the loop.

My reason for experimenting with software timers was to increase battery life.


I have started a low power optimization tutorial in our docs on staging:

This needs a lot more work before it pushes to production, especially around the connection intervals which is the next major part I will add. If you have any feedback or questions, let me know so I can try and make this as comprehensive as possible. Thanks!


Do you have code that you use to test various Tx options? Something that would start out with the highest level and lower it over some period of time to arrive at the lowest stable option. It could publish a status report periodically so we could monitor it’s progress remotely. If that’s possible why not take it a step further and allow this algorithm to be part of a normal program, ratcheting up and down the TX power as needed.


I don’t have a script like that, no.

I am not sure I would want to include something like that in the default firmware as TX Power is pretty specific to certain applications. Some apps may want a fixed low power at all times as things are stationary, while others may want fixed high power to get the best range. The risk of changing it on the fly is a disconnect, and if those start piling up you lose any benefit of lower transmissions, as connecting to the cloud is the highest power consumption point.

Certainly the user code can do whatever it would like, and I think that is a better default behavior.


Yea I meant within user code, something that seeks out an ideal TX power. To save power in the long run you could have it setup to only do this upon startup or perhaps hard code it and setup a function that you can remotely to re-optimize if needed. Initially I was just looking for something to automate the testing and report back as having to flash with different static values and then monitor the outcome over a sufficient period of time sounded a bit time consuming. If I end up writing something to do that I’ll share it :wink: