home

Creating reliable timers in Javascript

Disclaimer: Here I talk about how javascript handles code execution and why it's a terrible idea to trust its built-in intervals to work accurately, if you just want to see some code, you can take a look at this component


For the last month or so I had been working on what was pretty much just a pomodoro timer app in React. The workflow of the app would be as such:

  1. The users go in and create their tasks for the day defining the title and how many pomodoros there would be for each task.

  2. The users would pick which task they would do next, immediatly starting the timer.

  3. After 25 minutes, the timer would ring, starting a 5 minute break after which the user would either go to the next 25 minute timer if the task has more than 1 pomodoro of duration or complete the task and clear the completed task.

  4. If the user completes their task before the timer ends, they can skip the 25 minutes and go directly to a 5 minute break or they can also skip that break and go back to step 1

Why?

Why would you want to waste your time remaking a project idea already made by thousands of people? Well, all of the pomodoro timers out there will have at least one of the following issues, upon which I wanted to improve:

  1. Not great UX
  2. Distracting/dated UI
  3. Flawed timer logic

It's easy to understand what I mean by saying bad UX and UI, but what do I mean about the flawed timer logic?

The flaw

If you're a curious person you might have the urge to try and break software to see if the creators behind apps tested for all edge cases. I like to think that the best programmers try to make their apps so that even if the user tries to make the app not work, it will.

On almost all pomodoro timers available online you'll see this particular behaviour where if you spam click the pause/resume button, time won't pass. Even those milliseconds between each click aren't being accounted by the timer. But why is that? In order to understand this flaw, it's necessary to know how you would code a timer.

Basically, a timer is composed of some piece of code that periodically subtracts from some variable the amount of time passed since the last update. This timeleft variable is displayed and then repeats the update cycle.

This would translate to something like this in Javascript:

let timeleft = 5000;
    let interval = setInterval(() => {
      if (timeleft > 0) {
        timeleft -= 1000;
      }
    }, 1000);
    

The problem with almost all timer apps and components available is that they choose to make the interval between each code execution in exactly the same order of magnitude they will display on the lowest time measure. What I mean by this is that if they're just going to show how many minutes and seconds are left, they will create some code that updates the timer every 1 second. The problem with that is that every time I click pause and then resume, the time until my timeleft updates gets set to 1 second from now. So if I keep clicking that button on time intervals lower than the 1 second timleft update, my timer will just freeze unless I wait more than 1 second to click the pause button again.

It's an easy fix. All you've got to do is decrease the intervals in which timeleft decreases while also decreasing how much you subtract from it.

Updating the code I showed you before, it would look like this:

let timeleft = 5000;
    let interval = setInterval(() => {
      if (timeleft > 0) {
        timeleft -= 100;
      }
    }, 100);
    

This technically solves our problem. If it ended here, I would be a happy man, but there is a huge issue with the code above that isn't possible to be detected without some extra knowledge and especially not fixable without some thinking.

The problem

So I went on with the development of my app and after implementing all of the features I wanted in order for it to be useful (including a sound for when the time was completed), I decided to test it while studying for tests. I added my tasks, immediately started the timer and opened some tabs with materials for my studying session.

I studied for some time until I looked at my phone and noticed that more than 25 minutes had passed without any indication that the time had ended. So I went to check, and when I opened my timer's tab only 5 minutes had passed for the timer.

What's happenning?

I took some time to understand what was going on with my code. At first I thought it was some sort of performance problem related to React and how it handled re-renders. I knew that there was some connection between my timer lagging and the way the code was being executed, but I couldn't figure out why.

After some time debugging and not understanding what caused the issue, I decided to compare my timer with Google's built-in (and pretty accurate) timer that shows up on top of the results when you search for any x min timer. I noticed that sometimes my timer would have some delay in comparison to Google's even when my timer's window was focused and without any other background tabs running. So what was going on?

I searched around for a solution, but I couldn't find any. There were a couple of people online that had the same problem that I did, but they had it on many different frameworks, even vanilla Javascript.

After a while, I started noticing that the problem was actually related not only with the setInterval function but with the way Javascript code gets executed entirely.

The event loop

What I, a noob CRUD-making naive webdev, didn't think about was what was happening under the hood when I executed the code that made my timer work.

You've probably heard about something called the Javascript event loop. If you haven't, I suggest you watch Philip Roberts' talk on JSConf about it, he goes pretty deep into it and the talk might make you start think about some of the low-level stuff you've never thought about while making your little web application. But even if you don't watch it, I'll explain the basics in order for you to understand what does it have to do with our issue.

For our purposes, all that you need to know is that the event loop is just the way that Javascript treats code execution. Javascript, being a single threaded programming language, can only make computations one sequence at a time. That means that every time your code does something, your program waits until that computation has ended in order for it to resume execution. This happens until your program has reached the end. The event loop is a pattern that the program follows in order to make the single threaded aspect of the language not suck so bad. The event loop creates kind of a priority-based list of things your computer should run.

The naive implementation

So what's the problem after all? It's simple: our implementation assumes that our code will execute instantly after those 100 milliseconds we defined as the second argument in the setInterval function. Javascript, being a single threaded function, can't ensure our code will be executed as soon as those 100 milliseconds go by because there could be some computation going on right when my timer code should execute, creating a delay. Sometimes the delay was acceptable but other times it seemed like the timer freezed.

It's possible to see what are the effects of being single threaded when performing a computation with the following code:

for (let i = 0; i < 100000; i++) {
      for (let i = 0; i < 10000; i++) {}
    }
    
    console.log("hi!");
    

if you run this code with Node.js or in your browser, you will see that hi will take a good while to log because our thread gets busy running the nested for loops.

So if we can't trust javascript to execute our code on time how can we code a program like a timer? And if timers are impossible to be accurate, how come the google timer works and mine doesn't?

The attempts

I tried every type of fix I could think of.

Web workers

At first I tried creating another javascript thread just for our timer, that way I could ensure that my timer would always run on time (no pun intended). To do that you need to use something called a Web Worker that was made exactly for having more than one thread at a time running on your browser.

That didn't really work. I felt like it was being way too hard and awkward to write code, and for something as simple as this there was no way that I was making the right choice. So I had a feeling that I should go looking elsewhere.

window.requestAnimationFrame()

Then, I found a web API called window.requestAnimationFrame(). According to the MDN docs, this API was made to ensure that your browser performs an animation before the next repaint. At first I thought that it would be weird to use this to update our timer, but after seeing that it takes a callback function as one of the arguments, I thought to myself that this might work...

It didn't work, but I got one step closer to glory.

The lag was eliminated completely but the method didn't work well with my pause method. Sometimes it paused but then couldn't resume the timer and sometimes it just went crazy and wouldn't pause at all.

The depression

I didn't know what to do. Why was it so hard to make a pomodoro timer app? I just wanted go on with my life accepting that I failed coding a CRUD app.

I was prepared to give up on this project and call it a day after getting to some old cursed web APIs and 10 year old questions on StackOverflow. But then I thought: how can I ensure that the way that I measure the time elapsed is accurate in any language? and the answer is: dates.

Dates

Javascript (and all computers as a matter of fact) is able to represent dates as the amount of milliseconds since 1970 at 00:00:00 UTC. In order to get the current date in Javascript you can use the Date.now() method in the Date global object. The reason behind why picking 1970 exactly is besides the point, but I suggest you look it up.

I can't be sure about when Javascript will execute my code, but I can definetly be certain that the Date.now() method will always give me an accurate representation of time. So now that I know that, how can I use this to make my timer not lag?

The solution

I started thinking about how I would implement the accuracy of dates in my timer code. It's not complicated at all, but it's a bit harder to make the code work while also not having to rewrite the rest of the logic in my React app.

In regular Javascript, it's actually an easy problem to solve.

What you want to do is to stop thinking about executing the timer code that decreases the timeleft as: every x milliseconds I want to decrease 100 milliseconds from timeleft. And start to think about that code as: every x milliseconds I want to decrease the amount of time passed since the last time I decreased my timeleft.

In order to visualize that in a better, let's code:

let deltaT = 0;
    let tZero = Date.now();
    let timeLeft = 5000;
    
    setInterval(() => {
      deltaT = Date.now() - tZero;
      timeleft -= deltaT;
      tZero = Date.now();
    }, 100);
    

And that's it.

If you want to see the code I implemented of this in React, I suggest you check the disclaimer at the beginning of this article.

Final thoughts

Thank you for reading, it means a lot.