Todd Esposito
Synaptic Spillage

Synaptic Spillage

When a Spinner Just Won't Do

Streaming is Magic. Ok, not magic, but still pretty neat.

I'm not talking about video or audio streaming here. I'm talking about using streaming to keep a web-app user aware of the progress of a possibly long-running server-side process. Because sometimes, a spinner just doesn't cut it.

A Little Background

I have a client using a cloud-based inventory control system. It works really well, but there are a couple places where the system doesn't line up with my client's processes. Luckily, there is a very nice REST API to work with the system's data.

One of the processes is provided by a legacy system which doesn't know from HTTP, but can export a nice CSV file, so I've connected their business process via a web-app they can kick off as needed, which happens several times a day. Basically, we have to adjust some values in some records based on some factors from the data from the legacy system. You know, business rules.

The App

The app (applet? it's a part of a larger collection, so maybe applet) uses the system's API to grab a particular set of records, match them up with the data in the file from the legacy system, and update some subset of the original records with data from the file. This results in multiple round-trips to the API, and even thought the API is quick, it's not instantaneous. Plus, there's some small amount of data munging and list searching going on, so that takes time, too.

The Problem

On busy days, a single run of this process might need to update upwards of a hundred records at a time, and the process is run by the user (no, it can't be a background task), so they have to sit and wait for it to complete before moving on with their work. And this can happen several times a day.

A simple spinner sitting there for even a few minutes will make my users nervous. They need real-time feedback to be reasured things are going well. Or going at all.

The Solution: Streaming!

Rather than POST the file from a form and wait for the process to complete and return a status page, we'll leverage the power of streaming to deliver real-time status updates as the process processes.

Client-Side

We'll use JS on the app page to send the file. There's an <input type="file"> field on the page, whose change handler POSTs the file to our server-side code:

function onFileSelected() {
    fetch('/OurProcessHandler', {
        method: 'POST',
        body: this.files[0],
    })
}

Nothing unusual there, and this alone doesn't solve the problem. We also need to wait for results from the server, and show the stream of updates to the user. We wait for the fetch Promise to return, then we get the reponse's body (also a Promise), and then pass the "reader" for the body into a function to handle the stream-y nature of the body.

fetch('/processHandler', {
  method: 'POST',
  body: this.files[0],
})
.then(response => response.body)
.then(body => updateProgress(body.getReader()))

The neat part is in that updateProgress function:

async function updateProgress(reader) {
    const decoder = new TextDecoder('utf-8')
    let done, value;
    var status = document.createElement('pre')
    var container = document.getElementById('statusMessage')
    container.appendChild(status)
    while (!done) {
        ({value, done} = await reader.read())
        if (done) {
            var conclude = document.createElement('p')
            conclude.innerHTML = "Done."
            container.appendChild(conclude)
            return done
        }
        status.innerHTML += decoder.decode(value)
    }
}

We create an element called status to append to our #statusMessage element (I'm not showing the HTML because you can probably build it yourself, but it's a <div> in my source). The status element is where the data returned periodically from the server will live. In my case, I'm returning plain text, which renders nicely in a <pre>, but you can use whatever you'd like.

Then, we repeatedly wait for the reader to give us some text, which we append to the status element's innerHTML propery as it arrives. When the reader returns a truthy done, we append a paragraph to #statusMessage to say so.

Server Side

Our server side is build in Python using Flask. We have the normal scafolding around our code in the form of a @route decorator, and we return a Flask Response object. Here we're building the Response object directly, rather than using one of the utility functions, such as render_template. If we were using the usual templating feature, the entire template would have to be built before anything would be sent to the client. Not what we want.

We give our Response object a function, innerHandler, which will generate the data to send to the client, and set the content type to text/event-stream so the client browser understands it should expect the data to be delivered in discrete chunks rather than all at once.

@app.route("/processHandler", methods=["POST"])
def processHandler():
    @stream_with_context
    def innerHandler():
        ...
    return Response(innerHandler(), content_type='text/event-stream')

Note that innerHandler is decorated with @stream_with_context. This decorator, imported from Flask, ensures our request context is always available to innerHandler. Gotta have that.

We'll leverage Python's yield keyword to send little bits of text back to the client as we move along our process. Flask will dutifully send these bits to the client as we emit them.

So that our user gets some feedback right away, we'll yield a status message before we actually pull the request data.

@app.route("/processHandler", methods=["POST"])
def processHandler():
    @stream_with_context
    def innerHandler():
        yield "Examining data...\n"
        filedata = str(request.data, 'utf-8')
        ...
    return Response(innerHandler(), content_type='text/event-stream')

We munge each line in the uploaded file to extract a key we can use to get the corresponding data from the API. Since this call may take some time, we signal the user before we call the API.

@app.route("/processHandler", methods=["POST"])
def processHandler():
    @stream_with_context
    def innerHandler():
        yield "Examining data...\n"
        filedata = str(request.data, 'utf-8')
        for line in filedata.split("\n"):
            # ... snip ... : here we get a key from the line
            yield f"Retrieving data for record {key}..."
            record = getDataFromAPI(key)
            ...
    return Response(innerHandler(), content_type='text/event-stream')

Finally, update the data in the object, then hit the API again with the updated record. Again, we signal the user before we hit the API.

@app.route("/processHandler", methods=["POST"])
def processHandler():
    @stream_with_context
    def innerHandler():
        yield "Examining data...\n"
        filedata = str(request.data, 'utf-8')
        for line in filedata.split("\n"):
            # ... snip ... : here we get a key from the line
            yield f"Retrieving data for record {key}..."
            record = getDataFromAPI(key)
            # ... snip ... : update the record with data from the line
            yield f"Updating record\n"
            updateDataViaAPI(record)
    return Response(innerHandler(), content_type='text/event-stream')

Notice that, inside the loop, the first yield doesn't end with a newline, while the second does. This keeps the getting and updating for a single record on the same line (probably - window width matters here) of the status updates. This is why we use a <pre> element for the status updates.

a Viola

So there you have a pretty simple client-server streaming solution to keep a user aware of the progress of a long-running process.

There are other features you may need to add, such as error handling, but I leave that as an exercise for another day.

Happy streaming!

 
Share this