[SOLVED]Support for bulk data bucket writes not working on private instance

I’m trying to take advantage of a feature introduced in the platform version 5.1.1, the ability to send bulk data as described in this page: Platform Version 5.1.1 .

I have a private deployment and data is provided by The Things Stack.

Does anyone have an example of how to handle bulk data in the decoder?

In my attempt to validate I can receive and process bulk data in thinger platform, I wrote the following decoder in NodeJS. This is executed every time the webhook is triggered on the TTS side. Although I’m returning an array of datasets, what shows up in the data bucket is not expanded into multiple data sets as I expected.

module.exports.uplink = function(payload)
{
    const buffer = Buffer.from(payload, 'base64');
    let processed = {};
    
    let set1 = {};
    let set2 = {};
    let set3 = {};
    set1.ts = 1687189580000;
    set1.temperature = 31;
    set1.humidity = 91;
    set2.ts = 1687189680000;
    set2.temperature = 32;
    set2.humidity = 92;
    set3.ts = 1687169780000;
    set3.temperature = 33;
    set3.humidity = 93;
    processed[0] = set1;
    processed[1] = set2;
    processed[2] = set3;
    return processed;
};

This is what shows up in bucket:

Hi!

I think something likee the following should work:

module.exports.uplink = function(payload) {
    const buffer = Buffer.from(payload, 'base64');
    let processed = [];

    processed.push({
        ts: 1687189580000,
        temperature: 31,
        humidity: 91
    });

    processed.push({
        ts: 1687189680000,
        temperature: 32,
        humidity: 92
    });

    processed.push({
        ts: 1687169780000,
        temperature: 33,
        humidity: 93
    });

    return processed;
};

The main difference is that your processed is an Object, while thinger is expecting an Array. However, never test this from the plugins. Please, let me know if it works!

Good news!!! This construct worked like a charm. Thank you so much Alvaro!

I just re-rested it and got three separate entries in the auto-provisioned bucket, each with its own timestamp. Here is what it looks like after applying your suggestion:

DataBucket_fixed

Regards.

1 Like