Update rate of a data bucket

I am trying to upload my readings from the Arduino to the data bucket and the refresh mode is set to stream by device, which should read 1 sample per second.


In the dashboard, I can read the samples every second without errors. However, I only receive the sample every 59 seconds in the data bucket.
I try to set the sampling interval to 1 second (refresh mode: sampling interval, 1 second) but there is an error without any warnings.
Is there anyone know how to increase the sample rate of the data bucket or 59 seconds per sample is the limit?

Hi, the public server has the 1 minute writing limitation in buckets, on private instances this number is enhanced.

Hope this helps.

Thank you for your reply.
May I ask if there is any documentation to demonstrate how it can be enhanced? There are different licenses for private instances and I want to know which one can satisfy my requirements. Not only about the update rate but also the maximum number of records, etc.

The write rate is limited basically by the instance capability, if you are using just one device writing one variable, you can do it even more than once by second, of course this rate is limited by the amount of devices writing simultaneously, and the maximum numbers of records are limited by the storage capacity of the instance.

In my humble opinion, I strongly recommend to reconsider the truly need to storage that amount of information, because for example in electrical power logs it is analyzed a lot of information, according the IEEE recommendation it is needed to sample by 200ms and calculate parameters, so there are 5 measurements by second, but the taken values are the maximum, the minimum and the media of 10 minutes of sampling, basically that huge amount of samplings are sumarized into those 3 values for the analysis.

Yes, the frequency of data uploading is not necessary to be as high as what I expect.
I am new to IoT, so I just modify some of my existing codes to upload the collected data, which sampled at 1 fps. I can definitely pack the data into packages to reduce the rate.
Thank you again for the suggestions.