Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Automatically chunk messages to not exceed 1000-message limit #83

Closed
hypesystem opened this issue Jan 19, 2015 · 6 comments
Closed

Automatically chunk messages to not exceed 1000-message limit #83

hypesystem opened this issue Jan 19, 2015 · 6 comments

Comments

@hypesystem
Copy link
Collaborator

GCM can at most take 1000 registration ids at once. If more than this are sent, the call will fail.

This is a documented problem (#82).

It can be solved by automatically chunking the calls, to at most 1000 messages per call.

@hypesystem
Copy link
Collaborator Author

It is a bit tricky to find the right place to put this. I would consider putting it in Sender#sendNoRetry. But with Sender#send retrying on any failure, we need some sort of intelligence. If the first 1000 messages are sent succesfully, but the next 1000 are not, how should this be handles?

Postponed to next release for thining reasons.

@hypesystem
Copy link
Collaborator Author

One potential way of doing this would be simply checking in send. If the number of registrationIds is larger than 1000, it should use a library like async to call itself as many times as necessary in parallel.

Sender.prototype.send = function(message, registrationIds, ...) {
    // ...
    if(registrationIds.length > 1000) {
        var registrationIdBatches = splitEvery(registrationIds, 1000);
        async.parallel(registrationIdBatches, function(batch, next) {
            self.send(message, batch, ..., callback);
        }, function(error, results) {
            var resultEntries = _.flatten(_.map(results, function(result) {
                return result.results;
            });
            var result = results[0];
            result.results = resultEntries;
            return result;
        });
        return;
    }
    // ...
}

This already isn't pretty because it takes the chunks apart and then puts them back together again. Then take into account what happens if one of the run-in-parallel sends fails comepletely (returns an error) ... do we just fail completely? That's wrong. Some of the items were sent. Do we create custom error entries for all the missing ones? That seems a lot of work...

@hypesystem hypesystem modified the milestones: v0.10, v0.9.16 Feb 20, 2015
@hypesystem
Copy link
Collaborator Author

This adds new behaviour, hence staged for v0.10 instead of v0.9.16.

@hypesystem hypesystem removed this from the v0.10 milestone Jul 15, 2015
@robertrossmann
Copy link

Hi guys /gals!

While I was unable to find a good place to submit a PR for this and decided it would be best handled within my own code, I would like to share with you my implementation of a function that splits the array into chunks. At least a start. 😄

It's written for ES 2015 code, but you can just replace all const and let with var and that's it.

/**
 * Slice an array into size-long arrays
 *
 * Given an array with 10 items, if we slice this array by 3, we will get the following:
 * [ [ 1, 2, 3 ]
 * , [ 4, 5, 6 ]
 * , [ 7, 8, 9 ]
 * , [ 10 ]
 * ]
 *
 * @param     {Array}     arr     Array to be sliced
 * @param     {Number}    size    The number of items each slice should have at most
 * @return    {Array[]}           An array of size-long arrays. The last item will be
 *                                potentially shorter, only containing the remaining
 *                                pieces
 */
function sliceBy (arr, size) {

  // Determine how many slices we will need to make
  const slices = Math.ceil(arr.length / size)
  // Here we will put the sliced pieces
      , res = []

  for (let slice = 0; slice < slices; slice++)
    res[slice] = arr.slice(slice * size, slice * size + size)

  return res
}

@hypesystem
Copy link
Collaborator Author

Alternatively, you could use parallel-batch or something like lodash's chunk.

@eladnava
Copy link
Collaborator

Let's continue discussion on #167.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants