Scheduled Notifications Part II: Handling bigger databases

Reading Time: 6 minutes

Have you ever wondered how apps like Tinder, Facebook, Reddit and many more send annoying recurrent notifications to remind you haven’t read your messages or to let you know there are new people near you? Well, in this post we show you an approach to creating your own annoying notifications when working with big databases.

 

Background

In the first part of this serie we created Scheduled Notifications for tiny databases. Then, the structure was the following:

users -> <unique user device> -> <user information>

But, the problem there is that the Firebase free version will only allow us to do 50k Firestore reads per day. Meaning that if we get tons of users, we are going to overpass our free trial limit as soon as the scheduled notification finishes its execution. That happens because Firestore is thought to have a big Collection being handled by some partitions (documents). So, we need to change the structure a bit to handle a greater user capacity. To do so, I will tell you some stuff I have learned since my last blog was posted.

Here we go...

First thing I learned is that Firestore hates doing a lot of writing and reading. As it is pointed out in its documentation: “Firestore is optimized for storing large collections of small documents.”

For that, we will need to find a way to categorize our information. So, we need to choose something to describe our users (or whatever thing we are trying to describe). For this example, I will go with Device Detection (iOS or Android). This will work for now, but I do not think this is the best approach, ok? Maybe, some better ideas would be “subdomains” or “cities” (anything that can be divided will be useful). So, we need to try to find something that fits our needs.

Second, we are going to use Array Chunks which means we are going to divide our full array into smaller ones. This is necessary since Firestore has a limitation of sending 500 calls at the time, so we are going to divide our array into chunks of 500.

Third, (similar to chunks) we are going to use batches. These are a way for Firestore to do multiple reads and write in a single transaction. Keep in mind that this does not mean our reads and writes will remain as one. If we have multiple documents, it will write once for each of our documents.

For this example I am assuming we have the code from the previous entry. So, the function that we are going to overwrite is on React Native firestoreNotificationData and as for the Node Function we are going to update scheduleNotification.

The, let’s start with React Native. We want to change the code so it looks like the following one:

 

import { Platform } from ‘react-native’;

import moment from ‘moment’;



export const firestoreNotificationData = async (userId) => {

 const enabled = await messaging().hasPermission();

 if (enabled) {

   const fcmToken = await messaging().getToken();

   let token = await AsyncStorage.getItem('firebaseToken');

   if (fcmToken !== token) {

     await AsyncStorage.setItem('firebaseToken', fcmToken);

     token = fcmToken;

   }

   if (token) {

     const ref = firestore().collection('users').doc(Platform.OS);

     firestore()

       .runTransaction(async (transaction) => {

         const doc = await transaction.get(ref);

         if (!doc.exists) { // case the device platform does not exist

           transaction.set(ref,

             {

               tokens: [

                 {

                   token,

                   uuid: userId,

                   lastUpdate: moment().unix()

                 }

               ]

             }

           );

           return userId;

         } else { // Case device platform exists

           const collection = await firestore().collection('users').doc(Platform.OS).get();

           const tokensCollection = collection.data()['tokens'].find(element => element.uuid === userId);

           if (!tokensCollection) { // Case is a new uuid

             transaction.update(ref, {

               tokens: firestore.FieldValue.arrayUnion({

                 token,

                 uuid: userId,

                 lastUpdate: moment().unix()

             })});

             return userId;

           } // Case uuid already exists needs update

           // NOTE: In this moment the only way to update a object of a array

           // is to delete the object and add it again

           await firestore().collection('users').doc(Platform.OS).update({

             tokens: firestore.FieldValue.arrayRemove({

               token: token,

               uuid: tokensCollection.uuid,

               lastUpdate: tokensCollection.lastUpdate

             })

           })

           await firestore().collection('users').doc(Platform.OS).update({

             tokens: firestore.FieldValue.arrayUnion({

               token,

               uuid: userId,

               lastUpdate: moment().unix()

           })});

           return userId;

         }

       })

       .catch((error) => {

         console.log('Transaction failed: ', error);

       });

     return 'ok';

   }

 }

 return 'ok';

};


So let’s try to break it down a little:

 

transaction.set(ref,

  {

    tokens: [{

      token,

      uuid: userId,

      lastUpdate: moment().unix()

    }]

  }

);

Overlook

In the first case, we are handling the structure for cases where the document does not exist. Therefore we need to create it. We just need to add all the information we want. For this case we just created an array called “tokens”. We are simply storing the uuid, the token, and the last time we updated the object.

If the document already exists, then we have 2 options: the user already exists or not. In case we have a new user we just simply add it to the tokens array:

 

transaction.update(ref, {

  tokens: firestore.FieldValue.arrayUnion({

  token,

  uuid: userId,

  lastUpdate: moment().unix()

})});

If we need to update our user, then we have to do the previous step, plus removing the existing element in the array. Sadly, there is no way to avoid the deletion step. Our code should look like this:

await firestore().collection('users').doc(Platform.OS).update({

  tokens: firestore.FieldValue.arrayRemove({

  token: token,

  uuid: tokensCollection.uuid,

  lastUpdate: tokensCollection.lastUpdate

})});

await firestore().collection('users').doc(Platform.OS).update({

  tokens: firestore.FieldValue.arrayUnion({

  token,

  uuid: userId,

  lastUpdate: moment().unix()

})});

Cool, we have our new structure. We are only missing the function that will execute our new schedule notification. So let’s check it out.

const functions = require('firebase-functions');

const admin = require('firebase-admin');

const moment = require('moment');

admin.initializeApp();




const SEVEN_DAYS = 604800;

const runtimeOpts = {

 timeoutSeconds: 540,

 memory: '1GB'

}




exports.scheduleNotification = functions

 .runWith(runtimeOpts)

 .pubsub.schedule("0 10,14,20 * * *")

 .timeZone("America/New_York")

 .onRun(async (context) => {

   try {

     let db = admin.firestore();

     let batchIndex = 0;

     let success = 0;

     let failed = 0;

     const snapshot = await db.collection("users").get();

     const currentUnixTime = moment().unix();

     const chunkSize = 250;

     const batchArray = [];

     const users = [];

     const payload = {

     notification: {

         title: 'Annoying Notification',

         body: 'You just send this automatically good for you!'

     }

   };

     batchArray.push(db.batch());

     snapshot.forEach((doc) => {

       // This will create our new array with the users that have more than 7 days

       if (doc.data()["tokens"]) {

         doc.data()["tokens"].forEach((user) => {

           if (user.lastUpdate + SEVEN_DAYS <= currentUnixTime) {

             user.microsite = doc.id;

             users.push(user);

           }

         });

       }

     });

     const chunckyArray = chunkArrayInGroups(users, chunkSize);

     functions.logger.log("users number", users.length);

     functions.logger.log("chunckyArray; ", chunckyArray);

     functions.logger.log("currentUnixTime", currentUnixTime);

     if (users.length > 0) {

       await Promise.all(

         chunckyArray.map(async (FiveHundredArray, FHAindex) => {

           const tokens = FiveHundredArray.map((element) => element.token);

           const response = await admin.messaging().sendMulticast({

             notification: payload.notification,

             tokens,

           });

           failed = failed + response.failureCount;

           success = success + response.successCount;

           const promisesResults = await Promise.all(

             FiveHundredArray.map(async (user, index) => {

               await batchArray[batchIndex].update(

                 db.collection("users").doc(user.microsite),

                 {

                   tokens: admin.firestore.FieldValue.arrayRemove({

                     token: user.token,

                     uuid: user.uuid,

                     lastUpdate: user.lastUpdate,

                   }),

                 }

               );

               if (response.responses[index].success) {

                 return new Promise(async (resolve, reject) => {

                   const result = await batchArray[batchIndex].update(

                     db.collection("users").doc(user.microsite),

                     {

                       tokens: admin.firestore.FieldValue.arrayUnion({

                         token: user.token,

                         uuid: user.uuid,

                         lastUpdate: moment().unix(),

                       }),

                     }

                   );

                   resolve(result);

                 });

               }

             })

           );

           functions.logger.log("promisesResults: ", promisesResults);

           if (response.failureCount > 0) {

             batchArray.push(db.batch());

             batchIndex++;

             const failedTokens = await Promise.all(

               response.responses.map(async (resp, index) => {

                 const userToDelete = chunckyArray[FHAindex][index];

                 if (!resp.success) {

                   functions.logger.log("userToDelete: ", userToDelete);

                   await batchArray[batchIndex].update(

                     db.collection("users").doc(userToDelete.microsite),

                     {

                       tokens: admin.firestore.FieldValue.arrayRemove({

                         token: userToDelete.token,

                         uuid: userToDelete.uuid,

                         lastUpdate: userToDelete.lastUpdate,

                       }),

                     }

                   );

                 }

                 return userToDelete;

               })

             );

             functions.logger.log("failedTokens: ", failedTokens);

           }

           functions.logger.log("batchIndex", batchIndex);

           batchIndex++;

           batchArray.push(db.batch());

         })

       );

       const commitResults = await Promise.all(

         batchArray.map(async (batch) => {

           functions.logger.log("batchItem", batch);

           await batch.commit();

         })

       );

       functions.logger.log("commitResults", commitResults);

     }

     return {

       data: {

         total: success + failed,

         success,

         failed,

       },

     };

   } catch (error) {

     functions.logger.log("Error", error);

     return error;

   }

 });

All good so far?

Alright, lots of changes here. Let's break it down so it is easier to understand.

snapshot.forEach((doc) => {

       // This will create our new array with the users that have more than 7 days

       if (doc.data()["tokens"]) {

         doc.data()["tokens"].forEach((user) => {

           if (user.lastUpdate + SEVEN_DAYS <= currentUnixTime) {

             user.microsite = doc.id;

             users.push(user);

           }

         });

       }

     });

In this first part, we are creating our array with the ones that we are going to update by getting the full collection and checking if the lastUpdate was done more than seven days ago.

 chunckyArray.map(async (FiveHundredArray, FHAindex) => {

           const tokens = FiveHundredArray.map((element) => element.token);

           const response = await admin.messaging().sendMulticast({

             notification: payload.notification,

             tokens,

           });




...

Next, we are breaking down the array we just created in chunks of 250, (the maximum elements allowed by Firebase is 500) Since it is possible that all 250 calls update the array, then the number of calls is going to be close to or 500. That is why the array is called FiveHundredArray.

failed = failed + response.failureCount;

           success = success + response.successCount;

           const promisesResults = await Promise.all(

             FiveHundredArray.map(async (user, index) => {

               await batchArray[batchIndex].update(

                 db.collection("users").doc(user.microsite),

                 {

                   tokens: admin.firestore.FieldValue.arrayRemove({

                     token: user.token,

                     uuid: user.uuid,

                     lastUpdate: user.lastUpdate,

                   }),

                 }

               );

               if (response.responses[index].success) {

                 return new Promise(async (resolve, reject) => {

                   const result = await batchArray[batchIndex].update(

                     db.collection("users").doc(user.microsite),

                     {

                       tokens: admin.firestore.FieldValue.arrayUnion({

                         token: user.token,

                         uuid: user.uuid,

                         lastUpdate: moment().unix(),

                       }),

                     }

                   );

                   resolve(result);

                 });

               }

             })

           );

We are almost finished

Finally, similar to what we did in React Native we just need to update the user. We are going to delete the users in which the token no longer exists since this will only add weight into your database. If the user was indeed needed we just add it again into the array.

If you want to test this, you might want to add data into the Firestore database. For that you can simply run your React Native application and try to log in with two different devices (an Android and iOS device would be the best option).

To test this you have a couple of options. I suggest creating a simple node file to upload a JSON. I use this simple script which takes a JSON file and uploads it to your firestore table.

 

const firestoreService = require('firestore-export-import');

const firebaseConfig = require('./config.js');

const serviceAccount = require('./serviceAccount2.json');



// JSON To Firestore

const jsonToFirestore = async () => {

 try {

   console.log('Initialzing Firebase');

   await firestoreService.initializeApp(serviceAccount, firebaseConfig.databaseURL);

   console.log('Firebase Initialized');




   await firestoreService.restore('./converted.json');

   console.log('Upload Success');

 }

 catch (error) {

   console.log(error);

 }

};




jsonToFirestore();

Then just wait, and you should get the notifications, or you can remove
400">.pubsub.schedule("0 10,14,20 * * *")

 .timeZone("America/New_York")

 .onRun(async (context) =>

And change it to functions.runWith(runtimeOpts).https.onCall(async (input, context)
This way you will have a function. Deploy it into your firebase functions and call it with Postman.

These few changes will help you get to handle information in a better way. I hope you find this info useful. If something else seems to be important to add I will create a new blog post for that. Have a good one, and until next time.

0 Shares:
You May Also Like
Read More

Scheduled Notifications

Reading Time: 5 minutes Have you ever wondered how apps like Tinder, Facebook, Reddit and many more send annoying recurrent notifications to…