Exceeded maximum execution time in Google Apps Script [duplicate]

My google app script is iterating through the user's google drive files and copying and sometimes moving files to other folders. The script is always stopped after 5 minutes with no error message in the log.

I am sorting tens or sometimes thousands files in one run.

Are there any settings or workarounds?


Solution 1:

One thing you could do (this of course depends on what you are trying to accomplish) is:

  1. Store the necessary information (i.e. like a loop counter) in a spreadsheet or another permanent store(i.e. ScriptProperties).
  2. Have your script terminate every five minutes or so.
  3. Set up a time driven trigger to run the script every five minutes(or create a trigger programmatically using the Script service).
  4. On each run read the saved data from the permanent store you've used and continue to run the script from where it left off.

This is not a one-size-fit-all solution, if you post your code people would be able to better assist you.

Here is a simplified code excerpt from a script that I use every day:

function runMe() {
  var startTime= (new Date()).getTime();
  
  //do some work here
  
  var scriptProperties = PropertiesService.getScriptProperties();
  var startRow= scriptProperties.getProperty('start_row');
  for(var ii = startRow; ii <= size; ii++) {
    var currTime = (new Date()).getTime();
    if(currTime - startTime >= MAX_RUNNING_TIME) {
      scriptProperties.setProperty("start_row", ii);
      ScriptApp.newTrigger("runMe")
               .timeBased()
               .at(new Date(currTime+REASONABLE_TIME_TO_WAIT))
               .create();
      break;
    } else {
      doSomeWork();
    }
  }
  
  //do some more work here
  
}

NOTE#1: The variable REASONABLE_TIME_TO_WAIT should be large enough for the new trigger to fire. (I set it to 5 minutes but I think it could be less than that).

NOTE#2: doSomeWork() must be a function that executes relatively quick( I would say less than 1 minute ).

NOTE#3 : Google has deprecated Script Properties, and introduced Properties Service in its stead. The function has been modified accordingly.

NOTE#4: 2nd time when the function is called, it takes the ith value of for loop as a string. so you have to convert it into an integer

Solution 2:

Quotas

The maximum execution time for a single script is 6 mins / execution
- https://developers.google.com/apps-script/guides/services/quotas

But there are other limitations to familiarize yourself with. For example, you're only allowed a total trigger runtime of 1 hour / day, so you can't just break up a long function into 12 different 5 minute blocks.

Optimization

That said, there are very few reasons why you'd really need to take six minutes to execute. JavaScript should have no problem sorting thousands of rows of data in a couple seconds. What's likely hurting your performance are service calls to Google Apps itself.

You can write scripts to take maximum advantage of the built-in caching, by minimizing the number of reads and writes. Alternating read and write commands is slow. To speed up a script, read all data into an array with one command, perform any operations on the data in the array, and write the data out with one command.
- https://developers.google.com/apps-script/best_practices

Batching

The best thing you can possibly do is reduce the number of service calls. Google enables this by allowing batch versions of most of their API calls.

As a trivial example, Instead of this:

for (var i = 1; i <= 100; i++) {
  SpreadsheetApp.getActiveSheet().deleteRow(i);
}

Do this:

SpreadsheetApp.getActiveSheet().deleteRows(i, 100);

In the first loop, not only did you need 100 calls to deleteRow on the sheet, but you also needed to get the active sheet 100 times as well. The second variation should perform several orders of magnitude better than the first.

Interweaving Reads and Writes

Additionally, you should also be very careful to not go back and forth frequently between reading and writing. Not only will you lose potential gains in batch operations, but Google won't be able to use its built-in caching.

Every time you do a read, we must first empty (commit) the write cache to ensure that you're reading the latest data (you can force a write of the cache by calling SpreadsheetApp.flush()). Likewise, every time you do a write, we have to throw away the read cache because it's no longer valid. Therefore if you can avoid interleaving reads and writes, you'll get full benefit of the cache.
- http://googleappsscript.blogspot.com/2010/06/optimizing-spreadsheet-operations.html

For example, instead of this:

sheet.getRange("A1").setValue(1);
sheet.getRange("B1").setValue(2);
sheet.getRange("C1").setValue(3);
sheet.getRange("D1").setValue(4);

Do this:

sheet.getRange("A1:D1").setValues([[1,2,3,4]]);

Chaining Function Calls

As a last resort, if your function really can't finish in under six minutes, you can chain together calls or break up your function to work on a smaller segment of data.

You can store data in the Cache Service (temporary) or Properties Service (permanent) buckets for retrieval across executions (since Google Apps Scripts has a stateless execution).

If you want to kick off another event, you can create your own trigger with the Trigger Builder Class or setup a recurring trigger on a tight time table.

Solution 3:

Also, try to minimize the amount of calls to google services. For example, if you want to change a range of cells in the spreadsheets, don't read each one, mutate it and store it back. Instead read the whole range (using Range.getValues()) into memory, mutate it and store all of it at once (using Range.setValues()).

This should save you a lot of execution time.