Skip to content

Salesforce apex trigger: process records in less than 200 records per transaction using queueable chaining

Last updated on April 16, 2024

Consider a situation where you are bulk inserting records into a custom object or standard object, and you have an Apex trigger that is fired by this custom object. This trigger contains complex business logic, and you’re running into Salesforce limitations due to processing the default batch size of 200 records in a single transaction. Now, you’re looking for a solution to process these records in smaller batches to avoid Salesforce limitations.

SOLUTION

The remedy lies in leveraging the power of queuable Apex jobs to process records in smaller, more manageable batches. By chaining the remaining records into the same Apex job.

Take a look at the following Apex trigger. This trigger invokes a queuable Apex job and passes a list of records to be processed. The Apex job takes care of these records, handling five (5) records at a time. Once processed, these five records are removed from the list, and the remaining records are seamlessly integrated back into the queueable job for further processing.

You can set the batch_size variable according to your specific requirements. You can choose the exact number of records you want to process in each batch, tailoring the solution to your precise needs.

trigger CustomObjectTrigger on Custom_Object__c (after insert) {
    
    list<Custom_Object__c> records = new list<Custom_Object__c>();
    records.addAll(trigger.new);
    System.enqueueJob(new ProcessRecordsInBatchesQable(records));
}
public class ProcessRecordsInBatchesQable implements Queueable {
    List<Custom_Object__c> recordsToProcess;
    Integer batch_size = 5;

    public ProcessRecordsInBatchesQable(List<Custom_Object__c> records) {
        this.recordsToProcess = records;
    }

    public void execute(QueueableContext context) {
        
        list<Custom_Object__c> BatchRecords = new list<Custom_Object__c>();
        
        for(integer i=0; i < recordsToProcess.size(); i++){
            if(i < batch_size){
                BatchRecords.add(recordsToProcess[i]);
                recordsToProcess.remove(i);
            }
        }

        if(BatchRecords.size() > 0){
            processBatch(BatchRecords);
        }
        //chaining the job for remaining records
        if(recordsToProcess.size() > 0){
            System.enqueueJob(new ProcessRecordsInBatchesQable(recordsToProcess));
        }
    }

    public void processBatch(List<Custom_Object__c> records) {
		//put your business logic and processing here
		
	}
}

Furthermore, the Queueable interface offers distinct advantages. Certain governor limits, such as heap size constraints, are more forgiving compared to synchronous Apex operations.

Resources

https://developer.salesforce.com/docs/atlas.en-us.apexcode.meta/apexcode/apex_queueing_jobs.htm

https://trailhead.salesforce.com/content/learn/modules/asynchronous_apex/async_apex_queueable