We have a pretty complex trigger suite on our org. We’ve built a lot of custom functionality. I’m starting to worry about governor limits and not really understanding how to manage them. I’ve read through http://wiki.developerforce.com/page/Apex_Code_Best_Practices and I’ve made all my triggers accordingly, using as few SOQL and DML statements as possible, querying the database only when I have to and getting all the records I need at once, etc.
What I can’t figure out is – what if I just need to exceed the governor limit? What if I need to update 4,000 records when a certain other record they’re all related to changes? Am I out of luck? Is SalesForce really viable with large data sets?
Right now my client wants to update 20 different Product records, which could potentially update as many as 46,061 related opportunity line item records. They want a trigger that will update every related opportunity line item, which then will fire triggers that update other custom objects related to the opportunity line items – it multiplies fast, and hits governor limits hard. I’m not sure how to implement @future methods, but even then, those have limits that I’m not sure if I’ll hit or not.
What do I do?
Thank you for visiting the Q&A section on Magenaut. Please note that all the answers may not help you solve the issue immediately. So please treat them as advisements. If you found the post helpful (or not), leave a comment & I’ll get back to you as soon as possible.
I would look at using Batch Apex. It can handle the large data sets. I have found that the code is generally cleaner to write than triggers (e.g., different logic for insert/update, etc.) when the data model is very complex.
Even if your list size is greater than 200, Salesforce will break up your lists into chunks of 200 for trigger processing. Each chunk contributes to the same shared limit, though. If you were to perform one bulkified SOQL query in a single trigger you’d be able to process 100 * 200 = 20,000 records of that object in a single transaction until you hit the 101 max. If you are just using DML you’d have 150 * 200 = 30,000. That only accounts for a situation where a single query or single dml statement is issued.
In a complex data model you can definitely have many more queries occuring in your triggers, because your single save will trigger saves on other related records (e.g., roll-up summary triggering save in parent). It is worth keeping in mind what happens when you save a record when you design. If you design your solution to have triggers on every object it can be cumbersome to figure out why something is getting updated on a related record and can be harder to maintain.
On top of all of that, if your data model is really that complex you have to consider what will happen when you set up unit tests. If you have triggers on every object with many queries and updates you can definitely get into a situation where you approach the 101 SOQL Query limit when you are setting up the unit test data.
There’s a developer force blog post that has a some high level summaries on working with large data sets and links to other resources.
When dealing with large data sets your best bet is to leverage batch apex. This has the limitation of being asynchronous, but you can process way more data than you’re able to normally. It’s simpler than getting bogged down with @future based calls (an older technology on the platform) and is the only way to get a larger set of governor limits to work with.
The trigger could create an instance of your batch class, passing to it (via the constructor or similar) the list of IDs for the records to be processed.
We do something similar and process tens of thousands of records every day as part of our “end of day” process. You have several tools in your toolbox. Batch Apex is already mentioned in the answers. The other three tools you can use are @future jobs, queueable jobs, and scheduled jobs. If you don’t need to run all the processes synchronously, these three will help you in increasing the limit by breaking down the process into multiple independent processes with their own governor limits. Queueable Jobs are newly introduced and are very powerful and flexible.