views:

570

answers:

4

I'm trying to write an "after update" trigger that does a batch update on all child records of the record that has just been updated. This needs to be able to handle 15k+ child records at a time. Unfortunately, the limit appears to be 100, which is so far below my needs it's not even close to acceptable. I haven't tried splitting the records into batches of 100 each, since this will still put me at a cap of 10k updates per trigger execution. (Maybe I could just daisy-chain triggers together? ugh.)

Does anyone know what series of hoops I can jump through to overcome yet another ridiculously short-sighted limitation by this awful development "platform"?

Edit: I calling following @future function in my trigger, but it never updates the child records:

global class ParentChildBulkUpdater
{
    @future 
    public static void UpdateChildDistributors(String parentId) {
        Account[] children = [SELECT Id FROM Account WHERE ParentId = :parentId];

        for(Account child : children)
            child.Site = 'Bulk Updater Fired';
        update children;

    }
}
+1  A: 

It's worst than that, you're not even going to be able to get those 15k records in the first place, because there is a 1,000 row query limit within a trigger (This scales to the number of rows the trigger is being called for, but that probably doesnt help)

I guess your only way to do it is with the @future tag - read up on that in the docs. It gives you much higher limits. Although, you can only call so many of those in a day - so you may need to somehow keep track of which parent objects have their children updating, and then process that offline.

A final option may be to use the API via some external tool. But you'll still have to make sure everything in your code is batched up.

I thought these limits were draconian at first, but actually you can do a hell of a lot within them if you batch things correctly, we regularly update 1,000's of rows from triggers. And from an architectural point of view, much more than that and you're really talking batch processing anyway which isnt normally activated by a trigger. One things for sure - they make you jump through hoops to do it.

Codek
I tried the @future method route, but it doesn't seem to be firing in my sandbox - how long will I need to wait for these to go off?
Jake
@Jake In my experience, @future methods are executed within a few seconds. Check Setup > Monitoring > Apex jobs to see if yours are being executed
Adam
@Adam/@Codek, Can one of you take a look at the @future method I put in the main post? Is there any reason that wouldn't update the children like I would expect?
Jake
You need to add the "Site" column to the SOQL statement for the update to work correctly. One way to test something like this is to invoke it manually from the "System Log"
Adam
Thanks a lot, you've been a huge help. I'm going to go with an external API tool for now and move to an API web service as soon as we get some hosting.
Jake
+1  A: 

I think Codek is right, going the API / external tool route is a good way to go. The governor limits still apply, but are much less strict with API calls. Salesforce recently revamped their DataLoader tool, so that might be something to look into.

Another thing you could try is using a Workflow rule with an Outbound Message to call a web service on your end. Just send over the parent object and let a process on your end handle the child record updates via the API. One thing to be aware of with outbound messages, it is best to queue up the process on your end somehow, and immediately respond to Salesforce. Otherwise Salesforce will resend the message.

Adam
It might come to this...thanks for the warning about responding immediately, by the way.
Jake
A: 

I believe in version 18 of the API the 1000 limit has been removed. (so the documentations says but in some cases I still hit a limit)

So you may be able to use batch apex. With a single APEX update statement

Something like:

List children = new List{};

for(childObect__c c : [SELECT ....]) {

c.foo__c = 'bar';

children.add(c);

} update(children);;

Besure you bulkify your tigger also see http://sfdc.arrowpointe.com/2008/09/13/bulkifying-a-trigger-an-example/

Daveo
I was using a v18 trigger and I hit the limit with 400. It wals already bulkified so that it only had one select and one update. The error I got stated that the problem was with the size of the DML statement.
Jake
Interesting where do you see this? The docs here still show the limit is 1,000:http://www.salesforce.com/us/developer/docs/apexcode/index_Left.htm#StartTopic=Content%2Fapex_gov_limits.htm|SkinName=webhelpBut also, there are good notes on batch apex which may be the way to go here:https://na1.salesforce.com/help/doc/en/salesforce_winter10_release_notes.pdf
Codek
+1  A: 

@future doesn't work (does not update records at all)? Weird. Did you try using your function in automated test? It should work and and the annotation should be ignored (during the test it will be executed instantly, test methods have higher limits). I suggest you investigate this a bit more, it seems like best solution to what you want to accomplish.

Also - maybe try to call it from your class, not the trigger?

Daisy-chaining triggers together will not work, I've tried it in the past.

Your last option might be batch Apex (from Winter'10 release so all organisations should have it by now). It's meant for mass data update/validation jobs, things you typically run overnight in normal databases (it can be scheduled). See http://www.salesforce.com/community/winter10/custom-cloud/program-cloud-logic/batch-code.jsp and release notes PDF.

eyescream