Mike Shepherd - 1 year ago
This lack of functionality kills the click vs code approach as I've develope many process builder solutions combined with flows to address many of our business needs but I can no longer do anything in mass using dataloader because of the CPU limit time outs and lack of detailed debug logs to try to figure out which items could be improved. I've had to turn off every process builder or result to using view updates which will allow 200 record updates at a time but this isn't the best use of my time when I 250K records to update.
Tobias Nygren - 1 year ago
Desperately needed. Constantly have to disable our processes and replacing them with either multiple workflows and actions or triggers. Both of those works but processes constantly keep giving us CPU time limit errors for both our integrations and regular bulk uploads.
Having 20+ triggers and worflows causes no issues but 1 process that never fires any actions causes it all to fail. Please improve this. or clarify that process should not be used if you ever intend to run any integrations or mass-uploads towards the object
Lisa Johnson - 1 year ago
I just ran an NCOA update file where the batch size was set to 100 (3100 records total) just so the batch was small enough to run with triggers/process builder routines active. I fully agree with Mr. Boddepalli's suggestion - please make sure that the backend code for Process Builder flows is as optimized as possible so it can work in cooperation (not conflict) with our daily work. Thank you!