Bulk API V2 throws an "Invalid Batch" error when uploading a CSV containing column headers exceeding 4000 characters
Last updated 2020-01-04 ·Reference W-6571401 ·Reported By 4 users
When the fields designated in the CSV (column headers) exceeds 4000 characters, an invalid batch error is thrown due to not recognizing the field, as it is being truncated after the 4000th character.
- Actual Result
An "Invalid Batch" error is thrown.
- Expected Result
Bulk insertion should succeed.
1) Create a custom object containing enough fields to exceed 4000 characters.
2) Follow the steps in the walkthrough: https://developer.salesforce.com/docs/atlas.en-us.api_bulk_v2.meta/api_bulk_v2/walkthrough_upload_data.htm and attempt to use CSV data containing all of the fields in the custom object.
3) View the bulk data job in the "Bulk Data Jobs" menu and observe that the batch will fail with a similar error:
"null:InvalidBatch : InvalidBatch : Field name not found : MY_TEST_. Batch will not be retried."
- This issue seems to only happen with Bulk API V2. Use Bulk API V1 if the CSV header columns contain more than 4000 characters.
Is it Fixed?
Any unreleased services, features, statuses, or dates referenced in this or other public statements are not currently available and may not be delivered on time or at all. Customers who purchase our services should make their purchase decisions based upon features that are currently available.