Bulk loader API - Answers - Salesforce Trailblazer Community
Trailblazer Community
Ask Search:
Tom MerrowTom Merrow 

Bulk loader API

What are the limits to importing data via the bulk loader?
Steve MolisSteve Molis
Are you looking for this?

Bulk API Limits

Please note the following Bulk API limits:

API Usage Limits
Bulk API use is subject to the standard API usage limits. Each HTTP request counts as one call for the purposes of calculating usage limits.
Batch content
Each batch must contain exactly one CSV or XML file containing records for a single object, or the batch is not processed and stateMessage is updated. Use the enterprise WSDL for the correct format for object records.
Batch fields
You can't import attachments or any base64 (binary) field to Salesforce.com with the Bulk API. An alternative is to use the Data Loader without the Bulk API enabled to import base64 fields.
Batch limit
You can submit up to 1000 batches per rolling 24 hour period. You cannot create new batches associated with a job that is more than 24 hours old.
Batch lifespan
Batches and jobs that are older than seven days are removed from the queue regardless of job status. The seven days is measured from the youngest batch associated with a job, or the age of the job if there are no batches. You cannot create new batches associated with a job that is more than 24 hours old.
Batch size
  • Batches consist of a single CSV or XML file that can be no larger than 10 MB.
  • A batch can contain a maximum of 10,000 records.
  • A batch can contain a maximum of 10,000,000 characters for all the data in a batch.
  • A field can contain a maximum of 32,000 characters.
  • A record can contain a maximum of 5,000 fields.
  • A record can contain a maximum of 400,000 characters for all its fields.
  • A batch must contain some content or an error occurs.
Batch processing time
There is a 10-minute limit for processing a batch and a five-minute limit for processing 100 records. If you get a timeout error when processing a batch, split your batch into smaller batches, and try again.
Compression
The only valid compression value is gzip. Compression is optional, but strongly recommended. Note that compression does not affect the character limits defined in Batch size.
Job abort
Any user with correct permission can abort a job. Only the user who created a job can close it.
Job close
Only the user who created a job can close it. Any user with correct permission can abort a job.
Job content
Each job can specify one operation and one object. Batches associated with this job contains records of one object. Optionally, the job may specify serial processing mode, which is used only when previously submitted asynchronous jobs have accidentally produced contention because of locks. Use only when advised by Salesforce.com.
Job external ID
You cannot edit the value of an external ID field in JobInfo. When specifying an external ID, the operation must be upsert. If you try to use it with create or update, an error is generated.
Job lifespan
Batches and jobs that are older than seven days are removed from the queue regardless of job status. The seven days is measured from the youngest batch associated with a job, or the age of the job if there are no batches. You cannot create new batches associated with a job that is more than 24 hours old.
Job open time
The maximum time that a job can remain open is 24 hours. The Bulk API does not support clients that, for example, post one batch every hour for many hours.
Job status in job history
After a job has completed, the job status and batch result sets will be available for 7 days after which this data will be deleted permanently.
Job status change
When you submit a POST body with a change in job status, you can only specify the status field value. If operation or entity field values are specified, an error occurs.
Portal users
Regardless of whether the “API Enabled” profile permission is granted, portal users (Customer Portal, Self-Service portal, Partner Portal and PRM Portal) cannot access the Bulk API.