Update Salesforce Data
  • 3 Minutes to read
  • Contributors
  • Dark

Update Salesforce Data

  • Dark

The following articles describe how to use Single Dataloader to update data into Salesforce. The data is updated via a CSV file.

  1. Login to your AutoRABIT account.
  2. Hover your mouse over the DataLoader module and choose the option: Dataloader
  3. Click Update on the right side of the screen.Update
  4. Choose your Salesforce Org and your Org Environment (production or development edition, sandbox, or pre-release).
  5. The corresponding URL and your Username are automatically generated based on the above selection.
  6. Click Login and fetch objects to fetch all the objects from your Salesforce Org.Login Fetch Objects
  7. Select the object you wish to update the data. For example, Account, Contact, Lead, etc. You can use the Search function to search through your objects and the Filter button to quickly filter your Standard/Custom objects.
  8. Click Next.Account
  9. On the next screen, you can import your file from your local directory. Upload the CSV file you wish to import by clicking the Upload button.Upload
  10. The next step is to prepare your field mappings. Field mappings are the process of matching columns in your CSV to fields in your Salesforce Org. 
  11. You can automatically map the members and the fields using Automap. It compares the destination fields with the fields available in uploaded CSV files and if both of them match, the value is selected automatically.Automap
  12. Make sure you have mapped all the required fields, otherwise you can't move forward. Click Next.
  13. On the Process Summary screen, you can:
    1. Give the process/job a name.
    2. Select the Category. Categories are used to classify and group similar processes having similar functionality. In simple terms, you are assigning similar processes to a category. You can either select an existing category or create a new category by clicking on + icon.
    3. View the main object.
    4. View the operation type (update).
    5. Use Bulk API.
      About Bulk API
      The Bulk API is based on REST principles, and it’s optimized for inserting, updating, and deleting large sets of data. You can use the Bulk API to process jobs either in serial mode or in parallel mode. Processing batches serially means running them one after another, and processing batches in parallel means running multiple batches at the same time. When you run a Bulk API job, processing more of its batches in parallel means giving that job a higher degree of parallelism, which in turn gives your overall run better data throughput
  14. You can schedule your tasks so that they start running regularly. You can choose between Daily, Weekly, or On-demand schedules.
  15. Finally, click on Save to save your task and run it later.Process summary
  16. Your task will get listed on top of the lists in the Dataloader Summary screen. 
  17. Click on the Run button to run the dataloader immediately before the scheduled time.Run Dataloader
  18. Select the configurations here:
    • Use Batch Size.
      About Batch Size
      Whenever the Bulk API checkbox is left unchecked, the Batch API is in use.

      Salesforce Batch API is based on SOAP principles and is optimized for real-time client applications that update small numbers of records at a time. Although SOAP API can also be used for processing large numbers of records when the data sets contain hundreds of thousands of records, it becomes less practical, in those cases, Bulk API is the best option. Batch API processes data in smaller batches than Bulk API, resulting in a higher API call usage per operation on large volumes of data.
    • Disable workflow rules.
      About Disable workflow rules
      The workflows of the Salesforce objects will be deactivated, and the data would be transferred from the source to the destination sandbox. Once the migration is completed, workflows will get reactivated again.
    • Disable validation rules.
      About Disable validation rules
      Validation rules verify that the data a user enters in a record meets the criteria you specify before the user can save the record. On selection, all the validation rules of the Salesforce objects will be deactivated, and the data would be transferred from the source to the destination sandbox. Once the migration is completed, Validation rules will get activated.
    • Insert/update with null values
    • Use UTF-8 file encoding for file read and write operations.
  19. Click Run.Run configuration - Carlo update
  20. The number of successful or failed records extracted can be seen in the Results of Last Run section. Here, you can view the records or download them in your local system. The records get generated in CSV format.Results of last run

More Options

More options to update

  1. Edit: Modify or update the process details
  2. Schedule: This allows you to schedule the process at which it must be run.
  3. Delete: Deletes the update process.
  4. Clone: Creates a copy (clone) of the update process.
  5. Log: Logs provide information about the execution of the updated task.
  6. VR/WFR: AutoRABIT will list down all the validations/workflow rules that were set. The UI will list down all the validation rules and the users will have to enable them for the disabled validation rules (if required). For more info, refer to the article: Validation/ Workflow Rules. Sample VR/WFR attached:validation rulesworkflow rules

Was this article helpful?