Delete Salesforce Data
  • 14 Dec 2022
  • 4 Minutes to read
  • Contributors
  • Dark

Delete Salesforce Data

  • Dark

The following articles describe how to use Single Dataloader to delete data from Salesforce. To delete data, all you need is a CSV file that contains the IDs of the objects you want to delete in one of the columns. Once you have this, proceed with the instructions below.

  1. Log in to your ARM account.
  2. Hover your mouse over the DATALOADER module and select DATALOADER.
  3. Click DELETE on the right side of the screen.
  4. Choose your Salesforce Org and your org environment (Production or Development Edition, Sandbox, or Pre-Release).
  5. The corresponding URL and your Username are automatically generated based on the above selection.
  6. Click LOGIN AND FETCH OBJECTS to fetch all the objects from your Salesforce Org.
  7. Select the object you wish to delete the data. For example, Account, Contact, Lead, etc. You can use the search function to search through your objects and the filter button to quickly filter your standard/custom objects.
  8. Click NEXT.
  9. You can import your file from your local directory on the next screen. Upload the CSV file you wish to import by clicking the UPLOAD button.
  10. A notification pop-up will display the number of records that will be impacted. Click OK.
  11. The next step is to prepare your field mappings. Field mappings match columns in your CSV to fields in your Salesforce Org. 
  12. You can automatically map the members and the fields using Automap. It compares the destination fields with the fields available in uploaded CSV files and, if both match, the value is selected automatically.
  13. The number of fields mapped out of the total number of fields is displayed below the Automap checkbox.
  14. Use the search option to look up a field by name from the long list in order to map it. 
  15. Use the Filter dropdown to choose which fields to display:
    • ALL: Displays all fields whether they have been mapped or not.
    • MAPPED: Displays only the fields which have been mapped yet.
    • UNMAPPED: Displays only the fields which haven't been mapped yet.
      After selecting the filter, the list updates automatically as you map or unmap each field.
  16. Make sure you have mapped all the required fields. Otherwise, you can't move forward. Click NEXT.
  17. On the Process Summary screen, you can:
    1. Give the process/job a Name.
    2. Select the Category. Categories are used to classify and group similar processes having similar functionality. In simple terms, you are assigning similar processes to a category. You can either select an existing category or create a new category by clicking the + icon.
    3. View the main Object.
    4. View the operation Type (Delete).
    5. View the number of impacted Records.
    6. Use Bulk API.
      About Bulk API
      The Bulk API is based on REST principles and is optimized for inserting, updating, and deleting large sets of data. You can use the Bulk API to process jobs either in serial or parallel mode. Processing batches serially means running them one after another, while processing batches in parallel means running multiple batches simultaneously. When you run a Bulk API job, processing more batches in parallel means giving that job a higher degree of parallelism, providing better data throughput overall.
  18. You can schedule your tasks so they start running regularly. You can choose between Daily, Weekly, or On-demand schedules.
  19. Finally, click on SAVE to save your task and run it later.
  20. Your task is listed on top of the lists in the Dataloader Summary screen. 
  21. Click Run to run the dataloader immediately before the scheduled time.
  22. Select the configurations here:
    • Use Batch Size.
      About Batch Size
      Whenever the Bulk API checkbox is left unchecked, the Batch API is used.
       Salesforce Batch API is based on SOAP principles and is optimized for real-time client applications that update small numbers of records at a time. Although SOAP API can also be used for processing large numbers of records, when the data sets contain hundreds of thousands of records, it becomes less practical. In those cases, Bulk API is the best option. Batch API processes data in smaller batches than Bulk API, resulting in a higher API call usage per operation on large volumes of data.
    • Use UTF-8 file encoding for file read and write operations.
  23. Click RUN.
  24. The number of successful or failed records deleted can be seen in the Results of Last Run section. The values in this field are updated dynamically while the job is still running. You can view the records or download them to your local system. The records are generated in CSV format.
  25. The number of impacted records can be seen in the Records section. The value in this field is updated dynamically while the job is still running.

More Options

  1. Edit: Modifies or updates the process details. 
  2. Abort: Aborts the process while it is still running.
  3. Schedule: Sets the schedule at which the process must run.
  4. Delete: Deletes the delete process.
  5. Log: Provides information about the execution of the deleted task.
  6. VR/WFR: ARM lists all the validations/workflow rules that were set. The UI lists all the validation rules, and users have to enable them for the disabled validation rules (if required). For more info, refer to the article: Validation/ Workflow Rules. Sample VR/WFR attached:validation rulesworkflow rules
  7. Clone: Creates a copy (clone) of the insert process. Operation type and object name are displayed. Enter the Process Name in the field. The default Salesforce Org is automatically selected. To choose a different org use the dropdown list. To upload a different CSV file select the Choose Different Data CSV File check box. Finally, click Clone.

Was this article helpful?

What's Next