How to automate download of weekly export service files

Last time I checked, there was no way to access the backup file status (or actual files) over the API. I suspect they have made this process difficult to automate by design.

I use the Salesforce scheduler to prepare the files on a weekly basis, then I have a scheduled task that runs on a local server which downloads the files. Assuming you have the ability to automate/script some web requests, here are some steps you can use to download the files:

  1. Get an active salesforce session ID/token
    • enterprise API - login() SOAP method
  2. Get your organization ID ("org ID")
    • Setup > Company Profile > Company Information OR
    • use the enterprise API getUserInfo() SOAP call to retrieve your org ID
  3. Send an HTTP GET request to https://{your sf.com instance}.salesforce.com/ui/setup/export/DataExportPage/d?setupid=DataManagementExport
    • Set the request cookie as follows:
      • oid={your org ID}; sid={your session ID};
  4. Parse the resulting HTML for instances of <a href="/servlet/servlet.OrgExport?fileName=
    • (The filename begins after fileName=)
  5. Plug the file names into this URL to download (and save):
    • https://{your sf.com instance}.salesforce.com/servlet/servlet.OrgExport?fileName={filename}
    • Use the same cookie as in step 3 when downloading the files

This is by no means a best practice, but it gets the job done. It should go without saying that if they change the layout of the page in question, this probably won't work any more. Hope this helps.


A script to download the SalesForce backup files is available at https://github.com/carojkov/salesforce-export-downloader/

It's written in Ruby and can be run on any platform. Supplied configuration file provides fields for your username, password and download location.

With little configuration you can get your downloads going. The script sends email notifications on completion or failure.

It's simple enough to figure out the sequence of steps needed to write your own program if Ruby solution does not work for you.


I'm Naomi, CMO and co-founder of cloudHQ, so I feel like this is a question I should probably answer. :-)

cloudHQ is a SaaS service that syncs your cloud. In your case, you'd never need to upload your reports as a data export from Salesforce, but you'll just always have them backed up in a folder labeled "Salesforce Reports" in whichever service you synchronized Salesforce with like: Dropbox, Google Drive, Box, Egnyte, Sharepoint, etc.

The service is not free, but there's a free 15 day trial. To date, there's no other service that actually syncs your Salesforce reports with other cloud storage companies in real-time.

Here's where you can try it out: https://cloudhq.net/salesforce

I hope this helps you!

Cheers, Naomi

Tags:

Salesforce