Skip to main content

Scheduling Automated Data Exports

Automate recurring FieldKo data exports via Data Loader CLI or scheduled Salesforce reports

Updated this week

Rather than manually exporting data, you can schedule regular exports of FieldKo records to run automatically. Salesforce offers a few approaches using standard tools:

  • The Data Loader in command-line (CLI) mode for automated exports.

  • Scheduled Reports (with email) for smaller-scale exports of report data.

  • The built-in Data Export Service (weekly/monthly exports via Setup) – typically used for full org backups (we’ll touch on this as well).

This article will focus on using Data Loader’s CLI and scheduled reports, with tips on automation and file storage.

Using Data Loader CLI for Scheduled Exports

Data Loader CLI: Salesforce’s Data Loader (available for Windows) has a command-line interface that lets you run exports via scripts. This is perfect for scheduling nightly or weekly jobs. The Data Loader CLI supports all data operations (extract, insert, update, delete) and can handle very large volumes (up to millions of records, especially when using Bulk API 2.0 under the hood)​.

Setup Steps:

  1. Install Data Loader: Ensure you have the Data Loader installed. During installation, take note of the install directory (it contains a /bin folder with the CLI tools).

  2. Prepare Config Files: Data Loader uses configuration files (often process-conf.xml and config.properties) to define the export job details. You can use the sample files provided in the Data Loader installation (samples/conf/ directory) as a starting point. Key things to configure:

    • SOQL query: Define the SOQL query to extract FieldKo records. For example, to export all Visit__c records from the past month, you might use a query in the config like SELECT Id, Name, Date__c, ... FROM Visit__c WHERE Date__c = LAST_MONTH.

    • Target file name: Specify where to output the CSV (e.g., C:\Exports\FieldKo_Visits.csv). This is done via the dataAccess.name property in the process configuration.

    • Login credentials: You’ll include the Salesforce username and password (and security token) in the config.properties. Important: For security, you shouldn’t store the raw password; instead use Data Loader’s encryption utility. Data Loader comes with an encrypt.bat file to encrypt your password. Run that to get an encrypted token and put the encrypted value in config.properties (with sfdc.password property) and also store the generated encryption key file safely.

    Salesforce requires encrypting the password for CLI use​. Also include your username (sfdc.username) and the login URL (production or sandbox).

    • Batch settings: Optionally, configure batch size or use Bulk API. By default, Data Loader CLI might use the Bulk API for very large queries. You can set sfdc.useBulkApi=true in the config if not already.

  3. Define the Process in process-conf.xml: In process-conf.xml, define a “process” bean for your export. For example:

    xmlCopyEdit<bean id="exportFieldKoVisits" class="com.salesforce.dataloader.process.ProcessRunner" scope="prototype"> <description>Export FieldKo Visit records</description> <property name="configOverrideMap"> <map> <entry key="sfdc.accountsToExport" value="FieldKo Visits"/> <entry key="process.operation" value="extract"/> <entry key="sfdc.soqlQuery" value="SELECT Id, Name, Date__c, ... FROM Visit__c WHERE Date__c = LAST_MONTH"/> <entry key="dataAccess.name" value="C:/Exports/FieldKo_Visits_LastMonth.csv"/> <!-- other settings like bulk API, etc. --> </map> </property> </bean>

    The bean’s id (here "exportFieldKoVisits") will be used when running the CLI to refer to this job.

  4. Test the Export Command: Open a command prompt and navigate to the Data Loader \bin directory (e.g., C:\Users\YourUser\dataloader\bin)​. Run a test:

    bashCopyEditprocess.bat "..\config\YOUR_CONFIG_FOLDER" exportFieldKoVisits

    The first parameter is the path to your config directory (where your process-conf.xml is), and the second is the process bean id. If not provided, Data Loader will use a default process name defined in config, but it’s clearer to specify​.

    When you run this, it should connect to Salesforce and perform the SOQL query, saving the results to the CSV you specified. Check the output and logs for any errors. Common issues could be login errors (check encryption and credentials) or SOQL errors.

  5. Schedule the Export: Once the command works, schedule it to run automatically. In Windows, use Task Scheduler:

    • Open Task Scheduler > Create Basic Task.

    • Set a trigger (e.g., “Daily at 1:00 AM” or “Weekly on Fridays at 11:00 PM”).

    • Action: Start a Program. Point it to the process.bat in the Data Loader bin. For arguments, you can provide the config directory and process id. For example:
      Program: C:\Users\YourUser\dataloader\bin\process.bat
      Arguments: C:\Users\YourUser\dataloader\config exportFieldKoVisits

    • Make sure the task runs under a user account that has permissions and that the machine is on at that time. You might need to check “Run whether user is logged in or not” for server scenarios.

    • Test the scheduled task by manually running it to ensure it calls the script properly.

Automation strategies: By using the above, you can automate exports of FieldKo data on a schedule. If you have multiple objects to export, you can set up multiple beans and even multiple tasks (or one task that runs a batch file calling process.bat for each bean sequentially). Another strategy is to have one Data Loader process that exports all needed objects (this would produce multiple CSVs). It’s up to your needs.

File storage tips: Decide where the export files will live. We recommend:

  • Include date in filename: Each run should ideally produce a file with the date, to avoid overwriting previous backups. You can achieve this by generating the filename dynamically. One approach is to adjust the process to run via a custom batch script that appends date. For example, use a Windows batch file with %DATE% environment variable to construct the filename (or use the trick of passing a parameter like dataAccess.name=C:\Exports\Visits_%DATE%.csv to Data Loader CLI​).

  • Archive old files: Keep an organized archive. For instance, have a folder per object or per year/month. Periodically clean up files older than your retention policy (e.g., keep last 6 months of daily exports).

  • Secure storage: These CSV files contain potentially sensitive business data. Ensure the storage location is secure (if on a server, limit access or move them to a secure cloud storage). Consider compressing or encrypting the files if needed.

  • Verify on occasion: It’s good practice to occasionally open a recent export file to verify it’s capturing data as expected (no unexpected empty files due to a login failure, etc.)

Scheduling Salesforce Reports (Subscribed Reports)

For smaller data exports or summary data, Salesforce’s report scheduling (subscriptions) can be useful. Scheduled Reports allow you to have a report run on a schedule and email the results to chosen recipients. This is handy if, for example, you want a weekly email of “Survey Responses collected this week” or “Tasks completed this month.”

How to schedule a report:

  1. Create or identify a report that contains the data you want (e.g., a custom Report Type on the FieldKo Visit object).

  2. In Lightning Experience, open the report and click Subscribe (or in Classic, “Schedule Future Runs”).

  3. Set the schedule (daily, weekly, monthly) and time. You can specify certain days of week, etc.

  4. Choose recipients – yourself or other users, roles, etc., who should get the emailed report.

  5. (Optional) In Lightning, you can set conditions (only send if certain conditions met) – but for a regular export, you likely want it always.

  6. Save the subscription.

Once activated, the report will run at the scheduled times and send an email. The email will contain the report summary and a link to Salesforce. Tip: If you want the actual data attached, use the option “Attach results as Excel/CSV”. Salesforce allows attaching the report as a .csv file, but note that there is a limit of 15,000 rows and 30 columns for attached report data in subscriptions​. Also, scheduled report emails (especially older HTML-style in Classic) have a 2,000 row limit for the inline results​, which means if your report is larger, the email will only show first 2,000 rows (though the .csv attachment can include up to 15k rows as noted). So this method is best for moderate amounts of data.

Use cases for scheduled reports: If FieldKo’s data is relatively modest or you only need a subset (like a weekly summary of new accounts or a list of surveys completed), a scheduled report is quick to set up. It’s also great for non-admin users who just want a report delivered to their inbox. However, for huge record exports (tens of thousands of rows), this isn’t viable due to limits.

Storage and automation: The scheduled report will send to your email; from there, if you need to automate further (like drop the attachment to a drive), you’d need additional scripting outside Salesforce (e.g., using an email processing script). Generally, if you need fully automated file drops, the Data Loader CLI method is more direct. Scheduled reports are more for convenience and human consumption.

Additional Tips

  • Salesforce Data Export Service: In Setup, you can also use Data Export to schedule a full org export (including all objects, attachments, etc.) weekly or monthly. This is an out-of-the-box backup option. For instance, in Enterprise Edition you can schedule a weekly export; Salesforce will then generate a set of ZIP files containing CSVs of all your data and email you when ready to download. This is a good backup mechanism, though not as real-time flexible (weekly is the most frequent). We cover backup strategies more in a later article.

  • Use a Dedicated Integration User: If scheduling Data Loader jobs, consider using a dedicated API user account for running these exports. That way, the credentials and permissions are controlled and not tied to an admin’s personal account (which might get locked or have password changes). Ensure the user has API Enabled and appropriate object permissions.

  • Monitor the jobs: For Data Loader CLI jobs, regularly check the log files (or set up notifications on failure). You can have the script email the log, or you can schedule a second task to verify file timestamps. For report subscriptions, Salesforce will notify the running user via email if a scheduled report fails to run for some reason.

By setting up automated exports, you ensure your FieldKo data is regularly backed up or available outside Salesforce when needed, without manual effort each time. This is especially useful for feeding data to external systems or simply safeguarding your data on a schedule.

Did this answer your question?