Overview of HCM Data Loader (HDL)

Overview of HCM Data Loader (HDL)

Introduction of HCM Data Loader

HCM Data Loader (HDL) supports a flexible, pipe-delimited file format that allows you to provide just the business objects, components and component attributes that are required for your use case. Full sets of data can be loaded or just incremental changes. To achieve this flexibility, each file submitted for loading, must provide a definition of the business object attributes that are provided in the file. 

Brief introduction to the HCM Data Loader

Oracle HCM Cloud business objects can be complex and are usually hierarchical, allowing for multiple child records to exist for a business object, e.g. multiple phone numbers for a Person, or multiple columns and rows for a user defined table. Each delimited file will contain the data for a single business object hierarchy. The file will be named for the business object and have a .dat file extension. For example, Worker.dat contains data for workers, Job.dat contains data for jobs and ElementEntry.dat contains data for element entries.

When you supply data to be processed by HDL, you must uniquely identify each record in the file. For new records, two mechanisms are supported:

User Key –  A combination of user-friendly attributes that can be found on the user interface, which uniquely identify the record. For example, the JobCode and SetCode for a Job, the Person Number for a Worker.

Source Key – A combination of two attributes, SourceSystemId and SourceSystemOwner, are used to uniquely identify the record. The SourceSystemId value can be any value, but is often the identifier on the source system, or a value generated by an algorithm. The SourceSystemOwner ensures the source key is unique, when multiple source system exist.

What Do You Need?

  1. Access to import and load data, using HCM Data Loader. For this you need the following functional security privileges:
Access to import
  1. A text editor, in order to create your files.
  2. A file compressor, in order to zip your files.

Creating Your First File

In this step, you will create a simple file to load new grades. User keys are used to uniquely identify each grade.

Using any text editor, create a new file and enter the following line:


All files must include METADATA lines, to explain which attributes are included in the file and the order in which their values are supplied.

All attribute names and values are delimited by the pipe ‘|’ character by default.The string immediately after the METADATA instruction identifies the record type the attributes are for, in this case ‘Grade’. The values that follow are the names of the attributes available on the Grade record, that you want to supply data for.

Note: Ensure your text editor is using UTF-8 encoding.

On the next line enter the following:

MERGE|Grade|IC1|COMMON|Individual Contributor 1|2000/01/01|A

The MERGE instruction tells HDL to create the grade if it doesn’t already exist, or update it if it does.Again, the value immediately after the MERGE instruction identifies the record type the attributes for. The values that follow are the values for the attributes named in the corresponding METADATA line.

Add these MERGE lines to your file:

MERGE|Grade|IC2|COMMON|Individual Contributor 2|2000/01/01|A

MERGE|Grade|IC3|COMMON|Individual Contributor 3|2000/01/01|A

MERGE|Grade|M1|COMMON|Manager 1|2000/01/01|A

MERGE|Grade|M2|COMMON|Manager 2|2000/01/01|A

Each record needs to be uniquely identified. For grade records the user key is the combination of GradeCode and SetCode, i.e. IC2 and COMMON.

Save your file, naming it Grade.dat. Alternatively, you can download and edit the Grade.dat file.

Compress (zip) the Grade.dat into a filename of your choice, but it must have a .zip file extension. You have created your first HCM Data Loader file for bulk loading grades. Follow the next step to import this into the HDL staging tables and load the data into the application tables.

2. Importing and Loading Your File

  1. In the application, on the home page, click My Client Groups > Data Exchange
  1.  On the Data Exchange page, click Import and Load Data
Import and Load data

3. Click Import File on the page header.

Import file

4. Drag and drop your .zip file from your file explorer to the Choose File button. Alternatively, click the Choose File button to search and select your file.Select your file

Click Submit.

5. The process parameters are displayed, click Submit.

Tip: You don’t need to change the parameters values.

6. Click OK on the Submitted confirmation page.

7. You’re returned to the Import and Load Data page. Click Refresh to see your data set.

Cick Import and Load Data

Tip: Data sets have the same name as your zip file.

The Import Status will indicate if the business object .dat files in your zip file imported into the staging tables correctly. Here you can see that import was successful.

The Load Status will indicate if the data is successfully loaded in the Oracle HCM Cloud application tables. The clock icon indicates that Load is still in progress.

    There are various counts; your file contained 5 data lines, so the Total Lines should be 5. In this simple file the 5 lines represented 5 grade objects, so the Total Objects should also be 5.

Advantages Of HCM Data Loader (HDL)

  1. HDL has the capability to access most of the Business Objects.
  2. Support for Partial and Incremental data loading. (no need to create record fully)
  3. If your HCM instance is configured with flex fields or user-defined fields, then you are capable of load values in those fields.
  4. Comprehensive bulk loading capabilities.
  5. User-managed loading or Automated using Web services this allows the tool to be fully autonomous and without user-initiated data load.
  6. Images/Documents can be loaded with records so that the system can associate them.
  7. Bulk loading of HCM data from any source.
  8. Flexible, pipe-delimited file format.
  9. Stage Table Maintenance.
  10. Capability to track and monitor in real-time.
  11. Ability to set the number of threads that are driving a process.

Leave a Reply

SOAIS - Worksoft Newsletter

To view on your browser, click here
Facebook Twitter LinkedIn
Dear Default Value,

Welcome to SOAIS Newsletter of September 2021!

Continuous Testing with Remote Execution
The speed of innovation continues to increase, driving rapid and relentless change for today’s ever-evolving IT landscapes, creating greater risk as IT and business teams scramble to ensure timely delivery. How can your organization keep pace? Test more, worry less. With Worksoft’s Connective Automation Platform, you can easily build and maintain automated tests, accelerating testing time without losing scope or volume. You can schedule and execute remote, continuous tests to intercept defects sooner and prioritize remediation - without sacrificing your nights and weekends. Explore how continuous test automation and remote execution can empower your organization.

Click here to connect with us to get more information on our services.

Skip Costly Rework with Dynamic Change Resiliency​

Change resiliency is imperative in ever-evolving IT environments. Our patented object action framework streamlines change management by assigning object definitions to your shared assets. The same object may be used in a thousand automation steps, but it can be easily updated by making one simple change to the model definition. The change automatically propagates to every single instance where that object may have been used without a single line of code or manual human involvement. For more change readiness you can also engage our Impact Analysis for SAP to predict how changes in SAP transports will affect your business processes. 

Please click here to watch the video to get a gist.

SOAIS Blog – Nuts and bolts of Certify Database Maintenance​

One of the key thing, which is often missed by the organizations, who have invested in using Worksoft Certify for automating their Business Process Validation initiatives, is implementing a Database Maintenance Plan. While the business and the test automation consultants get excited about the shiny new thing that they have got and start building the regression suite; planning and executing a database maintenance plan for most of the customers gets pushed down the priority list. However, since all the test assets in Certify are stored in a Database, a robust database maintenance plan is very important to maintain smooth operation of Certify with acceptable performance criteria. The customers usually start facing issues once they have built significant number of Certify processes which they have started executing on regular basis. Such executions add a lot of data to the tables storing results data and increase the overall size of the Certify database.

Please click here to read the complete blog.

Worksoft Blog – Process Intelligence: A Multi-Dimensional Approach

The ability to extract process knowledge has become easier through the years. Technology has evolved to the point where we can deploy capabilities that connect at multiple levels to extract different types of process insight. In the past, organizations were forced to spend enormous energy extracting data manually from different applications and databases. Then, they would have to use things like spreadsheets to transform the data and convert it into meaningful information. 

Please click here and read the complete blog.
Unit 9, Level 5, Navigator, ITPL,
Bangalore - 560 066.
Phone: +91 80 40071234
Suite 101, 1979, N Mill St,
Naperville, IL 60563
Phone 1-800-262-2427
Please click here to Unsubscribe / Unsubscribe Preferences

Leave us your info