How to load and clean up test data efficiently in Cycle.

How to load and clean up test data efficiently in Cycle.


In order to properly evaluate system behavior it is required that known inputs produce desired outputs. This is especially critical when working with the JDA WMS due to its multitude of configuration options that drive specific system behavior.

Equally important is being able to reuse the known inputs to efficiently and accurately repeat tests. To accomplish this in transactional systems a mechanism to purge (clean up) test data is required.


Cycle includes steps to load and clean up test data using MOCA datasets. 

Once a MOCA connection in Cycle is established, Local Syntax and MOCA commands can be directly executed in Cycle.

The most commonly used MOCA commands are for loading and cleaning up data. The Cycle steps to perform those actions are:


This step Loads CSV files in the given directory into the current MOCA connection. The path to the dataset should be relative to the Resource Directory. First, any cleanup*.msql files are run (alphabetically ordered). Once all cleanup*.msql files have run, any files named load*.msql are executed. Next, data from any CSV files in the dataset (ordered alphabetically) is inserted into the appropriate database table based on the CSV file name matching the table name. Only columns in the CSV file matching known table columns are inserted into the database table. Any failures during insertion will cause this Step to generate an error. Finally, any files named validate*.msql found in the dataset are executed.

I execute cleanup script for MOCA dataset "<DATASET_DIRECTORY_PATH>"

Looks for any cleanup*.msql file in the specified directory and runs it (if it exists). The Step will generate an error if the script fails.

Loading data customarily takes place prior to executing the business process. In Cycle the data loading would occur after establishing a MOCA connection and setting any MOCA environment variables.


Let’s use the example of testing Wave Planning and more specifically a new Wave Rule.

To test Wave Planning the system must contain a waveable order with lines. To effectively test the new rule, the order and lines must contain the evaluation criteria.

Loading Data

The first step is to build the MSQL file or files responsible for creating orders and order lines.  More than likely the MSQL will contain the MOCA commands ‘create order’ and ‘create order lines’ with the necessary arguments and error handling. Using the MOCA commands enables the use of all inherent validations as well as any standard or custom triggers and wrappers. While this is manually adding an order into the system all existing configurations and business rules are followed ensuring valid data.

Clean Up Data

The next step is to build the MSQL file or files responsible for cleaning up test data.

One important detail to remember when creating a cleanup file is that is not enough just cleaning up the data loaded, in this case order and order line. You must also account for the data created as a result of executing the test. This test will potentially add records into shipment, shipment_line, ordact, dlytrn and pckbat. Cleanup for these tables is required as well.

Another important detail when building a cleanup msql is that the file must be constructed in a way to always return a MOCA status of 0. This is necessary due to the fact that when using the loading step cleanup*.msql is the first script run. Also, the Feature may fail at different points meaning not all downstream tables will be written to.

Not handling no rows found errors in the cleanup will cause a false negative for the entire test.

Putting It All Together

Now that both the load and cleanup files are created the steps can be incorporated into the Feature file.

Depending on the intent of the Feature it is appropriate for the data load to occur in a Background or in the main Scenario prior to the business logic executing.

It is Best Practice to execute the cleanup script in the After Scenario in order to ensure that it always runs.

Below is an example Feature with all the pieces in place.



The example Feature connects to MOCA in the Background and then executes the dataset which populates the destination instance with the required order structure.

The Feature will then execute the main Scenario validating the business process for the wave rule.

Finally when the main Scenario completes (pass or fail), the After Scenario will run and clean up the data introduced in this execution.

Related Articles:

  • Using Inbound Integrator transactions
  • Parameterized MSQL
  • Backgrounds/After Scenarios
    • Related Articles

    • Using Cycle's Data Extract Tool

      The Data Extract tool allows you to easily create CSV files from an existing database. The Extract Data window can be accessed via the "Tools > Extract Data..." menu item in Cycle. It can also be accessed from the Inspector panel with the ...
    • Can Cycle execute MOCA and Local Syntax?

      Problem Cycle Features built to interact with and test the JDA WMS need to run queries, execute statements and perform validations directly against the database. Solution JDA WMS provides a framework to ease database interaction called MOCA. Cycle ...
    • How to use MOCA Examples as a Scenario Outline data source

      With the enhancements to Scenario Outlines, it is now possible to use multiple sources for test parameter data. Previously, parameters were set in the Feature in an Examples section specified and maintained by the user. The addition of Example Row, ...
    • How to configure Blue Yonder WMS to record Cycle's MOCA executions in the System Auditing table

      Problem With a MOCA connection Cycle can perform many functions in the WMS including executing MOCA commands and making local syntax calls. While these commands and calls are usually necessary for test automation, Cycle is an external application ...
    • Can I use Cycle's Data Store reporting even if I don't have a database available?

      Problem In-depth reporting and analysis of Cycle tests often requires the use of a database to log all of the details of your Cycle executions, but not all users have a readily available database to use for this. Solution Cycle comes pre-packaged ...