Extended (Phase 2)

Settings


(1) Trim SQL values in phase 4: See section "Phase 4".

(2) Continue with next record on SQL error: See section "Phase 4".

(3) Write data to disk temporarily, On memory use of: See section "Swapping".

(4) Skip record on error: See the section below with the same name.

(5) Empty value of a field is considered not existing: This checkbox applies to source structure fields. If the checkbox is not set, fields without a value will be filled with an empty string or the default value. If the checkbox is set, the Empty Flag will be set for fields without a value.

(6) Skip empty fields (no memory usage): This checkbox applies to target structure fields. If this checkbox is set, no internal data objects are generated for empty fields. This can save memory if empty fields in the target tree are not needed.

(7) Ignore empty source fields (no memory usage): Only those source fields that are mapped or which are referenced as function parameters are kept in the main memory if the checkbox is set. This saves memory when only a few fields are used for large input files. This option has no effect for the XML parser version 4. It cannot be used for document types EDIFACT in TradaComs format or BWA. If function copy field by name() is used in the profile, the checkbox may not be used. If set, it will be removed when saving the profile.

(8) Fill with "0" for FixRecord output: Indicates whether numeric data types ("Integer", "Float", "Double", "BigInteger", "BigDecimal") should be filled from the left with zeros (0), even if this formatting is not explicitly set in the fields. This setting is only relevant for the output format "Fix record".

(9) Use matchcode values for functions in mapping: If a node has several match codes and you want to know which one actually matched when the node was entered, you have to set this checkbox. Then, you can simply use function get matched value() to read out the value. Attention: Setting this checkbox considerably increases the memory requirements for the source tree.

(10) Enter output channels for each record: See section Execute "Execute Responses per record" below.

(11) Apply filler again after template operation: If this option is set, fill settings of fields will be reapplied after the template has been applied. Only needed if you use content "Fix record" in a Response.

(12) Values can be Base64 encoded: Checks whether the input value is Base64-encoded and decodes it if it is. Note: This option only makes sense if the output format is not based on fixed lengths, since the length of the data is changed if decoded.

(13) Normal data mapper, Simple data mapper: The simple data mapper does not allow hierarchies below nodes. The normal data mapper allows complex hierarchies in the source and target structure. Note: See also section "Phase 3: Mapping (performance)".

(14) Create new record if only one of the possible field values changes: Concerns the source structure field attribute "New record if value changes". If this option is selected, the specified fields are ORed, otherwise ANDed (all field values must change).

Skip record on error


Usually, a profile is terminated if an error (like violations of the minimum/maximum constraints or missing mandatory fields) occurs during the mapping. But sometimes you do not want the whole profile to be terminated, for example, if a large number of orders are submitted in a single file. You would only want to skip the order that caused the error but still process all the other orders.

The checkbox "Skip record on error" can be used to limit errors during the mapping process to the current record. This allows for the subsequent (error-free) orders to be processed.

Error handling can only be activated at the level of a record. As a consequence, the logical unit of a mapping (e.g. an order) must be represented per record. If during the mapping an error occurs in a record, all subsequent phases behave as if the record would not exist.

Note: It is not a termination in the sense of terminating a transaction. Variables that were changed before the error in the record occurred retain the changed values!

If an error occurs, an entry is made in the log files. In addition, it is possible to use variables to collect information about the erroneous/skipped datasets.

  • The system variable VAR_SYS_HAS_INVALID_RECORD has the value true if an error has occurred during processing of at least one dataset.

  • You can use system variable VAR_SYS_MESSAGE to store relevant information about an erroneous/skipped dataset.

  • The contents of variable VAR_SYS_MESSAGE are accumulated in the system variable VAR_SYS_COLLECTED_MSGS.


The variables can, for example, be used in a Response to send an email containing information about the erroneous/skipped datasets.

Execute Responses per record


Normally, the Responses in phase 6 are executed once per job.

By setting checkbox Enter output channels for each record in the extended settings of phase 2, this behaviour can be changed. The Responses will then be executed once per record.

Attention: By default, Responses that have the content set to "Output of IU", only execute the Integration Unit once per job and not for each record. These Responses thus always receive the same data for each record. If you want the Integration Unit executed for each record, set checkbox "Incorporate IU".

See also section When does the parser start a new record?