Basic SQL (Input Agent Cron)

Introduction: You can find a description of this phase in section Phase 1 (Introduction).

images/download/thumbnails/135168212/image2020-3-17_18-46-30-version-1-modificationdate-1679988978288-api-v2.png

The input data is read from a database using an SQL query.

If the Input Agent starts a job, it will process all the datasets that the SQL query returns. The result set of the database query is stored in CSV format as a backup file. See also document type Database.


images/download/attachments/135168212/1662-version-1-modificationdate-1690355007331-api-v2.png


(1) Alias for the connection to the database (selectable via icon on the right). A database alias stands for a configured connection to a database system. See section Databases/connectors (Administration). In addition, all variables with prefix MSG_CALL_ defined in the profile are available for selection. This makes it possible to pass the alias from a triggering profile or with an HTTP call. Note: For easy testability, the variables should be defined with a default value.

(2) Displays all tables and procedures for the selected alias (1).

(3) A simple wizard that can be used to generate the SELECT statement (4).

(4) SQL query to read data from the database. Only a single SELECT statement can be specified. Comments can only be specified with the syntax /*comment*/, otherwise an error will occur. Queries can be saved and saved queries can be selected via the context menu. There is another dialogue for managing the queries. You can use variables (syntax @MSG_CALL_MYVAR@), system constants (syntax %MYCONST%), parameters (see (8) for syntax) and placeholders for the last profile run timestamp in the query.

(5) Tests the SQL statement in (4). If the SQL statement contains a * for the names of the columns, the actual names of the columns are read from the database and entered in (4).

(6) The columns of the SQL statement are created as fields below a new node in the source structure. Existing mapping instructions are retained. The data types and field lengths of the columns are taken from the database definition.

(7) To display (8).

(8) Is only displayed if (7) is set. Here you can define parameters (via the context menu) that can be used in the WHERE part of the query.

(9) Causes all read data of the table to be deleted. If isolation level Read Committed is set, data can simultaneously be inserted by another program. Attention: This checkbox should only be used with extreme caution. If the number of deleted lines is different from the number of lines read, no job is started and an error is generated.

A WHERE condition can be used to optimise the deletion process. The WHERE condition from (4) is used with the same parameters. This is especially important if the data types do not match exactly.

(10) If this checkbox is set, the content of BLOB data fields (binary data) will be encoded with Base64 before passing to the profile. Note: Otherwise, binary data may cause an error when the source tree is created. Read-in fields can be decoded with the function decode Base64(a, b).

(11) If this option is selected, data will be supplied to the profile even if the SQL query does not generate a result set. The specified replacement values will then be transferred to the profile. The data must be in CSV format (delimiter character comma).

Blanks in Query Result (MSSQL/Informix)


In MSSQL and Informix databases, the result values of SQL calls are trimmed of preceding and following spaces by default. The skipTrimResultValues parameter can be used in the database alias configuration file to change this behaviour.