HCL Workload Automation, Version 9.4

AWSBIO - Comm_api messages

This section lists error and warning comm_api messages that might be issued.

The message component code is BIO.

AWSBIO001E   The supplied parameter "parameter" is not valid.

AWSBIO002E   The supplied parameter "parameter" does not have a valid length.

AWSBIO003E   The supplied parameter "parameter" already exists.

AWSBIO004E   The supplied parameter "parameter" has a non-alphabetic first character.

AWSBIO005E   The supplied parameter "parameter" contains non-valid characters.

AWSBIO006E   Field "field" has a null value.

AWSBIO007E   An internal error has occurred. The MAE_CA comarea cannot be initialized.

AWSBIO008E   An internal error has occurred. The SEC_CA comarea cannot be initialized. The error was: !1.

AWSBIO009E   An internal error has occurred. The database could not be opened. The error was: !1.

AWSBIO010E   An internal error has occurred. The database files are not at the correct version.

AWSBIO011E   An internal error occurred allocating memory.

AWSBIO012E   The filter type "filter_type" is not valid.

AWSBIO013E   The specified HCL Workload Automation object is not valid.

AWSBIO014E   An incorrect time value "time" has been specified in the filter "filter".

AWSBIO015E   An incorrect time type "time_type" has been specified in the filter "filter".

AWSBIO016E   An incorrect priority value "priority" has been specified in a filter.

AWSBIO017E   An incorrect priority type "priority_type" has been specified in a filter.

AWSBIO018E   ID filter not found in filter array.

AWSBIO019E   File "file" could not be opened. The error was: "error".

AWSBIO020E   File "file" could not be closed. The error was: "error".

AWSBIO021E   File "file" could not be read. The error was: "error".

AWSBIO022E   File "file" could not be written. The error was: "error".

AWSBIO023E   TWS is not installed under group "Group"

AWSBIO024E   "Server" server is not installed.

AWSBIO025E   Error opening Symphony: "Symphony"

AWSBIO026E   Error opening Events files: "EventFile"

AWSBIO027E   The object "object" could not be found.

AWSBIO028E   An incorrect limit "limit" was specified in the filter.

AWSBIO029E   An incorrect limit type "limit_type" was specified in the filter.

AWSBIO030E   An incorrect status "status" was specified in the filter.

AWSBIO031E   An incorrect recovery option "recovery_option" was specified in the filter.

AWSBIO032E   An incorrect prompt status "prompt_status" was specified in the filter.

AWSBIO033E   An internal system error occurred.

AWSBIO034E   An incorrect limit "limit" was specified for the job stream "job_stream".

AWSBIO035E   Unable to initialize HCL Workload Automation Connector plan auditing.

AWSBIO036E   Unable to initialize HCL Workload Automation Connector database auditing.

AWSBIO037E   Workstation "workstation" does not support the task type "task_type".

AWSBIO038E   A method options file or jobdef keyword was not found for task type "task_type" on workstation "workstation".

AWSBIO039E   The parameter "parameter" could not be found in the local parameters file.

Explanation

The local parameters file is a file created on a fault-tolerant agent by the parms utility. However, it does not contain the indicated parameter.

System action

The operation is not performed.

Operator response

Check that the definition refers to the correct parameter. If it does, the parameter does not exist, and must be created. Use the parms utility to create the parameter on the local workstation. Then rerun the operation that needs to use the parameter.

AWSBIO040E   The supplied job stream alias "alias" cannot be used, because another job stream has been submitted in this plan instance with the same alias.

Explanation

When a job stream is submitted with an alias, the internal ID of the submitted job stream is set to the value of the alias. This is so that follows dependencies can be created where the aliased job stream instances are predecessors. If you subsequently submit the same or a different job stream, using the same alias, the software detects the duplication and does not submit the job stream, issuing this message.

The potential for duplication only exists within the same unextended plan. Whenever you extend the plan, the original job stream and its alias are assigned new internal IDs which make them unique, making the alias available for use for the same or another job stream.

The following is an example:
  1. You submit job streams JS-1 and JS-2. The plan now contains the following items, identified uniquely by their job stream internal IDs:
    • Job stream name=WS-1#JS-1 Job stream internal ID=058HNKHGRD8378
    • Job stream name=WS-1#JS-2 Job stream internal ID=058HNKYIJG8945
  2. You then submit job stream JS-1 with the alias myalias. The plan now contains the following items, identified uniquely by their job stream internal IDs:
    • Job stream name=WS-1#JS-1 Job stream internal ID=058HNKHGRD8378
    • Job stream name=WS-1#JS-2 Job stream internal ID=058HNKYIJG8945
    • Job stream name=WS-1#myalias Job stream internal ID=myalias
  3. You then try and submit job stream JS-2 with the alias myalias. The plan already has a job stream with the internal ID myalias, so the job stream cannot be created and this message is issued.

System action

The submit operation is not performed.

Operator response

Choose a different alias and resubmit the job stream. If there are particular reasons why you must use this alias, wait until after a plan extension before resubmitting the job stream.

AWSBIO041E   The job cannot be submitted because the job or job alias "job_name_or_alias" already exists.

Explanation

When a job is submitted, the program checks to see if a duplicate job or duplicate alias already exists in the Symphony file. The program follows these rules:
  • Two jobs without aliases cannot exist with the same name, irrespective of whether they were put in the plan by planner or by a normal submission to the current plan.
  • Two aliases cannot exist with the same name, irrespective of whether they were supplied explicitly, or generated by the program.
  • The program only generates an alias if you are: a) performing an ad-hoc submission, b) the job name already exists, and c) an alias has not been supplied. The program uses the docommand string of the job as the alias.

    This means that if the program has already created an alias for a job with a given docommand string, it cannot create an alias for any job that has the same docommand string. The job cannot be submitted because the alias is a duplicate.

The following is an example of the latter situation: The jobs Job-1 and Job-2already exist in the plan in the same job stream for workstation WS-1. You submit Job-1 again as an ad-hoc job without an alias and with a docommand of runjob27.exe. The program generates the alias from the docommand resulting in the following job in the plan: Job name=WS-1#0AAAAAAAAAAAAABG.RUNJOB27

You then submit Job-2 again as an ad-hoc job without an alias. The program wants to generate the same job name, but cannot, because to do would create a duplicate alias. This message is issued.

System action

The submit operation is not performed.

Operator response

Choose a different specific alias and resubmit the job.

AWSBIO044E   The conditional dependency expression specified for the predecessor "object" is not valid.

Explanation

See message.

System action

The operation is not performed.

Operator response

Determine the correct syntax of the expression and resubmit the operation using the correct syntax.