Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
By this option you can specify datasets that can never be deleted or from which members can never be deleted. If a user select such a dataset in DELETE process then he is informed that the deletion is not allowed.
You can use wildcards instead of the system id (SYSH_ID in the example). In the inner array you can specify datasets or masks with wildcards. If the real dataset that a user selects in the process matches an item in exclude_detete then the deletion is not allowed. If you specify ( and ) like in first two lines in the example then the dataset can't be deleted as well as any of its members. When brackets are missing then members from the dataset can be deleted, but not the dataset itself. The result of the example is as follows:
members can't be deleted from P391D.HOS.DATA.OUT
members can't be deleted from P391D.HOS.DATA.MIRROR1
P391D.HOS.DATA.OUT can't be deleted
P391D.HOS.DATA.MIRROR1 can't be deleted
P391D.HOS.DATA.MIRROR2 can't be deleted
P391D.HOS.DATA.MIRROR3 can't be deleted
The value entered here is used when skeletons are substituted (when ##CODEPAGE## variable is replaced). PRX code must know the codepage used on z/OS so that it is able to correctly convert special characters. The default value (if it is not explicitly set by codepage option) is 273.
Example:
This is a very important option.
The process configuration defines a list of operations (tasks) executed in sequence. This sequence configures what the process does. But you must also set where and with what data the process works, what JCL skeletons are used, what dataset and systems are used, what JCL statements are supported and so on. For setting this kind of options you must use the environment.You can have general part of the environment, that is valid for all processes and a specific one that is valid only for some process. The general part and the specific part are usually merged together, which produces the final set of environmental options.
The environment is sometimes very large, especially when you have many processes or several clients. In this case it is a good practice to split the environment to more files, for example one per a client, and include them like in the following example:
A content of each included file looks like in the sample below (only a part is displayed here, as the setting is too large).
This example is used for generic part of the environment that is valid for all clients, processes, activities and names of environment. This is because of to asterisks used for client, process, activity and name. This way you can predefine default values (typically templates - sometimes they are called skeletons, supported statements, ...). This general part should be placed on the top (therefore hos_env_general.php is the first included file in the sample). Some options can be later overwritten by other included files. There is a rule that the last setting wins. Procman processes the complete environment definition from the top to the bottom. If some option is defined more times then the last one is finally used.
For this reason always follow this simple rule: put the most general setting (with most wildcards) on the top and more restrictive ones (with named clients, processes, activities, ... ) below.
Comments typed in green in the sample explain what is the meaning of the value on the line where they are coded.
You can type * (it means all clients) or a real client name on the line with '// client name' comment.
You can type * (it means all processes) or a real process name on the line with '// process name' comment.
You can type * (it means all activities) or a real activity name on the line with '// activity name' comment.
You can type * (it means all environment names) or a real environment name on the line with '// environment name' comment.
Of course you easily put the environment directly into hos_config2.php without splitting to more files. Then the option looks like this (also not complete):
A sample with explicitly specified values (no more asterisk is used):
It is clear what client, process and activity are. But what is the environment? Every process contains m_environment module at the beginning. This module scans your environment setting and finds all distinct values used as environment names. In case of our samples only 'Test environment 1' is found. Therefore we have only one environment name. When the module finds only one environment then it uses it automatically. If more environment names are found then the module offers a selection. Once the environment is selected the whole process works with its setting. Therefore you can define more than one environment (which can contain system definitions, templates, ...) and allow the user to choose the one he wants at the beginning of the process. In spite of the fact this idea can be useful in some cases it is not usually configured in real installations. Most often you have only one environment name that is used throughout all your configuration files. Just make sure that once you run some process then the environment name in your configuration is not changed any more.
In the following sub sections all environmental options are described in details.
Typically you have several user forms defined for DD statements in your Procman/HOS installation and typically PRX code recognizes what form to assign to every DD statement while the JCL analysis is running. After the analysis you can see all DD statements found in the analyzed job in a table and you can start to enter user input:
The content of select boxes in DD Type column is built from jcl_dd_forms option. This option maps form names (names of form XML without ".XML" extension) to a text id used in a dictionary. As a result you see correct texts in the select boxes and you can change the form type by mouse.
This option sets what log files are downloaded after JCL analysis. You can deactivate some of the files if you don't need them, nevertheless it is not recommended. Every file contains useful information. Setting the value to true like in the example enables the log file and Procman/HOS downloads it.
JCKDTLO detailed listing of errors, warnings and notifications found by SmartJCL
JCKSUMO summary listing.
SYSPRINT SYSPRINT of SmartJCL.
Note that JOB, EXEC and DD are always enabled. You can disable only OUTPUT, SET and INCLUDE.
This option controls what JCL statements are supported by Procman/HOS for modification via user forms. You can setin PRX script (if you have their XML definition ready) for JCL statements that are enabled in this option. If you don't need forms for some of the statements (sometimes SET or INCLUDE is not needed) then you should disable them by this option.
Use this option if you want to copy members by the copy module also to additional libraries. Members written to mirrored datasets are not stored in a database. This option only ensures that a backup of members that are copied to target libraries is also copied to mirrored libraries. If the real target system and dataset match the ones specified by target_mirror option then copying to locations specified in the inner array is done. You can copy members to more than one library or system if you want. You can use wildcards both in and and also in and , as is shown in the sample below.
In the case of the above example this mirroring occurs:
If the target dataset is P391D.HOS.DATA.SYS on SYSH_ID system then members are mirrored to P391D.HOS.DATA2.SYS on the same SYSH_ID system.
If the target dataset is P391D.HOS.DATA.OUT on SYSH_ID system then members are mirrored to:
P391D.HOS.DATA.MIRROR1, ...MIRROR2, ...MIRROR3 on SYSH_ID system
P391D.HOS.DATA.PRE.MIRROR1,, ...MIRROR2 in SYSH2_ID system.
In the following examples mapping takes effect then any qualifier(s) appears at the place of asterisk:
Or with their variables:
It will map:
DDD.PROCMAN.TARGET.AAA.STEUKA to DDD,PROCMAN.TARGET.AAA.STEUKA.MIRROR DDD.PROCMAN.TARGET.BBB.STEUKA to DDD,PROCMAN.TARGET.BBB.STEUKA.MIRROR DDD.PROCMAN.TARGET.AAA.BBB.STEUKA to DDD,PROCMAN.TARGET.AAA.BBB.STEUKA.MIRROR
The mirroring works also when files are deleted. In this case approved members are deleted from target libraries as well as from all their mirrors.
template template name used as a base for the job that is generated and submitted (calls PGM=OSJIBAT).
tws_sys id if the system where the substitution runs.
app_id_pad letter used for padding of the application id to 16 letters if app_id_prefix is not specified.
app_id_prefix prefix used for padding of application id to 16 letters. When it exists then app_id_pad is ignored.
wsid id if the workstation that appears in the substituted job.
iat input arrival time in HHMM format that appears in the substituted job.
This option is needed for specifying parameters of tasks used in process configuration. The process configuration consists of several tasks grouped into activities. Many of tasks used there need to know some input data, for example system names, dataset names, and much more. The options which are required are described in details in the online help of Procman/HOS and therefore we show only a sam-ple demonstrating how to configure these options. The fact that the process definition and task options are separated helps to separate process logic from the data like systems, datasets, content of select boxes,...
The example is just a little part of the environment definition. All preceding options of the environment are usually inserted in the request and activity specified with wildcards. But this is usually not the case of task option. This is because task options must be valid for concrete process definition and the task names in the environment and in the process definition must match. In the example we use t0400 task and we define several options in the inner array. These options will therefore be available in t0400 task of JCL_CHANGE_PROD_APPROVE activity of JCL_CHANGE_PROD process in HORIZONT client. In this case t0400 in the corresponding process configuration calls m_jcl_approve module, which shows the approve web page. Systems and datasets coded in the environment are required by this module in order it is able to offer correct systems and libraries in its select boxes.
There are a few methods how to define items in the task array. Each one produces a different result on the web page. All available methods are listed below.
In this case there is no visible field on the web page, where the user can select or see the value. The value is always SYSH_ID and it can't be edited and it is hidden.
In this case an empty text field is displayed on the web page:
In this case a text field is rendered on the web page and its content is initialized with the value speci-fied in the array.
In this case the value can't be changed (like when you define a constant), but it is visible on the screen. You can make the field read-only by adding ! in front of the value. The result is:
In this case a select box is rendered on the web page. When you specify only values like in case of the target library then you see exactly the values from the array. You can also specify a value and a label separated by a colon. In this case you see the label but the value is used when an item is selected. This is common when you specify systems, as their ID is im-portant for Procman/HOS while the label is important for users.
Enter ! in front of the value you wish to select by default:
In this case P391D.PETRH.TEMP2 is selected by default when the page is first displayed:
Use it when the value is the same as a value of another field:
It is possible to render a field by any of the preceding methods conditionally. That means the rendering method depends on a value of another field:
In this case when SYSH_ID export system is selected then the export library shows an edit field initial-ized with your SYSUID followed with '.DATA.TEST'. If SYSH2_ID system is selected then the export library changes to a select box with two items.
and
You can also use remote dependencies in conditional fields. They are available only in some cases (when the referenced task stores data to JCKIN database table). This shows a reference in the same activity:
The target library is rendered as a read-only edit field initialized with P391D.PETRH.TEMP1 if the tar-get library selected at task t0500 in the same activity is P391D.NEW.TEMP1. The second row should be clear, it is analogous.
This sample shows a reference to another activity:
In this case the condition is based on target_library option of t0500 task in JCL_CHG_REQUEST activity.
You can use variables in any value used in task options. Please note the mandatory dot at the end. Available variables are:
%SYSUID. it is replaced with the user that is logged in Procman in upercase.
%(VAR). it is replaced with VAR HWM variable.
%(VAR,n). it is replaced with a substring of VAR HWM variable that starts at 'n' position. The position is 1-based.
%(VAR,n,p). it is replaced with a substring of VAR HWM variable that starts at 'n' position and has a maximal length equal to 'p' . The position is 1-based.
Let's now assume that the current user is P391D and that HWM variable DEPARTMENT equals 'DEP14':
%SYSUID..JOBLIB.* is substituted as P391D.JOBLIB.* %SYSUID..JOBLIB.%(DEPARTMENT)..* is substituted as P391D.JOBLIB.DEP14.* %SYSUID..JOBLIB.%(DEPARTMENT,3)..* is substituted as P391D.JOBLIB.P14.* %SYSUID..JOBLIB.%(DEPARTMENT,1,4)..* is substituted as P391D.JOBLIB.DEP1.*
You can use variables also in keys of conditions. A few examples:
If the selected source_system equals the value of TESTVAR1 HWM variable then use the first value. If the selected source_system equals the value of TESTVAR2 HWM variable then use the second value. Else use the last default value:
If the value of client HWM variable equals TEST then use the first value. If the value of client HWM variable equals HORIZONT then use the second value.
Else use the last default value:
This option configures IWS variable substitution. In order to run the substitution the setting for selected target system and dataset must exist. You can type the target system explicitly (specify or you can use an asterisk if the setting is valid for all systems. This holds true also for libraries, enter an asterisk or specify DSN mask. If the current target system and dataset match TargetSysIdMask and TargetDsnMask then the setting in the internal array is used. The meaning of individual options is as follows:
By this option you can specify parameters used for allocation of temporary PO datasets. These datasets are allocated for jobs that Procman/HOS submits and for members that are analyzed. You can specify units (tr, cy), primary and secondary quantities, directory blocks and type.
Example:
By this option you can specify parameters used for allocation of temporary PS datasets. This is used when HOSXIN dataset is created. You can specify space and units (tr, cy).
For PS datasets use instead.
For PO datasets use instead.
This option allows you to overwrite default set outside of the environment. If you want to set technical user differently for various activities then you can put zos_tech_user into the environment. This example sets technical users for DEMO_ID and SYSH_ID systems only in JCL_CHANGE_PROC_APPROVE activity in JCL_CHANGE_PROD process in HORIZONT client.
When Procman/HOS needs to submit a job on the host then it first reads an appropriate template (skele-ton of the JOB) and makes substitution of special variables (they are surrounded with ##). A result of this substitution is a real valid job that is subsequently copied to temporarily allocated dataset and sub-mitted. There are many templates in Procman/HOS which are used in many scripts. There are templates for JCL analysis and generation, for checking JCL after is has been edited, for copying to the host and much more. In PHP scripts names of templates are used instead of member names in order to allow to set member names freely. The mapping of template names to member names must be correctly defined by template option. The example below is not complete, it only shows the syntax. A list of standard tem-plates is listed below the sample.
Standard templates:
jck_job_new JCL analysis in JCL NEW processes (m_jcl_analyse_1 module).
jck_job_change JCL analysis in JCL CHANGE processes (m_jcl_analyse_1 module).
jck_job_editchk1 JCL check of jobs in the request activity (m_jcl_change, m_jcl_change_objects, m_jcl_generate modules and in the preview started from m_jcl_fillforms module).
jck_job_editchk1_tws JCL check of jobs with IWS variable substitution in the request activity (m_jcl_change, m_jcl_change_objects, m_jcl_generate modules and in the preview started from m_jcl_fillforms module).
jck_job_ref JCL generator in JCL processes (m_jcl_analyse_2 module).
jck_job_approve JCL check of jobs started from m_jcl_approve module.
jck_job_tws_approve JCL check of jobs with TWS variable substitution started from m_jcl_approve module.
jck_job_final final JCL check of jobs executed after members have been copied to the target libraries (m_jcl_final_check module).
jck_job_fast fast JCL check of jobs (from target libraries, members are not copied to temporary dataset), available in m_hos_jcl_approve module.
approve_regenerate_job JCL generator started from m_jcl_approve module (uses selected jobs as the input).
jck_proc_new JCL analysis in PROC NEW processes (m_jcl_analyse_1 module).
jck_proc_change JCL analysis in PROC CHANGE processes (m_jcl_analyse_1 module).
jck_proc_editchk1 JCL check of procedures in the request activity (m_jcl_change, m_jcl_change_objects, m_jcl_generate modules and in the preview started from m_jcl_fillforms module).
jck_proc_editchk1_tws JCL check of procedures with IWS variable substitution in the request activity (m_jcl_change, m_jcl_change_objects, m_jcl_generate modules and in the preview started from m_jcl_fillforms module).
jck_proc_ref JCL generator in PROC processes (m_jcl_analyse_2 module).
jck_proc_approve JCL check of procedures started from m_jcl_approve module.
jck_proc_tws_approve JCL check of procedures with TWS variable substitution started from m_jcl_approve module.
jck_proc_final final JCL check of procedures executed after members have been copied to the target libraries (m_jcl_final_check module).
jck_proc_fast fast JCL check of procedures (from target libraries, members are not copied to temporary dataset), available in m_hos_jcl_approve module.
approve_regenerate_proc JCL generator started from m_jcl_approve module (uses selected procedures as the input).
change_by_parm_jobname1 analysis of selected JCL/PROC members in 'Change by parameters' function in m_jcl_fillforms module.
change_by_parm_jobname2 generation of new JCL/PROC members that performs required changes in 'Change by parameters' function in m_jcl_fillforms module.
change_cc when control cards are processed in JCL processes then a special job with variables that are substituted is analyzed instead of the selected member (because valid JCL is required by the analyzer). The member that is analyzed (by subsequent m_jcl_analyse_1 module) is specified by this template.
copy copying of members to target libraries by m_jcl_copy module (members are first copied to temporary library and then the copy job is submitted).
fast_copy copying of members to target libraries by m_jcl_fast_copy module (assumes the members already exist on the host, which is typically a case of INIT processes).
delete deletion of members from target libraries by m_jcl_delete module.
gdg_change_limit changing GDG limit in DSN processes by m_dsn_gdg_submit module.
dsn_append_tape appending (merging) datasets into the target dataset (newly created) in DSN_APPEND processes (m_dsn_append_submit mudule) in case of TAPE device type.
dsn_append_volume appending (merging) datasets into the target dataset (newly created) in DSN_APPEND processes (m_dsn_append_submit mudule) in case of VOLUME device type.
dsn_rename renaming datasets in DSN_RENAME processes by m_dsn_rename_submit module.
dsn_copy copying datasets in DSN_COPY processes by m_dsn_copy_submit module.
idcams an alternative template used instead of gdg_change_limit. When this template is defined then it has a higher priority (by default it calls IDCAMS utility).
dsn_gtyp finds current GDG limits of selected datasets in DSN processes by m_dsn_gdg_enter module.
dsn_parm finds parameters of selected datasets in DSN processes by m_dsn_append_enter module.
split members that are checked by JCL checker or members where IWS variables should be substituted have to be copied to the host into a temporary dataset. This copying takes a while when members are copied one by one. If more members are copied then Procman/HOS can use a fast method of saving members to the host. All files are concatenated in one single PS dataset and split to members by IEBUPDTE utility. To enable this fast method the split template must be defined. Even when the template is defined Procman/HOS can decide to use the standard one by one method if the number of analyzed files is small.
universal job that is submitted in UNIVERSAL processes by m_universal_submit module.
There can be other user-defined templates as well. They are used in interfaces of process configuration in cases when the template name is configurable by the user. Names of such templates are fully under control of the administrator who prepares your process definition. A typical example of such a template is (extracted from process configuration):
members_download job submitted when members are downloaded by the the fast method when Import button is pressed in m_jcl_edit and m_jcl_approve modules. The fast method is used when it is enabled by r option.
members_upload job submitted when members are uploaded by the the fast method when Export button is pressed in m_jcl_edit and m_jcl_approve modules. The fast method is used when it is enabled by option.
You can use HWM variables and %SYSUID. (which is replaced with the current user name).
Setting at activity level for SYSH_ID system:
Setting at task level for SYSH_ID system:
You can use HWM variables and %SYSUID. (which is replaced with the current user name).
Setting at activity level for SYSH_ID system
Setting at task level for SYSH_ID system:
This option allows you to overwrite default set outside of the environment. If you want to set the template dataset name for various activities then you can put this option into the environment. It is also possible to set at the task level as is clear from the samples below.
This option allows you to overwrite default set outside of the environment. If you want to set this prefix of temporary dataset names for various activities then you can put this option into the environment. It is also possible to set at the task level as is clear from the samples below.