CI Scripting Techniques

Writing Command Interpreter (CI) Scripts often involves manipulating the output of other commands. Command I/O Redirection (CIOR) is utilized to direct the output of commands and programs into files for processing. There are no direct file processing commands built into the CI. Therefore CIOR is again utilized to read the contents of files into CI variables.

Stepping through each record of a file is another challenge altogether. When a program processes a file it first FOPENs the file and then FREADs records one at a time until it reaches EOF. The file system automatically places the record pointer at the first record of the file when it is opened, and moves the pointer forward for each subsequent FREAD. But when redirecting an MPE command to input data from a file, the the first record is always read because each command causes a separate FOPEN.

There are several common techniques for working around this behavior. I have written three different versions of the same command file to illustrate these approaches. Each one of them functions and provides the same output; a list of files with a specific file code. Each one captures the output of a LISTF command into a file then reads and processes the contents of the file searching for and displaying lines that contain the file code. It is the manner in which the files are handled that distinguishes each style.

The PRINT method

The first method uses the PRINT command to step through each record of the file. The number of records in the file is determined (NumRecs) and a loop is performed incrementing a counter (RecCount) until the counter reaches the number of records indicating the end of file has been reached.

Common Technique 1. Using the PRINT command to step through a file. (Least efficient):


parm FileSet="@",FileCode
comment This script uses the PRINT command to read one record
comment at a time from an input file, write that record to
comment another file, and input the record into a CI variable
comment for processing. Any file that has a filecode matching
comment the FileCode parameter is displayed.

purge infile >$null
build infile;rec=-80,,f,ascii
file infile,old;dev=disc
listf !FileSet,2;*infile
setvar NumRecs,finfo("infile","EOF")
setvar RecCount,1
setvar HoldVar,""
while RecCount < Numrecs
print infile;start=!RecCount;end=!RecCount > HoldFile
input HoldVar < HoldFile
if rtrim(str(HoldVar,11,6)) = rtrim("!FileCode")
echo !HoldVar
endif
setvar RecCount,RecCount + 1
endwhile
deletevar NumRecs
deletevar RecCount
deletevar HoldVar
purge infile

The Message File Option

The second method is to use message files (sometimes called FIFO files) to control the input of data. What differentiates message files from standard flat files is that records are deleted from the file after they are read. This is handled for you automatically by the MPE file system. So while each FOPEN will put the record pointer at the first record of the file, the previous first record has been removed and the next record in the file has bubbled up to the top. As each record is read and deleted the end of file counter changes. The loop is performed until the end of file reaches zero.

Common Technique 2. Use message files . The most common technique used but still not the most efficient.


parm FileSet="@",FileCode
comment This script uses a MESSAGE file to read one record at
comment a time from an input file into a CI variable for
comment processing. Any file that has a filecode matching the
comment FileCode parameter is displayed

purge infile >$null
build infile;rec=-80,,f,ascii;msg
file infile,old;dev=disc
listf !FileSet,2;*infile
setvar HoldVar,""
while finfo("infile","EOF") > 0
input HoldVar < infile
if rtrim(str(HoldVar,11,6)) = rtrim("!FileCode")
echo !HoldVar
endif
endwhile
deletevar HoldVar
purge infile

Multiple Entry Points

While both of the previous approaches are easy to write and follow they are terribly inefficient – especially if you are processing a large file. Each record read or written to a file using CIOR requires an FOPEN, FREAD or FWRITE, and an FCLOSE. FOPEN’s are expensive operations to perform. Larger input files require more FOPENs.

Here is the same command script written in a much different fashion. By using I/O redirection for the entire command script instead of the individual MPE commands within the script, we can cause MPE to only FOPEN the input file one time. Why? Because the $STDIN for the entire script is redirected much like when you redirect the keyboard input to a file for a compiled program.

I have created a single command file with multiple entry points or subroutines. This allows me to create a single command file to maintain instead of two separate files.

Technique 3. Most efficient, especially for very large file sets.


parm FileSet="@",FileCode="",sub="_main"
comment This command script will list all files contained in
comment FileSet that have a file code matching the FileCode
comment parameter. There are two distinctly different subroutines
comment contained within this file. The _main subroutine performs
comment LISTFILE to qualify the filenames and places the output
comment into a file. It then invokes itself utilizing the _sub1
comment subroutine to process the output of the LISTFILE command

if "!sub" = "_main" then
purge infile >$null
build infile;rec=-80,,f,ascii
file infile,old;dev=disc
listf !FileSet,2;*infile
xeq !hpfile FileCode="!FileCode",sub="_sub1" <infile
purge infile
endif

if "!sub" = "_sub1" then
setvar NumRecs,finfo(HPSTDIN,"EOF")
while setvar(NumRecs,NumRecs-1) >= 0 do
setvar HoldVar,""
input HoldVar
if rtrim(str(HoldVar,11,6)) = rtrim("!FileCode")
echo !HoldVar
endif
endwhile
deletevar HoldVar
endif

There are actually several techniques within this script that are worth discussing. The key to understanding this scripting approach is to understand the following line, which invokes the command file for a second time.


xeq !hpfile FileCode="!FileCode",sub="_sub1" <infile

xeq !hpfile

Notice the use of the !HPFILE variable. HPFILE always contains the name of the currently running command file. By running the variable instead of a hardcoded command file name I can rename the file or move it to another location without having to modify the name of the file contained inside.

FileCode=”!FileCode”,sub=”_sub1″ <infile

This component of the command has three important pieces. First, it is passing the FileCode parameter for processing. FileCode is not actually used by _main section but must be passed on to the _sub1. Second, it is invoking the command file telling it to use the “_sub1” subroutine. Third, it is redirecting the $STDIN for the command file to INFILE. The INPUT command which would normally obtain all input from $STDIN will now get it from INFILE. Furthermore, since the file is not closed and reopened the record pointer is not reset to the beginning of the file.


while setvar(NumRecs,NumRecs-1) >= 0 do

This command is both simple and complicated at the same time. It is simply the beginning of a WHILE loop. The interesting part is that it uses SETVAR as both a command and a function simultaneously. The value of NumRecs decrements by one for each iteration of the while loop and the result is not only stored in NumRecs but also evaluated. This is a shortcut for the following code


:Setvar RecsLeft,NumRecs
SetVar RecsProcessed,0
While RecsProcessed < RecsLeft do
<< insert code here >>
SetVar RecsProcessed,RecsProcessed + 1
endwhile

Posix Anyone?

Posix aficionados, here is a solution for you too! This script will use GREP and Regular Expressions as a replacement for the entire loop above. If you are not familiar with GREP it is a utility that searches for strings within files.


parm FileSet="@",FileCode=""

comment This command script will list all files contained in
comment FileSet that have a file code matching the FileCode
comment parameter.

purge infile >$null
build infile;rec=-80,,f,ascii
file infile,old;dev=disc
listf !FileSet,2;*infile
grep "' !FILECODE ' INFILE" purge infile >$null
purge infile

Summary

Command Interpreter scripting can be a convenient method for quickly writing programs. But inefficiently written command scripts can be a drain on system resources. If you need help incorporating any of the techniques here and applying them in your environment please feel free to call.