zshmisc - everything and then some
echo foo
is a simple command with arguments.
A pipeline is either a simple command, or a sequence of two or more simple commands where each command is separated from the next by `|' or `|&'. Where commands are separated by `|', the standard output of the first command is connected to the standard input of the next. `|&' is shorthand for `2>&1 |', which connects both the standard output and the standard error of the command to the standard input of the next. The value of a pipeline is the value of the last command, unless the pipeline is preceded by `!' in which case the value is the logical inverse of the value of the last command. For example,
echo foo | sed 's/foo/bar/'
is a pipeline, where the output (`foo' plus a newline) of the first command will be passed to the input of the second.
If a pipeline is preceded by `coproc', it is executed as a coprocess; a two-way pipe is established between it and the parent shell. The shell can read from or write to the coprocess by means of the `>&p' and `<&p' redirection operators or with `print -p' and `read -p'. A pipeline cannot be preceded by both `coproc' and `!'. If job control is active, the coprocess can be treated in other than input and output as an ordinary background job.
A sublist is either a single pipeline, or a sequence of two or more pipelines separated by `&&' or `||'. If two pipelines are separated by `&&', the second pipeline is executed only if the first succeeds (returns a zero value). If two pipelines are separated by `||', the second is executed only if the first fails (returns a nonzero value). Both operators have equal precedence and are left associative. The value of the sublist is the value of the last pipeline executed. For example,
dmesg | grep panic && print yes
is a sublist consisting of two pipelines, the second just a simple command which will be executed if and only if the grep command returns a zero value. If it does not, the value of the sublist is that return value, else it is the value returned by the print (almost certainly zero).
A list is a sequence of zero or more sublists, in which each sublist is terminated by `;', `&', `&|', `&!', or a newline. This terminator may optionally be omitted from the last sublist in the list when the list appears as a complex command inside `(...)' or `{...}'. When a sublist is terminated by `;' or newline, the shell waits for it to finish before executing the next sublist. If a sublist is terminated by a `&', `&|', or `&!', the shell executes the last pipeline in it in the background, and does not wait for it to finish (note the difference from other shells which execute the whole sublist in the background). A backgrounded pipeline returns a status of zero.
More generally, a list can be seen as a set of any shell commands whatsoever, including the complex commands below; this is implied wherever the word `list' appears in later descriptions. For example, the commands in a shell function form a special sort of list.
More than one parameter name can appear before the list of words. If N names are given, then on each execution of the loop the next N words are assigned to the corresponding parameters. If there are more names than remaining words, the remaining parameters are each set to the empty string. Execution of the loop ends when there is no remaining word to assign to the first name. It is only possible for in to appear as the first name in the list, else it will be treated as marking the end of the list.
Optional newlines or semicolons may appear after the always; note, however, that they may not appear between the preceeding closing brace and the always.
An `error' in this context is a condition such as a syntax error which causes the shell to abort execution of the current function, script, or list. Syntax errors encountered while the shell is parsing the code do not cause the always-list to be executed. For example, an erroneously constructed if block in try-list would cause the shell to abort during parsing, so that always-list would not be executed, while an erroneous substitution such as ${*foo*} would cause a run-time error, after which always-list would be executed.
An error condition can be tested and reset with the special integer variable TRY_BLOCK_ERROR. Outside an always-list the value is irrelevant, but it is initialised to -1. Inside always-list, the value is 1 if an error occurred in the try-list, else 0. If TRY_BLOCK_ERROR is set to 0 during the always-list, the error condition caused by the try-list is reset, and shell execution continues normally after the end of always-list. Altering the value during the try-list is not useful (unless this forms part of an enclosing always block).
Regardless of TRY_BLOCK_ERROR, after the end of always-list the normal shell status $? is the value returned from always-list. This will be non-zero if there was an error, even if TRY_BLOCK_ERROR was set to zero.
The following executes the given code, ignoring any errors it causes. This is an alternative to the usual convention of protecting code by executing it in a subshell.
{ # code which may cause an error } always { # This code is executed regardless of the error. (( TRY_BLOCK_ERROR = 0 )) } # The error condition has been reset.
An exit command encountered in try-list does not cause the execution of always-list. Instead, the shell exits immediately after any EXIT trap has been executed.
If the option SH_GLOB is set for compatibility with other shells, then whitespace may appear between between the left and right parentheses when there is a single word; otherwise, the parentheses will be treated as forming a globbing pattern in that case.
The short versions below only work if sublist is of the form `{ list }' or if the SHORT_LOOPS option is set. For the if, while and until commands, in both these cases the test part of the loop must also be suitably delimited, such as by `[[ ... ]]' or `(( ... ))', else the end of the test will not be recognized. For the for, repeat, case and select commands no such special form for the arguments is necessary, but the other condition (the special form of sublist or use of the SHORT_LOOPS option) still applies.
if [[ -o ignorebraces ]] { print yes }
works, but
if true { # Does not work! print yes }
does not, since the test is not suitably delimited.
do done esac then elif else fi for case if while function repeat time until select coproc nocorrect foreach end ! [[ { }
Additionally, `}' is recognized in any position if the IGNORE_BRACES option is not set.
Alias expansion is done on the shell input before any other expansion except history expansion. Therefore, if an alias is defined for the word foo, alias expansion may be avoided by quoting part of the word, e.g. \foo. But there is nothing to prevent an alias being defined for \foo as well.
A string enclosed between `$'' and `'' is processed the same way as the string arguments of the print builtin, and the resulting string is considered to be entirely quoted. A literal `'' character can be included in the string by using the `\'' escape.
All characters enclosed between a pair of single quotes ('') that is not preceded by a `$' are quoted. A single quote cannot appear within single quotes unless the option RC_QUOTES is set, in which case a pair of single quotes are turned into a single quote. For example,
print ''''
outputs nothing apart from a newline if RC_QUOTES is not set, but one single quote if it is set.
Inside double quotes (""), parameter and command substitution occur, and `\' quotes the characters `\', ``', `"', and `$'.
The following may appear anywhere in a simple command or may precede or follow a complex command. Expansion occurs before word or digit is used except as noted below. If the result of substitution on word produces more than one filename, redirection occurs for each separate filename in turn.
If any character of word is quoted with single or double quotes or a `\', no interpretation is placed upon the characters of the document. Otherwise, parameter and command substitution occurs, `\' followed by a newline is removed, and `\' must be used to quote the characters `\', `$', ``' and the first character of word.
Note that word itself does not undergo shell expansion. Backquotes in word do not have their usual effect; instead they behave similarly to double quotes, except that the backquotes themselves are passed through unchanged. (This information is given for completeness and it is not recommended that backquotes be used.) Quotes in the form $'...' have their standard effect of expanding backslashed references to special characters.
If <<- is used, then all leading tabs are stripped from word and from the document.
If one of the above is preceded by a digit, then the file descriptor referred to is that specified by the digit instead of the default 0 or 1. The order in which redirections are specified is significant. The shell evaluates each redirection in terms of the (file descriptor, file) association at the time of evaluation. For example:
... 1>fname 2>&1
first associates file descriptor 1 with file fname. It then associates file descriptor 2 with the file associated with file descriptor 1 (that is, fname). If the order of redirections were reversed, file descriptor 2 would be associated with the terminal (assuming file descriptor 1 had been) and then file descriptor 1 would be associated with file fname.
The `|&' command separator described in Simple Commands & Pipelines in zshmisc(1) is a shorthand for `2>&1 |'.
For output redirections only, if word is of the form `>(list)' then the output is piped to the command represented by list. See Process Substitution in zshexpn(1).
date >foo >bar
writes the date to two files, named `foo' and `bar'. Note that a pipe is an implicit redirection; thus
date >foo | cat
writes the date to the file `foo', and also pipes it to cat.
If the MULTIOS option is set, the word after a redirection operator is also subjected to filename generation (globbing). Thus
: > *
will truncate all files in the current directory, assuming there's at least one. (Without the MULTIOS option, it would create an empty file called `*'.) Similarly, you can do
echo exit 0 >> *.sh
If the user tries to open a file descriptor for reading more than once, the shell opens the file descriptor as a pipe to a process that copies all the specified inputs to its output in the order specified, similar to cat, provided the MULTIOS option is set. Thus
sort <foo <fubar
or even
sort <f{oo,ubar}
is equivalent to `cat foo fubar | sort'.
Note that a pipe is an implicit redirection; thus
cat bar | sort <foo
is equivalent to `cat bar foo | sort' (note the order of the inputs).
If the MULTIOS option is unset, each redirection replaces the previous redirection for that file descriptor. However, all files redirected to are actually opened, so
echo foo > bar > baz
when MULTIOS is unset will truncate bar, and write `foo' into baz.
There is a problem when an output multio is attached to an external program. A simple example shows this:
cat file >file1 >file2 cat file1 file2
Here, it is possible that the second `cat' will not display the full contents of file1 and file2 (i.e. the original contents of file repeated twice).
The reason for this is that the multios are spawned after the cat process is forked from the parent shell, so the parent shell does not wait for the multios to finish writing data. This means the command as shown can exit before file1 and file2 are completely written. As a workaround, it is possible to run the cat process as part of a job in the current shell:
{ cat file } >file >file2
Here, the {...} job will pause to wait for both files to be written.
If the parameter NULLCMD is not set or the option CSH_NULLCMD is set, an error is caused. This is the csh behavior and CSH_NULLCMD is set by default when emulating csh.
If the option SH_NULLCMD is set, the builtin `:' is inserted as a command with the given redirections. This is the default when emulating sh or ksh.
Otherwise, if the parameter NULLCMD is set, its value will be used as a command with the given redirections. If both NULLCMD and READNULLCMD are set, then the value of the latter will be used instead of that of the former when the redirection is an input. The default for NULLCMD is `cat' and for READNULLCMD is `more'. Thus
< file
shows the contents of file on standard output, with paging if that is a terminal. NULLCMD and READNULLCMD may refer to shell functions.
Otherwise, the shell searches each element of $path for a directory containing an executable file by that name. If the search is unsuccessful, the shell prints an error message and returns a nonzero exit status.
If execution fails because the file is not in executable format, and the file is not a directory, it is assumed to be a shell script. /bin/sh is spawned to execute it. If the program is a file beginning with `#!', the remainder of the first line specifies an interpreter for the program. The shell will execute the specified interpreter on operating systems that do not handle this executable format in the kernel.
Functions execute in the same process as the caller and share all files and present working directory with the caller. A trap on EXIT set inside a function is executed after the function completes in the environment of the caller.
The return builtin is used to return from function calls.
Function identifiers can be listed with the functions builtin. Functions can be undefined with the unfunction builtin.
A function can be marked as undefined using the autoload builtin (or `functions -u' or `typeset -fu'). Such a function has no body. When the function is first executed, the shell searches for its definition using the elements of the fpath variable. Thus to define functions for autoloading, a typical sequence is:
fpath=(~/myfuncs $fpath) autoload myfunc1 myfunc2 ...
The usual alias expansion during reading will be suppressed if the autoload builtin or its equivalent is given the option -U. This is recommended for the use of functions supplied with the zsh distribution. Note that for functions precompiled with the zcompile builtin command the flag -U must be provided when the .zwc file is created, as the corresponding information is compiled into the latter.
For each element in fpath, the shell looks for three possible files, the newest of which is used to load the definition for the function:
If element already includes a .zwc extension (i.e. the extension was explicitly given by the user), element is searched for the definition of the function without comparing its age to that of other files; in fact, there does not need to be any directory named element without the suffix. Thus including an element such as `/usr/local/funcs.zwc' in fpath will speed up the search for functions, with the disadvantage that functions included must be explicitly recompiled by hand before the shell notices any changes.
In summary, the order of searching is, first, in the parents of directories in fpath for the newer of either a compiled directory or a directory in fpath; second, if more than one of these contains a definition for the function that is sought, the leftmost in the fpath is chosen; and third, within a directory, the newer of either a compiled function or an ordinary function definition is used.
If the KSH_AUTOLOAD option is set, or the file contains only a simple definition of the function, the file's contents will be executed. This will normally define the function in question, but may also perform initialization, which is executed in the context of the function execution, and may therefore define local parameters. It is an error if the function is not defined by loading the file.
Otherwise, the function body (with no surrounding `funcname() {...}') is taken to be the complete contents of the file. This form allows the file to be used directly as an executable shell script. If processing of the file results in the function being re-defined, the function itself is not re-executed. To force the shell to perform initialization and then call the function defined, the file should contain initialization code (which will be executed then discarded) in addition to a complete function definition (which will be retained for subsequent calls to the function), and a call to the shell function, including any arguments, at the end.
For example, suppose the autoload file func contains
func() { print This is func; } print func is initialized
then `func; func' with KSH_AUTOLOAD set will produce both messages on the first call, but only the message `This is func' on the second and subsequent calls. Without KSH_AUTOLOAD set, it will produce the initialization message on the first call, and the other message on the second and subsequent calls.
It is also possible to create a function that is not marked as autoloaded, but which loads its own definition by searching fpath, by using `autoload -X' within a shell function. For example, the following are equivalent:
myfunc() { autoload -X } myfunc args...
and
unfunction myfunc # if myfunc was defined autoload myfunc myfunc args...
In fact, the functions command outputs `builtin autoload -X' as the body of an autoloaded function. This is done so that
eval "$(functions)"
produces a reasonable result. A true autoloaded function can be identified by the presence of the comment `# undefined' in the body, because all comments are discarded from defined functions.
To load the definition of an autoloaded function myfunc without executing myfunc, use:
autoload +X myfunc
If a function of this form is defined and null, the shell and processes spawned by it will ignore SIGNAL.
The return value from the function is handled specially. If it is zero, the signal is assumed to have been handled, and execution continues normally. Otherwise, the normal effect of the signal is produced; if this causes execution to terminate, the status returned to the shell is the status returned from the function.
Programs terminated by uncaught signals typically return the status 128 plus the signal number. Hence the following causes the handler for SIGINT to print a message, then mimic the usual effect of the signal.
TRAPINT() { print "Caught SIGINT, aborting." return $(( 128 + $1 )) }
The functions TRAPZERR, TRAPDEBUG and TRAPEXIT are never executed inside other traps.
The functions beginning `TRAP' may alternatively be defined with the trap builtin: this may be preferable for some uses, as they are then run in the environment of the calling process, rather than in their own function environment. Apart from the difference in calling procedure and the fact that the function form appears in lists of functions, the forms
TRAPNAL() { # code }
and
trap ' # code
[1] 1234
indicating that the job which was started asynchronously was job number 1 and had one (top-level) process, whose process ID was 1234.
If a job is started with `&|' or `&!', then that job is immediately disowned. After startup, it does not have a place in the job table, and is not subject to the job control features described here.
If you are running a job and wish to do something else you may hit the key ^Z (control-Z) which sends a TSTP signal to the current job: this key may be redefined by the susp option of the external stty command. The shell will then normally indicate that the job has been `suspended', and print another prompt. You can then manipulate the state of this job, putting it in the background with the bg command, or run some other commands and then eventually bring the job back into the foreground with the foreground command fg. A ^Z takes effect immediately and is like an interrupt in that pending output and unread input are discarded when it is typed.
A job being run in the background will suspend if it tries to read from the terminal. Background jobs are normally allowed to produce output, but this can be disabled by giving the command `stty tostop'. If you set this tty option, then background jobs will suspend when they try to produce output like they do when they try to read input.
When a command is suspended and continued later with the fg or wait builtins, zsh restores tty modes that were in effect when it was suspended. This (intentionally) does not apply if the command is continued via `kill -CONT', nor when it is continued with bg.
There are several ways to refer to jobs in the shell. A job can be referred to by the process ID of any process of the job or by one of the following:
The shell learns immediately whenever a process changes state. It normally informs you whenever a job becomes blocked so that no further progress is possible. If the NOTIFY option is not set, it waits until just before it prints a prompt before it informs you. All such notifications are sent directly to the terminal, not to the standard output or standard error.
When the monitor mode is on, each background job that completes triggers any trap set for CHLD.
When you try to leave the shell while jobs are running or suspended, you will be warned that `You have suspended (running) jobs'. You may use the jobs command to see what they are. If you do this or immediately try to exit again, the shell will not warn you a second time; the suspended jobs will be terminated, and the running jobs will be sent a SIGHUP signal, if the HUP option is set.
To avoid having the shell terminate the running jobs, either use the nohup command (see nohup(1)) or the disown builtin.
The let builtin command takes arithmetic expressions as arguments; each is evaluated separately. Since many of the arithmetic operators, as well as spaces, require quoting, an alternative form is provided: for any command which begins with a `((', all the characters until a matching `))' are treated as a quoted expression and arithmetic expansion performed as for an argument of let. More precisely, `((...))' is equivalent to `let "..."'. For example, the following statement
(( val = 2 + 1 ))
is equivalent to
let "val = 2 + 1"
both assigning the value 3 to the shell variable val and returning a zero status.
Integers can be in bases other than 10. A leading `0x' or `0X' denotes hexadecimal. Integers may also be of the form `base#n', where base is a decimal number between two and thirty-six representing the arithmetic base and n is a number in that base (for example, `16#ff' is 255 in hexadecimal). The base# may also be omitted, in which case base 10 is used. For backwards compatibility the form `[base]n' is also accepted.
It is also possible to specify a base to be used for output in the form `[#base]', for example `[#16]'. This is used when outputting arithmetical substitutions or when assigning to scalar parameters, but an explicitly defined integer or floating point parameter will not be affected. If an integer variable is implicitly defined by an arithmetic expression, any base specified in this way will be set as the variable's output arithmetic base as if the option `-i base' to the typeset builtin had been used. The expression has no precedence and if it occurs more than once in a mathematical expression, the last encountered is used. For clarity it is recommended that it appear at the beginning of an expression. As an example:
typeset -i 16 y print $(( [#8] x = 32, y = 32 )) print $x $y
outputs first `8#40', the rightmost value in the given output base, and then `8#40 16#20', because y has been explicitly declared to have output base 16, while x (assuming it does not already exist) is implicitly typed by the arithmetic evaluation, where it acquires the output base 8.
If the C_BASES option is set, hexadecimal numbers in the standard C format, for example 0xFF instead of the usual `16#FF'. If the option OCTAL_ZEROES is also set (it is not by default), octal numbers will be treated similarly and hence appear as `077' instead of `8#77'. This option has no effect on the output of bases other than hexadecimal and octal, and these formats are always understood on input.
When an output base is specified using the `[#base]' syntax, an appropriate base prefix will be output if necessary, so that the value output is valid syntax for input. If the # is doubled, for example `[##16]', then no base prefix is output.
Floating point constants are recognized by the presence of a decimal point or an exponent. The decimal point may be the first character of the constant, but the exponent character e or E may not, as it will be taken for a parameter name.
An arithmetic expression uses nearly the same syntax, precedence, and associativity of expressions in C. The following operators are supported (listed in decreasing order of precedence):
The operators `&&', `||', `&&=', and `||=' are short-circuiting, and only one of the latter two expressions in a ternary operator is evaluated. Note the precedence of the bitwise AND, OR, and XOR operators.
Mathematical functions can be called with the syntax `func(args)', where the function decides if the args is used as a string or a comma-separated list of arithmetic expressions. The shell currently defines no mathematical functions by default, but the module zsh/mathfunc may be loaded with the zmodload builtin to provide standard floating point mathematical functions.
An expression of the form `##x' where x is any character sequence such as `a', `^A', or `\M-\C-x' gives the ASCII value of this character and an expression of the form `#foo' gives the ASCII value of the first character of the value of the parameter foo. Note that this is different from the expression `$#foo', a standard parameter substitution which gives the length of the parameter foo. `#\' is accepted instead of `##', but its use is deprecated.
Named parameters and subscripted arrays can be referenced by name within an arithmetic expression without using the parameter expansion syntax. For example,
((val2 = val1 * 2))
assigns twice the value of $val1 to the parameter named val2.
An internal integer representation of a named parameter can be specified with the integer builtin. Arithmetic evaluation is performed on the value of each assignment to a named parameter declared integer in this manner. Assigning a floating point number to an integer results in rounding down to the next integer.
Likewise, floating point numbers can be declared with the float builtin; there are two types, differing only in their output format, as described for the typeset builtin. The output format can be bypassed by using arithmetic substitution instead of the parameter substitution, i.e. `${float}' uses the defined format, but `$((float))' uses a generic floating point format.
Promotion of integer to floating point values is performed where necessary. In addition, if any operator which requires an integer (`~', `&', `|', `^', `%', `<<', `>>' and their equivalents with assignment) is given a floating point argument, it will be silently rounded down to the next integer.
Scalar variables can hold integer or floating point values at different times; there is no memory of the numeric type in this case.
If a variable is first assigned in a numeric context without previously being declared, it will be implicitly typed as integer or float and retain that type either until the type is explicitly changed or until the end of the scope. This can have unforeseen consequences. For example, in the loop
for (( f = 0; f < 1; f += 0.1 )); do # use $f done
if f has not already been declared, the first assignment will cause it to be created as an integer, and consequently the operation `f += 0.1' will always cause the result to be truncated to zero, so that the loop will fail. A simple fix would be to turn the initialization into `f = 0.0'. It is therefore best to declare numeric variables with explicit types.
Normal shell expansion is performed on the file, string and pattern arguments, but the result of each expansion is constrained to be a single word, similar to the effect of double quotes. However, pattern metacharacters are active for the pattern arguments; the patterns are the same as those used for filename generation, see zshexpn(1), but there is no special behaviour of `/' nor initial dots, and no glob qualifiers are allowed.
In each of the above expressions, if file is of the form `/dev/fd/n', where n is an integer, then the test applied to the open file whose descriptor number is n, even if the underlying system does not support the /dev/fd directory.
In the forms which do numeric comparison, the expressions exp undergo arithmetic expansion as if they were enclosed in $((...)).
For example, the following:
[[ ( -f foo || -f bar ) && $report = y* ]] && print File exists.
tests if either file foo or file bar exists, and if so, if the value of the parameter report begins with `y'; if the complete condition is true, the message `File exists.' is printed.
If the PROMPT_SUBST option is set, the prompt string is first subjected to parameter expansion, command substitution and arithmetic expansion. See zshexpn(1).
Certain escape sequences may be recognised in the prompt string.
If the PROMPT_BANG option is set, a `!' in the prompt is replaced by the current history event number. A literal `!' may then be represented as `!!'.
If the PROMPT_PERCENT option is set, certain escape sequences that start with `%' are expanded. Some escapes take an optional integer argument, which should appear between the `%' and the next character of the sequence. The following escape sequences are recognized:
The left parenthesis may be preceded or followed by a positive integer n, which defaults to zero. A negative integer will be multiplied by -1. The test character x may be any of the following:
The forms with `<' truncate at the left of the string, and the forms with `>' truncate at the right of the string. For example, if the current directory is `/home/pike', the prompt `%8<..<%/' will expand to `..e/pike'. In this string, the terminating character (`<', `>' or `]'), or in fact any character, may be quoted by a preceding `\'; note when using print -P, however, that this must be doubled as the string is also subject to standard print processing, in addition to any backslashes removed by a double quoted string: the worst case is therefore `print -P "%<\\\\<<..."'.
If the string is longer than the specified truncation length, it will appear in full, completely replacing the truncated string.
The part of the prompt string to be truncated runs to the end of the string, or to the end of the next enclosing group of the `%(' construct, or to the next truncation encountered at the same grouping level (i.e. truncations inside a `%(' are separate), which ever comes first. In particular, a truncation with argument zero (e.g. `%<<') marks the end of the range of the string to be truncated while turning off truncation from there on. For example, the prompt '%10<...<%~%<<%# ' will print a truncated representation of the current directory, followed by a `%' or `#', followed by a space. Without the `%<<', those two characters would be included in the string to be truncated.
Закладки на сайте Проследить за страницей |
Created 1996-2024 by Maxim Chirkov Добавить, Поддержать, Вебмастеру |