搜档网
当前位置:搜档网 › oracle_数据字典学习字典

oracle_数据字典学习字典

oracle_数据字典学习字典
oracle_数据字典学习字典

Simplified Introduction to unix shell scripting

This note is for people, who knows very little of unix shell scripting, and would like a "quick" introduction into that subject.

(As such, this note is on an very introductionary level, and is obviously incomplete.)

So, If you are really quite "blank" to unix shell scripting, this note should get you "on the track" in a couple of hours (or a da

dive into more advanced material.

This note does not pretend to be anything more than just being a very simple introduction.

Version: 0.6

Date: 01/03/2009

Compiled by: Albert van der Sel - Antapex Technologies B.V.

It is assumed you are at least a bit comfortable in using some important shell commands, like for and listing files and directories (like using the "cd", "ls" commands), and that you know

for example, using the "cat", "more", "pg" commands).

In addition, some basic expertise on the editor "vi" (or an other editor) would be great.

But even if you do not have some vi knowledge, you can still follow this document, because some ba

Section 1. Let's start with a few simple examples.

Example 1:

Suppose you start a terminal session to your Unix host. A few moments later (after loggin

You could see a very basic shell prompt (like seen below) which basically only shows a "$" (dollar the shell is awaiting your commands.

$ _

But maybe your prompt is somewhat more informative, and for example, shows you "where you are" on on which account you have logged on, like in this example:

albert@starboss:/home/albert > _

(how your prompt "looks like", is simply a matter of how a system-wide profile, or your personal p Usually, you will "start" in your home directory, which could be, for example, something similar t Suppose you know you have some txt documents in your "docs" directory, so you decide to move over albert@starboss:/home/albert > cd docs

albert@starboss:/home/albert/docs >

Next, you want to list the contents of "/home/albert/docs", so you enter the "ls" command (using t to get a "long" listing).

albert@starboss:/home/albert/docs > ls -al

total 40

drwxr-xr-x 11 root system 4096 Oct 08 2002 .

drwxr-xr-x 30 root system 12288 Aug 08 11:41 ..

-rw-r--r-- 1 albert staff 2289 Sep 19 2002 todo.txt

-rw-r--r-- 1 albert staff 1289 Sep 20 2002 marlies.txt

-rw------- 1 albert staff 19523 Sep 20 2002 finance.txt

-rw-r----- 1 albert staff 2289 Sep 19 2002 checklist.txt

-rw-r--r-- 1 albert staff 33811 Sep 23 2002 math.txt

In this listing, a number of columns are worth mentioning. Obviously, the last column shows you th of your files. The first column, shows you entries like "rw-r--r--", which represent the "filemode that are in effect on your files. It means this: it shows first the "permissions" of the

and then "other" (which is everybody else). The permissions are always in the form of "rw for us The "maximum" that such a "three part" can get is: rwx, meaning "read" (r), "write" (w) a

If such a permission is not in place, it shows you a "-".

So, for example, the file "todo.txt" has as a filemode rw-r--r--, and that means that you

and that the group you belong to, can only read the file (r--), and that the rest of the users (ot

sort on "hammer", "tent" etc..

What can we do?

If you only had a couple of such files, you could give "interactive" commands like for example:

albert@starboss:/home/albert/docs > sort +1 checklist.txt | tr -d "[0-9]" > sorted_checkl

Here we have used a few utilities (sort and tr, which are programs which you can call fro

a few special "symbols" (that is the "|" and ">" symbols) which the shell will "interpreted" in a First of all we call the "sort" program which will sort the input (or file) which you have pass al In our case, we essentially say to sort "use the input "checklist.txt" and sort it on the second c Ofcourse "sort" will do that and normally produces it's output to a device called "stdout", which terminal screen. But you have used the pipe symbol "|", to signal the shell that the output from s to the screen, but must be past along as input to the second program "tr", which will handle that So, what "tr" gets as input, is the sorted information from the sort program, which is this:

2 backpack

0 hammer

3 money

4 passport

1 tent

That's the input "tr" will get, line for line. Now "tr" is a very handy utility. It's name comes f you could use it in many ways. In our example, we tell "tr" to delete from the input it gets (the the following characters 0,1,2,3,4,5,6,7,8,9 which in unix shells you comfortably can shorten to [ That's marvelous ofcourse. So tr handles that input, and will produce the following:

backpack

hammer

money

passport

tent

That's just what we expected ofcourse, because essentially we told "tr" to delete (via "-d") all c Now also "tr" will normally write to "stdout", which is the screen. So if we used the command with we would see the upper list on our screen.

But, since we have used the "redirection" symbol ">", tr will now writes it's output to the file " Mark my words, you will use the ">" symbol a lot. Most of the time, you will use it to redirect th

All upper information still has not much to do with shell scripting. But now we quicly move on to Suppose you have a lot of those files like "checklist.txt", and you want to sort them, and remove without always dreadfully typing in the command string we have seen above.

Now you will start up "vi" (or another flat file editor of your choice ) and we are about to creat

albert@starboss:/home/albert/docs > vi sortfile.ksh

You screen should look similar to this:

-

~

~

~

~

~

~

~

~

~

~

"sortfile.ksh" [new file]

Press the "i" key, in order to tell vi, that you want to insert (i) text.

Now type the line that you will see below:

sort +1 $1 | tr -d "[0-9]" > sorted_$1

~

~

~

~

~

~

~

~

~

~

:wq!

Take notice of the fact that on purpose we typed the $1 (dollar 1) instead of a filename, which wi Also take notice of the "sorted_$1" in the line shown above.

After you are ready with typing in the full command,

press the "Esc" key, and write "wq" (on the bottom of the screen).

With pressing the Esc key, you tell vi that you want to stop entering text, and want to do "someth like to write to the file (w) and then want to quit vi (q), hence that’s why you entered "wq!".

If you didn't want to save anything, you could have entered "q!", which will just end the vi sessi You now got a new file in the "/home/albert/doc" directory, called "sortfile.ksh". It contains the seen before, with a few special noteworthy things.

First of all, we have used $1 instead of a filename that we want to process. In unix shells, $1, $ will be interpreted as the parameters that you pass to your script.

So, what we mean is that we can process all our source files, with commands like for example:

albert@starboss:/home/albert/docs > sortfile.ksh finance.txt

albert@starboss:/home/albert/docs > sortfile.ksh checklist.txt

The $1 that’s in the code will be interpreted as the first parameter you pass to your script, whic So when executing the script, the $1 will be substituted in "checklist.txt" (or other filename we Note that we have used this in two occurences. We tell "sort" to take $1 as its input (th

and the output of "tr" will be redirected in "sorted_$1", which will be a new file like "sorted_ch As you see, the "$1" in "sorted_$1" will be substituted for the filename (like checklist.txt).

If we call our script like this:

albert@starboss:/home/albert/docs > sortfile.ksh checklist.txt

We got a new sorted file "sorted_checklist.txt" where all [0-9] characters are removed from, and w But…. actually we cannot execute the script yet !! You first need to make it "executable", which i than setting the "x" mode (permission) on the script. For that, we can use the "chmod" utility aga albert@starboss:/home/albert/docs > chmod u+x sortfile.ksh

Because we used "u+x", only the onwer (u) is allowed to execute the script. If you wanted the grou to be able to use the script as well, you might have used the following command:

albert@starboss:/home/albert/docs > chmod ug+x sortfile.ksh

So, do not forget to make your script executable after you have created it.

Ok, so now you can run your script. Even as being a "one liner" script, you hopefully appreciate t you have achieved. From now on, you only need to call your script with as a parameter the source f and you get a new sorted file, where certain unwanted characters (like 0,1,2, etc in our example), are removed from.

Example 2.

About the (positional) parameters (arguments) $1, $2 etc..

You could also use more than one positional parameter with your scripts. In the example a

Suppose we create this script, called "echo_parameters.ksh"

albert@starboss:/home/albert/docs > cat echo_parameters.ksh

# Print all arguments (version 1)

#

echo $1

echo $2

echo $3

and let's execute the script in the following way:

albert@starboss:/home/albert/docs > ./echo_parameters.ksh pearls gold silver

pearls

gold

silver

So with this script, you are supposed to pass 3 positional parameters (arguments). The script itse does no more than simply echo the parameters (by using the "echo" command) to stdout (which is you In the echo statements, $1 will be substituted by "pearls", $2 with "gold", and $3 with "silver". Note that we called our script with "./" which can be necessary at some times.

For now, take for granted that if you want to execute your script from that current directory, whi "path", then precede it with "./"

Above you have seen the use of multiple positional parameters. Ofcourse, this was a very primitive Let's take a look at a better version:

albert@starboss:/home/albert/docs > cat echo_parameters.ksh note: anything "behi # Print all arguments (version 2)is regarded by the s #

for arg in $*

do

echo Argument $arg

done

echo The total number of arguments was $#

Let's run this one.

albert@starboss:/home/albert/docs > ./echo_parameters.ksh pearls gold silver

Argument pearls

Argument gold

Argument silver

The total number of arguments was 3

This is interresting ! As you can see, you can not only use $1, $2, $3 etc.., as input variables, but this example shows you that "a thing called" $#sums up the total number of variables passed t That's not all. In the code you can see that I have build some sort of "loop" using "for arg in $* In section 2, we dive into loops and stuff, but for now focus on that "other thing called$*

It is simply a representation of the entire list of arguments.

So:

The '$*' symbol stands for the entire list of arguments and '$#' is the total number of arguments. You can use the $1, $2, $3 etc.., variables in the body of your script, which will be substituted which you passed to your script.

Note:

Why did I used the extension ".ksh" in those examples? There are quite a few shells aroun

the bourne "sh" shell, the bash shell, the korn (ksh) shell etc.., just to name a few.

There are indeed some (minor) differences between those shells, mostly in terms of which features It's more or less a convention to use the extension for the shell your script is supposed to run i So, if I had used the bourne shell, I probably would have called my script "sortfile.sh".

Don't worry about "portability" of your script, and wonder whether your .ksh script can r

It usually does! Only the C shell can sometimes be a bit troublesome, and usually you can

with no problem in the bourne (sh), bash (bash) and korn (ksh) shells (though exceptions

As you will see later, you can also make clear in your code, which shell is supposed to r

Note:

Almost all unix systems, have a superuser account, also called "root", which has almost unlimited That also means the power to destroy things, even if it was only "by an unfortunate mista

In almost all circumstances, it's not advisible to enter a system with the root account, and test Although that's a trivial remark, it's best to say it anyway, just in case.

Note:

You might have a personal Linux or Unix system. That would be great for studying unix and shell pr Even if you have a Windows machine, you could still install a Linux distro (e.g. RedHat, Suse, Ubu Preferrably, you have Windows installed first (say on C:) and on (say 10GB or larger) free unalloc (or a unix system like Solaris 10 x86) in it's own partition.

The modern Linux or Unix bootloaders are "intelligent" enough to let you boot to either Windows or If you really (yes, really) don't want to install Linux/Unix on your PC, you might checkout "Cygwi you a nice bash shell on your PC (you can run a terminal from Windows). Just Google on "G

Section 2: Some important theory.

This section will display some neccessay theory before we move on to more advanced scripts.

It won't be too dry stuff, because examples will illustrate all the theory.

Even a script that contains only "one line" of code, as we have seen in example 1, can save you a A script could contain a just series of sequential commands, like this:

command1

command2

etc..

But it might also contain "control structures" for decision taking, and "loops" in order to carry multiple times.

Here we will discuss those loops and control structures, along with other essential infor

2.1. The very first line in your script.

Although not always mandatory, you should start your script with a reference to which "in

this script is supposed to run in.

What we mean is this:

If your script is meant to run in the bash shell (very popular in Linux, but other unixes as well) #!/bin/bash

If your script is meant to run in the korn shell, you should write as the first line in y

#!/usr/bin/ksh

The exact paths to the right shell can vary from system to system. So the above are just

If you don't do this, you probably can "get away" with it (in most occasions) because many scripts between various shells like bourne, bash and korn.

But there are differences between the shells. So if your script is really meant for the korn shell

make sure you enter the correct info.

Your current shell will then just simply start the other shell and pass the script.

Note: in normal use, the "#" character means that you have commented out anything that follows the "#" symbol.

For example:

If you have the following statement in your code:

echo $TMP_DIR

Then, when your script runs, and comes along that statement, it will echo (show on the screen) whatever value

was stored in the variable TMP_DIR.

Now, if you place the "#" symbol before that statement, like

# echo $TMP_DIR

that statement will be ignored (it is viewed as comment).

2.2. Your $PATH environment variable, and other variables.

Your $PATH variable, will list a number of directories (or mountpoints) that your current

if you enter a command, or if you try to start a script.

If your script is located in some directory, which is not in your $PATH, it will simply t

If you want to see what is contained in your current $PATH variable, simply echo its cont

albert@starboss:/home/albert/scripts > echo $PATH

/usr/local/bin:/usr/bin:/etc:/usr/sbin:/usr/ucb:/usr/bin/X11:/sbin

Now this is just an example. If your PATH is different, that's likely to be OK.

Your shell will search all directories in your $PATH "from left to right", if you call a

The individual directories are seperated by a ":" symbol.

Can you "extend" your $PATH? Yes you can.

A variable like $PATH is called an "environment variable", because in some way, it determines your So lets try to extend the $PATH to include some new directory.

Suppose you use Oracle RDMBS v. 10g on your Unix/Linux system.

Suppose Oracle is installed in the following directory: "/opt/oracle/product/10g"

Now you need to call the Oracle utilities (like sqlplus and others) no matter what your current di Those utilities are stored in "/opt/oracle/product/10g/bin".

Now you could do this:

First create a new environment variable "ORACLE_HOME" that points to "/opt/oracle/product/10g"

That will be accomplished by this:

albert@starboss:/home/albert/scripts > export ORACLE_HOME=/opt/oracle/product/10g

Believe it or not, we now have created a new environment variable called $ORACLE_HOME that points You can easily check that with:

albert@starboss:/home/albert/scripts > echo $ORACLE_HOME

/opt/oracle/product/10g

Ok, now let's extend the $PATH environment variable to include that as well:

albert@starboss:/home/albert/scripts > export $PATH=$PATH:$ORACLE_HOME/bin

At this point, your $PATH is extended to include the $ORACLE_HOME/bin directory as well

But this is all valid for your current terminal session. If you log off, and log on again

your extension of the $PATH is lost.

The remedy to this, is to include the extension in your "logon profile", which could be t

".profile", or another file in your home directory (depending on your unix system).

By the way, not only can you call the executables in $ORACLE_HOME/bin, no matter "where you are", you can also quicly "jump" to $ORACLE_HOME in a simple step. Just look at this:

albert@starboss:/home/albert/scripts > cd $ORACLE_HOME

albert@starboss:/opt/oracle/product/10g >

You should know that creating (declaring) an environment variable in the bourne, bash, and korn sh with the "export VAR=" syntax.

But this goes in a somewhat different way when you use the C shell. We will not discuss that in th in the "mainsteam", that is, we will discuss stuff that most of us uses.

You do not always have to use the "export" syntax. When you set a variable in your script, that do to your session, you can use the simple syntax as shown here:

ME=bill

BC="bill clinton"

i=10

Those variables are then just local to your script, while an exported variable is global to your e Only if , for example, "ME" needs to be global, you use the statement:

export ME=bill

Here's a little task for you:

Try out what happens if you use the "set" and "env" commands. Both might give a lot of in

about your environment. Try to spot the differences, if any.

Because the output of both commands can be quite large, pipe the output to the "more" command, so the info screen by screen, instead that it all speeds over your screen (which makes that you can o that fits on your screen).

So, try this:

albert@starboss:/home/albert/scripts > env | more

and thencompare it with this:

albert@starboss:/home/albert/scripts > set | more

2.3. Positional parameters.

In section 1, example 2, we already have seen the use of the positional parameters (or ar

Also, when your script is runnig, some other $ variables gets their value. Your shell will do that You also have seen, for example, the $# variable, which is the nummerical sum of all parameters th For easy reference, we will list here the most important $ variables:

scriptname $0

argument 1 through 9 $1 .. $9the positional arguments

nth argument ${n}

number of positional parameters $#

every positional parameter $@, $*

decimal value returned by last executed cmd $? if the former command was successful t pid of shell $$

pid of last backgrounded command $!

As you can see from that list, when your script runs, the variable "$0" takes on the name

The "$?" can also be interessting. Your script might contain a series of commands. Any time such a "$?" will contain the "returncode" of success or failure. Almost always, "0" means that the comman 2.4 Assinging a value (or a list of values) to a variable from comman

You can give a variable a static value like we have seen before, like in these examples:

me=bill

i=0

But you can also use the output of a command. You only need to enclose the command with the ` quot you can find that symbol right below the Esc key.

So this works:

albert@starboss:/home/albert/scripts > datetime=`date`

albert@starboss:/home/albert/scripts > echo $datetime

Thu Jan 1 20:29:04 2009

Here we have assigned the output of the "date" command to the variable "datetime".

Also this works:

albert@starboss:/home/albert/scripts > listing=`ls`

and the variable $listing will contain all filenames residing in that directory. Note that we did any parameters in the ls command, which will produce a basic list with filenames only.

2.5. Sed, awk, grep and tr.

There are a couple of utilities (or commands) that you will use a lot in scripting. Those

The usage of the "tr" command was illustrated in section 1, example 1.

There is too much to say about those tools to fit in this simple note, so we (severely) limit ours to some common usages of those tools.

A few notes on the "sed" utility:

>> using sed in replacing strings in a file:

One of the most important usage of sed, is to replace strings in a file, or in whatever i

Suppose you have the following txt file:

albert@starboss:/home/albert/docs > cat finance.txt

Hammers: bevore: 1000 after: 900

Tents: bevore: 1700 after: 1500

Bycicles: bevore: 850 after: 730

Now this is a small file, but imagine it contains thousands of records. Then you want an easy way a string with another string. You can use sed for that purpose.

You use it in this way:

sed s/string/newstring/ old_file > new_file

The character after the "s" is the delimiter. It is conventionally a slash "/", but you c

So lets try this:

albert@starboss:/home/albert/docs > sed s/bevore/before/ finance.txt > new_finance.txt

albert@starboss:/home/albert/docs > cat new_finance.txt

Hammers: before: 1000 after: 900

Tents: before: 1700 after: 1500

Bycicles: before: 850 after: 730

As you see, we have replaced "bevore" with "before" throughout the entire file and stored it in "n So the "/" is the "commonly" used delimiter. But what if this "/" character is also used in the st Suppose we have this file:

albert@starboss:/home/albert/docs > cat software.txt

was: /appl/software/was/was6.0

oracle: /appl/software/oracle/product

db2: /appl/software/db2v8

What if we want to replace "/appl/software/" with "/swstore/"?

Lets try this, where we use "_" as another delimiter:

albert@starboss:/home/albert/docs > sed s_/appl/software/_/swstore/_ software.txt > new_software. albert@starboss:/home/albert/docs > cat new_software.txt

was: /swstore/was/was6.0

oracle: /swstore/oracle/product

db2: /swstore/db2v8

Note: it may be so on your specific unix/linux version, and/or your shell, that you have to quote sed 's/string/newstring/' old_file > new_file

>> using sed as a stream editor:

The sed utility is always a stream editor, but in the above examples, we have let it oper

It does not have to be a file, because you can pass to sed also the output stream of another comma Let's write down a few examples

In a some script, we could find a line of code as:

for i in `cat /etc/cistab | sed -e 's/:/ /g'| awk '{ print $1 }'`

In another script, we could find something like:

export OPSYSVER=`uname -r | sed 's/^[A-Z]\.//'|cut -c1-2`

Ok, those staments, at first sight, might look a bit intimidating (which they are not, be

The important thing here, is that I only want you to see that we have a "|" symbol just before the So, "whatever it is" that's before the "|" symbol, will sends it's output to sed. Then, sed will regard that as input to work with. That's all !

It's really the same principle as in the simple example of

$ cat finance.txt | more

where the "|" pipe, will make sure that the output of "cat", will be the input of "more".

In the examples above, we see stuff like sed 's/^[A-Z]\.//' where we want to know what the ^[A-Z] So let's talk a minute about "matches".

A few notes about matches:

As you already have seen in example 1, of you want to write a shortcut for "0, or 1, or 2

or 5, or 6, or 7, or 8, or 9", you can simply write [0-9].

The same is true for all capital letters, you may simply write [A-Z].

Would you be surprised, if [a-z] is a shorthand for all possible lowcase letters?

Now if you see an expression, where a "^" is present at the beginning, it means "match at the begi If you see an expression, where a "$" is present at the end, it means "match at the end of the lin Look at this. Suppose we have the file "experiment.txt":

albert@starboss:/home/albert/docs > cat experiment.txt

Harry 1256 YouTube

What! Did I hear that good? 17 Now.

23 BLA BLA no way 12 OK.

In it is nonsense ofcourse, but let's see what happens if we do this:

albert@starboss:/home/albert/docs > sed "s/^[A-Z]//" experiment.txt > experiment_2.txt

albert@starboss:/home/albert/docs > cat experiment_2.txt

arry 1256 YouTube

hat! Did I hear that good? 17 Now.

23 BLA BLA no way 12 OK.

You see? sed "s/^[A-Z]\.//" will remove all first encountered Capital letters, from all lines, in Now, sed will behave the same way, if we feed it a stream of input, coming from some program.

If we take a look again at the statement

export OPSYSVER=`uname -r | sed 's/^[A-Z]\.//'|cut -c1-2`

Then whatever that "uname -r" command will produce for output, sed considers it as input and will from all lines all first encountered Capital letters, just as you have seen happening wit

Here's a little task for you:

Create a couple of txt files, and experiment with the sed command, using the ^ and $ symb

just similar to the example above.

A few notes about the grep utility:

The grep utility "grabs" the string you search for, from a file, or from the output that

Actually, it will show you the "lines" from all the inputlines (from the file, or stream).

Suppose we have a file called people.txt.

albert@starboss:/home/albert/docs > more people.txt

Harry Smith

Albert van der Sel

Sally White

Roger Rabbit

Donald Duck

Now suppose this is a really big file, and you want to find out if a Sally is listed in that file. If you want to serach, you need grep. We try the following command:

albert@starboss:/home/albert/docs > grep Sally people.txt

people.txt: line 3 Sally Smith

You are not limited to files. As we have seen before, the output of some command can be feeded as Unix sysadmins do the following all the time:

If they want to see what processes are running on their machine, they usually use the command:

$ ps -ef

This gives them the process list of all processes running. This could be a list of hundreds of pro So, suppose the only want to see the Oracle processes (which carry "ora" in their process name) then they would use a command like shown below, just to "filter out" only the processes with "ora"

$ ps -ef | grep ora

As another example of searching, or filtering, take a look at this. Suppose you have a large logfi some application. You do not want to browse throught the entire file, but you are only interrested in lines that contain an error message. You know that such lines start with "ERROR:" Then you migh $ cat logfile.log | grep ERROR > /tmp/err.log

This leaves a (usually) much smaller file in the "/tmp" directory, which is much easier te browse.

There is so much to talk about "grep", but this is a simplyfied note, so I will end it here.

But not before this:

-- Using "grep -i" makes the search (or filter) case insensitive.

-- If you grep processnames from the "ps -ef" command, then the grep command itself will also be v That's because you gave grep itself the (partial) name to search for. It "sees" itself

If you want to omit this, use "grep -v".

A small note about the "awk" utility:

The "awk" utility is a programming environment, by itself. The only thing we will do here, is ment truly humble notes on this awesome utility (as if I would discuss a 1 dimensional line, w

Awk might appear as a strange name, but this comes from it's creators, Alfred Aho, Peter Weinberg In order to create shell scripts, you certainly do not need to know everything about "awk".

The awk utility (as all the other tools) can operate on files, or on any input that we feed it.

If you use "awk", it will often be in the form of awk '{ command fields}'

In general, the information you pipe to tools (via "|") are actually lines coming from a

of another program.

The awk utility thinks in terms of fields, so if you send it a line containing words, you might si awk to do this: awk '{print $1}', which displays the first field of the current line.

So if you take a look at this:

df -k |awk '{print $4,$7}' | grep -v "Filesystem"| grep -v tmp > /tmp/tmp.txt

then you know that "df -k" produces output, that goes to awk, and it will "print" only the 4th an that will be handled further by grep etc..

Or take a look at this:

An administrator, sometimes trough a script, needs to (hard) terminate some process. If h

he or she could produce a list of processes, with the "ps -ef" command, which produces a lot of in Actually it produces lines, containing the "process-id" (pid) , the name of the process, etc.. When the pid was found, the Admin then could do this: "kill pid", and that process should terminat So if this Admin knows that this process runs from the "/dir1/dir2/abc" directory, he or she could this in a script:

kill `ps -ef | grep /dir/dir/abc | grep -v grep | awk '(print $2)'`

The part between `` should resolve into the wanted process-id of that particular program, if indee the seconf field of the ps -ef output, is the pid, which awk will faithfully print.

Note: be a bit carefull in killing processes. This is ofcourse a trivial remark.

2.6. Conditional logic, loops, and tests.

I'am quite sure that you have seen many of those before from other scripting or programmi

All those constructs (logic and loops) are used in combination with a "boolean" test that returns Take a look at this while loop:

while (( i < 100 ))

do

some statements

done

So, as long as i stays smaller than 100, the statements in the while loop will be executed.

What you can use in shell scripts, is the following:

conditionals:

if-then-else-fi

case-esac

&&

||

loops:

for-do-done

while-do-done

until-do-done

As you will see in a minute , it's all so easy to use. Let's take a look at the conditional "block If then else fi:

The if-then-else-fi block, uses a conditional test: if [condition]=true then "do this" else "do th The conditional test takes the form of [ some_condition_is_true_or_false ].

Here are 2 examples:

if [ `whoami` != root ]

then

echo RUN THIS SCRIPT AS ROOT !

exit

fi

Where the command "whoami" resolves to what account you have logged on. That can be root, or anoth If you are not root (decided by the "!=", meaning "not equal"), then the script terminates.

if [ $? -eq 0 ] ; then Later, we will see that an alternative print the command exitcode is 0way to write that is:

else if (( $VAR != 0 )) meaning: if VAR is not eq print not good if (( $VAR == 0 )) meaning: if VAR is equal fi

As you know from paragraph 2.3, the "$?" viariable contains the returncode from the last command t was run from your script. Usually, "0" means succes, and any other value means a failure

The actual test is " $? -eg 0 ", meaning "is the returncode 0 or not?"

Did you see that we did a "string comparison" in the first example ("are you indeed root?"),

and a "nummerical comparison" in example 2 ("is the returncode 0?")

There is indeed some difference here. It's just something we need to accept, because this

The following list explains how to compare strings, and nummerical values:

string comparisons:

[ s1 = s2 ] True if strings s1 and s2 are equal

[ s1 != s2 ] True if strings s1 and s2 are unequal

numeric comparisons:

[ x -eq y ] True if the integers x and y are numerically equal

[ x -ne y ] True if integers are not equal

[ x -gt y ] True if x is greater than y

[ x -lt y ] True if x is less than y

[ x -ge y ] True if x>=y

[ x -le y ] True if x <= y

! : is Logical NOT operator

-a : is Logical AND

-o : is Logical OR

Apart from the string- or nummerical comparisons, you can also do all sorts of tests on objects

like files and directories, like the test "does the file abc.log exists?"

Take a look at the next example:

if [ -f /tmp/errlog ]

then

rm -rf /tmp/errlog# be very very very very carefull in using the rm -rf com

else

echo "no errorlog found"

fi

Here we test if the object "/tmp/errlog" exists. If true, delete it. If false, then print a messag It's really a test of whether a file (-f) exists. If you want to test whether a directory exists, Again, don't wonder about that syntax. It's just the way the system works.

tests on objects:

[ -f file ] True if the file is a plain file

[ -d file ] True if the file is a directory

[ -r file ] True if the file is readable

[ -w file ] True if the file is writable

[ -x file ] True if the file is executable

[ -h file ] True if the file is a symbolic link

[ -s file ] True if the file contains something

[ -g file ] True if setgid bit is set

[ -u file ] True if setuid bit is set

A few very important remarks:

1. Use spaces in "[ condition ]"

if we take a look at the "[" symbol, used in if [ condition ] then.., we must tell you that it cou be build into the shell itself.

Anyway, keep a space between the "[", and your condition, and the "]".

keeplooping=1;

while [[ $keeplooping -eq 1 ]] ; do

read quitnow

if [[ "$quitnow" = "yes" ]] ; then

keeplooping=0

fi

if [[ "$quitnow" = "q" ]] ; then

break;

fi

done

UNTIL

=====

The other kind of loop in ksh, is 'until'. The difference between them is that 'while' implies loo something remains true.

The 'until' loop implies looping until something that is false during the loop, at some p

until [[ $stopnow -eq 1 ]] ; do

echo just run this once

stopnow=1;

echo we should not be here again.

done

2.7 The use of functions in your script.

You can define "functions" in your script, which you can (if you want) call "later" (some

There is really nothing difficult about it. You can simply use the same code as you have seen befo like for example "if .. Then..else" statements, or "for..do..done" etc.. Just write whatever you in your function. That's all up to you, and what you want the function to accomplish.

The use of functions gives you a way, to nicely "group" code for a certain task, and to make your So, a script could be "shaped" like in this example:

#!/usr/bin/ksh

CONTINUE=true# set the variable CONTINUE to have a value of true

functionA()

{

Whatever code

}

functionB()

{

Whatever code

And some code that (for example) re-evaluates the value of the variable CONTINUE (which

on the basis of some sort of test inside this function.

}

functionC()

{

Whatever code

}

# main program

functionA# call functionA at all times

functionB# call functionB at all times. It will also determine what value

;;

1 )

MAIL_TEXT="There was 1 file found in the $UPLOAD_DIR"

# Note that this is good, and CONTINUE is still true

;;

* )

MAIL_TEXT="There were more than one file found in the $UPLOAD_DIR"

CONTINUE=false

esac

}

ProcessTheFile()# This is our second function {

NAME_OF_FILE=`ls -1 | grep '[0-9a-zA-Z]' `# Determine the name of the f tr -d '\r' < $NAME_OF_FILE > FILE_FOR_FINANCE.txt# We remove the carriage retu }# file "FILE_FOR_FINANCE.txt"

MoveTheFileToFinance()# This is our third function. {# It will move the file to th mv FILE_FOR_FINANCE.txt /data/finance/received

}

CleanupUpload()# This is our fourth function {# If two or more files were f rm -rf *# and we will through it away }

# Main:

DetermineIfWeGoOn# This function we will do al

# of files found. If the numb if [ ${CONTINUE} = "true" ]# the variable CONTINUE will then

ProcessTheFile# If CONTINUE is still true, MoveTheFileToFinance()# and move it to the Finance else

CleanUpload

fi

echo "$MAIL_TEXT" |mail -s "Result of the Upload" ops@https://www.sodocs.net/doc/fd18562090.html,# As the last step, we mail t

Ofcouse, you could refine such a script in many ways. The sole purpose was to demonstrate the use of functions. Don't you agree that in this way, the code is very structured and easy to read?

2.8. Just a few other tips.

Tip 1: append to a file, using ">>".

You know that you can redirect output (from some program) to a file, using the ">" symbol

So if you do this:

$ cat some_logfile.log | grep -i ERROR > /tmp/err.txt

You know that you filter the lines containing upper- or lower case errorlines, and redirect it to If you now do this:

$ cat some_logfile.log | grep -i WARNING > /tmp/err.txt

The err.txt file will be recreated, and the former info is lost.

If you want to "append" to a file, use the ">>" symbol, like in:

$ cat some_logfile.log | grep -i WARNING >> /tmp/err.txt

Now the first lines in err.txt, contains the errors, followed by the warnings (from "some_logfile. Tip 2: determining the type of file.

Sometimes, if you see a file that's not yours, you may wonder whether it's a binary file,

a flat ascii file (which you can view with the "cat", "more", "pg" and other commands).

Just use the "file file_you_want_to_test" command for that.

Ofcourse, you can not just "edit" a binary file, in the way you can with text files.

Example:

albert@starboss:/home/albert/docs > file software.txt

software.txt: ascii readable file

albert@starboss:/home/albert/docs > cd /opt/tuxedo/product/bin

albert@starboss:/opt/tuxedo/product/bin > file tmboot

tmboot: executable or object module

Tip 3: Finding files.

There are multiple ways, to find files in a complex directory tree.

1. ls -alR

On some unixes, the "ls" command can be used with the recursive option, R.

Suppose you want to list all your files that lies "somewhere" in derectories in your Home director albert@starboss:/home/albert > ls -alR

Or, if you only want to find specific types of files, like .ksh scripts:

albert@starboss:/home/albert > ls -alR *.ksh

2. find command

You can also use the "find" command. Most of the times, you will use it as follows.

You go to the most "toplevel" directory from which you want to search.

Then you use find as follows: "find . -name file_name_you_want_to_search -print"

Example:

Suppose you want to search for the file "software.txt" which is somewhere in a subdir in "/data". Then you can try this:

albert@starboss:/data > find . -name software.txt -print

You can also use "wildcards" like "*" or "?" to find files which have something common in their na For example:

albert@starboss:/data > find . -name "*.txt" -print

In this case, I was searching for all files with the .txt extension.

Because you used the ".", it means to find to start searching from your current directory, which i this example was "/data".

Note that the "." always means "the current directory".

3. which or whereis

Maybe the "which" and "whereis" commands are available on your system.

Task: Just try them. What information do they provide?

Tip 4: Use of the IFS, or internal field separator, variable

If you consider some line of text, you can tell your shell what is the field seperator.

In normal text, a separator will be a 'space", or a "tab" etc.., but if you look for exam

variable, your field seperator will be a ":" symbol.

#!/usr/bin/ksh

IFS=:# Here we say that the field seperator is the : character.

for p in $PATH

do

if [ -x $p/$1 ]

then

echo $p/$1

fi

done

echo "No $1 found in path"

return 1

Tip 5: Getting user input from keyboard.

Sometimes you want to interact with the user of your script. Maybe you wrote a "menu" scr

of some sort, where the user can choose from a list of programs to run.

Take a look at this example:

echo Tell us what you want: yes or no

read answer

case $answer in

yes|Yes|y)

echo "You typed a yes."

;;

no)

echo "You typed a no."

;;

q*|Q*)

# Quit the menu

exit

;;

*)

# This is a sort of catch all

echo Did not understood what you want. Sorry.

;;

esac

Tip 6: A small note on Arrays.

In the korn shell, it is possible to use arrays. Not all shells support that functionalit

The most common type is the indexed array, which is a group of similar elements, characterized by or index, in the array.

An element "in the array", which has index (or position, counting from 0) k, can be chara

Take a look at the next example:

set -A colors RED GREEN BLUE# or use:

# colors[0]=RED

# colors[1]=GREEN

# colors[2]=GREEN

i=0

while [ $i -lt 3 ]# meaning: While I lower than 3 do…

do

echo ${colors[$i]}

(( i=i+1 ))

done

Output:

RED

GREEN

BLUE

Tip 7: Scheduling your scripts.

There are several ways to schedule your scripts. The most common one, is to use cron, the

相关主题