- How To Add Linux Commands To The Queue And Execute Them One By One
- Install Task Spooler On Debian, Ubuntu, Linux Mint
- Add Linux Commands To The Queue And Execute Them One By One Using Task Spooler
- Conclusion
- Run one command after another, even if I suspend the first one (Ctrl-z)
- 2 Answers 2
- Running multiple commands in one line in shell
- 6 Answers 6
- How can I run multiple commands which have & in one command line?
- 2 Answers 2
- Lists
- Job Control
- Execute combine multiple Linux commands in one line
- 11 Answers 11
How To Add Linux Commands To The Queue And Execute Them One By One
Today, I stumbled upon a cool Linux command line utility called «Task Spooler». As the name says, Task spooler is a Unix batch system that can be used to add the Linux commands to the queue and execute them one after the other in numerical order (ascending order, to be precise). Please do not confuse it with ‘at’ command, which is used to execute Linux commands at a specified time. Unlike at command, Task spooler runs a command immediately from the queue as soon as the previous command is finished.
This utility can be very useful when you have to run a lots of commands, but you don’t want to waste time waiting for one command to finish and run the next command. You can queue them all up and Task Spooler will execute them one by one. In the mean time, you can do other activities. Each user in each system has his/her own job queue. It is also very useful when you know that your commands depend on a lot of RAM, a lot of disk use, give a lot of output, or for whatever reason it’s better not to run them at the same time. In a nutshell, Task Spooler is a command line program to queue up other commands for batch execution.
In this brief tutorial, let me show you how to install and use Task Spooler in Unix-like operating systems.
Install Task Spooler On Debian, Ubuntu, Linux Mint
Task Spooler is available in the default repositories of Debian, Ubuntu and other DEB based systems. So, you can install it using command:
For other systems, you can download the Task Spooler source file from this link and build it as native package to the Linux distribution you use and install it as explained in any one of the following methods.
Add Linux Commands To The Queue And Execute Them One By One Using Task Spooler
Let us see some practical examples. All examples provided here are tested in Ubuntu 18.04 LTS system.
Note: In Debian/Ubuntu systems, Task Spooler should be executed with ‘tsp’ command, because there is an another program with same name called ts (Time Stamping Authority tool (client/server)). For Linux distributions other than Ubuntu/Debian, you can run it using ‘ts’ command.
Run tsp command:
Right now, there is nothing in the queue. Let us add some commands to the queue. To do so, run:
Now, run tsp command again to view the commands in the queue:
Sample output:
As you see in the above output, each command has a unique ID in (0, 1, 2.. etc.) in ascending order. Also, it shows the current of state of commands (Eg. finished or running) in the queue. The echo commands are very simple and short, so we got the result as ‘finished’.
Let us run a some command that takes more time to finish. Take a look at the following command:
This command will find and display the top 20 oldest files in the root (/) file system.
Now add the above command to queue:
Sample output:
Now, run tsp command to view the list of commands in the queue.
Sample output:
Add Linux Commands To The Queue Using Task Spooler
As you see in the above output, the command with ID 2 is running. Similarly, you can add as many as commands you want to run using Task Spooler.
Update:
As one of our reader mentioned in the comment section, the find command should be run like below:
To view the output of a running job to check what’s going on, enter the following command:
Here, 2 is the ID of running command. Press CTRL+C to return back to the Terminal. It won’t cancel the running command. It will only take you back to the Terminal. The job will still run on the back ground.
You can remove a command (running, finished, queued up) from the queue using -r flag followed by the ID like below.
The above command will remove the command that has ID 2 from the queue.
To clear all commands from the queue, simply run:
Please note that here C is capital. The above command will clear the last completed commands from the queue. It will not remove any running commands or the commands in the queue.
Remember, you need to run Task Spooler in distributions other than Debian/Ubuntu using ts command.
For more details, refer the man pages.
Suggested read:
Conclusion
I find it Task spooler very useful when I have to run multiple commands. I am just a lazy fellow to wait for one command to finish and execute the another one. Using Task spooler, I queued up the list of commands to be executed and it executed commands from the queue one by one in ascending order. I can also view the output of any running command using its ID at any time. Please be mindful that It won’t run all commands at once. Instead, it will run commands one after another. That said, Task Spooler is opt for for executing batch jobs.
Источник
Run one command after another, even if I suspend the first one (Ctrl-z)
I know in bash I can run one command after another by separating them by semicolons, like
Or if I only want command2 to run only if command1 succeeds, using && :
This works, but if I suspend command1 using Ctrl-z , in the first case, it runs command2 immediately, and in the second case, it doesn’t run it at all. How can I run commands in sequence, but still be able to suspend the first command, but not have the second run until I have restarted it (with fg ) and it finishes? I’d prefer something as simple to type as possible, as I would like to do this interactively. Or maybe I just need to set an option somewhere.
By the way, what is the proper term for what Ctrl-z does?
2 Answers 2
The following should do it:
Note the added parentheses.
In Bash, when you place a job into the background (using CTRL+Z or &) it does not wait for the job to finish, and gives an exit code of zero (success). That much you have observed, and it is documented in the man pages.
The behaviour of logical «AND», &&, is that it tests from left-to right. Each part must be successful, so if the first is unsuccessful then the second will not run. So with && it runs commands from left to right until one of them fails. The definition of success is an exitcode ($?) of zero.
Contrast this with logical «OR», ||, which runs commands from left to right until one of them works.
Explaination of the subshell solution give by @NPE can also be found in the man pages:
Compound commands and command sequences of the form ‘a ; b ; c’ are not handled gracefully when process suspension is attempted. When a process is stopped, the shell immediately executes the next command in the sequence. It suffices to place the sequence of commands between parentheses to force it into a subshell, which may be stopped as a unit.
The proper term for CTRL+Z is the suspend character, again from the man pages:
Typing the suspend character (typically ^Z, Control-Z) while a process is running causes that process to be stopped and returns control to bash.
(Sorry to quote the man pages so much, but they really are your friends and worth reading)
If you look at stty -a you will see something like this:
So you can alter it, hence the phrase «typically». Don’t do that though, it will confuse the heck out of everyone. The terminal driver raises a SIGTSTP signal which is trapped by Bash.
Источник
Running multiple commands in one line in shell
Say I have a file /templates/apple and I want to
- put it in two different places and then
- remove the original.
So, /templates/apple will be copied to /templates/used AND /templates/inuse and then after that I’d like to remove the original.
Is cp the best way to do this, followed by rm ? Or is there a better way?
I want to do it all in one line so I’m thinking it would look something like:
Is this the correct syntax?
6 Answers 6
You are using | (pipe) to direct the output of a command into another command. What you are looking for is && operator to execute the next command only if the previous one succeeded:
To summarize (non-exhaustively) bash’s command operators/separators:
- | pipes (pipelines) the standard output ( stdout ) of one command into the standard input of another one. Note that stderr still goes into its default destination, whatever that happen to be.
- |& pipes both stdout and stderr of one command into the standard input of another one. Very useful, available in bash version 4 and above.
- && executes the right-hand command of && only if the previous one succeeded.
- || executes the right-hand command of || only it the previous one failed.
- ; executes the right-hand command of ; always regardless whether the previous command succeeded or failed. Unless set -e was previously invoked, which causes bash to fail on an error.
Источник
How can I run multiple commands which have & in one command line?
I’ve encountered a headache problem.
I want to execute mulitple commands in background, so I want to start them in bash one by one. It’s easy to start one command in linux shell in background, just like this:
It’s also easy to start multiple commands, just like this:
But if I want to run multiple commands in background, I tried the following command format, but failed:
Both formats fail. How can I run multiple commands which have & in one command line?
2 Answers 2
If you want to run them sequentially:
If you want them to run parallel:
In bash you can also use this (space behind the < and the ; are mandatory):
I suppose you want this:
This starts myCommand1 and sends it to the background as it’s followed by ampersand, then immediately starts myCommand2 and sends also this to the background, therefore releasing the shell again.
Lists
For better understanding you may substitute pipeline by command here.
A list is a sequence of one or more pipelines separated by one of the operators ;, &, &&, or ||, and optionally terminated by one of ;, &, or .
If a command is terminated by the control operator &, the shell executes the command in the background in a subshell. The shell does not wait for the command to finish, and the return status is 0. Commands separated by a ; are executed sequentially; the shell waits for each command to terminate in turn. The return status is the exit status of the last command executed.
AND and OR lists are sequences of one or more pipelines separated by the && and || control operators, respectively.
Source: man bash
Let’s break that down into examples. You can build a list by combining commands and separating them with one of these: ; & && || :
You can terminate lists with one of these: ; & .
Normally you execute a command or a list by pressing Enter , that equals . The semicolon ; serves the very same purpose especially in scripts. Ampersand & however starts the command(s) in a subshell in the background, immediately releasing the shell.
You can use round () or curly brackets <> to further group lists, the difference being that round brackets spawn a subshell and curly ones don’t. Curly brackets need a space after the first and a semicolon or a newline before the closing bracket. For example:
This can get quite complicated, if you’re unsure use true and false to test whether the construction works as expected:
Job Control
The jobs command displays a list of the background jobs that are running or have recently been finished in the current shell. There are a number of keyboard shortcuts and commands for job control:
- Ctrl + Z types the suspend character that causes the process currently running in the foreground to be stopped, it is not terminated, but remains in the jobs list
- Ctrl + Y types the delayed suspend character that causes the process currently running in the foreground to be stopped when it attempts to read input from the terminal
fg = % brings a process into the foreground starting it if necessary, you can specify the process as follows:
bg = %& takes a process into the background starting it if necessary:
wait waits for a background process to be finished and returns its termination status:
Imagine you started a lengthy process ( jobs reveals it’s number 3) and then realize you want the computer to be suspended when it finishes, plus echo a message if the process didn’t succeed:
Источник
Execute combine multiple Linux commands in one line
I am trying to merge multiple linux commands in one line to perform deployment operation. For example
11 Answers 11
If you want to execute each command only if the previous one succeeded, then combine them using the && operator:
If one of the commands fails, then all other commands following it won’t be executed.
If you want to execute all commands regardless of whether the previous ones failed or not, separate them with semicolons:
In your case, I think you want the first case where execution of the next command depends on the success of the previous one.
You can also put all commands in a script and execute that instead:
(The backslashes at the end of the line are there to prevent the shell from thinking that the next line is a new command; if you omit the backslashes, you would need to write the whole command in a single line.)
Save that to a file, for example myscript , and make it executable:
You can now execute that script like other programs on the machine. But if you don’t place it inside a directory listed in your PATH environment variable (for example /usr/local/bin , or on some Linux distributions
/bin ), then you will need to specify the path to that script. If it’s in the current directory, you execute it with:
The commands in the script work the same way as the commands in the first example; the next command only executes if the previous one succeeded. For unconditional execution of all commands, simply list each command on its own line:
Источник