- Delete files older than 10 days using shell script in Unix [duplicate]
- 3 Answers 3
- How to delete files older than X hours
- 9 Answers 9
- Find And Delete Oldest File If There Are More Than X Files In A Directory In Linux
- Find and delete oldest file in a directory in Linux
- How to Delete Files Older than 30 days in Linux
- 1. Delete Files older Than 30 Days
- 2. Delete Files with Specific Extension
- 3. Delete Old Directory Recursively
- Conclusion
- Bash, find and delete old files
- 1 Answer 1
Delete files older than 10 days using shell script in Unix [duplicate]
I’m new to shell scripts, can anyone help? I want to delete scripts in a folder from the current date back to 10 days. The scripts looks like:
The script will run in every 10 day with Crontab, that’s why I need the current date.
3 Answers 3
find is the common tool for this kind of task :
EXPLANATIONS
- ./my_dir your directory (replace with your own)
- -mtime +10 older than 10 days
- -type f only files
- -delete no surprise. Remove it to test your find filter before executing the whole command
And take care that ./my_dir exists to avoid bad surprises !
Just spicing up the shell script above to delete older files but with logging and calculation of elapsed time
The code adds a few things.
- log files named with a timestamp
- log folder specified
- find looks for *.txt files only in the log folder
- type f ensures you only deletes files
- maxdepth 1 ensures you dont enter subfolders
- log files older than 7 days are deleted ( assuming this is for a backup log)
- notes the start / end time
- calculates the elapsed time for the backup operation.
Note: to test the code, just use -print instead of -print -delete. But do check your path carefully though.
Note: Do ensure your server time is set correctly via date — setup timezone/ntp correctly . Additionally check file times with ‘stat filename’
Note: mtime can be replaced with mmin for better control as mtime discards all fractions (older than 2 days (+2 days) actually means 3 days ) when it deals with getting the timestamps of files in the context of days
Источник
How to delete files older than X hours
I’m writing a bash script that needs to delete old files.
It’s currently implemented using :
This will delete of the files older than 1 day.
However, what if I need a finer resolution that 1 day, say like 6 hours old? Is there a nice clean way to do it, like there is using find and -mtime?
9 Answers 9
Does your find have the -mmin option? That can let you test the number of mins since last modification:
Or maybe look at using tmpwatch to do the same job. phjr also recommended tmpreaper in the comments.
Here is the approach that worked for me (and I don’t see it being used above)
deleting all the files older than 59 minutes while leaving the folders intact.
You could to this trick: create a file 1 hour ago, and use the -newer file argument.
(Or use touch -t to create such a file).
-mmin is for minutes.
Try looking at the man page.
If one’s find does not have -mmin and if one also is stuck with a find that accepts only integer values for -mtime , then all is not necessarily lost if one considers that «older than» is similar to «not newer than».
If we were able to create a file that that has an mtime of our cut-off time, we can ask find to locate the files that are «not newer than» our reference file.
To create a file that has the correct time stamp is a bit involved because a system that doesn’t have an adequate find probably also has a less-than-capable date command that could do things like: date +%Y%m%d%H%M%S -d «6 hours ago» .
Fortunately, other old tools can manage this, albeit in a more unwieldy way.
To begin finding a way to delete files that are over six hours old, we first have to find the time that is six hours ago. Consider that six hours is 21600 seconds:
Since the perl statement produces the date/time information we need, use it to create a reference file that is exactly six hours old:
Now that we have a reference file exactly six hours old, the «old UNIX» solution for «delete all files older than six hours» becomes something along the lines of:
It might also be a good idea to clean up our reference file.
Источник
Find And Delete Oldest File If There Are More Than X Files In A Directory In Linux
I have many movies in my hard drive and I have stored them in different folders based on the movie genre. Now, I want to keep only particular number of movie files in a directory, and delete everything else. More importantly, I want to delete only the oldest files. This way I can maintain a constant number of files in each folder. Since I have so many files scattered across many folders, it is quite time consuming process to go to each folder, search for the oldest files and manually delete them one by one. While looking for an easy way to do this, I found the following solution. Read on. It’s not that difficult.
Find and delete oldest file in a directory in Linux
Let us say, you wanted to find and delete the oldest file if there are more than 10 files in a directory. How would you do? It’s very simple.
Take the following directory named ostechnix as an example. Let us check how many files are in this directory using command:
Or cd into that directory and run:
Sample output:
As you see in the above example, the directory ostechnix contains 33 files. I don’t want 33 files in this directory. I want to delete all oldest files and keep only 10 files.
Now, let us find and delete oldest file(s) in this directory if it contains more than 10 files. To do so, go to that directory:
And, run the following command:
- ls : List directory contents.
- -1t : 1(Number one) indicates that the output of ls should be one file per line. t indicates sort contents by modification time, newest first.
- tail : Output the last part of files.
- -n +11 : output the last NUM lines, instead of the last 10; or use -n +NUM to output starting with line NUM
- xargs : Build and execute command lines from standard input.
- rm -f : Remove files or directories. f indicates ignore nonexistent files and arguments, never prompt. It means that this command won’t display any error messages if there are less than 10 files.
- | — It is a pipeline. It is generally a sequence of one or more commands separated by one of the control operators | or |& .
So, the above command will delete the oldest files if there are more than 10 files in the current working directory. To verify how many files are in the directory after deleting the oldest file(s), just run:
Update:
If the filenames contains spaces, the above command will not work. Because, the xargs command takes white space characters (tabs, spaces, new lines) as delimiters. In that case, you can narrow it down only for the new line characters ( ‘\n’ ) with -d option like below:
Источник
How to Delete Files Older than 30 days in Linux
This is the best practice to remove old unused files from your server. For example, if we are running daily/hourly backup of files or database on the server then there will be much junk created on the server. So clean it regularly. To do it you can find older files from the backup directory and clean them.
This article describe you to how to find and delete files older than 30 days. Here 30 days older means the last modification date is before 30 days.
1. Delete Files older Than 30 Days
You can use the find command to search all files modified older than X days. And also delete them if required in single command.
First of all, list all files older than 30 days under /opt/backup directory.
Verify the file list and make sure no useful file is listed in above command. Once confirmed, you are good to go to delete those files with following command.
2. Delete Files with Specific Extension
Instead of deleting all files, you can also add more filters to find command. For example, you only need to delete files with “.log” extension and modified before 30 days.
For the safe side, first do a dry run and list files matching the criteria.
Once the list is verified, delete those file by running the following command:
Above command will delete only files having .log extension and last modification date is older than 30 days.
3. Delete Old Directory Recursively
Using the -delete option may fail, if the directory is not empty. In that case we will use Linux rm command with find command.
The below command will search all directories modified before 90 days under the /var/log directory.
Here we can execute the rm command using -exec command line option. Find command output will be send to rm command as input.
Conclusion
In this tutorial, you have learned to search and delete files modified older than specific days in Linux command line.
Источник
Bash, find and delete old files
Write script to backup files and delete old backups:
Command Line (for testing):
Somehow, the script does not work. However, if I execute it in the console, all is OK and old files get deleted. But if I run the script, files are not deleted (tar archive get created when I run the script). Why?
Script permissions are 755 .
1 Answer 1
As suggested by Fiximan, you probably want to use the -delete option, although that should make no difference in your situation, if you were to hit a filename with spaces or other special characters, your script would fail.
ROTATE=1 does not seem to make much sense, you probably want more than 1 backup, just in case. You often notice something’s wrong a few days in. good luck if you have a backup from last night only!
chmod 600 $CURRENT should be done as soon as possible; if you are really afraid for the security of the file, do it before creating the tarball (as shown in my sample.)
Fix the find by adding a + in front of the $ROTATE number. This is something that gets me each time, so don’t feel bad. Actually, if you use + it has the mean of older or equal, and if you use ‘-‘, the test is inverted (so -mtime +7 is more or less equivalent to ! -mtime -7 — probably within 1 day in between which is not unlikely to match one side or the other.) With a plain number as you used ( -mtime 1 ), it will delete files that were modified on that particular day. If that script does not run for 3 days, then those 3 files won’t ever get deleted.
It is easier to use the -delete so you do not have to think about the quotations you missed in your sample code (i.e. -exec rm «<>» \; ) in case the filename includes special characters.
I suggest you add a -name because you have a simple way to know whether the file in question is a backup. This is just for security. If you never ever put any other file in that directory (like a copy of a backup you want to keep for a longer period of time) then you do not need it.
Adding the -e option in the hash bang ( #!/bin/sh -e ) is a good idea so the script stops on the very first error. At times scripts run, generate errors, and you never see them.
Источник