(thanks to Matt
) to prevent file collisions from same-named files. Thanks Matt! A bashrc version and a bash script are both available.
Warning: Currently I am not using this script. This script works good for most instances but I discovered it does have problems when compiling. During compiling some files get force removed (the
-rf options) that it looks like
mv will not do. When this happens files don’t get removed and compiling errors can occur. I am still trying to figure out how to do this.
I’ve deleted files before that I wish I could have back and resorted to file saving utilities to try and save them but I have never done this before:
rm -rf ~/ .Arachnophilia
I was doing a bit of spring (fall?) cleaning and as you may have guessed: the space between
.Arachnophilia was not intended. An accidental slip of the thumb caused this and yes, this had caused my computer to delete my whole home directory! I remembered a post I had read and quickly I did
sudo init 1 to drop into single user mode. The post was for recovering a text file though and single user mode didn’t accept any input anyhow so I rebooted into my beloved install disk.
R! and R? (Request!! and Recover?)
Luckily I had back up several days ago. I wrote down the files I had been working on since then to try and recover. The Arch Wiki has a good page on this. I tried extundelete first and while its probably the best and most through way to recover all files, it recovers the files like this:
ls . | head -n 2
Since the files I’ve been working on were just text files and scripts, Photorec was a better application for this. Photorec is able to analyze and determine certain file types including text files and script files.
Afterward, I was left with directories and files that looked like
recup_dir/f1029394.txt. 14000 and some text files if I remember right. To be able to find which one were the correct ones using
grep is awesome. So quick too. I just had to be able to remember some text within the file and search for it like this:
grep -rn --include=*.txt "the text here" .
Using grep alone instead of combined it with some program like
find is much quicker. The
-r does a recursive search, the
-n will print the line number, and
--include specifies the file type in this case. Piping grep works good for more refined searches like searching for two patterns on a string or removing a string:
grep -rn --include=*.txt "the text here" . | grep -n "this too"
grep -rn --include=*.txt "the text here" . | grep -v "string not with this"
I found the files I needed, thankfully, and was able to get them back.
First thing I did after restoring from my backup was to see if I could prevent this again. The answer was pretty simple: move the files to the trash instead. By adding:
alias rm="mv -t ~/.local/share/Trash/files --backup=t --verbose"
~/.bashrc files will get moved to the trash instead of deleted. The
~/.local/share/Trash/file is defined by the freedesktop.org organization so it has interoperability with and Gnome and other desktops. With the
mv will rename files with duplicate file names to ones with appended numbers (e.g.
Here’s is a more detailed version in a bash script that includes feedback: