A more desirable rm

Update: Added mv option -t and --backup=t (thanks to Matt) to prevent file collisions from same-named files. Thanks Matt! A bashrc version and a bash script are both available.
Warning: Currently I am not using this script. This script works good for most instances but I discovered it does have problems when compiling. During compiling some files get force removed (the rm -f, -fr, or -rf options) that it looks like mv will not do. When this happens files don’t get removed and compiling errors can occur. I am still trying to figure out how to do this.

I’ve deleted files before that I wish I could have back and resorted to file saving utilities to try and save them but I have never done this before:

rm -rf ~/ .Arachnophilia

I was doing a bit of spring (fall?) cleaning and as you may have guessed: the space between ~/ and .Arachnophilia was not intended. An accidental slip of the thumb caused this and yes, this had caused my computer to delete my whole home directory! I remembered a post I had read and quickly I did sudo init 1 to drop into single user mode. The post was for recovering a text file though and single user mode didn’t accept any input anyhow so I rebooted into my beloved install disk.

R! and R? (Request!! and Recover?)

Luckily I had back up several days ago. I wrote down the files I had been working on since then to try and recover. The Arch Wiki has a good page on this. I tried extundelete first and while its probably the best and most through way to recover all files, it recovers the files like this:

ls . | head -n 2
010392
010394

Since the files I’ve been working on were just text files and scripts, Photorec was a better application for this. Photorec is able to analyze and determine certain file types including text files and script files.

Afterward, I was left with directories and files that looked like recup_dir/f1029394.txt. 14000 and some text files if I remember right. To be able to find which one were the correct ones using grep is awesome. So quick too. I just had to be able to remember some text within the file and search for it like this:

grep -rn --include=*.txt "the text here" .

Using grep alone instead of combined it with some program like find is much quicker. The -r does a recursive search, the -n will print the line number, and --include specifies the file type in this case. Piping grep works good for more refined searches like searching for two patterns on a string or removing a string:

grep -rn --include=*.txt "the text here" . | grep -n "this too"
grep -rn --include=*.txt "the text here" . | grep -v "string not with this"

I found the files I needed, thankfully, and was able to get them back.

~/Never\ Again\?/

First thing I did after restoring from my backup was to see if I could prevent this again. The answer was pretty simple: move the files to the trash instead. By adding:

alias rm="mv -t ~/.local/share/Trash/files --backup=t --verbose"

to the ~/.bashrc files will get moved to the trash instead of deleted. The ~/.local/share/Trash/file is defined by the freedesktop.org organization so it has interoperability with and Gnome and other desktops. With the -t and --backup=t options mv will rename files with duplicate file names to ones with appended numbers (e.g. abc.~2~).

Here’s is a more detailed version in a bash script that includes feedback:

#!/bin/bash
# rm replacement for regular user. Moves files to trash rather than
# directly deleting them.
reg_rm=/bin/rm
# Display usage if no parameters given
if [[ -z "$@" ]]; then
echo " ${0##*/} <file, folder, link...> - rm replacement: move to trash rather than delete"
exit
fi
# Text color variables
txtund=$(tput sgr 0 1) # Underline
txtbld=$(tput bold) # Bold
bldred=${txtbld}$(tput setaf 1) # red
bldblu=${txtbld}$(tput setaf 4) # blue
bldwht=${txtbld}$(tput setaf 7) # white
txtrst=$(tput sgr0) # Reset
info=${bldwht}*${txtrst} # Feedback
pass=${bldblu}*${txtrst}
warn=${bldred}*${txtrst}
ques=${bldblu}?${txtrst}
# Run 'rm' regular if root.
if [[ $(whoami) == root ]]; then
$reg_rm "$@"
exit
fi
# Test if first entry is an option, if it is ignore.
if [[ "$1" == -* ]]; then
echo " Ignored: "$bldred""$1""$txtrst""
shift
fi
# Trash each file independently, give feedback.
for e in "$@"; do
if [ ! -e "$e" ] && [ ! -L "$e" ]; then
echo " Unexist: "$bldred""$e""$txtrst""
continue
fi
mv -t ~/.local/share/Trash/files --backup=t "$e"
echo " Trashed: "$bldblu""$e""$txtrst""
done
view raw rm hosted with ❤ by GitHub

12 thoughts on “A more desirable rm

  1. karol

    Is there a way to get back file A after
    $ mv B A
    or do I have to use my backup? (let’s forget about ‘alias mv=”mv -i”‘ and the like for a moment)

    Reply
  2. Todd Partridge (Gen2ly) Post author

    Yeah, that is a good question. I’d think that the ext filesystem (which I’m guessing you are using) would decide where to put the file. ext will put the file (by default I heard) on the nearest contiguous space at the beginning of the harddisk. So physically on the harddrive I bet it is likely it still exists. 100% certain I’d like to be, if you try Parted Magic (or other tool) I’d be interested in hearing the results.

    Reply
  3. Fernando Canizo

    Sad we’re still struggling with this kind of stuff. This should have been solved at file system (fs) level looooooong ago. That was the first thing I missed from my switch to gnulinux years ago. An automatic fs level trash can should be provided that also allowed for good fs performance.

    Another gripe I carry from those times is the unimportance fs give to creation time. We have access times, file status changes times and file modification times, but there’s no way to know the creation time of a living file. Even a simple ‘cp a b’ will lost creation time forever.

    Reply
  4. matclab

    A good practice is to write :

    rm files

    The check your ok with that and only then add “-rf”

    rm files -rf

    Putting -rf at the end alos limit the problems when you accidentaly it “enter” before the end of your line (then “rm ~/.xxx -rf” is much less dangerous than “rm -rf ~/.xxx” )

    Reply
  5. Todd Partridge (Gen2ly) Post author

    irtigor :

    gvfs-trash can be a better option.

    Thought I’d try this but am sticking to what I have. Looks like gvfs-trash isn’t sticking to the freedesktop standard:

    # touch abc
    # gvfs-trash abc 
    Error trashing file: Unable to find or create trash directory

    Looks like gvfs-trash uses its own specification. According to this post:

    https://bbs.archlinux.org/viewtopic.php?id=103194

    You have to create a .Trash-1000 directory in your home directory. For this to work.

    Reply
  6. Todd Partridge (Gen2ly) Post author

    Well, after having it installed for awhile it’s not working. For Ubuntu:

    sudo apt-get install gvfs-bin

    Files are getting moved to ~/.local/share/Trash/files:

    pavilion ~/.local/share/Trash/files:
    ls -1X ab*
    abc.2.txt
    abc.3.txt
    abc.txt

    A simple line in the ~/.bashrc does the job:

    alias rm="gvfs-trash"

    Not sure what got it working all the sudden.

    Reply
  7. Harvey Kelly

    Thanks for that, I’ll bookmark this page. About 8 or 9 years ago I thought I was in /floppy and issued the ‘command of doom’:

    rm -rf *

    Instead, I was in /home/harvey/ and I’ll never forget the feeling as a minute passed and I had absolutely no idea what to do… backups? Pah!

    Reply
  8. David Sure

    Sorry for being late to the party, but still:
    I used to use the ‘mv’ approach as well, but that discards the benefits of the freedesktop implementation that remembers delete date and where the deleted file came from. I wouldn’t like the gvfs-trash version as I wouldn’t want to have dependencies into GNOME for just using the cmdline (also I’m on KDE). However, there’s a desktop-agnostic implementation of the standard available that I’ve been using happily. Available in Ubuntu/Debian through the package trash-cli; some goes for Arch via AUR (https://aur.archlinux.org/packages.php?ID=19076)

    Reply

Leave a reply to Todd Partridge (Gen2ly) Cancel reply