A more Desirable rm

Update: Added mv option -t and --backup=t (thanks to Matt) to prevent file collisions from same-named files. Thanks Matt! A bashrc version and a bash script are both available.

Warning: Currently I am not using this script. This script works good for most instances but I discovered it does have problems when compiling. During compiling some files get force removed (the rm -f, -fr, or -rf options) that it looks like mv will not do. When this happens files don’t get removed and compiling errors can occur. I am still trying to figure out how to do this.

I’ve deleted files before that I wish I could have back and resorted to file saving utilities to try and save them but I have never done this before:

rm -rf ~/ .Arachnophilia

I was doing a bit of spring (fall?) cleaning and as you may have guessed: the space between ~/ and .Arachnophilia was not intended. An accidental slip of the thumb caused this and yes, this had caused my computer to delete my whole home directory! I remembered a post I had read at cyberciti and quickly I did sudo init 1 to drop into single user mode. The post was for recovering a text file though and single user mode didn’t accept any input anyhow so I rebooted into my beloved Parted Magic.

R! and R? (Request!! and Recover?)

Parted Magic now has Clonezilla once again and luckily I had back up several days ago. I wrote down the files I had been working on since then to try and recover. The Arch Wiki has a good page on this. I tried extundelete first and while its probably the best and most through way to recover all files, it recovers the files like this:

ls . | head -n 2
010392
010394

Since the files I’ve been working on were just text files and scripts, Photorec was a better application for this. Photorec is able to analyze and determine certain file types including text files and script files.

Afterward, I was left with directories and files that looked like recup_dir/f1029394.txt. 14000 and some text files if I remember right. To be able to find which one were the correct ones using grep is awesome. So quick too. I just had to be able to remember some text within the file and search for it like this:

grep -rn --include=*.txt "the text here" .

Using grep alone instead of combined it with some program like find is much quicker. The -r does a recursive search, the -n will print the line number, and --include specifies the file type in this case. Piping grep works good for more refined searches like searching for two patterns on a string or removing a string:

grep -rn --include=*.txt "the text here" . | grep -n "this too"
grep -rn --include=*.txt "the text here" . | grep -v "string not with this"

I found the files I needed, thankfully, and was able to get them back.

~/Never\ Again\?/

First thing I did after restoring from my backup was to see if I could prevent this again. The answer was pretty simple: move the files to the trash instead. By adding:

alias rm="mv -t ~/.local/share/Trash/files --backup=t --verbose"

to the ~/.bashrc files will get moved to the trash instead of deleted. The ~/.local/share/Trash/file is defined by the freedesktop.org organization so it has interoperability with both KDE and Gnome. With the -t and --backup=t options mv will rename files with duplicate file names to ones with appended numbers (e.g. abc.~2~).

Here’s is a more detailed version in a bash script that includes feedback:

About these ads

About Todd Partridge (Gently)

Good times, good people, good fun.

Posted on 2011-09-27, in Command Line, Linux. Bookmark the permalink. 12 Comments.

  1. Is there a way to get back file A after
    $ mv B A
    or do I have to use my backup? (let’s forget about ‘alias mv=”mv -i”‘ and the like for a moment)

  2. Yeah, that is a good question. I’d think that the ext filesystem (which I’m guessing you are using) would decide where to put the file. ext will put the file (by default I heard) on the nearest contiguous space at the beginning of the harddisk. So physically on the harddrive I bet it is likely it still exists. 100% certain I’d like to be, if you try Parted Magic (or other tool) I’d be interested in hearing the results.

  3. There’s also this:

    https://code.google.com/p/trash-cli/

    Used to use this a few years back. Grew more confident with rm, but seeing your post, i think i’ll start using this again.

  4. Sad we’re still struggling with this kind of stuff. This should have been solved at file system (fs) level looooooong ago. That was the first thing I missed from my switch to gnulinux years ago. An automatic fs level trash can should be provided that also allowed for good fs performance.

    Another gripe I carry from those times is the unimportance fs give to creation time. We have access times, file status changes times and file modification times, but there’s no way to know the creation time of a living file. Even a simple ‘cp a b’ will lost creation time forever.

  5. gvfs-trash can be a better option.

  6. A good practice is to write :

    rm files

    The check your ok with that and only then add “-rf”

    rm files -rf

    Putting -rf at the end alos limit the problems when you accidentaly it “enter” before the end of your line (then “rm ~/.xxx -rf” is much less dangerous than “rm -rf ~/.xxx” )

  7. irtigor :

    gvfs-trash can be a better option.

    Thought I’d try this but am sticking to what I have. Looks like gvfs-trash isn’t sticking to the freedesktop standard:

    # touch abc
    # gvfs-trash abc 
    Error trashing file: Unable to find or create trash directory

    Looks like gvfs-trash uses its own specification. According to this post:

    https://bbs.archlinux.org/viewtopic.php?id=103194

    You have to create a .Trash-1000 directory in your home directory. For this to work.

  8. Well, after having it installed for awhile it’s not working. For Ubuntu:

    sudo apt-get install gvfs-bin

    Files are getting moved to ~/.local/share/Trash/files:

    pavilion ~/.local/share/Trash/files:
    ls -1X ab*
    abc.2.txt
    abc.3.txt
    abc.txt

    A simple line in the ~/.bashrc does the job:

    alias rm="gvfs-trash"

    Not sure what got it working all the sudden.

  9. Thanks for that, I’ll bookmark this page. About 8 or 9 years ago I thought I was in /floppy and issued the ‘command of doom':

    rm -rf *

    Instead, I was in /home/harvey/ and I’ll never forget the feeling as a minute passed and I had absolutely no idea what to do… backups? Pah!

  10. Sorry for being late to the party, but still:
    I used to use the ‘mv’ approach as well, but that discards the benefits of the freedesktop implementation that remembers delete date and where the deleted file came from. I wouldn’t like the gvfs-trash version as I wouldn’t want to have dependencies into GNOME for just using the cmdline (also I’m on KDE). However, there’s a desktop-agnostic implementation of the standard available that I’ve been using happily. Available in Ubuntu/Debian through the package trash-cli; some goes for Arch via AUR (https://aur.archlinux.org/packages.php?ID=19076)

  11. Added script, using because a bit more thorough.

  12. Edited script to defer to rm directly if root.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Follow

Get every new post delivered to your Inbox.

Join 52 other followers

%d bloggers like this: