Category Archives: Command Line

makepkg Shortcuts Script

This one is not as important as the pacman related script but I find I use it often too. I maintain several packages in the AUR and it comes in handy to quickly refer to common tasks related to makepkg. The md5sum function still needs a bit of work (i.e. it requires the build function in the PKGBUILD be able to place the md5sums nicely). Otherwise it’s pretty ready. Here’s what it does:

 mp <option> - common makepkg building tasks
  b - build package (install dependencies if required)
  m - update md5sums
  r - remove previous build directories
  s - create source files for upload, prompts for submission to the AUR
  t - create PKGBUILD template

(ar)ch (pa)ckages – a generic package tasks script for Arch Linux

I once saw a wrapper-script for pacman in the forums that was basically a short-hand version of common pacman tasks. I thought this was a good idea and over the last couple years, I’ve expanded on it. It does just about everything I need it to. It’s real basic and I call it arpa. Here is a basic synopsis:

arpa [option] [*package] - a generic package tasks wrapper script
  -e, --explicit - install a package as explicit
  -g, --get      - get/download package upgrade(s)    : -G get pkg upgrades all
  -i, --install  - install a package                  : -I install as dependency
  -l, --list     - list package files                 : -L list pkgs installed
  -o, --owns     - owning package of a file
  -q, --query    - query for an installed package     : -Q query w/ description
  -r, --remove   - remove a pkg and its deps          : -R force, no argue orphs
  -s, --search   - search for a package               : -S search w/ description
  -u, --upgrade  - upgrade system                     : -U upgrade AUR
  -y, --sync     - sync package db

Good for me to have this around so I can remember everything :), and it is in the AUR.

Embedded Scripts in WordPress with GitHub Gist and Update Script

I asked at Stack Overflow recently if I could embed a text file into a webpage. My reason was basic: I wanted to be able to use my newly created GitHub script repository to be my source for scripts I posted on this blog. If I was able to do this, I reasoned, than my script on the blog will be up-to-date when I updated my GitHub repository. Unfortunately, there appears to be no direct way to do this that I could find so I look for an alternative and found GitHub Gist. GitHub Gist’s description:

Gist is a simple way to share snippets and pastes with others. All gists are git repositories, so they are automatically versioned, forkable and usable as a git repository.

I was hoping that there would be a way to link a script but there isn’t. Basically the standard process it to visit the GitHub Gist WebUI paste the script, config, … and then post the link on its own line into WordPress.

Because this creates git repository it means it can be updated. So I wrote a script does two functions: 1) Creates a repository for a file; 2) updates all files listed in the script with a Gist repository.

Works pretty good, there are a couple caveats though. First, Gist does not recognize the interpreter on the first line of a script and instead uses the extension. I tend not to use the .sh extension but I wanted syntax highlighting so the script on the blog now are labeled as which I guess isn’t a huge deal. Second, each script must have it’s own repository or all the scripts, configs… would be placed when put into a post. Not sure if this a breach of etiquette but I think I’m ok.

The script requires defunkts excellent gist command line upload tool.

The syntax is such:

 ghsync-gist   - Add or update gist repo(s)
  a - Add gist repo for file(s)
  u - Update all gist repos for all files

Managing Scripts and Configurations on Github with a script

This post is a follow-up to Michael Smalley’s excellent post on how to manage your dotfiles.

Use Git and Github to Manage Your Dotfiles. I wanted a way to regularly have my configurations and scripts updated on Github that didn’t require me remembering how to do it :). So I created a script that would do it for me:

Works pretty good. Then I put these in my crontab to have them updated every week.

Sudoers Permissions as a File

I learned that for granting root permissions to certain programs that it is easer and more constructive to use a separate file. sudo must be told to look in a separate directory in its configuration to be able to do so.

sudo visudo

Likely all distributions have this available and it will be listed at the end:

#includedir /etc/sudoers.d/  

The # is necessary. Also the trailing forward slash is likely necessary (I had to add it); when it wasn’t added, files in /etc/sudoers.d/ would not always get recognized.

Here’s my configuration built from an excellent tutorial in the Ubuntu forums. I usually build these per user naming them user_<USER>.

# Allowed root permissions of programs for user USER

# Aliases
Host_Alias HOST    = aspire
Cmnd_Alias G9LED   = /usr/bin/g9led
Cmnd_Alias IOTOP   = /usr/bin/iotop
Cmnd_Alias PACKER  = /usr/bin/packer
Cmnd_Alias PACMAN  = /usr/bin/pacman
Cmnd_Alias SANDFOX = /usr/bin/sandfox

# Programs allowed for user or computer

The configuration will need to proper-permissions:

sudo chown root:root /etc/sudoers.d/user_<USER>
sudo chmod 0440      /etc/sudoers.d/user_<USER>

Allotting iotop

Recently, iotop has been moved to being only allowable to be viewed as root. iotop is a great program for measuring disk throughput and I am unable to figure the logic of why it has been moved to root-only. To be able to run iotop as regular user again root permission must be given to the regular user for the program. A good way is to create a sudoers file per-user of allowable programs.

Then iotop can be invoked by (without need of a password):

sudo iotop


Power Management from the Command Line

To be able to invoke commands like suspend and hibernate from the command line not so long ago required having root privileges or using the desktop environment built-in tools. Now to invoke suspend, hibernate, shutdown, or restart, D-Bus can be invoked as Regular user. I created a script called pwrman to ease the task (requires UPower to be installed).

(I got this idea from a person from the Arch Linux forums. I forgot who you are, so sorry, but thank you.)


The bash shell’s setting file is a really nice this to have detailed. Therefore I looked into it pretty good. For any that would like to use it, it can be found in the AUR.

The ABC’s of creating MP3s

New link, read Fine DAE scripts for more.

A more Desirable rm

Update: Added mv option -t and --backup=t (thanks to Matt) to prevent file collisions from same-named files. Thanks Matt! A bashrc version and a bash script are both available.

Warning: Currently I am not using this script. This script works good for most instances but I discovered it does have problems when compiling. During compiling some files get force removed (the rm -f, -fr, or -rf options) that it looks like mv will not do. When this happens files don’t get removed and compiling errors can occur. I am still trying to figure out how to do this.

I’ve deleted files before that I wish I could have back and resorted to file saving utilities to try and save them but I have never done this before:

rm -rf ~/ .Arachnophilia

I was doing a bit of spring (fall?) cleaning and as you may have guessed: the space between ~/ and .Arachnophilia was not intended. An accidental slip of the thumb caused this and yes, this had caused my computer to delete my whole home directory! I remembered a post I had read at cyberciti and quickly I did sudo init 1 to drop into single user mode. The post was for recovering a text file though and single user mode didn’t accept any input anyhow so I rebooted into my beloved Parted Magic.

R! and R? (Request!! and Recover?)

Parted Magic now has Clonezilla once again and luckily I had back up several days ago. I wrote down the files I had been working on since then to try and recover. The Arch Wiki has a good page on this. I tried extundelete first and while its probably the best and most through way to recover all files, it recovers the files like this:

ls . | head -n 2

Since the files I’ve been working on were just text files and scripts, Photorec was a better application for this. Photorec is able to analyze and determine certain file types including text files and script files.

Afterward, I was left with directories and files that looked like recup_dir/f1029394.txt. 14000 and some text files if I remember right. To be able to find which one were the correct ones using grep is awesome. So quick too. I just had to be able to remember some text within the file and search for it like this:

grep -rn --include=*.txt "the text here" .

Using grep alone instead of combined it with some program like find is much quicker. The -r does a recursive search, the -n will print the line number, and --include specifies the file type in this case. Piping grep works good for more refined searches like searching for two patterns on a string or removing a string:

grep -rn --include=*.txt "the text here" . | grep -n "this too"
grep -rn --include=*.txt "the text here" . | grep -v "string not with this"

I found the files I needed, thankfully, and was able to get them back.

~/Never\ Again\?/

First thing I did after restoring from my backup was to see if I could prevent this again. The answer was pretty simple: move the files to the trash instead. By adding:

alias rm="mv -t ~/.local/share/Trash/files --backup=t --verbose"

to the ~/.bashrc files will get moved to the trash instead of deleted. The ~/.local/share/Trash/file is defined by the organization so it has interoperability with both KDE and Gnome. With the -t and --backup=t options mv will rename files with duplicate file names to ones with appended numbers (e.g. abc.~2~).

Here’s is a more detailed version in a bash script that includes feedback:

Setting Up a Scripting Environment

When first starting learning Linux, I didn’t realize lot of it lies beneath the surface. Linux still holds on to it’s developmental roots and a good deal of it’s power can be found directly from the command line. Windows doesn’t have this type of functionality, and though Mac OS X has some of it few people know about it. If needing to do powerful or automated commands with Linux (whether it be switch mouse buttons, or launch multiple programs at once), many times I can turn to the command line and write a bash script for it. The command line can be very powerful: there are few things that can only be done only from a window, and many more from the command line that can’t be done in a window.

Setting up a scripting environment means creating a place to store the scripts, easily getting to them, and executing them like a regular command.

Directory Setup

First thing I do is set up a directory to place the scripts in. This directory is usually best in the home folder and is preferably invisible as it’s not necessary to see it all the time. This may sound inconvenient at first but since commands will be run from the terminal it is quickly gotten used to. I like to name the directory ~/.scripts, others follow Linux filesystem convention and use ~/.local/bin (dot files are hidden files and are not shown unless explicitly stated):

mkdir ~/.scripts

The tilda character (~) signifies that the directory is the home directory and is used as a shortcut because it is quicker than typing /home/user. To quickly switch to that directory, I create a shortcut in the bash configuration file. Shortcuts can be defined in the bash configuration file using aliases. The bash configuration file is called ~/.bashrc. Adding the shortcut:

alias cds="cd ~/.scripts"

cds tells me to: change to the directory of scripts. After I save it, I re-source the bash configuration file to reload the new settings.

source ~/.bashrc

Now typing the shortcut cds will change to the script directory.

Run Scripts Just Like Regular Commands

I create new scripts here or put those I find here. Creating a script is outside this post but once they are here they will need to be executable:

chmod +x script-name

To be able to run the script like a regular command, the bash shell will need to be let known of the new executable path (~/.scripts). Anytime a command is run in bash, it looks for programs or scripts that are in the path directive. Currently known paths can be discovered by:

echo $PATH

To add the script directory to the known paths, it needs to be defined in the ~/.bashrc file. The bash configuration file may already have some paths defined in the export PATH... line. If it does, the script directory can be added to the line. If it doesn’t, I add both the script directory and the current paths ($PATH) to be sure the new path(s) don’t override the old:

export PATH="~/.scripts:$PATH"

Different paths are separated by a colon (:) and as many can be added as needed. Saving and sourcing ~/.bashrc will have the new directory(ies) be recognized by the bash shell.


  • If you like to learn more about copying scripts (or text) from a window and pasting it to a file from the command line, see Command Line to Clipboard.

Get every new post delivered to your Inbox.

Join 58 other followers