Using date and time in Bash scripts

Date and time is useful. You might want your script to check if it is Monday today, or if the script ran 2 days ago. Maybe you need to save a file with the current date and time. Here are some variables that I use to put in my bash scripts:

[bash]
TODAY=$(date|awk ‘{ print $1 }’)
TODAY=`date ‘+%A’`
NUMBER-OF-DAY-IN-WEEK=`date +%u`
MONTH=$(date|awk ‘{ print $1 }’)
MONTHNAME=`date +%b –date ‘0 month’`
DAYINMONTH=$(date|awk ‘{ print $3 }’)
YEAR=$(date | awk ‘{ print $6 }’)
WEEKNUMBER=`date +"%V"`
[/bash]

 

[bash]
# Date to unixstamp
date2stamp () {
date –utc –date "$1" +%s
}
stamp2date (){
date –utc –date "1970-01-01 $1 sec" "+%Y-%m-%d %T"
}
dateDiff (){
case $1 in
-s) sec=1; shift;;
-m) sec=60; shift;;
-h) sec=3600; shift;;
-d) sec=86400; shift;;
*) sec=86400;;
esac
dte1=$(date2stamp $1)
dte2=$(date2stamp $2)
diffSec=$((dte2-dte1))
if ((diffSec < 0)); then abs=-1; else abs=1; fi
echo $((diffSec/sec*abs))
}
[/bash]

Here is useful reading:
http://www.cyberciti.biz/faq/linux-unix-formatting-dates-for-display/

Postgresql 9.1 BACKUP DUMP BASH script

We wrote an improved postgresql dump bash script for Postgresql version 9.1. This one will save each dump file with the name: database_name_DAYNAME.sql.bz2
In this way, we would only have 7 backups at any time, because each file will be overwritten after seven days. Since our backup system (TSM) saves 7 versions of each file, we would then have 49 versions at any time. That means we can go 49 days back in time to restore a certain dumpfile. In addition the script saves monthly a dump file with the name: database_name_MONTHNAME.sql.bz2.
You can also set a variable TEST to ‘yes’ to, then the script will only dump one specific filename: database_name_daily.sql.bz2. This can be useful on Test servers, where you might not be interested in a historical backup back in time.

Here it is:

#!/bin/bash

## This scripts dumps all the databases in a Postgres 9.1 server, localhost
## 2 dump files are made for each database. One with Inserts another without.

## TODO
# - implement a 'if system is Test' option to minimize number of dump files UNDER PROGRESS
# - use functions instead?
# - some kind of integration with Jenkins?
# - fix the 2 strange '|' that appears in the DATABASE list FIXED?
# - Add timer so we can optimize speed of the script execution time
# - enable use of the logfile LOGFILE. Could be nice to log what this script is/has been doing and when.
# - number of days to keep a dump file could be a parameter to this script
# - enable print of name of the script, where the script is run (hostname and directory). Makes it easy to find the script on a server
# - would be nice to add a incremental feature for this script. Then one can dump files several times a day, without worrying about space problems on the harddisk DIFFICULT?
## TODO END

# Timer
start_time=$(date +%s)

# Variables
LOGFILE="/var/lib/pgsql/9.1/data/pg_log/pgsql_dump.log"
BACKUP_DIR="/var/backup/postgresql_dumps"
BACKUP_DIR2="var/backup/postgresql_dumps" # Gosh..
HOSTNAME=`hostname`
MAILLIST="someone att somewhere" # should be edited
# Is this a test system? Set TESTSYSTEM to 'yes' in order to remove date and time information from dumpfile names (in order to minimize number of dumpfiles).
TESTSYSTEM="no"
TODAY=$(date|awk '{ print $1 }')
MONTH=$(date|awk '{ print $1 }')
MONTHNAME=`date +%b --date '0 month'`
DAYINMONTH=$(date|awk '{ print $3 }')
YEAR=$(date | awk '{ print $6 }')

# Only postgres can run this script
if [ `whoami` != "postgres" ]; then
echo "pgsql_dump tried to run, but user is not postgres!" >> $LOGFILE
echo "You are not postgres, can not run."
echo "Try: su -c ./pgsql_dump.sh postgres"
exit;
fi

# Check if there any backup files. If not, something is wrong!
if [ `find $BACKUP_DIR -type f -name '*.sql.bz2' -mtime -2 | wc -l` -eq 0 ]; then
echo "There are no pgsql dumps for the last 2 days at $HOSTNAME. Something is wrong!" | mail -s "[PGSQLDUMP ERROR] $HOSTNAME" $MAILLIST
fi

# logfile might be nice to have (or maybe Jenkins is the way to go?)
if [ ! -e $LOGFILE ]; then
touch $LOGFILE
fi

if [ $TESTSYSTEM == "yes" ];then
#DATABASES=`psql -q -c "\l" | sed -n 4,/\eof/p | grep -v rows | grep -v template0 | awk {'print $1}' | sed 's/^://g' | sed -e '/^$/d' | grep -v '|'`
# For testing purposes
DATABASES="database-1
database-2"
else
DATABASES=`psql -q -c "\l" | sed -n 4,/\eof/p | grep -v rows | grep -v template0 | awk {'print $1}' | sed 's/^://g' | sed -e '/^$/d' | grep -v '|'`
fi

for i in $DATABASES; do

## Create folders for each database if they don't exist
if [ ! -d "$BACKUP_DIR/$i/" ];then
mkdir $BACKUP_DIR/$i
fi
if [ ! -d "$BACKUP_DIR/$i/daily" ];then
mkdir $BACKUP_DIR/$i/daily
fi
if [ ! -d "$BACKUP_DIR/$i/monthly" ];then
mkdir $BACKUP_DIR/$i/monthly
fi

# On Test servers we don't want dump files with date and time information
if [ $TESTSYSTEM == "yes" ];then
DAILYFILENAME="daily_$i"
MONTHLYFILENAME="monthly_$i"
ALLDATABASESFILENAME="all-databases"
else
DAILYFILENAME="daily_$i_$TODAY"
MONTHLYFILENAME="monthly_$i_$MONTHNAME"
ALLDATABASESFILENAME="all-databases_$TODAY"
fi

# backup for each weekday (Mon, Tue, ...)
nice -n 10 /usr/pgsql-9.1/bin/pg_dump --column-inserts $i > $BACKUP_DIR/$i/daily/"$DAILYFILENAME".sql
nice -n 10 tar cjf $BACKUP_DIR/$i/daily/"$DAILYFILENAME".sql.bz2 -C / $BACKUP_DIR2/$i/daily/"$DAILYFILENAME".sql
rm -f $BACKUP_DIR/$i/daily/"$DAILYFILENAME".sql

# dump with copy statements
nice -n 10 /usr/pgsql-9.1/bin/pg_dump $i > $BACKUP_DIR/$i/daily/"$DAILYFILENAME"_copy.sql
nice -n 10 tar cjf $BACKUP_DIR/$i/daily/"$DAILYFILENAME"_copy.sql.bz2 -C / $BACKUP_DIR2/$i/daily/"$DAILYFILENAME"_copy.sql
rm -f $BACKUP_DIR/$i/daily/"$DAILYFILENAME"_copy.sql

# monthly backup (Jan, Feb...)
if [ $DAYINMONTH==10 ]; then
cp -f $BACKUP_DIR/$i/daily/"$DAILYFILENAME".sql.bz2 $BACKUP_DIR/$i/monthly/"$MONTHLYFILENAME".sql.bz2
cp -f $BACKUP_DIR/$i/daily/"$DAILYFILENAME"_copy.sql.bz2 $BACKUP_DIR/$i/monthly/"$MONTHLYFILENAME"_copy.sql.bz2
fi

# Year backup
# coming after a while

done

## Full backup
nice -n 10 /usr/pgsql-9.1/bin/pg_dumpall --column-inserts > $BACKUP_DIR/"$ALLDATABASESFILENAME".sql
nice -n 10 /usr/pgsql-9.1/bin/pg_dumpall > $BACKUP_DIR/"$ALLDATABASESFILENAME"_copy.sql
nice -n 10 tar cjf $BACKUP_DIR/"$ALLDATABASESFILENAME".sql.bz2 -C / var/backup/postgresql_dumps/"$ALLDATABASESFILENAME".sql
nice -n 10 tar cjf $BACKUP_DIR/"$ALLDATABASESFILENAME"_copy.sql.bz2 -C / var/backup/postgresql_dumps/"$ALLDATABASESFILENAME"_copy.sql
rm -f $BACKUP_DIR/"$ALLDATABASESFILENAME".sql
rm -f $BACKUP_DIR/"$ALLDATABASESFILENAME"_copy.sql

## Vacuuming (is it really necessary for PG 9.1? Don't think so...)
#nice -n 10 vacuumdb -a -f -z -q

finish_time=$(date +%s)
echo "Time duration for pg_dump script at $HOSTNAME: $((finish_time - start_time)) secs." | mail $MAILLIST

Mysql SQL in bash one-liner

If you just need a quick way to get some data from a mysql database in your shell (bash), you could do something like this in one line:

mysql -h your.server.edu -u db_username -p`cat /path/to/your/homedir/secretpasswordfile` -e ";use databasename; SELECT tablename.columnname FROM tablename where id like '421111' and something like '1' and option like '23';"; > /tmp/datayouwant.txt; while read i; do echo ";$i";; done < /tmp/datayourwant.txt | sort | uniq

If you don't like to scroll:
-bash-3.2$ mysql -h your.server.edu -u db_username -p`cat /path/to/your/homedir/secretpasswordfile` -e "use databasename; SELECT tablename.columnname FROM tablename where id like '421111' and something like '1' and option like '23';" > /tmp/datayouwant.txt; while read i; do echo "$i"; done < /tmp/datayourwant.txt | sort | uniq

On my server I would then get a list of words/numbers or whatever you might have in the database, which one might want to use further in another script or command:

Dikult
Drupal
DSpace
Mediawiki
Open Journal Systems
Piwik
Postgresql og Mysql
Redhat Enterprise Linux 6 (RHEL6)
Redmine
Solr
Webmail (RoundCubemail)
Wordpress
Xibo

Machform javascript embed form in wordpress post

If one have a form in Machform, one can add the form in a WordPress Post or Page with a simple javascript embed:
[javascript]


[/javascript] Which will give the form: [raw]



[/raw]

Move a local git repo to a remote repo

Challenge: one have a local git repo on a Linux server where a big bunch of commit’s have been done. Now one want to put all the files and commits on a remote server, for more easier sharing with other.

Here is how it might be done:

First create the remote repo. We use gitolite-admin, where we add typically:
# Explaination of the repo
repo path/to/remote/git/repo
RW+ = adminuser
RW = user1 user2 user3

then,

git push

Now, since there is already a local git repo on the server with commits, we need to switch to the remote repo.
Check that there are no remote:
git remote -v
should not show anything, then:
git remote add origin git@git.uib.no:path/to/remote/git/repo

and possible this one too:

git branch --set-upstream master origin/master

What I had to do now, was to clone the remote repo I created with gitolite-admin in another directory, create a dummyfile, and push it. Then I could push and pull the local repo.
So:
cd /tmp/
git clone git@git.uib.no:path/to/remote/git/repo
vim dummyfile
git add dummyfile
git commit -m "dummyfile" dummyfile
git push origin master

Then:
cd /back/to/the/local/repo/you/want/to/clone/to/remote
git pull
git push origin master

I guess there must be an easier way, but I am no Git master…

Or, one can also do it in this way:

cd folder/
git init
# make a lot of files, add, commit and so on..
# REMOTE GIT REPO AVAILABLE AT git@something.url.edu:sys/path/remoterepo.git
# First push files to remote repo
git push --remote git@something.url.edu:sys/path/remoterepo.git
# clone the remote repo into a new folder
cd ..
git clone git@something.url.edu:sys/path/remoterepo.git folder2/
mv folder/ folder-old/
mv folder2/ folder/
# Test, everything ok? If so:
rm -rf folder-old/

Check http headers with wget

If you want to see the http headers from your shell, you can do it with:

 

wget --no-check-certificate --server-response --spider https://yourwebsite.something

The result would be something like:

[bash]
Spider mode enabled. Check if remote file exists.
–2014-02-07 11:13:33– https://yourwebsite.something/something
Resolving yourwebsite.something… 129.177.5.226
Connecting to yourwebsite.something|129.177.5.226|:443… connected.
HTTP request sent, awaiting response…
HTTP/1.1 301 Moved Permanently
Date: Fri, 07 Feb 2014 10:13:33 GMT
Server: Apache
Location: https://yourwebsite.something/something/
Vary: Accept-Encoding
Keep-Alive: timeout=15, max=100
Connection: Keep-Alive
Content-Type: text/html; charset=iso-8859-1
Location: https://yourwebsite.something/something/ [following]
Spider mode enabled. Check if remote file exists.
–2014-02-07 11:13:33– https://yourwebsite.something/something/
Connecting to yourwebsite.something|129.177.5.226|:443… connected.
HTTP request sent, awaiting response…
HTTP/1.1 301 Moved Permanently
Date: Fri, 07 Feb 2014 10:13:33 GMT
Server: Apache
X-Powered-By: PHP/5.3.3
X-Content-Type-Options: nosniff
Vary: Accept-Encoding,Cookie,User-Agent
Expires: Thu, 01 Jan 1970 00:00:00 GMT
Cache-Control: private, must-revalidate, max-age=0
Last-Modified: Fri, 07 Feb 2014 10:13:33 GMT
Location: http://yourwebsite.something/something/index.php/Hovudside
Connection: keep-alive, Keep-Alive
Keep-Alive: timeout=15, max=100
Content-Type: text/html; charset=utf-8
Location: http://yourwebsite.something/something/index.php/Hovudside [following]
Spider mode enabled. Check if remote file exists.
–2014-02-07 11:13:33– http://yourwebsite.something/something/index.php/Hovudside
Connecting to yourwebsite.something|129.177.5.226|:80… connected.
HTTP request sent, awaiting response…
HTTP/1.1 200 OK
Date: Fri, 07 Feb 2014 10:13:33 GMT
Server: Apache
X-Powered-By: PHP/5.3.3
X-Content-Type-Options: nosniff
Content-language: nn
Vary: Accept-Encoding,Cookie,User-Agent
Expires: Thu, 01 Jan 1970 00:00:00 GMT
Cache-Control: private, must-revalidate, max-age=0
Last-Modified: Tue, 14 Jan 2014 11:52:09 GMT
Connection: keep-alive, Keep-Alive
Keep-Alive: timeout=15, max=100
Content-Type: text/html; charset=UTF-8
Length: unspecified [text language="/html"][/text]
Remote file exists and could contain further links,
but recursion is disabled — not retrieving.
[/bash]