Emails from Linux with æ, ø and å

Sometimes one need to send an email from Linux (automated messages, warnings and so on).
The problem is that the content looks strange when received by the email clients. The issue could be related to character encoding and UTF8.

Here is a php script I used to send an email with the correct character encoding.

[php]
<?php

# If you are sending the email to several people, sometimes it is good to use BCC (blind carbon copy)

$bcc = "person1@something,person2@something,person3@something";
$subject = "Here is some text with special characters ø æ å";
$body = "Hello, this is the norwegian characters ø, æ and å";
$headers = "From: apache" . "\r\n" .
"Reply-To: me@something" . "\r\n" .
"Bcc: $bcc" . "\r\n" .
"X-Mailer: PHP/" . phpversion();
$header_ = ‘MIME-Version: 1.0’ . "\r\n" . ‘Content-type: text/plain; charset=UTF-8’ . "\r\n";

# Send the email with headers!
mail(”, ‘=?UTF-8?B?’.base64_encode($subject).’?=’, $body, $header_ . $headers);
?>
[/php]

Use ‘mutt’ to send email with attachment

The unix ‘mutt’ command can be used to send an email with attachment:

mutt -s "Test message" name@something -a files.tar < message.txt

where files.tar is the attached file, and message.txt is the file containing the message.

Subject to the email is added with the -s option. In this case:
-s “Test message”

name@something is the receiver email address

You can also do this:

echo "Message text" | mutt -s "Subject text" name@something -a files.tar

Reference: http://www.cyberciti.biz/tips/sending-mail-with-attachment.html

Why would you like to use a unix command to send an email with an attachment?
The answer is that in the case you want to automate some process, let us say, you would like to pack together some important files, and send them every Sunday to a certain person. It could be statistical data, for instance, and the person would like to have a report each Sunday so to read them on Monday before lunch.
You could then make a simple bash script that uses ‘tar’ to collect the files into a single tar-file. Then call on the unix ‘mutt’ command, and send the message together with the important tar-file. All can be automated by calling this bash script from cron (unix job scheduler)

Postgresql 8.1 database bash dump script

I’ve wrote a simple bash script to dump databases from a Postgresql 8.1 database server
which uses the pg_dump and pg_dumpall.


#!/bin/bash
## This scripts dumps all the databases in a Postgres 8.1 server, localhost
## 2 dump files are made for each database. One with Inserts another without.
## some variables, change them to fit yours
 LOGFILE="/var/lib/pgsql/data/pg_log/pgsql_dump.log"
 HOSTNAME=`hostname`
 MAILLIST="name1@something.com,name2@something.com"
 BACKUP_DIR="/backup/postgresql_dumps"

# Date and time variables
 DATE=`date +%Y-%m-%d`
 TIME=`date +%k:%M:%S`
 TODAY=$DATE"T"$TIME
# only postgres can run this script!
 if [ `whoami` != "postgres" ]; then
   echo "pgsql_dump tried to run, but user is not postgres!" >> $LOGFILE
   echo "You are not postgres, can not run."
   echo "Try: su -c ./pgsql_dump.sh postgres"
   exit;
 fi

# clean up old dumps! (find all types which are files, with the name that ends with .sql,
# their date older than 7 days, and execute the command "rm" (remove) )
 find $BACKUP_DIR -type f -name '*.sql' -mtime +7 -exec rm {} \;

# Check if there any backup files.
# Action: find in folder BACKUP_DIR all files with file-extension .sql.
# Count the number of files with wc (word count, option -l, which counts the numbers of lines.
# If this number is 0, then there are no files, and clearly something is wrong,
# because you don't have any backups!
if [ `find $BACKUP_DIR -type f -name '*.sql' | wc -l` -eq 0 ]; then
 echo "There are no pgsql dumps for the last 2 days at $HOSTNAME. Something is wrong!" | mail -s "[PGSQLDUMP ERROR] $HOSTNAME" $MAILLIST
fi

# Create the log-file if it doesn't exist
if [ ! -e $LOGFILE ]; then
 touch $LOGFILE
fi

# Find which databases you have in your Postgresql server
# Action: list out all the databases, remove unwanted lines and characters, extract wanted line with awk (line 1),
# strip away white empty lines with the command 'sed':
DATABASES=`psql -q -c "\l" | sed -n 4,/\eof/p | grep -v rows | grep -v template0 | awk {'print $1'} | sed -e '/^$/d'`
# Dump the databases in individual files
 for i in $DATABASES; do
   FILENAME="$i-$TODAY.sql"
   pg_dump -D $i > /backup/postgresql_dumps/"$FILENAME"
 done

# we also like a dump with copy statements
 for i in $DATABASES; do
   FILENAME="$i-cp-$TODAY.sql"
   pg_dump $i > /backup/postgresql_dumps/"$FILENAME"
 done

# full backup is also necessary
 pg_dumpall -D > /backup/postgresql_dumps/"all-databases $TODAY".sql
 pg_dumpall > /backup/postgresql_dumps/"all-databases $TODAY".sql

# Finally vacuum the database
 vacuumdb -a -f -z -q

tar

Create a tar file of files located in a folder:
tar -cvf filname.tar foldername/

where:
c = create
v = verbose
f = file

filname.tar = the created file

———-
To show the content of a tar’ed file:
tar -tvf filname.tar

———-

extract a file with tar:
tar -xvf filname.tar

where

x = extract
v = verbose
f = file

Extract img url links from HTML document

I needed to find and get a webpage’s img src url links. I wanted to do this with a script on a regular basis. The solution I found was to use PHP domdocument:

http://stackoverflow.com/questions/138313/how-to-extract-img-src-title-and-alt-from-html-using-php

[php]
<?php
$url="http://example.com";

$html = file_get_contents($url);

$doc = new DOMDocument();
@$doc->loadHTML($html);

$tags = $doc->getElementsByTagName(‘img’);

foreach ($tags as $tag) {
echo $tag->getAttribute(‘src’);
}
?>

[/php]

After spending some time using wget, cat, grep and so on to solve my problem, this little php code made my life easier 🙂

ldapsearch users and places that contains æ, ø and å

We needed to collect “Place” information per user from our LDAP server.
The problem was that the description of the “Place” came out strangely encoded whenever it contained one of the norwegian characters æ,ø or å.

The ldap command:

ldapsearch -x -H ldap://ourldapserver.uib.no x121Address=XXXXXX

where XXXXXX is the “place” code, gave a place description that looked like this:

description:: SW5zdGl0dXR0IGZvciBmaWxvc29maSBvZyBmw7hyc3Rlc2VtZXN0ZXJzdHVkaWVy

where the real name of “Place” could be something like: “Institutt for .. and then a word with æ, ø or å”

The solution was to use ldapsearch as follows:
ldapsearch -x -z 1 -t departmentNumber=XXXXXX ou

where XXXXXX is the University of Bergen “placecode” for a “Place”. For instance the number 567123 could be the place code for our IT department.
-z 1 reduces the list of hits to one (1), and ou specifies the “Place” description.

The list of users was already collected in a text file: people.txt on the form:

username1
username2

The bash script that solved the issue for me was:

[code lang=”bash”]

#!/bin/bash
# People collected with:
# ls -al /www/folk/ |awk {‘print $9’}|grep -v unwanted_line|sort > people.txt

PEOPLE=`cat people.txt`

for USERNAME in $PEOPLE; do
PLACECODE=`ldapsearch -x -H ldap://ourldapserver.uib.no uid=$USERNAME | grep departmentNumber | awk {‘print $2’}`

if [ ! -z $PLACECODE ]; then

# Some times name of place is written to screen, other times to a file under /tmp
OUINFO=`ldapsearch -x -z 1 -t departmentNumber=$PLACECODE ou | grep ‘ou:’`

if [ `echo $OUINFO | grep ‘file:’ | wc -l` -eq 0 ];then
PLACE=`echo $OUINFO | sed -e s/"ou:\ "//g`
else
THEFILE=`echo $OUINFO | grep ‘file:’ | sed -e s/".*file:\/\/"//g`
PLACE=`cat $THEFILE`

#echo "The file is: " $THEFILE
#echo "and the place is: " $PLACE

fi
fi
echo $USERNAME, $PLACE
done

[/code]

Simple ldapsearch

Search for information from a ldap server:
Log in to server with:
ssh servernavn

Then type the command:

ldapsearch -x -H ldap://ldapservername.uib.no uid=userid
ldapsearch -x -H ldap://ldapservername.uib.no departmentnumber=XXXXXX

where XXXXXX is placecode