Simple bash website monitoring script

This simple script tries to download a file from every website in the list. If it gets a timeout, it sends you an email.

#!/bin/bash
HOSTS="google.com example.org/test.html yahoo.com"
SUBJECT="Website is down"
EMAIL="example@example.com xxx@xxx.com"
TIMEOUT=.5

for myHost in $HOSTS
do
result=$(wget -T $TIMEOUT -t 1 $myHost -O /dev/null -o /dev/stdout | grep "Connecting to")
connected=$(echo $result | grep failed);

if [ "$connected" != "" ]; then
for myEMAIL in $EMAIL
do
echo -e "$(date) - $myHost is down! \n This is an automated message." | mail -s "$SUBJECT" $myEMAIL
done
fi
done


You should put this in a crontab. Open a terminal, type crontab -e and use something like */10 * * * * /home/deconectat/monitor.sh to run the script every 10 minutes.

Advertisements

2 Responses to “Simple bash website monitoring script”


  1. 1 Alexandro 30/06/2010 at 09:58

    Hi,
    It’s a good idea to monitor your websites directly from your PC without any special software. I have a question – what should change in the script to make it check my FTP site? (i.e. I need a connection on 21 port)

  2. 2 Josef Irineo 01/11/2010 at 00:11

    Setup many different website monitoring plans to increase reliability.


Comments are currently closed.




%d bloggers like this: