Monitor websites without monitoring software (bash script)

In my daily work I have a dial with big amount of websites. Due to the specifics of the projects I need to make sure that websites are available after the project completion.

I decided to automate this task and exclude using Nagios or Zabbix. So I crafted a small bash script that checks the state of the websites and prints the results in a table.

while true
    echo "<br><br><br>" > $html
    echo '<center><table width="40%" border="1" cellspacing="0" cellpadding="5" >' >> $html
    echo "<tr><td></td><td><b>HOST</b></td><td><b>State</b></td></tr>" >> $html
    for web in "${serverlist[@]}"
        state=$(HEAD -t 8 $web |sed -n 1p)
        echo "<tr><td>$i</td><td>$ip</td><td>$state</td></tr>" >> $html
        if [ "$state" != "200 OK" ]
            echo "Something is wrong with $ip" |mail -s "Invalid responce from $web" \
    let "i = $i +1"
echo "</table>" >> $html
echo "<br>" >> $html
echo "<p>Last updated at $(date +%H:%M:%S)</p>" >> $html
echo "</center>" >> $html
sleep 30

The variable serverlist contains the list of the websites/servers that are to be checked.
Please pay attention to the highlighted text. You might need to adjust the variables.

The script calls the HEAD tool every 30 seconds and walks through all items from the variable serverlist. The output is being written to the following file:


If the response header is not 200, then I get the email notification.

The status page is available on the following URL:


The mailbox contains the status history with 30 seconds step.

Alternatively you can send notifications to Skype.

Share Button

Leave a Reply

You must be logged in to post a comment.