Linux: Command Line to monitor a websites status
Steps to monitor a website using curl:
1. Create the main script file
2. Create a site list file
3. Create a cron job
1. main scripts
Working directory /home/username/scripts
vi /home/username/scripts/checkmysites
#!/bin/bash
SITESFILE=/home/usernamescripts/sites.txt #list the sites you want to monitor in this file
EMAILS=”[email protected]” #list of email addresses to receive alerts (comma separated)
while read site; do
if [ ! -z “${site}” ]; then
checker=$(/usr/bin/curl -s –head -L –request GET $site) #The -L is very important to handle load balancer redirects
if echo $checker | grep “200 OK” > /dev/null
then
echo “The HTTP server on ${site} is up!”
else
MESSAGE=”This is an alert from MonitoringServer1 that the connection to the website ${site} has failed to respond.”
for EMAIL in $(echo $EMAILS | tr “,” ” “); do
SUBJECT=”The connection to $site (http) Failed”
echo “$MESSAGE” | mail -s “$SUBJECT” $EMAIL
echo $SUBJECT
echo “Alert sent to $EMAIL”
done
fi
fi
done < $SITESFILE
2. Site list file
Working directory /home/username/scripts
vi /home/username/scripts/sites.txt
http://www.techpository.com
https://www.google.com
3. Cron job
crontab -e
* * * * * /home/username/scripts/./checkmysites
Initial script by: Axel
Script modifications and expansions by: Timothy Conrad