Hey, wondering if anyone has any ideas that might be better than this one I yanked off Github. Looking for a bash script I can run a cron on several times a day that auto checks a list of domains to see if any are down.
Bash:
#!/bin/bash
# List of URLs to check.
urls=(
https://www.google.com
https://www.yahoo.com
)
for url in "${urls[@]}"
do
/usr/bin/wget --server-response -O /dev/null $url > /dev/null 2> /dev/null
if [ $? -ne 0 ]
then
# Full URLs can disappear in text messages. Extract the domain and use that
# for all error reporting instead.
domain=`echo $url | awk -F/ '{print $3} down'`
# If the website appears down, check again in five minutes to eliminate
# false positives.
echo "$domain down. Waiting 5 minutes."
sleep 300
/usr/bin/wget --server-response -O /dev/null $url > /dev/null 2> /dev/null
if [ $? -ne 0 ]
then
echo "$domain still down. Sending email and/or text messages."
echo "$domain down" | /usr/sbin/sendmail [email protected]
fi
fi
done