This tutorial will show you how to quickly check if a particular website is working from the Linux Terminal.

You probably already know some commands to verify this, namely ping, curl and wget. But the article has added some other commands in this tutorial.

In addition, the article also added various options to test this information for one and more servers.

This article will help you check if the site is working or not. But if you maintain some websites and want to receive real-time notifications when the website is down, you should use real-time site monitoring tools. There are many tools for this, some of which are free and most others require payment. So choose a favorite tool based on your needs.

How to see if the website works or not from Linux Terminal

  1. Method 1: How to check the status of a website using the fping command
  2. Method 2: Quickly check the status of a website using the http command
  3. Method 3: How to check the status of a website using the curl command
  4. Method 4: Quickly check the status of a website using the wget command
  5. Method 5: Quickly check if a website is working by using the lynx command
  6. Method 6: How to check if a website is working by using the ping command
  7. Method 7: Quickly check if a website is working by using the telnet command
  8. Method 8: How to check if a website is working by using Bash Script

Method 1: How to check the status of a website using the fping command

The fping command is a program like ping that uses the Internet Control Message Protocol (ICMP) echo request to determine if the destination server is responding.

fping is different from ping because it allows users to ping any number of servers simultaneously. Alternatively, the server can be imported from a text file.

fping sends an ICMP echo request, moving to the next target in a circular fashion, without waiting until the destination server responds.

If a target server responds, it is noted as active and removed from the list of targets to check. If the target is unresponsive within a certain time limit and / or retry limit, it will be identified as inaccessible.

 # fping 2daygeek.com linuxtechnews.com magesh.co.in 2daygeek.com is alive linuxtechnews.com is alive magesh.co.in is alive 

Method 2: Quickly check the status of a website using the http command

HTTPie (pronounced aitch-tee-tee-pie) is a command-line HTTP client.

The HTTPie tool is a modern, command-line HTTP client that helps CLI interact with web services.

It provides a simple HTTP command that allows sending arbitrary HTTP requests, using a simple syntax and displaying color output.

HTTPie can be used for testing, debugging and often interacting with HTTP servers.

 # http 2daygeek.com HTTP/1.1 301 Moved Permanently CF-RAY: 535b66722ab6e5fc-LHR Cache-Control: max-age=3600 Connection: keep-alive Date: Thu, 14 Nov 2019 19:30:28 GMT Expires: Thu, 14 Nov 2019 20:30:28 GMT Location: https://2daygeek.com/ Server: cloudflare Transfer-Encoding: chunked Vary: Accept-Encoding 

Method 3: How to check the status of a website using the curl command

The curl command is a tool to transfer data from the server or to the server, using one of the supported protocols (DICT, FILE, FTP, FTPS, GOPHER, HTTP, HTTPS, IMAP, IMAPS, LDAP, LDAPS). , POP3, POP3S, RTMP, RTSP, SCP, SFTP, SMTP, SMTPS, TELNET and TFTP).

The command is designed to operate without user interaction.

The curl command supports proxy, user authentication, FTP upload, HTTP post, SSL connection, cookies, file transfer summary, Metalink, etc.

 # curl -I https://www.magesh.co.in HTTP/2 200 date: Thu, 14 Nov 2019 19:39:47 GMT content-type: text/html set-cookie: __cfduid=db16c3aee6a75c46a504c15131ead3e7f1573760386; expires=Fri, 13-Nov-20 19:39:46 GMT; path=/; domain=.magesh.co.in; HttpOnly vary: Accept-Encoding last-modified: Sun, 14 Jun 2015 11:52:38 GMT x-cache: HIT from Backend cf-cache-status: DYNAMIC expect-ct: max-age=604800, report-uri="https://report-uri.cloudflare.com/cdn-cgi/beacon/expect-ct" server: cloudflare cf-ray: 535b74123ca4dbf3-LHR 

Use the following curl command if you only want to see the HTTP status code instead of the entire output:

 # curl -I "www.magesh.co.in" 2>&1 | awk '/HTTP// {print $2}' 200 

If you want to see if a particular website is working, use the following Bash script.

 # vi curl-url-check.sh #!/bin/bash if curl -I "https://www.magesh.co.in" 2>&1 | grep -w "200|301" ; then echo "magesh.co.in is up" else echo "magesh.co.in is down" fi 

Once you've added the above script to a file, run the file to see the output.

 # sh curl-url-check.sh HTTP/2 200 magesh.co.in is up 

Use the following shell script if you want to see the status of many websites.

 # vi curl-url-check-1.sh #!/bin/bash for site in www.google.com google.co.in www.xyzzz.com do if curl -I "$site" 2>&1 | grep -w "200|301" ; then echo "$site is up" else echo "$site is down" fi echo "----------------------------------" done 

Once you've added the above script to a file, run the file to see the output.

 # sh curl-url-check-1.sh HTTP/1.1 200 OK www.google.com is up ---------------------------------- HTTP/1.1 301 Moved Permanently google.co.in is up ---------------------------------- www.xyzzz.com is down ---------------------------------- 

Method 4: Quickly check the status of a website using the wget command

The wget command (formerly known as Geturl) is a free, open-source command line download tool, accesses files using HTTP, HTTPS and FTP, the most widely used Internet protocols.

The wget command is a non-interactive command line tool. Its name is taken from the World Wide Web and get.

Wget handles downloads quite a lot compared to other tools, expected to include working in the background, downloading multiple files, non-interactive downloading and downloading large files, etc.

 # wget -S --spider https://www.magesh.co.in Spider mode enabled. Check if remote file exists. --2019-11-15 01:22:00-- https://www.magesh.co.in/ Loaded CA certificate '/etc/ssl/certs/ca-certificates.crt' Resolving www.magesh.co.in (www.magesh.co.in)… 104.18.35.52, 104.18.34.52, 2606:4700:30::6812:2334, … Connecting to www.magesh.co.in (www.magesh.co.in)|104.18.35.52|:443… connected. HTTP request sent, awaiting response… HTTP/1.1 200 OK Date: Thu, 14 Nov 2019 19:52:01 GMT Content-Type: text/html Connection: keep-alive Set-Cookie: __cfduid=db73306a2f1c72c1318ad4709ef49a3a01573761121; expires=Fri, 13-Nov-20 19:52:01 GMT; path=/; domain=.magesh.co.in; HttpOnly Vary: Accept-Encoding Last-Modified: Sun, 14 Jun 2015 11:52:38 GMT X-Cache: HIT from Backend CF-Cache-Status: DYNAMIC Expect-CT: max-age=604800, report-uri="https://report-uri.cloudflare.com/cdn-cgi/beacon/expect-ct" Server: cloudflare CF-RAY: 535b85fe381ee684-LHR Length: unspecified [text/html] Remote file exists and could contain further links, but recursion is disabled -- not retrieving. 

Use the following wget command if you only want to see the HTTP status code instead of the entire output:

 # wget --spider -S "www.magesh.co.in" 2>&1 | awk '/HTTP// {print $2}' 200 

If you want to see if a particular website is working, use the following Bash script:

 # vi wget-url-check.sh #!/bin/bash if wget --spider -S "https://www.google.com" 2>&1 | grep -w "200|301" ; then echo "Google.com is up" else echo "Google.com is down" fi 

Once you've added the above script to a file, run the file to see the output.

 # wget-url-check.sh HTTP/1.1 200 OK Google.com is up 

Use the following shell script if you want to see the status of many websites.

 # vi curl-url-check-1.sh #!/bin/bash for site in www.google.com google.co.in www.xyzzz.com do if wget --spider -S "$site" 2>&1 | grep -w "200|301" ; then echo "$site is up" else echo "$site is down" fi echo "----------------------------------" done 

Once you've added the above script to a file, run the file to see the output.

 # sh wget-url-check-1.sh HTTP/1.1 200 OK www.google.com is up ---------------------------------- HTTP/1.1 301 Moved Permanently google.co.in is up ---------------------------------- www.xyzzz.com is down ---------------------------------- 

Method 5: Quickly check if a website is working by using the lynx command

Lynx is a highly configurable text-based web browser for use on character cell terminals (a very simple digital electronic device that displays the received text, usually with a serial port and sending respond back on the same port) can address the cursor. It is the oldest and still actively developing web browser.

 # lynx -head -dump http://www.magesh.co.in HTTP/1.1 200 OK Date: Fri, 15 Nov 2019 08:14:23 GMT Content-Type: text/html Connection: close Set-Cookie: __cfduid=df3cb624024b81df7362f42ede71300951573805662; expires=Sat, 1 4-Nov-20 08:14:22 GMT; path=/; domain=.magesh.co.in; HttpOnly Vary: Accept-Encoding Last-Modified: Sun, 14 Jun 2015 11:52:38 GMT X-Cache: HIT from Backend CF-Cache-Status: DYNAMIC Server: cloudflare CF-RAY: 535fc5704a43e694-LHR 

Use the following lynx command if you only want to see HTTP status codes instead of the entire output:

 # lynx -head -dump https://www.magesh.co.in 2>&1 | awk '/HTTP// {print $2}' 200 

If you want to see if a particular website is working, use the following Bash script:

 # vi lynx-url-check.sh #!/bin/bash if lynx -head -dump http://www.magesh.co.in 2>&1 | grep -w "200|301" ; then echo "magesh.co.in is up" else echo "magesh.co.in is down" fi 

Once you've added the above script to a file, run the file to see the output.

 # sh lynx-url-check.sh HTTP/1.1 200 OK magesh.co.in is up 

Use the following shell script if you want to see the status of many websites.

 # vi lynx-url-check-1.sh #!/bin/bash for site in http://www.google.com https://google.co.in http://www.xyzzz.com do if lynx -head -dump "$site" 2>&1 | grep -w "200|301" ; then echo "$site is up" else echo "$site is down" fi echo "----------------------------------" done 

Once you've added the above script to a file, run the file to see the output.

 # sh lynx-url-check-1.sh HTTP/1.0 200 OK http://www.google.com is up ---------------------------------- HTTP/1.0 301 Moved Permanently https://google.co.in is up ---------------------------------- www.xyzzz.com is down ---------------------------------- 

Method 6: How to check if a website is working by using the ping command

The ping command is a network utility used to check the availability / connectivity of the target server on the Internet Protocol (IP).

The command confirms server availability by sending the Internet Control Message Protocol (ICMP) Echo Request to the destination server and waiting for ICMP Echo Reply.

Ping summarizes the statistical results based on the transmitted, received, and lost packets (packet loss), usually including min / avg / max (minimum / average / maximum).

 # ping -c 5 2daygeek.com PING 2daygeek.com (104.27.157.177) 56(84) bytes of data. 64 bytes from 104.27.157.177 (104.27.157.177): icmp_seq=1 ttl=58 time=228 ms 64 bytes from 104.27.157.177 (104.27.157.177): icmp_seq=2 ttl=58 time=227 ms 64 bytes from 104.27.157.177 (104.27.157.177): icmp_seq=3 ttl=58 time=250 ms 64 bytes from 104.27.157.177 (104.27.157.177): icmp_seq=4 ttl=58 time=171 ms 64 bytes from 104.27.157.177 (104.27.157.177): icmp_seq=5 ttl=58 time=193 ms --- 2daygeek.com ping statistics --- 5 packets transmitted, 5 received, 0% packet loss, time 13244ms rtt min/avg/max/mdev = 170.668/213.824/250.295/28.320 ms 

Method 7: Quickly check if a website is working by using the telnet command

The telnet command is an old network protocol used to communicate with another server over a TCP / IP network using the TELNET protocol.

It uses port 23 to connect to other devices, such as computers and network devices.

Telnet is not a secure protocol and is not currently recommended because data sent to the protocol is unencrypted and can be intercepted by hackers.

Everyone uses the SSH protocol, which is encrypted and very secure, instead of Telnet.

 # telnet google.com 80 Trying 216.58.194.46… Connected to google.com. Escape character is '^]'. ^] telnet> quit Connection closed. 

Method 8: How to check if a website is working by using Bash Script

Put simply, a shell script is a file containing a sequence of commands. The shell reads this file and executes each command once they are entered directly on the command line.

To make this command more useful, you can add some conditions. This alleviates the task for Linux administrators.

If you want to see the status of many websites using the wget command, use the following shell script.

 # vi wget-url-check-2.sh #!/bin/bash for site in www.google.com google.co.in www.xyzzz.com do if wget --spider -S "$site" 2>&1 | grep -w "200|301" > /dev/null ; then echo "$site is up" else echo "$site is down" fi done 

Once you've added the above script to a file, run the file to see the output:

 # sh wget-url-check-2.sh www.google.com is up google.co.in is up www.xyzzz.com is down 

If you want to view the status of many websites with the curl command, use the following Bash script:

 # vi curl-url-check-2.sh #!/bin/bash for site in www.google.com google.co.in www.xyzzz.com do if curl -I "$site" 2>&1 | grep -w "200|301" > /dev/null ; then echo "$site is up" else echo "$site is down" fi done 

Once you've added the above script to a file, run the file to see the output.

 # sh curl-url-check-2.sh www.google.com is up google.co.in is up www.xyzzz.com is down 

Hope you are succesful.

5 ★ | 1 Vote | 👨 2308 Views

Above is an article about: "8 methods to quickly check website status from Linux Terminal". Hope this article is useful to you. Don't forget to rate the article, like and share this article with your friends and relatives. Good luck!

« PREV POST
NEXT POST »