Bash Display Web Page Content In Terminal

See all Bash/Shell scripting related FAQ
How can I fetch HTML web page content from bash and display on screen (terminal) using shell utilities under Linux, macOS or Unix?

You can use any one of the following tool or combination of all of them to get or display the contents of a web page in a shell/terminal session:
Advertisement

  1. curl command – Tool to transfer data from or to a server using http/https/ftp and much more.
  2. lynx command – Fully-featured World Wide Web (WWW) client/browser for users running terminals.
  3. wget command – Free utility for non-interactive download of files from the Web. It supports HTTP, HTTPS, and FTP protocols, as well as retrieval through HTTP proxies.
  4. w3m command – A text based Web browser and pager.
  5. elinks command – Another text based Web browser and pager for your bash and web page terminal needs.
Tutorial details
Difficulty level Easy
Root privileges Yes
Requirements Linux terminal
Category Package Manager
OS compatibility *BSD Linux macOS Unix WSL
Est. reading time 5 minutes

Installing CLI bash utilities To display web page

Prerequisite
By default, lyxn, elinks, w3m, wget, and curl command may not be installed on your system. Hence, use the apk command on Alpine Linux, dnf command/yum command on RHEL & co, apt command/apt-get command on Debian, Ubuntu & co, zypper command on SUSE/OpenSUSE, pacman command on Arch Linux to install the lyxn, elinks, w3m, wget, and curl.
As I said above CLI tools may not be installed on your Linux or Unix like operating systems. Here are some common examples to install them as per your operating systems.

Debian / Ubuntu Linux install curl, wget, lynx, and w3m

Open a terminal and and then type the apt command or apt-get command:
$ sudo apt-get install curl wget lynx w3m elinks

Ubuntu install CLI Utilties For Bash To Display Web Page Content In Terminal

Installing Debian/Ubuntu Linux CLU utilities for different ways to get the contents of a webpage into a shell variable

Fedora / RHEL / CentOS Linux install curl, wget, lynx, and w3m

Open a terminal app and and then type the yum command or dnf command:
$ sudo yum install curl wget lynx w3m elinks

FreeBSD Unix install curl, wget, lynx, and w3m (binary package)

Open a terminal and and then type the pkg command:
$ sudo pkg_add -v -r curl lynx w3m wget elinks
## newer version of FreeBSD use pkg ##
$ sudo pkg install curl lynx w3m wget elinks

macOS installing curl, wget and others

First, install Homebrew on macOS to use the brew package manager and then type the brew command:
$ brew install curl wget w3m lynx elinks
Bash Display Web Page Content In Terminal Command Examples

Bash Display Web Page Content In Terminal Examples

WARNING! Some Web Application Firewalls (WAF) such as Apache or Nginx mod_security or Cloudflare WAF may block standard terminal utilities such as curl, wget and others to avoid hammering their web servers and to avoid bots activity. Be respectful when running commands to download many web pages from the Internet. Not every website owner has unlimited resources.

Now we installed the packages. It is time to see some common examples to show web page content in a terminal. You can use curl command to download the page as follows:

curl https://your-domain-path/file.html
curl https://www.cyberciti.biz/
curl https://www.cyberciti.biz/faq/bash-for-loop/

Use curl and store output into a variable as follows:

page="$(curl https://www.cyberciti.biz/)"
page="$(curl https://www.cyberciti.biz/faq/bash-for-loop/)"

To display content use printf command or echo command:

echo "$page"
printf "%s" $page

lynx command examples

Use the lynx command as follows:

lynx -dump https://www.cyberciti.biz
lynx -dump https://www.nixcraft.com
lynx -dump https://www.cyberciti.biz/faq/bash-for-loop/

The -dump option dumps the formatted output of the default document or those specified on the command line to standard output. Unlike interactive mode, all documents are processed.

wget command examples

The syntax is as follows for the wget command:

wget -O - http://www.cyberciti.biz
wget -O - http://www.cyberciti.biz/faq/bash-for-loop/

OR use the wget command to grab the page and store it into a variable called page:

page="$(wget -O - http://www.cyberciti.biz)"
## display the page ##
echo "$page"
## or pass it to lynx / w3m ##
echo "$page" | w3m -dump -T text/html
echo "$page" | lynx -dump -stdin

w3m command examples

The syntax is as follows to dump web page content in terminal using the w3m command:

w3m -dump https://www.cyberciti.biz/
w3m -dump https://www.cyberciti.biz/faq/bash-for-loop/

OR use w3m command to grab the page and store it into a variable called page:

page="$(w3m -dump http://www.cyberciti.biz/)"
echo "$page"

Practical examples

Get the definition of linux from a dictionary:
$ curl dict://dict.org/d:linux
Here is what bash displayed on my screen:

220 dict.dict.org dictd 1.12.1/rf on Linux 4.19.0-10-amd64 <auth.mime> <107284616.15383.1649137441@dict.dict.org>
250 ok
150 1 definitions retrieved
151 "linux" wn "WordNet (r) 3.0 (2006)"
Linux
    n 1: an open-source version of the UNIX operating system
.
250 ok [d/m/c = 1/0/30; 0.000r 0.000u 0.000s]
221 bye [d/m/c = 0/0/0; 0.000r 0.000u 0.000s]

Backup your del.icio.us bookmarks:
$ wget --user=Your-Username-Here --password=Your-Password-Here https://api.del.icio.us/v1/posts/all -O my-old-bookmarks.xml
$ more my-old-bookmarks.xml

Grab all .mp3s from url:

mp3=$(lynx -dump https://server1.cyberciti.biz/media/index.html  | grep 'https://' | awk '/mp3/{print $2}')
cd /nas/music/mp3/
for i in $mp3
 wget "$i"
done

Getting the contents of a web page in a shell variable using elinks

The syntax is simple:

elinks url
elinks https://www.nixcraft.com
# Let us print formatted plain-text versions of given URLs
elinks --dump https://www.nixcraft.com

Try the following CLI options to save output directly to a variable named $OUTPUTS and then display at bash terminal session:

# Bash Display Web Page examples #
OUTPUTS="$(elinks --dump https://www.nixcraft.com)"
echo "$OUTPUTS"

Summing up

I hope this quick introduction about downloading or displaying web page content in bash terminal is helpful to many. But, of course, you may want to use Perl or Python to scrape web pages for serious work. Linux and Unix command I discussed here have many more options. Therefore, reading manual pages using the help command or man command is an essential tasks for command-line users:

man curl
man w3m
man lynx
man wget
man elinks

🥺 Was this helpful? Please add a comment to show your appreciation or feedback.

nixCrat Tux Pixel Penguin
Hi! 🤠
I'm Vivek Gite, and I write about Linux, macOS, Unix, IT, programming, infosec, and open source. Subscribe to my RSS feed or email newsletter for updates.

8 comments… add one
  • Bob Aug 5, 2013 @ 14:53

    You can use elinks too; another text web browser.

  • Jalal Hajigholamali Aug 6, 2013 @ 5:02

    Hi,

    Thanks a lot

  • Curtis Aug 6, 2013 @ 6:14

    Lets not forget about plain ol’ links as well!

  • Mike Bravog Oct 20, 2013 @ 17:22

    As Bob said: elinks too. For me, the best option!

    For Debian/Ubuntu series users:

    # sudo apt-get install links

    That’s all.

  • rod May 9, 2015 @ 21:51

    Is there a way to display the page for humans, I mean, interpret in some way the markup language in the bash (with no image/css/etc obviously)?

  • siddharth das Jul 27, 2016 @ 16:10

    How many tab open in terminal

  • abhinav sarvari Aug 3, 2016 @ 8:00

    thank you mate

  • O Aug 21, 2016 @ 8:08

    I like elink very much as well.

Leave a Reply

Your email address will not be published. Required fields are marked *

Use HTML <pre>...</pre> for code samples. Your comment will appear only after approval by the site admin.