Hey guys, it's me, Fariz A-gain #yeaaah!!! *crowd gets louder*

Okay, enough for it. 

Today I'll be focusing my post to Data Communication and Computer Network subject, specified to Terminal, and fyi we're not going to talk about terminal like bus station, airport, but we're gonna be talking about terminal as one of computer interface, a command line interface.

Terminal is one of command line interface in computer. It exists in OS X and Linux, and maybe other OSes. For those Windows user, you don't have terminal on your system, but instead you'll have CMD. But don't worry, it's a same thing as terminal with a different command of course.

Terminal can do many things. Most of the things, actually. Like you can copy/move files, create directories, set up your computer, install/remove packages, and even download files. And for the sake of time, let's skip to the part where we can download files from network to our computer using a simple command called wget.

wget is the non-interactive network downloader.
here's some description abot this command:

      GNU Wget is a free utility for non-interactive download of files from
       the Web.  It supports HTTP, HTTPS, and FTP protocols, as well as
       retrieval through HTTP proxies.

       Wget is non-interactive, meaning that it can work in the background,
       while the user is not logged on.  This allows you to start a retrieval
       and disconnect from the system, letting Wget finish the work.  By
       contrast, most of the Web browsers require constant user's presence,
       which can be a great hindrance when transferring a lot of data.

       Wget can follow links in HTML, XHTML, and CSS pages, to create local
       versions of remote web sites, fully recreating the directory structure
       of the original site.  This is sometimes referred to as "recursive
       downloading."  While doing that, Wget respects the Robot Exclusion
       Standard (/robots.txt).  Wget can be instructed to convert the links in
       downloaded files to point at the local files, for offline viewing.

       Wget has been designed for robustness over slow or unstable network
       connections; if a download fails due to a network problem, it will keep
       retrying until the whole file has been retrieved.  If the server
       supports regetting, it will instruct the server to continue the
       download from where it left off.

P.S I honestly didn't read (and want to read) that.

In terminal, we also have syntaxes and options and here's how you do it when using wget:


wget [option]... [URL]...
 
Is it too long to read? well let's download something, shall we?
I am going to download this jam from 1993 (wow, I haven't even been born yet) by Michael Bolton...

Downloading using wget
Command line shows a simple information. In this wget case, you can get the download progress, download speed and also (one of my favourite feature) an eta (Estimated Time of Arrival) :)

When you press Enter, it refreshes you current download status. But you don't have to do that because it automatically shows the current progress.
Enter pressed 4 times

 And when your download is finished, you can get message like this:

 2015-03-25 11:05:20 (17,4 KB/s) - ‘5755d.mp3’ saved [6262332/6262332]

Download finished
There's more things you can do with wget, but I think I'm not gonna doo dee detail it in this post. You can add options on your downloaded file by using the syntax I mentioned before. and you can get the option code using this command in your terminal field:

man wget

And yeah, that's one of my long post about computer stuff. Thank you for reading such a long boring post about this subject.

And don't forget to read my blog so I can be more popular than the plastic. Love y'all, peace buhbye *wink wink*