Is cURL standard part of all Unix-like operating systems?

I'm writing a shell script that needs to download some data from the Internet and I was just thinking if I can rely on cURL that it's pre-installed on all Unix distributions.
I know that wget is not on OS X by default but cURL is. I also remember me installing cURL on Ubuntu Server but it's already a year or two. I tried Ubuntu Server 12.04 today and it comes with cURL out of the box.


Everything else equals I would say it being more likely that you have wget installed.

Yet, why not simply make a conditional, looking for both wget and curl in the PATH, and use whatever is available, if any? If you want to be ambitious, also feel free to through in lynx, w3m, etc in the mix.


No, cURL is not a standard part of operating systems. It isn't ever standard on all Linux based distributions.


Neither curl nor wget are "guaranteed" to be installed anywhere, especially on proper UNIX systems. They are not POSIX standards. Neither is ftp, ssh / scp / sftp, rsync, telnet, nc / netcat, openssl, or probably any related tool that comes to mind. It seems like an odd oversight to me, but that is how it is.

Various GNU/Linux distros may include curl and/or wget, but YMMV.

FreeBSD comes standard with the "fetch" tool for cases like this and OpenBSD comes with a souped-up "ftp" client that can do the job with it's "AUTO-FETCHING" feature.

https://pubs.opengroup.org/onlinepubs/9699919799/utilities/

To my knowledge, there is no "tcp" file transfer tool defined by POSIX at all. uucp is standard, but I do not know if you could even make that work without config changes on both ends.

You can test for both of them (as many have suggested), but to be certain you would either need to install something or write something.

A shell with tcp socket support (like ksh93 or Bash) should let you write a function call in a pinch. Of course, you still need the proper permissions to read / write to the socket.

GNU awk ("gawk") can do it, however the odds of having gawk and nothing more convenient like curl or wget seems slim to me. Pretty sure POSIX awk does not support networking and I do not recall seeing anything about it in "The AWK Programming Language" but it has been a while. Of course, Perl, Python, Ruby, C, etc. can do it also.

See: https://www.shell-tips.com/bash/download-files-from-shell/

https://unix.stackexchange.com/questions/83926/how-to-download-a-file-using-just-bash-and-nothing-else-no-curl-wget-perl-et

https://unix.stackexchange.com/questions/336876/simple-shell-script-to-send-socket-message

#!/bin/ksh
## Tested with ksh93u+ and Bash v3.2x
## Not tested with Binary files

HOST=gutenberg.org
DOC=/files/84/84-0.txt
## Change port (80) as needed

exec 3<>/dev/tcp/${HOST}/80
printf "GET ${DOC} HTTP/1.0\r\nHost: ${HOST}\r\n\r\n" >&3

# drop the header from the data stream
while IFS= read -r line ; do
   [ "$line" == $'\r' ] && break
done <&3

cat <&3 | tee Frankenstein.txt

Pro tip: "tar" is not POSIX either! Use "pax" instead.


Since some machines tend to have curl and others tend to have wget pre-installed, I sometimes use this in Bash scripts :

# use curl or wget, depending on which one we find
curl_or_wget=$(if hash curl 2>/dev/null; then echo "curl -s"; elif hash wget 2>/dev/null; then echo "wget -qO-"; fi);

if [ -z "$curl_or_wget" ]; then
        echo "Neither curl nor wget found. Cannot use http method." >&2
        exit 1
fi

x=$($curl_or_wget "$url")