Skip to content
hugehelp.c 50.2 KiB
Newer Older
Daniel Stenberg's avatar
Daniel Stenberg committed
/* NEVER EVER edit this manually, fix the mkhelp script instead! */
#include <stdio.h>
void hugehelp(void)
{
puts (
"                                  _   _ ____  _     \n"
"  Project                     ___| | | |  _ \\| |    \n"
"                             / __| | | | |_) | |    \n"
"                            | (__| |_| |  _ <| |___ \n"
"                             \\___|\\___/|_| \\_\\_____|\n"
"NAME\n"
"     curl - get a URL with FTP, TELNET, LDAP, GOPHER, DICT, FILE,\n"
"     HTTP or HTTPS syntax.\n"
"\n"
"SYNOPSIS\n"
"     curl [options] url\n"
"\n"
"DESCRIPTION\n"
"     curl is a client to get documents/files from servers,  using\n"
"     any  of  the supported protocols. The command is designed to\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"     work without user interaction or any kind of  interactivity.\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"     curl  offers  a busload of useful tricks like proxy support,\n"
"     user authentication, ftp upload,  HTTP  post,  SSL  (https:)\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"     connections, cookies, file transfer resume and more.\n"
"\n"
"URL\n"
"     The URL syntax is protocol dependent. You'll find a detailed\n"
"     description in RFC 2396.\n"
"\n"
"     You can specify multiple URLs or parts of  URLs  by  writing\n"
"     part sets within braces as in:\n"
"\n"
"      http://site.{one,two,three}.com\n"
"\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"     or  you can get sequences of alphanumeric series by using []\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"     as in:\n"
"\n"
"      ftp://ftp.numericals.com/file[1-100].txt\n"
"      ftp://ftp.numericals.com/file[001-100].txt    (with leading\n"
"     zeros)\n"
"      ftp://ftp.letters.com/file[a-z].txt\n"
"\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"     It  is possible to specify up to 9 sets or series for a URL,\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"     but no nesting is supported at the moment:\n"
"\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"      http://www.any.org/archive[1996-1999]/vol­\n"
"     ume[1-4]part{a,b,c,index}.html\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"\n"
"OPTIONS\n"
"     -a/--append\n"
"          (FTP) When used in a ftp upload, this will tell curl to\n"
"          append to the target file instead of overwriting it. If\n"
"          the file doesn't exist, it will be created.\n"
"\n"
"     -A/--user-agent <agent string>\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          (HTTP)  Specify  the  User-Agent  string to send to the\n"
"          HTTP server. Some badly done CGIs fail if its  not  set\n"
"          to \"Mozilla/4.0\".  To encode blanks in the string, sur­\n"
"          round the string with single  quote  marks.   This  can\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          also be set with the -H/--header flag of course.\n"
"     -b/--cookie <name=data>\n"
"          (HTTP) Pass the data to the HTTP server as a cookie. It\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          is supposedly the data  previously  received  from  the\n"
"          server  in a \"Set-Cookie:\" line.  The data should be in\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          the format \"NAME1=VALUE1; NAME2=VALUE2\".\n"
"\n"
"          If no '=' letter is used in the line, it is treated  as\n"
"          a  filename  to  use  to  read previously stored cookie\n"
"          lines from, which should be used  in  this  session  if\n"
"          they  match.  Using  this  method  also  activates  the\n"
"          \"cookie parser\" which will make  curl  record  incoming\n"
"          cookies too, which may be handy if you're using this in\n"
"          combination with the  -L/--location  option.  The  file\n"
"          format of the file to read cookies from should be plain\n"
"          HTTP headers or the netscape cookie file format.\n"
"\n"
"     -B/--ftp-ascii\n"
"          (FTP/LDAP) Use ASCII transfer when getting an FTP  file\n"
"          or  LDAP  info.  For  FTP, this can also be enforced by\n"
"          using an URL that ends with \";type=A\".\n"
"\n"
"     -c/--continue\n"
"          Continue/Resume  a   previous   file   transfer.   This\n"
"          instructs  curl  to continue appending data on the file\n"
"          where it was previously left,  possibly  because  of  a\n"
"          broken  connection to the server. There must be a named\n"
"          physical file to append to for  this  to  work.   Note:\n"
"          Upload  resume  is depening on a command named SIZE not\n"
"          always present in all ftp servers! Upload resume is for\n"
"          FTP  only.   HTTP resume is only possible with HTTP/1.1\n"
"          or later servers.\n"
"\n"
"     -C/--continue-at <offset>\n"
"          Continue/Resume a previous file transfer at  the  given\n"
"          offset.  The  given offset is the exact number of bytes\n"
"          that will be skipped counted from the beginning of  the\n"
"          source file before it is transfered to the destination.\n"
"          If used with uploads, the ftp server command SIZE  will\n"
"          not  be  used  by  curl. Upload resume is for FTP only.\n"
"          HTTP resume is only possible  with  HTTP/1.1  or  later\n"
"          servers.\n"
"\n"
"     -d/--data <data>\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          (HTTP)  Sends  the  specified data in a POST request to\n"
"          the HTTP server. Note that the data is sent exactly  as\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          specified  with  no  extra  processing.   The  data  is\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          expected to be \"url-encoded\". This will cause  curl  to\n"
"          pass  the  data  to  the  server using the content-type\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          application/x-www-form-urlencoded. Compare to -F.\n"
"\n"
"          If you start the data  with  the  letter  @,  the  rest\n"
"          should  be  a  file name to read the data from, or - if\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          you want curl to read the data from  stdin.   The  con­\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          tents of the file must already be url-encoded.\n"
"\n"
"     -D/--dump-header <file>\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          (HTTP/FTP)  Write  the HTTP headers to this file. Write\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          the FTP file info to this file if -I/--head is used.\n"
"\n"
"     -e/--referer <URL>\n"
"          (HTTP) Sends the \"Referer Page\" information to the HTTP\n"
"          server. Some badly done CGIs fail if it's not set. This\n"
"          can also be set with the -H/--header flag of course.\n"
"\n"
"     -E/--cert <certificate[:password]>\n"
"          (HTTPS) Tells curl to  use  the  specified  certificate\n"
"          file  when  getting  a file with HTTPS. The certificate\n"
"          must be in PEM format.  If the optional password  isn't\n"
"          specified, it will be queried for on the terminal. Note\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          that this certificate is the private key and  the  pri­\n"
"          vate certificate concatenated!\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"\n"
"     -f/--fail\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          (HTTP)  Fail  silently  (no  output  at  all) on server\n"
"          errors. This is mostly done like this to better  enable\n"
"          scripts  etc  to  better  deal with failed attempts. In\n"
"          normal cases when a HTTP server fails to deliver a doc­\n"
"          ument,  it  returns  a  HTML document stating so (which\n"
"          often also describes why and more). This flag will pre­\n"
"          vent  curl  from  outputting  that  and  fail  silently\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          instead.\n"
"\n"
"     -F/--form <name=content>\n"
"          (HTTP) This lets curl emulate a filled in form in which\n"
"          a  user has pressed the submit button. This causes curl\n"
"          to POST data using the content-type multipart/form-data\n"
"          according  to RFC1867. This enables uploading of binary\n"
"          files etc. To force the 'content' part to be read  from\n"
"          a  file,  prefix the file name with an @ sign. Example,\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          to send your password file to the server, where  'pass­\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          word'   is   the   name  of  the  form-field  to  which\n"
"          /etc/passwd will be the input:\n"
"\n"
"          curl -F password=@/etc/passwd www.mypasswords.com\n"
"\n"
"          To read the file's content from stdin insted of a file,\n"
"          use - where the file name should've been.\n"
"\n"
"     -h/--help\n"
"          Usage help.\n"
"\n"
"     -H/--header <header>\n"
"          (HTTP) Extra header to use when getting a web page. You\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          may specify any number of extra headers. Note  that  if\n"
"          you  should  add a custom header that has the same name\n"
"          as one of the internal ones curl would use, your exter­\n"
"          nally  set  header will be used instead of the internal\n"
"          one. This allows you to make even trickier  stuff  than\n"
"          curl  would  normally do. You should not replace inter­\n"
"          nally set headers without knowing perfectly  well  what\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          you're doing.\n"
"\n"
"     -i/--include\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          (HTTP) Include the HTTP-header in the output. The HTTP-\n"
"          header includes things like server-name,  date  of  the\n"
"          document, HTTP-version and more...\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"\n"
"     -I/--head\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          (HTTP/FTP)  Fetch  the  HTTP-header  only! HTTP-servers\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          feature the command HEAD which this uses to get nothing\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          but  the header of a document. When used on a FTP file,\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          curl displays the file size only.\n"
"\n"
"     -K/--config <config file>\n"
"          Specify which config file to read curl arguments  from.\n"
"          The  config  file  is a text file in which command line\n"
"          arguments can be written which then will be used as  if\n"
"          they  were  written  on the actual command line. If the\n"
"          first column of a config line is a '#'  character,  the\n"
"          rest of the line will be treated as a comment.\n"
"\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          Specify  the filename as '-' to make curl read the file\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          from stdin.\n"
"\n"
"     -l/--list-only\n"
"          (FTP) When listing an FTP directory, this switch forces\n"
"          a  name-only  view.   Especially  useful if you want to\n"
"          machine-parse the contents of an  FTP  directory  since\n"
"          the  normal  directory view doesn't use a standard look\n"
"          or format.\n"
"\n"
"     -L/--location\n"
"          (HTTP/HTTPS) If the server reports that  the  requested\n"
"          page  has  a  different  location  (indicated  with the\n"
"          header line Location:) this flag will let curl  attempt\n"
"          to reattempt the get on the new place. If used together\n"
"          with -i or -I, headers from all requested pages will be\n"
"          shown.\n"
"\n"
"     -m/--max-time <seconds>\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          Maximum time in seconds that you allow the whole opera­\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          tion to take.  This is useful for preventing your batch\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          jobs  from  hanging  for  hours due to slow networks or\n"
"          links going down.  This doesn't work properly in  win32\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          systems.\n"
"     -M/--manual\n"
"          Manual. Display the huge help text.\n"
"\n"
"     -n/--netrc\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          Makes  curl  scan  the  .netrc  file in the user's home\n"
"          directory for login name and password.  This  is  typi­\n"
"          cally  used  for  ftp  on unix. If used with http, curl\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          will  enable  user  authentication.  See  netrc(4)  for\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          details  on  the file format. Curl will not complain if\n"
"          that file hasn't the right permissions (it  should  not\n"
"          be  world nor group readable). The environment variable\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          \"HOME\" is used to find the home directory.\n"
"\n"
"          A quick and very simple  example  of  how  to  setup  a\n"
"          .netrc   to   allow   curl   to   ftp  to  the  machine\n"
"          host.domain.com with user name\n"
"\n"
"          machine host.domain.com user myself password secret\n"
"\n"
"     -o/--output <file>\n"
"          Write output to <file> instead of stdout.  If  you  are\n"
"          using {} or [] to fetch multiple documents, you can use\n"
"          #<num> in the <file> specifier. That variable  will  be\n"
"          replaced  with  the  current  string  for the URL being\n"
"          fetched. Like in:\n"
"\n"
"            curl http://{one,two}.site.com -o \"file_#1.txt\"\n"
"\n"
"          or use several variables like:\n"
"\n"
"            curl http://{site,host}.host[1-5].com -o \"#1_#2\"\n"
"\n"
"     -O/--remote-name\n"
"          Write output to a local file named like the remote file\n"
"          we get. (Only the file part of the remote file is used,\n"
"          the path is cut off.)\n"
"\n"
"     -P/--ftpport <address>\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          (FTP) Reverses the initiator/listenor roles  when  con­\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          necting  with  ftp. This switch makes Curl use the PORT\n"
"          command instead of PASV. In practice,  PORT  tells  the\n"
"          server to connect to the client's specified address and\n"
"          port, while PASV asks the server for an ip address  and\n"
"          port to connect to. <address> should be one of:\n"
"           interface - i.e \"eth0\" to specify which interface's IP\n"
"          address you want to use  (Unix only)\n"
"           IP address - i.e \"192.168.10.1\" to  specify  exact  IP\n"
"          number\n"
"           host name - i.e \"my.host.domain\" to specify machine\n"
"           \"-\"       - (any single-letter string) to make it pick\n"
"          the machine's default\n"
"     -q   If used as the first parameter on the command line, the\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          $HOME/.curlrc  file will not be read and used as a con­\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          fig file.\n"
"\n"
"     -Q/--quote <comand>\n"
"          (FTP) Send an  arbitrary  command  to  the  remote  FTP\n"
"          server,  by  using the QUOTE command of the server. Not\n"
"          all servers support this command, and the set of  QUOTE\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          commands  are  server specific! Quote commands are sent\n"
"          BEFORE the transfer is taking place. To  make  commands\n"
"          take  place  after  a  successful transfer, prefix them\n"
"          with a dash '-'. You may specify any amount of commands\n"
"          to  be run before and after the transfer. If the server\n"
"          returns failure for one of  the  commands,  the  entire\n"
"          operation will be aborted.\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"\n"
"     -r/--range <range>\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          (HTTP/FTP)  Retrieve  a byte range (i.e a partial docu­\n"
"          ment) from a HTTP/1.1 or  FTP  server.  Ranges  can  be\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          specified in a number of ways.\n"
"           0-499           - specifies the first 500 bytes\n"
"           500-999         - specifies the second 500 bytes\n"
"           -500            - specifies the last 500 bytes\n"
"           9500-           - specifies the bytes from offset 9500\n"
"          and forward\n"
"           0-0,-1          - specifies the first  and  last  byte\n"
"          only(*)(H)\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"           500-700,600-799  -  specifies  300  bytes  from offset\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          500(H)\n"
"           100-199,500-599 - specifies  two  separate  100  bytes\n"
"          ranges(*)(H)\n"
"\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          (*)  =  NOTE  that  this will cause the server to reply\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          with a multipart response!\n"
"\n"
"          You should also be aware that many HTTP/1.1 servers  do\n"
"          not have this feature enabled, so that when you attempt\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          to get a range, you'll instead get the whole  document.\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          FTP  range  downloads  only  support  the simple syntax\n"
"          'start-stop' (optionally with one of the numbers  omit­\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          ted). It depends on the non-RFC command SIZE.\n"
"\n"
"     -s/--silent\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          Silent  mode.  Don't  show progress meter or error mes­\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          sages.  Makes Curl mute.\n"
"\n"
"     -S/--show-error\n"
"          When used with -s it makes curl show error  message  if\n"
"          it fails.\n"
"\n"
"     -t/--upload\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          Transfer  the  stdin  data  to the specified file. Curl\n"
"          will read everything from stdin  until  EOF  and  store\n"
"          with  the  supplied  name. If this is used on a http(s)\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          server, the PUT command will be used.\n"
"\n"
"     -T/--upload-file <file>\n"
"          Like -t, but this transfers the specified  local  file.\n"
"          If  there  is  no  file part in the specified URL, Curl\n"
"          will append the local file name. NOTE that you must use\n"
"          a  trailing  / on the last directory to really prove to\n"
"          Curl that there is no file name or curl will think that\n"
"          your  last  directory  name  is the remote file name to\n"
"          use. That will most likely cause the  upload  operation\n"
"          to  fail.  If this is used on a http(s) server, the PUT\n"
"          command will be used.\n"
"\n"
"     -u/--user <user:password>\n"
"          Specify user and password to  use  when  fetching.  See\n"
"          README.curl  for  detailed examples of how to use this.\n"
"          If no password is  specified,  curl  will  ask  for  it\n"
"          interactively.\n"
"\n"
"     -U/--proxy-user <user:password>\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          Specify  user and password to use for Proxy authentica­\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          tion. If no password is specified, curl will ask for it\n"
"          interactively.\n"
"\n"
"     -v/--verbose\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          Makes   the  fetching  more  verbose/talkative.  Mostly\n"
"          usable for debugging. Lines  starting  with  '>'  means\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          data sent by curl, '<' means data received by curl that\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          is hidden in normal cases and lines starting  with  '*'\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          means additional info provided by curl.\n"
"\n"
"     -V/--version\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          Displays  the  full  version of curl, libcurl and other\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          3rd party libraries linked with the executable.\n"
"\n"
"     -x/--proxy <proxyhost[:port]>\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          Use specified proxy. If the port number is  not  speci­\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          fied, it is assumed at port 1080.\n"
"\n"
"     -X/--request <command>\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          (HTTP)  Specifies a custom request to use when communi­\n"
"          cating with the HTTP  server.   The  specified  request\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          will be used instead of the standard GET. Read the HTTP\n"
"          1.1 specification for details and explanations.\n"
"\n"
"          (FTP) Specifies a custom FTP command to use instead  of\n"
"          LIST when doing file lists with ftp.\n"
"\n"
"     -y/--speed-time <speed>\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          Speed  Limit.  If  a download is slower than this given\n"
"          speed, in bytes per second, for Speed Time  seconds  it\n"
"          gets  aborted.  Speed  Time is set with -Y and is 30 if\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          not set.\n"
"\n"
"     -Y/--speed-limit <time>\n"
"          Speed Time. If a download is slower  than  Speed  Limit\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          bytes  per second during a Speed Time period, the down­\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          load gets aborted. If Speed Time is used,  the  default\n"
"          Speed Limit will be 1 unless set with -y.\n"
"\n"
"     -z/--time-cond <date expression>\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          (HTTP)  Request  to  get  a file that has been modified\n"
"          later than the given time and date,  or  one  that  has\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          been modified before that time. The date expression can\n"
"          be all sorts of date strings or if it doesn't match any\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          internal  ones,  it  tries to get the time from a given\n"
"          file name instead! See the GNU  date(1)  man  page  for\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          date expression details.\n"
"\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          Start  the  date  expression with a dash (-) to make it\n"
"          request for a document that is  older  than  the  given\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          date/time, default is a document that is newer than the\n"
"          specified date/time.\n"
"\n"
"     -3/--sslv3\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          (HTTPS) Forces curl to use SSL version 3 when negotiat­\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          ing with a remote SSL server.\n"
"\n"
"     -2/--sslv2\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          (HTTPS) Forces curl to use SSL version 2 when negotiat­\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          ing with a remote SSL server.\n"
"\n"
"     -#/--progress-bar\n"
"          Make curl display progress information  as  a  progress\n"
"          bar instead of the default statistics.\n"
"\n"
"     --crlf\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          (FTP)  Convert  LF  to  CRLF  in upload. Useful for MVS\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          (OS/390).\n"
"\n"
"     --stderr <file>\n"
"          Redirect all writes to stderr  to  the  specified  file\n"
"          instead. If the file name is a plain '-', it is instead\n"
"          written to stdout. This option has no point when you're\n"
"          using a shell with decent redirecting capabilities.\n"
"\n"
"FILES\n"
"     ~/.curlrc\n"
"          Default config file.\n"
"ENVIRONMENT\n"
"     HTTP_PROXY [protocol://]<host>[:port]\n"
"          Sets proxy server to use for HTTP.\n"
"\n"
"     HTTPS_PROXY [protocol://]<host>[:port]\n"
"          Sets proxy server to use for HTTPS.\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"     FTP_PROXY [protocol://]<host>[:port]\n"
"          Sets proxy server to use for FTP.\n"
"\n"
"     GOPHER_PROXY [protocol://]<host>[:port]\n"
"          Sets proxy server to use for GOPHER.\n"
"\n"
"     ALL_PROXY [protocol://]<host>[:port]\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          Sets  proxy server to use if no protocol-specific proxy\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          is set.\n"
"\n"
"     NO_PROXY <comma-separated list of hosts>\n"
"          list of host names that shouldn't go through any proxy.\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"          If set to a asterisk '*' only, it matches all hosts.\n"
"\n"
"     COLUMNS <integer>\n"
"          The  width of the terminal.  This variable only affects\n"
"          curl when the --progress-bar option is used.\n"
"\n"
"EXIT CODES\n"
"     There exists a bunch of different error codes and their cor­\n"
"     responding  error messages that may appear during bad condi­\n"
"     tions. At the time of this writing, the exit codes are:\n"
"\n"
"     1    Unsupported protocol. This build of curl has no support\n"
"          for this protocol.\n"
"\n"
"     2    Failed to initialize.\n"
"\n"
"     3    URL malformat. The syntax was not correct.\n"
"\n"
"     4    URL  user malformatted. The user-part of the URL syntax\n"
"          was not correct.\n"
"\n"
"     5    Couldn't resolve proxy. The given proxy host could  not\n"
"          be resolved.\n"
"\n"
"     6    Couldn't  resolve  host.  The given remote host was not\n"
"          resolved.\n"
"\n"
"     7    Failed to connect to host.\n"
"\n"
"     8    FTP weird server  reply.  The  server  sent  data  curl\n"
"          couldn't parse.\n"
"\n"
"     9    FTP access denied. The server denied login.\n"
"     10   FTP  user/password  incorrect.  Either one or both were\n"
"          not accepted by the server.\n"
"\n"
"     11   FTP weird PASS reply. Curl  couldn't  parse  the  reply\n"
"          sent to the PASS request.\n"
"\n"
"     12   FTP  weird  USER  reply.  Curl couldn't parse the reply\n"
"          sent to the USER request.\n"
"\n"
"     13   FTP weird PASV reply, Curl  couldn't  parse  the  reply\n"
"          sent to the PASV request.\n"
"\n"
"     14   FTP  weird 227 formay. Curl couldn't parse the 227-line\n"
"          the server sent.\n"
"\n"
"     15   FTP can't get host. Couldn't resolve the host IP we got\n"
"          in the 227-line.\n"
"\n"
"     16   FTP  can't  reconnect.  Couldn't connect to the host we\n"
"          got in the 227-line.\n"
"\n"
"     17   FTP  couldn't  set  binary.  Couldn't  change  transfer\n"
"          method to binary.\n"
"\n"
"     18   Partial file. Only a part of the file was transfered.\n"
"\n"
"     19   FTP couldn't RETR file. The RETR command failed.\n"
"\n"
"     20   FTP  write  error. The transfer was reported bad by the\n"
"          server.\n"
"\n"
"     21   FTP quote error. A quote command  returned  error  from\n"
"          the server.\n"
"\n"
"     22   HTTP  not found. The requested page was not found. This\n"
"          return code only appears if --fail is used.\n"
"\n"
"     23   Write error.  Curl  couldn't  write  data  to  a  local\n"
"          filesystem or similar.\n"
"\n"
"     24   Malformat user. User name badly specified.\n"
"\n"
"     25   FTP  couldn't  STOR  file.  The  server denied the STOR\n"
"          operation.\n"
"\n"
"     26   Read error. Various reading problems.\n"
"\n"
"     27   Out of memory. A memory allocation request failed.\n"
"\n"
"     28   Operation timeout. The specified  time-out  period  was\n"
"          reached according to the conditions.\n"
"     29   FTP  couldn't set ASCII. The server returned an unknown\n"
"          reply.\n"
"\n"
"     30   FTP PORT failed. The PORT command failed.\n"
"\n"
"     31   FTP couldn't use REST. The REST command failed.\n"
"\n"
"     32   FTP couldn't use SIZE. The  SIZE  command  failed.  The\n"
"          command  is  an  extension to the original FTP spec RFC\n"
"          959.\n"
"\n"
"     33   HTTP range error. The range \"command\" didn't work.\n"
"\n"
"     34   HTTP  post  error.  Internal  post-request   generation\n"
"          error.\n"
"\n"
"     35   SSL connect error. The SSL handshaking failed.\n"
"\n"
"     36   FTP  bad  download resume. Couldn't continue an earlier\n"
"          aborted download.\n"
"\n"
"     37   FILE couldn't read file. Failed to open the file.  Per­\n"
"          missions?\n"
"\n"
"     38   LDAP cannot bind. LDAP bind operation failed.\n"
"\n"
"     39   LDAP search failed.\n"
"\n"
"     40   Library not found. The LDAP library was not found.\n"
"\n"
"     41   Function  not  found.  A required LDAP function was not\n"
"          found.\n"
"\n"
"     XX   There will appear  more  error  codes  here  in  future\n"
"          releases.  The existing ones are meant to never change.\n"
"\n"
"BUGS\n"
"     If you do find any (or have other suggestions), mail  Daniel\n"
"     Stenberg <Daniel.Stenberg@haxx.nu>.\n"
"\n"
"AUTHORS / CONTRIBUTORS\n"
"      - Daniel Stenberg <Daniel.Stenberg@haxx.nu>\n"
"      - Rafael Sagula <sagula@inf.ufrgs.br>\n"
"      - Sampo Kellomaki <sampo@iki.fi>\n"
"      - Linas Vepstas <linas@linas.org>\n"
"      - Bjorn Reese <breese@mail1.stofanet.dk>\n"
"      - Johan Anderson <johan@homemail.com>\n"
"      - Kjell Ericson <Kjell.Ericson@sth.frontec.se>\n"
"      - Troy Engel <tengel@sonic.net>\n"
"      - Ryan Nelson <ryan@inch.com>\n"
"      - Bjorn Stenberg <Bjorn.Stenberg@sth.frontec.se>\n"
"      - Angus Mackay <amackay@gus.ml.org>\n"
"      - Eric Young <eay@cryptsoft.com>\n"
"      - Simon Dick <simond@totally.irrelevant.org>\n"
"      - Oren Tirosh <oren@monty.hishome.net>\n"
"      - Steven G. Johnson <stevenj@alum.mit.edu>\n"
"      - Gilbert Ramirez Jr. <gram@verdict.uthscsa.edu>\n"
"      - Andrés García <ornalux@redestb.es>\n"
"      - Douglas E. Wegscheid <wegscd@whirlpool.com>\n"
"      - Mark Butler <butlerm@xmission.com>\n"
"      - Eric Thelin <eric@generation-i.com>\n"
"      - Marc Boucher <marc@mbsi.ca>\n"
"      - Greg Onufer <Greg.Onufer@Eng.Sun.COM>\n"
"      - Doug Kaufman <dkaufman@rahul.net>\n"
"      - David Eriksson <david@2good.com>\n"
"      - Ralph Beckmann <rabe@uni-paderborn.de>\n"
"      - T. Yamada <tai@imasy.or.jp>\n"
"      - Lars J. Aas <larsa@sim.no>\n"
"      - Jörn Hartroth <Joern.Hartroth@telekom.de>\n"
"      - Matthew Clarke <clamat@van.maves.ca>\n"
"      - Linus Nielsen <Linus.Nielsen@haxx.nu>\n"
"      - Felix von Leitner <felix@convergence.de>\n"
"      - Dan Zitter <dzitter@zitter.net>\n"
"      - Jongki Suwandi <Jongki.Suwandi@eng.sun.com>\n"
"      - Chris Maltby <chris@aurema.com>\n"
"\n"
"WWW\n"
"     http://curl.haxx.nu\n"
"\n"
"FTP\n"
"     ftp://ftp.sunet.se/pub/www/utilities/curl/\n"
"\n"
"SEE ALSO\n"
"     ftp(1), wget(1), snarf(1)\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
609 610 611 612 613 614 615 616 617 618 619 620 621 622 623 624 625 626 627 628 629 630 631 632 633 634 635 636 637 638 639 640 641 642 643 644 645 646 647 648 649 650 651 652 653 654 655 656 657 658 659 660 661 662 663 664 665 666 667 668 669 670 671 672 673 674 675 676 677 678 679 680 681 682 683 684 685 686 687 688 689 690 691 692 693 694 695 696 697 698 699 700 701 702 703 704 705 706 707 708 709 710 711 712 713 714 715 716 717 718 719 720 721 722 723 724 725 726 727 728 729 730 731 732 733 734 735 736 737 738 739 740 741 742 743 744 745 746 747 748 749 750 751 752 753 754 755 756 757 758 759 760 761 762 763 764 765 766 767 768 769 770 771 772 773 774 775 776 777 778 779 780 781 782 783 784 785 786 787 788 789 790 791 792 793 794 795 796 797 798 799 800 801 802 803 804 805 806 807 808 809 810 811 812 813 814 815 816 817 818 819 820 821 822 823 824 825 826 827 828 829 830 831 832 833 834 835 836 837 838 839 840 841 842 843 844 845 846 847 848 849 850 851 852 853 854 855 856 857 858 859 860 861 862 863 864 865 866 867 868 869 870 871 872 873 874 875 876 877 878 879 880 881 882 883 884 885 886 887 888 889 890 891 892 893 894 895 896 897 898 899 900 901 902 903 904 905 906 907 908 909 910 911 912 913 914 915 916 917 918 919 920 921 922 923 924 925 926 927 928 929 930 931 932 933 934 935 936 937 938 939 940 941 942 943 944 945 946 947 948 949 950 951 952 953 954 955 956 957 958 959 960 961 962 963 964 965 966 967 968 969 970 971 972 973 974 975 976 977 978 979 980 981 982 983 984 985 986 987 988 989 990 991 992 993 994 995 996 997 998 999 1000
"\n"
"LATEST VERSION\n"
"\n"
"  You always find news about what's going on as well as the latest versions\n"
"  from the curl web pages, located at:\n"
"\n"
"        http://curl.haxx.nu\n"
"\n"
"SIMPLE USAGE\n"
"\n"
"  Get the main page from netscape's web-server:\n"
"\n"
"        curl http://www.netscape.com/\n"
"\n"
"  Get the root README file from funet's ftp-server:\n"
"\n"
"        curl ftp://ftp.funet.fi/README\n"
"\n"
"  Get a gopher document from funet's gopher server:\n"
"\n"
"        curl gopher://gopher.funet.fi\n"
"\n"
"  Get a web page from a server using port 8000:\n"
"\n"
"        curl http://www.weirdserver.com:8000/\n"
"\n"
"  Get a list of the root directory of an FTP site:\n"
"\n"
"        curl ftp://ftp.fts.frontec.se/\n"
"\n"
"  Get the definition of curl from a dictionary:\n"
"\n"
"        curl dict://dict.org/m:curl\n"
"\n"
"DOWNLOAD TO A FILE\n"
"\n"
"  Get a web page and store in a local file:\n"
"\n"
"        curl -o thatpage.html http://www.netscape.com/\n"
"\n"
"  Get a web page and store in a local file, make the local file get the name\n"
"  of the remote document (if no file name part is specified in the URL, this\n"
"  will fail):\n"
"\n"
"        curl -O http://www.netscape.com/index.html\n"
"\n"
"USING PASSWORDS\n"
"\n"
" FTP\n"
"\n"
"   To ftp files using name+passwd, include them in the URL like:\n"
"\n"
"        curl ftp://name:passwd@machine.domain:port/full/path/to/file\n"
"\n"
"   or specify them with the -u flag like\n"
"\n"
"        curl -u name:passwd ftp://machine.domain:port/full/path/to/file\n"
"\n"
" HTTP\n"
"\n"
"   The HTTP URL doesn't support user and password in the URL string. Curl\n"
"   does support that anyway to provide a ftp-style interface and thus you can\n"
"   pick a file like:\n"
"\n"
"        curl http://name:passwd@machine.domain/full/path/to/file\n"
"\n"
"   or specify user and password separately like in\n"
"\n"
"        curl -u name:passwd http://machine.domain/full/path/to/file\n"
"\n"
"   NOTE! Since HTTP URLs don't support user and password, you can't use that\n"
"   style when using Curl via a proxy. You _must_ use the -u style fetch\n"
"   during such circumstances.\n"
"\n"
" HTTPS\n"
"\n"
"   Probably most commonly used with private certificates, as explained below.\n"
"\n"
" GOPHER\n"
"\n"
"   Curl features no password support for gopher.\n"
"\n"
"PROXY\n"
"\n"
" Get an ftp file using a proxy named my-proxy that uses port 888:\n"
"\n"
"        curl -x my-proxy:888 ftp://ftp.leachsite.com/README\n"
"\n"
" Get a file from a HTTP server that requires user and password, using the\n"
" same proxy as above:\n"
"\n"
"        curl -u user:passwd -x my-proxy:888 http://www.get.this/\n"
"\n"
" Some proxies require special authentication. Specify by using -U as above:\n"
"\n"
"        curl -U user:passwd -x my-proxy:888 http://www.get.this/\n"
"\n"
" See also the environment variables Curl support that offer further proxy\n"
" control.\n"
"\n"
"RANGES\n"
"\n"
"  With HTTP 1.1 byte-ranges were introduced. Using this, a client can request\n"
"  to get only one or more subparts of a specified document. Curl supports\n"
"  this with the -r flag.\n"
"\n"
"  Get the first 100 bytes of a document:\n"
"\n"
"        curl -r 0-99 http://www.get.this/\n"
"\n"
"  Get the last 500 bytes of a document:\n"
"\n"
"        curl -r -500 http://www.get.this/\n"
"\n"
"  Curl also supports simple ranges for FTP files as well. Then you can only\n"
"  specify start and stop position.\n"
"\n"
"  Get the first 100 bytes of a document using FTP:\n"
"\n"
"        curl -r 0-99 ftp://www.get.this/README  \n"
"\n"
"UPLOADING\n"
"\n"
" FTP\n"
"\n"
"   Upload all data on stdin to a specified ftp site:\n"
"\n"
"        curl -t ftp://ftp.upload.com/myfile\n"
"\n"
"   Upload data from a specified file, login with user and password:\n"
"\n"
"        curl -T uploadfile -u user:passwd ftp://ftp.upload.com/myfile\n"
"\n"
"   Upload a local file to the remote site, and use the local file name remote\n"
"   too:\n"
" \n"
"        curl -T uploadfile -u user:passwd ftp://ftp.upload.com/\n"
"\n"
"   NOTE: Curl is not currently supporing ftp upload through a proxy! The reason\n"
"   for this is simply that proxies are seldomly configured to allow this and\n"
"   that no author has supplied code that makes it possible!\n"
"\n"
" HTTP\n"
"\n"
"   Upload all data on stdin to a specified http site:\n"
"\n"
"        curl -t http://www.upload.com/myfile\n"
"\n"
"   Note that the http server must've been configured to accept PUT before this\n"
"   can be done successfully.\n"
"\n"
"   For other ways to do http data upload, see the POST section below.\n"
"\n"
"VERBOSE / DEBUG\n"
"\n"
"  If curl fails where it isn't supposed to, if the servers don't let you\n"
"  in, if you can't understand the responses: use the -v flag to get VERBOSE\n"
"  fetching. Curl will output lots of info and all data it sends and\n"
"  receives in order to let the user see all client-server interaction.\n"
"\n"
"        curl -v ftp://ftp.upload.com/\n"
"\n"
"DETAILED INFORMATION\n"
"\n"
"  Different protocols provide different ways of getting detailed information\n"
"  about specific files/documents. To get curl to show detailed information\n"
"  about a single file, you should use -I/--head option. It displays all\n"
"  available info on a single file for HTTP and FTP. The HTTP information is a\n"
"  lot more extensive.\n"
"\n"
"  For HTTP, you can get the header information (the same as -I would show)\n"
"  shown before the data by using -i/--include. Curl understands the\n"
"  -D/--dump-header option when getting files from both FTP and HTTP, and it\n"
"  will then store the headers in the specified file.\n"
"\n"
"  Store the HTTP headers in a separate file:\n"
"\n"
"        curl --dump-header headers.txt curl.haxx.nu\n"
"\n"
"  Note that headers stored in a separate file can be very useful at a later\n"
"  time if you want curl to use cookies sent by the server. More about that in\n"
"  the cookies section.\n"
"\n"
"POST (HTTP)\n"
"\n"
"  It's easy to post data using curl. This is done using the -d <data>\n"
"  option.  The post data must be urlencoded.\n"
"\n"
"  Post a simple \"name\" and \"phone\" guestbook.\n"
"\n"
"        curl -d \"name=Rafael%20Sagula&phone=3320780\" \\\n"
"                http://www.where.com/guest.cgi\n"
"\n"
"  While -d uses the application/x-www-form-urlencoded mime-type, generally\n"
"  understood by CGI's and similar, curl also supports the more capable\n"
"  multipart/form-data type. This latter type supports things like file upload.\n"
"\n"
"  -F accepts parameters like -F \"name=contents\". If you want the contents to\n"
"  be read from a file, use <@filename> as contents. When specifying a file,\n"
"  you can also specify which content type the file is, by appending\n"
"  ';type=<mime type>' to the file name. You can also post contents of several\n"
"  files in one field. So that the field name 'coolfiles' can be sent three\n"
"  files with different content types in a manner similar to:\n"
"\n"
"        curl -F \"coolfiles=@fil1.gif;type=image/gif,fil2.txt,fil3.html\" \\\n"
"        http://www.post.com/postit.cgi\n"
"\n"
"  If content-type is not specified, curl will try to guess from the extension\n"
"  (it only knows a few), or use the previously specified type (from an earlier\n"
"  file if several files are specified in a list) or finally using the default\n"
"  type 'text/plain'.\n"
"\n"
"  Emulate a fill-in form with -F. Let's say you fill in three fields in a\n"
"  form. One field is a file name which to post, one field is your name and one\n"
"  field is a file description. We want to post the file we have written named\n"
"  \"cooltext.txt\". To let curl do the posting of this data instead of your\n"
"  favourite browser, you have to check out the HTML of the form page to get to\n"
"  know the names of the input fields. In our example, the input field names are\n"
"  'file', 'yourname' and 'filedescription'.\n"
"\n"
"        curl -F \"file=@cooltext.txt\" -F \"yourname=Daniel\" \\\n"
"             -F \"filedescription=Cool text file with cool text inside\" \\\n"
"             http://www.post.com/postit.cgi\n"
"\n"
"  So, to send two files in one post you can do it in two ways:\n"
"\n"
"  1. Send multiple files in a single \"field\" with a single field name:\n"
" \n"
"        curl -F \"pictures=@dog.gif,cat.gif\" \n"
" \n"
"  2. Send two fields with two field names: \n"
"\n"
"        curl -F \"docpicture=@dog.gif\" -F \"catpicture=@cat.gif\" \n"
"\n"
"REFERER\n"
"\n"
"  A HTTP request has the option to include information about which address\n"
"  that referred to actual page, and curl allows the user to specify that\n"
"  referrer to get specified on the command line. It is especially useful to\n"
"  fool or trick stupid servers or CGI scripts that rely on that information\n"
"  being available or contain certain data.\n"
"\n"
"        curl -e www.coolsite.com http://www.showme.com/\n"
"\n"
"USER AGENT\n"
"\n"
"  A HTTP request has the option to include information about the browser\n"
"  that generated the request. Curl allows it to be specified on the command\n"
"  line. It is especially useful to fool or trick stupid servers or CGI\n"
"  scripts that only accept certain browsers.\n"
"\n"
"  Example:\n"
"\n"
"  curl -A 'Mozilla/3.0 (Win95; I)' http://www.nationsbank.com/\n"
"\n"
"  Other common strings:\n"
"    'Mozilla/3.0 (Win95; I)'     Netscape Version 3 for Windows 95\n"
"    'Mozilla/3.04 (Win95; U)'    Netscape Version 3 for Windows 95\n"
"    'Mozilla/2.02 (OS/2; U)'     Netscape Version 2 for OS/2\n"
"    'Mozilla/4.04 [en] (X11; U; AIX 4.2; Nav)'           NS for AIX\n"
"    'Mozilla/4.05 [en] (X11; U; Linux 2.0.32 i586)'      NS for Linux\n"
"\n"
"  Note that Internet Explorer tries hard to be compatible in every way:\n"
"    'Mozilla/4.0 (compatible; MSIE 4.01; Windows 95)'    MSIE for W95\n"
"\n"
"  Mozilla is not the only possible User-Agent name:\n"
"    'Konqueror/1.0'             KDE File Manager desktop client\n"
"    'Lynx/2.7.1 libwww-FM/2.14' Lynx command line browser\n"
"\n"
"COOKIES\n"
"\n"
"  Cookies are generally used by web servers to keep state information at the\n"
"  client's side. The server sets cookies by sending a response line in the\n"
"  headers that looks like 'Set-Cookie: <data>' where the data part then\n"
"  typically contains a set of NAME=VALUE pairs (separated by semicolons ';'\n"
"  like \"NAME1=VALUE1; NAME2=VALUE2;\"). The server can also specify for what\n"
"  path the \"cookie\" should be used for (by specifying \"path=value\"), when the\n"
"  cookie should expire (\"expire=DATE\"), for what domain to use it\n"
"  (\"domain=NAME\") and if it should be used on secure connections only\n"
"  (\"secure\").\n"
"\n"
"  If you've received a page from a server that contains a header like:\n"
"        Set-Cookie: sessionid=boo123; path=\"/foo\";\n"
"\n"
"  it means the server wants that first pair passed on when we get anything in\n"
"  a path beginning with \"/foo\".\n"
"\n"
"  Example, get a page that wants my name passed in a cookie:\n"
"\n"
"        curl -b \"name=Daniel\" www.sillypage.com\n"
"\n"
"  Curl also has the ability to use previously received cookies in following\n"
"  sessions. If you get cookies from a server and store them in a file in a\n"
"  manner similar to:\n"
"\n"
"        curl --dump-header headers www.example.com\n"
"\n"
"  ... you can then in a second connect to that (or another) site, use the\n"
"  cookies from the 'headers' file like:\n"
"\n"
"        curl -b headers www.example.com\n"
"\n"
"  Note that by specifying -b you enable the \"cookie awareness\" and with -L\n"
"  you can make curl follow a location: (which often is used in combination\n"
"  with cookies). So that if a site sends cookies and a location, you can\n"
"  use a non-existing file to trig the cookie awareness like:\n"
"\n"
"        curl -L -b empty-file www.example.com\n"
"\n"
"  The file to read cookies from must be formatted using plain HTTP headers OR\n"
"  as netscape's cookie file. Curl will determine what kind it is based on the\n"
"  file contents.\n"
"\n"
"PROGRESS METER\n"
"\n"
"  The progress meter was introduced to better show a user that something\n"
"  actually is happening. The different fields in the output have the following\n"
"  meaning:\n"
"\n"
"   %   Received   Total  Speed   Time left  Total   Curr.Speed\n"
"  13   524140   3841536   4296   0:12:52   0:14:54    292     \n"
"\n"
"  From left-to-right:\n"
"  - The first column, is the percentage of the file currently transfered.\n"
"  - Received means the total number of bytes that has been transfered.\n"
"  - Total is the total number of bytes expected to transfer.\n"
"  - Speed is average speed in bytes per second for the whole transfer so far.\n"
"  - Time left is the estimated time left for this transfer to finnish if the\n"
"    current average speed will remain steady.\n"
"  - Total is the estimated total transfer time.\n"
"  - Curr.Speed is the average transfer speed the last 5 seconds (the first\n"
"    5 seconds of a transfer is based on less time of course.)\n"
"\n"
"  NOTE: Much of the output is based on the fact that the size of the transfer\n"
"  is known before it takes place. If it isn't, a much less fancy display will\n"
"  be used.\n"
"\n"
"SPEED LIMIT\n"
"\n"
"  Curl offers the user to set conditions regarding transfer speed that must\n"
"  be met to let the transfer keep going. By using the switch -y and -Y you\n"
"  can make curl abort transfers if the transfer speed doesn't exceed your\n"
"  given lowest limit for a specified time.\n"
"\n"
"  To let curl abandon downloading this page if its slower than 3000 bytes per\n"
"  second for 1 minute, run:\n"
"\n"
"        curl -y 3000 -Y 60 www.far-away-site.com\n"
"\n"
"  This can very well be used in combination with the overall time limit, so\n"
"  that the above operatioin must be completed in whole within 30 minutes:\n"
"\n"
"        curl -m 1800 -y 3000 -Y 60 www.far-away-site.com\n"
"\n"
"CONFIG FILE\n"
"\n"
"  Curl automatically tries to read the .curlrc file (or _curlrc file on win32\n"
"  systems) from the user's home dir on startup. The config file should be\n"
"  made up with normal command line switches. Comments can be used within the\n"
"  file. If the first letter on a line is a '#'-letter the rest of the line\n"
"  is treated as a comment.\n"
"\n"
"  Example, set default time out and proxy in a config file:\n"
"\n"
"        # We want a 30 minute timeout:\n"
"        -m 1800\n"
"        # ... and we use a proxy for all accesses:\n"
"        -x proxy.our.domain.com:8080\n"
"\n"
"  White spaces ARE significant at the end of lines, but all white spaces\n"
"  leading up to the first characters of each line are ignored.\n"
"\n"
"  Prevent curl from reading the default file by using -q as the first command\n"
"  line parameter, like:\n"
"\n"
"        curl -q www.thatsite.com\n"
"\n"
"  Force curl to get and display a local help page in case it is invoked\n"
"  without URL by making a config file similar to:\n"
"\n"
"        # default url to get\n"
"        http://help.with.curl.com/curlhelp.html\n"
"\n"
"  You can specify another config file to be read by using the -K/--config\n"
"  flag. If you set config file name to \"-\" it'll read the config from stdin,\n"
"  which can be handy if you want to hide options from being visible in process\n"
"  tables etc:\n"
"\n"
"        echo \"-u user:passwd\" | curl -K - http://that.secret.site.com\n"
"\n"
"EXTRA HEADERS\n"
"\n"