Skip to content
hugehelp.c 57.3 KiB
Newer Older
Daniel Stenberg's avatar
Daniel Stenberg committed
/* NEVER EVER edit this manually, fix the mkhelp script instead! */
#include <stdio.h>
void hugehelp(void)
{
puts (
"                                  _   _ ____  _     \n"
"  Project                     ___| | | |  _ \\| |    \n"
"                             / __| | | | |_) | |    \n"
"                            | (__| |_| |  _ <| |___ \n"
"                             \\___|\\___/|_| \\_\\_____|\n"
"NAME\n"
"     curl - get a URL with FTP, TELNET, LDAP, GOPHER, DICT, FILE,\n"
"     HTTP or HTTPS syntax.\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"\n"
"SYNOPSIS\n"
"     curl [options] url\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"\n"
"DESCRIPTION\n"
"     curl is a client to get documents/files from servers,  using\n"
"     any  of  the supported protocols. The command is designed to\n"
"     work without user interaction or any kind of  interactivity.\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"\n"
"     curl  offers  a busload of useful tricks like proxy support,\n"
"     user authentication, ftp upload,  HTTP  post,  SSL  (https:)\n"
"     connections, cookies, file transfer resume and more.\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"\n"
"URL\n"
"     The URL syntax is protocol dependent. You'll find a detailed\n"
"     description in RFC 2396.\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"\n"
"     You can specify multiple URLs or parts of  URLs  by  writing\n"
"     part sets within braces as in:\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"\n"
"      http://site.{one,two,three}.com\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"\n"
"     or  you can get sequences of alphanumeric series by using []\n"
"     as in:\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"\n"
"      ftp://ftp.numericals.com/file[1-100].txt\n"
"      ftp://ftp.numericals.com/file[001-100].txt    (with leading\n"
"     zeros)\n"
"      ftp://ftp.letters.com/file[a-z].txt\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"\n"
"     It  is possible to specify up to 9 sets or series for a URL,\n"
"     but no nesting is supported at the moment:\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"\n"
"      http://www.any.org/archive[1996-1999]/vol­\n"
"     ume[1-4]part{a,b,c,index}.html\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"\n"
"OPTIONS\n"
51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 331 332 333 334 335 336 337 338 339 340 341 342 343 344 345 346 347 348 349 350 351 352 353 354 355 356 357 358 359 360 361 362 363 364 365 366 367 368 369 370 371 372 373 374 375 376 377 378 379 380 381 382 383 384 385 386 387 388 389 390 391 392 393 394 395 396 397 398 399 400 401 402 403 404 405 406 407 408 409 410 411 412 413 414 415 416 417 418 419 420 421 422 423 424 425 426 427 428 429 430 431 432 433 434 435 436 437 438 439 440 441 442 443 444 445 446 447 448 449 450 451 452 453 454 455 456 457 458 459 460 461 462 463 464 465 466 467 468 469 470 471 472 473 474 475 476 477 478 479 480 481 482 483 484 485 486 487 488 489 490 491 492 493 494 495 496 497 498 499 500 501 502
"     -a/--append\n"
"          (FTP) When used in a ftp upload, this will tell curl to\n"
"          append to the target file instead of overwriting it. If\n"
"          the file doesn't exist, it will be created.\n"
"\n"
"     -A/--user-agent <agent string>\n"
"          (HTTP)  Specify  the  User-Agent  string to send to the\n"
"          HTTP server. Some badly done CGIs fail if its  not  set\n"
"          to \"Mozilla/4.0\".  To encode blanks in the string, sur­\n"
"          round the string with single  quote  marks.   This  can\n"
"          also be set with the -H/--header flag of course.\n"
"     -b/--cookie <name=data>\n"
"          (HTTP) Pass the data to the HTTP server as a cookie. It\n"
"          is supposedly the data  previously  received  from  the\n"
"          server  in a \"Set-Cookie:\" line.  The data should be in\n"
"          the format \"NAME1=VALUE1; NAME2=VALUE2\".\n"
"\n"
"          If no '=' letter is used in the line, it is treated  as\n"
"          a  filename  to  use  to  read previously stored cookie\n"
"          lines from, which should be used  in  this  session  if\n"
"          they  match.  Using  this  method  also  activates  the\n"
"          \"cookie parser\" which will make  curl  record  incoming\n"
"          cookies too, which may be handy if you're using this in\n"
"          combination with the  -L/--location  option.  The  file\n"
"          format of the file to read cookies from should be plain\n"
"          HTTP headers or the netscape cookie file format.\n"
"\n"
"          NOTE that the file specified with -b/--cookie  is  only\n"
"          used  as  input. No cookies will be stored in the file.\n"
"          To store cookies, save the HTTP headers to a file using\n"
"          -D/--dump-header!\n"
"\n"
"     -B/--ftp-ascii\n"
"          (FTP/LDAP)  Use ASCII transfer when getting an FTP file\n"
"          or LDAP info. For FTP, this can  also  be  enforced  by\n"
"          using an URL that ends with \";type=A\".\n"
"\n"
"     -c/--continue\n"
"          Continue/Resume   a   previous   file   transfer.  This\n"
"          instructs curl to continue appending data on  the  file\n"
"          where  it  was  previously  left, possibly because of a\n"
"          broken connection to the server. There must be a  named\n"
"          physical  file  to  append  to for this to work.  Note:\n"
"          Upload resume is depening on a command named  SIZE  not\n"
"          always present in all ftp servers! Upload resume is for\n"
"          FTP only.  HTTP resume is only possible  with  HTTP/1.1\n"
"          or later servers.\n"
"\n"
"     -C/--continue-at <offset>\n"
"          Continue/Resume  a  previous file transfer at the given\n"
"          offset. The given offset is the exact number  of  bytes\n"
"          that  will be skipped counted from the beginning of the\n"
"          source file before it is transfered to the destination.\n"
"          If  used with uploads, the ftp server command SIZE will\n"
"          not be used by curl. Upload resume  is  for  FTP  only.\n"
"          HTTP  resume  is  only  possible with HTTP/1.1 or later\n"
"          servers.\n"
"\n"
"     -d/--data <data>\n"
"          (HTTP) Sends the specified data in a  POST  request  to\n"
"          the  HTTP server. Note that the data is sent exactly as\n"
"          specified  with  no  extra  processing.   The  data  is\n"
"          expected  to  be \"url-encoded\". This will cause curl to\n"
"          pass the data to  the  server  using  the  content-type\n"
"          application/x-www-form-urlencoded. Compare to -F.\n"
"\n"
"          If  you  start  the  data  with  the letter @, the rest\n"
"          should be a file name to read the data from,  or  -  if\n"
"          you  want  curl  to read the data from stdin.  The con­\n"
"          tents of the file must already be url-encoded.\n"
"\n"
"     -D/--dump-header <file>\n"
"          (HTTP/FTP) Write the HTTP headers to this  file.  Write\n"
"          the FTP file info to this file if -I/--head is used.\n"
"\n"
"          This  option is handy to use when you want to store the\n"
"          cookies that a HTTP site  sends  to  you.  The  cookies\n"
"          could then be read in a second curl invoke by using the\n"
"          -b/--cookie option!\n"
"\n"
"     -e/--referer <URL>\n"
"          (HTTP) Sends the \"Referer Page\" information to the HTTP\n"
"          server. Some badly done CGIs fail if it's not set. This\n"
"          can also be set with the -H/--header flag of course.\n"
"\n"
"     -E/--cert <certificate[:password]>\n"
"          (HTTPS) Tells curl to  use  the  specified  certificate\n"
"          file  when  getting  a file with HTTPS. The certificate\n"
"          must be in PEM format.  If the optional password  isn't\n"
"          specified, it will be queried for on the terminal. Note\n"
"          that this certificate is the private key and  the  pri­\n"
"          vate certificate concatenated!\n"
"\n"
"     -f/--fail\n"
"          (HTTP)  Fail  silently  (no  output  at  all) on server\n"
"          errors. This is mostly done like this to better  enable\n"
"          scripts  etc  to  better  deal with failed attempts. In\n"
"          normal cases when a HTTP server fails to deliver a doc­\n"
"          ument,  it  returns  a  HTML document stating so (which\n"
"          often also describes why and more). This flag will pre­\n"
"          vent  curl  from  outputting  that  and  fail  silently\n"
"          instead.\n"
"\n"
"     -F/--form <name=content>\n"
"          (HTTP) This lets curl emulate a filled in form in which\n"
"          a  user has pressed the submit button. This causes curl\n"
"          to POST data using the content-type multipart/form-data\n"
"          according  to RFC1867. This enables uploading of binary\n"
"          files etc. To force the 'content' part to be read  from\n"
"          a  file,  prefix the file name with an @ sign. Example,\n"
"          to send your password file to the server, where  'pass­\n"
"          word'   is   the   name  of  the  form-field  to  which\n"
"          /etc/passwd will be the input:\n"
"\n"
"          curl -F password=@/etc/passwd www.mypasswords.com\n"
"          To read the file's content from stdin insted of a file,\n"
"          use - where the file name should've been.\n"
"\n"
"     -h/--help\n"
"          Usage help.\n"
"\n"
"     -H/--header <header>\n"
"          (HTTP) Extra header to use when getting a web page. You\n"
"          may specify any number of extra headers. Note  that  if\n"
"          you  should  add a custom header that has the same name\n"
"          as one of the internal ones curl would use, your exter­\n"
"          nally  set  header will be used instead of the internal\n"
"          one. This allows you to make even trickier  stuff  than\n"
"          curl  would  normally do. You should not replace inter­\n"
"          nally set headers without knowing perfectly  well  what\n"
"          you're doing.\n"
"\n"
"     -i/--include\n"
"          (HTTP) Include the HTTP-header in the output. The HTTP-\n"
"          header includes things like server-name,  date  of  the\n"
"          document, HTTP-version and more...\n"
"\n"
"     -I/--head\n"
"          (HTTP/FTP)  Fetch  the  HTTP-header  only! HTTP-servers\n"
"          feature the command HEAD which this uses to get nothing\n"
"          but  the header of a document. When used on a FTP file,\n"
"          curl displays the file size only.\n"
"\n"
"     -K/--config <config file>\n"
"          Specify which config file to read curl arguments  from.\n"
"          The  config  file  is a text file in which command line\n"
"          arguments can be written which then will be used as  if\n"
"          they  were  written  on the actual command line. If the\n"
"          first column of a config line is a '#'  character,  the\n"
"          rest of the line will be treated as a comment.\n"
"\n"
"          Specify  the filename as '-' to make curl read the file\n"
"          from stdin.\n"
"\n"
"     -l/--list-only\n"
"          (FTP) When listing an FTP directory, this switch forces\n"
"          a  name-only  view.   Especially  useful if you want to\n"
"          machine-parse the contents of an  FTP  directory  since\n"
"          the  normal  directory view doesn't use a standard look\n"
"          or format.\n"
"\n"
"     -L/--location\n"
"          (HTTP/HTTPS) If the server reports that  the  requested\n"
"          page  has  a  different  location  (indicated  with the\n"
"          header line Location:) this flag will let curl  attempt\n"
"          to reattempt the get on the new place. If used together\n"
"          with -i or -I, headers from all requested pages will be\n"
"          shown.\n"
"\n"
"     -m/--max-time <seconds>\n"
"          Maximum time in seconds that you allow the whole opera­\n"
"          tion to take.  This is useful for preventing your batch\n"
"          jobs  from  hanging  for  hours due to slow networks or\n"
"          links going down.  This doesn't work properly in  win32\n"
"          systems.\n"
"\n"
"     -M/--manual\n"
"          Manual. Display the huge help text.\n"
"\n"
"     -n/--netrc\n"
"          Makes  curl  scan  the  .netrc  file in the user's home\n"
"          directory for login name and password.  This  is  typi­\n"
"          cally  used  for  ftp  on unix. If used with http, curl\n"
"          will  enable  user  authentication.  See  netrc(5)  for\n"
"          details  on  the file format. Curl will not complain if\n"
"          that file hasn't the right permissions (it  should  not\n"
"          be  world nor group readable). The environment variable\n"
"          \"HOME\" is used to find the home directory.\n"
"\n"
"          A quick and very simple  example  of  how  to  setup  a\n"
"          .netrc   to   allow   curl   to   ftp  to  the  machine\n"
"          host.domain.com with user name\n"
"\n"
"          machine host.domain.com login myself password secret\n"
"\n"
"     -N/--no-buffer\n"
"          Disables the buffering of the output stream. In  normal\n"
"          work situations, curl will use a standard buffered out­\n"
"          put stream that will have the effect that it will  out­\n"
"          put  the  data  in chunks, not necessarily exactly when\n"
"          the data arrives.  Using this option will disable  that\n"
"          buffering.\n"
"\n"
"     -o/--output <file>\n"
"          Write  output  to  <file> instead of stdout. If you are\n"
"          using {} or [] to fetch multiple documents, you can use\n"
"          '#'  followed by a number in the <file> specifier. That\n"
"          variable will be replaced with the current  string  for\n"
"          the URL being fetched. Like in:\n"
"\n"
"            curl http://{one,two}.site.com -o \"file_#1.txt\"\n"
"\n"
"          or use several variables like:\n"
"\n"
"            curl http://{site,host}.host[1-5].com -o \"#1_#2\"\n"
"\n"
"     -O/--remote-name\n"
"          Write output to a local file named like the remote file\n"
"          we get. (Only the file part of the remote file is used,\n"
"          the path is cut off.)\n"
"\n"
"     -P/--ftpport <address>\n"
"          (FTP)  Reverses  the initiator/listener roles when con­\n"
"          necting with ftp. This switch makes Curl use  the  PORT\n"
"          command  instead  of  PASV. In practice, PORT tells the\n"
"          server to connect to the client's specified address and\n"
"          port,  while PASV asks the server for an ip address and\n"
"          port to connect to. <address> should be one of:\n"
"\n"
"          interface   i.e \"eth0\" to specify which interface's  IP\n"
"                      address you want to use  (Unix only)\n"
"\n"
"          IP address  i.e \"192.168.10.1\" to specify exact IP num­\n"
"                      ber\n"
"\n"
"          host name   i.e \"my.host.domain\" to specify machine\n"
"\n"
"          -           (any single-letter string) to make it  pick\n"
"                      the machine's default\n"
"\n"
"     -q   If used as the first parameter on the command line, the\n"
"          $HOME/.curlrc file will not be read and used as a  con­\n"
"          fig file.\n"
"\n"
"     -Q/--quote <comand>\n"
"          (FTP)  Send  an  arbitrary  command  to  the remote FTP\n"
"          server, by using the QUOTE command of the  server.  Not\n"
"          all  servers support this command, and the set of QUOTE\n"
"          commands are server specific! Quote commands  are  sent\n"
"          BEFORE  the  transfer is taking place. To make commands\n"
"          take place after a  successful  transfer,  prefix  them\n"
"          with a dash '-'. You may specify any amount of commands\n"
"          to be run before and after the transfer. If the  server\n"
"          returns  failure  for  one  of the commands, the entire\n"
"          operation will be aborted.\n"
"\n"
"     -r/--range <range>\n"
"          (HTTP/FTP) Retrieve a byte range (i.e a  partial  docu­\n"
"          ment)  from  a  HTTP/1.1  or  FTP server. Ranges can be\n"
"          specified in a number of ways.\n"
"\n"
"          0-499     specifies the first 500 bytes\n"
"\n"
"          500-999   specifies the second 500 bytes\n"
"\n"
"          -500      specifies the last 500 bytes\n"
"\n"
"          9500      specifies the bytes from offset 9500 and for­\n"
"                    ward\n"
"\n"
"          0-0,-1    specifies the first and last byte only(*)(H)\n"
"          500-700,600-799\n"
"                    specifies 300 bytes from offset 500(H)\n"
"\n"
"          100-199,500-599\n"
"                    specifies two separate 100 bytes ranges(*)(H)\n"
"\n"
"     (*) = NOTE that this will cause the server to reply  with  a\n"
"     multipart response!\n"
"\n"
"     You  should  also be aware that many HTTP/1.1 servers do not\n"
"     have this feature enabled, so that when you attempt to get a\n"
"     range, you'll instead get the whole document.\n"
"\n"
"     FTP  range  downloads only support the simple syntax 'start-\n"
"     stop' (optionally with  one  of  the  numbers  omitted).  It\n"
"     depends on the non-RFC command SIZE.\n"
"\n"
"     -s/--silent\n"
"          Silent  mode.  Don't  show progress meter or error mes­\n"
"          sages.  Makes Curl mute.\n"
"\n"
"     -S/--show-error\n"
"          When used with -s it makes curl show error  message  if\n"
"          it fails.\n"
"\n"
"     -t/--upload\n"
"          Transfer  the  stdin  data  to the specified file. Curl\n"
"          will read everything from stdin  until  EOF  and  store\n"
"          with  the  supplied  name. If this is used on a http(s)\n"
"          server, the PUT command will be used.\n"
"\n"
"     -T/--upload-file <file>\n"
"          Like -t, but this transfers the specified  local  file.\n"
"          If  there  is  no  file part in the specified URL, Curl\n"
"          will append the local file name. NOTE that you must use\n"
"          a  trailing  / on the last directory to really prove to\n"
"          Curl that there is no file name or curl will think that\n"
"          your  last  directory  name  is the remote file name to\n"
"          use. That will most likely cause the  upload  operation\n"
"          to  fail.  If this is used on a http(s) server, the PUT\n"
"          command will be used.\n"
"\n"
"     -u/--user <user:password>\n"
"          Specify user and password to  use  when  fetching.  See\n"
"          README.curl  for  detailed examples of how to use this.\n"
"          If no password is  specified,  curl  will  ask  for  it\n"
"          interactively.\n"
"\n"
"     -U/--proxy-user <user:password>\n"
"          Specify  user and password to use for Proxy authentica­\n"
"          tion. If no password is specified, curl will ask for it\n"
"          interactively.\n"
"     -v/--verbose\n"
"          Makes   the  fetching  more  verbose/talkative.  Mostly\n"
"          usable for debugging. Lines  starting  with  '>'  means\n"
"          data sent by curl, '<' means data received by curl that\n"
"          is hidden in normal cases and lines starting  with  '*'\n"
"          means additional info provided by curl.\n"
"\n"
"     -V/--version\n"
"          Displays  the  full  version of curl, libcurl and other\n"
"          3rd party libraries linked with the executable.\n"
"\n"
"     -w/--write-out <format>\n"
"          Defines what to display after a completed and  success­\n"
"          ful  operation. The format is a string that may contain\n"
"          plain text mixed with  any  number  of  variables.  The\n"
"          string can be specified as \"string\", to get read from a\n"
"          particular file you specify it \"@filename\" and to  tell\n"
"          curl to read the format from stdin you write \"@-\".\n"
"\n"
"          The variables present in the output format will be sub­\n"
"          stituted by the value or text that curl thinks fit,  as\n"
"          described  below.  All  variables  are  specified  like\n"
"          %{variable_name} and to output  a  normal  %  you  just\n"
"          write  them  like %%. You can output a newline by using\n"
"          \\n, a carrige return with \\r and a tab space with \\t.\n"
"\n"
"          NOTE:  The  %-letter  is  a  special  letter   in   the\n"
"          win32-environment,  where  all occurrences of % must be\n"
"          doubled when using this option.\n"
"\n"
"          Available variables are at this point:\n"
"\n"
"          url_effective  The URL that was fetched last.  This  is\n"
"                         mostly meaningful if you've told curl to\n"
"                         follow location: headers.\n"
"\n"
"          http_code      The numerical code that was found in the\n"
"                         last retrieved HTTP(S) page.\n"
"\n"
"          time_total     The  total  time,  in  seconds, that the\n"
"                         full operation lasted. The time will  be\n"
"                         displayed with millisecond resolution.\n"
"\n"
"          time_namelookup\n"
"                         The  time,  in seconds, it took from the\n"
"                         start until the name resolving was  com­\n"
"                         pleted.\n"
"\n"
"          time_connect   The  time,  in seconds, it took from the\n"
"                         start until the connect  to  the  remote\n"
"                         host (or proxy) was completed.\n"
"          time_pretransfer\n"
"                         The  time,  in seconds, it took from the\n"
"                         start until the file  transfer  is  just\n"
"                         about  to  begin. This includes all pre-\n"
"                         transfer commands and negotiations  that\n"
"                         are  specific  to  the particular proto­\n"
"                         col(s) involved.\n"
"\n"
"          size_download  The total  amount  of  bytes  that  were\n"
"                         downloaded.\n"
"\n"
"          size_upload    The  total  amount  of  bytes  that were\n"
"                         uploaded.\n"
"\n"
"          speed_download The average  download  speed  that  curl\n"
"                         measured for the complete download.\n"
"\n"
"          speed_upload   The  average upload speed that curl mea­\n"
"                         sured for the complete download.\n"
"\n"
"     -x/--proxy <proxyhost[:port]>\n"
"          Use specified proxy. If the port number is  not  speci­\n"
"          fied, it is assumed at port 1080.\n"
"\n"
"     -X/--request <command>\n"
"          (HTTP)  Specifies a custom request to use when communi­\n"
"          cating with the HTTP  server.   The  specified  request\n"
"          will be used instead of the standard GET. Read the HTTP\n"
"          1.1 specification for details and explanations.\n"
"\n"
"          (FTP) Specifies a custom FTP command to use instead  of\n"
"          LIST when doing file lists with ftp.\n"
"\n"
"     -y/--speed-time <time>\n"
"          If a download is slower than speed-limit bytes per sec­\n"
"          ond during  a  speed-time  period,  the  download  gets\n"
"          aborted. If speed-time is used, the default speed-limit\n"
"          will be 1 unless set with -y.\n"
"\n"
"     -Y/--speed-limit <speed>\n"
"          If a download is slower than this given speed, in bytes\n"
"          per  second,  for  speed-time  seconds it gets aborted.\n"
"          speed-time is set with -Y and is 30 if not set.\n"
"\n"
"     -z/--time-cond <date expression>\n"
"          (HTTP) Request to get a file  that  has  been  modified\n"
"          later  than  the  given  time and date, or one that has\n"
"          been modified before that time. The date expression can\n"
"          be all sorts of date strings or if it doesn't match any\n"
"          internal ones, it tries to get the time  from  a  given\n"
"          file  name  instead!  See  the GNU date(1) man page for\n"
"          date expression details.\n"
"          Start the date expression with a dash (-)  to  make  it\n"
"          request  for  a  document  that is older than the given\n"
"          date/time, default is a document that is newer than the\n"
"          specified date/time.\n"
"\n"
"     -3/--sslv3\n"
"          (HTTPS) Forces curl to use SSL version 3 when negotiat­\n"
"          ing with a remote SSL server.\n"
"\n"
"     -2/--sslv2\n"
"          (HTTPS) Forces curl to use SSL version 2 when negotiat­\n"
"          ing with a remote SSL server.\n"
"\n"
"     -#/--progress-bar\n"
"          Make  curl  display  progress information as a progress\n"
"          bar instead of the default statistics.\n"
"\n"
"     --crlf\n"
"          (FTP) Convert LF to CRLF  in  upload.  Useful  for  MVS\n"
"          (OS/390).\n"
"\n"
"     --stderr <file>\n"
"          Redirect  all  writes  to  stderr to the specified file\n"
"          instead. If the file name is a plain '-', it is instead\n"
"          written to stdout. This option has no point when you're\n"
"          using a shell with decent redirecting capabilities.\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"\n"
"FILES\n"
"     ~/.curlrc\n"
"          Default config file.\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"ENVIRONMENT\n"
"     HTTP_PROXY [protocol://]<host>[:port]\n"
"          Sets proxy server to use for HTTP.\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"\n"
"     HTTPS_PROXY [protocol://]<host>[:port]\n"
"          Sets proxy server to use for HTTPS.\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"\n"
"     FTP_PROXY [protocol://]<host>[:port]\n"
"          Sets proxy server to use for FTP.\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"\n"
"     GOPHER_PROXY [protocol://]<host>[:port]\n"
"          Sets proxy server to use for GOPHER.\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"\n"
"     ALL_PROXY [protocol://]<host>[:port]\n"
"          Sets proxy server to use if no protocol-specific  proxy\n"
"          is set.\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"\n"
"     NO_PROXY <comma-separated list of hosts>\n"
"          list of host names that shouldn't go through any proxy.\n"
"          If set to a asterisk '*' only, it matches all hosts.\n"
"     COLUMNS <integer>\n"
"          The width of the terminal.  This variable only  affects\n"
"          curl when the --progress-bar option is used.\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"\n"
"EXIT CODES\n"
"     There exists a bunch of different error codes and their cor­\n"
"     responding error messages that may appear during bad  condi­\n"
"     tions. At the time of this writing, the exit codes are:\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"\n"
"     1    Unsupported protocol. This build of curl has no support\n"
"          for this protocol.\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"\n"
"     2    Failed to initialize.\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"\n"
"     3    URL malformat. The syntax was not correct.\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"\n"
"     4    URL user malformatted. The user-part of the URL  syntax\n"
"          was not correct.\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"\n"
"     5    Couldn't  resolve proxy. The given proxy host could not\n"
"          be resolved.\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"\n"
"     6    Couldn't resolve host. The given remote  host  was  not\n"
"          resolved.\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"\n"
"     7    Failed to connect to host.\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"\n"
"     8    FTP  weird  server  reply.  The  server  sent data curl\n"
"          couldn't parse.\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"\n"
"     9    FTP access denied. The server denied login.\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"\n"
"     10   FTP user/password incorrect. Either one  or  both  were\n"
"          not accepted by the server.\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"\n"
"     11   FTP  weird  PASS  reply.  Curl couldn't parse the reply\n"
"          sent to the PASS request.\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"\n"
"     12   FTP weird USER reply. Curl  couldn't  parse  the  reply\n"
"          sent to the USER request.\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"\n"
"     13   FTP  weird  PASV  reply,  Curl couldn't parse the reply\n"
"          sent to the PASV request.\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"\n"
"     14   FTP weird 227 formay. Curl couldn't parse the  227-line\n"
"          the server sent.\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"\n"
"     15   FTP can't get host. Couldn't resolve the host IP we got\n"
"          in the 227-line.\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"\n"
"     16   FTP can't reconnect. Couldn't connect to  the  host  we\n"
"          got in the 227-line.\n"
"     17   FTP  couldn't  set  binary.  Couldn't  change  transfer\n"
"          method to binary.\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"\n"
"     18   Partial file. Only a part of the file was transfered.\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"\n"
"     19   FTP couldn't RETR file. The RETR command failed.\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"\n"
"     20   FTP write error. The transfer was reported bad  by  the\n"
"          server.\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"\n"
"     21   FTP  quote  error.  A quote command returned error from\n"
"          the server.\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"\n"
"     22   HTTP not found. The requested page was not found.  This\n"
"          return code only appears if --fail is used.\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"\n"
"     23   Write  error.  Curl  couldn't  write  data  to  a local\n"
"          filesystem or similar.\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"\n"
"     24   Malformat user. User name badly specified.\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"\n"
"     25   FTP couldn't STOR file.  The  server  denied  the  STOR\n"
"          operation.\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"\n"
"     26   Read error. Various reading problems.\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"\n"
"     27   Out of memory. A memory allocation request failed.\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"\n"
"     28   Operation  timeout.  The  specified time-out period was\n"
"          reached according to the conditions.\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"\n"
"     29   FTP couldn't set ASCII. The server returned an  unknown\n"
"          reply.\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"\n"
"     30   FTP PORT failed. The PORT command failed.\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"\n"
"     31   FTP couldn't use REST. The REST command failed.\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"\n"
"     32   FTP  couldn't  use  SIZE.  The SIZE command failed. The\n"
"          command is an extension to the original  FTP  spec  RFC\n"
"          959.\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"\n"
"     33   HTTP range error. The range \"command\" didn't work.\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"\n"
"     34   HTTP   post  error.  Internal  post-request  generation\n"
"          error.\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"\n"
"     35   SSL connect error. The SSL handshaking failed.\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"\n"
"     36   FTP bad download resume. Couldn't continue  an  earlier\n"
"          aborted download.\n"
"     37   FILE  couldn't read file. Failed to open the file. Per­\n"
"          missions?\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"\n"
"     38   LDAP cannot bind. LDAP bind operation failed.\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"\n"
"     39   LDAP search failed.\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"\n"
"     40   Library not found. The LDAP library was not found.\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"\n"
"     41   Function not found. A required LDAP  function  was  not\n"
"          found.\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"\n"
"     XX   There  will  appear  more  error  codes  here in future\n"
"          releases. The existing ones are meant to never  change.\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"\n"
"BUGS\n"
"     If  you do find any (or have other suggestions), mail Daniel\n"
"     Stenberg <Daniel.Stenberg@haxx.nu>.\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"\n"
"AUTHORS / CONTRIBUTORS\n"
"      - Daniel Stenberg <Daniel.Stenberg@haxx.nu>\n"
"      - Rafael Sagula <sagula@inf.ufrgs.br>\n"
"      - Sampo Kellomaki <sampo@iki.fi>\n"
"      - Linas Vepstas <linas@linas.org>\n"
"      - Bjorn Reese <breese@mail1.stofanet.dk>\n"
"      - Johan Anderson <johan@homemail.com>\n"
"      - Kjell Ericson <Kjell.Ericson@haxx,nu>\n"
"      - Troy Engel <tengel@sonic.net>\n"
"      - Ryan Nelson <ryan@inch.com>\n"
"      - Bjorn Stenberg <Bjorn.Stenberg@haxx.nu>\n"
"      - Angus Mackay <amackay@gus.ml.org>\n"
"      - Eric Young <eay@cryptsoft.com>\n"
"      - Simon Dick <simond@totally.irrelevant.org>\n"
"      - Oren Tirosh <oren@monty.hishome.net>\n"
"      - Steven G. Johnson <stevenj@alum.mit.edu>\n"
"      - Gilbert Ramirez Jr. <gram@verdict.uthscsa.edu>\n"
"      - Andrés García <ornalux@redestb.es>\n"
"      - Douglas E. Wegscheid <wegscd@whirlpool.com>\n"
"      - Mark Butler <butlerm@xmission.com>\n"
"      - Eric Thelin <eric@generation-i.com>\n"
"      - Marc Boucher <marc@mbsi.ca>\n"
"      - Greg Onufer <Greg.Onufer@Eng.Sun.COM>\n"
"      - Doug Kaufman <dkaufman@rahul.net>\n"
"      - David Eriksson <david@2good.com>\n"
"      - Ralph Beckmann <rabe@uni-paderborn.de>\n"
"      - T. Yamada <tai@imasy.or.jp>\n"
"      - Lars J. Aas <larsa@sim.no>\n"
"      - Jörn Hartroth <Joern.Hartroth@telekom.de>\n"
"      - Matthew Clarke <clamat@van.maves.ca>\n"
"      - Linus Nielsen <Linus.Nielsen@haxx.nu>\n"
"      - Felix von Leitner <felix@convergence.de>\n"
"      - Dan Zitter <dzitter@zitter.net>\n"
"      - Jongki Suwandi <Jongki.Suwandi@eng.sun.com>\n"
"      - Chris Maltby <chris@aurema.com>\n"
"      - Ron Zapp <rzapper@yahoo.com>\n"
"      - Paul Marquis <pmarquis@iname.com>\n"
"      - Ellis Pritchard <ellis@citria.com>\n"
"      - Damien Adant <dams@usa.net>\n"
"      - Chris <cbayliss@csc.come>\n"
"      - Marco G. Salvagno <mgs@whiz.cjb.net>\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"\n"
"WWW\n"
"     http://curl.haxx.nu\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"\n"
"FTP\n"
"     ftp://ftp.sunet.se/pub/www/utilities/curl/\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"\n"
"SEE ALSO\n"
"     ftp(1), wget(1), snarf(1)\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"\n"
"LATEST VERSION\n"
"\n"
"  You always find news about what's going on as well as the latest versions\n"
"  from the curl web pages, located at:\n"
"\n"
"        http://curl.haxx.nu\n"
"\n"
"SIMPLE USAGE\n"
"\n"
"  Get the main page from netscape's web-server:\n"
"\n"
"        curl http://www.netscape.com/\n"
"\n"
"  Get the root README file from funet's ftp-server:\n"
"\n"
"        curl ftp://ftp.funet.fi/README\n"
"\n"
"  Get a gopher document from funet's gopher server:\n"
"\n"
"        curl gopher://gopher.funet.fi\n"
"\n"
"  Get a web page from a server using port 8000:\n"
"\n"
"        curl http://www.weirdserver.com:8000/\n"
"\n"
"  Get a list of the root directory of an FTP site:\n"
"\n"
"        curl ftp://ftp.fts.frontec.se/\n"
"\n"
"  Get the definition of curl from a dictionary:\n"
"\n"
"        curl dict://dict.org/m:curl\n"
"\n"
"DOWNLOAD TO A FILE\n"
"\n"
"  Get a web page and store in a local file:\n"
"\n"
"        curl -o thatpage.html http://www.netscape.com/\n"
"\n"
"  Get a web page and store in a local file, make the local file get the name\n"
"  of the remote document (if no file name part is specified in the URL, this\n"
"  will fail):\n"
"\n"
"        curl -O http://www.netscape.com/index.html\n"
"\n"
"USING PASSWORDS\n"
"\n"
" FTP\n"
"\n"
"   To ftp files using name+passwd, include them in the URL like:\n"
"\n"
"        curl ftp://name:passwd@machine.domain:port/full/path/to/file\n"
"\n"
"   or specify them with the -u flag like\n"
"\n"
"        curl -u name:passwd ftp://machine.domain:port/full/path/to/file\n"
"\n"
" HTTP\n"
"\n"
"   The HTTP URL doesn't support user and password in the URL string. Curl\n"
"   does support that anyway to provide a ftp-style interface and thus you can\n"
"   pick a file like:\n"
"\n"
"        curl http://name:passwd@machine.domain/full/path/to/file\n"
"\n"
"   or specify user and password separately like in\n"
"\n"
"        curl -u name:passwd http://machine.domain/full/path/to/file\n"
"\n"
"   NOTE! Since HTTP URLs don't support user and password, you can't use that\n"
"   style when using Curl via a proxy. You _must_ use the -u style fetch\n"
"   during such circumstances.\n"
"\n"
" HTTPS\n"
"\n"
"   Probably most commonly used with private certificates, as explained below.\n"
"\n"
" GOPHER\n"
"\n"
"   Curl features no password support for gopher.\n"
"\n"
"PROXY\n"
"\n"
" Get an ftp file using a proxy named my-proxy that uses port 888:\n"
"\n"
"        curl -x my-proxy:888 ftp://ftp.leachsite.com/README\n"
"\n"
" Get a file from a HTTP server that requires user and password, using the\n"
" same proxy as above:\n"
"\n"
"        curl -u user:passwd -x my-proxy:888 http://www.get.this/\n"
"\n"
" Some proxies require special authentication. Specify by using -U as above:\n"
"\n"
"        curl -U user:passwd -x my-proxy:888 http://www.get.this/\n"
"\n"
" See also the environment variables Curl support that offer further proxy\n"
" control.\n"
"\n"
"RANGES\n"
"\n"
"  With HTTP 1.1 byte-ranges were introduced. Using this, a client can request\n"
"  to get only one or more subparts of a specified document. Curl supports\n"
"  this with the -r flag.\n"
"\n"
"  Get the first 100 bytes of a document:\n"
"\n"
"        curl -r 0-99 http://www.get.this/\n"
"\n"
"  Get the last 500 bytes of a document:\n"
"\n"
"        curl -r -500 http://www.get.this/\n"
"\n"
"  Curl also supports simple ranges for FTP files as well. Then you can only\n"
"  specify start and stop position.\n"
"\n"
"  Get the first 100 bytes of a document using FTP:\n"
"\n"
"        curl -r 0-99 ftp://www.get.this/README  \n"
"\n"
"UPLOADING\n"
"\n"
" FTP\n"
"\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"  Upload all data on stdin to a specified ftp site:\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"\n"
"        curl -t ftp://ftp.upload.com/myfile\n"
"\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"  Upload data from a specified file, login with user and password:\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"\n"
"        curl -T uploadfile -u user:passwd ftp://ftp.upload.com/myfile\n"
"\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"  Upload a local file to the remote site, and use the local file name remote\n"
"  too:\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
" \n"
"        curl -T uploadfile -u user:passwd ftp://ftp.upload.com/\n"
"\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"  Upload a local file to get appended to the remote file using ftp:\n"
"\n"
"        curl -T localfile -a ftp://ftp.upload.com/remotefile\n"
"\n"
"  NOTE: Curl does not support ftp upload through a proxy! The reason for this\n"
"  is simply that proxies are seldomly configured to allow this and that no\n"
"  author has supplied code that makes it possible!\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"\n"
" HTTP\n"
"\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"  Upload all data on stdin to a specified http site:\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"\n"
"        curl -t http://www.upload.com/myfile\n"
"\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"  Note that the http server must've been configured to accept PUT before this\n"
"  can be done successfully.\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"  For other ways to do http data upload, see the POST section below.\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"\n"
"VERBOSE / DEBUG\n"
"\n"
"  If curl fails where it isn't supposed to, if the servers don't let you\n"
"  in, if you can't understand the responses: use the -v flag to get VERBOSE\n"
"  fetching. Curl will output lots of info and all data it sends and\n"
"  receives in order to let the user see all client-server interaction.\n"
"\n"
"        curl -v ftp://ftp.upload.com/\n"
"\n"
"DETAILED INFORMATION\n"
"\n"
"  Different protocols provide different ways of getting detailed information\n"
"  about specific files/documents. To get curl to show detailed information\n"
"  about a single file, you should use -I/--head option. It displays all\n"
"  available info on a single file for HTTP and FTP. The HTTP information is a\n"
"  lot more extensive.\n"
"\n"
"  For HTTP, you can get the header information (the same as -I would show)\n"
"  shown before the data by using -i/--include. Curl understands the\n"
"  -D/--dump-header option when getting files from both FTP and HTTP, and it\n"
"  will then store the headers in the specified file.\n"
"\n"
"  Store the HTTP headers in a separate file:\n"
"\n"
"        curl --dump-header headers.txt curl.haxx.nu\n"
"\n"
"  Note that headers stored in a separate file can be very useful at a later\n"
"  time if you want curl to use cookies sent by the server. More about that in\n"
"  the cookies section.\n"
"\n"
"POST (HTTP)\n"
"\n"
"  It's easy to post data using curl. This is done using the -d <data>\n"
"  option.  The post data must be urlencoded.\n"
"\n"
"  Post a simple \"name\" and \"phone\" guestbook.\n"
"\n"
"        curl -d \"name=Rafael%20Sagula&phone=3320780\" \\\n"
"                http://www.where.com/guest.cgi\n"
"\n"
"  How to post a form with curl, lesson #1:\n"
"\n"
"  Dig out all the <input> tags in the form that you want to fill in. (There's\n"
"  a perl program called formfind.pl on the curl site that helps with this).\n"
"\n"
"  If there's a \"normal\" post, you use -d to post. -d takes a full \"post\n"
"  string\", which is in the format\n"
"\n"
"        <variable1>=<data1>&<variable2>=<data2>&...\n"
"\n"
"  The 'variable' names are the names set with \"name=\" in the <input> tags, and\n"
"  the data is the contents you want to fill in for the inputs. The data *must*\n"
"  be properly URL encoded. That means you replace space with + and that you\n"
"  write weird letters with %XX where XX is the hexadecimal representation of\n"
"  the letter's ASCII code.\n"
"\n"
"  Example:\n"
"\n"
"  (page located at http://www.formpost.com/getthis/\n"
"\n"
"        <form action=\"post.cgi\" method=\"post\">\n"
"        <input name=user size=10>\n"
"        <input name=pass type=password size=10>\n"
"        <input name=id type=hidden value=\"blablabla\">\n"
"        <input name=ding value=\"submit\">\n"
"        </form>\n"
"\n"
"  We want to enter user 'foobar' with password '12345'.\n"
"\n"
"  To post to this, you enter a curl command line like:\n"
"\n"
"        curl -d \"user=foobar&pass=12345&id=blablabla&dig=submit\"  (continues)\n"
"          http://www.formpost.com/getthis/post.cgi\n"
"\n"
"\n"
Daniel Stenberg's avatar
Daniel Stenberg committed
"  While -d uses the application/x-www-form-urlencoded mime-type, generally\n"
"  understood by CGI's and similar, curl also supports the more capable\n"
"  multipart/form-data type. This latter type supports things like file upload.\n"
"\n"
"  -F accepts parameters like -F \"name=contents\". If you want the contents to\n"
"  be read from a file, use <@filename> as contents. When specifying a file,\n"
"  you can also specify which content type the file is, by appending\n"
"  ';type=<mime type>' to the file name. You can also post contents of several\n"
"  files in one field. So that the field name 'coolfiles' can be sent three\n"
"  files with different content types in a manner similar to:\n"
"\n"
"        curl -F \"coolfiles=@fil1.gif;type=image/gif,fil2.txt,fil3.html\" \\\n"
"        http://www.post.com/postit.cgi\n"
"\n"
"  If content-type is not specified, curl will try to guess from the extension\n"
"  (it only knows a few), or use the previously specified type (from an earlier\n"
"  file if several files are specified in a list) or finally using the default\n"
"  type 'text/plain'.\n"
"\n"
"  Emulate a fill-in form with -F. Let's say you fill in three fields in a\n"
"  form. One field is a file name which to post, one field is your name and one\n"
"  field is a file description. We want to post the file we have written named\n"
"  \"cooltext.txt\". To let curl do the posting of this data instead of your\n"
"  favourite browser, you have to check out the HTML of the form page to get to\n"
"  know the names of the input fields. In our example, the input field names are\n"
"  'file', 'yourname' and 'filedescription'.\n"
"\n"
"        curl -F \"file=@cooltext.txt\" -F \"yourname=Daniel\" \\\n"
"             -F \"filedescription=Cool text file with cool text inside\" \\\n"
"             http://www.post.com/postit.cgi\n"
"\n"
"  So, to send two files in one post you can do it in two ways:\n"
"\n"
"  1. Send multiple files in a single \"field\" with a single field name:\n"
" \n"
"        curl -F \"pictures=@dog.gif,cat.gif\" \n"
" \n"
"  2. Send two fields with two field names: \n"
"\n"
"        curl -F \"docpicture=@dog.gif\" -F \"catpicture=@cat.gif\" \n"
"\n"
"REFERER\n"
"\n"
"  A HTTP request has the option to include information about which address\n"
"  that referred to actual page, and curl allows the user to specify that\n"
"  referrer to get specified on the command line. It is especially useful to\n"
"  fool or trick stupid servers or CGI scripts that rely on that information\n"
"  being available or contain certain data.\n"
"\n"
"        curl -e www.coolsite.com http://www.showme.com/\n"
"\n"
"USER AGENT\n"
"\n"
"  A HTTP request has the option to include information about the browser\n"
"  that generated the request. Curl allows it to be specified on the command\n"
"  line. It is especially useful to fool or trick stupid servers or CGI\n"
"  scripts that only accept certain browsers.\n"
"\n"
"  Example:\n"
"\n"
"  curl -A 'Mozilla/3.0 (Win95; I)' http://www.nationsbank.com/\n"
"\n"
"  Other common strings:\n"
"    'Mozilla/3.0 (Win95; I)'     Netscape Version 3 for Windows 95\n"
"    'Mozilla/3.04 (Win95; U)'    Netscape Version 3 for Windows 95\n"
"    'Mozilla/2.02 (OS/2; U)'     Netscape Version 2 for OS/2\n"
"    'Mozilla/4.04 [en] (X11; U; AIX 4.2; Nav)'           NS for AIX\n"
"    'Mozilla/4.05 [en] (X11; U; Linux 2.0.32 i586)'      NS for Linux\n"
"\n"