Newer
Older
/* NEVER EVER edit this manually, fix the mkhelp script instead! */
#include <stdio.h>
void hugehelp(void)
{
puts (
" _ _ ____ _ \n"
" Project ___| | | | _ \\| | \n"
" / __| | | | |_) | | \n"
" | (__| |_| | _ <| |___ \n"
" \\___|\\___/|_| \\_\\_____|\n"
"NAME\n"
" curl - get a URL with FTP, TELNET, LDAP, GOPHER, DICT, FILE,\n"
" HTTP or HTTPS syntax.\n"
" curl is a client to get documents/files from servers, using\n"
" any of the supported protocols. The command is designed to\n"
" work without user interaction or any kind of interactivity.\n"
" curl offers a busload of useful tricks like proxy support,\n"
" user authentication, ftp upload, HTTP post, SSL (https:)\n"
" connections, cookies, file transfer resume and more.\n"
" The URL syntax is protocol dependent. You'll find a detailed\n"
" description in RFC 2396.\n"
" You can specify multiple URLs or parts of URLs by writing\n"
" part sets within braces as in:\n"
" or you can get sequences of alphanumeric series by using []\n"
" as in:\n"
" ftp://ftp.numericals.com/file[1-100].txt\n"
" ftp://ftp.numericals.com/file[001-100].txt (with leading\n"
" zeros)\n"
" ftp://ftp.letters.com/file[a-z].txt\n"
" It is possible to specify up to 9 sets or series for a URL,\n"
" but no nesting is supported at the moment:\n"
" http://www.any.org/archive[1996-1999]/vol\n"
" ume[1-4]part{a,b,c,index}.html\n"
" -a/--append\n"
" (FTP) When used in a ftp upload, this will tell curl to\n"
" append to the target file instead of overwriting it. If\n"
" the file doesn't exist, it will be created.\n"
"\n"
" If this option is used twice, the second one will dis\n"
" able append mode again.\n"
"\n"
" (HTTP) Specify the User-Agent string to send to the\n"
" HTTP server. Some badly done CGIs fail if its not set\n"
" to \"Mozilla/4.0\". To encode blanks in the string,\n"
" surround the string with single quote marks. This can\n"
" also be set with the -H/--header flag of course.\n"
"\n"
" If this option is used more than once, the last one\n"
" will be the one to be used.\n"
"\n"
" -b/--cookie <name=data>\n"
" (HTTP) Pass the data to the HTTP server as a cookie. It\n"
" is supposedly the data previously received from the\n"
" server in a \"Set-Cookie:\" line. The data should be in\n"
" the format \"NAME1=VALUE1; NAME2=VALUE2\".\n"
"\n"
" If no '=' letter is used in the line, it is treated as\n"
" a filename to use to read previously stored cookie\n"
" lines from, which should be used in this session if\n"
" they match. Using this method also activates the\n"
" \"cookie parser\" which will make curl record incoming\n"
" cookies too, which may be handy if you're using this in\n"
" combination with the -L/--location option. The file\n"
" format of the file to read cookies from should be plain\n"
" HTTP headers or the netscape cookie file format.\n"
"\n"
" NOTE that the file specified with -b/--cookie is only\n"
" used as input. No cookies will be stored in the file.\n"
" To store cookies, save the HTTP headers to a file using\n"
" -D/--dump-header!\n"
"\n"
" If this option is used more than once, the last one\n"
" will be the one to be used.\n"
"\n"
" Use ASCII transfer when getting an FTP file or LDAP\n"
" info. For FTP, this can also be enforced by using an\n"
" URL that ends with \";type=A\". This option causes data\n"
" sent to stdout to be in text mode for win32 systems.\n"
" If this option is used twice, the second one will dis\n"
" able ASCII usage.\n"
"\n"
" Deprecated. Use '-C -' instead. Continue/Resume a pre\n"
" vious file transfer. This instructs curl to continue\n"
" appending data on the file where it was previously\n"
" left, possibly because of a broken connection to the\n"
" server. There must be a named physical file to append\n"
" to for this to work. Note: Upload resume is depening\n"
" on a command named SIZE not always present in all ftp\n"
" servers! Upload resume is for FTP only. HTTP resume is\n"
" only possible with HTTP/1.1 or later servers.\n"
"\n"
" -C/--continue-at <offset>\n"
" Continue/Resume a previous file transfer at the given\n"
" offset. The given offset is the exact number of bytes\n"
" that will be skipped counted from the beginning of the\n"
" source file before it is transfered to the destination.\n"
" If used with uploads, the ftp server command SIZE will\n"
" not be used by curl. Upload resume is for FTP only.\n"
" HTTP resume is only possible with HTTP/1.1 or later\n"
" If this option is used serveral times, the last one\n"
" will be used.\n"
"\n"
" (HTTP) Sends the specified data in a POST request to\n"
" the HTTP server. Note that the data is sent exactly as\n"
" specified with no extra processing (with all newlines\n"
" cut off). The data is expected to be \"url-encoded\".\n"
" This will cause curl to pass the data to the server\n"
" using the content-type application/x-www-form-urlen\n"
" coded. Compare to -F. If more than one -d/--data option\n"
" is used on the same command line, the data pieces spec\n"
" ified will be merged together with a separating &-let\n"
" ter. Thus, using '-d name=daniel -d skill=lousy' would\n"
" generate a post chunk that looks like\n"
"\n"
" If you start the data with the letter @, the rest\n"
" should be a file name to read the data from, or - if\n"
" you want curl to read the data from stdin. The con\n"
" tents of the file must already be url-encoded. Multiple\n"
" files can also be specified.\n"
"\n"
" To post data purely binary, you should instead use the\n"
" --data-binary option.\n"
"\n"
" -d/--data is the same as --data-ascii.\n"
"\n"
" If this option is used serveral times, the last one\n"
" will be used.\n"
"\n"
" --data-ascii <data>\n"
" (HTTP) This is an alias for the -d/--data option.\n"
"\n"
" If this option is used serveral times, the last one\n"
" will be used.\n"
"\n"
" (HTTP) This posts data in a similar manner as --data-\n"
" ascii does, although when using this option the entire\n"
" context of the posted data is kept as-is. If you want\n"
" to post a binary file without the strip-newlines fea\n"
" ture of the --data-ascii option, this is for you.\n"
"\n"
" If this option is used serveral times, the last one\n"
" will be used.\n"
"\n"
" (HTTP/FTP) Write the HTTP headers to this file. Write\n"
" the FTP file info to this file if -I/--head is used.\n"
"\n"
" This option is handy to use when you want to store the\n"
" cookies that a HTTP site sends to you. The cookies\n"
" could then be read in a second curl invoke by using the\n"
" -b/--cookie option!\n"
"\n"
" If this option is used serveral times, the last one\n"
" will be used.\n"
"\n"
" -e/--referer <URL>\n"
" (HTTP) Sends the \"Referer Page\" information to the HTTP\n"
" server. This can also be set with the -H/--header flag\n"
" of course. When used with -L/--location you can append\n"
" \";auto\" to the referer URL to make curl automatically\n"
" set the previous URL when it follows a Location:\n"
" header. The \";auto\" string can be used alone, even if\n"
" you don't set an initial referer.\n"
" If this option is used serveral times, the last one\n"
" will be used.\n"
"\n"
" -E/--cert <certificate[:password]>\n"
" (HTTPS) Tells curl to use the specified certificate\n"
" file when getting a file with HTTPS. The certificate\n"
" must be in PEM format. If the optional password isn't\n"
" specified, it will be queried for on the terminal. Note\n"
" that this certificate is the private key and the pri\n"
" vate certificate concatenated!\n"
"\n"
Loading
Loading full blame…