Newer
Older
Daniel Stenberg
committed
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
/* NEVER EVER edit this manually, fix the mkhelp script instead! */
#include <stdio.h>
void hugehelp(void)
{
puts (
" _ _ ____ _ \n"
" Project ___| | | | _ \\| | \n"
" / __| | | | |_) | | \n"
" | (__| |_| | _ <| |___ \n"
" \\___|\\___/|_| \\_\\_____|\n"
"NAME\n"
" curl - get a URL with FTP, TELNET, LDAP, GOPHER, DICT, FILE,\n"
" HTTP or HTTPS syntax.\n"
"\n"
"SYNOPSIS\n"
" curl [options] [URL...]\n"
"\n"
"DESCRIPTION\n"
" curl is a client to get documents/files from or send docu\n"
);
puts(
" ments to a server, using any of the supported protocols\n"
" (HTTP, HTTPS, FTP, GOPHER, DICT, TELNET, LDAP or FILE). The\n"
" command is designed to work without user interaction or any\n"
" kind of interactivity.\n"
"\n"
" curl offers a busload of useful tricks like proxy support,\n"
" user authentication, ftp upload, HTTP post, SSL (https:)\n"
" connections, cookies, file transfer resume and more.\n"
"\n"
"URL\n"
" The URL syntax is protocol dependent. You'll find a detailed\n"
);
puts(
" description in RFC 2396.\n"
"\n"
" You can specify multiple URLs or parts of URLs by writing\n"
" part sets within braces as in:\n"
"\n"
" http://site.{one,two,three}.com\n"
"\n"
" or you can get sequences of alphanumeric series by using []\n"
" as in:\n"
"\n"
" ftp://ftp.numericals.com/file[1-100].txt\n"
" ftp://ftp.numericals.com/file[001-100].txt (with leading\n"
" zeros)\n"
" ftp://ftp.letters.com/file[a-z].txt\n"
"\n"
" It is possible to specify up to 9 sets or series for a URL,\n"
);
puts(
" but no nesting is supported at the moment:\n"
"\n"
" http://www.any.org/archive[1996-1999]/vol\n"
" ume[1-4]part{a,b,c,index}.html\n"
"\n"
" You can specify any amount of URLs on the command line. They\n"
" will be fetched in a sequential manner in the specified\n"
" order.\n"
"\n"
" Curl will attempt to re-use connections for multiple file\n"
" transfers, so that getting many files from the same server\n"
" will not do multiple connects / handshakes. This improves\n"
);
puts(
" speed. Of course this is only done on files specified on a\n"
" single command line and cannot be used between separate curl\n"
" invokes.\n"
"OPTIONS\n"
" -a/--append\n"
" (FTP) When used in a ftp upload, this will tell curl to\n"
" append to the target file instead of overwriting it. If\n"
" the file doesn't exist, it will be created.\n"
"\n"
" If this option is used twice, the second one will dis\n"
" able append mode again.\n"
"\n"
" -A/--user-agent <agent string>\n"
);
puts(
" (HTTP) Specify the User-Agent string to send to the\n"
" HTTP server. Some badly done CGIs fail if its not set\n"
" to \"Mozilla/4.0\". To encode blanks in the string, sur\n"
" round the string with single quote marks. This can\n"
" also be set with the -H/--header flag of course.\n"
"\n"
" If this option is set more than once, the last one will\n"
" be the one that's used.\n"
"\n"
" -b/--cookie <name=data>\n"
);
puts(
" (HTTP) Pass the data to the HTTP server as a cookie. It\n"
" is supposedly the data previously received from the\n"
" server in a \"Set-Cookie:\" line. The data should be in\n"
" the format \"NAME1=VALUE1; NAME2=VALUE2\".\n"
"\n"
" If no '=' letter is used in the line, it is treated as\n"
" a filename to use to read previously stored cookie\n"
" lines from, which should be used in this session if\n"
);
puts(
" they match. Using this method also activates the\n"
" \"cookie parser\" which will make curl record incoming\n"
" cookies too, which may be handy if you're using this in\n"
" combination with the -L/--location option. The file\n"
" format of the file to read cookies from should be plain\n"
" HTTP headers or the Netscape/Mozilla cookie file for\n"
" mat.\n"
"\n"
" NOTE that the file specified with -b/--cookie is only\n"
);
puts(
" used as input. No cookies will be stored in the file.\n"
" To store cookies, save the HTTP headers to a file using\n"
" -D/--dump-header!\n"
"\n"
" If this option is set more than once, the last one will\n"
" be the one that's used.\n"
"\n"
" -B/--use-ascii\n"
" Use ASCII transfer when getting an FTP file or LDAP\n"
" info. For FTP, this can also be enforced by using an\n"
" URL that ends with \";type=A\". This option causes data\n"
);
puts(
" sent to stdout to be in text mode for win32 systems.\n"
"\n"
" If this option is used twice, the second one will dis\n"
" able ASCII usage.\n"
" --connect-timeout <seconds>\n"
" Maximum time in seconds that you allow the connection\n"
" to the server to take. This only limits the connection\n"
" phase, once curl has connected this option is of no\n"
" more use. This option didn't work in win32 systems\n"
);
puts(
" until 7.7.2. See also the --max-time option.\n"
"\n"
" If this option is used several times, the last one will\n"
" be used.\n"
"\n"
" -c/--continue\n"
" Deprecated. Use '-C -' instead. Continue/Resume a pre\n"
" vious file transfer. This instructs curl to continue\n"
" appending data on the file where it was previously\n"
" left, possibly because of a broken connection to the\n"
" server. There must be a named physical file to append\n"
);
puts(
" to for this to work. Note: Upload resume is depening\n"
" on a command named SIZE not always present in all ftp\n"
" servers! Upload resume is for FTP only. HTTP resume is\n"
" only possible with HTTP/1.1 or later servers.\n"
"\n"
" -C/--continue-at <offset>\n"
" Continue/Resume a previous file transfer at the given\n"
" offset. The given offset is the exact number of bytes\n"
" that will be skipped counted from the beginning of the\n"
);
puts(
" source file before it is transfered to the destination.\n"
" If used with uploads, the ftp server command SIZE will\n"
" not be used by curl. Upload resume is for FTP only.\n"
" HTTP resume is only possible with HTTP/1.1 or later\n"
" servers.\n"
"\n"
" If this option is used several times, the last one will\n"
" be used.\n"
"\n"
" -d/--data <data>\n"
" (HTTP) Sends the specified data in a POST request to\n"
);
puts(
" the HTTP server, in a way that can emulate as if a user\n"
" has filled in a HTML form and pressed the submit but\n"
" ton. Note that the data is sent exactly as specified\n"
" with no extra processing (with all newlines cut off).\n"
" The data is expected to be \"url-encoded\". This will\n"
" cause curl to pass the data to the server using the\n"
" content-type application/x-www-form-urlencoded. Compare\n"
);
puts(
" to -F. If more than one -d/--data option is used on the\n"
" same command line, the data pieces specified will be\n"
" merged together with a separating &-letter. Thus, using\n"
" '-d name=daniel -d skill=lousy' would generate a post\n"
" chunk that looks like 'name=daniel&skill=lousy'.\n"
"\n"
" If you start the data with the letter @, the rest\n"
" should be a file name to read the data from, or - if\n"
);
puts(
" you want curl to read the data from stdin. The\n"
" contents of the file must already be url-encoded. Mul\n"
" tiple files can also be specified.\n"
"\n"
" To post data purely binary, you should instead use the\n"
" --data-binary option.\n"
Loading
Loading full blame…