Newer
Older
_ _ ____ _
___| | | | _ \| |
/ __| | | | |_) | |
| (__| |_| | _ <| |___
\___|\___/|_| \_\_____|
History of Changes
Version XX
Daniel (31 January 2000):
- Paul Harrington <paul@pizza.org> found another core dump in the cookie
parser. Curl doesn't properly recognize the 'version' keyword and I think
that is what caused this. I need to refresh some specs on cookies and see
what else curl lacks to improve this a bit more once and for all.
RFC 2109 clearly specifies how cookies should be dealt with when they are
compliant with that spec. I don't think many servers are though...
- Mark W. Eichin <eichin@thok.org> found that while curl is uploading a form
to a web site, it doesn't read incoming data why it'll hang after a while
since the socket "pipe" becomes full.
It took me two hours to rewrite Download() and Upload() into the new
single function Transfer(). It even seems to work! More testing is required
of course... I should get the header-sending together in a kind of queue
and let them get "uploaded" in Transfer() as well.
- Zhibiao Wu <wuzb@erols.com> pointed out a curl bug in the location: area,
although I did not get a reproducable way to do this why I have to wait
with fixing anything.
- Bob Schader <rschader@product-des.com> suggested I should implement resume
support for the HTTP PUT operation, and as I think it is a valid suggestion
I'll work on it.
Daniel (25 January 2000):
- M Travis Obenhaus <Travis.Obenhaus@aud.alcatel.com> pointed out a manual
mixup with -y and -Y that was corrected.
- Jens Schleusener <Jens.Schleusener@dlr.de> pointed out a problem to compile
curl on AIX 4.1.4 and gave me a solution. This problem was already fixed
by Jörn's recent #include modifications!
Daniel (19 January 2000):
- Oskar Liljeblad <osk@hem.passagen.se> pointed out and corrected a problem
in the Location: following system that made curl following a location: to a
different protocol to fail.
At January 31st I re-considered this fix and the surrounding source code. I
could not really see that the patch did any difference, why I removed it
again for further research and debugging. (It disabled location: following
on server not running on default ports.)
- Jörn Hartroth <Joern.Hartroth@telekom.de> brought a fix that once again
made it possible to select progress bar.
- Jörn also fixed a few include problems.
Daniel (17 January 2000):
- Based on suggestions from Björn Stenberg (bjorn@haxx.nu), I made the
progress deal better with larger files and added a "Time" field which shows
the time spent on the download so far.
- I'm now using the CVS repository on sourceforge.net, which also allows web
browsing. See http://curl.haxx.nu.
Daniel (10 January 2000):
- Renumbered some enums in curl/curl.h since tag number 35 was used twice!
- Added "postquote" support to the ftp section that enables post-ftp-transfer
quote commands.
- Now made the -Q/--quote parameter recognize '-' as a prefix, which means
that command will be issued AFTER a successful ftp transfer. This can of
course be used to delete or rename a file after it has been uploaded or
downloaded. Use your imagination! ;-)
- Since I do the main development on solaris 2.6 now, I had to download and
install GNU groff to generate the hugehelp.c file. The solaris nroff cores
on the man page! So, in order to make the solaris configure script find a
better result I made gnroff get checked prior to the regular nroff.
- Added all the curl exit codes to the man page.
- Jim Gallagher <jmgallag@usa.net> properly tracked down a bug in autoconf
2.13. The AC_CHECK_LIB() macro wrongfully uses the -l flag before the -L
flag to 'ld' which causes the HP-UX 10.20 flavour to fail on all libchecks
and thefore you can't make the configure script find the openssl libs!
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
Daniel (28 December 1999):
- Tim Verhoeven <dj@walhalla.sin.khk.be> correctly identified that curl
doesn't support URL formatted file names when getting ftp. Now, there's a
problem with getting very weird file names off FTP servers. RFC 959 defines
that the file name syntax to use should be the same as in the native OS of
the server. Since we don't know the peer server system we currently just
translate the URL syntax into plain letters. It is still better and with
the solaris 2.6-supplied ftp server it works with spaces in the file names.
Daniel (27 December 1999):
- When curl parsed cookies straight off a remote site, it corrupted the input
data, which, if the downloaded headers were stored made very odd characters
in the saved data. Correctfully identified and reported by Paul Harrington
<paul@pizza.org>.
Daniel (13 December 1999):
- General cleanups in the library interface. There had been some bad kludges
added during times of stress and I did my best to clean them off. It was
both regarding the lib API as well as include file confusions.
Daniel (3 December 1999):
- A small --stderr bug was reported by Eetu Ojanen <esojanen@jyu.fi>...
- who also brought the suggestion of extending the -X flag to ftp list as
well. So, now it is and the long option is now --request instead. It is
only for ftp list for now (and the former http stuff too of course).
Lars J. Aas <larsa@sim.no> (24 November 1999):
- Patched curl to compile and build under BeOS. Doesn't work yet though!
- Corrected the Makefile.am files to allow putting object files in
different directories than the sources.
Version 6.3.1
Daniel (23 November 1999):
- I've had this major disk crash. My good old trust-worthy source disk died
along with the machine that hosted it. Thank goodness most of all the
things I've done are either backed up elsewhere or stored in this CVS
server!
- Michael S. Steuer <michael@steuer.com> pointed out a bug in the -F handling
that made curl hang if you posted an empty variable such as '-F name='. It
was one of those old bugs that never have worked properly...
- Jason Baietto <jason@durians.com> pointed out a general flaw in the HTTP
download. Curl didn't complain if it was prematurely aborted before the
entire download was completed. It does now.
Daniel (19 November 1999):
- Chris Maltby <chris@aurema.com> very accurately criticized the lack of
return code checks on the fwrite() calls. I did a thorough check for all
occurrences and corrected this.
Daniel (17 November 1999):
- Paul Harrington <paul@pizza.org> pointed out that the -m/--max-time option
doesn't work for the slow system calls like gethostbyname()... I don't have
any good fix yet, just a slightly less bad one that makes curl exit hard
when the timeout is reached.
- Bjorn Reese helped me point out a possible problem that might be the reason
why Thomas Hurst experience problems in his Amiga version.
Daniel (12 November 1999):
- I found a crash in the new cookie file parser. It crashed when you gave
a plain http header file as input...
Version 6.3
Daniel (10 November 1999):
- I kind of found out that the HTTP time-conditional GETs (-z) aren't always
respected by the web server and the document is therefore sent in whole
again, even though it doesn't match the requested condition. After reading
section 13.3.4 of RFC 2616, I think I'm doing the right thing now when I do
my own check as well. If curl thinks the condition isn't met, the transfer
is aborted prematurely (after all the headers have been received).
- After comments from Robert Linden <robert.linden@postcom.deutschepost.de> I
also rewrote some parts of the man page to better describe how the -F
works.
- Michael Anti <anti@pshowing.com> put up a new curl download mirror in
China: http://www.pshowing.com/curl/
- I added the list of download mirrors to the README file
- I did add more explanations to the man page
Daniel (8 November 1999):
- I made the -b/--cookie option capable of reading netscape formatted cookie
files as well as normal http-header files. It should be able to
transparantly figure out what kind of file it got as input.
Daniel (29 October 1999):
- Another one of Sebastiaan van Erk's ideas (that has been requested before
but I seem to have forgotten who it was), is to add support for ranges in
FTP downloads. As usual, one request is just a request, when they're two
it is a demand. I've added simple support for X-Y style fetches. X has to
be the lower number, though you may omit one of the numbers. Use the -r/
--range switch (previously HTTP-only).
- Sebastiaan van Erk <sebster@sebster.com> suggested that curl should be
able to show the file size of a specified file. I think this is a splendid
idea and the -I flag is now working for FTP. It displays the file size in
this manner:
Content-Length: XXXX
As it resembles normal headers, and leaves us the opportunity to add more
info in that display if we can come up with more in the future! It also
makes sense since if you access ftp through a HTTP proxy, you'd get the
file size the same way.
I changed the order of the QUOTE command execusions. They're now executed
just after the login and before any other command. I made this to enable
quote commands to run before the -I stuff is done too.
Loading full blame...