Skip to content
CHANGES 71.2 KiB
Newer Older
Daniel Stenberg's avatar
Daniel Stenberg committed
                                  _   _ ____  _     
                              ___| | | |  _ \| |    
                             / __| | | | |_) | |    
                            | (__| |_| |  _ <| |___ 
                             \___|\___/|_| \_\_____|

                               History of Changes


Version 6.4

Daniel (28 December 1999):
 - Tim Verhoeven <dj@walhalla.sin.khk.be> correctly identified that curl
   doesn't support URL formatted file names when getting ftp. Now, there's a
   problem with getting very weird file names off FTP servers. RFC 959 defines
   that the file name syntax to use should be the same as in the native OS of
   the server. Since we don't know the peer server system we currently just
   translate the URL syntax into plain letters. It is still better and with
   the solaris 2.6-supplied ftp server it works with spaces in the file names.

Daniel (27 December 1999):
 - When curl parsed cookies straight off a remote site, it corrupted the input
   data, which, if the downloaded headers were stored made very odd characters
   in the saved data. Correctfully identified and reported by Paul Harrington
   <paul@pizza.org>.

Daniel (13 December 1999):
 - General cleanups in the library interface. There had been some bad kludges
   added during times of stress and I did my best to clean them off. It was
   both regarding the lib API as well as include file confusions.

Daniel (3 December 1999):
 - A small --stderr bug was reported by Eetu Ojanen <esojanen@jyu.fi>...

 - who also brought the suggestion of extending the -X flag to ftp list as
   well. So, now it is and the long option is now --request instead. It is
   only for ftp list for now (and the former http stuff too of course).

Lars J. Aas <larsa@sim.no> (24 November 1999):
 - Patched curl to compile and build under BeOS. Doesn't work yet though!

 - Corrected the Makefile.am files to allow putting object files in
   different directories than the sources.

Version 6.3.1

Daniel (23 November 1999):
 - I've had this major disk crash. My good old trust-worthy source disk died
   along with the machine that hosted it. Thank goodness most of all the
   things I've done are either backed up elsewhere or stored in this CVS
   server!

 - Michael S. Steuer <michael@steuer.com> pointed out a bug in the -F handling
   that made curl hang if you posted an empty variable such as '-F name='. It
   was one of those old bugs that never have worked properly...

 - Jason Baietto <jason@durians.com> pointed out a general flaw in the HTTP
   download. Curl didn't complain if it was prematurely aborted before the
   entire download was completed. It does now.

Daniel (19 November 1999):
 - Chris Maltby <chris@aurema.com> very accurately criticized the lack of
   return code checks on the fwrite() calls. I did a thorough check for all
   occurrences and corrected this.

Daniel (17 November 1999):
 - Paul Harrington <paul@pizza.org> pointed out that the -m/--max-time option
   doesn't work for the slow system calls like gethostbyname()... I don't have
   any good fix yet, just a slightly less bad one that makes curl exit hard
   when the timeout is reached.

 - Bjorn Reese helped me point out a possible problem that might be the reason
   why Thomas Hurst experience problems in his Amiga version.

 Daniel (12 November 1999):
 - I found a crash in the new cookie file parser. It crashed when you gave
   a plain http header file as input...

Version 6.3

 Daniel (10 November 1999):
 - I kind of found out that the HTTP time-conditional GETs (-z) aren't always
   respected by the web server and the document is therefore sent in whole
   again, even though it doesn't match the requested condition. After reading
   section 13.3.4 of RFC 2616, I think I'm doing the right thing now when I do
   my own check as well. If curl thinks the condition isn't met, the transfer
   is aborted prematurely (after all the headers have been received).

 - After comments from Robert Linden <robert.linden@postcom.deutschepost.de> I
   also rewrote some parts of the man page to better describe how the -F
   works.

 - Michael Anti <anti@pshowing.com> put up a new curl download mirror in
   China:  http://www.pshowing.com/curl/

 - I added the list of download mirrors to the README file

 - I did add more explanations to the man page

 Daniel (8 November 1999):
 - I made the -b/--cookie option capable of reading netscape formatted cookie
   files as well as normal http-header files. It should be able to
   transparantly figure out what kind of file it got as input.

 Daniel (29 October 1999):
 - Another one of Sebastiaan van Erk's ideas (that has been requested before
   but I seem to have forgotten who it was), is to add support for ranges in
   FTP downloads. As usual, one request is just a request, when they're two
   it is a demand. I've added simple support for X-Y style fetches. X has to
   be the lower number, though you may omit one of the numbers. Use the -r/
   --range switch (previously HTTP-only).

 - Sebastiaan van Erk <sebster@sebster.com> suggested that curl should be
   able to show the file size of a specified file. I think this is a splendid
   idea and the -I flag is now working for FTP. It displays the file size in
   this manner:
        Content-Length: XXXX
   As it resembles normal headers, and leaves us the opportunity to add more
   info in that display if we can come up with more in the future! It also
   makes sense since if you access ftp through a HTTP proxy, you'd get the
   file size the same way.

   I changed the order of the QUOTE command execusions. They're now executed
   just after the login and before any other command. I made this to enable
   quote commands to run before the -I stuff is done too.

 - I found out that -D/--dump-header and -V/--version weren't documented in
   the man page.

 - Many HTTP/1.1 servers do not support ranges. Don't ask me why. I did add
   some text about this in the man page for the range option. The thread in
   the mailing list that started this was initiated by Michael Anti
   <anti@pshowing.com>.

 - I get reports about nroff crashes on solaris 2.6+ when displaying the curl
   man page. Switch to gnroff instead, it is reported to work(!). Adam Barclay
   <adam@oz.org> reported and brought the suggestion.

 - In a dialogue with Johannes G. Kristinsson <d98is@dtek.chalmers.se> we came
   up with the idea to let -H/--header specified headers replace the
   internally generated headers, if you happened to select to add a header
   that curl normally uses by itself. The advantage with this is not entirely
   obvious, but in Johannes' case it means that he can use another Host: than
   the one curl would set.

 Daniel (27 October 1999):
 - Jongki Suwandi <Jongki.Suwandi@eng.sun.com> brought a nice patch for
   (yet another) crash when following a location:. This time you had to
   follow a https:// server's redirect to get the core.

Version 6.2

 Daniel (21 October 1999):
 - I think I managed to remove the suspicious (nil) that has been seen just
   before the "Host:" in HTTP requests when -v was used.
 - I found out that if you followed a location: when using a proxy, without
   having specified http:// in the URL, the protocol part was added once again
   when moving to the next URL! (The protocol part has to be added to the
   URL when going through a proxy since it has no protocol-guessing system
   such as curl has.)
 - Benjamin Ritcey <ritcey@tfn.com> reported a core dump under solaris 2.6
   with OpenSSL 0.9.4. It turned out this was due to a bad free() in main.c
   that occurred after the download was done and completed.
 - Benjamin found ftp downloads to show the first line of the download meter
   to get written twice, and I removed that problem. It was introduced with
   the multiple URL support.
 - Dan Zitter <dzitter@zitter.net> correctly pointed out that curl 6.1 and
   earlier versions didn't honor RFC 2616 chapter 4 section 2, "Message
   Headers": "...Field names are case-insensitive..."
   HTTP header parsing assumed a certain casing. Dan also provided me with
   a patch that corrected this, which I took the liberty of editing slightly.
 - Dan Zitter also provided a nice patch for config.guess to better recognize
   the Mac OS X
 - Dan also corrected a minor problem in the lib/Makefile that caused linking
   to fail on OS X.

 Daniel (19 October 1999):
 - Len Marinaccio <len@goodnet.com> came up with some problems with curl.
   Since Windows has a crippled shell, it can't redirect stderr and that
   causes trouble. I added --stderr today which allows the user to redirect
   the stderr stream to a file or stdout.

 Daniel (18 October 1999):
 - The configure script now understands the '--without-ssl' flag, which now
   totally disable SSL/https support. Previously it wasn't possible to force
   the configure script to leave SSL alone. The previous functionality has
   been retained. Troy Engel helped test this new one.

Version 6.1

 Daniel (17 October 1999):
 - I ifdef'ed or commented all the zlib stuff in the sources and configure
   script. It turned out we needed to mock more with zlib than I initially
   thought, to make it capable of downloading compressed HTTP documents and
   uncompress them on the fly. I didn't mean the zlib parts of curl to become
   more than minor so this means I halt the zlib expedition for now and wait
   until someone either writes the code or zlib gets updated and better
   adjusted for this kind of usage.  I won't get into details here, but a
   short a summary is suitable:
   - zlib can't automatically detect whether to use zlib or gzip
Loading full blame...