CHANGES 104 KB
Newer Older
Daniel Stenberg's avatar
Daniel Stenberg committed
   load the .curlrc file for command line parameters. The syntax for the
   config file is the standard command line argument style. Details in 'curl
   -h' or the README.
 - I removed the -k option. Keep-alive isn't really anything anyone would
   want to enable with curl anyway.
Daniel Stenberg's avatar
Daniel Stenberg committed
 - Martin Staael helped me add the 'irix' target. Now
Daniel Stenberg's avatar
Daniel Stenberg committed
   "make irix" should build curl successfully on non-gcc SGI machines.
 - Single switches now toggle behaviours. I.e if you use -v -v the second
   will switch off the verbose mode the first one enabled. This is so that
   you can disable a default setting a .curlrc file enables etc.

Version 4.9 (Oct 7, 1998)
 Daniel Stenberg
Daniel Stenberg's avatar
Daniel Stenberg committed
 - Martin Staael suggested curl would support cookies.
Daniel Stenberg's avatar
Daniel Stenberg committed
   I added -b/--cookie to enable free-text cookie data to be passed. There's
   also a little blurb about general cookie stuff in the README/help text.
Daniel Stenberg's avatar
Daniel Stenberg committed
 - dmh <dmh at jet.es> suggested HTTP resume capabilities. Although you could
Daniel Stenberg's avatar
Daniel Stenberg committed
   manually get curl to resume HTTP documents, I made the -c resume flag work
   for HTTP too (unless -r is used too, which would be very odd anyway).
 - Added checklinks.pl to the archive. It is a still experimental perl script
   that checks all links of a web page by using curl.
 - Rearranged the archive hierarchy a little. Build the executable in the
   src/ dir from now on!
 - Version 4.9 and hereafter, is no longer released under the GPL license.
   I have now updated the LEGAL file etc and now this is released using the
   Mozilla Public License to avoid the plague known as "the GPL virus". You
   must make the source available if you decide to change and/or redistribute
   curl, but if you decide to use curl within something else you do not need
   to offer the world the source to that too.
 - Curl did not like HTTP servers that sent no headers at all on a GET
   request.  It is a violation of RFC2068 but appearantly some servers do
Daniel Stenberg's avatar
Daniel Stenberg committed
   that anyway.  Thanks to Gordon Beaton for the report!
 - -L/--location was added after a suggestion from Martin Staael. This makes
   curl ATTEMPT to follow the Location: redirect if one is present in the HTTP
   headers. If -i or -I is used with this flag, you will see headers from all
   sites the Location: points to. Do note that the first server can point to a
   second that points to a third etc. It seems the Location: parameter (said
   to be an AbsoluteURI in RFC2068) isn't always absolute.. :-/ Anyway, I've
   made curl ATTEMPT to do the best it can to deal with the reality.
Daniel Stenberg's avatar
Daniel Stenberg committed
 - Added getlinks.pl to the archive. getlinks.pl selectively downloads
   files that a web page links to.

Version 4.8.4
 Daniel Stenberg
Daniel Stenberg's avatar
Daniel Stenberg committed
 - As Julian Romero Nieto reported, curl reported wrong version number.
 - As Teemu Yli-Elsila pointed out, the win32 version of 4.8 (and probably all
   other versions for win32) didn't work with binary files since I'm too used
   to the UNIX style fopen() where binary and text don't differ...
 - Ralph Beckmann brought me some changes that lets curl compile error and
   warning free with -Wall -pedantic with g++. I also took the opportunity to
   clean off some unused variables and similar.
 - Ralph Beckmann made me aware of a really odd bug now corrected. When curl
   read a set of headers from a HTTP server, divided into more than one read
   and the first read showed a full line *exactly* (i.e ending with a
   newline), curl did not behave well.
Daniel Stenberg's avatar
Daniel Stenberg committed

Version 4.8.3
 Daniel Stenberg
 - I was too quick to release 4.8.2 with too little testing. One of the
   changes is now reverted slightly to the 4.8.1 way since 4.8.2 couldn't
   upload files. I still think both problems corrected in 4.8.2 remain
Daniel Stenberg's avatar
Daniel Stenberg committed
   corrected.  Reported by Julian Romero Nieto.
Daniel Stenberg's avatar
Daniel Stenberg committed

Version 4.8.2
 Daniel Stenberg
Daniel Stenberg's avatar
Daniel Stenberg committed
 - Bernhard Iselborn reported two FTP protocol errors curl did. They're now
   corrected. Both appeared when getting files from a MS FTP server! :-)
Daniel Stenberg's avatar
Daniel Stenberg committed

Version 4.8.1
 Daniel Stenberg
 - Added a last update of the progress meter when the transfer is done. The
   final output on the screen didn't have to be the final size transfered
   which made it sometimes look odd.
Daniel Stenberg's avatar
Daniel Stenberg committed
 - Thanks to David Long I got rid of a silly bug that happened if a HTTP-page
   had nothing but header. Appearantly Solaris deals with negative sizes in
   fwrite() calls a lot better than Linux does... =B-]
Daniel Stenberg's avatar
Daniel Stenberg committed

Version 4.8
 Daniel Stenberg
 - Continue FTP file transfer. -c is the switch. Note that you need to
   specify a file name if you wanna resume a download (you can't resume a
   download sent to stdout). Resuming upload may be limited by the server
   since curl is then using the non-RFC959 command SIZE to get the size of
   the target file before upload begins (to figure out which offset to
   use). Use -C to specify the offset yourself! -C is handy if you're doing
   the output to something else but a plain file or when you just want to get
   the end of a file.
 - recursiveftpget.pl now features a maximum recursive level argument.

Version 4.7
 Daniel Stenberg
 - Added support to abort a download if the speed is below a certain amount
   (speed-limit) bytes per second for a certain (speed-time) time.
 - Wrote a perl script 'recursiveftpget.pl' to recursively use curl to get a
   whole ftp directory tree. It is meant as an example of how curl can be
   used.  I agree it isn't the wisest thing to do to make a separate new
   connection for each file and directory for this.

Version 4.6
 Daniel Stenberg
 - Added a first attempt to optionally parse the .netrc file for login user
   and password. If used with http, it enables user authentication. -n is
   the new switch.
 - Removed the extra newlines on the default user-agent string.
 - Corrected the missing ftp upload error messages when it failed without the
   verbose flag set. Gary W. Swearingen found it.
 - Now using alarm() to enable second-precision timeout even on the name
   resolving/connecting phase. The timeout is although reset after that first
Daniel Stenberg's avatar
Daniel Stenberg committed
   sequence. (This should be corrected.) Gary W. Swearingen reported.
Daniel Stenberg's avatar
Daniel Stenberg committed
 - Now spells "Unknown" properly, as in "Unknown option 'z'"... :-)
 - Added bug report email address in the README.
 - Added a "current speed" field to the progress meter. It shows the average
   speed the last 5 seconds. The other speed field shows the average speed of
   the entire transfer so far.

Version 4.5.1
 Linas Vepstas
 - SSL through proxy fix
 - Added -A to allow User-Agent: changes

 Daniel Stenberg 
 - Made the -A work when SSL-through-proxy.

Version 4.5
Daniel Stenberg's avatar
Daniel Stenberg committed
 Linas Vepstas
Daniel Stenberg's avatar
Daniel Stenberg committed
 - More SSL corrections
 - I've added a port to AIX.
 - running SSL through a proxy causes a chunk of code to be executred twice.
   one of those blocks needs to be deleted.

 Daniel Stenberg
 - Made -i and -I work again

Version 4.4
Daniel Stenberg's avatar
Daniel Stenberg committed
 Linas Vepstas
Daniel Stenberg's avatar
Daniel Stenberg committed
 - -x can now also specify proxyport when used as in 'proxyhost:proxyport'
 - SSL fixes

Version 4.3
 Daniel Stenberg
 - Adjusted to compile under win32 (VisualC++ 5). The -P switch does not
   support network interface names in win32. I couldn't figure out how!

Version 4.2
 Linas Vepstas / Sampo Kellomaki
 - Added SSL / SSLeay support (https://)
 - Added the -T usage for HTTP POST.

 Daniel Stenberg
 - Bugfixed the SSL implementation.
 - Made -P a lot better to use other IP addresses. It now accepts a following
   parameter that can be either
        interface - i.e "eth0" to specify which interface's IP address you
                    want to use
        IP address - i.e "192.168.10.1" to specify exact IP number
        host name - i.e "my.host.domain" to specify machine
        "-"       - (any single-letter string) to make it pick the machine's
                    default
 - The Makefile is now ready to compile for solaris, sunos4 and linux right
   out of the box.
 - Better generated version string seen with 'curl -V'

Version 4.1
 Daniel Stenberg
 - The IP number returned by the ftp server as a reply to PASV does no longer
   have to DNS resolve. In fact, no IP-number-only addresses have to anymore.
 - Binds better to available port when -P is used.
 - Now LISTs ./ instead of / when used as in ftp://ftp.funet.fi/. The reason
   for this is that exactly that site, ftp.funet.fi, does not allow LIST /
   while LIST ./ is fine. Any objections?

Version 4 (1998-03-20)
 Daniel Stenberg
 - I took another huge step and changed both version number and project name!
   The reason for the new name is that there are just one too many programs
   named urlget already and this program already can a lot more than merely
   getting URLs, and the reason for the version number is that I did add the
   pretty big change in -P and since I changed name I wanted to start with
   something fresh!
 - The --style flags are working better now.
 - Listing directories with FTP often reported that the file transfer was
   incomplete. Wrong assumptions were too common for directories, why no
   size will be attempted to get compared on them from now on.
 - Implemented the -P flag that let's the ftp control issue a PORT command
   instead of the standard PASV.
 - -a for appending FTP uploads works.

***************************************************************************

Version 3.12
 Daniel Stenberg
 - End-of-header tracking still lacked support for \r\n or just \n at the
   end of the last header line.
Daniel Stenberg's avatar
Daniel Stenberg committed
 Sergio Barresi
Daniel Stenberg's avatar
Daniel Stenberg committed
 - Added PROXY authentication.
 Rafael Sagula
 - Fixed some little bugs.

Version 3.11
 Daniel Stenberg
 - The header parsing was still not correct since the 3.2 modification...

Version 3.10
 Daniel Stenberg
 - 3.7 and 3.9 were simultaneously developed and merged into this version.
 - FTP upload did not work correctly since 3.2.

Version 3.9
 Rafael Sagula
 - Added the "-e <url> / --referer <url>" option where we can specify
   the referer page. Obviously, this is necessary only to fool the
   server, but...

Version 3.7
 Daniel Stenberg
 - Now checks the last error code sent from the ftp server after a file has
   been received or uploaded. Wasn't done previously.
 - When 'urlget <host>' is used without a 'protocol://' first in the host part,
   it now checks for host names starting with ftp or gopher and if it does,
   it uses that protocol by default instead of http.

Version 3.6
 Daniel Stenberg
 - Silly mistake made the POST bug. This has now also been tested to work with
   proxy.

Version 3.5
 Daniel Stenberg
 - Highly inspired by Rafael Sagula's changes to the 3.1 that added an almost
   functional POST, I applied his changes into this version and made them work.
   (It seems POST requires the Content-Type and Content-Length headers.) It is
   now usable with the -d switch.

Version 3.3 - 3.4
 Passed to avoid confusions

Version 3.2
 Daniel Stenberg
 - Major rewrite of two crucial parts of this code: upload and download.
   They are both now using a select() switch, that allows much better
   progress meter and time control. 
 - alarm() usage removed completely
 - FTP get can now list directory contents if the path ends with a slash '/'.
   Urlget on a ftp-path that doesn't end with a slash means urlget will
   attempt getting it as a file name.
 - FTP directory view supports -l for "list-only" which lists the file names
   only.
 - All operations support -m for max time usage in seconds allowed.
 - FTP upload now allows the size of the uploaded file to be provided, and
   thus it can better check it actually uploaded the whole file. It also
   makes the progress meter for uploads much better!
 - Made the parameter parsing fail in cases like 'urlget -r 900' which
   previously tried to connect to the host named '900'.

Version 3.1
 Kjell Ericson
 - Pointed out how to correct the 3 warnings in win32-compiles.

 Daniel Stenberg
 - Removed all calls to exit().
 - Made the short help text get written to stdout instead of stderr.
 - Made this file instead of keeping these comments in the source.
 - Made two callback hooks, that enable external programs to use urlget()
   easier and to grab the output/offer the input easier.
 - It is evident that Win32-compiles are painful. I watched the output from
   the Borland C++ v5 and it was awful. Just ignore all those warnings.

Version 3.0
 Daniel Stenberg
 - Added FTP upload capabilities. The name urlget gets a bit silly now
   when we can put too... =)
 - Restructured the source quite a lot.
   Changed the urlget() interface. This way, we will survive changes much
   better. New features can come and old can be removed without us needing
   to change the interface. I've written a small explanation in urlget.h
   that explains it.
 - New flags include -t, -T, -O and -h. The -h text is generated by the new
   mkhelp script.

Version 2.9
 Remco van Hooff
 - Added a fix to make it compile smoothly on Amiga using the SAS/C
   compiler.
  
 Daniel Stenberg
 - Believe it or not, but the STUPID Novell web server seems to require
   that the Host: keyword is used, so well I use it and I (re-introduce) the
   urlget User-Agent:. I still have to check that this Host: usage works with
   proxies... 'Host:' is required for HTTP/1.1 GET according to RFC2068.

Version 2.8
 Rafael Sagula
 - some little modifications

Version 2.7
 Daniel Stenberg
 - Removed the -l option and introduced the -f option instead. Now I'll
   rewrite the former -l kludge in an external script that'll use urlget to
   fetch multipart files like that.
 - '-f' is introduced, it means Fail without output in case of HTTP server
   errors (return code >=300).
 - Added support for -r, ranges. Specify which part of a document you 
   want, and only that part is returned. Only with HTTP/1.1-servers.
 - Split up the source in 3 parts. Now all pure URL functions are in
   urlget.c and stuff that deals with the stand-alone program is in main.c.
 - I took a few minutes and wrote an embryo of a README file to explain
   a few things.

Version 2.6
 Daniel Stenberg
 - Made the -l (loop) thing use the new CONF_FAILONERROR which makes
   urlget() return error code if non-successful. It also won't output anything
   then. Now finally removed the HTTP 1.0 and error 404 dependencies.
 - Added -I which uses the HEAD request to get the header only from a
   http-server.

Version 2.5
 Rafael Sagula
 - Made the progress meter use HHH:MM:SS instead of only seconds.

Version 2.4
 Daniel Stenberg
 - Added progress meter. It appears when downloading > BUFFER SIZE and
   mute is not selected. I found out that when downloading large files from
   really really slow sites, it is desirable to know the status of the
   download. Do note that some downloads are done unawaring of the size, which
   makes the progress meter less thrilling ;) If the output is sent to a tty,
   the progress meter is shut off.
 - Increased buffer size used for reading.
 - Added length checks in the user+passwd parsing.
 - Made it grok user+passwd for HTTP fetches. The trick is to base64
   encode the user+passwd and send an extra header line. Read chapter 11.1 in
   RFC2068 for details. I added it to be used just like the ftp one.  To get a
   http document from a place that requires user and password, use an URL
   like:

        http://user:passwd@www.site.to.leach/doc.html

   I also added the -u flag, since WHEN USING A PROXY YOU CAN'T SPECIFY THE
   USER AND PASSWORD WITH HTTP LIKE THAT. The -u flag works for ftp too, but
   not if used with proxy. To do the same as the above one, you can invoke:

        urlget -u user:passwd http://www.site.to.leach/doc.html

Version 2.3
 Rafael Sagula
 - Added "-o" option (output file)
 - Added URG_HTTP_NOT_FOUND return code.
   (Daniel's note:)
   Perhaps we should detect all kinds of errors and instead of writing that
   custom string for the particular 404-error, use the error text we actually
   get from the server. See further details in RFC2068 (HTTP 1.1
   definition). The current way also relies on a HTTP/1.0 reply, which newer
   servers might not do.
 - Looping mode ("-l" option). It's easier to get various split files.
   (Daniel's note:)
   Use it like 'urlget -l 1 http://from.this.site/file%d.html', which will
   make urlget to attempt to fetch all files named file1.html, file2.html etc
   until no more files are found. This is only a modification of the
   STAND_ALONE part, nothing in the urlget() function was modfified for this.
 Daniel Stenberg
 - Changed the -h to be -i instead. -h should be preserved to help use.
 - Bjorn Reese indicated that Borland _might_ use '_WIN32' instead of the
   VC++ WIN32 define and therefore I added a little fix for that.

Version 2.2
 Johan Andersson
 - The urlget function didn't set the path to url when using proxy.
 - Fixed bug with IMC proxy. Now using (almost) complete GET command.
  
 Daniel Stenberg
 - Made it compile on Solaris. Had to reorganize the includes a bit.
   (so Win32, Linux, SunOS 4 and Solaris 2 compile fine.)
 - Made Johan's keepalive keyword optional with the -k flag (since it
   makes a lot of urlgets take a lot longer time).
 - Made a '-h' switch in case you want the HTTP-header in the output.

Version 2.1
 Daniel Stenberg and Kjell Ericson
 - Win32-compilable
 - No more global variables
 - Mute option (no output at all to stderr)
 - Full range of return codes from urlget(), which is now written to be a
   function for easy-to-use in [other] programs.
 - Define STAND_ALONE to compile the stand alone urlget program
 - Now compiles with gcc options -ansi -Wall -pedantic ;)

Version 2.0
 - Introducing ftp GET support. The FTP URL type is recognized and used.
 - Renamed the project to 'urlget'.
 - Supports the user+passwd in the FTP URL (otherwise it tries anonymous
   login with a weird email address as password).

Version 1.5
 Daniel Stenberg
 - The skip_header() crap messed it up big-time. By simply removing that
   one we can all of a sudden download anything ;)
 - No longer requires a trailing slash on the URLs.
 - If the given URL isn't prefixed with 'http://', HTTP is assumed and
   given a try!
 - 'void main()' is history.

Version 1.4
 Daniel Stenberg
 - The gopher source used the ppath variable instead of path which could
   lead to disaster.

Version 1.3
 Daniel Stenberg
 - Well, I added a lame text about the time it took to get the data. I also
   fought against Johan to prevent his -f option (to specify a file name
   that should be written instead of stdout)! =)
 - Made it write 'connection refused' for that particular connect()
   problem.
 - Renumbered the version. Let's not make silly 1.0.X versions, this is
   a plain 1.3 instead.

Version 1.2
 Johan Andersson
 - Discovered and fixed the problem with getting binary files. puts() is
   now replaced with fwrite(). (Daniel's note: this also fixed the buffer
   overwrite problem I found in the previous version.)

Daniel Stenberg's avatar
Daniel Stenberg committed
 Rafael Sagula
Daniel Stenberg's avatar
Daniel Stenberg committed
 - Let "-p" before "-x".

Daniel Stenberg's avatar
Daniel Stenberg committed
 Daniel Stenberg
Daniel Stenberg's avatar
Daniel Stenberg committed
 - Bugfixed the proxy usage. It should *NOT* use nor strip the port number
   from the URL but simply pass that information to the proxy. This also
   made the user/password fields possible to use in proxy [ftp-] URLs.
   (like in ftp://user:password@ftp.my.site:8021/README)

Daniel Stenberg's avatar
Daniel Stenberg committed
 Johan Andersson
Daniel Stenberg's avatar
Daniel Stenberg committed
 - Implemented HTTP proxy support.
 - Receive byte counter added.

Daniel Stenberg's avatar
Daniel Stenberg committed
 Bjorn Reese
Daniel Stenberg's avatar
Daniel Stenberg committed
 - Implemented URLs (and skipped the old syntax).
 - Output is written to stdout, so to achieve the above example, do:
   httpget http://143.54.10.6/info_logo.gif > test.gif

Version 1.1
Daniel Stenberg's avatar
Daniel Stenberg committed
 Daniel Stenberg
Daniel Stenberg's avatar
Daniel Stenberg committed
 - Adjusted it slightly to accept named hosts on the command line. We
   wouldn't wanna use IP numbers for the rest of our lifes, would we?

Version 1.0
Daniel Stenberg's avatar
Daniel Stenberg committed
  Rafael Sagula
Daniel Stenberg's avatar
Daniel Stenberg committed
  - Wrote the initial httpget, which started all this!