Skip to content
CHANGES 91.1 KiB
Newer Older

Daniel (3 February 2001)
- Ingo Ralf Blum provided another fix that makes curl build under the more
  recent cygwin installations. It seems they've changed the preset defines to
  not include WIN32 anymore.

Version 7.6.1-pre2

Daniel (31 January 2001)
- Curl_read() and curl_read() now return a ssize_t for the size, as it had to
Daniel Stenberg's avatar
Daniel Stenberg committed
  be able to return -1. The telnet support crashed due to this and there was a
Daniel Stenberg's avatar
Daniel Stenberg committed
  possibility to weird behavior all over. Linus Nielsen Feltzing helped me
Daniel Stenberg's avatar
Daniel Stenberg committed
  find this.
Daniel Stenberg's avatar
Daniel Stenberg committed
- Added a configure.in check for a working getaddrinfo() if IPv6 is requested.
  I also made the configure script feature --enable-debug which sets a couple
  of compiler options when used. It assumes gcc.

Daniel (30 January 2001)
- I finally took a stab at the long-term FIXME item I've had on myself, and
  now libcurl will properly work when doing a HTTP range-request that follows
  a Location:. Previously that would make libcurl fail saying that the server
  doesn't seem to support range requests.

Daniel Stenberg's avatar
Daniel Stenberg committed
Daniel (29 January 2001)
Daniel Stenberg's avatar
Daniel Stenberg committed
- I added a test case for the HTTP PUT resume thing (test case 33).

Version 7.6.1-pre1

Daniel (29 January 2001)
- Yet another Content-Range change. Ok now? Bob Schader checks from his end 
  and it works for him.
Daniel Stenberg's avatar
Daniel Stenberg committed

Daniel Stenberg's avatar
Daniel Stenberg committed
Daniel (27 January 2001)
- So the HTTP PUT resume fix wasn't good. There should appearantly be a
  Content-Range header when resuming a PUT.

- I noticed I broke the download-check that verifies that a resumed HTTP
  download is actually resumed. It got broke because my new 'httpreq' field
  in the main curl struct. I should get slapped. I added a test case for
  this now, so I won't be able to ruin this again without noticing.

- Added a test case for content-length verifying when downloading HTTP.

- Made the progress meter title say if the transfer is being transfered. It
  makes the output slightly better for resumes.

- When dealing with Location: and HTTP return codes, libcurl will not attempt
  to follow the spirit of RFC2616 better. It means that when POSTing to a
  URL that is being following to a second place, the standard will judge on
  what to do. All HTTP codes except 303 and 305 will cause curl to make a
  second POST operation. 303 will make a GET and 305 is not yet supported.

  I also wrote two test cases for this POST/GET/Location stuff.

Daniel Stenberg's avatar
Daniel Stenberg committed
Version 7.6

Daniel Stenberg's avatar
Daniel Stenberg committed
Daniel (26 January 2001)
- Lots of mails back and forth with Bob Schader finally made me add a small
  piece of code in the HTTP engine so that HTTP upload resume works. You can
  now do an operation like 'curl -T file -C <offset> <URL>' and curl will PUT
  the ending part of the file starting at given offet to the specified URL.

Version 7.6-pre4

Daniel Stenberg's avatar
Daniel Stenberg committed
Daniel (25 January 2001)
- I took hold of Rick Jones' question why we don't use recv() and send() for
  reading/writing to the sockets and I've now modified the sread() and
  swrite() macros to use them instead. If nothing else, they could be tested
  in the next beta-round coming right up.

- Jeff Morrow found a problem with libcurl's usage of SSL_read() and supplied
  his research results in how to fix this. It turns out we have to invoke the
  function several times in some cases. The same goes for the SSL_write().

  I made some rather drastic changes all over libcurl to make all writes and
  reads get done on one single place so that this repeated-attempts thing
  would only have to be implemented at one point.

- Rick Jones spotted that the 'total time' counter really didn't measure the
  total time very accurate on subsecond levels.

- Johan Nilsson pointed out the need to more clearly specify that the timeout
  value you set for a download is for the *entire* download. There's currently
  no option available that sets a timeout for the connection phase only.

- Ingo Ralf Blum submitted a series of patches required to get curl to compile
  properly with cygwin.

- Robert Weaver posted a fix for the win32 section of the curl_getenv() code
  that corrected a potential memory leak.

- Added comments in a few files in a sudden attempt to make the sources more
  easy to read and understand!
Daniel Stenberg's avatar
Daniel Stenberg committed

Daniel Stenberg's avatar
Daniel Stenberg committed
Daniel (23 January 2001)
- Added simple IPv6 detection in the configure script and made the version
  string add 'ipv6' to the enable section in that case. ENABLE_IPV6 will be
  set if curl is compiled with IPv6 support enabled.

- Added a parser for IPv6-style specified IP-addresses in a URL. Thus, when
  IPv6 gets enabled soon, we can use URLs like '[0::1]:80'...

- Made the URL globbing in the client possible to fail silently if there's an
  error in the globbing. It makes it almost intuitive, so when you don't
Daniel Stenberg's avatar
Daniel Stenberg committed
  follow the syntax rules, globbing is simply switched off and the raw string
Daniel Stenberg's avatar
Daniel Stenberg committed
  is used instead.

  I still think we'll get problems with IPv6-style IP-addresses when we *want*
  globbing on parts of the URL as the initial part of the URL will for sure
  seriously confuse the globber.

Daniel (22 January 2001)
- Björn Stenberg supplied a progress meter patch that makes it look better even
  during slow starts. Previously it made some silly assumptions...

- Added two FTP tests for -Q and -Q - stuff since it was being discussed on
  the mailing list. Had to correct the ftpserver.pl too as it bugged slightly.

Daniel (19 January 2001)
- Made the Location: parsers deal with any-length URLs. Thus I removed the last
  code that restricts the length of URLs that curl supports.

- Added a --globoff test case (#28) and it quickly identified a memory problem
  in src/main.c that I took care of.

Version 7.6-pre3

Daniel (17 January 2001)
- Made the two former files lib/download.c and lib/highlevel.c become the new
  lib/transfer.c which makes more sense. I also did the rename from Transfer()
  to Curl_Transfer() in the other source files that use the transfer function
  in the spirit of using Curl_ prefix for library-scoped global symbols.

- Added -g/--globoff that switches OFF the URL globbing and thus enables {}[]
  letters to be part of the URL. Do note that RFC2396 section 2.4.3 explicitly
  mention these letters to be escaped. This was posted as a feature request by
  Jorge Gutierrez and as a bug by Terry.

- Short options to curl that requires parameters can now be specified without
  having the option and its parameter space separated. -ofile works as good as
  -o file. -m20 is equal to -m 20. Do note that this goes for single-letter
  options only, verbose --long-style options still must be separated with
  space from their parameters.

Daniel Stenberg's avatar
Daniel Stenberg committed
Daniel (8 January 2001)
- Francis Dagenais reported that the SCO compiler still fails when compiling
  curl due to that getpass_r() prototype. I've now put it around #ifndef
  HAVE_GETPASS_R in an attempt to please the SCO systems.

Daniel Stenberg's avatar
Daniel Stenberg committed
- Made some minor corrections to get the client to cleanup properly and I made
  the separator work again when getting multiple globbed URLs to stdout.

- Worked with Loic Dachary to get the make dist and make distcheck work
  correctly. The 'maketgz' script is now using the automake generated 'make
  dist' when creating release archives. Loic successfully made 'make rpms'
  automatically build RPMs!
Daniel Stenberg's avatar
Daniel Stenberg committed

Loic Dachary (6 January 2001)
- Automated generation of rpm packages, no need to be root.

- make distcheck generates a proper distribution (EXTRA_DIST
  in all Makefile.am modified to match FILES).

- Huge client-side hack: now multiple URLs are supported. Any number of URLs
  can be specified on the command line, and they'll all be downloaded. There
  must be a corresponding -o or -O for each URL or the data will be written to
  stdout. This needs more testing, time to release a 7.6-pre package.

- The krb4 support was broken in the release. Fixed now.

- Huge internal symbol rename operation. All non-static but still lib-internal
  symbols should now be prefixed with 'Curl_' to prevent collisions with other
  libs. All public symbols should be prefixed with 'curl_' and the rest should
  be static and thus invisible to the outside world. I updated the INTERNALS
  document to say this as well.
Daniel Stenberg's avatar
Daniel Stenberg committed
Version 7.5.2

Daniel (4 January 2001)
- As Kevin P Roth suggested, I've added text to the man page for every command
  line option and what happens when you specify that option more than
  once. That hasn't been exactly crystal clear before.

- Made the configure script possible to run from outside the source-tree. For
  odd reasons I can't build curl properly outside though. It has to do with
  curl's dependencies on libcurl...

Daniel Stenberg's avatar
Daniel Stenberg committed
- Cut off all older (dated 1999 and earlier) CHANGES entries from this file.
  The older piece is named CHANGES.0 and is added to the CVS repository in
  case anyone would need it.

- I added another file 'CVS-INFO' to the CVS. It contains information about
  files in the CVS that aren't included in release archives and how to build
  curl when you get the sources off CVS.

- Updated CONTRIBUTE and FAQ due to the new license.

Daniel (3 January 2001)
- Renamed README.libcurl to LIBCURL

- Changed headers in all sources files to the new dual license concept of
  curl: use the MIT/X derivate license *or* MPL. The LEGAL file was updated
  accordingly and the MPL 1.1 and MIT/X derivate licenses are now part of the
  release archive.