CHANGES 79.7 KB
Newer Older
Daniel Stenberg's avatar
Daniel Stenberg committed
                                  _   _ ____  _     
                              ___| | | |  _ \| |    
                             / __| | | | |_) | |    
                            | (__| |_| |  _ <| |___ 
                             \___|\___/|_| \_\_____|

                               History of Changes

Daniel Stenberg's avatar
Daniel Stenberg committed
Version XX

Daniel Stenberg's avatar
Daniel Stenberg committed
Daniel (13 March 2000):
- <curl@spam.wolvesbane.net> pointed out that the way curl sent cookies in a
  single line wasn't enjoyed by IIS4.0 servers. In my view, that is not what
  the standards say, but I added a white space between the name/value pairs to
  perhaps make them work better.

- Added the perl check back in the configure.in again since the mkhelp.pl
  script needs it!

- Made some beautifications in the curl man page.

Daniel (3 March 2000):
- Jörn helped me update the config-win32.h files with HAVE_SETVBUF and
  HAVE_STRDUP.

Daniel (3 March 2000):
- Uploaded the 6.5pre2 package.

Daniel (2 March 2000):
- Removed the perl-programs from the distribution, they never made many people
  happy and I'll still keep them available on the web.

- Added the -w and -N stuff to the man page. Documented the new progress meter
  display in README.curl.

- Jörn Hartroth <Joern.Hartroth@telekom.de>, Chris <cbayliss@csc.come> and Ulf
  Möller from the openssl development team helped bringing me the details for
  fixing an OpenSSL usage flaw. It became appearant when they released openssl
  0.9.5 since that barfed on curl's bad behaviour (not seeding a random number
  thing).

- Yet another option: -N/--no-buffer disables buffering in the output stream.
  Probably most useful for very slow transfers when you really want to get
  every byte curl receives within some prefered time. Andrew <tmr@gci.net>
  suggested this.

- Damien Adant <dams@usa.net> mailed me his fixes for making curl compile
  on Ultrix.

Daniel (24 February 2000):
- Applied Jörn Hartroth's fixes for config-win32.h and lib/Makefile.w32.

  I should also make a note here, if nothing else to myself, that when using
  the %-syntax for variables in DOS command prompts, you must use two %-
  letters for each one since that is an escape letter there! Maybe I should
  use another letter instead!

- Added more variables to -w:

  'http_code'
  'time_namelookup'
  'time_connect'
  'time_pretransfer'
  'url_effective'

- Made -w@filename read the syntax from a file and -w@- reads the syntax from
  stdin in the good old "standard" curl way.

Daniel (22 February 2000):
- Released a 6.5pre1 version to get some test and user feedback.

Daniel (21 February 2000):

- I added the -w/--write-out flag and some variables to go with it. -w is a
  single string, whatever you enter there will be written out when curl has
  completed a successful request. There are some variable substitutions and
  they are specifed as '%{variable}' (without the quotes). Variables that
  exist as of this moment are:

        total_time     - total transfer time in seconds (with 2 decimals)
        size_download  - total downloaded amount of bytes
        size_upload    - total uploaded amount of bytes
        speed_download - the average speed of the entire download
        speed_upload   - the average speed of the entire upload

  I will of course add more variables, but I need input on these and others.

- It struck me that the -# progress bar will be hard to just apply on the new
  progress bar concept. I need some feedback on this before that'll get re-
  introduced! :-/

Daniel Stenberg's avatar
Daniel Stenberg committed
Daniel (16 February 2000):
- Jörn Hartroth brought me some fixes for the progress meter and I continued
  working on it. It seems to work for http download, http post, ftp download
  and ftp upload. It should be a pretty good test it works generally good.

- Still need to add the -# progress bar into the new style progress interface.

- Gonna have a go at my new output option parameter next.

Daniel (15 February 2000):
- The progress meter stuff is slowly taking place. There's more left before it
  is working ok and everything is tested, but we're reaching there. Slowly!

Daniel (11 February 2000):
- Paul Marquis <pmarquis@iname.com> fixed the config file parsing of curl to
  deal with any-length lines, removing the previous limit of 4K.

- Eetu Ojanen <esojanen@jyu.fi>'s suggestion of supporting the @-style for -b
  is implemented. Now -b@<filename> works as well as the old style. -b@- also
  similarily reads the cookies from stdin.

- Reminder: -D should not write to the file until it needs to, in the same way
  -o does. That would enable curl to use -b and -D on the same file...

- Ellis Pritchard <ellis@citria.com> made getdate.y work for MacOS X.

- Paul Harrington <paul@pizza.org> helped me out finding the crash in the
  cookie parser. He also pointed out curl's habbit of sending empty cookies to
  the server.

Daniel (8 February 2000):
 - Ron Zapp <rzapper@yahoo.com> corrected a problem in src/urlglob.c that
   prevented curl from getting compiled on sunos 4. The problem had to do
   with the difference in sprintf() return code types.

 - Transfer() should now be able to download and upload simultaneously. Let's
   do some progress meter fixes later this week.

Daniel Stenberg's avatar
Daniel Stenberg committed
Daniel (31 January 2000):
 - Paul Harrington <paul@pizza.org> found another core dump in the cookie
   parser. Curl doesn't properly recognize the 'version' keyword and I think
   that is what caused this. I need to refresh some specs on cookies and see
   what else curl lacks to improve this a bit more once and for all.
Daniel Stenberg's avatar
Daniel Stenberg committed
   RFC 2109 clearly specifies how cookies should be dealt with when they are
   compliant with that spec. I don't think many servers are though...

Daniel Stenberg's avatar
Daniel Stenberg committed
 - Mark W. Eichin <eichin@thok.org> found that while curl is uploading a form
   to a web site, it doesn't read incoming data why it'll hang after a while
   since the socket "pipe" becomes full.

   It took me two hours to rewrite Download() and Upload() into the new
   single function Transfer(). It even seems to work! More testing is required
   of course... I should get the header-sending together in a kind of queue
   and let them get "uploaded" in Transfer() as well.

Daniel Stenberg's avatar
Daniel Stenberg committed
 - Zhibiao Wu <wuzb@erols.com> pointed out a curl bug in the location: area,
   although I did not get a reproducable way to do this why I have to wait
   with fixing anything.

 - Bob Schader <rschader@product-des.com> suggested I should implement resume
   support for the HTTP PUT operation, and as I think it is a valid suggestion
   I'll work on it.

Daniel (25 January 2000):
 - M Travis Obenhaus <Travis.Obenhaus@aud.alcatel.com> pointed out a manual
   mixup with -y and -Y that was corrected.

 - Jens Schleusener <Jens.Schleusener@dlr.de> pointed out a problem to compile
   curl on AIX 4.1.4 and gave me a solution. This problem was already fixed
   by Jörn's recent #include modifications!

Daniel (19 January 2000):
 - Oskar Liljeblad <osk@hem.passagen.se> pointed out and corrected a problem
   in the Location: following system that made curl following a location: to a
   different protocol to fail.

   At January 31st I re-considered this fix and the surrounding source code. I
   could not really see that the patch did any difference, why I removed it
   again for further research and debugging. (It disabled location: following
   on server not running on default ports.)

 - Jörn Hartroth <Joern.Hartroth@telekom.de> brought a fix that once again
   made it possible to select progress bar.

 - Jörn also fixed a few include problems.
Daniel Stenberg's avatar
Daniel Stenberg committed

Version 6.4

Daniel Stenberg's avatar
Daniel Stenberg committed
Daniel (17 January 2000):
 - Based on suggestions from Björn Stenberg (bjorn@haxx.nu), I made the
   progress deal better with larger files and added a "Time" field which shows
   the time spent on the download so far.
 - I'm now using the CVS repository on sourceforge.net, which also allows web
   browsing. See http://curl.haxx.nu.

Daniel Stenberg's avatar
Daniel Stenberg committed
Daniel (10 January 2000):
 - Renumbered some enums in curl/curl.h since tag number 35 was used twice!
 - Added "postquote" support to the ftp section that enables post-ftp-transfer
   quote commands.
 - Now made the -Q/--quote parameter recognize '-' as a prefix, which means
   that command will be issued AFTER a successful ftp transfer. This can of
   course be used to delete or rename a file after it has been uploaded or
   downloaded. Use your imagination! ;-)
 - Since I do the main development on solaris 2.6 now, I had to download and
   install GNU groff to generate the hugehelp.c file. The solaris nroff cores
Daniel Stenberg's avatar
Daniel Stenberg committed
   on the man page! So, in order to make the solaris configure script find a
   better result I made gnroff get checked prior to the regular nroff.
Daniel Stenberg's avatar
Daniel Stenberg committed
 - Added all the curl exit codes to the man page.
 - Jim Gallagher <jmgallag@usa.net> properly tracked down a bug in autoconf
   2.13. The AC_CHECK_LIB() macro wrongfully uses the -l flag before the -L
   flag to 'ld' which causes the HP-UX 10.20 flavour to fail on all libchecks
   and thefore you can't make the configure script find the openssl libs!

Daniel Stenberg's avatar
Daniel Stenberg committed
206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 331 332 333 334 335 336 337 338 339 340 341 342 343 344 345 346 347 348 349 350 351 352 353 354 355 356 357 358 359 360 361 362 363 364 365 366 367 368 369 370 371 372 373 374 375 376 377 378 379 380 381 382 383 384 385 386 387 388 389 390 391 392 393 394 395 396 397 398 399 400 401 402 403 404 405 406 407 408 409 410 411 412 413 414 415 416 417 418 419 420 421 422 423 424 425 426 427 428 429 430 431 432 433 434 435 436 437 438 439 440 441 442 443 444 445 446 447 448 449 450 451 452 453 454 455 456 457 458 459 460 461 462 463 464 465 466 467 468 469 470 471 472 473 474 475 476 477 478 479 480 481 482 483 484 485 486 487 488 489 490 491 492 493 494 495 496 497 498 499 500 501 502 503 504 505 506 507 508 509 510 511 512 513 514 515 516 517 518 519 520 521 522 523 524 525 526 527 528 529 530 531 532 533 534 535 536 537 538 539 540 541 542 543 544 545 546 547 548 549 550 551 552 553 554 555 556 557 558 559 560 561 562 563 564 565 566 567 568 569 570 571 572 573 574 575 576 577 578 579 580 581 582 583 584 585 586 587 588 589 590 591 592 593 594 595 596 597 598 599 600 601 602 603 604 605 606 607 608 609 610 611 612 613 614 615 616 617 618 619 620 621 622 623 624 625 626 627 628 629 630 631 632 633 634 635 636 637 638 639 640 641 642 643 644 645 646 647 648 649 650 651 652 653 654 655 656 657 658 659 660 661 662 663 664 665 666 667 668 669 670 671 672 673 674 675 676 677 678 679 680 681 682 683 684 685 686 687 688 689 690 691 692 693 694 695 696 697 698 699 700 701 702 703 704 705 706 707 708 709 710 711 712 713 714 715 716 717 718 719 720 721 722 723 724 725 726 727 728 729 730 731 732 733 734 735 736 737 738 739 740 741 742 743 744 745 746 747 748 749 750 751 752 753 754 755 756 757 758 759 760 761 762 763 764 765 766 767 768 769 770 771 772 773 774 775 776 777 778 779 780 781 782 783 784 785 786 787 788 789 790 791 792 793 794 795 796 797 798 799 800 801 802 803 804 805 806 807 808 809 810 811 812 813 814 815 816 817 818 819 820 821 822 823 824 825 826 827 828 829 830 831 832 833 834 835 836 837 838 839 840 841 842 843 844 845 846 847 848 849 850 851 852 853 854 855 856 857 858 859 860 861 862 863 864 865 866 867 868 869 870 871 872 873 874 875 876 877 878 879 880 881 882 883 884 885 886 887 888 889 890 891 892 893 894 895 896 897 898 899 900 901 902 903 904 905 906 907 908 909 910 911 912 913 914 915 916 917 918 919 920 921 922 923 924 925 926 927 928 929 930 931 932 933 934 935 936 937 938 939 940 941 942 943 944 945 946 947 948 949 950 951 952 953 954 955 956 957 958 959 960 961 962 963 964 965 966 967 968 969 970 971 972 973 974 975 976 977 978 979 980 981 982 983 984 985 986 987 988 989 990 991 992 993 994 995 996 997 998 999 1000 1001 1002 1003 1004 1005 1006 1007 1008 1009 1010 1011 1012 1013 1014 1015 1016 1017 1018 1019 1020 1021 1022 1023 1024 1025 1026 1027 1028 1029 1030 1031 1032 1033 1034 1035 1036 1037 1038 1039 1040 1041 1042 1043 1044 1045 1046 1047 1048 1049 1050 1051 1052 1053 1054 1055 1056 1057 1058 1059 1060 1061 1062 1063 1064 1065 1066 1067 1068 1069 1070 1071 1072 1073 1074 1075 1076 1077 1078 1079 1080 1081 1082 1083 1084 1085 1086 1087 1088 1089 1090 1091 1092 1093 1094 1095 1096 1097 1098 1099 1100 1101 1102 1103 1104 1105 1106 1107 1108 1109 1110 1111 1112 1113 1114 1115 1116 1117 1118 1119 1120 1121 1122 1123 1124 1125 1126 1127 1128 1129 1130 1131 1132 1133 1134 1135 1136 1137 1138 1139 1140 1141 1142 1143 1144 1145 1146 1147 1148 1149 1150 1151 1152 1153 1154 1155 1156 1157 1158 1159 1160 1161 1162 1163 1164 1165 1166 1167 1168 1169 1170 1171 1172 1173 1174 1175 1176 1177 1178 1179 1180 1181 1182 1183 1184 1185 1186 1187 1188 1189 1190 1191 1192 1193 1194 1195 1196 1197 1198 1199 1200 1201 1202 1203 1204 1205 1206 1207 1208 1209 1210 1211 1212 1213 1214 1215 1216 1217 1218 1219 1220 1221 1222 1223 1224 1225 1226 1227 1228 1229 1230 1231 1232 1233 1234 1235 1236 1237 1238 1239 1240 1241 1242 1243 1244 1245 1246 1247 1248 1249 1250 1251 1252 1253 1254 1255 1256 1257 1258 1259 1260 1261 1262 1263 1264 1265 1266 1267 1268 1269 1270 1271 1272 1273 1274 1275 1276 1277 1278 1279 1280 1281 1282 1283 1284 1285 1286 1287 1288 1289 1290 1291 1292 1293 1294 1295 1296 1297 1298 1299 1300 1301 1302 1303 1304 1305 1306 1307 1308 1309 1310 1311 1312 1313 1314 1315 1316 1317 1318 1319 1320 1321 1322 1323 1324 1325 1326 1327 1328 1329 1330 1331 1332 1333 1334 1335 1336 1337 1338 1339 1340 1341 1342 1343 1344 1345 1346 1347 1348 1349 1350 1351 1352 1353 1354 1355 1356 1357 1358 1359 1360 1361 1362 1363 1364 1365 1366 1367 1368 1369 1370 1371 1372 1373 1374 1375 1376 1377 1378 1379 1380 1381 1382 1383 1384 1385 1386 1387 1388 1389 1390 1391 1392 1393 1394 1395 1396 1397 1398 1399 1400 1401 1402 1403 1404 1405 1406 1407 1408 1409 1410 1411 1412 1413 1414 1415 1416 1417 1418 1419 1420 1421 1422 1423 1424 1425 1426 1427 1428 1429 1430 1431 1432 1433 1434 1435 1436 1437 1438 1439 1440 1441 1442 1443 1444 1445 1446 1447 1448 1449 1450 1451 1452 1453 1454 1455 1456 1457 1458 1459 1460 1461 1462 1463 1464 1465 1466 1467 1468 1469 1470 1471 1472 1473 1474 1475 1476 1477 1478 1479 1480 1481 1482 1483 1484 1485 1486 1487 1488 1489 1490 1491 1492 1493 1494 1495 1496 1497 1498 1499 1500 1501 1502 1503 1504 1505 1506 1507 1508 1509 1510 1511 1512 1513 1514 1515 1516 1517 1518 1519 1520 1521 1522 1523 1524 1525 1526 1527 1528 1529 1530 1531 1532 1533 1534 1535 1536 1537 1538 1539 1540 1541 1542 1543 1544 1545 1546 1547 1548 1549 1550 1551 1552 1553 1554 1555 1556 1557 1558 1559 1560 1561 1562 1563 1564 1565 1566 1567 1568 1569 1570 1571 1572 1573 1574 1575 1576 1577 1578 1579 1580 1581 1582 1583 1584 1585 1586 1587 1588 1589 1590 1591 1592 1593 1594 1595 1596 1597 1598 1599 1600 1601 1602 1603 1604 1605 1606 1607 1608 1609 1610 1611 1612 1613 1614 1615 1616 1617 1618 1619 1620 1621 1622 1623 1624 1625 1626 1627 1628 1629 1630 1631 1632 1633 1634 1635 1636 1637 1638 1639 1640 1641 1642 1643 1644 1645 1646 1647 1648 1649 1650 1651 1652 1653 1654 1655 1656 1657 1658 1659 1660 1661 1662 1663 1664 1665 1666 1667 1668 1669 1670 1671 1672 1673 1674 1675 1676 1677 1678 1679 1680 1681 1682 1683 1684 1685 1686 1687 1688 1689 1690 1691 1692 1693 1694 1695 1696 1697 1698 1699 1700 1701 1702 1703 1704 1705 1706 1707 1708 1709 1710 1711 1712 1713 1714 1715 1716 1717 1718 1719 1720 1721 1722 1723 1724 1725 1726 1727 1728 1729 1730 1731 1732 1733 1734 1735 1736 1737 1738 1739 1740 1741 1742 1743 1744 1745 1746 1747 1748 1749 1750 1751 1752 1753 1754 1755 1756 1757 1758 1759 1760 1761 1762 1763 1764 1765 1766 1767 1768 1769 1770 1771 1772 1773 1774 1775 1776 1777 1778 1779 1780 1781 1782 1783 1784 1785 1786 1787 1788 1789 1790 1791 1792 1793 1794 1795 1796 1797 1798 1799 1800 1801 1802 1803 1804 1805 1806 1807 1808 1809 1810 1811 1812 1813 1814 1815 1816 1817 1818 1819 1820 1821 1822 1823 1824 1825 1826 1827 1828 1829 1830 1831 1832 1833 1834 1835 1836 1837 1838 1839 1840 1841 1842 1843 1844 1845 1846 1847 1848 1849 1850
Daniel (28 December 1999):
 - Tim Verhoeven <dj@walhalla.sin.khk.be> correctly identified that curl
   doesn't support URL formatted file names when getting ftp. Now, there's a
   problem with getting very weird file names off FTP servers. RFC 959 defines
   that the file name syntax to use should be the same as in the native OS of
   the server. Since we don't know the peer server system we currently just
   translate the URL syntax into plain letters. It is still better and with
   the solaris 2.6-supplied ftp server it works with spaces in the file names.

Daniel (27 December 1999):
 - When curl parsed cookies straight off a remote site, it corrupted the input
   data, which, if the downloaded headers were stored made very odd characters
   in the saved data. Correctfully identified and reported by Paul Harrington
   <paul@pizza.org>.

Daniel (13 December 1999):
 - General cleanups in the library interface. There had been some bad kludges
   added during times of stress and I did my best to clean them off. It was
   both regarding the lib API as well as include file confusions.

Daniel (3 December 1999):
 - A small --stderr bug was reported by Eetu Ojanen <esojanen@jyu.fi>...

 - who also brought the suggestion of extending the -X flag to ftp list as
   well. So, now it is and the long option is now --request instead. It is
   only for ftp list for now (and the former http stuff too of course).

Lars J. Aas <larsa@sim.no> (24 November 1999):
 - Patched curl to compile and build under BeOS. Doesn't work yet though!

 - Corrected the Makefile.am files to allow putting object files in
   different directories than the sources.

Version 6.3.1

Daniel (23 November 1999):
 - I've had this major disk crash. My good old trust-worthy source disk died
   along with the machine that hosted it. Thank goodness most of all the
   things I've done are either backed up elsewhere or stored in this CVS
   server!

 - Michael S. Steuer <michael@steuer.com> pointed out a bug in the -F handling
   that made curl hang if you posted an empty variable such as '-F name='. It
   was one of those old bugs that never have worked properly...

 - Jason Baietto <jason@durians.com> pointed out a general flaw in the HTTP
   download. Curl didn't complain if it was prematurely aborted before the
   entire download was completed. It does now.

Daniel (19 November 1999):
 - Chris Maltby <chris@aurema.com> very accurately criticized the lack of
   return code checks on the fwrite() calls. I did a thorough check for all
   occurrences and corrected this.

Daniel (17 November 1999):
 - Paul Harrington <paul@pizza.org> pointed out that the -m/--max-time option
   doesn't work for the slow system calls like gethostbyname()... I don't have
   any good fix yet, just a slightly less bad one that makes curl exit hard
   when the timeout is reached.

 - Bjorn Reese helped me point out a possible problem that might be the reason
   why Thomas Hurst experience problems in his Amiga version.

 Daniel (12 November 1999):
 - I found a crash in the new cookie file parser. It crashed when you gave
   a plain http header file as input...

Version 6.3

 Daniel (10 November 1999):
 - I kind of found out that the HTTP time-conditional GETs (-z) aren't always
   respected by the web server and the document is therefore sent in whole
   again, even though it doesn't match the requested condition. After reading
   section 13.3.4 of RFC 2616, I think I'm doing the right thing now when I do
   my own check as well. If curl thinks the condition isn't met, the transfer
   is aborted prematurely (after all the headers have been received).

 - After comments from Robert Linden <robert.linden@postcom.deutschepost.de> I
   also rewrote some parts of the man page to better describe how the -F
   works.

 - Michael Anti <anti@pshowing.com> put up a new curl download mirror in
   China:  http://www.pshowing.com/curl/

 - I added the list of download mirrors to the README file

 - I did add more explanations to the man page

 Daniel (8 November 1999):
 - I made the -b/--cookie option capable of reading netscape formatted cookie
   files as well as normal http-header files. It should be able to
   transparantly figure out what kind of file it got as input.

 Daniel (29 October 1999):
 - Another one of Sebastiaan van Erk's ideas (that has been requested before
   but I seem to have forgotten who it was), is to add support for ranges in
   FTP downloads. As usual, one request is just a request, when they're two
   it is a demand. I've added simple support for X-Y style fetches. X has to
   be the lower number, though you may omit one of the numbers. Use the -r/
   --range switch (previously HTTP-only).

 - Sebastiaan van Erk <sebster@sebster.com> suggested that curl should be
   able to show the file size of a specified file. I think this is a splendid
   idea and the -I flag is now working for FTP. It displays the file size in
   this manner:
        Content-Length: XXXX
   As it resembles normal headers, and leaves us the opportunity to add more
   info in that display if we can come up with more in the future! It also
   makes sense since if you access ftp through a HTTP proxy, you'd get the
   file size the same way.

   I changed the order of the QUOTE command execusions. They're now executed
   just after the login and before any other command. I made this to enable
   quote commands to run before the -I stuff is done too.

 - I found out that -D/--dump-header and -V/--version weren't documented in
   the man page.

 - Many HTTP/1.1 servers do not support ranges. Don't ask me why. I did add
   some text about this in the man page for the range option. The thread in
   the mailing list that started this was initiated by Michael Anti
   <anti@pshowing.com>.

 - I get reports about nroff crashes on solaris 2.6+ when displaying the curl
   man page. Switch to gnroff instead, it is reported to work(!). Adam Barclay
   <adam@oz.org> reported and brought the suggestion.

 - In a dialogue with Johannes G. Kristinsson <d98is@dtek.chalmers.se> we came
   up with the idea to let -H/--header specified headers replace the
   internally generated headers, if you happened to select to add a header
   that curl normally uses by itself. The advantage with this is not entirely
   obvious, but in Johannes' case it means that he can use another Host: than
   the one curl would set.

 Daniel (27 October 1999):
 - Jongki Suwandi <Jongki.Suwandi@eng.sun.com> brought a nice patch for
   (yet another) crash when following a location:. This time you had to
   follow a https:// server's redirect to get the core.

Version 6.2

 Daniel (21 October 1999):
 - I think I managed to remove the suspicious (nil) that has been seen just
   before the "Host:" in HTTP requests when -v was used.
 - I found out that if you followed a location: when using a proxy, without
   having specified http:// in the URL, the protocol part was added once again
   when moving to the next URL! (The protocol part has to be added to the
   URL when going through a proxy since it has no protocol-guessing system
   such as curl has.)
 - Benjamin Ritcey <ritcey@tfn.com> reported a core dump under solaris 2.6
   with OpenSSL 0.9.4. It turned out this was due to a bad free() in main.c
   that occurred after the download was done and completed.
 - Benjamin found ftp downloads to show the first line of the download meter
   to get written twice, and I removed that problem. It was introduced with
   the multiple URL support.
 - Dan Zitter <dzitter@zitter.net> correctly pointed out that curl 6.1 and
   earlier versions didn't honor RFC 2616 chapter 4 section 2, "Message
   Headers": "...Field names are case-insensitive..."
   HTTP header parsing assumed a certain casing. Dan also provided me with
   a patch that corrected this, which I took the liberty of editing slightly.
 - Dan Zitter also provided a nice patch for config.guess to better recognize
   the Mac OS X
 - Dan also corrected a minor problem in the lib/Makefile that caused linking
   to fail on OS X.

 Daniel (19 October 1999):
 - Len Marinaccio <len@goodnet.com> came up with some problems with curl.
   Since Windows has a crippled shell, it can't redirect stderr and that
   causes trouble. I added --stderr today which allows the user to redirect
   the stderr stream to a file or stdout.

 Daniel (18 October 1999):
 - The configure script now understands the '--without-ssl' flag, which now
   totally disable SSL/https support. Previously it wasn't possible to force
   the configure script to leave SSL alone. The previous functionality has
   been retained. Troy Engel helped test this new one.

Version 6.1

 Daniel (17 October 1999):
 - I ifdef'ed or commented all the zlib stuff in the sources and configure
   script. It turned out we needed to mock more with zlib than I initially
   thought, to make it capable of downloading compressed HTTP documents and
   uncompress them on the fly. I didn't mean the zlib parts of curl to become
   more than minor so this means I halt the zlib expedition for now and wait
   until someone either writes the code or zlib gets updated and better
   adjusted for this kind of usage.  I won't get into details here, but a
   short a summary is suitable:
   - zlib can't automatically detect whether to use zlib or gzip
     decompression methods.
   - zlib is very neat for reading gzipped files from a file descriptor,
     although not as nice for reading buffer-based data such as we would
     want it.
   - there are still some problems with the win32 version when reading from
     a file descriptor if that is a socket

 Daniel (14 October 1999):
 - Moved the (external) include files for libcurl into a subdirectory named
   curl and adjusted all #include lines to use <curl/XXXX> to maintain a
   better name space and control of the headers. This has been requested.

 Daniel (12 October 1999):
 - I modified the 'maketgz' script to perform a 'make' too before a release
   archive is put together in an attempt to make the time stamps better and
   hopefully avoid the double configure-running that use to occur.

 Daniel (11 October 1999):
 - Applied Jörn's patches that fixes zlib for mingw32 compiles as well as
   some other missing zlib #ifdef and more text on the multiple URL docs in
   the man page.

Version 6.1beta

 Daniel (6 October 1999):
 - Douglas E. Wegscheid <wegscd@whirlpool.com> sent me a patch that made the
   exact same thing as I just made: the -d switch is now capable of reading
   post data from a named file or stdin.  Use it similarly to the -F. To read
   the post data from a given file:

        curl -d @path/to/filename www.postsite.com

   or let curl read it out from stdin:

        curl -d @- www.postit.com

 Jörn Hartroth (3 October 1999):
 - Brought some more patches for multiple URL functionality. The MIME
   separation ideas are almost scrapped now, and a custom separator is being
   used instead. This is still compile-time "flagged".

 Daniel
 - Updated curl.1 with multiple URL info.

 Daniel (30 September 1999):
 - Felix von Leitner <felix@convergence.de> brought openssl-check fixes
   for configure.in to work out-of-the-box when the openssl files are
   installed in the system default dirs.

 Daniel (28 September 1999)
 - Added libz functionality. This should enable decompressing gzip, compress
   or deflate encoding HTTP documents. It also makes curl send an accept that
   it accepts that kind of encoding. Compressed contents usually shortens
   download time. I *need* someone to tell me a site that uses compressed HTTP
   documents so that I can test this out properly.

 - As a result of the adding of zlib awareness, I changed the version string
   a little. I plan to add openldap version reporting in there too.

 Daniel (17 September 1999)
 - Made the -F option allow stdin when specifying files. By using '-' instead
   of file name, the data will be read from stdin.

Version 6.0

 Daniel (13 September 1999)
 - Added -X/--http-request <request> to enable any HTTP command to be sent.
   Do not that your server has to support the exact string you enter. This
   should possibly a string like DELETE or TRACE.

 - Applied Douglas' mingw32-fixes for the makefiles.

 Daniel (10 September 1999)
 - Douglas E. Wegscheid <wegscd@whirlpool.com> pointed out a problem. Curl
   didn't check the FTP servers return code properly after the --quote
   commands were issued. It took anything non 200 as an error, when all 2XX
   codes should be accepted as OK.

 - Sending cookies to the same site in multiple lines like curl used to do
   turned out to be bad and breaking the cookie specs. Curl now sends all
   cookies on a single Cookie: line. Curl is not yet RFC 2109 compliant, but I
   doubt that many servers do use that syntax (yet).

 Daniel (8 September 1999)
 - Jörn helped me make sure it still compiles nicely with mingw32 under win32.

 Daniel (7 September 1999)
 - FTP upload through proxy is now turned into a HTTP PUT. Requested by
   Stefan Kanthak <Stefan.Kanthak@mchp.siemens.de>.

 - Added the ldap files to the .m32 makefile.

 Daniel (3 September 1999)
 - Made cookie matching work while using HTTP proxy.

 Bjorn Reese <breese@mail1.stofanet.dk> (31 August 1999)
 - Passed his ldap:// patch. Note that this requires the openldap shared
   library to be installed and that LD_LIBRARY_PATH points to the
   directory where the lib will be found when curl is run with a
   ldap:// URL.

 Jörn Hartroth <Joern.Hartroth@telekom.de> (31 August 1999)
 - Made the Mingw32 makefiles into single files.
 - Made file:// work for Win32. The same code is now used for unix as well for
   performance reasons.

 Douglas E. Wegscheid <wegscd@whirlpool.com> (30 August 1999)
 - Patched the Mingw32 makefiles for SSL builds.

 Matthew Clarke <clamat@van.maves.ca> (30 August 1999)
 - Made a cool patch for configure.in to allow --with-ssl to specify the
   root dir of the openssl installation, as in

        ./configure --with-ssl=/usr/ssl_here

 - Corrected the 'reconf' script to work better with some shells.

 Jörn Hartroth <Joern.Hartroth@telekom.de> (26 August 1999)
 - Fixed the Mingw32 makefiles in lib/ and corrected the file.c for win32
   compiles.

Version 5.11

 Daniel (25 August 1999)
 - John Weismiller <johnweis@home.com> pointed out a bug in the header-line
   realloc() system in download.c.

 - I added lib/file.[ch] to offer a first, simple, file:// support. It
   probably won't do much good on win32 system at this point, but I see it
   as a start.

 - Made the release archives get a Makefile in the root dir, which can be
   used to start the compiling/building process easier. I haven't really
   changed any INSTALL text yet, I wanted to get some feed-back on this
   first.

 Daniel (17 August 1999)
 - Another Location: bug. Curl didn't do proper relative locations if the
   original URL had cgi-parameters that contained a slash. Nusu's page
   again.

 - Corrected the NO_PROXY usage. It is a list of substrings that if one of
   them matches the tail of the host name it should connect to, curl should
   not use a proxy to connect there. Pointed out to me by Douglas E. Wegscheid
   <wegscd@whirlpool.com>. I also changed the README text a little regarding
   this.

 Daniel (16 August 1999)
 - Fixed a memory bug with http-servers that sent Location: to a Location:
   page. Nusu's page showed this too.

 - Made cookies work a lot better. Setting the same cookie name several times
   used to add more cookies instead of replacing the former one which it
   should've. Nusu <nus@intergorj.ro> brought me an URL that made this
   painfully visible...

 Troy (15 August 1999)
 - Brought new .spec files as well as a patch for configure.in that lets the
   configure script find the openssl files better, even when the include
   files are in /usr/include/openssl

Version 5.10

 Daniel (13 August 1999)
 - SSL_CTX_set_default_passwd_cb() has been modified in the 0.9.4 version of
   OpenSSL. Now why couldn't they simply add a *new* function instead of
   modifying the parameters of an already existing function? This way, we get
   a compiler warning if compiling with 0.9.4 but not with earlier. So, I had
   to come up with a #if contruction that deals with this...

 - Made curl output the SSL version number get displayed properly with 0.9.4.

 Troy (12 August 1999)
 - Added MingW32 (GCC-2.95) support under Win32. The INSTALL file was also
   a bit rearranged.
 
 Daniel (12 August 1999)
 - I had to copy a good <arpa/telnet.h> include file into the curl source
   tree to enable the silly win32 systems to compile. The distribution rights
   allows us to do that as long as the file remains unmodified.

 - I corrected a few minor things that made the compiler complain when
   -Wall -pedantic was used.

 - I'm moving the official curl web page to http://curl.haxx.nu. I think it
   will make it easier to remember as it is a lot shorter and less cryptic.
   The old one still works and shows the same info.

 Daniel (11 August 1999)
 - Albert Chin-A-Young mailed me another correction for NROFF in the
   configure.in that is supposed to be better for IRIX users.

 Daniel (10 August 1999)
 - Albert Chin-A-Young <china@thewrittenword.com> helped me with some stupid
   Makefile things, as well as some fiddling with the getdate.c
   stuff that he had problems with under HP-UX v10. getdate.y will now be
   compiled into getdate.c if the appropriate yacc or bison is found by the
   configure script. Since this is slightly new, we need to test the output
   getdate.c with win32 systems to make sure it still compiles there.

 Daniel (5 August 1999)
 - I've just setup a new mailing list with the intention to keep discussions
   around libcurl development in it. I mainly expect it to be for thoughts and
   brainstorming around a "next generation" library, rather than nitpicking
   about the current implementation or details in the current libcurl.

   To join our happy bunch of future-looking geeks, enter 'subscribe
   <address>' in the body of a mail and send it to
   libcurl-request@listserv.fts.frontec.se.  Curl bug reports, the usual curl
   talk and everything else should still be kept in this mailing list. I've
   started to archive this mailing list and have put the libcurl web page at
   www.fts.frontec.se/~dast/libcurl/.

 - Stefan Kanthak <Stefan.Kanthak@mchp.siemens.de> contacted me regarding a
   few problems in the configure script which he discovered when trying to
   make curl compile and build under Siemens SINIX-Z V5.42B2004!

 - Marcus Klein <m.klein@in-olpe.de> very accurately informed me that
   src/version.h was not present in the CVS repository. Oh, how silly...

 - Linus Nielsen <Linus.Nielsen@sth.frontec.se> rewrote the telnet:// part and
   now curl offers limited telnet support. If you run curl like 'curl
   telnet://host' you'll get all output on the screen and curl will read input
   from stdin. You'll be able to login and run commands etc, but since the
   output is buffered, expect to get a little weird output.

   This is still in its infancy and it might get changed. We need your
   feed-back and input in how this is best done.

   WIN32 NOTE: I bet we'll get problems when trying to compile the current
   lib/telnet.c on win32, but I think we can sort them out in time.

 - David Sanderson <david@transarc.com> reported that FORCE_ALLOCA_H or
   HAVE_ALLOCA_H must be defined for getdate.c to compile properly on HP-UX
   11.0. I updated the configure script to check for alloca.h which should
   make it.

 Daniel (4 August 1999)
 - I finally got to understand Marcus Klein's ftp download resume problem,
   which turns out to be due to different outputs from different ftp
   servers. It makes ftp download resuming a little trickier, but I've made
   some modifications I really believe will work for most ftp servers and I do
   hope you report if you have problems with this!

 - Added text about file transfer resuming to README.curl.

 Daniel (2 August 1999)
 - Applied a progress-bar patch from Lars J. Aas <larsa@sim.no>. It offers
   a new styled progress bar enabled with -#/--progress-bar. 

 T. Yamada <tai@imasy.or.jp> (30 July 1999)
 - It breaks with segfault when 1) curl is using .netrc to obtain
   username/password (option '-n'), and 2) is auto-matically redirected to
   another location (option '-L').

   There is a small bug in lib/url.c (block starting from line 641), which
   tries to take out username/password from user- supplied command-line
   argument ('-u' option). This block is never executed on first attempt since
   CONF_USERPWD bit isn't set at first, but curl later turns it on when it
   checks for CONF_NETRC bit. So when curl tries to redo everything due to
   redirection, it segfaults trying to access *data->userpwd.

Version 5.9.1

 Daniel (30 July 1999)
 - Steve Walch <swalch@cisoft.com> pointed out that there is a memory leak in
   the formdata functions. I added a FormFree() function that is now used and
   supposed to correct this flaw.

 - Mark Wotton <mwotton@black.ug.cs.usyd.edu.au> reported:
   'curl -L https://www.cwa.com.au/' core dumps.  I managed to cure this by
   correcting the cleanup procedure. The bug seems to be gone with my OpenSSL
   0.9.2b, although still occurs when I run the ~100 years old SSLeay 0.8.0. I
   don't know whether it is curl or SSLeay that is to blame for that.

 - Marcus Klein <m.klein@in-olpe.de>:
   Reported an FTP upload resume bug that I really can't repeat nor understand.
   I leave it here so that it won't be forgotten.

 Daniel (29 July 1999)
 - Costya Shulyupin <costya@trivnet.com> suggested support for longer URLs
   when following Location: and I could only agree and fix it!

 - Leigh Purdie <leighp@defcen.gov.au> found a problem in the upload/POST
   department. It turned out that http.c accidentaly cleared the pointer
   instead of the byte counter when supposed to.

 - Costya Shulyupin <costya@trivnet.com> pointed out a problem with port
   numbers and Location:. If you had a server at a non-standard port that
   redirected to an URL using a standard port number, curl still used that
   first port number.

 - Ralph Beckmann <rabe@uni-paderborn.de> pointed out a problem when using both
   CONF_FOLLOWLOCATION and CONF_FAILONERROR simultaneously. Since the
   CONF_FAILONERROR exits on the 302-code that the follow location header
   outputs it will never show any html on location: pages. I have now made it
   look for >=400 codes if CONF_FOLLOWLOCATION is set.

 - 'struct slist' is now renamed to 'struct curl_slist' (as suggested by Ralph
   Beckmann).

 - Joshua Swink <jpswink@hotmail.com> and Rick Welykochy <rick@praxis.com.au>
   were the first to point out to me that the latest OpenSSL package now have
   moved the standard include path. It is now in
   /usr/local/ssl/include/openssl and I have now modified the --enable-ssl
   option for the configure script to use that as the primary path, and I
   leave the former path too to work with older packages of OpenSSL too.

 Daniel (9 June 1999)
 - I finally understood the IRIX problem and now it seem to compile on it!
   I am gonna remove those #define strcasecmp() things once and for all now.

 Daniel (4 June 1999)
 - I adjusted the FTP reply 227 parser to make the PASV command work better
   with more ftp servers. Appearantly the Roxen Challanger server replied
   something curl 5.9 could deal with! :-( Reported by Ashley Reid-Montanaro
   <ashley@compsoc.man.ac.uk> and Mark Butler <butlerm@xmission.com> brought a
   solution for it.

 Daniel (26 May 1999)
 - Rearranged. README is new, the old one is now README.curl and I added a
   README.libcurl with text I got from Ralph Beckmann <rabe@uni-paderborn.de>.

 - I also updated the INSTALL text.

 Daniel (25 May 1999)
 - David Jonathan Lowsky <dlowsky@leland.stanford.edu> correctly pointed out
   that curl didn't properly deal with form posting where the variable 
   shouldn't have any content, as in curl -F "form=" www.site.com. It was
   now fixed.

Version 5.9

 Daniel (22 May 1999)
 - I've got a bug report from Aaron Scarisbrick <aaronsca@hotmail.com> in
   which he states he has some problems with -L under FreeBSD 3.0. I have
   previously got another bug report from Stefan Grether
   <stefan.grether@ubs.com> which points at an error with similar sympthoms
   when using win32. I made the allocation of the new url string a bit faster
   and different, don't know if it actually improves anything though...

 Daniel (20 May 1999)
 - Made the cookie parser deal with CRLF newlines too.

 Daniel (19 May 1999)
 - Download() didn't properly deal with failing return codes from the
   sread() function. Adam Coyne <adam@gamespy.com> found the problem in the
   win32 version, and Troy Engel helped me out isolating it.

 Daniel (16 May 1999)
 - Richard Adams <Richard@Slayford.com> pointed out a bug I introduced in
   5.8. --dump-header doesn't work anymore! :-/ I fixed it now.

 - After a suggestion by Joshua Swink <jpswink@hotmail.com> I added -S /
   --show-error to force curl to display the error message in case of an
   error, even if -s/--silent was used.

 Daniel (10 May 1999)
 - I moved the stuff concerning HTTP, DICT and TELNET it their own source
   files now. It is a beginning on my clean-up of the sources to make them
   layer all those protocols better to enable more to be added easier in the
   future!

 - Leon Breedt <ljb@debian.org> sent me some files I've not put into the main
   curl archive. They're for creating the Debian package thingie. He also sent
   me a debian package that I've made available for download at the web page

 Daniel (9 May 1999)
 - Made it compile on cygwin too.

 Troy Engel (7 May 1999)
 - Brought a series of patches to allow curl to compile smoothly on MSVC++ 6
   again!

 Daniel (6 May 1999)
 - I changed the #ifdef HAVE_STRFTIME placement for the -z code so that it
   will be easier to discover systems that don't have that function and thus
   can't use -z successfully. Made the strftime() get used if WIN32 is defined
   too.

Version 5.8

 Daniel (5 May 1999)
 - I've had it with this autoconf/automake mess. It seems to work allright
   for most people who don't have automake installed, but for those who have
   there are problems all over.

   I've got like five different bug reports on this only the last
   week... Claudio Neves <claudio@nextis.com> and Federico Bianchi
   <bianchi@pc-arte2.arte.unipi.it> and root <duggerj001@hawaii.rr.com> are
   some of them reporting this.

   Currently, I have no really good fix since I want to use automake myself to
   generate the Makefile.in files. I've found out that the @SHELL@-problems
   can often be fixed by manually invoking 'automake' in the archive root
   before you run ./configure... I've hacked my maketgz script now to fiddle
   a bit with this and my tests seem to work better than before at least!

 Daniel (4 May 1999)
 - mkhelp.pl has been doing badly lately. I corrected a case problem in
   the regexes.

 - I've now remade the -o option to not touch the file unless it needs to.
   I had to do this to make -z option really fine, since now you can make a
   curl fetch and use a local copy's time when downloading to that file, as
   in:

        curl -z dump -o dump remote.site.com/file.html

   This will only get the file if the remote one is newer than the local.
   I'm aware that this alters previous behaviour a little. Some scripts out
   there may depend on that the file is always touched...

 - Corrected a bug in the SSLv2/v3 selection.

 - Felix von Leitner <leitner@math.fu-berlin.de> requested that curl should
   be able to send "If-Modified-Since" headers, which indeed is a fair idea.
   I implemented it right away! Try -z <expression> where expression is a full
   GNU date expression or a file name to get the date from!

 Stephan Lagerholm <stephan@unilog.se> (30 Apr 1999)
 - Pointed out a problem with the src/Makefile for FreeBSD. The RM variable
   isn't set and causes the make to fail.

 Daniel (26 April 1999)
 - Am I silly or what? <Irving_Wolfe@wolfe.net> pointed out to me that the
   curl version number was not set properly. Hasn't been since 5.6. This was
   due to a bug in my maketgz script!

 David Eriksson <david@2good.com> (25 Apr 1999)
 - Found a bug in cookies.c that made it crash at times.

Version 5.7.1

 Doug Kaufman <dkaufman@rahul.net> (23 Apr 1999)
 - Brought two sunos 4 fixes. One of them being the hostip.c fix mentioned
   below and the other one a correction in include/stdcheaders.h

 - Added a paragraph about compiling with the US-version of openssl to the
   INSTALL file.

 Daniel
 - New mailing list address. Info updated on the web page as well as in the
   README file

 Greg Onufer <Greg.Onufer@Eng.Sun.COM> (20 Apr 1999)
 - hostip.c didn't compile properly on SunOS 5.5.1.
   It needs an #include <sys/types.h>

Version 5.7

 Daniel (Apr 20 1999)
 - Decided to upload a non-beta version right now!

 - Made curl support any-length HTTP headers. The destination buffer is now
   simply enlarged every time it turns out to be too small!

 - Added the FAQ file to the archive. Still a bit smallish, but it is a
   start.

 Eric Thelin <eric@generation-i.com> (15 Apr 1999)
 - Made -D accept '-' instead of filename to write to stdout.

Version 5.6.3beta

 Daniel (Apr 12 1999)

 - Changed two #ifdef WIN32 to better #ifdef <errorcode> when connect()ing
   in url.c and ftp.c. Makes cygwin32 deal with them better too. We should
   try to get some decent win32-replacement there. Anyone?

 - The old -3/--crlf option is now ONLY --crlf!

 - I changed the "SSL fix" to a more lame one, but that doesn't remove as
   much functionality. Now I've enabled the lib to select what SSL version it
   should try first. Appearantly some older SSL-servers don't like when you
   talk v3 with them so you need to be able to force curl to talk v2 from the
   start. The fix dated April 6 and posted on the mailing list forced curl to
   use v2 at all times using a modern OpenSSL version, but we don't really
   want such a crippled solution.
 
 - Marc Boucher <marc@mbsi.ca> sent me a patch that corrected a math error
   for the "Curr.Speed" progress meter.

 - Eric Thelin <eric@generation-i.com> sent me a patch that enables '-K -'
   to read a config file from stdin.

 - I found out we didn't close the file properly before so I added it!

 Daniel (Apr 9 1999)
 - Yu Xin <is@isee.za.net> pointed out a problem with ftp download resume.
   It didn't work at all! ;-O

 Daniel (Apr 6 1999)
 - Corrected the version string part generated for the SSL version.

 - I found a way to make some other SSL page work with openssl 0.9.1+ that
   previously didn't (ssleay 0.8.0 works with it though!). Trying to get
   some real info from the OpenSSL guys to see how I should do to behave the
   best way. SSLeay 0.8.0 shouldn't be that much in use anyway these days!

Version 5.6.2beta

 Daniel (Apr 4 1999)
 - Finally have curl more cookie "aware". Now read carefully. This is how
   it works.
   To make curl read cookies from an already existing file, in plain header-
   format (like from the headers of a previous fetch) invoke curl with the
   -b flag like:

        curl -b file http://site/foo.html

   Curl will then use all cookies it finds matching. The old style that sets
   a single cookie with -b is still supported and is used if the string
   following -b includes a '=' letter, as in "-b name=daniel".

   To make curl read the cookies sent in combination with a location: (which
   sites often do) point curl to read a non-existing file at first (i.e
   to start with no existing cookies), like:

        curl -b nowhere http://site/setcookieandrelocate.html

 - Added a paragraph in the TODO file about the SSL problems recently
   reported. Evidently, some kind of SSL-problem curl may need to address.

 - Better "Location:" following.

 Douglas E. Wegscheid <wegscd@whirlpool.com> (Tue, 30 Mar 1999)
 - A subsecond display patch.

 Daniel (Mar 14 1999)
 - I've separated the version number of libcurl and curl now. To make
   things a little easier, I decided to start the curl numbering from
   5.6 and the former version number known as "curl" is now the one
   set for libcurl.

 - Removed the 'enable-no-pass' from configure, I doubt anyone wanted
   that.

 - Made lots of tiny adjustments to compile smoothly with cygwin under
   win32. It's a killer for porting this to win32, bye bye VC++! ;-)
   Compiles and builds out-of-the-box now. See the new wordings in
   INSTALL for details.

 - Beginning experiments with downloading multiple document from a http
   server while remaining connected.

Version 5.6beta

 Daniel (Mar 13 1999)
 - Since I've changed so much, I thought I'd just go ahead and implement
   the suggestion from Douglas E. Wegscheid <wegscd@whirlpool.com>. -D or
   --dump-header is now storing HTTP headers separately in the specified
   file.

 - Added new text to INSTALL on what to do to build this on win32 now.

 - Aaargh. I had to take a step back and prefix the shared #include files
   in the sources with "../include/" to please VC++...

 Daniel (Mar 12 1999)
 - Split the url.c source into many tiny sources for better readability
   and smaller size.

 Daniel (Mar 11 1999)
 - Started to change stuff for a move to make libcurl and a more separate
   curl application that uses the libcurl. Made the libcurl sources into
   the new lib directory while the curl application will remain in src as
   before. New makefiles, adjusted configure script and so.

   libcurl.a built quickly and easily. I better make a better interface to
   the lib functions though.

   The new root dir include/ is supposed to contain the public information
   about the new libcurl. It is a little ugly so far :-)


 Daniel (Mar 1 1999)
 - Todd Kaufmann <tkaufmann@adforce.com> sent me a good link to Netscape's
   cookie spec as well as the info that RFC 2109 specifies how to use them.
   The link is now in the README and the RFC in the RESOURCES.

 Daniel (Feb 23 1999)
 - Finally made configure accept --with-ssl to look for SSL libs and includes
   in the "standard" place /usr/local/ssl...

 Daniel (Feb 22 1999)
 - Verified that curl linked fine with OpenSSL 0.9.1c which seems to be
   the most recent.

 Henri Gomez <gomez@slib.fr> (Fri Feb  5 1999)
 - Sent in an updated curl-ssl.spec. I still miss the script that builds an
   RPM automatically...

Version 5.5.1

 Mark Butler <butlerm@xmission.com> (27 Jan 1999)
 - Corrected problems in Download().

 Danitel Stenberg (25 Jan 1999)
 - Jeremie Petit <Jeremie.Petit@Digital.com> pointed out a few flaws in the
   source that prevented it from compile warning free with the native
   compiler under Digital Unix v4.0d.

Version 5.5

 Daniel Stenberg (15 Jan 1999)
 - Added Bjorns small text to the README about the DICT protocol.

 Daniel Stenberg (11 Jan 1999)
 - <jswink@softcom.net> reported about the win32-versioin: "Doesn't use
   ALL_PROXY environment variable". Turned out to be because of the static-
   buffer nature of the win32 environment variable calls!

 Bjorn Reese <breese@imada.ou.dk> (10 Jan 1999)
 - I have attached a simple addition for the DICT protocol (RFC 2229).
   It performs dictionary lookups. The output still needs to be better
   formatted.

   To test it try (the exact format, and more examples are described in
   the RFC)

        dict://dict.org/m:hello
        dict://dict.org/m:hello::soundex


 Vicente Garcia <verot@redestb.es> (10 Jan 1999)
 - Corrected the progress meter for files larger than 20MB.

 Daniel Stenberg (7 Jan 1999)
 - Corrected the -t and -T help texts. They claimed to be FTP only.

Version 5.4

 Daniel Stenberg
 (7 Jan 1999)
 - <Irving_Wolfe@Wolfe.Net> reported that curl -s didn't always supress the
   progress reporting. It was the form post that autoamtically always switched
   it on again. This is now corrected!

 (4 Jan 1999)
 - Andreas Kostyrka <andreas@mtg.co.at> suggested I'd add PUT and he helped me
   out to test it. If you use -t or -T now on a http or https server, PUT will
   be used for file upload.

   I removed the former use of -T with HTTP. I doubt anyone ever really used
   that.

 (4 Jan 1999)
 - Erik Jacobsen <erik@mint.com> found a width bug in the mprintf() function.
   I corrected it now.

 (4 Jan 1999)
 - As John V. Chow <johnchow@brooklinetech.com> pointed out to me, curl
   accepted very limited URL sizes. It should now accept path parts that are
   up to at least 4096 bytes.

 - Somehow I screwed up when applying the AIX fix from Gilbert Ramirez, so
   I redid that now.

Version 5.3a (win32 only)

 Troy Engel
 - Corrected a win32 bug in the environment variable part.

Version 5.3

 Gilbert Ramirez Jr. (21 Dec 1998)
 - I have implemented the "quote" function of FTP clients. It allows you to
   send arbitrary commands to the remote FTP server. I chose the -Q/--quote
   command-line arguments.

   You can have more than one quoted string, and curl will apply them in
   order.  This is what I use for my MVS upload:

  curl -B --crlf -Q "site lrecl=80" -Q "site blk=8000" -T file ftp://os390/test

   Curl will send the two quoted "site" commands in the proper order.

 - Made it compile smoothly on AIX.

 Gilbert Ramirez Jr. <gram@verdict.uthscsa.edu> (18 Dec 1998)
 - Brought an MVS patch: -3/--mvs, for ftp upload to the MVS ftp server.

 Troy Engel <tengel@sonic.net> (17 Dec 1998)
 - Brought a correction that fixes the win32 curl bug.

 Daniel Stenberg
 - A bug, pointed out to me by Dr H. T. Leung <htl10@cus.cam.ac.uk>, caused
   curl to crash on the -A flag on certain systems. Actually, all systems
   should've!

 - Added a few defines to make directories/file names get build nicer (with _
   instead of . and \ instead of / in win32).

 - steve <fisk@polar.bowdoin.edu> reported a weird bug that occured if the
   ftp server response line had a parenthesis on the line before the (size)
   info. I hope it works better now!

Version 5.2.1

 Steven G. Johnson <stevenj@alum.mit.edu> (Dec 14, 1998)
 - Brought a fix that corrected a crash in 5.2 due to bad treatment of the
   environment variables.

Version 5.2

 Daniel Stenberg (Dec 14, 1998)
 - Rewrote the mkhelp script and now, the mkhelp.pl script generates the
   hugehelp.c file from the README *and* the man page file curl.1. By using
   both files, I no longer need to have double information in both the man
   page and the README as well. So, win32-users will only have the hugehelp.c
   file for all info, but then, they download the plain binary most times
   anyway.

 - gcc2.8.1 with the -Wall flag complaints a lot on subscript has type `char'
   if I don't explicitly typecast the argument to isdigit() or isspace() to
   int. So I did to compile warning free with that too.

 - Added checks for 'long double' and 'long long' in the configure script. I
   need those for the mprintf.c source to compile well on non long long
   comforming systems!

Version 5.1 (not publicly released)

 Daniel Stenberg (Dec 10, 1998)
 - I got a request for a pre-compiled NT Alpha version. Anyone?

 - Added Lynx/CERN www lib proxy environment variable support. That means curl
   now reads and understands the following environment variables:

	HTTP_PROXY, HTTPS_PROXY, FTP_PROXY, GOPHER_PROXY

   They should be set for protocol-specific proxies. General proxy should be
   set with
	
	ALL_PROXY

   And a comma-separated list of host names that shouldn't go through any
   proxy is set in (only an asterisk, '*' matches all hosts).

	NO_PROXY

   The usage of the -x/--proxy flag overrides the environment variables.

 - Proxy can now be specified with a procotol:// prefix.

 - Wrote the curl.1 man page.

 - Introduced a whole new dynamic buffer system for all sprintf()s. It is
   based on the *printf() package by yours truly and Bjorn Reese. Hopefully,
   there aren't that many buffer overflow risks left now.

 - Ah, I should mention I've compiled and built curl successfully under
   solaris 2.6 with gcc now, gcc 2.7.2 won't work but 2.8.1 did ok.

 Oren Tirosh <oren@hishome.net> (Dec 3, 1998)
 - Brought two .spec files, to use when creating (Linux) Redhat style RPM
   packages. They're named curl.spec and curl-ssl.spec.

 Troy Engel <tengel@sonic.net>
 - Supplied the src/Makefile.vc6 for easy compiling with VC++ under Win32.

Version 5.0

 Daniel Stenberg (Dec 1, 1998)
 - Not a single bug report in ages.
 - Corrected getpass.c and main.c to compile warning and error free with the
   Win32 VC++ crap.

Version 5.0 beta 24

 Daniel Stenberg (Nov 20, 1998)

 HOW TO BUILD A RELEASE ARCHIVE:

 * Pre-requisite software:
   What		     To build what             Reads data from
   ====              =============             ===============
   GNU automake      Makefile.in, aclocal.m4   configure.in
     GNU make(1)      - " -
     GNU gcc(1)       - " -
   GNU autoconf      configure                 configure.in
   GNU autoheader(2) config.h.in	       configure.in, acconfig.h

 * Make sure all files that should be part of the archive are put in FILES.

 * Run './maketgz' and enter version number of the new to become archive.

   maketgz does:

   - Enters the newly created version number in url.h.
   - (If you don't have automake, this script will warn about that, but unless
     you have changed the Makefile.am files, that is nothing to care about.)
     If you have it, it'll run it.
   - If you have autoconf, the configure.in will be edited to get the newly
     created version number and autoconf will be run.
   - Creates a new directory named curl-<version>. (Actually, it uses the base
     name of the current directory up to the first '-'.)
   - Copies all files mentioned in FILES to the new directory. Saving
     permissions and directory structure.
   - Uses tar to create an archive of it all, named curl-<version>.tar.gz
   - gzips the archive
   - Removes the new directory and all its contents.

 * When done, you have an archive stored in your directory named
   curl-<version>.tar.gz.

   Done!

   (1) They're required to make automake run properly.
   (2) It is distributed as a part of the GNU autoconf archive.

 Daniel Stenberg (Nov 18, 1998)
 - I changed the TAG-system. If you ever used urlget() from this package in
   another product, you need to recompile with the new headers. I did this
   new stuff to better deal with different compilers and system with different
   variable sizes. I think it makes it a little more portable. This proves
   to compile warning free with the problematic IRIX compiler!
 - Win32 compiled with a silly error. Corrected now.
 - Brian Chaplin <bchaplin@capital-mkts.com> reported yet another problem in
   multiline FTP responses. I've tried to correct it. I mailed him a new
   version and I hope he gets back soon with positive feedback!
 - Improved the 'maketgz' to create a temporary directory tree which it makes
   an archive from instead of the previous renaming of the current one.
 - Mailing list opened (see README).
 - Made -v more verbose on the PASV section of ftp transfers. Now it tells
   host name and IP of the new host (and port number). I also added a section
   about PORT vs PASV in the README.

Version 5.0 beta 21

 Angus Mackay (Nov 15, 1998)
 - Introduced automake stuff.

 Daniel Stenberg (Nov 13, 1998)
 - Just made a successful GET of a document from an SSL-server using my own
   private certificate for authentication! The certificate has to be in PEM
   format. You do that the easiest way (although not *that* easy) by
   downloading the SSLyeay PKCS#12-patch by Dr Stephen N. Henson from his site
   at: http://www.drh-consultancy.demon.co.uk/. Using his tool, you can
   convert any modern Netscape or (even) MSIE certificate to PEM-format.  Use
   it with 'curl -E <certificate:password> https://site.com'.  If this isn't a
   cool feature, then I don't know what cool features look like! ;-)
 - Working slowly on telnet connections. #define TRY_TELNET to try it out.
   (curl -u user:passwd "telnet://host.com/cat .login" is one example) I do
   have problem to define how it should work. The prime purpose for this must
   be to get (8bit clean) files via telnet, and it really isn't that easy to
   get files this way. Still having problems with \n being converted to \r\n.

 Angus Mackay (Nov 12, 1998)
 - Corrected another bug in the long parameter name parser.
 - Modified getpass.c (NOTE: see the special licensing in the top of that
   source file).

 Daniel Stenberg (Nov 12, 1998)
 - We may have removed the silly warnings from url.c when compiled under IRIX.
   Thanks again to Bjorn Reese <breese@imada.ou.dk> and Martin Staael
   <martin@netgroup.dk>.
 - Wrote formfind.pl which is a new perl script intended to help you find out
   how a FORM submission should be done. This needs a little more work to get
   really good.

 Daniel Stenberg (Nov 11, 1998)
 - Made the HTTP header-checker accept white spaces before the HTTP/1.? line.
   Appearantly some proxies/sites add such at times (my test proxy did when I
   downloaded a gopher page with it)!
 - Moved the former -h to -M and made -h show the short help text instead. I
   had to enable a forced help text option. Now an even shorter help text will
   be presented when an unknown option and similar, is used.
 - stdcheaders.h didn't work with IRIX 6.4 native cc compiler. I hope my
   changes don't make other versions go nuts instead.

 Daniel Stenberg (Nov 10, 1998)
 - Added a weird check in the configure script to check for the silly AIX
   warnings about my #define strcasecmp() stuff. I do that define to prevent
   me and other contributors to accidentaly use that function name instead
   of strequal()...
 - I bugfixed Angus's getpass.c very little.
 - Fixed the verbose flag names to getopt-style, i.e 'curl --loc' will be
   sufficient instead of --location as "loc" is a unique prefix. Also, anything
   after a '--' is treated as an URL. So if you do have a host with a weeeird
   name you can do 'curl -- -host.com'.
 - Another getopt-adjust; curl now accepts flags after the URL on the command
   line. 'curl www.foo.com -O' is perfectly valid.
 - Corrected the .curlrc parser so that strtok() is no longer used and I
   believe it works better. Even URLs can be specified in it now.

 Angus Mackay (Nov 9, 1998)
 - Replaced getpass.c with a newly written one, not under GPL license
 - Changed OS to a #define in config.h instead of compiler flag
 - Makefile now uses -DHAVE_CONFIG_H

 Daniel Stenberg (Nov 9, 1998)
 - Ok, I expanded the tgz-target to update the version string on each occation
   I build a release archive!
 - I reacted on Angus Mackay's initiative and remade the parameter parser to
   be more getopt compliant. Curl now supports "merged" flags as in 
	curl -lsv ftp.site.com
   Do note that I had to move three short-names of the options. Parameters
   that needs an additional string such as -x must be stand-alone or the
   last in a merged sequence:
	curl -lsx my-proxy ftp.site.com
   is ok, but using the flags in a different order like '-lxs' would cause
   unexpected results (as the 's' option would be skipped).
 - I've changed the headers in all files that are subject to the MozPL
   license, as they are supposed to look like when conforming.
 - Made the configure script make the config.h. The former config.h is now
   setup.h.
 - The RESOURCES and TODO files have been added to the archive.

 Angus Mackay <amackay@gus.ml.org> (Nov 5, 1998)
 - Fixed getpass.c and various configure stuff

 Daniel Stenberg (Nov 3, 1998)
 - Use -H/--header for custom HTTP-headers. Lets you pass on your own
   specified headers to the remote server. I wouldn't recommend trying to use
   a header with a defined usage according to standards. Use this flag once
   for every custom header you want to add.
 - Use -B/--ftp-ascii to force ftp to use ASCII mode when transfering files.
 - Corrected the 'getlinks.pl' script, I accidentally left my silly proxy
   usage in there! Since the introduction of the .curlrc file, it is easier to
   write scripts that use curl since proxies and stuff should be in the
   .curlrc file anyway.
 - Introducing the new -F flag for HTTP POST. It supports multipart/form-data
   which means it is gonna be possible to upload files etc through HTTP POST.
   Shiraz Kanga <skanga@bigfoot.com> asked for the feature and my brother,
   Björn Stenberg <Bjorn.Stenberg@sth.frontec.se> helped me design the user
   interface for this beast.  This feature requires quite some docs,
   since it has turned out not only quite capable, but also complicated! :-)
 - A note here, since I've received mail about it. SSLeay versions prior to
   0.8 will *not* work with curl!
 - Wil Langford <wil@langford.net> reported a bug that occurred since curl
   did not properly use CRLF when issuing ftp commands. I fixed it.
 - Rearranged the order config files are read. .curlrc is now *always* read
   first and before the command line flags. -K config files then act as
   additional config items.
 - Use -q AS THE FIRST OPTION specified to prevent .curlrc from being read.
 - You can now disable a proxy by using -x "". Useful if the .curlrc file
   specifies a proxy and you wanna fetch something without going through
   that.
 - I'm thinking of dropping the -p support. Its really not useful since ports
   could (and should?) be specified as :<port> appended on the host name
   instead, both in URLs and to proxy host names.
 - Martin Staael <martin@netgroup.dk> reports curl -L bugs under Windows NT
   (test with URL http://come.to/scsde). This bug is not present in this
   version anymore.
 - Added support for the weird FTP URL type= thing. You can download a file
   using ASCII transfer by appending ";type=A" to the right of it. Other
   available types are type=D for dir-list (NLST) and type=I for binary
   transfer. I can't say I've ever seen anyone use this kind of URL though!
   :-)
 - Troy Engel <tengel@palladium.net> pointed out a bug in my getenv("HOME")
   usage for win32 systems. I introduce getenv.c to better cope with
   this. Mr Engel helps me with the details around that...
 - A little note to myself and others, I should make the win32-binary built
   with SSL support...
 - r-y-a-n/n-e-l-s-o-n <ryan@inch.com> sent me comments about building curl
   with SSL under FreeBSD. See the Makefile for details. Using the configure
   script, it should work better and automatically now...
 - Cleaned up in the port number mess in the source. No longer stores and uses
   proxy port number separate from normal port number.
 - 'configure' script working. Confirmed compiles on:
    Host         SSL  Compiler
    SunOS 5.5    no   gcc
    SunOS 5.5.1  yes  gcc
    SunOS 5.6    no   cc  (with gcc, it has the "gcc include files" problem)
    SunOS 4.1.3  no   gcc (without ANSI C headers)
    SunOS 4.1.2  no   gcc (native compiler failed)
    Linux 2.0.18 no   gcc
    Linux 2.0.32 yes  gcc
    Linux 2.0.35 no   gcc (with glibc)
    IRIX 6.2     no   gcc (cc compiles generate a few warnings)
    IRIX 6.4     no   cc  (generated warnings though)
    Win32        no   Borland
    OSF4.0	 no   ?

 - Ooops. The 5beta (and 4.10) under win32 failed if the HOME variable wasn't
   set.
 - When using a proxy, curl now guesses and uses the protocol part in cases
   like:
	curl -x proxy:80 www.site.com
   Proxies normally go nuts unless http:// is prepended to the host name, so
   if curl is used like this, it guesses protocol and appends the protocol
   string before passing it to the proxy. It already did this when used
   without proxy.
 - Better port usage with SSL through proxy now. If you specified a different
   https-port when accessing through a proxy, it didn't use that number
   correctly. I also rewrote the code that parses the stuff read from the
   proxy when you wanna connect through it with SSL.
 - Bjorn Reese <breese@imada.ou.dk> helped me work around one of the compiler
   warnings on IRIX native cc compiles.

Version 4.10 (Oct 26, 1998)
 Daniel Stenberg
 - John A. Bristor <jbristor@bellsouth.net> suggested a config file switch,
   and since I've been having that idea kind of in the background for a long
   time I rewrote the parameter parsing function a little and now I introduce
   the -K/--config flag. I also made curl *always* (unless -K is used) try to
   load the .curlrc file for command line parameters. The syntax for the
   config file is the standard command line argument style. Details in 'curl
   -h' or the README.
 - I removed the -k option. Keep-alive isn't really anything anyone would
   want to enable with curl anyway.
 - Martin Staael <Martin@Staael.dk> helped me add the 'irix' target. Now
   "make irix" should build curl successfully on non-gcc SGI machines.
 - Single switches now toggle behaviours. I.e if you use -v -v the second
   will switch off the verbose mode the first one enabled. This is so that
   you can disable a default setting a .curlrc file enables etc.

Version 4.9 (Oct 7, 1998)
 Daniel Stenberg
 - Martin Staael <Martin@Staael.dk> suggested curl would support cookies.
   I added -b/--cookie to enable free-text cookie data to be passed. There's
   also a little blurb about general cookie stuff in the README/help text.
 - dmh <dmh@jet.es> suggested HTTP resume capabilities. Although you could
   manually get curl to resume HTTP documents, I made the -c resume flag work
   for HTTP too (unless -r is used too, which would be very odd anyway).
 - Added checklinks.pl to the archive. It is a still experimental perl script
   that checks all links of a web page by using curl.
 - Rearranged the archive hierarchy a little. Build the executable in the
   src/ dir from now on!
 - Version 4.9 and hereafter, is no longer released under the GPL license.
   I have now updated the LEGAL file etc and now this is released using the
   Mozilla Public License to avoid the plague known as "the GPL virus". You
   must make the source available if you decide to change and/or redistribute
   curl, but if you decide to use curl within something else you do not need
   to offer the world the source to that too.
 - Curl did not like HTTP servers that sent no headers at all on a GET
   request.  It is a violation of RFC2068 but appearantly some servers do
   that anyway.  Thanks to Gordon Beaton <gordon@erix.ericsson.se> for the
   report!
 - -L/--location was added after a suggestion from Martin Staael
   <Martin@Staael.dk>. This makes curl ATTEMPT to follow the Location:
   redirect if one is present in the HTTP headers. If -i or -I is used with
   this flag, you will see headers from all sites the Location: points to. Do
   note that the first server can point to a second that points to a third
   etc. It seems the Location: parameter (said to be an AbsoluteURI in
   RFC2068) isn't always absolute.. :-/ Anyway, I've made curl ATTEMPT to do
   the best it can to deal with the reality.
 - Added getlinks.pl to the archive. getlinks.pl selectively downloads
   files that a web page links to.

Version 4.8.4
 Daniel Stenberg
 - As Julian Romero Nieto <jromero@anaya.es> reported, curl reported wrong
   version number.
 - As Teemu Yli-Elsila <tylielsi@mail.student.oulu.fi> pointed out,
   the win32 version of 4.8 (and probably all other versions for win32)
   didn't work with binary files since I'm too used to the UNIX style
   fopen() where binary and text don't differ...
 - Ralph Beckmann <rabe@uni-paderborn.de> brought me some changes that lets
   curl compile error and warning free with -Wall -pedantic with
   g++. I also took the opportunity to clean off some unused variables
   and similar.
 - Ralph Beckmann <rabe@uni-paderborn.de> made me aware of a really odd bug
   now corrected. When curl read a set of headers from a HTTP server, divided
   into more than one read and the first read showed a full line *exactly*
   (i.e ending with a newline), curl did not behave well.

Version 4.8.3
 Daniel Stenberg
 - I was too quick to release 4.8.2 with too little testing. One of the
   changes is now reverted slightly to the 4.8.1 way since 4.8.2 couldn't
   upload files. I still think both problems corrected in 4.8.2 remain
   corrected.  Reported by Julian Romero Nieto <jromero@anaya.es>.

Version 4.8.2
 Daniel Stenberg
 - Bernhard Iselborn <biselbor@rhrk.uni-kl.de> reported two FTP protocol
   errors curl did. They're now corrected. Both appeared when getting files
   from a MS FTP server! :-)

Version 4.8.1
 Daniel Stenberg
 - Added a last update of the progress meter when the transfer is done. The
   final output on the screen didn't have to be the final size transfered
   which made it sometimes look odd.
 - Thanks to David Long <long@research.bell-labs.com> I got rid of a silly
   bug that happened if a HTTP-page had nothing but header. Appearantly
   Solaris deals with negative sizes in fwrite() calls a lot better than
   Linux does... =B-]

Version 4.8
 Daniel Stenberg
 - Continue FTP file transfer. -c is the switch. Note that you need to
   specify a file name if you wanna resume a download (you can't resume a
   download sent to stdout). Resuming upload may be limited by the server
   since curl is then using the non-RFC959 command SIZE to get the size of
   the target file before upload begins (to figure out which offset to
   use). Use -C to specify the offset yourself! -C is handy if you're doing
   the output to something else but a plain file or when you just want to get
   the end of a file.
 - recursiveftpget.pl now features a maximum recursive level argument.

Version 4.7
 Daniel Stenberg
 - Added support to abort a download if the speed is below a certain amount
   (speed-limit) bytes per second for a certain (speed-time) time.
 - Wrote a perl script 'recursiveftpget.pl' to recursively use curl to get a
   whole ftp directory tree. It is meant as an example of how curl can be
   used.  I agree it isn't the wisest thing to do to make a separate new
   connection for each file and directory for this.

Version 4.6
 Daniel Stenberg
 - Added a first attempt to optionally parse the .netrc file for login user
   and password. If used with http, it enables user authentication. -n is
   the new switch.
 - Removed the extra newlines on the default user-agent string.
 - Corrected the missing ftp upload error messages when it failed without the
   verbose flag set. Gary W. Swearingen found it.
 - Now using alarm() to enable second-precision timeout even on the name
   resolving/connecting phase. The timeout is although reset after that first
   sequence. (This should be corrected.) Gary W. Swearingen <swear@aa.net>
   reported.
 - Now spells "Unknown" properly, as in "Unknown option 'z'"... :-)
 - Added bug report email address in the README.
 - Added a "current speed" field to the progress meter. It shows the average
   speed the last 5 seconds. The other speed field shows the average speed of
   the entire transfer so far.

Version 4.5.1
 Linas Vepstas
 - SSL through proxy fix
 - Added -A to allow User-Agent: changes

 Daniel Stenberg 
 - Made the -A work when SSL-through-proxy.

Version 4.5
 Linas Vepstas <linas@linas.org>
 - More SSL corrections
 - I've added a port to AIX.
 - running SSL through a proxy causes a chunk of code to be executred twice.
   one of those blocks needs to be deleted.

 Daniel Stenberg
 - Made -i and -I work again

Version 4.4
 Linas Vepstas <linas@us.ibm.com>
 - -x can now also specify proxyport when used as in 'proxyhost:proxyport'
 - SSL fixes

Version 4.3
 Daniel Stenberg
 - Adjusted to compile under win32 (VisualC++ 5). The -P switch does not
   support network interface names in win32. I couldn't figure out how!

Version 4.2
 Linas Vepstas / Sampo Kellomaki
 - Added SSL / SSLeay support (https://)
 - Added the -T usage for HTTP POST.

 Daniel Stenberg
 - Bugfixed the SSL implementation.
 - Made -P a lot better to use other IP addresses. It now accepts a following
   parameter that can be either
        interface - i.e "eth0" to specify which interface's IP address you
                    want to use
        IP address - i.e "192.168.10.1" to specify exact IP number
        host name - i.e "my.host.domain" to specify machine
        "-"       - (any single-letter string) to make it pick the machine's
                    default
 - The Makefile is now ready to compile for solaris, sunos4 and linux right
   out of the box.
 - Better generated version string seen with 'curl -V'

Version 4.1
 Daniel Stenberg
 - The IP number returned by the ftp server as a reply to PASV does no longer
   have to DNS resolve. In fact, no IP-number-only addresses have to anymore.
 - Binds better to available port when -P is used.
 - Now LISTs ./ instead of / when used as in ftp://ftp.funet.fi/. The reason
   for this is that exactly that site, ftp.funet.fi, does not allow LIST /
   while LIST ./ is fine. Any objections?

Version 4 (1998-03-20)
 Daniel Stenberg
 - I took another huge step and changed both version number and project name!
   The reason for the new name is that there are just one too many programs
   named urlget already and this program already can a lot more than merely
   getting URLs, and the reason for the version number is that I did add the
   pretty big change in -P and since I changed name I wanted to start with
   something fresh!
 - The --style flags are working better now.
 - Listing directories with FTP often reported that the file transfer was
   incomplete. Wrong assumptions were too common for directories, why no
   size will be attempted to get compared on them from now on.
 - Implemented the -P flag that let's the ftp control issue a PORT command
   instead of the standard PASV.
 - -a for appending FTP uploads works.

***************************************************************************

Version 3.12
 Daniel Stenberg
 - End-of-header tracking still lacked support for \r\n or just \n at the
   end of the last header line.
 Sergio Barresi <sbarresi@imispa.it>
 - Added PROXY authentication.
 Rafael Sagula
 - Fixed some little bugs.

Version 3.11
 Daniel Stenberg
 - The header parsing was still not correct since the 3.2 modification...

Version 3.10
 Daniel Stenberg
 - 3.7 and 3.9 were simultaneously developed and merged into this version.
 - FTP upload did not work correctly since 3.2.

Version 3.9
 Rafael Sagula
 - Added the "-e <url> / --referer <url>" option where we can specify
   the referer page. Obviously, this is necessary only to fool the
   server, but...

Version 3.7
 Daniel Stenberg
 - Now checks the last error code sent from the ftp server after a file has
   been received or uploaded. Wasn't done previously.
 - When 'urlget <host>' is used without a 'protocol://' first in the host part,
   it now checks for host names starting with ftp or gopher and if it does,
   it uses that protocol by default instead of http.

Version 3.6
 Daniel Stenberg
 - Silly mistake made the POST bug. This has now also been tested to work with
   proxy.

Version 3.5
 Daniel Stenberg
 - Highly inspired by Rafael Sagula's changes to the 3.1 that added an almost
   functional POST, I applied his changes into this version and made them work.
   (It seems POST requires the Content-Type and Content-Length headers.) It is
   now usable with the -d switch.

Version 3.3 - 3.4
 Passed to avoid confusions

Version 3.2
 Daniel Stenberg
 - Major rewrite of two crucial parts of this code: upload and download.
   They are both now using a select() switch, that allows much better
   progress meter and time control. 
 - alarm() usage removed completely
 - FTP get can now list directory contents if the path ends with a slash '/'.
   Urlget on a ftp-path that doesn't end with a slash means urlget will
   attempt getting it as a file name.
 - FTP directory view supports -l for "list-only" which lists the file names
   only.
 - All operations support -m for max time usage in seconds allowed.
 - FTP upload now allows the size of the uploaded file to be provided, and
   thus it can better check it actually uploaded the whole file. It also
   makes the progress meter for uploads much better!
 - Made the parameter parsing fail in cases like 'urlget -r 900' which
   previously tried to connect to the host named '900'.

Version 3.1
 Kjell Ericson
 - Pointed out how to correct the 3 warnings in win32-compiles.

 Daniel Stenberg
 - Removed all calls to exit().
 - Made the short help text get written to stdout instead of stderr.
 - Made this file instead of keeping these comments in the source.
 - Made two callback hooks, that enable external programs to use urlget()
   easier and to grab the output/offer the input easier.
 - It is evident that Win32-compiles are painful. I watched the output from
   the Borland C++ v5 and it was awful. Just ignore all those warnings.

Version 3.0
 Daniel Stenberg
 - Added FTP upload capabilities. The name urlget gets a bit silly now
   when we can put too... =)
 - Restructured the source quite a lot.
   Changed the urlget() interface. This way, we will survive changes much
   better. New features can come and old can be removed without us needing
   to change the interface. I've written a small explanation in urlget.h
   that explains it.
 - New flags include -t, -T, -O and -h. The -h text is generated by the new
   mkhelp script.

Version 2.9
 Remco van Hooff
 - Added a fix to make it compile smoothly on Amiga using the SAS/C
   compiler.
  
 Daniel Stenberg
 - Believe it or not, but the STUPID Novell web server seems to require
   that the Host: keyword is used, so well I use it and I (re-introduce) the
   urlget User-Agent:. I still have to check that this Host: usage works with
   proxies... 'Host:' is required for HTTP/1.1 GET according to RFC2068.

Version 2.8
 Rafael Sagula
 - some little modifications

Version 2.7
 Daniel Stenberg
 - Removed the -l option and introduced the -f option instead. Now I'll
   rewrite the former -l kludge in an external script that'll use urlget to
   fetch multipart files like that.
 - '-f' is introduced, it means Fail without output in case of HTTP server
   errors (return code >=300).
 - Added support for -r, ranges. Specify which part of a document you 
   want, and only that part is returned. Only with HTTP/1.1-servers.
 - Split up the source in 3 parts. Now all pure URL functions are in
   urlget.c and stuff that deals with the stand-alone program is in main.c.
 - I took a few minutes and wrote an embryo of a README file to explain
   a few things.

Version 2.6
 Daniel Stenberg
 - Made the -l (loop) thing use the new CONF_FAILONERROR which makes
   urlget() return error code if non-successful. It also won't output anything
   then. Now finally removed the HTTP 1.0 and error 404 dependencies.
 - Added -I which uses the HEAD request to get the header only from a
   http-server.

Version 2.5
 Rafael Sagula
 - Made the progress meter use HHH:MM:SS instead of only seconds.

Version 2.4
 Daniel Stenberg
 - Added progress meter. It appears when downloading > BUFFER SIZE and
   mute is not selected. I found out that when downloading large files from
   really really slow sites, it is desirable to know the status of the
   download. Do note that some downloads are done unawaring of the size, which
   makes the progress meter less thrilling ;) If the output is sent to a tty,
   the progress meter is shut off.
 - Increased buffer size used for reading.
 - Added length checks in the user+passwd parsing.
 - Made it grok user+passwd for HTTP fetches. The trick is to base64
   encode the user+passwd and send an extra header line. Read chapter 11.1 in
   RFC2068 for details. I added it to be used just like the ftp one.  To get a
   http document from a place that requires user and password, use an URL
   like:

        http://user:passwd@www.site.to.leach/doc.html

   I also added the -u flag, since WHEN USING A PROXY YOU CAN'T SPECIFY THE
   USER AND PASSWORD WITH HTTP LIKE THAT. The -u flag works for ftp too, but
   not if used with proxy. To do the same as the above one, you can invoke:

        urlget -u user:passwd http://www.site.to.leach/doc.html

Version 2.3
 Rafael Sagula
 - Added "-o" option (output file)
 - Added URG_HTTP_NOT_FOUND return code.
   (Daniel's note:)
   Perhaps we should detect all kinds of errors and instead of writing that
   custom string for the particular 404-error, use the error text we actually
   get from the server. See further details in RFC2068 (HTTP 1.1
   definition). The current way also relies on a HTTP/1.0 reply, which newer
   servers might not do.
 - Looping mode ("-l" option). It's easier to get various split files.
   (Daniel's note:)
   Use it like 'urlget -l 1 http://from.this.site/file%d.html', which will
   make urlget to attempt to fetch all files named file1.html, file2.html etc
   until no more files are found. This is only a modification of the
   STAND_ALONE part, nothing in the urlget() function was modfified for this.
 Daniel Stenberg
 - Changed the -h to be -i instead. -h should be preserved to help use.
 - Bjorn Reese indicated that Borland _might_ use '_WIN32' instead of the
   VC++ WIN32 define and therefore I added a little fix for that.

Version 2.2
 Johan Andersson
 - The urlget function didn't set the path to url when using proxy.
 - Fixed bug with IMC proxy. Now using (almost) complete GET command.
  
 Daniel Stenberg
 - Made it compile on Solaris. Had to reorganize the includes a bit.
   (so Win32, Linux, SunOS 4 and Solaris 2 compile fine.)
 - Made Johan's keepalive keyword optional with the -k flag (since it
   makes a lot of urlgets take a lot longer time).
 - Made a '-h' switch in case you want the HTTP-header in the output.

Version 2.1
 Daniel Stenberg and Kjell Ericson
 - Win32-compilable
 - No more global variables
 - Mute option (no output at all to stderr)
 - Full range of return codes from urlget(), which is now written to be a
   function for easy-to-use in [other] programs.
 - Define STAND_ALONE to compile the stand alone urlget program
 - Now compiles with gcc options -ansi -Wall -pedantic ;)

Version 2.0
 - Introducing ftp GET support. The FTP URL type is recognized and used.
 - Renamed the project to 'urlget'.
 - Supports the user+passwd in the FTP URL (otherwise it tries anonymous
   login with a weird email address as password).

Version 1.5
 Daniel Stenberg
 - The skip_header() crap messed it up big-time. By simply removing that
   one we can all of a sudden download anything ;)
 - No longer requires a trailing slash on the URLs.
 - If the given URL isn't prefixed with 'http://', HTTP is assumed and
   given a try!
 - 'void main()' is history.

Version 1.4
 Daniel Stenberg
 - The gopher source used the ppath variable instead of path which could
   lead to disaster.

Version 1.3
 Daniel Stenberg
 - Well, I added a lame text about the time it took to get the data. I also
   fought against Johan to prevent his -f option (to specify a file name
   that should be written instead of stdout)! =)
 - Made it write 'connection refused' for that particular connect()
   problem.
 - Renumbered the version. Let's not make silly 1.0.X versions, this is
   a plain 1.3 instead.

Version 1.2
 Johan Andersson
 - Discovered and fixed the problem with getting binary files. puts() is
   now replaced with fwrite(). (Daniel's note: this also fixed the buffer
   overwrite problem I found in the previous version.)

 Rafael Sagula <sagula@inf.ufrgs.br>
 - Let "-p" before "-x".

 Daniel Stenberg <Daniel.Stenberg@sth.frontec.se>
 - Bugfixed the proxy usage. It should *NOT* use nor strip the port number
   from the URL but simply pass that information to the proxy. This also
   made the user/password fields possible to use in proxy [ftp-] URLs.
   (like in ftp://user:password@ftp.my.site:8021/README)

 Johan Andersson <johan@homemail.com>
 - Implemented HTTP proxy support.
 - Receive byte counter added.

 Bjorn Reese <breese@imada.ou.dk>
 - Implemented URLs (and skipped the old syntax).
 - Output is written to stdout, so to achieve the above example, do:
   httpget http://143.54.10.6/info_logo.gif > test.gif

Version 1.1
 Daniel Stenberg <Daniel.Stenberg@sth.frontec.se>
 - Adjusted it slightly to accept named hosts on the command line. We
   wouldn't wanna use IP numbers for the rest of our lifes, would we?

Version 1.0
  Rafael Sagula <sagula@inf.ufrgs.br>
  - Wrote the initial httpget, which started all this!