Newer
Older
2001
2002
2003
2004
2005
2006
2007
2008
2009
2010
2011
2012
2013
2014
2015
2016
2017
2018
2019
2020
2021
2022
2023
2024
2025
2026
2027
2028
2029
2030
2031
2032
2033
2034
2035
2036
2037
2038
2039
2040
2041
2042
2043
2044
2045
2046
2047
2048
2049
2050
2051
2052
2053
2054
2055
2056
2057
2058
2059
2060
2061
2062
2063
2064
2065
2066
2067
2068
2069
2070
2071
2072
2073
2074
2075
2076
2077
2078
2079
2080
2081
2082
2083
2084
2085
2086
2087
2088
2089
2090
2091
2092
2093
2094
2095
2096
2097
2098
2099
2100
2101
2102
2103
2104
2105
2106
2107
2108
2109
2110
2111
2112
2113
2114
2115
2116
2117
2118
2119
2120
2121
2122
2123
2124
2125
2126
2127
2128
2129
2130
2131
2132
2133
2134
2135
2136
2137
2138
2139
2140
2141
2142
2143
2144
2145
2146
2147
2148
2149
2150
2151
2152
2153
2154
2155
2156
2157
2158
2159
2160
2161
2162
2163
2164
2165
2166
2167
2168
2169
2170
2171
2172
2173
2174
2175
2176
2177
2178
2179
2180
2181
2182
2183
2184
2185
2186
2187
2188
2189
2190
2191
2192
2193
2194
2195
2196
2197
2198
2199
2200
- Silly mistake made the POST bug. This has now also been tested to work with
proxy.
Version 3.5
Daniel Stenberg
- Highly inspired by Rafael Sagula's changes to the 3.1 that added an almost
functional POST, I applied his changes into this version and made them work.
(It seems POST requires the Content-Type and Content-Length headers.) It is
now usable with the -d switch.
Version 3.3 - 3.4
Passed to avoid confusions
Version 3.2
Daniel Stenberg
- Major rewrite of two crucial parts of this code: upload and download.
They are both now using a select() switch, that allows much better
progress meter and time control.
- alarm() usage removed completely
- FTP get can now list directory contents if the path ends with a slash '/'.
Urlget on a ftp-path that doesn't end with a slash means urlget will
attempt getting it as a file name.
- FTP directory view supports -l for "list-only" which lists the file names
only.
- All operations support -m for max time usage in seconds allowed.
- FTP upload now allows the size of the uploaded file to be provided, and
thus it can better check it actually uploaded the whole file. It also
makes the progress meter for uploads much better!
- Made the parameter parsing fail in cases like 'urlget -r 900' which
previously tried to connect to the host named '900'.
Version 3.1
Kjell Ericson
- Pointed out how to correct the 3 warnings in win32-compiles.
Daniel Stenberg
- Removed all calls to exit().
- Made the short help text get written to stdout instead of stderr.
- Made this file instead of keeping these comments in the source.
- Made two callback hooks, that enable external programs to use urlget()
easier and to grab the output/offer the input easier.
- It is evident that Win32-compiles are painful. I watched the output from
the Borland C++ v5 and it was awful. Just ignore all those warnings.
Version 3.0
Daniel Stenberg
- Added FTP upload capabilities. The name urlget gets a bit silly now
when we can put too... =)
- Restructured the source quite a lot.
Changed the urlget() interface. This way, we will survive changes much
better. New features can come and old can be removed without us needing
to change the interface. I've written a small explanation in urlget.h
that explains it.
- New flags include -t, -T, -O and -h. The -h text is generated by the new
mkhelp script.
Version 2.9
Remco van Hooff
- Added a fix to make it compile smoothly on Amiga using the SAS/C
compiler.
Daniel Stenberg
- Believe it or not, but the STUPID Novell web server seems to require
that the Host: keyword is used, so well I use it and I (re-introduce) the
urlget User-Agent:. I still have to check that this Host: usage works with
proxies... 'Host:' is required for HTTP/1.1 GET according to RFC2068.
Version 2.8
Rafael Sagula
- some little modifications
Version 2.7
Daniel Stenberg
- Removed the -l option and introduced the -f option instead. Now I'll
rewrite the former -l kludge in an external script that'll use urlget to
fetch multipart files like that.
- '-f' is introduced, it means Fail without output in case of HTTP server
errors (return code >=300).
- Added support for -r, ranges. Specify which part of a document you
want, and only that part is returned. Only with HTTP/1.1-servers.
- Split up the source in 3 parts. Now all pure URL functions are in
urlget.c and stuff that deals with the stand-alone program is in main.c.
- I took a few minutes and wrote an embryo of a README file to explain
a few things.
Version 2.6
Daniel Stenberg
- Made the -l (loop) thing use the new CONF_FAILONERROR which makes
urlget() return error code if non-successful. It also won't output anything
then. Now finally removed the HTTP 1.0 and error 404 dependencies.
- Added -I which uses the HEAD request to get the header only from a
http-server.
Version 2.5
Rafael Sagula
- Made the progress meter use HHH:MM:SS instead of only seconds.
Version 2.4
Daniel Stenberg
- Added progress meter. It appears when downloading > BUFFER SIZE and
mute is not selected. I found out that when downloading large files from
really really slow sites, it is desirable to know the status of the
download. Do note that some downloads are done unawaring of the size, which
makes the progress meter less thrilling ;) If the output is sent to a tty,
the progress meter is shut off.
- Increased buffer size used for reading.
- Added length checks in the user+passwd parsing.
- Made it grok user+passwd for HTTP fetches. The trick is to base64
encode the user+passwd and send an extra header line. Read chapter 11.1 in
RFC2068 for details. I added it to be used just like the ftp one. To get a
http document from a place that requires user and password, use an URL
like:
http://user:passwd@www.site.to.leach/doc.html
I also added the -u flag, since WHEN USING A PROXY YOU CAN'T SPECIFY THE
USER AND PASSWORD WITH HTTP LIKE THAT. The -u flag works for ftp too, but
not if used with proxy. To do the same as the above one, you can invoke:
urlget -u user:passwd http://www.site.to.leach/doc.html
Version 2.3
Rafael Sagula
- Added "-o" option (output file)
- Added URG_HTTP_NOT_FOUND return code.
(Daniel's note:)
Perhaps we should detect all kinds of errors and instead of writing that
custom string for the particular 404-error, use the error text we actually
get from the server. See further details in RFC2068 (HTTP 1.1
definition). The current way also relies on a HTTP/1.0 reply, which newer
servers might not do.
- Looping mode ("-l" option). It's easier to get various split files.
(Daniel's note:)
Use it like 'urlget -l 1 http://from.this.site/file%d.html', which will
make urlget to attempt to fetch all files named file1.html, file2.html etc
until no more files are found. This is only a modification of the
STAND_ALONE part, nothing in the urlget() function was modfified for this.
Daniel Stenberg
- Changed the -h to be -i instead. -h should be preserved to help use.
- Bjorn Reese indicated that Borland _might_ use '_WIN32' instead of the
VC++ WIN32 define and therefore I added a little fix for that.
Version 2.2
Johan Andersson
- The urlget function didn't set the path to url when using proxy.
- Fixed bug with IMC proxy. Now using (almost) complete GET command.
Daniel Stenberg
- Made it compile on Solaris. Had to reorganize the includes a bit.
(so Win32, Linux, SunOS 4 and Solaris 2 compile fine.)
- Made Johan's keepalive keyword optional with the -k flag (since it
makes a lot of urlgets take a lot longer time).
- Made a '-h' switch in case you want the HTTP-header in the output.
Version 2.1
Daniel Stenberg and Kjell Ericson
- Win32-compilable
- No more global variables
- Mute option (no output at all to stderr)
- Full range of return codes from urlget(), which is now written to be a
function for easy-to-use in [other] programs.
- Define STAND_ALONE to compile the stand alone urlget program
- Now compiles with gcc options -ansi -Wall -pedantic ;)
Version 2.0
- Introducing ftp GET support. The FTP URL type is recognized and used.
- Renamed the project to 'urlget'.
- Supports the user+passwd in the FTP URL (otherwise it tries anonymous
login with a weird email address as password).
Version 1.5
Daniel Stenberg
- The skip_header() crap messed it up big-time. By simply removing that
one we can all of a sudden download anything ;)
- No longer requires a trailing slash on the URLs.
- If the given URL isn't prefixed with 'http://', HTTP is assumed and
given a try!
- 'void main()' is history.
Version 1.4
Daniel Stenberg
- The gopher source used the ppath variable instead of path which could
lead to disaster.
Version 1.3
Daniel Stenberg
- Well, I added a lame text about the time it took to get the data. I also
fought against Johan to prevent his -f option (to specify a file name
that should be written instead of stdout)! =)
- Made it write 'connection refused' for that particular connect()
problem.
- Renumbered the version. Let's not make silly 1.0.X versions, this is
a plain 1.3 instead.
Version 1.2
Johan Andersson
- Discovered and fixed the problem with getting binary files. puts() is
now replaced with fwrite(). (Daniel's note: this also fixed the buffer
overwrite problem I found in the previous version.)
- Bugfixed the proxy usage. It should *NOT* use nor strip the port number
from the URL but simply pass that information to the proxy. This also
made the user/password fields possible to use in proxy [ftp-] URLs.
(like in ftp://user:password@ftp.my.site:8021/README)
- Implemented HTTP proxy support.
- Receive byte counter added.
- Implemented URLs (and skipped the old syntax).
- Output is written to stdout, so to achieve the above example, do:
httpget http://143.54.10.6/info_logo.gif > test.gif
Version 1.1
- Adjusted it slightly to accept named hosts on the command line. We
wouldn't wanna use IP numbers for the rest of our lifes, would we?
Version 1.0
- Wrote the initial httpget, which started all this!