Skip to content
Snippets Groups Projects
TODO 4.91 KiB
Newer Older
  • Learn to ignore specific revisions
  •                                   _   _ ____  _     
                                  ___| | | |  _ \| |    
                                 / __| | | | |_) | |    
                                | (__| |_| |  _ <| |___ 
                                 \___|\___/|_| \_\_____|
    
    TODO
    
    
     Ok, this is what I wanna do with Curl. Please tell me what you think, and
     please don't hesitate to contribute and send me patches that improve this
     product! (Yes, you may add things not mentioned here, these are just a
     few teasers...)
    
    
    Daniel Stenberg's avatar
    Daniel Stenberg committed
     * Make SSL session ids get used if multiple HTTPS documents from the same
       host is requested.
    
    
     * Improve the command line option parser to accept '-m300' as well as the '-m
       300' convention. It should be able to work if '-m300' is considered to be
       space separated to the next option.
    
    
    Daniel Stenberg's avatar
    Daniel Stenberg committed
     * Make the curl tool support URLs that start with @ that would then mean that
       the following is a plain list with URLs to download. Thus @filename.txt
       reads a list of URLs from a local file. A fancy option would then be to
       support @http://whatever.com that would first load a list and then get the
       URLs mentioned in the list. I figure -O or something would have to be
       implied by such an action.
    
    
     * Make curl with multiple URLs, even outside of {}-letters. I could also
       imagine an optional fork()ed system that downloads each URL in its own
       thread. It should of course have a maximum amount of simultaneous fork()s.
    
    
    Daniel Stenberg's avatar
    Daniel Stenberg committed
     * Improve the regular progress meter with --continue is used. It should be
       noticable when there's a resume going on.
    
     * Add a command line option that allows the output file to get the same time
       stamp as the remote file. This requires some fiddling on FTP but comes
       almost free for HTTP.
    
    
     * Make the SSL layer option capable of using the Mozilla Security Services as
       an alternative to OpenSSL:
       http://www.mozilla.org/projects/security/pki/nss/
    
    
    Daniel Stenberg's avatar
    Daniel Stenberg committed
     * Make sure the low-level interface works. highlevel.c should basically be
       possible to write using that interface. Document the low-level interface
    
    Daniel Stenberg's avatar
    Daniel Stenberg committed
     * Make the easy-interface support multiple file transfers. If they're done
       to the same host, they should use persistant connections or similar.
    
    
    Daniel Stenberg's avatar
    Daniel Stenberg committed
     * Add asynchronous name resolving, as this enables full timeout support for
       fork() systems.
    
     * Move non-URL related functions that are used by both the lib and the curl
       application to a separate "portability lib".
    
    
    Daniel Stenberg's avatar
    Daniel Stenberg committed
     * Add support for other languages than C.  C++ (rumours have been heard about
       something being worked on in this area) and perl (we have seen the first
       versions of this!) comes to mind. Python anyone?
    
     * "Content-Encoding: compress/gzip/zlib"
    
       HTTP 1.1 clearly defines how to get and decode compressed documents. There
       is the zlib that is pretty good at decompressing stuff. This work was
       started in October 1999 but halted again since it proved more work than we
       thought. It is still a good idea to implement though.
    
    
    Daniel Stenberg's avatar
    Daniel Stenberg committed
     * Authentication: NTLM. It would be cool to support that MS crap called NTLM
       authentication. MS proxies and servers sometime require that. Since that
       protocol is a proprietary one, it involves reverse engineering and network
       sniffing. This should however be a library-based functionality. There are a
       few different efforts "out there" to make open source HTTP clients support
       this and it should be possible to take advantage of other people's hard
    
       work. http://modntlm.sourceforge.net/ is one.
    
    
     * RFC2617 compliance, "Digest Access Authentication"
       A valid test page seem to exist at:
        http://hopf.math.nwu.edu/testpage/digest/
       And some friendly person's server source code is available at
        http://hopf.math.nwu.edu/digestauth/index.html
    
       Then there's the Apache mod_digest source code too of course.  It seems as
       if Netscape doesn't support this, and not many servers do. Although this is
       a lot better authentication method than the more common "Basic". Basic
       sends the password in cleartext over the network, this "Digest" method uses
       a challange-response protocol which increases security quite a lot.
    
     * Multiple Proxies?
       Is there anyone that actually uses serial-proxies? I mean, send CONNECT to
       the first proxy to connect to the second proxy to which you send CONNECT to
       connect to the remote host (or even more iterations). Is there anyone
       wanting curl to support it? (Not that it would be hard, just confusing...)
    
     * Other proxies
       Ftp-kind proxy, Socks5, whatever kind of proxies are there?
    
    
    Daniel Stenberg's avatar
    Daniel Stenberg committed
     * IPv6 Awareness and support
    
       Where ever it would fit.  configure search for v6-versions of a few
    
       functions and then use them instead is of course the first thing to do...
       RFC 2428 "FTP Extensions for IPv6 and NATs" will be interesting. PORT
       should be replaced with EPRT for IPv6, and EPSV instead of PASV.
    
     * SSL for more protocols, like SSL-FTP...
       (http://search.ietf.org/internet-drafts/draft-murray-auth-ftp-ssl-05.txt)
    
     * HTTP POST resume using Range: