Provided by: tclcurl_7.19.6-3_amd64 bug


       TclCurl: - get  a  URL with FTP, FTPS, HTTP, HTTPS, SCP, SFTP, TFTP, TELNET, DICT, FILE or
       LDAP syntax.



       curlHandle configure ?options?

       curlHandle perform

       curlHandle getinfo curlinfo_option

       curlhandle cleanup

       curlhandle reset

       curlHandle duhandle

       curlHandle pause

       curlHandle resume

       curl::transfer ?options?


       curl::escape url

       curl::unescape url

       curl::curlConfig option

       curl::versioninfo option

       curl::easystrerror errorCode


       The TclCurl extension gives Tcl programmers access  to  the  libcurl  library  written  by
       Daniel  Stenberg,  with  it you can download urls, upload them and many other neat tricks,
       for more information check


       This procedure must be the first one to call, it returns a curlHandle that you need to use
       to  invoke  TclCurl  procedures.  The init calls intializes curl and this call MUST have a
       corresponding call to cleanup when the operation is completed.   You  should  perform  all
       your  sequential  file  transfers  using  the same curlHandle. This enables TclCurl to use
       persistent connections when possible.


       curlHandle to use.

curlHandle configure ?options?

       configure is called to set the options for the transfer. Most operations in  TclCurl  have
       default actions, and by using the appropriate options you can make them behave differently
       (as documented). All options are set with the option followed by a parameter.

       Notes: the options set with this procedure are valid for the  forthcoming  data  transfers
       that are performed when you invoke perform

       The  options  are  not  reset  between  transfers  (except  where  noted),  so if you want
       subsequent transfers with different options, you must change them between  the  transfers.
       You can optionally reset all options back to the internal default with curlHandle reset.

       curlHandle is the return code from the curl::init call.


Behaviour options

              Set  the  parameter to 1 to get the library to display a lot of verbose information
              about its operations.  Very  useful  for  libcurl  and/or  protocol  debugging  and

              You  hardly  ever want this set in production use, you will almost always want this
              when you debug/report problems. Another neat option for debugging is -debugproc

              A 1 tells the extension to include the headers in the body  output.  This  is  only
              relevant for protocols that actually have headers preceding the data (like HTTP).

              A  1  tells  the  extension to turn on the built-in progress meter.  Nowadays it is
              turn off by default.

              A 1 tells TclCurl not use  any  functions  that  install  signal  handlers  or  any
              functions  that cause signals to be sent to the process. This option is mainly here
              to allow multi-threaded unix applications to still set/use all timeout options etc,
              without risking getting signals.

Callback options

              Use  it  to set a Tcl procedure that will be invoked by TclCurl as soon as there is
              received data that needs to be saved. The procedure will receive a single parameter
              with the data to be saved.

              NOTE:  you  will  be passed as much data as possible in all invokes, but you cannot
              possibly make any assumptions. It may be nothing if the file is empty or it may  be
              thousands of bytes.

       -file  File in which the transfered data will be saved.

              Sets  a  Tcl  procedure to be called by TclCurl as soon as it needs to read data in
              order to send it to the peer. The procedure has to take one parameter,  which  will
              contain the maximun numbers of bytes to read. It should return the actual number of
              bytes read, or '0' if you want to stop the transfer.

              File from which the data will be transfered.

              Name of the Tcl procedure that will invoked by TclCurl  with  a  frequent  interval
              during  operation  (roughly once per second), no matter if data is being transfered
              or not.  Unknown/unused argument values passed to the callback will be set to  zero
              (like  if  you only download data, the upload size will remain 0), the prototype of
              the procedure must be:

              proc ProgressCallback {dltotal dlnow ultotal ulnow}

              In order to this option to work you have to  set  the  noprogress  option  to  '0'.
              Setting  this  option  to  the  empty  string  will  restore  the original progress

              If you transfer data with the multi interface, this procedure will  not  be  called
              during  periods of idleness unless you call the appropriate procedure that performs

              You can pause and resume a transfer from within this procedure using the pause  and
              resume commands.

              Pass  a  the file name to be used to write the header part of the received data to.
              The headers are guaranteed to be written one-by-one to this file and only  complete
              lines are written. Parsing headers should be easy enough using this.

              Name  of  the  procedure  that will receive the debug data produced by the -verbose
              option, it should match the following prototype:

              debugProc {infoType data}

              where infoType specifies what kind of information it is (0 text, 1 incoming header,
              2  outgoing  header,  3  incoming  data,  4  outgoing  data, 5 incoming SSL data, 6
              outgoing SSL data).

Error Options

              Pass a variable name where TclCurl may store human readable error messages in. This
              may be more helpful than just the return code from the command.

              Pass  a  file  name  as  parameter. This is the stream to use internally instead of
              stderr when reporting errors.

              A 1 parameter tells the extension to fail silently if the  HTTP  code  returned  is
              equal  or larger than 400. The default action would be to return the page normally,
              ignoring that code.

              This method is not fail-safe and there are occasions where non-successful  response
              codes will slip through, especially when authentication is involved (response codes
              401 and 407).

              You might get  some  amounts  of  headers  transferred  before  this  situation  is
              detected,  like  for  when a "100-continue" is received as a response to a POST/PUT
              and a 401 or 407 is received immediately afterwards.

Network options

       -url   The actual URL to deal with.

              If the given URL lacks the protocol part  ("http://"  or  "ftp://"  etc),  it  will
              attempt  to  guess which protocol to use based on the given host name. If the given
              protocol of the set URL is not  supported,  TclCurl  will  return  the  unsupported
              protocol  error  when  you call perform. Use curl::versioninfo for detailed info on
              which protocols that are supported.

              NOTE: this the one option required to be set before perform is called.

       -proxy If you need to use a http proxy to access the outside world, set the  proxy  string
              with  this option. To specify port number in this string, append :[port] to the end
              of the host name. The proxy string may be prefixed  with  [protocol]://  since  any
              such prefix will be ignored.

              When you tell the extension to use a HTTP proxy, TclCurl will transparently convert
              operations to HTTP even if you specify a FTP URL etc. This may have  an  impact  on
              what  other  features  of  the  library  you can use, such as quote and similar FTP
              specifics that will not work  unless  you  tunnel  through  the  HTTP  proxy.  Such
              tunneling  is activated with proxytunnel

              TclCurl respects the environment variables http_proxy, ftp_proxy, all_proxy etc, if
              any of those are set. The use of this option does however override any possibly set
              environment variables.

              Setting the proxy string to "" (an empty string) will explicitly disable the use of
              a proxy, even if there is an environment variable set for it.

              The proxy host string can be specified the exact same way as the proxy  environment
              variables, include protocol prefix (http://) and embedded user + password.

              Use  this  option  to set the proxy port to use unless it is specified in the proxy
              string by -proxy.

              Pass the type of  the  proxy. Available options are 'http', 'socks4', 'socks4a' and
              'socks5', with the HTTP one being default.

              Set  the  parameter  to  1  to  get the extension to tunnel all non-HTTP operations
              through the given HTTP proxy. Do note that there is a big difference between  using
              a  proxy  and tunneling through it. If you don't know what this means, you probably
              don't want this tunnel option.

              Pass the interface name to use as outgoing network interface. The name  can  be  an
              interface name, an IP address or a host name.

              This sets the local port number of the socket used for connection. This can be used
              in combination with -interface and you are recommended  to  use  localportrange  as
              well when this is set. Note the only valid port numbers are 1 - 65535.

              This  is  the  number  of  attempts  TclCurl should do to find a working local port
              number. It starts with the given -localport and adds one to  the  number  for  each
              retry. Setting this value to 1 or below will make TclCurl do only one try for exact
              port number. Note that port numbers by nature are a scarce resource  that  will  be
              busy  at  times  so setting this value to something too low might cause unnecessary
              connection setup failures.

              Pass the timeout in seconds. Name resolves will be kept in memory for  this  number
              of  seconds.  Set  to '0' to completely disable caching, or '-1' to make the cached
              entries remain forever. By default, TclCurl caches this info for 60 seconds.

              If the value passed is 1, it tells TclCurl to use a  global  DNS  cache  that  will
              survive between curl handles creations and deletions. This is not thread-safe as it
              uses a global varible.

              WARNING: this option is considered obsolete. Stop using it. Switch  over  to  using
              the share interface instead! See tclcurl_share.

              Pass  your preferred size for the receive buffer in TclCurl. The main point of this
              would be that the write callback gets called more often and  with  smaller  chunks.
              This  is  just  treated  as  a  request,  not an order. You cannot be guaranteed to
              actually get the given size.


              Pass the number specifying what remote port to  connect  to,  instead  of  the  one
              specified in the URL or the default port for the used protocol.


              Pass a number to specify whether the TCP_NODELAY option should be set or cleared (1
              = set, 0 = clear).  The option is cleared by default.  This  will  have  no  effect
              after the connection has been established.

              Setting  this  option  will  disable  TCP's  Nagle  algorithm.  The purpose of this
              algorithm is to try to minimize the number of small packets on the  network  (where
              "small packets" means TCP segments less than the Maximum Segment Size (MSS) for the

              Maximizing the amount of data sent per TCP segment is good because it amortizes the
              overhead  of the send. However, in some cases (most notably telnet or rlogin) small
              segments may need to be sent without delay. This is  less  efficient  than  sending
              larger  amounts  of data at a time, and can contribute to congestion on the network
              if overdone.

              Pass a number specifying the scope_id value to use when connecting  to  IPv6  link-
              local or site-local addresses.

Names and Passwords options

       -netrc A  1 parameter tells the extension to scan your ~/.netrc file to find user name and
              password for the remote site you are about to access. Do note that TclCurl does not
              verify  that  the  file  has  the  correct properties set (as the standard unix ftp
              client does), and that only machine name, user name  and  password  is  taken  into
              account (init macros and similar things are not supported).

              You can set it to the following values:

                   The use of your ~/.netrc file is optional, and information in the URL is to be
                   preferred. The file will be scanned with the host and user name (to  find  the
                   password only) or with the host only, to find the first user name and password
                   after that machine, which ever information is not specified in the URL.

                   Undefined  values  of  the  option  will  have this effect.

                   The extension will ignore the file and use only the information  in  the  URL.
                   This is the default.

                   This  value  tells the library that use of the file is required, to ignore the
                   information in the URL, and to search the file with the host only.

              Pass a string containing the full path name to the file you want to use  as  .netrc
              file. For the option to work, you have to set the netrc option to required. If this
              option is omitted, and netrc is set, TclCurl will attempt to find the a .netrc file
              in the current user's home directory.

              Pass  a  string  as parameter, which should be [username]:[password] to use for the
              connection. Use httpauth to decide authentication method.

              When using NTLM, you can  set  domain  by  prepending  it  to  the  user  name  and
              separating the domain and name with a forward (/) or backward slash (\). Like this:
              "domain/user:password" or "domain\user:password". Some HTTP  servers  (on  Windows)
              support this style even for Basic authentication.

              When  using  HTTP  and  -followlocation,  TclCurl might perform several requests to
              possibly different hosts. TclCurl will only send this user and password information
              to  hosts  using  the  initial  host  name (unless -unrestrictedauth is set), so if
              TclCurl follows locations to other hosts it will not send the user and password  to
              those. This is enforced to prevent accidental information leakage.

              Pass  a  string  as parameter, which should be [username]:[password] to use for the
              connection to the HTTP proxy.

              Set to the authentication method you want, the available ones are:

                   HTTP Basic authentication. This is the default choice,  and  the  only  method
                   that  is  in  widespread use and supported virtually everywhere.  It sends the
                   user name and password over the network in  plain  text,  easily  captured  by

                   HTTP  Digest  authentication. Digest authentication is a more secure way to do
                   authentication over public  networks  than  the  regular  old-fashioned  Basic

                   HTTP  GSS-Negotiate  authentication.  The  GSS-Negotiate method, also known as
                   plain  "Negotiate",was  designed  by  Microsoft  and  is  used  in  their  web
                   applications.  It is primarily meant as a support for Kerberos5 authentication
                   but may be also used along with another authentication methods.

              ntlm HTTP  NTLM  authentication.  A  proprietary  protocol  invented  and  used  by
                   Microsoft.   It  uses a challenge-response and hash concept similar to Digest,
                   to prevent the password from being eavesdropped.

              any  TclCurl will automatically select the one it finds most secure.

                   It may use anything but basic, TclCurl will automatically select  the  one  it
                   finds most secure.

              Use  it  to tell TclCurl which authentication method(s) you want it to use for your
              proxy authentication. Note that for some methods, this will induce an extra network
              round-trip. Set the actual name and password with the proxyuserpwd option.

              The  methods  are  those  listed above for the httpauth option. As of this writing,
              only Basic and NTLM work.

HTTP options

              Pass an 1 parameter to enable this. When enabled, TclCurl  will  automatically  set
              the Referer: field in requests where it follows a Location: redirect.

              Sets  the  contents  of  the  Accept-Encoding:  header sent in an HTTP request, and
              enables decoding of a response when a Content-Encoding: header is received.   Three
              encodings  are  supported: identity, which does nothing, deflate which requests the
              server to compress its response using the zlib algorithm, and gzip  which  requests
              the  gzip  algorithm.   Use  all  to send an Accept-Encoding: header containing all
              supported encodings.

              This is a request, not an order; the server may or may not do it.  This option must
              be  set  or  else  any  unsolicited encoding done by the server is ignored. See the
              special file lib/README.encoding in libcurl docs for details.

              An 1 tells the library to follow any Location: header that the server sends as part
              of a HTTP header.

              NOTE:  this  means  that  the  extension  will re-send the  same request on the new
              location and follow new Location: headers all the way until no  more  such  headers
              are returned.  -maxredirs can be used to limit the number of redirects TclCurl will

              An 1 parameter  tells  the  extension  it  can  continue  to   send  authentication
              (user+password)  when  following  locations,  even when hostname changed. Note that
              this is  meaningful  only  when setting -followlocation.

              Sets the redirection limit. If that many redirections have been followed, the  next
              redirect  will  cause an error. This option only makes sense if the -followlocation
              option is used at the same time. Setting the limit to 0 will  make  libcurl  refuse
              any  redirect.  Set  it  to  -1  for  an infinite number of redirects (which is the

              An 1 tells TclCurl to respect RFC 2616/10.3.2 and not convert  POST  requests  into
              GET  requests when following a 301 redirection. The non-RFC behaviour is ubiquitous
              in web browsers, so the conversion is done  by  default  to  maintain  consistency.
              However,  a  server  may  require a POST to remain a POST after such a redirection.
              This option is meaningful only when setting -followlocation.

       -put   An 1 parameter tells the extension to use HTTP PUT a file. The file to put must  be
              set with -infile and -infilesize.

              This option is deprecated starting with version 0.12.1, you should use -upload.

       -post  An  1  parameter  tells  the  library  to  do a regular HTTP post. This is a normal
              application/x-www-form-urlencoded kind, which is the most commonly used one by HTML
              forms.  See  the  -postfields  option  for  how  to  specify  the  data to post and
              -postfieldsize about how to set the data size.

              Use the -postfields option to specify what data to post and -postfieldsize  to  set
              the  data  size.  Optionally,  you  can  provide  data  to POST using the -readproc

              You can override the default POST Content-Type: header by  setting  your  own  with

              Using  POST  with HTTP 1.1 implies the use of a "Expect: 100-continue" header.  You
              can disable this header with -httpheader as usual.

              If you use POST to a HTTP 1.1 server, you can send data without  knowing  the  size
              before  starting  the POST if you use chunked encoding. You enable this by adding a
              header like "Transfer-Encoding:  chunked"  with  -httpheader.   With  HTTP  1.0  or
              without chunked transfer, you must specify the size in the request.

              When setting post to an 1 value, it will automatically set nobody to 0.

              NOTE: if you have issued a POST request and want to make a HEAD or GET instead, you
              must explicitly pick the new request type using -nobody or -httpget or similar.

              Pass a string as parameter, which should be the full data to post in  a  HTTP  POST
              operation.  You  must  make  sure  that  the data is formatted the way you want the
              server to receive it. TclCurl will not convert or encode  it  for  you.   Most  web
              servers will assume this data to be url-encoded.

              This  is  a  normal  application/x-www-form-urlencoded   kind,  which  is  the most
              commonly used one by HTML forms.

              If you want to do a zero-byte POST, you need to set  -postfieldsize  explicitly  to
              zero,  as  simply  setting  -postfields to NULL or "" just effectively disables the
              sending of the specified string. TclCurl will instead assume  that  the  POST  data
              will be send using the read callback!

              Using  POST  with HTTP 1.1 implies the use of a "Expect: 100-continue" header.  You
              can disable this header with -httpheader as usual.

              Note: to make multipart/formdata posts (aka  rfc1867-posts),  check  out  -httppost

              If  you  want  to  post data to the server without letting TclCurl do a strlen() to
              measure the data size, this option must be used. Also, when this  option  is  used,
              you  can  post fully binary data which otherwise is likely to fail. If this size is
              set to zero, the library will use strlen() to get the data size.

              Tells TclCurl you want a multipart/formdata HTTP POST to be made and  you  instruct
              what data to pass on to the server through a Tcl list.

              This is the only case where the data is reset after a transfer.

              First, there are some basics you need to understand about multipart/formdata posts.
              Each part consists of at least a NAME and a CONTENTS part. If the part is made  for
              file  upload,  there  are  also  a stored CONTENT-TYPE and a FILENAME. Below, we'll
              discuss on what options you use to set these properties in the parts  you  want  to
              add to your post.

              The  list  must  contain  a  'name'  tag with the name of the section followed by a
              string with the name, there are three tags to indicate the value  of  the  section:
              'value'  followed by a string with the data to post, 'file' followed by the name of
              the file to  post  and  'contenttype'  with  the  type  of  the  data  (text/plain,
              image/jpg,  ...),  you can also indicate a false file name with 'filename', this is
              useful in case the server checks if the given file name is valid, for  example,  by
              testing  if  it  starts  with  'c:\'  as  any real file name does or if you want to
              include the full path of the file to post. You can  also  post  the  content  of  a
              variable  as  if  it  were a file with the options 'bufferName' and 'buffer' or use
              'filecontent' followed by a file name to read that file and  use  the  contents  as

              Should  you  need  to  specify  extra  headers  for  the  form  POST  section,  use
              'contentheader' followed by a list with the headers to post.

              Please see 'httpPost.tcl' and 'httpBufferPost.tcl' for examples.

              If TclCurl can't set the data to post an error will be returned:

              1    If the memory allocation fails.

              2    If one option is given twice for one form.

              3    If an empty string was given.

              4    If an unknown option was used.

              5    If the some form info is not complete (or error)

              6    If an illegal option is used in an array.

              7    TclCurl has no http support.

              Pass a string as parameter. It will be used to set the referer header in  the  http
              request sent to the remote server. This can be used to fool servers or scripts. You
              can also set any custom header with -httpheader.

              Pass a string as parameter. It will be used to set the user-agent:  header  in  the
              http  request  sent  to  the  remote  server.  This  can be used to fool servers or
              scripts. You can also set any custom header with -httpheader.

              Pass a list with the HTTP headers to pass to the server in your  request.   If  you
              add a header that is otherwise generated and used by TclCurl internally, your added
              one will be used instead. If you add a header with no contents as in 'Accept:', the
              internally  used header will just get disabled. Thus, using this option you can add
              new headers, replace and remove internal headers.

              The headers included in the  linked  list  must  not  be  CRLF-terminated,  because
              TclCurl  adds  CRLF after each header item. Failure to comply with this will result
              in strange bugs because the server will most likely ignore part of the headers  you

              The first line in a request (containing the method, usually a GET or POST) is not a
              header and cannot be replaced using this  option.  Only  the  lines  following  the
              request-line are headers. Adding this method line in this list of headers will only
              cause your request to send an invalid header.

              NOTE:The most commonly replaced headers have "shortcuts" in the   options:  cookie,
              useragent, and referer.

              Pass  a  list  of  aliases  to be treated as valid HTTP 200 responses. Some servers
              respond with a custom header response line. For example,  IceCast  servers  respond
              with  "ICY  200 OK". By including this string in your list of aliases, the response
              will be treated as a valid HTTP header line such as "HTTP/1.0 200 OK".

              NOTE:The alias itself is not parsed for any version strings. Before version 7.16.3,
              TclCurl  used  the  value  set  by option httpversion, but starting with 7.16.3 the
              protocol is assumed to match HTTP 1.0 when an alias matched.

              Pass a string as parameter.  It will be used to set a cookie in the  http  request.
              The  format  of the string should be '[NAME]=[CONTENTS];'. Where NAME is the cookie
              name and  CONTENTS is what the cookie should contain.

              If  you  need  to  set mulitple cookies, you need to set them all  using  a  single
              option  and  thus  you  need  to  concatenate  them  all in one single string.  Set
              multiple cookies in one string like this: "name1=content1; name2=content2;" etc.

              Note that this option sets the cookie header explicitly in the outgoing request(s).
              If  multiple  requests  are  done  due  to authentication, followed redirections or
              similar, they will all get this cookie passed on.

              Using this option multiple times will only make  the  latest  string  override  the
              previous ones.

              Pass  a string as parameter. It should contain the name of your file holding cookie
              data. The cookie data may be in netscape cookie data format or just  regular  HTTP-
              style headers dumped to a file.

              Given  an empty or non-existing file, this option will enable cookies for this curl
              handle, making it understand and parse  received  cookies  and  then  use  matching
              cookies in future requests.

              If you use this option multiple times, you add more files to read.

              Pass  a  file  name  in  which  TclCurl will dump all internally known cookies when
              curlHandle cleanup is called. If no cookies are known, no  file  will  be  created.
              Specify "-" to have the cookies written to stdout.

              Using  this  option  also enables cookies for this session, so if you, for example,
              follow a location it will make matching cookies get sent accordingly.

              TclCurl will not and cannot report an error for  this. Using  'verbose' will get  a
              warning  to  display,  but  that  is  the  only visible feedback you get about this
              possibly lethal situation.

              Pass an 1 to mark this as a new cookie "session". It will force TclCurl  to  ignore
              all  cookies  it  is  about  to  load  that are "session cookies" from the previous
              session. By default, TclCurl always stores and loads all  cookies,  independent  of
              whether  they  are  session  cookies  are not.  Session cookies are cookies without
              expiry date and they are meant to be alive and existing for this "session" only.

              Pass a string with a cookie. The cookie can be either in Netscape / Mozilla  format
              or  just  regular  HTTP-style header (Set-Cookie: ...) format. If the cookie engine
              was not enabled it will be enabled.  Passing a magic string "ALL"  will  erase  all
              known  cookies  while  "FLUSH"  will write all cookies known by TclCurl to the file
              specified by -cookiejar.

              If set to 1 forces the HTTP request to get back to GET, usable if POST,  PUT  or  a
              custom request have been used previously with the same handle.

              When setting httpget to 1, nobody will automatically be set to 0.

              Set  to  one  of  the values decribed below, they force TclCurl to use the specific
              http versions. It should only be used if you really MUST do that because of a silly
              remote server.

              none We  do not care about what version the library uses. TclCurl will use whatever
                   it thinks fit.

              1.0  Enforce HTTP 1.0 requests.

              1.1  Enforce HTTP 1.1 requests.

              Ignore the Content-Length header. This  is  useful  for  Apache  1.x  (and  similar
              servers)  which will report incorrect content length for files over 2 gigabytes. If
              this option is used, TclCurl will not be able to accurately  report  progress,  and
              will simply stop the download when the server ends the connection.

              Set  to  zero  to disable content decoding. If set to 1 it is enabled. Note however
              that TclCurl has no default content decoding but requires you to use  encoding  for

              Set  to  zero  to  disable  transfer decoding, if set to 1 it is enabled (default).
              TclCurl does chunked transfer decoding by default unless  this  option  is  set  to

FTP options

              Pass  a  string  as parameter. It will be used to get the IP address to use for the
              ftp PORT instruction. The PORT instruction tells the remote server  to  connect  to
              our  specified  IP  address.  The  string may be a plain IP address, a host name, a
              network interface name (under unix) or just a '-'  to  let  the  library  use  your
              systems  default  IP address. Default FTP operations are passive, and thus will not
              use PORT.

       -quote Pass a list list with the FTP or SFTP commands to pass to the server prior to  your
              ftp  request.  This  will  be  done  before any other FTP commands are issued (even
              before the CWD command).If you do not want to transfer any files, set nobody to '1'
              and header to '0'.

              Keep  in  mind  the  commands  to  send must be 'raw' ftp commands, for example, to
              create a directory you need to send mkd Test, not mkdir Test.

              Valid SFTP commands are: chgrp, chmod, chown, ln, mkdir, pwd, rename, rm, rmdir and

              Pass  a  list  with  the FTP commands to pass to the server after your ftp transfer
              request. If you do not want to transfer any files, set nobody to '1' and header  to

              Pass  a  list of FTP or SFTP commands to pass to the server after the transfer type
              is set.

              A 1 tells the library to just list the names of files in a  directory,  instead  of
              doing  a  full directory listing that would include file sizes, dates etc. It works
              with both FTP and SFTP urls.

              This causes an FTP NLST command to be sent. Beware that some FTP servers list  only
              files in their response to NLST, they might not include subdirectories and symbolic

              A 1 parameter tells  the  extension  to  append  to  the  remote  file  instead  of
              overwriting it. This is only useful when uploading to a ftp site.

              Set  to  1 to tell TclCurl to use the EPRT (and LPRT) command when doing active FTP
              downloads (which is enabled by 'ftpport'). Using EPRT  means  that  it  will  first
              attempt  to  use  EPRT  and  then  LPRT before using PORT, if you pass zero to this
              option, it will not try using EPRT or LPRT, only plain PORT.

              Set to one to tell TclCurl to use the EPSV command when doing passive FTP downloads
              (which  it  always does by default). Using EPSV means that it will first attempt to
              use EPSV before using PASV, but if you pass a zero to this option, it will not  try
              using EPSV, only plain PASV.

              If  set  to 1, TclCurl will attempt to create any remote directory that it fails to
              CWD into. CWD is the command that changes working directory.

              This setting also applies to SFTP-connections. TclCurl will attempt to  create  the
              remote  directory  if it can't obtain a handle to the target-location. The creation
              will fail if a file of the same name as the directory to create already  exists  or
              lack of permissions prevents creation.

              Causes  TclCurl to set a timeout period (in seconds) on the amount of time that the
              server is allowed to take in order to generate a response  message  for  a  command
              before  the  session  is  considered hung. Note that while TclCurl is waiting for a
              response, this  value  overrides  timeout.  It  is  recommended  that  if  used  in
              conjunction with timeout, you set it to a value smaller than timeout.

              Pass  a  string which will be used to authenticate if the usual FTP "USER user" and
              "PASS password" negotiation fails. This is currently only known to be required when
              connecting  to  Tumbleweed's Secure Transport FTPS server using client certificates
              for authentication.

              If set to 1, it instructs TclCurl not to use the IP address the server suggests  in
              its  227-response  to  TclCurl's  PASV  command  when  TclCurl  connects  the  data
              connection. Instead TclCurl will re-use the same IP address it already uses for the
              control connection. But it will use the port number from the 227-response.

              This option has no effect if PORT, EPRT or EPSV is used instead of PASV.

              You can use ftps:// URLs to explicitly switch on SSL/TSL for the control connection
              and the data connection.

              Alternatively, and what seems to be the recommended way, you can set the option  to
              one of these values:

              nope Do not attempt to use SSL

              try  Try using SSL, proceed anyway otherwise.

                   Use SSL for the control conecction or fail with "use ssl failed" (64).

              all  Use SSL for all communication or fail with "use ssl failed" (64).


              Pass  TclCurl  one of the values from below, to alter how TclCurl issues "AUTH TLS"
              or "AUTH SSL" when FTP over SSL is activated (see -ftpssl).

              You may need this option because of servers like BSDFTPD-SSL  from  http://bsdftpd-
      "which  won't  work  properly   when "AUTH SSL" is issued (although the
              server responds fine and everything) but requires "AUTH TLS" instead".

                   Allows TclCurl to decide.

              ssl  Try "AUTH SSL" first, and only if that fails try "AUTH TLS".

              tls  Try "AUTH TLS" first, and only if that fails try "AUTH SSL".

              Set it to make TclCurl use CCC (Clear Command Channel). It shuts down  the  SSL/TLS
              layer  after  authenticating. The rest of the control channel communication will be
              unencrypted. This allows NAT routers to follow the FTP transaction. Possible values

              none Do not attempt to use CCC.

                   Do  not  initiate  the  shutdown,  wait for the server to do it. Do not send a

                   Initiate the shutdown and wait for a reply.

              Pass string (or "" to disable). When an FTP server asks for  "account  data"  after
              user  name  and  password  has  been provided, this data is sent off using the ACCT

              It allows three values:

                   The default, TclCurl will do a single CWD operation for each path part in  the
                   given  URL.  For  deep  hierarchies this means very many commands. This is how
                   RFC1738 says it should be done.

                   No CWD at all is done, TclCurl will do SIZE, RETR, STOR, etc and give  a  full
                   path to the server.

                   Make  one  CWD  with  the  full  target directory and then operate on the file
                   "normally".  This is  somewhat  more  standards  compliant  than  'nocwd'  but
                   without the full penalty of 'multicwd'.

Protocol options

              A 1 tells the extension to use ASCII mode for ftp transfers, instead of the default
              binary transfer. For win32 systems it does not set the stdout to binary mode.  This
              option  can  be  usable  when transferring text data between systems with different
              views on certain characters, such as newlines or similar.

              NOTE: TclCurl does not do a complete ASCII conversion when  doing  ASCII  transfers
              over FTP. This is a known limitation/flaw that nobody has rectified. TclCurl simply
              sets the mode to ascii and performs a standard transfer.

              If set to 1, TclCurl sets the transfer mode (binary or  ASCII)  for  FTP  transfers
              done  via  an HTTP proxy, by appending ;type=a or ;type=i to the URL.  Without this
              setting, or it being set to 0, the default, -transfertext has no effect when  doing
              FTP via a proxy. Beware that not all proxies support this feature.

       -crlf  Convert unix newlines to CRLF newlines on FTP transfers.

       -range Pass  a  string as parameter, which should contain the specified range you want. It
              should be in the format X-Y , where X or Y may be left  out.  HTTP  transfers  also
              support  several  intervals, separated with commas as in X-Y,N-M Using this kind of
              multiple intervals will cause the HTTP server to  send  the  response  document  in
              pieces (using standard MIME separation techniques).

              Ranges only work on HTTP, FTP and FILE transfers.

              Pass  the  offset in number of bytes that you want the transfer to start from.  Set
              this option to 0 to  make  the  transfer  start  from  the  beginning  (effectively
              disabling resume).

              For  FTP,  set  this  option  to  -1 to make the transfer start from the end of the
              target file (useful to continue an interrupted upload).

              Pass a string as parameter. It will be used instead of GET or HEAD when  doing  the
              HTTP  request. This is useful for doing DELETE or other more obscure HTTP requests.
              Do not do this at will, make sure your server supports the command first.

              Note that TclCurl will still act and assume the keyword it would use if you do  not
              set  your  custom  and it will act according to that. Thus, changing this to a HEAD
              when TclCurl otherwise would do a  GET  might  cause  TclCurl  to  act  funny,  and
              similar.  To  switch to a proper HEAD, use -nobody, to switch to a proper POST, use
              -post or -postfields and so on.

              If you pass a 1, TclCurl will attempt to get the modification date  of  the  remote
              document  in this operation. This requires that the remote server sends the time or
              replies to a time  querying  command.  The  getinfo  procedure  with  the  filetime
              argument can be used after a transfer to extract the received time (if any).

              A  1  tells  the  library  not to include the body-part in the output. This is only
              relevant for protocols that have a  separate  header  and  body  part.  On  HTTP(S)
              servers, this will make TclCurl do a HEAD request.

              To  change request to GET, you should use httpget. Change request to POST with post

              When uploading a file to a remote site, this option should be used to tell  TclCurl
              what the expected size of the infile is.

              This option is mandatory for uploading using SCP.

              A 1 tells the library to prepare for an upload. The -infile and -infilesize options
              are also interesting for uploads.  If the protocol is HTTP, uploading  means  using
              the PUT request unless you tell TclCurl otherwise.

              Using  PUT  with  HTTP 1.1 implies the use of a "Expect: 100-continue" header.  You
              can disable this header with -httpheader as usual.

              If you use PUT to a HTTP 1.1 server, you can upload data without knowing  the  size
              before starting the transfer if you use chunked encoding. You enable this by adding
              a header like "Transfer-Encoding: chunked" with  -httpheader.   With  HTTP  1.0  or
              without chunked transfer, you must specify the size.

              This  allows  you to specify the maximum size (in bytes) of a file to download.  If
              the file requested is larger than this value, the transfer will not start and error
              'filesize exceeded' (63) will be returned.

              NOTE:  The file size is not always known prior to download, and for such files this
              option has no effect even if the file transfer ends up being larger than this given
              limit. This concerns both FTP and HTTP transfers.

              This  defines  how  the  timevalue  value is treated. You can set this parameter to
              ifmodsince or ifunmodsince.  This feature applies to HTTP and FTP.

              This should be the time in seconds since 1 jan 1970, and the time will be used in a
              condition as specified with timecondition.

Connection options

              Pass  the  maximum time in seconds that you allow the TclCurl transfer operation to
              take. Do note that normally, name lookups may take a  considerable  time  and  that
              limiting  the  operation to less than a few minutes risks aborting perfectly normal
              operations. This option will cause libcurl to use the SIGALRM to enable time-outing
              system calls.

              In unix-like systems, this might cause signals to be used unless -nosignal is used.

              Like timeout but takes a number of milliseconds instead. If libcurl is built to use
              the standard system name resolver, that part will still use full-second  resolution
              for timeouts.

              Pass  the  speed  in  bytes  per  second  that  the transfer should be below during
              lowspeedtime seconds for the extension to consider it too slow and abort.

              Pass the time in seconds that the transfer should be below  the  lowspeedlimit  for
              the extension to consider it too slow and abort.

              Pass  a  speed  in bytes per seconds. If an upload exceeds this speed on cumulative
              average during the transfer, the transfer will pause to keep the average rate  less
              than or equal to the parameter value.  Defaults to unlimited speed.

              Pass  a  speed  in bytes per second. If a download exceeds this speed on cumulative
              average during the transfer, the transfer will pause to keep the average rate  less
              than or equal to the parameter value. Defaults to unlimited speed.

              Sets  the  persistent  connection  cache  size  in  all  the protocols that support
              persistent conecctions. The set amount will be the maximum amount  of  simultaneous
              connections  that  TclCurl  may  cache in this easy handle. Default is 5, and there
              isn't much point in changing this value unless you are perfectly aware of how  this
              work and changes TclCurl's behaviour.

              When  reaching the maximum limit, TclCurl closes the oldest connection in the cache
              to prevent the number of open connections to increase.

              Note: if you have already performed transfers with  this  curl  handle,  setting  a
              smaller  maxconnects  than  before  may cause open connections to unnecessarily get

              Note that if you add this easy handle to a multi handle, this setting is not  being
              acknowledged,  instead  you  must  configure  the  multi handle its own maxconnects

              Maximum time in seconds that you allow the connection to the server to take.   This
              only  limits the connection phase, once it has connected, this option is of no more
              use. Set to zero to disable connection timeout (it will then only  timeout  on  the
              internal timeouts).

              In unix-like systems, this might cause signals to be used unless -nosignal is set.

              Like connecttimeout but takes a number of milliseconds instead. If libcurl is built
              to use the standard system name resolver, that  part  will  still  use  full-second
              resolution for timeouts.

              Allows  an  application  to  select what kind of IP addresses to use when resolving
              host names. This is only interesting when using host names that  resolve  addresses
              using more than one version of IP. The allowed values are:

                   Default, resolves addresses to all IP versions that your system allows.

              v4   Resolve to ipv4 addresses.

              v6   Resolve to ipv6 addresses.

SSL and security options

              Pass a string as parameter. The string should be the file name of your certificate.
              The default format is "PEM" and can be changed with -sslcerttype.

              With NSS this is the nickname of the certificate you wish to authenticate with.

              Pass a string as parameter. The string should be the format  of  your  certificate.
              Supported formats are "PEM" and "DER".

              Pass  a  pointer to a zero terminated string as parameter. The string should be the
              file name of your private key. The default format is "PEM" and can be changed  with

              Pass  a  pointer to a zero terminated string as parameter. The string should be the
              format of your private key. Supported formats are "PEM", "DER" and "ENG"

              NOTE:The format "ENG" enables you to load the private key from a crypto engine.  in
              this  case  -sslkey  is used as an identifier passed to the engine. You have to set
              the crypto engine with -sslengine. The "DER" format key  file  currently  does  not
              work because of a bug in OpenSSL.

              Pass  a  string  as  parameter. It will be used as the password required to use the
              -sslkey or -sshprivatekeyfile private key.

              You never need a pass phrase to load a certificate but you need  one  to  load  you
              private key.

              This option used to be known as -sslkeypasswd and -sslcertpasswd.

              Pass a string as parameter. It will be used as the identifier for the crypto engine
              you want to use for your private key.

              NOTE:If the crypto device cannot be loaded, an error will be returned.

              Pass a 1 to set the actual crypto engine as the  default  for  (asymmetric)  crypto

              NOTE:If the crypto device cannot be set, an error will be returned.

              Use it to set what version of SSL/TLS to use. The available options are:

                   The  default  action.  This will attempt to figure out the remote SSL protocol
                   version, i.e. either SSLv3 or TLSv1 (but not SSLv2, which became  disabled  by
                   default with 7.18.1).

                   Force TLSv1

                   Force SSLv2

                   Force SSLv3

              This  option  determines  whether  TclCurl  verifies the authenticity of the peer's
              certificate.  A 1 means it verifies; zero means it doesn't. The default is 1.

              When negotiating an SSL connection, the server sends a certificate  indicating  its
              identity.  TclCurl verifies whether the certificate is authentic, i.e. that you can
              trust that the server is who the certificate says it is. This trust is based  on  a
              chain  of  digital  signatures, rooted in certification authority (CA) certificates
              you supply.

              TclCurl uses a default bundle of CA certificates that comes with  libcurl  but  you
              can specify alternate certificates with the -cainfo or the -capath options.

              When  -sslverifypeer  is  nonzero,  and  the  verification  fails to prove that the
              certificate is authentic, the connection  fails.  When  the  option  is  zero,  the
              connection succeeds regardless.

              Authenticating  the certificate is not by itself very useful. You typically want to
              ensure that the server, as authentically identified  by  its  certificate,  is  the
              server you mean to be talking to, use -sslverifyhost to control that.

              Pass a file naming holding the certificate to verify the peer with. This only makes
              sense when used in combination with the -sslverifypeer option, if it is set to zero
              -cainfo need not even indicate an accessible file.

              When  built  against  NSS  this  is the directory that the NSS certificate database
              resides in.

              Pass a string naming a file holding a CA certificate in PEM format. If  the  option
              is set, an additional check against the peer certificate is performed to verify the
              issuer is indeed the one associated with the certificate provided  by  the  option.
              This  additional  check  is useful in multi-level PKI where one need to enforce the
              peer certificate is from a specific branch of the tree.

              This option makes sense only when  used  in  combination  with  the  -sslverifypeer
              option. Otherwise, the result of the check is not considered as failure.

              Pass  the  directory holding multiple CA certificates to verify the peer with.  The
              certificate directory must be prepared using the openssl  c_rehash  utility.   This
              only makes sense when used in combination with the  -sslverifypeer option, if it is
              set to zero, -capath need not even indicate an accessible path.

              This option apparently does not work in Windows due to some limitation in openssl.

              This option is OpenSSL-specific and does nothing if libcurl is built to use GnuTLS.

              Pass a string naming a file with the concatenation of CRL (in PEM format) to use in
              the certificate validation that occurs during the SSL exchange.

              When libcurl is built to use NSS or GnuTLS, there is no way to influence the use of
              CRL passed to help in the verification process. When built  with  OpenSSL  support,
              X509_V_FLAG_CRL_CHECK  and  X509_V_FLAG_CRL_CHECK_ALL  are  both set, requiring CRL
              check against all the elements of the certificate chain if a CRL file is passed.

              This option makes sense only when  used  in  combination  with  the  -sslverifypeer

              Pass  a file name. The file will be used to read from to seed the random engine for
              SSL. The more random the specified file is, the more secure will the SSL connection

              Pass  a  path  name to the Entropy Gathering Daemon socket. It will be used to seed
              the random engine for SSL.

              This option determines whether TclCurl verifies that the server claims  to  be  who
              you want it to be.

              When  negotiating  an SSL connection, the server sends a certificate indicating its

              When -sslverifyhost is set to 2, that certificate must indicate that the server  is
              the server to which you meant to connect, or the connection fails.

              TclCurl  considers  the  server  the  intended  one when the Common Name field or a
              Subject Alternate Name field in the certificate matches the host name in the URL to
              which you told Curl to connect.

              When  set  to  1, the certificate must contain a Common Name field, but it does not
              matter what name it says. (This is not ordinarily a useful setting).

              When the value is 0, the  connection  succeeds  regardless  of  the  names  in  the

              The default is 2.

              This  option  controls  the  identity  that  the server claims. The server could be
              lying. To control lying, see sslverifypeer.

              Pass a string holding the ciphers to use for the  SSL  connection.  The  list  must
              consists  of  one  or more cipher strings separated by colons. Commas or spaces are
              also acceptable separators but colons are normally used, , - and + can be  used  as

              For   OpenSSL  and  GnuTLS  valid  examples  of  cipher  lists  include  'RC4-SHA',
              'SHA1+DES', 'TLSv1' and 'DEFAULT'. The  default  list  is  normally  set  when  you
              compile OpenSSL.

              You will find more details about cipher lists on this URL:

              For    NSS    valid   examples   of   cipher   lists   include   'rsa_rc4_128_md5',
              'rsa_aes_128_sha', etc. With NSS you don't add/remove  ciphers.  If  you  use  this
              option then all known ciphers are disabled and only those passed in are enabled.

              You'll find more details about the NSS cipher lists on this URL:

              Pass  a  0  to disable TclCurl's use of SSL session-ID caching or a 1 to enable it.
              By default all transfers are done using the cache. Note  that  while  nothing  ever
              should get hurt by attempting to reuse SSL session-IDs, there seem to be broken SSL
              implementations in the wild that may require you to disable this in order  for  you
              to succeed.

              Set  the  kerberos  security  level  for FTP, this also enables kerberos awareness.
              This is a string, 'clear', 'safe', 'confidential' or 'private'. If  the  string  is
              set but does not match one of these, 'private' will be used. Set the string to NULL
              to disable kerberos4. Set the string to "" to disable kerberos support for FTP.

SSH options

              The allowed types are:





              any  To let TclCurl pick one

              Pass a string containing 32 hexadecimal digits. The string should be  the  128  bit
              MD5  cheksum  of the remote host public key, and TclCurl will reject the connection
              to the host unless the md5sums  match.  This  option  is  only  for  SCP  and  SFTP

              Pass  the  file  name  for  your public key. If not used, TclCurl defaults to using

              Pass the file name for your private key. If not used,  TclCurl  defaults  to  using
              ~/.ssh/id_dsa.    If   the  file  is  password-protected,  set  the  password  with

Other options

              Name of the Tcl array variable where TclCurl will store the headers returned by the

              When  a  server  sends  a  chunked encoded transfer, it may contain a trailer. That
              trailer is identical to a HTTP header and if such  a  trailer  is  received  it  is
              passed  to  the  application using this callback as well. There are several ways to
              detect it being a trailer and not  an  ordinary  header:  1)  it  comes  after  the
              response-body. 2) it comes after the final header line (CR LF) 3) a Trailer: header
              among the response-headers mention what header to expect in the trailer.

              Name of the Tcl variable where TclCurl will store the file requested, the file  may
              contain text or binary data.

              Name  of  a  Tcl  variable,  in  case  you  have  defined  a procedure to call with
              -progressproc setting this variable to '1' will cancel the transfer.

              Executes the given command after the transfer is done, since  it  only  works  with
              blocking transfers, it is pretty much useless.

       -share Pass  a  share  handle as a parameter. The share handle must have been created by a
              previous call to curl::shareinit. Setting this option, will make  this  handle  use
              the  data  from  the  shared  handle  instead  of  keeping the data to itself.  See
              tclcurl_share for details.

              Pass a number as a parameter, containing the value of the permissions that will  be
              assigned  to  newly  created files on the remote server. The default value is 0644,
              but any valid value can be used. The only protocols that can use this are  sftp://,
              scp:// and file://.

              Pass  a number as a parameter, containing the value of the permissions that will be
              assigned to newly created directories on the remote server. The  default  value  is
              0755,  but  any  valid  value can be used. The only protocols that can use this are
              sftp://, scp:// and file://.


       Some of the options libcurl offers are not supported, I don't think them worth  supporting
       in TclCurl but if you need one of them don't forget to complain:


curlHandle perform

       This  procedure  is  called  after the init and all the configure calls are made, and will
       perform the transfer as described in the options.

       It must be called with the same curlHandle curl::init  call  returned.   You  can  do  any
       amount  of  calls  to  perform while using the same handle. If you intend to transfer more
       than one file, you are even encouraged to do so. TclCurl will then attempt to  re-use  the
       same  connection  for the following transfers, thus making the operations faster, less CPU
       intense and using less network resources. Just note that you will have  to  use  configure
       between the invokes to set options for the following perform.

       You  must  never call this procedure simultaneously from two places using the same handle.
       Let it return first before invoking it another time. If you want parallel  transfers,  you
       must use several curl handles.

              '0'  if  all went well, non-zero if it didn't. In case of error, if the errorbuffer
              was set with configure there will be a readable error  message.   The  error  codes

       1      Unsupported protocol. This build of TclCurl has no support for this protocol.

       2      Very  early  initialization code failed. This is likely to be and internal error or

       3      URL malformat. The syntax was not correct.

       5      Couldn't resolve proxy. The given proxy host could not be resolved.

       6      Couldn't resolve host. The given remote host was not resolved.

       7      Failed to connect to host or proxy.

       8      FTP weird server reply. The server sent data TclCurl  couldn't  parse.   The  given
              remote server is probably not an OK FTP server.

       9      We  were denied access to the resource given in the URL. For FTP, this occurs while
              trying to change to the remote directory.

       11     FTP weird PASS reply. TclCurl couldn't parse the reply sent to the PASS request.

       13     FTP weird PASV reply, TclCurl couldn't parse the reply sent to  the  PASV  or  EPSV

       14     FTP weird 227 format. TclCurl couldn't parse the 227-line the server sent.

       15     FTP can't get host. Couldn't resolve the host IP we got in the 227-line.

       17     FTP couldn't set type. Couldn't change transfer method to either binary or ascii.

       18     Partial  file. Only a part of the file was transfered, this happens when the server
              first reports an expected transfer size and then delivers data that  doesn't  match
              the given size.

       19     FTP  couldn't  RETR file, we either got a weird reply to a 'RETR' command or a zero
              byte transfer.

       21     Quote error. A custom 'QUOTE' returned error  code  400  or  higher  (for  FTP)  or
              otherwise indicated unsuccessful completion of the command.

       22     HTTP  returned error. This return code only appears if -failonerror is used and the
              HTTP server returns an error code that is 400 or higher.

       23     Write error. TclCurl couldn't write data to a local  filesystem  or  an  error  was
              returned from a write callback.

       25     Failed  upload  failed.  For FTP, the server typcially denied the STOR command. The
              error buffer usually contains the server's explanation to this.

       26     Read error. There was a problem reading from a local file or an error was  returned
              from the read callback.

       27     Out  of memory. A memory allocation request failed. This should never happen unless
              something weird is going on in your computer.

       28     Operation timeout. The specified time-out  period  was  reached  according  to  the

       30     The  FTP  PORT  command  failed,  not all FTP servers support the PORT command, try
              doing a transfer using PASV instead!.

       31     FTP couldn't use REST. This command is used for resumed FTP transfers.

       33     Range error. The server doesn't support or accept range requests.

       34     HTTP post error. Internal post-request generation error.

       35     SSL connect error. The SSL handshaking failed, the error buffer may have a clue  to
              the reason, could be certificates, passwords, ...

       36     FTP  bad  download  resume. Couldn't continue an earlier aborted download, probably
              because you are trying to resume beyond the file size.

       37     A file given with FILE:// couldn't be read. Did you checked the permissions?

       38     LDAP cannot bind. LDAP bind operation failed.

       39     LDAP search failed.

       41     A required zlib function was not found.

       42     Aborted by callback. An application told TclCurl to abort the operation.

       43     Internal error. A function was called with a bad parameter.

       45     Interface error. A specified outgoing interface could not be used.

       47     Too many redirects. When following redirects, TclCurl hit the maximum  amount,  set
              your limit with --maxredirs

       48     Unknown TELNET option specified.

       49     A telnet option string was illegally formatted.

       51     The remote peer's SSL certificate or SSH md5 fingerprint wasn't ok

       52     The server didn't reply anything, which here is considered an error.

       53     The specified crypto engine wasn't found.

       54     Failed setting the selected SSL crypto engine as default!

       55     Failed sending network data.

       56     Failure with receiving network data.

       58     Problem with the local client certificate.

       59     Couldn't use specified SSL cipher.

       60     Peer certificate cannot be authenticated with known CA certificates.

       61     Unrecognized transfer encoding.

       62     Invalid LDAP URL.

       63     Maximum file size exceeded.

       64     SSL use failed.

       65     Sending  the  data requires a rewind that failed, since TclCurl should take care of
              it for you, it means you found a bug.

       66     Failed to initialise ssl engine.

       67     Failed to login, user password or similar was not accepted.

       68     File not found on TFTP server.

       69     There is a permission problem with the TFTP request.

       70     The remote server has run out of space.

       71     Illegal TFTP operation.

       72     Unknown transfer ID.

       73     TFTP file already exists and will not be overwritten.

       74     No such user in the TFTP server and good behaving TFTP server should  never  return

       75     Character conversion failed.

       77     Problem with reading the SSL CA cert (path? access rights?).

       78     Remote file not found

       79     Error from the SSH layer

       80     Failed to shut down the SSL connection

       82     Failed to load CRL file

       83     Issuer check failed

curlHandle getinfo option

       Request internal information from the curl session with this procedure.  This procedure is
       intended to get used *AFTER* a performed transfer, and can be  relied  upon  only  if  the
       perform  returns  0.   Use  this  function  AFTER  a performed transfer if you want to get
       transfer-oriented data.

       The following information can be extracted:

              Returns the last used effective URL.

              Returns the last received HTTP or FTP code. This will be zero if no server response
              code  has  been  received. Note that a proxy's CONNECT response should be read with
              httpconnectcode and not this.

              Returns the last received proxy response code to a CONNECT request.

              Returns the remote time of the retrieved document (in number of seconds since 1 jan
              1970  in  the  GMT/UTC time zone). If you get -1, it can be because of many reasons
              (unknown, the server hides it or the server doesn't support the command that  tells
              document time etc) and the time of the document is unknown.

              In order for this to work you have to set the -filetime option before the transfer.

              Returns  the  time, in seconds, it took from the start until the name resolving was

              Returns the time, in seconds, it took from the  start  until  the  connect  to  the
              remote host (or proxy) was completed.

              Returns   the  time,  in  seconds,  it  took  from  the  start  until  the  SSL/SSH
              connect/handshake to the remote host was completed. This time is  most  often  very
              near  to  the PRETRANSFER time, except for cases such as HTTP pippelining where the
              pretransfer time can be delayed due to waits in line for the pipeline and more.

              Returns the time, in seconds, it took from the start until  the  file  transfer  is
              just  about to begin. This includes all pre-transfer commands and negotiations that
              are specific to the particular protocol(s) involved.

              Returns the time, in seconds, it took from the start until the first byte  is  just
              about  to  be  transfered. This includes the pretransfertime, and also the time the
              server needs to calculate the result.

              Returns the  total  transaction  time,  in  seconds,  for  the  previous  transfer,
              including name resolving, TCP connect etc.

              Returns  the  URL  a redirect would take you to if you enable followlocation.  This
              can come very handy if you think using the built-in libcurl  redirect  logic  isn't
              good  enough for you but you would still prefer to avoid implementing all the magic
              of figuring out the new URL.

              Returns the total time, in seconds, it took for  all  redirection  steps  including
              name  lookup,  connect,  pretransfer  and transfer before the final transaction was
              started, it returns the complete execution time for multiple  redirections,  so  it
              returns zero if no redirections were needed.

              Returns the total number of redirections that were actually followed.

              Returns  how  many  new  connections  TclCurl had to create to achieve the previous
              transfer (only the successful connects are counted).  Combined  with  redirectcount
              you  are  able  to  know  how  many  times  TclCurl  successfully  reused  existing
              connection(s) or not. See the Connection Options of setopt to see how TclCurl tries
              to make persistent connections to save time.

              Returns  the  IP address of the most recent connection done with this handle.  This
              string may be IPv6 if that's enabled.

              Returns the total amount of bytes that were uploaded.

              Returns the total amount of bytes that were downloaded. The amount is only for  the
              latest transfer and will be reset again for each new transfer.

              Returns  the  average  download  speed,  measured in bytes/second, for the complete

              Returns the average upload  speed,  measured  in  bytes/second,  for  the  complete

              Returns the total size in bytes of all the headers received.

              Returns  the  total  size  of  the  issued  requests.  This is so far only for HTTP
              requests. Note that this may be more than one request if followLocation is true.

              Returns the result of the certification verification that was requested (using  the
              -sslverifypeer option to configure).

              Returns  a  list  of  the  OpenSSL  crypto-engines supported. Note that engines are
              normally implemented in separate dynamic libraries.  Hence  not  all  the  returned
              engines may be available at run-time.

              Returns  the  content-length  of  the  download.  This  is  the value read from the
              Content-Length: field.

              Returns the specified size of the upload.

              Returns the content-type of the downloaded object. This is the value read from  the
              Content-Type: field. If you get an empty string, it  means the server didn't send a
              valid Content-Type header or that the protocol used doesn't support this.

              Returns a list with the authentication method(s) available.

              Returns  a  list  with  the  authentication  method(s)  available  for  your  proxy

              Returns the errno value from a connect failure.

              Returns  a  list  of all cookies TclCurl knows (expired ones, too). If there are no
              cookies (cookies for the handle have not been enabled  or  simply  none  have  been
              received) the list will be empty.

              Returns  a  string  holding  the  path  of the entry path. That is the initial path
              TclCurl ended up in when logging on to the remote  FTP  server.  Returns  an  empty
              string if something is wrong.

curlHandle cleanup

       This  procedure must be the last one to call for a curl session. It is the opposite of the
       curl::init procedure and must  be  called  with  the  same  curlhandle  as  input  as  the
       curl::init  call  returned.   This will effectively close all connections TclCurl has used
       and possibly has kept open until now. Don't call this procedure if you intend to  transfer
       more files.

curlHandle reset

       Re-initializes all options previously set on a specified handle to the default values.

       This  puts  back  the  handle to the same state as it was in when it was just created with

       It does not change the following information kept in the  handle:  live  connections,  the
       Session ID cache, the DNS cache, the cookies and shares.

curlHandle duphandle

       This  procedure  will  return  a  new  curl  handle,  a  duplicate,  using all the options
       previously set  in  the  input  curl  handle.   Both  handles  can  subsequently  be  used
       independently  and  they must both be freed with cleanup.  The new handle will not inherit
       any state information, connections, SSL sessions or cookies.

              A new curl handle or an error message if the copy fails.

curlHandle pause

       You can use this command from within a progress callback procedure to pause the transfer.

curlHandle resume

       Resumes a transfer paused with curlhandle pause


       In case you do not want to use persistent connections you can use this command,  it  takes
       the  same  arguments  as  the  curlHandle  configure and will init, configure, perform and
       cleanup a connection for you.

       You can also get the getinfo information by using -infooption variable  pairs,  after  the
       transfer  variable  will  contain  the  value that would have been returned by $curlHandle
       getinfo option.

              The same error code perform would return.


       Returns a string with the version number of tclcurl, libcurl and  some  of  its  important
       components (like OpenSSL version).

              The string with the version info.

curl::escape url

       This  procedure  will  convert  the given input string to an URL encoded string and return
       that. All input characters that are not a-z, A-Z or 0-9 will be converted  to  their  "URL
       escaped" version (%NN where NN is a two-digit hexadecimal number)

              The converted string.

curl::unescape url

       This  procedure  will  convert  the given URL encoded input string to a "plain string" and
       return that. All input characters that are URL  encoded  (%XX  where  XX  is  a  two-digit
       hexadecimal number) will be converted to their plain text versions.

              The string unencoded.

curl::curlConfig option

       Returns some information about how you have cURL installed.

              Returns the directory root where you installed cURL

              Returns  a list containing particular main features the installed libcurl was built
              with. The list may include SSL, KRB4 or IPv6, do not assume any particular order.

              Outputs  version  information  about  the installed  libcurl,  in  numerical  mode.
              This  outputs  the   version   number,   in hexadecimal, with 8 bits for each part;
              major, minor, patch. So  that  libcurl 7.7.4 would appear  as  070704  and  libcurl
              12.13.14 would appear as 0c0d0e...

curl::versioninfo option

       Returns information about various run-time features in TclCurl.

       Applications  should  use  this  information to judge if things are possible to do or not,
       instead of using compile-time checks, as dynamic/DLL libraries can be changed  independent
       of applications.

              Returns the version of libcurl we are using.

              Retuns  the  version  of  libcurl  we are using in hexadecimal with 8 bits for each
              part; major, minor, patch. So  that  libcurl  7.7.4  would  appear  as  070704  and
              libcurl  12.13.14  would  appear  as  0c0d0e... Note that the initial zero might be

       -host  Returns a string with the host information as discovered by a configure  script  or
              set by the build environment.

              Returns a list with the features compiled into libcurl, the possible elements are:

                   Libcurl  was  built  with  support for asynchronous name lookups, which allows
                   more exact timeouts (even on Windows) and less blocking when using  the  multi

              CONV Libcurl was built with support for character conversions.

                   Libcurl  was  built  with extra debug capabilities built-in. This is mainly of
                   interest for libcurl hackers.

                   Supports HTTP GSS-Negotiate.

              IDN  Supports IDNA, domain names with international letters.

              IPV6 Supports IPv6.

                   Supports kerberos4 (when using FTP).

                   Libcurl was built with support for large files.

              LIBZ Supports HTTP deflate using libz.

              NTML Supports HTTP NTLM

                   Libcurl was built with support for SPNEGO authentication (Simple and Protected
                   GSS-API Negotiation Mechanism, defined in RFC 2478)

              SSL  Supports SSL (HTTPS/FTPS)

              SSPI Libcurl was built with support for SSPI. This is only available on Windows and
                   makes libcurl use Windows-provided functions for NTLM authentication. It  also
                   allows libcurl to use the current user and the current user's password without
                   the app having to pass them on.
       Do not assume any particular order.

              Returns a string with the OpenSSL version used, like OpenSSL/0.9.6b.

              Returns the numerical OpenSSL version value as defined by the OpenSSL project.   If
              libcurl has no SSL support, this is 0.

              Returns a string, there is no numerical  version, for example: 1.1.3.

              Lists what particular protocols the installed TclCurl was built to support.  At the
              time of writing, this list may include HTTP, HTTPS, FTP, FTPS, FILE, TELNET,  LDAP,
              DICT.  Do  not  assume  any  particular  order.  The protocols will be listed using
              uppercase. There may be none, one or several protocols in the list.

curl::easystrerror errorCode

       This procedure returns a string describing the error code passed in the argument.


       curl, The art of HTTP scripting (at, RFC 2396,