- Как использовать cURL для отправки кукиз
- Утилита командной строки CURL
- Следовать за редиректами
- Сохранить вывод в файл
- Загрузить файл, только если он изменён
- Прохождение аутентификации HTTP
- Получение и отправка cookie
- Получение и отправка заголовков
- Отправка данных методом POST
- Загрузка файлов методом POST
- Работа по протоколу FTP
- How do I send Cookies using Curl?
- What is Curl?
- What is Cookie?
- How set Cookie with Curl?
- What is cookie-jar?
- How to set cookies for Curl?
- How to use Сookies with Libcurl?
- See also
- Generate Code Snippets for Curl Send Cookies Example
- Linux curl command
- Syntax
- Options
Как использовать cURL для отправки кукиз
Отправить cookie в cURL можно опцией —cookie:
У опции —cookie есть короткий вариант -b. С этими опциями можно указать как файл, содержащий кукиз, так и строку с паролй ИМЯ=ЗНАЧЕНИЕ. Если аргументом является строка с символом ‘=‘, то он передаётся как есть. В противном случае строка трактуется как имя файла из которого нужно прочитать кукиз.
Сразу несколько кукиз можно передать разделив их точкой с запятой, например:
Если вы хотите сохранить кукиз, то для этого используется опция -c (длинное написание —cookie-jar). Следующий пример запишет полученные от хоста кукиз в файл cookiefile:
А следующий пример прочитает кукиз из файла cookiefile и отправит их хосту:
Смотрите также примеры в статье «Парсинг сайтов: азы, продвинутые техники, сложные случаи» раздел «cURL и аутентификация в веб-формах (передача данных методом GET и POST)».
Допустимо использовать опции для записи и чтения кукиз одновременно в один и тот же файл, как это делают веб-браузеры:
Если нужно написать файл кукиз вручную, то помните, что в документации curl сказано, что используется старый формат файлов кукиз Netscape, который отличается от формата, используемого веб-браузерами. Если вам нужно создать файл кукиз для curl вручную, то это сообщение должно помочь вам.
Файл должен быть записан примерно так:
то есть содержать 7 разделённых табом поолей: домен, tailmatch, путь, безопасность, истекают, имя, значение.
Источник
Утилита командной строки CURL
CURL — утилита командной строки для Linux или Windows, поддерживает работу с протоколами: FTP, FTPS, HTTP, HTTPS, TFTP, SCP, SFTP, Telnet, DICT, LDAP, POP3, IMAP и SMTP. Она отлично подходит для имитации действий пользователя на страницах сайтов и других операций с URL адресами. Поддержка CURL добавлена в множество различных языков программирования и платформ.
Для начала скачаем саму утилиту, для этого переходим на официальный сайт утилиты, в раздел Download. После скачивания архива для своей платформы (у меня это Windows 64 bit), распаковываем архив. Чтобы иметь возможность работать с HTTPS и FTPS, устанавливаем сертификат безопасности url-ca-bundle.crt , который находится в папке curl/bin .
Запускаем командную строку, переходим в директорию curl/bin и пытаемся скачать главную страницу Google:
Опция -X позволяет задать тип HTTP-запроса вместо используемого по умолчанию GET. Дополнительные запросы могут быть POST, PUT и DELETE или связанные с WebDAV — PROPFIND, COPY, MOVE и т.п.
Следовать за редиректами
Сервер Google сообщил нам, что страница google.com перемещена (301 Moved Permanently), и теперь надо запрашивать страницу www.google.com . С помощью опции -L укажем CURL следовать редиректам:
Сохранить вывод в файл
Чтобы сохранить вывод в файл, надо использовать опции -o или -O :
- -o ( o нижнего регистра) — результат будет сохранён в файле, заданном в командной строке;
- -O ( O верхнего регистра) — имя файла будет взято из URL и будет использовано для сохранения полученных данных.
Сохраняем страницу Google в файл google.html :
Сохраняем документ gettext.html в файл gettext.html :
Загрузить файл, только если он изменён
Опция -z позволяет получить файлы, только если они были изменены после определённого времени. Это будет работать и для FTP и для HTTP. Например, файл archive.zip будет получен, если он изменялся после 20 августа 2018 года:
Команда ниже загрузит файл archive.zip , если он изменялся до 20 августа 2018 года:
Прохождение аутентификации HTTP
Опция -u позволяет указать данные пользователя (имя и пароль) для прохождения базовой аутентификаци (Basic HTTP Authentication):
Получение и отправка cookie
Cookie используются сайтами для хранения некой информации на стороне пользователя. Сервер сохраняет cookie на стороне клиента (т.е. в браузере), отправляя заголовки:
А браузер, в свою очередь, отправляет полученные cookie обратно на сервер при каждом запросе. Разумеется, тоже в заголовках:
Передать cookie на сервер, как будто они были ранее получены от сервера:
Чтобы сохранить полученные сookie в файл:
Затем можно отправить сохраненные в файле cookie обратно:
Файл cookie.txt имеет вид:
Получение и отправка заголовков
По умолчанию, заголовки ответа сервера не показываются. Но это можно исправить:
Если содержимое страницы не нужно, а интересны только заголовки (будет отправлен HEAD запрос):
Посмотреть, какие заголовки отправляет CURL при запросе, можно с помощью опции -v , которая выводит более подробную информацию:
- Строка, начинающаяся с > означает заголовок, отправленный серверу
- Строка, начинающаяся с означает заголовок, полученный от сервера
- Строка, начинающаяся с * означает дополнительные данные от CURL
А вот так можно отправить свой заголовок:
Отправка данных методом POST
Команда ниже отправляет POST запрос на сервер аналогично тому, как пользователь, заполнив HTML форму, нажал бы кнопку «Отправить». Данные будут отправлены в формате application/x-www-form-urlencoded .
Параметр —data аналогичен —data-ascii , для отправки двоичных данных необходимо использовать параметр —data-binary . Для URL-кодирования полей формы нужно использовать —data-urlencode .
Если значение опции —data начинается с @ , то после него должно быть имя файла с данными (или дефис — тогда будут использованы данные из стандартного ввода). Пример получения данных из файла для отправки POST-запроса:
Содержимое файла data.txt :
Массив $_POST , который будет содержать данные этого запроса:
Пример URL-кодирования данных из файла перед отправкой POST-запроса:
Содержимое файла username.txt :
Массив $_POST , который будет содержать данные этого запроса:
Загрузка файлов методом POST
Для HTTP запроса типа POST существует два варианта передачи полей из HTML форм, а именно, используя алгоритм application/x-www-form-urlencoded и multipart/form-data . Алгоритм первого типа создавался давным-давно, когда в языке HTML еще не предусматривали возможность передачи файлов через HTML формы.
Со временем возникла необходимость через формы отсылать еще и файлы. Тогда консорциум W3C взялся за доработку формата POST запроса, в результате чего появился документ RFC 1867. Форма, которая позволяет пользователю загрузить файл, используя алгоритм multipart/form-data , выглядит примерно так:
Чтобы отправить на сервер данные такой формы:
Скрипт upload.php , который принимает данные формы:
Работа по протоколу FTP
Скачать файл с FTP-сервера:
Если заданный FTP путь является директорией, то по умолчанию будет выведен список файлов в ней:
Источник
How do I send Cookies using Curl?
Compare Request Timings
What is Curl?
The Curl is a command-line tool available for Linux, Windows, and macOS and a cross-platform library (libcurl) that can be used with almost any application written in nearly any programming language. Curl uses URL syntax to transfer data to and from servers. With Curl, you can upload or download data, submit web forms, and make API requests using over 25 protocols, including HTTP, HTTPS, FTP, and SFTP.
What is Cookie?
Cookies are small blocks of data sent from a website and stored by a web browser on a user’s computer. Cookies are placed on the device used to access the website (computer, mobile phone, etc.). Several cookies may be placed on the same computer during a session. Browsers send cookies back to the server with each subsequent request, allowing the server to determine if the request came from the same browser or not.
Cookies enable web servers to store state information on a user’s device or track a user’s online activity. The data stored in cookies are created by the server when it connects to the server. This data is tagged with an identifier that is unique to user and user’s computer. Cookies are primarily used for user session management, user tracking, and personalization.
How set Cookie with Curl?
The general form of the Curl command for sending a Cookies request is as follows:
What is cookie-jar?
The -c (or —cookie-jar) command-line option specifies the filename where Curl should write all cookies after the operation completes. Curl will report to the specified file all the cookies from its in-memory cookie store at the end of the process. If there are no cookies, Curl will not create the specified file. If you include a dash «-» for a filename, Curl will write cookies to standard output.
How to set cookies for Curl?
You can use the -b (or —cookie) command-line switch to pass cookies to Curl.
How to use Сookies with Libcurl?
Libcurl is a free client library that can add the same capabilities to your application as the Curl command-line tool. Libcurl is portable, thread-safe, IPv6 compatible, and can be used on many platforms, including Windows, Linux, and has bindings for many popular programming languages, including C++, JavaScript, PHP, Python, and others. Libcurl offers several ways to enable and interact with cookies in your application:
Options | Action |
---|---|
CURLOPT_COOKIE | Provide a cookie header to be sent to the server. |
CURLOPT_COOKIEFILE | Read cookies from the given cookie jar file. |
CURLOPT_COOKIEJAR | Save cookies to the specified cookie jar file. |
CURLOPT_COOKIELIST | Provide details of one cookie. |
CURLINFO_COOKIELIST | Extract cookie information from cookie storage. |
See also
Generate Code Snippets for Curl Send Cookies Example
Convert your Curl Send Cookies request to the PHP, JavaScript/AJAX, Curl/Bash, Python, Java, C#/.NET code snippets using the ReqBin code generator.
Источник
Linux curl command
The curl command transfers data to or from a network server, using one of the supported protocols (HTTP, HTTPS, FTP, FTPS, SCP, SFTP, TFTP, DICT, TELNET, LDAP or FILE). It is designed to work without user interaction, so it is ideal for use in a shell script.
The software offers proxy support, user authentication, FTP uploading, HTTP posting, SSL connections, cookies, file transfer resume, metalink, and other features.
Syntax
Options
-a, —append | (FTP/SFTP) When used in an FTP upload, this will tell curl to append to the target file instead of overwriting it. If the file doesn’t exist, it is created. |
Note that this option is ignored by some SSH servers, including OpenSSH.
This value can also be set with the -H/—header option.
If this option is set more than once, the last one will be the one that’s used.
Note that using —anyauth is not recommended if you do uploads from stdin since it may require data to be sent twice and then the client must be able to rewind. If the need should arise when uploading from stdin, the upload operation fails.
If no ‘=‘ (equals) character is used in the line, it is treated as a file name to use to read previously stored cookie lines from, which should be used in this session if they match. Using this method also activates the «cookie parser» which makes curl record incoming cookies too, which may be handy if you’re using this in combination with the —location option. The file format of the file to read cookies from should be plain HTTP headers or the Netscape/Mozilla cookie file format.
NOTE: the file specified with -b/—cookie is only used as input. No cookies will be stored in the file. To store cookies, use the -c/—cookie-jar option, or you can save the HTTP headers to a file using -D/—dump-header.
If this option is set more than once, the last occurrence will be the option that’s used.
If this option is used twice, the second one disables ASCII usage.
NSS ciphers are done differently than OpenSSL and GnuTLS. The full list of NSS ciphers is in the NSSCipherSuite entry at this URL: https://pagure.io/mod_nss#Directives.
If this option is used several times, the last one overrides the others.
If this option is used several times, the last one will be used.
This command line option activates the cookie engine that makes curl record and use cookies. Another way to activate it is to use the -b/—cookie option.
NOTE: If the cookie jar can’t be created or written to, the whole curl operation won’t fail or even report an error. If -v is specified a warning is displayed, but that is the only visible feedback you get about this possibly fatal situation.
If this option is used several times, the last specified file name will be used.
Use «-C —» to tell curl to automatically find out where/how to resume the transfer. It then uses the given output/input files to figure that out.
If this option is used several times, the last one will be used.
To create remote directories when using FTP or SFTP, try —ftp-create-dirs.
If this option is used several times, the last one will be used.
(Added in 7.19.7)
If you start the data with the «@» character, the rest should be a file name to read the data from, or «—» (dash) if you want curl to read the data from stdin. The contents of the file must already be url-encoded. Multiple files can also be specified. Posting data from a file named ‘foobar’ would thus be done with «—data @foo-bar«.
-d/—data is the same as —data-ascii. To post data purely binary, use the —data-binary option. To URL-encode the value of a form field you may use —data-urlencode.
If this option is used several times, the ones following the first will append data.
If this option is used several times, the ones following the first will append data.
If you start the data with the character @, the rest should be a filename. Data is posted in a similar manner as —data-ascii does, except that newlines are preserved and conversions are never done.
If this option is used several times, the ones following the first will append data as described in -d, —data.
To be CGI-compliant, the part should begin with a name followed by a separator and a content specification. The part can be passed to curl using one of the following syntaxes:
This makes curl URL-encode the content and pass that on. Just be careful so that the content doesn’t contain any = or @ symbols, as that will then make the syntax match one of the other cases below!
This makes curl URL-encode the content and pass that on. The preceding = symbol is not included in the data.
This makes curl URL-encode the content part and pass that on. Note that the name part is expected to be URL-encoded already.
This makes curl load data from the given file (including any newlines), URL-encode that data and pass it on in the POST.
This makes curl load data from the given file (including any newlines), URL-encode that data and pass it on in the POST. The name part gets an equal sign appended, resulting in name=urlencoded-file-content. Note that the name is expected to be URL-encoded already.
Don’t allow any delegation.
Delegates if and only if the OK-AS-DELEGATE flag is set in the Kerberos service ticket, which is a matter of realm policy.
Unconditionally allow the server to delegate.
If this option is used several times, the following occurrences make no difference.
—eprt can explicitly enable EPRT again and —no-eprt is an alias for —disable-eprt.
Disabling EPRT only changes the active behavior. If you want to switch to passive mode you need to not use -P, —ftp-port or force it with —ftp-pasv.
—epsv can explicitly enable EPSV again and —no-epsv is an alias for —disable-epsv.
Disabling EPSV only changes the passive behavior. If you want to switch to active mode you need to use -P, —ftp-port.
This option is handy to use when you want to store the headers that an HTTP site sends to you. Cookies from the headers could then be read in a second curl invoke using the -b/—cookie option. However, the -c/—cookie-jar option is a better way to store cookies.
When used on FTP, the ftp server response lines are considered being «headers» and thus are saved there.
If this option is used several times, the last one is used.
If this option is used several times, the last one will be used.
If curl is built against the NSS SSL library then this option can tell curl the nickname of the certificate to use in the NSS database defined by the environment variable SSL_DIR (or by default /etc/pki/nssdb). If the NSS PEM PKCS#11 module (libnsspem.so) is available then PEM files may be loaded. If you want to use a file from the current directory, please precede it with «./» prefix, to avoid confusion with a nickname. If the nickname contains «:«, it needs to be preceded by «\» so that it is not recognized as password delimiter. If the nickname contains «\«, it needs to be escaped as «\\» so that it is not recognized as an escape character.
(iOS and Mac OS X only) If curl is built against Secure Transport, then the certificate string must match the name of a certificate that’s in the system or user keychain. The private key corresponding to the certificate, and certificate chain (if any), must also be present in the keychain.
If this option is used several times, the last one will be used.
If this option is used several times, the last one will be used.
curl recognizes the environment variable named ‘CURL_CA_BUNDLE‘ if that is set, and uses the given path as a path to a CA cert bundle. This option overrides that variable.
The Windows version of curl automatically looks for a CA certs file named ‘curl-ca-bundle.crt‘, either in the same directory as curl.exe, or in the current working directory, or in any folder along your PATH.
If curl is built against the NSS SSL library, the NSS PEM PKCS#11 module (libnsspem.so) needs to be available for this option to work properly.
If this option is used several times, the last one will be used.
If this option is used several times, the last one will be used.
This method is not fail-safe and there are occasions where non-successful response codes will slip through, especially when authentication is involved (response codes 401 and 407).
If this option is used twice, the second overrides the previous use.
curl does a single CWD operation for each path part in the given URL. For deep hierarchies this means a lot of commands. This is the default but the slowest behavior.
curl does no CWD at all. curl will do SIZE, RETR, STOR, etc. and give a full path to the server for all these commands. This is the fastest behavior.
curl does one CWD with the full target directory and then operates on the file «normally» (like in the multicwd case). This is somewhat more standards-compliant than ‘nocwd‘ but without the full penalty of ‘multicwd‘.
If this option is used several times, the following occurrences make no difference. Undoing an enforced passive really isn’t doable but you must then instead enforce the correct -P, —ftp-port again.
Passive mode means that curl will try the EPSV command first and then PASV, unless —disable-epsv is used.
This option has no effect if PORT, EPRT or EPSV is used instead of PASV.
If this option is used twice, the second will again disable this.
If this option is used twice, the second will again disable this.
curl makes sure that each header you add/replace get sent with the proper end of line marker, therefore don’t add that as a part of the header content: do not add newlines or carriage returns they only mess things up for you.
See also the -A/—user-agent and -e/—referer options.
This option can be used multiple times to add/replace/remove multiple headers.
curl —interface eth0:1 http://www.netscape.com/
If this option is used several times, the last one will be used.
See this online resource for more information: https://curl.haxx.se/docs/sslcerts.html.
If this option is used several times, the last one will be used.
If this option is used several times, the last one is used.
This option requires a library built with kerberos4 or GSSAPI (GSS-Negotiate) support. This is not very common. Use -V, —version to see if your curl supports it.
If this option is used several times, the last one is used.
Specify the filename as ‘—‘ to make curl read the file from stdin.
Note that to be able to specify a URL in the config file, you need to specify it using the —url option, and not by writing the URL on its own line. So, it could look similar to this:
Long option names can optionally be given in the config file without the initial double dashes.
When curl is invoked, it always (unless -q is used) checks for a default config file and uses it if found. The default config file is checked for in the following places in this order:
1) curl tries to find the «home dir»: It first checks for the CURL_HOME and then the HOME environment variables. Failing that, it uses getpwuid() on unix-like systems (which returns the home dir given the current user in your system). On Windows, it then checks for the APPDATA variable, or as a last resort the ‘%USER-PROFILE%\Application Data‘.
2) On Windows, if there is no _curlrc file in the home dir, it checks for one in the same dir the executable curl is placed. On unix-like systems, it will try to load .curlrc from the determined home dir.
This option can be used multiple times to load multiple config files.
If this option is used several times, the last one will be used. If unspecified, the option defaults to 60 seconds.
The given speed is measured in bytes/second, unless a suffix is appended. Appending ‘k‘ or ‘K‘ will count the number as kilobytes, ‘m‘ or ‘M‘ makes it megabytes while ‘g‘ or ‘G‘ makes it gigabytes. Examples: 200K, 3m and 1G.
The given rate is the average speed counted during the entire transfer. It means that curl might use higher transfer speeds in short bursts, but over time it uses no more than the given rate.
If you are also using the -Y/—speed-limit option, that option takes precedence and might cripple the rate-limiting slightly, to help keep the speed-limit logic working.
If this option is used several times, the last one will be used.
This option causes an FTP NLST command to be sent. Some FTP servers list only files in their response to NLST; they do not include subdirectories and symbolic links.
When curl follows a redirect and the request is not a plain GET (for example POST or PUT), it does the following request with a GET if the HTTP response was 301, 302, or 303. If the response code was any other 3xx code, curl re-sends the following request using the same unmodified method.
If this option is used several times, the last given file name is used. (Added in 7.16.1)
NOTE: The file size is not always known before download, and for such files this option has no effect even if the file transfer ends up being larger than this given limit. This concerns both FTP and HTTP transfers.
If this option is used several times, the last one will be used.
(Added in 7.25.0)
(Added in 7.20.0)
(Added in 7.20.0)
Example to use a remote Metalink file:
curl —metalink http://www.example.com/example.metalink
To use a Metalink file in the local file system, use FILE protocol (file://):
curl —metalink file://example.metalink
Please note that if FILE protocol is disabled, there is no way to use a local Metalink file at the time of this writing. Also, note that if —metalink and —include are used together, —include will be ignored. This is because including headers in the response will break Metalink parser and if the headers are included in the file described in Metalink file, hash check fails.
(Added in 7.27.0, if built against the libmetalink library.)
A quick and very simple example of how to set up a .netrc to allow curl to ftp to the machine host.domain.com with username ‘myself’ and password ‘secret’ should look similar to:
machine host.domain.com login myself password secret
If this option is used twice, the second will again disable netrc usage.
This option requires that the curl library was built with GSSAPI support. This is not very common. Use -V/—version to see if your version supports GSS-Negotiate.
When using this option, you must also provide a fake -u/—user option to activate the authentication code properly. Sending a ‘-u :‘ is enough as the username and password from the -u option aren’t actually used.
If this option is used several times, the following occurrences make no difference.
Note that this is the negated option name documented. You can thus use —keepalive to enforce keepalive.
Note that this is the negated option name documented. You can thus use —sessionid to enforce session-ID caching.
Note that this is the negated option name documented. You can thus use —buffer to enforce the buffering.
This option overrides any use of —netrc as they are mutually exclusive. It also abides by —netrc-optional if specified.
If you want to enable NTLM for your proxy authentication, then use —proxy-ntlm.
This option requires a library built with SSL support. Use -V, —version to see if your curl supports NTLM.
If this option is used several times, only the first one is used.
or use several variables like:
You may use this option as many times as you have number of URLs.
See also the —create-dirs option to create the local directories dynamically. Specifying the output as ‘—‘ (a single dash) will force the output to be done to stdout.
The remote file name to use for saving is extracted from the given URL, nothing else.
Consequentially, the file will be saved in the current working directory. If you want the file saved in a different directory, make sure you change current working directory before you invoke curl with the -O, —remote-name flag!
You may use this option as many times as you have number of URLs.
If this option is used several times, the last one will be used.
Permit this protocol in addition to protocols already permitted (this is the default if no modifier is used).
Deny this protocol, removing it from the list of protocols already permitted.
Permit only this protocol (ignoring the list already permitted), though subject to later modification by subsequent entries in the comma separated list.
—proto -ftps uses the default protocols, but disables ftps
—proto -all,https,+http only enables http and https
—proto =http,https also only enables http and https
Unknown protocols produce a warning. This allows scripts to safely rely on being able to disable potentially dangerous protocols, without relying upon support for that protocol being built into curl to avoid an error.
This option can be used multiple times, which is the same as concatenating the protocols into one instance of the option. (Added in 7.20.2)
The only difference between this and the HTTP proxy option (-x, —proxy), is that attempts to use CONNECT through the proxy specifying an HTTP 1.0 protocol instead of the default HTTP 1.1.
If this option is used several times, the last one is used.
i.e «eth0» to specify which interface’s IP address you want to use (Unix only)
i.e «192.168.10.1» to specify the exact IP address
i.e «my.host.domain» to specify the machine
make curl pick the same IP address that is already used for the control connection.
If this option is used several times, the last one is used. Disable the use of PORT with —ftp-pasv. Disable the attempt to use the EPRT command instead of PORT using —disable-eprt. EPRT is really PORT++.
Starting in 7.19.5, you can append «:[start]-[end]» to the right of the address, to tell curl what TCP port range to use. That means you specify a port range, from a lower to a higher number. A single number works as well, but do note that it increases the risk of failure since the port may not be available.
This option can be used multiple times. When speaking to an FTP server, prefix the command with an asterisk (*) to make curl continue even if the command fails as by default curl stops at first failure.
SFTP is a binary protocol. Unlike for FTP, curl interprets SFTP quote commands itself before sending them to the server. File names may be quoted shell-style to embed spaces or special characters. Following is the list of all supported SFTP quote commands:
chgrp group file
The chgrp command sets the group ID of the file named by the file operand to the group ID specified by the group operand. The group operand is a decimal integer group ID.
chmod mode file
The chmod command modifies the file mode bits of the specified file. The mode operand is an octal integer mode number.
chown user file
The chown command sets the owner of the file named by the file operand to the user ID specified by the user operand. The user operand is a decimal integer user ID.
ln source_file target_file
The ln and symlink commands create a symbolic link at the target_file location pointing to the source_file location.
The mkdir command creates the directory named by the directory_name operand.
The pwd command returns the absolute pathname of the current working directory.
rename source target
The rename command renames the file or directory named by the source operand to the destination path named by the target operand.
The rm command removes the file specified by the file operand.
The rmdir command removes the directory entry specified by the directory operand, provided it is empty.
symlink source_file target_file
See ln.
This option can be used many times to add many hostnames to resolve.
(Added in 7.21.3)
0-499 specifies the first 500 bytes;
500-999 specifies the second 500 bytes;
-500 specifies the last 500 bytes;
9500- specifies the bytes from offset 9500 and forward;
0-0,-1 specifies the first and last byte only(*)(H);
500-700,600-799 specifies 300 bytes from offset 500(H);
100-199,500-599 specifies two separate 100 bytes ranges(*)(H).
(*) = NOTE that this causes the server to reply with a multipart response!
Also be aware that many HTTP/1.1 servers do not have this feature enabled, so that when you attempt to get a range, you’ll instead get the whole document.
FTP range downloads only support the simple syntax ‘start-stop‘ (optionally with one of the numbers omitted). It depends on the non-RFC command SIZE.
Only digit characters (0-9) are valid in the ‘start‘ and ‘stop‘ fields of the ‘start-stop‘ range syntax. If a non-digit character is given in the range, the server’s response will be unspecified, depending on the server’s configuration.
If this option is used several times, the last one will be used.
When curl is about to retry a transfer, it first waits one second, and then for all forthcoming retries, it doubles the waiting time until it reaches 10 minutes, which then is the delay between the rest of the retries. Using —retry-delay, you disable this exponential backoff algorithm. See also —retry-max-time to limit the total time allowed for retries. (Added in 7.12.3)
If this option is used multiple times, the last occurrence decide the amount.
If this option is used multiple times, the last occurrence decide the amount.
If this option is used multiple times, the last occurrence decide the amount.
This option was formerly known as —ftp-ssl (Added in 7.11.0). That option name can still be used but will be removed in a future version.
This option was formerly known as —ftp-ssl-reqd (added in 7.15.5). That option name can still be used, but is removed in a future version.
This option overrides any previous use of -x, —proxy, as they are mutually exclusive.
Since 7.21.7, this option is superfluous since you can specify a socks4 proxy with -x, —proxy using a socks4:// protocol prefix.
If this option is used several times, the last one is used.
This option overrides any previous use of -x, —proxy, as they are mutually exclusive.
Since 7.21.7, this option is superfluous since you can specify a socks4a proxy with -x, —proxy using a socks4a:// protocol prefix.
If this option is used several times, the last one is used.
This option overrides any previous use of -x, —proxy, as they are mutually exclusive.
Since 7.21.7, this option is superfluous since you can specify a socks5 hostname proxy with -x, —proxy using a socks5h:// protocol prefix.
If this option is used several times, the last one is used. (This option was previously wrongly documented and used as —socks without the number appended.)
This option overrides any previous use of -x, —proxy, as they are mutually exclusive.
Since 7.21.7, this option is superfluous since you can specify a socks5 proxy with -x, —proxy using a socks5:// protocol prefix.
If this option is used several times, the last one is used. (This option was previously wrongly documented and used as —socks without the number appended.)
This option (and —socks4) does not work with IPV6, FTPS or LDAP.
—socks5 proxy-name —socks5-gssapi-service sockd
would use sockd/proxy-name;
—socks5 proxy-name —socks5-gssapi-service sockd/real-name
would use sockd/real-name for cases where the proxy-name does not match the principal name.
If this option is used several times, the last one is used.
If this option is used several times, the last one is used.
(Added in 7.20.0)
(Added in 7.21.6)
TType= Sets the terminal type.
XDISPLOC= Sets the X display location.
NEW_ENV= Sets an environment variable.
Use the file name «—» (a single dash) to use stdin instead of a given file. Alternately, the file name «.» (a single period) may be specified instead of «—» to use stdin in non-blocking mode to allow reading server output while stdin is being uploaded.
You can specify one -T for each URL on the command line. Each -T + URL pair specifies what to upload and to where. curl also supports «globbing» of the -T argument, meaning you can upload multiple files to a single URL using the same URL globbing style supported in the URL, like this:
curl -T «
curl -T «img259.png» ftp://ftp.picturemania.com/upload/
This option overrides previous uses of -v, —verbose or —trace-ascii.
If this option is used several times, the last one is used.
This is very similar to —trace, but leaves out the hex part and only shows the ASCII part of the dump. It makes smaller output that might be easier to read for untrained humans.
This option overrides previous uses of -v, —verbose or —trace.
If this option is used several times, the last one is used.
If you give the username (without entering a colon) curl prompts for a password.
If you use an SSPI-enabled curl binary and do NTLM authentication, you can force curl to pick up the username and password from your environment by specifying a single colon with this option: «-u :«.
If this option is used several times, the last one is used.
If you use an SSPI-enabled curl binary and do NTLM authentication, you can force curl to pick up the username and password from your environment by specifying a single colon with this option: «-U :«.
Источник