Provided by: gallery-dl_1.21.1-1_all 

NAME
gallery-dl - download image-galleries and -collections
SYNOPSIS
gallery-dl [OPTION]... URL...
DESCRIPTION
gallery-dl is a command-line program to download image-galleries and -collections from several image
hosting sites. It is a cross-platform tool with many configuration options and powerful filenaming
capabilities.
OPTIONS
-h, --help
Print this help message and exit
--version
Print program version and exit
-i, --input-file FILE
Download URLs found in FILE ('-' for stdin). More than one --input-file can be specified
-d, --destination PATH
Target location for file downloads
-D, --directory PATH
Exact location for file downloads
-f, --filename FORMAT
Filename format string for downloaded files ('/O' for "original" filenames)
--cookies FILE
File to load additional cookies from
--proxy URL
Use the specified proxy
--source-address IP
Client-side IP address to bind to
--clear-cache MODULE
Delete cached login sessions, cookies, etc. for MODULE (ALL to delete everything)
-q, --quiet
Activate quiet mode
-v, --verbose
Print various debugging information
-g, --get-urls
Print URLs instead of downloading
-G, --resolve-urls
Print URLs instead of downloading; resolve intermediary URLs
-j, --dump-json
Print JSON information
-s, --simulate
Simulate data extraction; do not download anything
-E, --extractor-info
Print extractor defaults and settings
-K, --list-keywords
Print a list of available keywords and example values for the given URLs
--list-modules
Print a list of available extractor modules
--list-extractors
Print a list of extractor classes with description, (sub)category and example URL
--write-log FILE
Write logging output to FILE
--write-unsupported FILE
Write URLs, which get emitted by other extractors but cannot be handled, to FILE
--write-pages
Write downloaded intermediary pages to files in the current directory to debug problems
-r, --limit-rate RATE
Maximum download rate (e.g. 500k or 2.5M)
-R, --retries N
Maximum number of retries for failed HTTP requests or -1 for infinite retries (default: 4)
--http-timeout SECONDS
Timeout for HTTP connections (default: 30.0)
--sleep SECONDS
Number of seconds to wait before each download. This can be either a constant value or a range
(e.g. 2.7 or 2.0-3.5)
--sleep-request SECONDS
Number of seconds to wait between HTTP requests during data extraction
--sleep-extractor SECONDS
Number of seconds to wait before starting data extraction for an input URL
--filesize-min SIZE
Do not download files smaller than SIZE (e.g. 500k or 2.5M)
--filesize-max SIZE
Do not download files larger than SIZE (e.g. 500k or 2.5M)
--no-part
Do not use .part files
--no-skip
Do not skip downloads; overwrite existing files
--no-mtime
Do not set file modification times according to Last-Modified HTTP response headers
--no-download
Do not download any files
--no-check-certificate
Disable HTTPS certificate validation
-c, --config FILE
Additional configuration files
-o, --option OPT
Additional '<key>=<value>' option values
--ignore-config
Do not read the default configuration files
-u, --username USER
Username to login with
-p, --password PASS
Password belonging to the given username
--netrc
Enable .netrc authentication data
--download-archive FILE
Record all downloaded files in the archive file and skip downloading any file already in it
-A, --abort N
Stop current extractor run after N consecutive file downloads were skipped
-T, --terminate N
Stop current and parent extractor runs after N consecutive file downloads were skipped
--range RANGE
Index-range(s) specifying which images to download. For example '5-10' or '1,3-5,10-'
--chapter-range RANGE
Like '--range', but applies to manga-chapters and other delegated URLs
--filter EXPR
Python expression controlling which images to download. Files for which the expression evaluates
to False are ignored. Available keys are the filename-specific ones listed by '-K'. Example:
--filter "image_width >= 1000 and rating in ('s', 'q')"
--chapter-filter EXPR
Like '--filter', but applies to manga-chapters and other delegated URLs
--zip Store downloaded files in a ZIP archive
--ugoira-conv
Convert Pixiv Ugoira to WebM (requires FFmpeg)
--ugoira-conv-lossless
Convert Pixiv Ugoira to WebM in VP9 lossless mode
--ugoira-conv-copy
Convert Pixiv Ugoira to MKV without re-encoding any frames
--write-metadata
Write metadata to separate JSON files
--write-info-json
Write gallery metadata to a info.json file
--write-tags
Write image tags to separate text files
--mtime-from-date
Set file modification times according to 'date' metadata
--exec CMD
Execute CMD for each downloaded file. Example: --exec 'convert {} {}.png && rm {}'
--exec-after CMD
Execute CMD after all files were downloaded successfully. Example: --exec-after 'cd {} && convert
* ../doc.pdf'
-P, --postprocessor NAME
Activate the specified post processor
EXAMPLES
gallery-dl URL
Download images from URL.
gallery-dl -g -u <username> -p <password> URL
Print direct URLs from a site that requires authentication.
gallery-dl --filter 'type == "ugoira"' --range '2-4' URL
Apply filter and range expressions. This will only download the second, third, and fourth file
where its type value is equal to "ugoira".
gallery-dl r:URL
Scan URL for other URLs and invoke gallery-dl on them.
gallery-dl oauth:SITE-NAME
Gain OAuth authentication tokens for deviantart, flickr, reddit, smugmug, and tumblr.
FILES
/etc/gallery-dl.conf
The system wide configuration file.
~/.config/gallery-dl/config.json
Per user configuration file.
~/.gallery-dl.conf
Alternate per user configuration file.
BUGS
https://github.com/mikf/gallery-dl/issues
AUTHORS
Mike Fährmann <mike_faehrmann@web.de>
and https://github.com/mikf/gallery-dl/graphs/contributors
SEE ALSO
gallery-dl.conf(5)
1.21.1 2022-04-08 GALLERY-DL(1)