Coder Social home page Coder Social logo

e-hentai-downloader's Introduction

E-Hentai-Downloader

Download E-Hentai archive as zip file 📦

Required Environment

Browser GreaseMonkey Tampermonkey Violentmonkey
Firefox (56-) 3.2 beta2+
Firefox (57+) 4.1 beta5+ 4.0.5054+ 2.8.18+
Chrome 3.5.3630+ 2.2.6+
Opera (15+) 3.5.3630+ 2.1.10+
Safari (10.1+) (1) 4.3.5421+
Edge (18-) (2) 4.2.5284+
Edge (79+) 4.10.6111+ 2.12.8+
Maxthon 2.1.10+
Yandex Browser for Android (3) 4.2.5291+ 2.2.6+
Kiwi Browser (3) 4.11+ 2.12.8+
Firefox for Android (68-)(3) Incompatible Incompatible 2.12.8+
Firefox Nightly for Android (85+) (3)(4) Incompatible 4.11.6120+ 2.12.8+

(1) You must upgrade your Windows 10 to 14393 which supports Edge extension.
(2) You must upgrade your macOS to 10.12.4 which supports download attribute of <a> tag.
(3) It's not a good idea to use it on mobile with limited RAM, but it can work, so it's up to you.
(4) Firefox Nightly user requires to follow this steps to install extensions not in default list.

Install This Script

How To Use

  1. Open E-Hentai Gallery
  2. Find your interested gallery
  3. Click "Download Archive" in E-Hentai Downloader box
  4. Have a cup of coffee ☕
  5. Save the Zip file

E-Hentai Downloader box

Tips:

  • Check "Number Images" to number download images
  • Set "Pages Range" to choose pages you want to download
  • More personalized options can be found on "Settings"

How It Works

This script won't download archive from E-Hentai archive download page, so it won't spend your GPs or credits. It will fetch all the pages of the gallery and get their images' URL. Then script will use GM_xmlhttpRequest API (in order to cross origin) to download them. After that, it will package them to a Zip file with JSZip and give it to you with FileSaver.js.

Should Be Noticed

  • If you are using the latest Tampermonkey, or receive a warning of "A userscript wants to access a cross-origin resource" from Tampermonkey, please Allow All or turn off "@connect mode" at setting page. For more info, see details here
    E-Hentai now uses hath.network domain to access images, and it's now listed in @connect, so you're not needed to set this with latest Tampermonkey
  • If you see a message about out of memory on Firefox, or file not found on Chrome, see solution here.
    In short, use Pages Range to limit each zip file under 500 MB is recommended, and enable File System if you're using Chrome, or use other tools, or upgrade your PC with more RAM
  • ViolentMonkey doesn't support timeout, final URL and download progress
    The latest ViolentMonkey supports these features now
  • Single-thread download mode is removed in 1.18, if you need it, roll back to old version
    Don't use an old version, it doesn't support current site
  • You can also have a look at E-Hentai Image Viewing Limits
  • Most of galleries may have torrents to download. You can download archive with torrent to get stable download experience, get bonus content (most in cosplay gallery), earn GP and credit, and reduce the pressure of E-Hentai original servers (though it's a P2P site)

Here are some other compatible information, which is not important.

  • Tampermonkey uses a dirty way to give GM_xhr.response content (transfers String to ArrayBuffer everytime), so it'll stuck for 1~3 seconds or more after downloaded image (depend on your device). If you are using Microsoft Edge, you may often see the working tab is stuck, saying it's not responding. Just let it go and do nothing. And if you are using Firefox, it's better to use GreaseMonkey from this side
    The freeze problem should be fixed in Tampermonkey 4.12.6125
  • Dolphin Browser (Android) doesn't support blob URL, so this script cannot be run in Tampermonkey for Dolphin probably
  • UC Browser (Android) doesn't support blob constructor, so this script cannot be run in Tampermonkey for UC probably
  • Opera 12- doesn't support blob URL, and if generated as data URL, it may crash, so it's not supported
  • TrixIE (for IE) is too old and its GM_xhr cannot handle large content, so it's not supported

Warning And Limitation

Memory Usage

The script will store ALL the data in RAM, not in HDD. This will increase the memory usage of current tab process. So if you don't have enough RAM, or the archive is too large (see file size limit section), please pay attention to your memory usage, or try other download tools.

"Out of memory" problem is the most limitaion of the script (in fact, all the sections of "Warning And Limitation" are about RAM problem, and here is also a specific out of memory tag to label all related issues). If you get an error like out of memory, see solution here. And if you usually have the problem, try other tools.

Browser Developer Tools

To help us debug, the script will output some logs into console (F12 -> Console). If you find a bug, you can keep opening devtools to see and copy the logs. But note that it may increase memory usage and reduce running efficiency. So don't open console only if you want to see the output logs.

File Size Limit

(This part is a bit long, you can just read the table)

Different browsers have different maximum file size limits. Here is a table to show the maximum size the supported browser can handle.

Browser Maximum Size
Chrome 56- 500 MB
Chrome 57+ 2 GB or (total RAM / 5)
Chrome (with File System) 1 GB / > 2GB (with 1.33+)
Firefox > 800MB (depends on your RAM)
Opera 15+ Same as Chrome
Edge 18- ?
Edge 79+ Same as Chrome
Safari 10.1+ ?
Maxthon ?

For Google Chrome 56-, it has a hard limit at 500 MB on Blob Storage for years. That means all the files that in storage cannot be larger than 500MB in total, and if the storage doesn't have enough free space to save the next file, it'll return a fake Blob instance silently without any errors. Also for Chrome 45-, Blob.close() didn't implement (and it's depreated so no browser supports it now), so we cannot free those used Blob immediately at that time, only to pary the browser will GC them ASAP (and for most of time it didn't work). That's why here is a wiki page to help you work around this.

So to help you save larger files, the script can save the Zip file into File System, a deprecated HTML5 API but still works on Chrome (as it's Chrome introduce the standard first). With the API, you can handle larger file because the file data will be writing to your disk instead of storing in Blob Storage, its limit is also big enough (10% of your disk free storage, 15 GB in maximum). But when processing the file, the files are still keeping in RAM, and if datas are too large, Chrome may also cannot handle them. From my test the maximum limit maybe 1 GB if you only have 8 GB RAM, but it may also depends on your device. If you have enough RAM, you can download a gallery larger than 2 GB with 1.33+.

Chrome 57+ fixes the 500 MB limit of Blob Storage, so that it can handler larger files in RAM just like File System. Its quota is still exist but it's larger, which bases on the limits below, and here are some examples to make it more clear:

In-memory quota:

  • 2GB if system is x64 and NOT ChromeOS or Android
  • Total RAM amount / 5;

Disk quota:

  • Disk size / 2 if ChromeOS (user partition disk size)
  • Disk size / 20 if Android
  • Disk size / 10 otherwise.

Also, if disk is almost full, we try to keep at least (in-memory quota)*2 disk space available, and we limit the disk quota accordingly.

For Firefox, from our previous data from FileSaver.js, the limit is 800 MB. But from our tests, you can save the file that larger than 800 MB. So we think the limit of Firefox is depending on your device, as it stores the Blob in RAM. If you have a larger RAM, you can save a larger file. However, you should care about your RAM usage, as if Firefox cannot get more RAM to generate Zip, it'll throw an "out of memory" error. To give you some advice, no more than 200 MB if you're using 4 GB RAM, and be care for more than 800 MB if you're using 8 GB RAM.

Opera 15+ is a Chromium-based browser, so you can check its Chromium version and compare it to Chrome version to get your limit. All the other Chromium-based browsers can also use this rule.

Safari 10.1+ finally supports download attribute on <a> tag, so you can now make it works on Safari. We don't have too much data about Safari Blob limit, so if you're dealing with Safari, be care about your RAM usage.

Todo List

See plans and progress here, notice that some of them may changed or removed in some time.

Report A Bug

You can report a bug or give suggestions at GitHub Issue or GreasyFork Feedback. English and Chinese are acceptable 😝

English is not my mother tounge, so if you found any mistakes, don't hesitate to let me know =ω=

Sorry my code is a bit untidy, so it may hard for your development. I'll try optimizing it in a further time 😅

e-hentai-downloader's People

Contributors

adrianiainlam avatar ccloli avatar kawaharai avatar shingenpizza avatar simon300000 avatar temporaryaccount0x1 avatar xiaokangwang avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

e-hentai-downloader's Issues

Downloading problem

不知道为啥每次同时下载的数张图中总会出现几个0kb/s,且retry时还是下载失败,火狐和chrome都是这个问题Orz

Skip Image if error after the retries (When downloading only one image)

As everyone know FF now has a option to not let the tab bring you (from the tab you're browsing) to them, when downloading a gallery with EHD with only one image per time (for some particular reason), the gallery stopped downloading at the image 13 because of a timeout error and the script brought up the option to fetch and download it again, instead of skipping it and try download the rest as before.

Showing on Bottom?

I'm using the last version of the script in the last version of Chromium and the scritp keep showing in the bottom of the page after the comments. In firefox it show as usual.
3MPcdz7

I even tried copy&paste the firefox script on chromium and still after the comments.

[Feat. Requst] add tags to info.txt

This is a feature request to add all the tags applied to a comic being downloaded to the "info.txt" so this metadata can be used for later use.

遇上下载错误无法结束下载

在下载大量图片时(200+),会遇上部分图片下载失败。我记得以前的版本会问是否重下,然而我现在遇到的是下载失败后仍被归为下载中,纵使那个进度条右边显示已失败。

Chrome:52.0.2739.0 dev-m (64-bit)
Tampermonkey:v4.1.5240
E-Hentai Downloader:1.21.5
电脑开了SS

default

1

2

Wrong file extension

Sometimes the file extension are png but it saves them as jpg which causes some problems in some image viewers.

Uploader Comment as zip comment

Can we have the uploader comment as zip comment? Every now and then when I'm reading something that I downloaded some time ago I took note that it's a multi-series or something the like and it's a bother to look into the gallery again to search in the uploader comment for the order or the like. Well, can we have that as a zip comment?

Spilt Download Large Galleries

Some large galleries do not have torrent and currently E-Hentai Downloader cannot handle them. It just downloads the images and then save dialog does not appear.

A way to download large galleries in segments so you can download them over several days if you want and need due to download limitation.

建议限制Retrying下载的时间限制

一个图集都下完99%了,就几个图片无法下载,卡在重试中,比下载时间还长,简直浪费时间。
建议添加一个时间限制,可以忽略卡住的图片。

求添加功能:①轉換無效半角字符成有效全角字符功能②修改檔名功能


A filename cannot contain any of the following characters:
\ / : * ? " < > |
有些畫廊有這些字符在title和subtitle中
希望可以添加一個可以自動轉換這些在建立檔名時會成"-"這樣的一個無效字符為該字符的全角字符
例如:
https://exhentai.org/g/966727/6258cf46a9/
本來的非法"?"字符:
(C90) [オセロアイス (shuz)] おおきいけれどいいですか?
換成合法的全角”?”字符:
(C90) [オセロアイス (shuz)] おおきいけれどいいですか?

希望在下載之前
可以在"E-Hentai Downloader | Image Limits: 0/5000"的框框內
添加兩個確認folder name和zip file name的框框
按照settings內設置的在框框內顯示出檔案名
供使用者做細節更改(若需要)
例:
https://exhentai.org/g/589459/a442012cdd/
原本設定:
folder name:{subtitle}
zip file name:{subtitle}
故應該是:
folder name:(C83) [うすべに屋 (うすべに桜子)] 東方足祭 (東方Project) [**翻訳] [琉璃神社★汉化]
zip file name:(C83) [うすべに屋 (うすべに桜子)] 東方足祭 (東方Project) [**翻訳] [琉璃神社★汉化]
下載者想作修改成:
folder name:(C83) [うすべに屋 (よろず)] 東方足祭 (東方Project) [**翻訳]
zip file name:(C83) [うすべに屋 (よろず)] 東方足祭 (東方Project) [**翻訳]
這樣就可以不用重新解壓縮改名後再重新打包
大概長這樣的概念:
untitled

English mistakes (grammar, spell, phrase, etc.) report

English is not my native language, so it may have so many mistakes in script, readme document, wiki or even this issue. QwQ

If you find any mistakes that makes you feel strange, you can post them here, and I'll correct them in next version. =w=

BTW, if they are in wikis, you can edit them by yourself (you can also write your own wiki here).

Thanks for your contribution. >w<

能不能弄个在其他网站搜索相同本子的脚本

我一般都会先收藏一大堆,在之后找个时间下载
但是E-Hentai有时会删资源,那么只好去其他绅士网站找资源,比如这个:hitomi.la
能不能在Favorites的界面每一本本子那里弄个按钮,快速的在其他绅士站搜索,或是用google搜索本子封面

一个很严重的bug,serious

症状是:批量下载一大堆zip,用winrar解压“单独解压到每个文件夹”,然后文件夹损坏,不能改名,不能删除。
但是图片可以用windows自带的软件浏览,也可以移动。就是文件夹不行。

前几天,这个bug出现一次,我以为是我的硬盘坏了。下载了360强力删除,删不掉,必须和上一级目录一起删除才能清除掉。

然后今天,先下载了一个zip,我发现设置重置了,下载回来的是英文包,并且里面有乱码子目录。
于是第二个本子,更改了设置:
Set folder name as:/
Set Zip file name as:{subtitle}
Retry automatically when images download failed
Force download resized image (never download original image) **

勾选了这四项。
下载回来一大批本子后,问题出现了:
第一个本子没问题。后面的本子,全部再次文件夹无法删除/改名。

有bug的zip文件找不到了,手快解压后就删除了……再下回来一个,是没问题的……
我估计就是文件夹名字那里有问题。
请重视这个bug。

stuck fetching images

I don't know why but I just noticed today that the script just stuck on fetching images with no progress at all.
Fetching Image 1: page1.png ...
Fetching Image 2: page2.png ...
Fetching Image 3: page3.png ...
Fetching Image 4: page4.png ...
Fetching Image 5: page5.png ...

then nothing. I don't know what could be the problem. How to check?

Don't work with Safari

The download finished but the zip is not downloaded, a blank page popped out but nothing happened when clicked the "Not download?" button.

Version: 1.19.9
Safari: Version 9.1 (11601.5.17.1)

Numbering Bug

The numbering always starts from 1 even if you are downloading images 200-400. Shouldn't the number start from 200 if you are downloading the image 200.

Calculating extra data...

请问,下载完了一个比较大的文件,然后到了下面就不动了
Generating Zip file...
Calculating extra data...
浏览器卡卡的,等了好久没反应
但是下载小本子没问题


when I finished downloading a large book, it doesn't work.
But small book is ok.

Generating Zip file...
Calculating extra data...

File Size: 130.1 MB
Length: 200 pages

Firefox

Page refreshing problem

The popup while refreshing the downloading page seems to be effectiveness.


ccloli你好
10月9日更新的版本在下载时对页面的更新操作后不再弹出提示窗口,而直接跳转到新页面,之前的下载需要从头开始TvT

Any way to download using this script even if image limit is reached?

So I'm like at 37000+ towards a limit of 5000 and it needs credits of 76k to reset and I don't know when will ever go back to below the limit,
I have like 8k credits right now. I am using this script especially when there are no torrents available.
So yeah my fault I downloaded large gallery files in a single day but some of them didn't even complete the download because of the image limit.

中间的下载按钮消失

这两天突然发现页面中间的下载框不见了。。。
E-Hentai Downloader 1.21.3
Chrome 版本 52.0.2723.2 dev-m (64-bit)

When downloading large gallery =>600MB

When the downloader finishes downloading and asks me where to save the zip file then I save it. It downloads the zip file to my computer but for some reason the archive is "no file size" if I remember correctly. I am pretty sure my RAM can handle 600MB and up gallery sizes I am using Chrome browser

Few Suggestions

Would you consider the following:

  1. Add the numbering option beside or under the "Download Archive" option. Most galleries are numbered but there are a few that aren't. So in most cases you need a quick way to switch between numbering and not numbering.

  2. Usually the first comment is "Uploader Comment" which contains useful information about the gallery. Would you consider including it in the info. I already added it in the example at line 11256. Would you also consider making the info file into an html to allow formatting of output including the comment.

  3. Rename the info file to always be at the top. Right now it is at the bottom if you enable numbering, and if its a large archive you do not spot it unless you scroll down to the bottom.

Thank you

Retry if a download keeps pending for too long

Is there an option to consider a download fail after keep pending for a certain amount of time.

I sometime encounter the problem that some of image failed to fetch(probable server down as they are host by other peers).

I have to refresh the page to retry download, which cost a lot of Image Limits especially when downloading a gallery which consist of a large amount of images.

It seems that this problem associated with an script error:

Log

Failed to load resource: net::ERR_BLOCKED_BY_CLIENT  // (PURPOSEFULLY)
Failed to load resource: net::ERR_BLOCKED_BY_CLIENT // (PURPOSEFULLY)
Failed to load resource: net::ERR_BLOCKED_BY_CLIENT // (PURPOSEFULLY)
[EHD] E-Hentai Downloader is running.
[EHD] Bugs Report > https://github.com/ccloli/E-Hentai-Downloader/issues | https://greasyfork.org/scripts/10379-e-hentai-downloader/feedback
[EHD] To report a bug, showing all the "[EHD]" logs is wonderful. =w=
[EHD] UserAgent > Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/47.0.2526.80 Safari/537.36
[EHD] Script Handler > Tampermonkey
[EHD] GreaseMonkey / Tampermonkey Version > 3.12.58
[EHD] E-Hentai Downloader Version > 1.18.6
[EHD] E-Hentai Downloader Setting > {"thread-count":5,"timeout":30,"number-images":false,"number-real-index":false,"force-resized":false,"never-new-url":false,"never-send-nl":false,"store-in-fs":false}
[EHD] Current URL > http://g.e-hentai.org/g/(^_*)
[EHD] Is Logged In > true
[EHD] Index > 3  | RealIndex > 3  | Name > 02.jpg  | RetryCount > 0  | DownloadedCount > 1  | FetchCount > 5  | FailedCount > 0
(omit)
[EHD] Index > 19  | RealIndex > 19  | Name > 18.jpg  | RetryCount > 0  | DownloadedCount > 22  | FetchCount > 5  | FailedCount > 0
Uncaught TypeError: Cannot read property 'length' of undefined
[EHD] Index > 23  | RealIndex > 23  | Name > 22.jpg  | RetryCount > 0  | DownloadedCount > 23  | FetchCount > 5  | FailedCount > 0
(omit)
[EHD] Index > 29  | RealIndex > 29  | Name > 28.jpg  | RetryCount > 0  | DownloadedCount > 30  | FetchCount > 2  | FailedCount > 0

==Err trackback:

Uncaught TypeError: Cannot read property 'length' of undefinedfetchThread.(anonymous function).GM_xmlhttpRequest.onload @ VM7682:10108
(anonymous function) @ VM7676:59

VM7682:10108:
response: new ArrayBuffer(res.responseText.length),

==Debug Information

WTF?

JSON.stringify(res) is
{"readyState":4,"responseHeaders":"Date: Tue, 05 Jan 2016 13:43:56 GMT\r\nContent-Length: 0\r\nContent-Type: text/plain; charset=utf-8\r\n","finalUrl":"http://g.e-hentai.org/fullimg.php?(^_*)","status":500,"statusText":"Internal Server Error","responseType":"arraybuffer","response_types":{"response":false,"responseText":false,"responseXML":false}}

so res.response is undefined while

!res.response is true!

res.response==undefined is also true.

!(res.response==undefined) is false. (expected behavior?)

I will send you a pull request later if I can solve this myself.

Always offer to continue download after hitting image limit.

If you hit your image limit, a popup from the script asks if you want to spend credits to reset it. If you hit OK, there's a continue button on the status window. If you hit cancel, will not have a continue button (and will then ask if you want to save the ZIP).

I tested by telling it I would reset, then just waiting a few hours, then hitting continue. The download did continue and the ZIP was fine.

Is there a reason to not always offer to continue?

关于检测脚本更新

Pixiv++看起来就是个超强大的脚本,还自带更新检测功能。之前说到的把库放在@requier里,你可以每次升级你的版本的时候检测这些库是否有更新了。
image 1

[Request] Close Tab After Successful Download

There's a way to put a checkbox in the settings to let the script close the tab if the download was successful?
I, and many people (I think), probably download many things and start doing another thing during the downloads, it'll be cool if there's a option to close the tab after de download completed.

Image count goes up way too fast

I noticed the image limit gets "consumed" really fast: I downloaded a zip with 31 pics, and the counter went up from 0/5000 to 265/5000.
What is happening?

[Request] Always show download status on tab title.

  • Add a checkbox to always show the download status on tab title.
  • Change status text.

Can you please add a checkbox to always show the download status on tab title?
And a input box to customize the status title (It takes to much space in tab title).
changed
I could do this manually but it would always change back when the script get updated.

[Request] Add a pause/resume button

Can you please add a pause/resume button on the download status mini screen?

Sometimes if the gallery size is big, after more than 30% or 50% have been downloaded, I remember something more important. So I have to close the tab and download again later. But my image limits are consumed due to the previous partial downloading. So, a pause/resume button would be appreciated.

(pending)timeout on some page

[EHD] E-Hentai Downloader is running.
VM442:35 [EHD] Bugs Report > https://github.com/ccloli/E-Hentai-Downloader/issues | https://greasyfork.org/scripts/10379-e-hentai-downloader/feedback
VM442:36 [EHD] To report a bug, showing all the "[EHD]" logs is wonderful. =w=
VM442:9703 [EHD] UserAgent > Mozilla/5.0 (Macintosh; Intel Mac OS X 10_10_3) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/45.0.2454.101 Safari/537.36
VM442:9704 [EHD] Script Handler > Tampermonkey
VM442:9705 [EHD] GreaseMonkey / Tampermonkey Version > undefined
VM442:9706 [EHD] E-Hentai Downloader Version > 1.18.3
VM442:9707 [EHD] E-Hentai Downloader Setting > {}
VM442:9708 [EHD] Current URL > http://exhentai.org/g/245439/0dbe807aca/?p=1
VM442:9709 [EHD] Is Logged In > true
VM442:9898 [EHD] Index > 1  | RealIndex > 1  | Name > 00.jpg  | RetryCount > 0  | DownloadedCount > 1  | FetchCount > 5  | FailedCount > 0
VM442:9898 [EHD] Index > 3  | RealIndex > 3  | Name > 02.jpg  | RetryCount > 0  | DownloadedCount > 2  | FetchCount > 5  | FailedCount > 0
VM442:9898 [EHD] Index > 2  | RealIndex > 2  | Name > 01.jpg  | RetryCount > 0  | DownloadedCount > 3  | FetchCount > 5  | FailedCount > 0
VM442:9898 [EHD] Index > 5  | RealIndex > 5  | Name > 04.jpg  | RetryCount > 0  | DownloadedCount > 4  | FetchCount > 5  | FailedCount > 0
VM442:9898 [EHD] Index > 6  | RealIndex > 6  | Name > 05.jpg  | RetryCount > 0  | DownloadedCount > 5  | FetchCount > 5  | FailedCount > 0
VM442:9898 [EHD] Index > 9  | RealIndex > 9  | Name > 08.jpg  | RetryCount > 0  | DownloadedCount > 6  | FetchCount > 5  | FailedCount > 0
VM442:9898 [EHD] Index > 11  | RealIndex > 11  | Name > 10.jpg  | RetryCount > 0  | DownloadedCount > 7  | FetchCount > 5  | FailedCount > 0
VM442:9898 [EHD] Index > 12  | RealIndex > 12  | Name > 11.jpg  | RetryCount > 0  | DownloadedCount > 8  | FetchCount > 5  | FailedCount > 0
VM442:9898 [EHD] Index > 8  | RealIndex > 8  | Name > 07.jpg  | RetryCount > 0  | DownloadedCount > 9  | FetchCount > 5  | FailedCount > 0
VM442:9898 [EHD] Index > 14  | RealIndex > 14  | Name > 13.jpg  | RetryCount > 0  | DownloadedCount > 10  | FetchCount > 5  | FailedCount > 0
VM442:9898 [EHD] Index > 7  | RealIndex > 7  | Name > 06.jpg  | RetryCount > 0  | DownloadedCount > 11  | FetchCount > 5  | FailedCount > 0
VM442:9898 [EHD] Index > 15  | RealIndex > 15  | Name > 14.jpg  | RetryCount > 0  | DownloadedCount > 12  | FetchCount > 5  | FailedCount > 0
VM442:9898 [EHD] Index > 13  | RealIndex > 13  | Name > 12.jpg  | RetryCount > 0  | DownloadedCount > 13  | FetchCount > 5  | FailedCount > 0
VM442:9898 [EHD] Index > 16  | RealIndex > 16  | Name > 15.jpg  | RetryCount > 0  | DownloadedCount > 14  | FetchCount > 5  | FailedCount > 0
VM442:9898 [EHD] Index > 18  | RealIndex > 18  | Name > 17.jpg  | RetryCount > 0  | DownloadedCount > 15  | FetchCount > 5  | FailedCount > 0
VM442:9898 [EHD] Index > 17  | RealIndex > 17  | Name > 16.jpg  | RetryCount > 0  | DownloadedCount > 16  | FetchCount > 5  | FailedCount > 0
VM442:9898 [EHD] Index > 21  | RealIndex > 21  | Name > 20.jpg  | RetryCount > 0  | DownloadedCount > 17  | FetchCount > 5  | FailedCount > 0
VM442:9898 [EHD] Index > 22  | RealIndex > 22  | Name > 21.jpg  | RetryCount > 0  | DownloadedCount > 18  | FetchCount > 5  | FailedCount > 0
VM442:9898 [EHD] Index > 4  | RealIndex > 4  | Name > 03.jpg  | RetryCount > 0  | DownloadedCount > 19  | FetchCount > 5  | FailedCount > 0
VM442:9898 [EHD] Index > 23  | RealIndex > 23  | Name > 22.jpg  | RetryCount > 0  | DownloadedCount > 20  | FetchCount > 5  | FailedCount > 0
VM442:9898 [EHD] Index > 24  | RealIndex > 24  | Name > 23.jpg  | RetryCount > 0  | DownloadedCount > 21  | FetchCount > 5  | FailedCount > 0
VM442:9898 [EHD] Index > 10  | RealIndex > 10  | Name > 09.jpg  | RetryCount > 0  | DownloadedCount > 22  | FetchCount > 5  | FailedCount > 0
VM442:9898 [EHD] Index > 27  | RealIndex > 27  | Name > 26.jpg  | RetryCount > 0  | DownloadedCount > 23  | FetchCount > 5  | FailedCount > 0
VM442:9898 [EHD] Index > 25  | RealIndex > 25  | Name > 24.jpg  | RetryCount > 0  | DownloadedCount > 24  | FetchCount > 5  | FailedCount > 0
VM442:9898 [EHD] Index > 28  | RealIndex > 28  | Name > 27.jpg  | RetryCount > 0  | DownloadedCount > 25  | FetchCount > 5  | FailedCount > 0
VM442:9898 [EHD] Index > 30  | RealIndex > 30  | Name > 29.jpg  | RetryCount > 0  | DownloadedCount > 26  | FetchCount > 5  | FailedCount > 0
VM442:9898 [EHD] Index > 29  | RealIndex > 29  | Name > 28.jpg  | RetryCount > 0  | DownloadedCount > 27  | FetchCount > 5  | FailedCount > 0
VM442:9898 [EHD] Index > 32  | RealIndex > 32  | Name > 31.jpg  | RetryCount > 0  | DownloadedCount > 28  | FetchCount > 5  | FailedCount > 0
VM442:9898 [EHD] Index > 31  | RealIndex > 31  | Name > 30.jpg  | RetryCount > 0  | DownloadedCount > 29  | FetchCount > 5  | FailedCount > 0
VM442:9898 [EHD] Index > 33  | RealIndex > 33  | Name > 32.jpg  | RetryCount > 0  | DownloadedCount > 30  | FetchCount > 5  | FailedCount > 0
VM442:9898 [EHD] Index > 34  | RealIndex > 34  | Name > 33.jpg  | RetryCount > 0  | DownloadedCount > 31  | FetchCount > 5  | FailedCount > 0
VM442:9898 [EHD] Index > 36  | RealIndex > 36  | Name > 35.jpg  | RetryCount > 0  | DownloadedCount > 32  | FetchCount > 5  | FailedCount > 0
VM442:9898 [EHD] Index > 35  | RealIndex > 35  | Name > 34.jpg  | RetryCount > 0  | DownloadedCount > 33  | FetchCount > 5  | FailedCount > 0
(program):9898 [EHD] Index > 26  | RealIndex > 26  | Name > 25.jpg  | RetryCount > 0  | DownloadedCount > 34  | FetchCount > 5  | FailedCount > 0
(program):9898 [EHD] Index > 38  | RealIndex > 38  | Name > 37.jpg  | RetryCount > 0  | DownloadedCount > 35  | FetchCount > 5  | FailedCount > 0
(program):9898 [EHD] Index > 37  | RealIndex > 37  | Name > 36.jpg  | RetryCount > 0  | DownloadedCount > 36  | FetchCount > 5  | FailedCount > 0
(program):9898 [EHD] Index > 40  | RealIndex > 40  | Name > 39.jpg  | RetryCount > 0  | DownloadedCount > 37  | FetchCount > 5  | FailedCount > 0
(program):9898 [EHD] Index > 39  | RealIndex > 39  | Name > 38.jpg  | RetryCount > 0  | DownloadedCount > 38  | FetchCount > 5  | FailedCount > 0
(program):9898 [EHD] Index > 42  | RealIndex > 42  | Name > 41.jpg  | RetryCount > 0  | DownloadedCount > 39  | FetchCount > 5  | FailedCount > 0
(program):9898 [EHD] Index > 43  | RealIndex > 43  | Name > 42.jpg  | RetryCount > 0  | DownloadedCount > 40  | FetchCount > 5  | FailedCount > 0
(program):9898 [EHD] Index > 44  | RealIndex > 44  | Name > 43.jpg  | RetryCount > 0  | DownloadedCount > 41  | FetchCount > 4  | FailedCount > 0
(program):9898 [EHD] Index > 41  | RealIndex > 41  | Name > 40.jpg  | RetryCount > 0  | DownloadedCount > 42  | FetchCount > 3  | FailedCount > 0
(program):9898 [EHD] Index > 20  | RealIndex > 20  | Name > 19.jpg  | RetryCount > 0  | DownloadedCount > 43  | FetchCount > 2  | FailedCount > 0
VM442:10319 [EHD] #19: Timed Out
VM442:10320 [EHD] #19: RealIndex > 19  | ReadyState > undefined  | Status > undefined  | StatusText > undefined
ResposeHeaders >undefined
VM442:10064 [EHD] Index > 19  | RealIndex > 19  | Name > 18.jpg  | RetryCount > 0  | DownloadedCount > 43  | FetchCount > 1  | FailedCount > 0failedFetching @ VM442:10064fetchThread.(anonymous function).GM_xmlhttpRequest.ontimeout @ VM442:10323(anonymous function) @ VM437:54setTimeout (async)backup.safeWindow.(anonymous function) @ VM434:1k @ VM437:54(anonymous function) @ VM437:57f.notifyListeners @ VM437:28(anonymous function) @ VM437:28Context.chromeEmu.e.runConnectResponse @ VM437:27(anonymous function) @ VM2180:2(anonymous function) @ VM2180:2copy.exec @ VM437:1copy.Eventing.b @ VM437:8copy.Eventing.a.eventHandlerPage @ VM437:12a.standardEventSource.element.dispatchEvent @ content.js:25a.fireEvent @ content.js:27g.sendMessage @ content.js:10b.onConnectResponse @ content.js:19(anonymous function) @ content.js:21EventImpl.dispatchToListener @ VM403 extensions::event_bindings:387publicClass.(anonymous function) @ VM409 extensions::utils:94EventImpl.dispatch_ @ VM403 extensions::event_bindings:371EventImpl.dispatch @ VM403 extensions::event_bindings:393publicClass.(anonymous function) @ VM409 extensions::utils:94dispatchOnMessage @ VM402 extensions::messaging:310

Classification according to EH

Since it's easier to do this by the script I'm writing this issue.

Could you add an sign as a classification, according to EH, since the script can't sort the downloaded files by folders it would be waaaaaaaaaaaay easier to do this on my side if the first files of the archive were the classification something like this:

{DO}[Mariana Kaikou Kikaku (Mikami Hokuto)] Uchi, Nandemo Shimasu kara (Guilty Gear XX) [Digital]
{AR}[Ecolonun (Numeko)] Mahou Shoujo Reina
{MA}[Takara Akihito] Koiiro Girls Soutennenshoku [Digital]
{MI}[Brandon Santiago] Erma (Ongoing)
{NO}[Sung San-Young] The Gamer Ch. 72-136 (English) (Ongoing)
{WE}[Area] Between Friends (Ongoing)
{CO}[AmieChan] Rainbow Dash
{GA}[Noesis] Free Friends 2 [Decensored]
{IM}「bluefield」 - Artist

Then, after the downloads, it would be easier to sort them, and only then, run a script to erase those initials. The webpage should have the things needed.


:(

Sorting ~1000 folders is hard.

Not Showing

In the latest version is showing nothing here.

xx

[Request] Auto retry option

It's possible to do one auto-retry checkbox to let the script get new links and try to download the failed images without further question? If you have control of your download limit and how many tabs you are downloading it's not a problem, even more so if you know what you're doing since it's not default anyway.

NS_Out of Memory / Failed to Zip

So I'm tying to download a archive that way too big... I understand that.

So I downloaded just 30 images, Worked (101MBs)

The I tried several times to get 100 images. That Dies with some failed to zip error. And I was running out of ram.

Restart with clean run of FF, tried to get 50 images. FF used about 300MB (of RAM), and fails during the start of the zip process with (NS_Out of Memory). FF isn't using anymore ram past downloading the last pic. System isn't out of ram..., and FF's only using 770MB overall

Tried both 100, and 50; 4 times each,

Go back to 30 images per zip, and it instantly makes a zip
grab the next 30, it instantly get 20, and starts downloading the 50th. also works

Those two archives are only 172MB total, which is 60 images and at double ram usage is only 344MB which should be well under the 800MB FF limit.

The four archives (30,30,30,10) that represent the 100 images originally tried, total up to just 304MB which at double ram is 608MB and still well under the 800MB FF limit.

It is downloading all the images fine, its just the JSZIP that's dying. I would much rather have the script download all 600 images (and save), and then zip (or RAR/.7/.ace) them by hand; As downloading 20 little 30-image zipfiles, is annoying.

Script doesn't work on some pages

I noticed I can't download from "http://exhentai.org/g/354095/7148834893/".
Happens on Azasuke Wind Coll. 13, 14, 15, 16 too. I can download normally any other page.
I did not hit any limit, it's just that when I click the download "button" nothing happens.
Happens on win7 x64, ffox 40.0.3, ffox 41.0, greasemonkey 3.4.1.
Any idea how to solve this?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.