Coder Social home page Coder Social logo

e-hentai-downloader's Issues

Skip Image if error after the retries (When downloading only one image)

As everyone know FF now has a option to not let the tab bring you (from the tab you're browsing) to them, when downloading a gallery with EHD with only one image per time (for some particular reason), the gallery stopped downloading at the image 13 because of a timeout error and the script brought up the option to fetch and download it again, instead of skipping it and try download the rest as before.

[Request] Close Tab After Successful Download

There's a way to put a checkbox in the settings to let the script close the tab if the download was successful?
I, and many people (I think), probably download many things and start doing another thing during the downloads, it'll be cool if there's a option to close the tab after de download completed.

NS_Out of Memory / Failed to Zip

So I'm tying to download a archive that way too big... I understand that.

So I downloaded just 30 images, Worked (101MBs)

The I tried several times to get 100 images. That Dies with some failed to zip error. And I was running out of ram.

Restart with clean run of FF, tried to get 50 images. FF used about 300MB (of RAM), and fails during the start of the zip process with (NS_Out of Memory). FF isn't using anymore ram past downloading the last pic. System isn't out of ram..., and FF's only using 770MB overall

Tried both 100, and 50; 4 times each,

Go back to 30 images per zip, and it instantly makes a zip
grab the next 30, it instantly get 20, and starts downloading the 50th. also works

Those two archives are only 172MB total, which is 60 images and at double ram usage is only 344MB which should be well under the 800MB FF limit.

The four archives (30,30,30,10) that represent the 100 images originally tried, total up to just 304MB which at double ram is 608MB and still well under the 800MB FF limit.

It is downloading all the images fine, its just the JSZIP that's dying. I would much rather have the script download all 600 images (and save), and then zip (or RAR/.7/.ace) them by hand; As downloading 20 little 30-image zipfiles, is annoying.

Numbering Bug

The numbering always starts from 1 even if you are downloading images 200-400. Shouldn't the number start from 200 if you are downloading the image 200.

能不能弄个在其他网站搜索相同本子的脚本

我一般都会先收藏一大堆,在之后找个时间下载
但是E-Hentai有时会删资源,那么只好去其他绅士网站找资源,比如这个:hitomi.la
能不能在Favorites的界面每一本本子那里弄个按钮,快速的在其他绅士站搜索,或是用google搜索本子封面

English mistakes (grammar, spell, phrase, etc.) report

English is not my native language, so it may have so many mistakes in script, readme document, wiki or even this issue. QwQ

If you find any mistakes that makes you feel strange, you can post them here, and I'll correct them in next version. =w=

BTW, if they are in wikis, you can edit them by yourself (you can also write your own wiki here).

Thanks for your contribution. >w<

遇上下载错误无法结束下载

在下载大量图片时(200+),会遇上部分图片下载失败。我记得以前的版本会问是否重下,然而我现在遇到的是下载失败后仍被归为下载中,纵使那个进度条右边显示已失败。

Chrome:52.0.2739.0 dev-m (64-bit)
Tampermonkey:v4.1.5240
E-Hentai Downloader:1.21.5
电脑开了SS

default

1

2

[Request] Always show download status on tab title.

  • Add a checkbox to always show the download status on tab title.
  • Change status text.

Can you please add a checkbox to always show the download status on tab title?
And a input box to customize the status title (It takes to much space in tab title).
changed
I could do this manually but it would always change back when the script get updated.

[Feat. Requst] add tags to info.txt

This is a feature request to add all the tags applied to a comic being downloaded to the "info.txt" so this metadata can be used for later use.

Calculating extra data...

请问,下载完了一个比较大的文件,然后到了下面就不动了
Generating Zip file...
Calculating extra data...
浏览器卡卡的,等了好久没反应
但是下载小本子没问题


when I finished downloading a large book, it doesn't work.
But small book is ok.

Generating Zip file...
Calculating extra data...

File Size: 130.1 MB
Length: 200 pages

Firefox

[Request] Auto retry option

It's possible to do one auto-retry checkbox to let the script get new links and try to download the failed images without further question? If you have control of your download limit and how many tabs you are downloading it's not a problem, even more so if you know what you're doing since it's not default anyway.

建议限制Retrying下载的时间限制

一个图集都下完99%了,就几个图片无法下载,卡在重试中,比下载时间还长,简直浪费时间。
建议添加一个时间限制,可以忽略卡住的图片。

Script doesn't work on some pages

I noticed I can't download from "http://exhentai.org/g/354095/7148834893/".
Happens on Azasuke Wind Coll. 13, 14, 15, 16 too. I can download normally any other page.
I did not hit any limit, it's just that when I click the download "button" nothing happens.
Happens on win7 x64, ffox 40.0.3, ffox 41.0, greasemonkey 3.4.1.
Any idea how to solve this?

Retry if a download keeps pending for too long

Is there an option to consider a download fail after keep pending for a certain amount of time.

I sometime encounter the problem that some of image failed to fetch(probable server down as they are host by other peers).

I have to refresh the page to retry download, which cost a lot of Image Limits especially when downloading a gallery which consist of a large amount of images.

It seems that this problem associated with an script error:

Log

Failed to load resource: net::ERR_BLOCKED_BY_CLIENT  // (PURPOSEFULLY)
Failed to load resource: net::ERR_BLOCKED_BY_CLIENT // (PURPOSEFULLY)
Failed to load resource: net::ERR_BLOCKED_BY_CLIENT // (PURPOSEFULLY)
[EHD] E-Hentai Downloader is running.
[EHD] Bugs Report > https://github.com/ccloli/E-Hentai-Downloader/issues | https://greasyfork.org/scripts/10379-e-hentai-downloader/feedback
[EHD] To report a bug, showing all the "[EHD]" logs is wonderful. =w=
[EHD] UserAgent > Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/47.0.2526.80 Safari/537.36
[EHD] Script Handler > Tampermonkey
[EHD] GreaseMonkey / Tampermonkey Version > 3.12.58
[EHD] E-Hentai Downloader Version > 1.18.6
[EHD] E-Hentai Downloader Setting > {"thread-count":5,"timeout":30,"number-images":false,"number-real-index":false,"force-resized":false,"never-new-url":false,"never-send-nl":false,"store-in-fs":false}
[EHD] Current URL > http://g.e-hentai.org/g/(^_*)
[EHD] Is Logged In > true
[EHD] Index > 3  | RealIndex > 3  | Name > 02.jpg  | RetryCount > 0  | DownloadedCount > 1  | FetchCount > 5  | FailedCount > 0
(omit)
[EHD] Index > 19  | RealIndex > 19  | Name > 18.jpg  | RetryCount > 0  | DownloadedCount > 22  | FetchCount > 5  | FailedCount > 0
Uncaught TypeError: Cannot read property 'length' of undefined
[EHD] Index > 23  | RealIndex > 23  | Name > 22.jpg  | RetryCount > 0  | DownloadedCount > 23  | FetchCount > 5  | FailedCount > 0
(omit)
[EHD] Index > 29  | RealIndex > 29  | Name > 28.jpg  | RetryCount > 0  | DownloadedCount > 30  | FetchCount > 2  | FailedCount > 0

==Err trackback:

Uncaught TypeError: Cannot read property 'length' of undefinedfetchThread.(anonymous function).GM_xmlhttpRequest.onload @ VM7682:10108
(anonymous function) @ VM7676:59

VM7682:10108:
response: new ArrayBuffer(res.responseText.length),

==Debug Information

WTF?

JSON.stringify(res) is
{"readyState":4,"responseHeaders":"Date: Tue, 05 Jan 2016 13:43:56 GMT\r\nContent-Length: 0\r\nContent-Type: text/plain; charset=utf-8\r\n","finalUrl":"http://g.e-hentai.org/fullimg.php?(^_*)","status":500,"statusText":"Internal Server Error","responseType":"arraybuffer","response_types":{"response":false,"responseText":false,"responseXML":false}}

so res.response is undefined while

!res.response is true!

res.response==undefined is also true.

!(res.response==undefined) is false. (expected behavior?)

I will send you a pull request later if I can solve this myself.

Always offer to continue download after hitting image limit.

If you hit your image limit, a popup from the script asks if you want to spend credits to reset it. If you hit OK, there's a continue button on the status window. If you hit cancel, will not have a continue button (and will then ask if you want to save the ZIP).

I tested by telling it I would reset, then just waiting a few hours, then hitting continue. The download did continue and the ZIP was fine.

Is there a reason to not always offer to continue?

Uploader Comment as zip comment

Can we have the uploader comment as zip comment? Every now and then when I'm reading something that I downloaded some time ago I took note that it's a multi-series or something the like and it's a bother to look into the gallery again to search in the uploader comment for the order or the like. Well, can we have that as a zip comment?

Image count goes up way too fast

I noticed the image limit gets "consumed" really fast: I downloaded a zip with 31 pics, and the counter went up from 0/5000 to 265/5000.
What is happening?

(pending)timeout on some page

[EHD] E-Hentai Downloader is running.
VM442:35 [EHD] Bugs Report > https://github.com/ccloli/E-Hentai-Downloader/issues | https://greasyfork.org/scripts/10379-e-hentai-downloader/feedback
VM442:36 [EHD] To report a bug, showing all the "[EHD]" logs is wonderful. =w=
VM442:9703 [EHD] UserAgent > Mozilla/5.0 (Macintosh; Intel Mac OS X 10_10_3) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/45.0.2454.101 Safari/537.36
VM442:9704 [EHD] Script Handler > Tampermonkey
VM442:9705 [EHD] GreaseMonkey / Tampermonkey Version > undefined
VM442:9706 [EHD] E-Hentai Downloader Version > 1.18.3
VM442:9707 [EHD] E-Hentai Downloader Setting > {}
VM442:9708 [EHD] Current URL > http://exhentai.org/g/245439/0dbe807aca/?p=1
VM442:9709 [EHD] Is Logged In > true
VM442:9898 [EHD] Index > 1  | RealIndex > 1  | Name > 00.jpg  | RetryCount > 0  | DownloadedCount > 1  | FetchCount > 5  | FailedCount > 0
VM442:9898 [EHD] Index > 3  | RealIndex > 3  | Name > 02.jpg  | RetryCount > 0  | DownloadedCount > 2  | FetchCount > 5  | FailedCount > 0
VM442:9898 [EHD] Index > 2  | RealIndex > 2  | Name > 01.jpg  | RetryCount > 0  | DownloadedCount > 3  | FetchCount > 5  | FailedCount > 0
VM442:9898 [EHD] Index > 5  | RealIndex > 5  | Name > 04.jpg  | RetryCount > 0  | DownloadedCount > 4  | FetchCount > 5  | FailedCount > 0
VM442:9898 [EHD] Index > 6  | RealIndex > 6  | Name > 05.jpg  | RetryCount > 0  | DownloadedCount > 5  | FetchCount > 5  | FailedCount > 0
VM442:9898 [EHD] Index > 9  | RealIndex > 9  | Name > 08.jpg  | RetryCount > 0  | DownloadedCount > 6  | FetchCount > 5  | FailedCount > 0
VM442:9898 [EHD] Index > 11  | RealIndex > 11  | Name > 10.jpg  | RetryCount > 0  | DownloadedCount > 7  | FetchCount > 5  | FailedCount > 0
VM442:9898 [EHD] Index > 12  | RealIndex > 12  | Name > 11.jpg  | RetryCount > 0  | DownloadedCount > 8  | FetchCount > 5  | FailedCount > 0
VM442:9898 [EHD] Index > 8  | RealIndex > 8  | Name > 07.jpg  | RetryCount > 0  | DownloadedCount > 9  | FetchCount > 5  | FailedCount > 0
VM442:9898 [EHD] Index > 14  | RealIndex > 14  | Name > 13.jpg  | RetryCount > 0  | DownloadedCount > 10  | FetchCount > 5  | FailedCount > 0
VM442:9898 [EHD] Index > 7  | RealIndex > 7  | Name > 06.jpg  | RetryCount > 0  | DownloadedCount > 11  | FetchCount > 5  | FailedCount > 0
VM442:9898 [EHD] Index > 15  | RealIndex > 15  | Name > 14.jpg  | RetryCount > 0  | DownloadedCount > 12  | FetchCount > 5  | FailedCount > 0
VM442:9898 [EHD] Index > 13  | RealIndex > 13  | Name > 12.jpg  | RetryCount > 0  | DownloadedCount > 13  | FetchCount > 5  | FailedCount > 0
VM442:9898 [EHD] Index > 16  | RealIndex > 16  | Name > 15.jpg  | RetryCount > 0  | DownloadedCount > 14  | FetchCount > 5  | FailedCount > 0
VM442:9898 [EHD] Index > 18  | RealIndex > 18  | Name > 17.jpg  | RetryCount > 0  | DownloadedCount > 15  | FetchCount > 5  | FailedCount > 0
VM442:9898 [EHD] Index > 17  | RealIndex > 17  | Name > 16.jpg  | RetryCount > 0  | DownloadedCount > 16  | FetchCount > 5  | FailedCount > 0
VM442:9898 [EHD] Index > 21  | RealIndex > 21  | Name > 20.jpg  | RetryCount > 0  | DownloadedCount > 17  | FetchCount > 5  | FailedCount > 0
VM442:9898 [EHD] Index > 22  | RealIndex > 22  | Name > 21.jpg  | RetryCount > 0  | DownloadedCount > 18  | FetchCount > 5  | FailedCount > 0
VM442:9898 [EHD] Index > 4  | RealIndex > 4  | Name > 03.jpg  | RetryCount > 0  | DownloadedCount > 19  | FetchCount > 5  | FailedCount > 0
VM442:9898 [EHD] Index > 23  | RealIndex > 23  | Name > 22.jpg  | RetryCount > 0  | DownloadedCount > 20  | FetchCount > 5  | FailedCount > 0
VM442:9898 [EHD] Index > 24  | RealIndex > 24  | Name > 23.jpg  | RetryCount > 0  | DownloadedCount > 21  | FetchCount > 5  | FailedCount > 0
VM442:9898 [EHD] Index > 10  | RealIndex > 10  | Name > 09.jpg  | RetryCount > 0  | DownloadedCount > 22  | FetchCount > 5  | FailedCount > 0
VM442:9898 [EHD] Index > 27  | RealIndex > 27  | Name > 26.jpg  | RetryCount > 0  | DownloadedCount > 23  | FetchCount > 5  | FailedCount > 0
VM442:9898 [EHD] Index > 25  | RealIndex > 25  | Name > 24.jpg  | RetryCount > 0  | DownloadedCount > 24  | FetchCount > 5  | FailedCount > 0
VM442:9898 [EHD] Index > 28  | RealIndex > 28  | Name > 27.jpg  | RetryCount > 0  | DownloadedCount > 25  | FetchCount > 5  | FailedCount > 0
VM442:9898 [EHD] Index > 30  | RealIndex > 30  | Name > 29.jpg  | RetryCount > 0  | DownloadedCount > 26  | FetchCount > 5  | FailedCount > 0
VM442:9898 [EHD] Index > 29  | RealIndex > 29  | Name > 28.jpg  | RetryCount > 0  | DownloadedCount > 27  | FetchCount > 5  | FailedCount > 0
VM442:9898 [EHD] Index > 32  | RealIndex > 32  | Name > 31.jpg  | RetryCount > 0  | DownloadedCount > 28  | FetchCount > 5  | FailedCount > 0
VM442:9898 [EHD] Index > 31  | RealIndex > 31  | Name > 30.jpg  | RetryCount > 0  | DownloadedCount > 29  | FetchCount > 5  | FailedCount > 0
VM442:9898 [EHD] Index > 33  | RealIndex > 33  | Name > 32.jpg  | RetryCount > 0  | DownloadedCount > 30  | FetchCount > 5  | FailedCount > 0
VM442:9898 [EHD] Index > 34  | RealIndex > 34  | Name > 33.jpg  | RetryCount > 0  | DownloadedCount > 31  | FetchCount > 5  | FailedCount > 0
VM442:9898 [EHD] Index > 36  | RealIndex > 36  | Name > 35.jpg  | RetryCount > 0  | DownloadedCount > 32  | FetchCount > 5  | FailedCount > 0
VM442:9898 [EHD] Index > 35  | RealIndex > 35  | Name > 34.jpg  | RetryCount > 0  | DownloadedCount > 33  | FetchCount > 5  | FailedCount > 0
(program):9898 [EHD] Index > 26  | RealIndex > 26  | Name > 25.jpg  | RetryCount > 0  | DownloadedCount > 34  | FetchCount > 5  | FailedCount > 0
(program):9898 [EHD] Index > 38  | RealIndex > 38  | Name > 37.jpg  | RetryCount > 0  | DownloadedCount > 35  | FetchCount > 5  | FailedCount > 0
(program):9898 [EHD] Index > 37  | RealIndex > 37  | Name > 36.jpg  | RetryCount > 0  | DownloadedCount > 36  | FetchCount > 5  | FailedCount > 0
(program):9898 [EHD] Index > 40  | RealIndex > 40  | Name > 39.jpg  | RetryCount > 0  | DownloadedCount > 37  | FetchCount > 5  | FailedCount > 0
(program):9898 [EHD] Index > 39  | RealIndex > 39  | Name > 38.jpg  | RetryCount > 0  | DownloadedCount > 38  | FetchCount > 5  | FailedCount > 0
(program):9898 [EHD] Index > 42  | RealIndex > 42  | Name > 41.jpg  | RetryCount > 0  | DownloadedCount > 39  | FetchCount > 5  | FailedCount > 0
(program):9898 [EHD] Index > 43  | RealIndex > 43  | Name > 42.jpg  | RetryCount > 0  | DownloadedCount > 40  | FetchCount > 5  | FailedCount > 0
(program):9898 [EHD] Index > 44  | RealIndex > 44  | Name > 43.jpg  | RetryCount > 0  | DownloadedCount > 41  | FetchCount > 4  | FailedCount > 0
(program):9898 [EHD] Index > 41  | RealIndex > 41  | Name > 40.jpg  | RetryCount > 0  | DownloadedCount > 42  | FetchCount > 3  | FailedCount > 0
(program):9898 [EHD] Index > 20  | RealIndex > 20  | Name > 19.jpg  | RetryCount > 0  | DownloadedCount > 43  | FetchCount > 2  | FailedCount > 0
VM442:10319 [EHD] #19: Timed Out
VM442:10320 [EHD] #19: RealIndex > 19  | ReadyState > undefined  | Status > undefined  | StatusText > undefined
ResposeHeaders >undefined
VM442:10064 [EHD] Index > 19  | RealIndex > 19  | Name > 18.jpg  | RetryCount > 0  | DownloadedCount > 43  | FetchCount > 1  | FailedCount > 0failedFetching @ VM442:10064fetchThread.(anonymous function).GM_xmlhttpRequest.ontimeout @ VM442:10323(anonymous function) @ VM437:54setTimeout (async)backup.safeWindow.(anonymous function) @ VM434:1k @ VM437:54(anonymous function) @ VM437:57f.notifyListeners @ VM437:28(anonymous function) @ VM437:28Context.chromeEmu.e.runConnectResponse @ VM437:27(anonymous function) @ VM2180:2(anonymous function) @ VM2180:2copy.exec @ VM437:1copy.Eventing.b @ VM437:8copy.Eventing.a.eventHandlerPage @ VM437:12a.standardEventSource.element.dispatchEvent @ content.js:25a.fireEvent @ content.js:27g.sendMessage @ content.js:10b.onConnectResponse @ content.js:19(anonymous function) @ content.js:21EventImpl.dispatchToListener @ VM403 extensions::event_bindings:387publicClass.(anonymous function) @ VM409 extensions::utils:94EventImpl.dispatch_ @ VM403 extensions::event_bindings:371EventImpl.dispatch @ VM403 extensions::event_bindings:393publicClass.(anonymous function) @ VM409 extensions::utils:94dispatchOnMessage @ VM402 extensions::messaging:310

When downloading large gallery =>600MB

When the downloader finishes downloading and asks me where to save the zip file then I save it. It downloads the zip file to my computer but for some reason the archive is "no file size" if I remember correctly. I am pretty sure my RAM can handle 600MB and up gallery sizes I am using Chrome browser

中间的下载按钮消失

这两天突然发现页面中间的下载框不见了。。。
E-Hentai Downloader 1.21.3
Chrome 版本 52.0.2723.2 dev-m (64-bit)

Not Showing

In the latest version is showing nothing here.

xx

一个很严重的bug,serious

症状是:批量下载一大堆zip,用winrar解压“单独解压到每个文件夹”,然后文件夹损坏,不能改名,不能删除。
但是图片可以用windows自带的软件浏览,也可以移动。就是文件夹不行。

前几天,这个bug出现一次,我以为是我的硬盘坏了。下载了360强力删除,删不掉,必须和上一级目录一起删除才能清除掉。

然后今天,先下载了一个zip,我发现设置重置了,下载回来的是英文包,并且里面有乱码子目录。
于是第二个本子,更改了设置:
Set folder name as:/
Set Zip file name as:{subtitle}
Retry automatically when images download failed
Force download resized image (never download original image) **

勾选了这四项。
下载回来一大批本子后,问题出现了:
第一个本子没问题。后面的本子,全部再次文件夹无法删除/改名。

有bug的zip文件找不到了,手快解压后就删除了……再下回来一个,是没问题的……
我估计就是文件夹名字那里有问题。
请重视这个bug。

Classification according to EH

Since it's easier to do this by the script I'm writing this issue.

Could you add an sign as a classification, according to EH, since the script can't sort the downloaded files by folders it would be waaaaaaaaaaaay easier to do this on my side if the first files of the archive were the classification something like this:

{DO}[Mariana Kaikou Kikaku (Mikami Hokuto)] Uchi, Nandemo Shimasu kara (Guilty Gear XX) [Digital]
{AR}[Ecolonun (Numeko)] Mahou Shoujo Reina
{MA}[Takara Akihito] Koiiro Girls Soutennenshoku [Digital]
{MI}[Brandon Santiago] Erma (Ongoing)
{NO}[Sung San-Young] The Gamer Ch. 72-136 (English) (Ongoing)
{WE}[Area] Between Friends (Ongoing)
{CO}[AmieChan] Rainbow Dash
{GA}[Noesis] Free Friends 2 [Decensored]
{IM}「bluefield」 - Artist

Then, after the downloads, it would be easier to sort them, and only then, run a script to erase those initials. The webpage should have the things needed.


:(

Sorting ~1000 folders is hard.

Page refreshing problem

The popup while refreshing the downloading page seems to be effectiveness.


ccloli你好
10月9日更新的版本在下载时对页面的更新操作后不再弹出提示窗口,而直接跳转到新页面,之前的下载需要从头开始TvT

关于检测脚本更新

Pixiv++看起来就是个超强大的脚本,还自带更新检测功能。之前说到的把库放在@requier里,你可以每次升级你的版本的时候检测这些库是否有更新了。
image 1

Wrong file extension

Sometimes the file extension are png but it saves them as jpg which causes some problems in some image viewers.

Downloading problem

不知道为啥每次同时下载的数张图中总会出现几个0kb/s,且retry时还是下载失败,火狐和chrome都是这个问题Orz

Any way to download using this script even if image limit is reached?

So I'm like at 37000+ towards a limit of 5000 and it needs credits of 76k to reset and I don't know when will ever go back to below the limit,
I have like 8k credits right now. I am using this script especially when there are no torrents available.
So yeah my fault I downloaded large gallery files in a single day but some of them didn't even complete the download because of the image limit.

Showing on Bottom?

I'm using the last version of the script in the last version of Chromium and the scritp keep showing in the bottom of the page after the comments. In firefox it show as usual.
3MPcdz7

I even tried copy&paste the firefox script on chromium and still after the comments.

stuck fetching images

I don't know why but I just noticed today that the script just stuck on fetching images with no progress at all.
Fetching Image 1: page1.png ...
Fetching Image 2: page2.png ...
Fetching Image 3: page3.png ...
Fetching Image 4: page4.png ...
Fetching Image 5: page5.png ...

then nothing. I don't know what could be the problem. How to check?

[Request] Add a pause/resume button

Can you please add a pause/resume button on the download status mini screen?

Sometimes if the gallery size is big, after more than 30% or 50% have been downloaded, I remember something more important. So I have to close the tab and download again later. But my image limits are consumed due to the previous partial downloading. So, a pause/resume button would be appreciated.

Don't work with Safari

The download finished but the zip is not downloaded, a blank page popped out but nothing happened when clicked the "Not download?" button.

Version: 1.19.9
Safari: Version 9.1 (11601.5.17.1)

Spilt Download Large Galleries

Some large galleries do not have torrent and currently E-Hentai Downloader cannot handle them. It just downloads the images and then save dialog does not appear.

A way to download large galleries in segments so you can download them over several days if you want and need due to download limitation.

求添加功能:①轉換無效半角字符成有效全角字符功能②修改檔名功能


A filename cannot contain any of the following characters:
\ / : * ? " < > |
有些畫廊有這些字符在title和subtitle中
希望可以添加一個可以自動轉換這些在建立檔名時會成"-"這樣的一個無效字符為該字符的全角字符
例如:
https://exhentai.org/g/966727/6258cf46a9/
本來的非法"?"字符:
(C90) [オセロアイス (shuz)] おおきいけれどいいですか?
換成合法的全角”?”字符:
(C90) [オセロアイス (shuz)] おおきいけれどいいですか?

希望在下載之前
可以在"E-Hentai Downloader | Image Limits: 0/5000"的框框內
添加兩個確認folder name和zip file name的框框
按照settings內設置的在框框內顯示出檔案名
供使用者做細節更改(若需要)
例:
https://exhentai.org/g/589459/a442012cdd/
原本設定:
folder name:{subtitle}
zip file name:{subtitle}
故應該是:
folder name:(C83) [うすべに屋 (うすべに桜子)] 東方足祭 (東方Project) [**翻訳] [琉璃神社★汉化]
zip file name:(C83) [うすべに屋 (うすべに桜子)] 東方足祭 (東方Project) [**翻訳] [琉璃神社★汉化]
下載者想作修改成:
folder name:(C83) [うすべに屋 (よろず)] 東方足祭 (東方Project) [**翻訳]
zip file name:(C83) [うすべに屋 (よろず)] 東方足祭 (東方Project) [**翻訳]
這樣就可以不用重新解壓縮改名後再重新打包
大概長這樣的概念:
untitled

Few Suggestions

Would you consider the following:

  1. Add the numbering option beside or under the "Download Archive" option. Most galleries are numbered but there are a few that aren't. So in most cases you need a quick way to switch between numbering and not numbering.

  2. Usually the first comment is "Uploader Comment" which contains useful information about the gallery. Would you consider including it in the info. I already added it in the example at line 11256. Would you also consider making the info file into an html to allow formatting of output including the comment.

  3. Rename the info file to always be at the top. Right now it is at the bottom if you enable numbering, and if its a large archive you do not spot it unless you scroll down to the bottom.

Thank you

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.