-
Notifications
You must be signed in to change notification settings - Fork 50
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Duplicate downloading checking + fix startFromOffset #183
Conversation
97ba4b3
to
e16378c
Compare
Good news: the duplicate downloading checking appears to be working great. Bad news: The startFromOffset changes appear to have broken something else. Where previously it found all ~2200-ish books in my account with no pagination parameters (just --baseUrl), it's now only finding 200 and then stopping. | Found 200 books in total |
Can confirm the same-- the skipping of existing downloads work, the offset fix does not. Checking out e16378c alone was a huge improvement, so I wonder if it's worth reverting the offset fix into it's own PR. |
Thanks for the reports! I |
I think I found the culprit of only downloading 200 books. Was a simple fix so will keep in this PR for now. @jsonbecker / @HamsterExAstris I would really appreciate if either of you could pull the latest and retest before I merge this? |
I didn't actually redownload, but it does appear to find the full count of books now and (and didn't download them because I have them). |
…queness across editions
631a400
to
bb32cd4
Compare
Can confirm the current version in the branch fixes the 200 book limit. It looks like the duplicate handling is working too. I didn't think it was at first, but that's because the file names changed to append ASIN. I think that's a good idea because it fixes other issues reported when a user has multiple books with the same title; but it might cause some confusion over why duplicates aren't being skipped. |
The latest version works for me. Was able to download 3800 books with only 63 failures. |
Changes:
--duplicateHandling
with options for eitherskip
oroverwrite
to control the duplicate download check--totalDownloads
to beInfinity