You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I got an idea for an unofficial URL syntax expansion which could work for other sites having similar functionality:
It's unofficial cuz this format doesn't work on DA's end, so RipMe has to use the assumption of 24 items per page and start from offset 0 (I researched the values; starting from offset 0 guarantees getting the gallery contents starting from the latest one assuming the uploads haven't been reordered, and if you go in the increments of 24 you're guaranteed to get everything as long as the gallery contents don't change while the ripping is in progress; just keep going with incrementing the offset until you get below 24 hits, at which point it's the last page. Also the webpage size drops dramatically when the upload count drops).
(XX is an integer >=0) are the links used to fetch the entire gallery/favs without regarding the folders, the XX could be expanded as follows:
last to get only the last page
first and/or firstonly for only the first page
since# (# is an integer >=0) for inclusively starting from a specific 24-divisible offset onwards, incrementing the offset number. Meaning giving since10 fails but since24 starts from offset 24, inclusive, and goes up to whatever page is the last (or the first in the sense that the last page has the first upload in an unordered folder) like normal.
sincepage# (# is an integer >=1) for inclusively starting from a specific page onwards, incrementing the page number. Meaning giving since10 starts from page 10, inclusive, and goes up to whatever page is the last (or the first in the sense that the last page has the first upload in an unordered folder) like normal.
before# (# is an integer >=24) for starting from specific offset, minus 24, backwards, decrementing the page number. Meaning giving before10 fails but before96 starts from offset 72, inclusive, and goes down to offset 0.
beforepage# (# is an integer >=2) for starting from specific page, minus 1, backwards, decrementing the page number. Meaning giving before10 starts from page 9, inclusive, and goes down to page 1 aka offset 0.
only# (# is an integer >=0) for taking only that offset.
onlypage# (# is an integer >=1) for taking only that page.
How to act:
Given last: go to offset 0, find the last page and use the found link or offset to get just that page.
Given either of the since#: go to offset 0, get the list of pages, use the assumption of 24 items per page to fill the missing ones from the start to the end and inclusively start from the page #'s offset, going forward (incrementing).
Given either of the before#: go to offset 0, get the list of pages, use the assumption of 24 items per page to fill the missing ones from the start to the end and inclusively start from the page #'s offset, going backward (decrementing).
The text was updated successfully, but these errors were encountered:
I got an idea for an unofficial URL syntax expansion which could work for other sites having similar functionality:
It's unofficial cuz this format doesn't work on DA's end, so RipMe has to use the assumption of 24 items per page and start from offset 0 (I researched the values; starting from offset 0 guarantees getting the gallery contents starting from the latest one assuming the uploads haven't been reordered, and if you go in the increments of 24 you're guaranteed to get everything as long as the gallery contents don't change while the ripping is in progress; just keep going with incrementing the offset until you get below 24 hits, at which point it's the last page. Also the webpage size drops dramatically when the upload count drops).
Since
http://USERNAME.deviantart.com/gallery/?catpath=/&offset=XX
http://USERNAME.deviantart.com/favourites/?catpath=/&offset=XX
(
XX
is an integer >=0) are the links used to fetch the entire gallery/favs without regarding the folders, theXX
could be expanded as follows:last
to get only the last pagefirst
and/orfirstonly
for only the first pagesince#
(#
is an integer >=0) for inclusively starting from a specific 24-divisible offset onwards, incrementing the offset number. Meaning givingsince10
fails butsince24
starts from offset 24, inclusive, and goes up to whatever page is the last (or the first in the sense that the last page has the first upload in an unordered folder) like normal.sincepage#
(#
is an integer >=1) for inclusively starting from a specific page onwards, incrementing the page number. Meaning givingsince10
starts from page 10, inclusive, and goes up to whatever page is the last (or the first in the sense that the last page has the first upload in an unordered folder) like normal.before#
(#
is an integer >=24) for starting from specific offset, minus 24, backwards, decrementing the page number. Meaning givingbefore10
fails butbefore96
starts from offset 72, inclusive, and goes down to offset 0.beforepage#
(#
is an integer >=2) for starting from specific page, minus 1, backwards, decrementing the page number. Meaning givingbefore10
starts from page 9, inclusive, and goes down to page 1 aka offset 0.only#
(#
is an integer >=0) for taking only that offset.onlypage#
(#
is an integer >=1) for taking only that page.How to act:
Given
last
: go to offset 0, find the last page and use the found link or offset to get just that page.Given either of the
since#
: go to offset 0, get the list of pages, use the assumption of 24 items per page to fill the missing ones from the start to the end and inclusively start from the page#
's offset, going forward (incrementing).Given either of the
before#
: go to offset 0, get the list of pages, use the assumption of 24 items per page to fill the missing ones from the start to the end and inclusively start from the page#
's offset, going backward (decrementing).The text was updated successfully, but these errors were encountered: