barefootguru 67 Report post Posted October 15, 2017 Hi, I like to keep areas around home in offline lists and up to date, so they're always available for unplanned caching where there's no cell coverage. I currently have 6 PQs with around 800 caches each, covering different geographic areas (some overlap). Every week I delete the 6 old offline lists and download the new PQs. Wondering if a better workflow would be a one of combining of the offline lists into a single list, then weekly using Cachly's Update Caches function on all 5,000 caches. Plus importing a single PQ into that list to pick up any new caches placed. Thinking this would save quite a bit of time, plus allow me to view all the offline caches at once. I guess I'd have to keep the original PQs around in case I needed to start again in Cachly. Interested in thoughts. Share this post Link to post Share on other sites
ChrisDen 87 Report post Posted October 15, 2017 I do something similar to what you are suggesting. Combine the PQs. The problem I have with that method is how to identify archived caches so they can be deleted. Takes to long to run an update. 1 barefootguru reacted to this Share this post Link to post Share on other sites
Bolling 99 Report post Posted October 15, 2017 47 minutes ago, ChrisDen said: I do something similar to what you are suggesting. Combine the PQs. The problem I have with that method is how to identify archived caches so they can be deleted. Takes to long to run an update. Once you import new pocket queries you should be able to delete archived caches with a Date Last Updated filter. This is what I do in GSAK. The filter doesn't appear to risk properly in the latest beta. I've reported that issue here: http://www.cach.ly/support/index.php?/topic/882-date-last-updated-filter/ 1 ChrisDen reacted to this Share this post Link to post Share on other sites
Bolling 99 Report post Posted October 17, 2017 Ok Nic said it should be the "Date Downloaded" filter. So, import PQs to update caches. Filter on Date Downloaded and set prior to date of PQ import. Then either delete the caches in the filter or update them to confirm they are archived and then filter on archived caches to delete. 2 barefootguru and ChrisDen reacted to this Share this post Link to post Share on other sites
barefootguru 67 Report post Posted October 17, 2017 Thanks guys, things to think about Share this post Link to post Share on other sites
Kyzabra 22 Report post Posted November 21, 2017 On 10/15/2017 at 6:03 PM, barefootguru said: Hi, I like to keep areas around home in offline lists and up to date, so they're always available for unplanned caching where there's no cell coverage. I currently have 6 PQs with around 800 caches each, covering different geographic areas (some overlap). Every week I delete the 6 old offline lists and download the new PQs. Wondering if a better workflow would be a one of combining of the offline lists into a single list, then weekly using Cachly's Update Caches function on all 5,000 caches. Plus importing a single PQ into that list to pick up any new caches placed. Thinking this would save quite a bit of time, plus allow me to view all the offline caches at once. I guess I'd have to keep the original PQs around in case I needed to start again in Cachly. Interested in thoughts. I run a database with 12000 caches. And have just started experimenting with the dropbox import feature. Ill start by saying I have my PQs done on date order rather than geographic area to avoid overlaps and potential misses. What I do is run GSAK, and import the PQs into that. I then manage my database in there. From GSAK, I export the csv to a dropbox folder and from cachly I import that into a fresh list each week. I then have the ability to download the most recent date PQ (which I run daily) and update this list with the new caches as needed. It gives me a lot of control over the caches I bring in (also allowing for archived caches that are still there) plus it means that I can bring all my finds in(complete with cache note) so when I get a PAF call, I have my notes on me and can help out a fellow cacher. The downfall with the workflow you sugggested is that it doesnt update caches that HAVE been archived and are gone. And the update cache feature is VERY SLOW on bulk lists as it only returns 50 caches at a time then waits 60 seconds (I think) Share this post Link to post Share on other sites
Nic Hubbard 645 Report post Posted November 21, 2017 3 minutes ago, Kyzabra said: And the update cache feature is VERY SLOW on bulk lists as it only returns 50 caches at a time then waits 60 seconds (I think) No, it can perform up to 20 requests per 60 seconds, each with 50 caches in the request. Cachly just has to wait for all of those to complete, and if you more requests are needed it does have to wait 60 seconds. Share this post Link to post Share on other sites
Kyzabra 22 Report post Posted November 28, 2017 On 11/22/2017 at 6:57 AM, Nic Hubbard said: No, it can perform up to 20 requests per 60 seconds, each with 50 caches in the request. Cachly just has to wait for all of those to complete, and if you more requests are needed it does have to wait 60 seconds. Can you clarify this for me Nic. Are you saying that it has the potential to return up to 20 groups of 50 caches per 60 seconds? I am not getting this. If I have 10,000 caches, how long "should" this take? Share this post Link to post Share on other sites
Team DEMP 120 Report post Posted November 28, 2017 4 hours ago, Kyzabra said: Can you clarify this for me Nic. Are you saying that it has the potential to return up to 20 groups of 50 caches per 60 seconds? I am not getting this. If I have 10,000 caches, how long "should" this take? Yes - 20 requests in 60 seconds with each request covering 50 caches. That means 1000 caches per minute and then it needs to pause 1 minute before it can start the next batch. So with 10,000 caches it would be: Min 1 - 1000 Min 2 - Pause Min 3 - 2000 Min 4 - Pause Min 5 - 3000 Min 6 - Pause ... Min 17 - 9000 Min 18 - Pause Min 19 - 10000 Min 20 - Pause Min 21 - 11000 Assuming everything goes as planned. 1 barefootguru reacted to this Share this post Link to post Share on other sites
ChrisDen 87 Report post Posted November 28, 2017 Except you will hit the daily limit before you get to 11 000 Share this post Link to post Share on other sites
Nic Hubbard 645 Report post Posted November 28, 2017 8 hours ago, Kyzabra said: Can you clarify this for me Nic. Are you saying that it has the potential to return up to 20 groups of 50 caches per 60 seconds? I am not getting this. If I have 10,000 caches, how long "should" this take? Cachly can perform 30 requests to the API per 60 seconds. So you can potentially return 1,500 updated caches every 60 seconds. Share this post Link to post Share on other sites
Team DEMP 120 Report post Posted November 28, 2017 So updating to 30 requests per minute with 50 caches per request ...Min 1 - 1500 cache updatesMin 2 - PauseMin 3 - 3000Min 4 - PauseMin 5 - 4500Min 6 - PauseMin 7 - 6000 Min 8 - Pause Min 9 - 7500 Min 10 - Pause Min 11 - 9000 Min 12 - Pause Min 13 - 10500 <--- hit your daily limit Min 14 - PauseMin 15 - 12000 Share this post Link to post Share on other sites
Kyzabra 22 Report post Posted November 28, 2017 Not sure what the cause is, but I ran an update today on 5792 caches. Ran the update on "FIltered caches". It ran for 18 minutes before crashing. Have checked the crash folder but cant see a crash log. Share this post Link to post Share on other sites
rragan 200 Report post Posted November 29, 2017 Are you low on space and not room enough to write the log? Share this post Link to post Share on other sites
Kyzabra 22 Report post Posted November 29, 2017 1 hour ago, rragan said: Are you low on space and not room enough to write the log? Nope, plenty of room Share this post Link to post Share on other sites