A follow up on the memory issue. I did another test with a borrowed camera that had free_block_max_size=328k . It began downloading ~300 photos but aborted after saving ~160 photos. I didn't catch any error message.
Without knowing what failed, it's impossible to say. A memory issue seems likely, but of course there could be other bugs too.
I haven't done a lot of testing with large numbers of files.
I have some ideas for improving the memory situation, I'll try to look into it when I get some time. If someone else wants to play with it here are some of the issues:
1) The listing process gathers information about the files (path, stat information) and collects them into an array. When the array reaches batch_size, the whole thing is serialized and turned into a message. You could monitor free memory (with get_meminfo), and force a batch to be sent earlier if it was low.
2) The message is queued in the PTP code until the PC side fetches it. The script will keep running and adding to the queue until it fills up (16 items) so a lot of memory can be used by stuff waiting in the queue. It would make more sense to send fewer, larger messages, but the message API doesn't make this easy to do. Some testing would be required to see if the queue actually fills up under normal usage. If so, it might make sense to force a sleep at some point (or make it a batcher option). Or improve the message API somehow.
3) Although lua has automatic garbage collection, it's still possible to run out of memory due to uncollected garbage, because the lua memory system doesn't know how much system memory is available. You can use collectgarbage to query memory usage or force collection. You can use get_meminfo to find out how much system memory is available.
4) The serialize function currently generates a LOT of garbage. In lua, each string is a unique value. So if you build a string piecewise by something like
while cond do
foo = foo.."bar"
end
Every intermediate version becomes a lua string, which immediately becomes unreferenced, but might not be collected right away. Re-writing serialize to put all the parts in an array and join them once at the end would probably be a lot more memory friendly.
5) os.listdir lists all the files in a directory into a table, so anything that uses it inevitably uses more memory as the number of files goes up. Implementing a proper directory iterator function (like in lfs.dir() on the PC side) on the camera would solve this.
6) While improving efficiency would be good, it would be better to detect low memory and avoid failing.