I've been looking into tackling this but it is far more complex than I anticipated. I cannot get my head around a feasible approach. Since the bulkupload works by reading entries in the zip file and performing necessary actions upon each one as it is read I do not see how I can be sure that a CSV file is the first one read and then how to deal with its contents in a scalable fashion? I mean even if I run thru with searching for the CSV first how do I then store the CSV info so it can be referenced when adding a new image or video entry? If I put it all in an array to reference then I imagine that could be a huge array if it is thousands of images and could/would this not cause memory problems? At this point I am suffering a "Coders Block" on how to proceed. I may have to leave this for sponsorship for Magician Chris to handle?
So far I believe the following files and functions will need modifying:
cms/pages/modules/cms_galerries.php
function ___gimp - To read the CSV and store the map somewhere?
function store_from_archive - To accept a title and description (as optional) and pass to simple_add
function simple_add - To accept a title and description (as optional) and pass to galleries 2 image_add and video_add
sources/galleries2.php
functions image_add and video_add - check to see that if $title and $comment have values sent and use them if not use the current method of using exif data
I also suppose a new function for seeking and reading the CSV file will need to be added to cms/pages/modules/cms_galerries.php
It would indeed be hard to solve. It's a typical speed vs memory trade-off issue. You could spool it into a temporary file, but you'd need to keep re-reading that. Or you could dump it in the database for efficient querying, but that's way over-complex.
If I were you I'd just document that the file shouldn't be too huge.
Call Composr's disable_php_memory_limit() function before the import starts. Composr will do its best to remove all memory limits, and if that fails -- well, users with huge amounts of data have an onus to have at least some control over their server settings.
So far I believe the following files and functions will need modifying:
cms/pages/modules/cms_galerries.php
function ___gimp - To read the CSV and store the map somewhere?
function store_from_archive - To accept a title and description (as optional) and pass to simple_add
function simple_add - To accept a title and description (as optional) and pass to galleries 2 image_add and video_add
sources/galleries2.php
functions image_add and video_add - check to see that if $title and $comment have values sent and use them if not use the current method of using exif data
I also suppose a new function for seeking and reading the CSV file will need to be added to cms/pages/modules/cms_galerries.php
Fair point about CSV size.
It would indeed be hard to solve. It's a typical speed vs memory trade-off issue. You could spool it into a temporary file, but you'd need to keep re-reading that. Or you could dump it in the database for efficient querying, but that's way over-complex.
If I were you I'd just document that the file shouldn't be too huge.
Call Composr's disable_php_memory_limit() function before the import starts. Composr will do its best to remove all memory limits, and if that fails -- well, users with huge amounts of data have an onus to have at least some control over their server settings.
So this feature won't apply.
Going forward with expectations of UI quality, it's not realistic users are going to juggle zip files around to upload stuff.