Is it possible to chunk data on the incoming configuration and then split the chunks into tasks? I.e. fetch 1000 products, chunk them into batches of 100 and then have 10 tasks (one task for each chunk)?
Hi @peep_toppy
Welcome to the Alumio forum.
Yes, it is possible to do so in Alumio. Once the data is retrieved, you can use a Value mapper with a mapper called List: Split into chunks.
Please give it a try and let us know if you have any questions.
Hey @Gugi ,
Thanks for the answer. Then it chunks the data, but I still receive a single taks (with the chunks). I want each chunk to be assigned to a different task.
You are correct. Then, you can separate the groups into multiple tasks by using Get branches from a pattern. Please take a look at the screenshot below.
Result:
Hey Gugi,
While the above did have the wanted effect (and is the solution to my initial question) I’m still having some difficulties. I’m trying to load a large XML file (~200Mb) with ~8000 products and I’m getting out of memory errors (on UAT). The solution seems to be to use the incremental load feature, but that will create a task per product, but then I won’t be able to batch requests on the outgoing side (which will cause rate limits / complaints about not using the API efficiently). Is there another way to do this?
Hi Peep,
Correct. The only solution to reducing the memory consumption of subscribing to a large XML file is to read it incrementally.
Could you please let me know how many items per task you currently configure? You may want to consider storing each item in a storage first. Then, you can create a separate incoming configuration to call the Alumio API to retrieve, i.e., the items per page and create a task for each page.