Optimizing Resources

Hi! I just wanted to consult about optimizing code so that we can save resources and usage. So there's a set of prodcuts like 2k products and each product has like 5-10 files that we upload in shopify. Basically my current logic is something like: erpProducts.js (this file is where we fetch all the products first and loop through it)
//products is a response from erp API.
for (product of products) {
await api.updateProductsMetafiled({product});
}
//products is a response from erp API.
for (product of products) {
await api.updateProductsMetafiled({product});
}
updateProductsMetafiled.js (this is where we loop through the files of the product passed from erpProduct.js so we can bulk upload it in shopify)
for (file of product.files) {
processFilesAndBulkUploadItInShopify();
}
for (file of product.files) {
processFilesAndBulkUploadItInShopify();
}
We ran this now every 12hours to save resources. Also, we tried getting all the products that has change for the last X time but changes in files don't update the product's last modified date so that approach won't work. Thank you in advance!
9 Replies
Chocci_Milk
Chocci_Milk5w ago
Are you making one call to Shopify per product? Have you looked at enqueuing the calls to Shopify? I would also recommend that you enqueue actions using the p-map package which is like a Promise.all where you can set concurrency. https://www.npmjs.com/package/p-map
Benz
BenzOP5w ago
If you're referring to uploading it in shopify then yes it is per product. Is enqueuing the calls to shopify means like the enqueue built in the gadget? Or is that something used in shopify? Is p-map different with the built in enqueue of gadget?
Chocci_Milk
Chocci_Milk5w ago
Yes, when I say enqueuing, I'm talking about our api.enqueue which marks an action as to be run later. p-map and enqueuing are different. p-map is a Promise.all with concurrency while enqueue is a queue of things that need to run later. The idea of using p-map is to allow you to get through the initial action faster and letting the slow work be run later (each call to Shopify)
Benz
BenzOP5w ago
Hmmm but by doing all of that will make any difference in the resources and usage i'm consuming?
Chocci_Milk
Chocci_Milk5w ago
It would lower the request time for the initial action. Also, if you're making multiple requests to Shopify, you should always be enqueuing
Benz
BenzOP5w ago
Okay okay. I was enqueuing it before but had some problems with some products not being processed. But will try to add it back. When using p-map do I enclose all the code in a function then pass it on the p-map? And by initial action you mean the erpProducts.js right?
Chocci_Milk
Chocci_Milk5w ago
p-map use looks something like this:
import pMap from "p-map"

const arr = [{someStuff: "here"}]

pMap(arr, async (record) => await api.enqueue(...),
{
concurrency: someNumber
})
import pMap from "p-map"

const arr = [{someStuff: "here"}]

pMap(arr, async (record) => await api.enqueue(...),
{
concurrency: someNumber
})
Then in the action you would do the work you need to call Shopify Make sure that your concurrency is lowish in your queue
Benz
BenzOP5w ago
import pMap from 'p-map';

export async function run({ params, record, api, logger }) {
const files = record.files || []; // e.g., from ERP

// Optional: define concurrency limit
const CONCURRENCY_LIMIT = 3;

// Async task to run for each file
const handleFile = async (file) => {
try {
// Example: format and upload to Shopify
const formatted = formatFileForShopify(file);
await api.shopifyProductFile.create({ formatted });
logger.info(`Uploaded file: ${file.name}`);
} catch (error) {
logger.error(`Failed to upload file ${file.name}: ${error.message}`);
}
};

// Use p-map for concurrency-limited parallel processing
await pMap(files, handleFile, {
concurrency: CONCURRENCY_LIMIT,
stopOnError: false,
});
}
import pMap from 'p-map';

export async function run({ params, record, api, logger }) {
const files = record.files || []; // e.g., from ERP

// Optional: define concurrency limit
const CONCURRENCY_LIMIT = 3;

// Async task to run for each file
const handleFile = async (file) => {
try {
// Example: format and upload to Shopify
const formatted = formatFileForShopify(file);
await api.shopifyProductFile.create({ formatted });
logger.info(`Uploaded file: ${file.name}`);
} catch (error) {
logger.error(`Failed to upload file ${file.name}: ${error.message}`);
}
};

// Use p-map for concurrency-limited parallel processing
await pMap(files, handleFile, {
concurrency: CONCURRENCY_LIMIT,
stopOnError: false,
});
}
I tried to prompt in chatgpt on how it would look like. Is this also correct? Sorry still having a hard time adjustin to node/js code. I'm just doing context clues Are you familiar with tiny-async-pool? I believe they are the same with some trade offs. Would you still prefer the p-map over it?
Chocci_Milk
Chocci_Milk5w ago
We internally use p-map (version ^4.0.0) that's why we recommend it. You can use whatever package you would like. When it comes to the handleFile function, I think that you should instead enqueue the action so that you run it later. If you enqueue, you can then run more concurrent calls to enqueue than the recommended rate limit it wrote

Did you find this page helpful?