I have a customer for which I’m collecting the inventory Items every night. Usually it would take around 1h10/1h20 to get them all using bulk operations, but since the 08/03 It takes between 8 and 10 hours to complete.
I didn’t change anything on my side, the rootObjectCount didn’t change, it’s solely on shopify side.
Weirdly, other bulk operations don’t seem to be much slower.
Here is the request I’m sending :
{
inventoryItems(query: "") {
edges {
node {
id
countryCodeOfOrigin
createdAt
duplicateSkuCount
harmonizedSystemCode
inventoryHistoryUrl
inventoryLevels { edges { node { id canDeactivate createdAt deactivationAlert location { id } updatedAt quantities(names: ["available", "incoming", "on_hand", "committed", "reserved", "damaged", "safety_stock", "quality_control"]) { id name quantity updatedAt } } } }
measurement { weight { unit value } }
locationsCount { count precision }
provinceCodeOfOrigin
requiresShipping
sku
tracked
unitCost { amount currencyCode }
updatedAt
variant { id }
}
}
}
}
I’m hitting Shopify 2025-04 (could not make the switch to later versions right now but should happen soon enough)
@Noe_Morvillers could you provide an example where the operation took ~1 hour? I can see that the query itself takes a long time to execute so it’s not specific to bulk operations (you can test the query yourself to verify) and I want to see if this is some sort of performance regression on the query itself.
Also when you say other bulk operations aren’t slower, are you saying the same query on other shops are fine or other queries used via bulk operations is fine?
This query for example : gid://shopify/BulkOperation/9496457970049
on the 07/03 took 1 hour and 45 minutes to end up with 172 354 of root object count
This query : gid://shopify/BulkOperation/9504172441985
on the 08/03 took 7 hours and ended with 172 354 of root object count
This query : gid://shopify/BulkOperation/9514477060481
on the 09/03 got canceled after 3 hours and was only at 38k object count at this moment
Other bulk operations on the same shop are fine and the same bulk operation on other shops are fine too
My customer just told me he added extra locations on the 7th of march which made the number of total objects more important, but if we can’t use bulk operations for huge volume of data like that what should we do?
We went from around 10 to more than 30 locations so yeah this might be the problem.
What options would we have to be able to get this data in an acceptable amount of time?
Actually, I was able to confirm between the two operations, you are downloading significantly more data. The root_object_count may not have changed but the total number of rows has grown significantly. If you check the results of the file before March 7 and after, the difference in the exported data size is about 8.6 GB.
If you upgrade your API version to 2026-01, you have the ability to run 5 concurrent bulk operations at the same time. You can then use query filters to break up workload to help improve throughput speed by spreading out the work.