I have built a sync pipeline that uses bulk operations and the product set mutation. For some products, we have some metafields that are pretty big ~800K characters. We ran into a strange issue where when we start the bulk operation with one such product, we get Unexpected file structure - expected JSONL
.
I find that strange since the limits specified here for a JSON metafield is 2M characters. We are 1.2M away from that.
In addition, even stranger is that the limit appears to be per line. From my tests, I was able to send product that are at most 500K characters per line.
However, that’s only for bulk sync. If I try to use the productSet mutation with a metafield that has 800k characters, it works just fine.
In the bulk sync documentation it states that there is a limit of 20MB per jsonl file. It does not state other size-based limits. Any information about this? Are our observations correct? If yes, is there any way t o move forward since we really need the bulk sync?
How to replicate:
- create a json with the following contents:
{
"synchronous": false,
"productSet": {
"title": "Sync bulk limit test",
"status": "DRAFT",
"metafields": [
{
"key": "limitest",
"namespace": "limitest",
"type": "json",
"value": "{\"name\":\"\"}"
}
]
}
}
- inside the name json node, I usually paste 500K characters, like
aaa....
. - minify the json and put it into a jsonl file
- upload it using
stagedUpload
- start a bulk mutation for the
productSet
operation.
The bulk mutation does not start, and we get the error: Unexpected file structure - expected JSONL
.