The new support is a huge update.
I have a few questions for how to handle larger data sets with the new shopifyqlQuery endpoint:
-
Is there a specific limit for maximum number of rows that can be returned?
-
Are there plans to add pagination support?
-
Will shopifyqlQuery work with bulk queries?
Thanks,
Tom
2 Likes
Hey Tom. shopifyqlQuery is not supported with bulk queries as it doesn’t implement node.
I’m looking in to your other two questions to get clarity on that for you and will follow up once I have an answer 
1 Like
Hey @tdavies, following up on your other two questions:
1. Maximum rows: If you don’t specify a LIMIT in your query, it defaults to 1000 rows. You can set a different limit explicitly using the LIMIT keyword.
2. Pagination: This is supported through LIMIT combined with OFFSET.
For context, this works the same way as the LIMIT/OFFSET pattern in the Shopify Analytics query editor. The OFFSET parameter skips a specified number of rows before returning results.
https://help.shopify.com/en/manual/reports-and-analytics/shopify-reports/report-types/shopifyql-editor/shopifyql-syntax#limit
Thanks @KyleG-Shopify! That is great news about the pagination support. I’m 99% sure that documentation didn’t exist when I asked this 
fyi - the Tip in that section is duplicated if you are in touch with that documentation team.
@KyleG-Shopify There doesn’t seem to be a maximum value for the LIMIT statement currently.
I did a test of LIMIT 1000000000 expecting an error from the API and I didn’t get any errors and returned everything on our test store:
Even in the Shopify Admin, the help text mentions 1000, but I can change to 10000 and it returns all 10k records on the page:
Is there a plan to put a maximum to the LIMIT statement in the future? Or is there a recommendation from Shopify Engineering for a maximum LIMIT value to prevent potential issues in the future (such as timeout errors)?
We’ll probably play it safe when we add this feature to our app and just do LIMIT 1000 at a time with OFFSET += LIMIT to paginate, but I’m curious how high we can reasonably set the LIMIT for without breaking things in the future.
Hi @Jonathan-HA ,
I work on the Analytics team at Shopify 
We’re likely going to add some additional validation on the LIMIT and improve how it’s represented in our rate limiting and query complexity calculations so I wouldn’t rely on being able to do LIMIT 1000000000 forever.
With that said, I’d love to know more about the use case for wanting to grab that much data in a single request. Based on the screenshot you shared it looks like you’re doing some sort of bulk export?
While the current API allows for large requests (e.g. a large LIMIT) or paging through with LIMIT + OFFSET it works best for more “interactive” queries where only a few hundred or thousand rows are needed at a time so we can return you data quickly.
If you’re wanting to do more of a bulk export for data analysis, or syncing into your own data warehouse, I’d love to learn more so we can provide a solution that’s more tailored to that use case. For example, providing a mechanism to bulk export the analytics data in a CSV, Parquet, XML, or JSONL file via a GCS URL. This will execute faster and provide you the ability to read the data however you see fit (e.g. JSONL streaming reads, Parquet loaded into DuckDB/ClickHouse/Dataframe library).
Thanks for any additional context you can share!
Hey there Nick,
Thanks for the response!
We basically just want to be able to replicate the export to CSV option in the Shopify Admin > Analytics > Reports where you can export more than 1,000 rows.
Certain reports such as the inventory adjustment history, for example, can be pretty big if the store has a lot of variants.
We have an app that does automated data exports, so using this API gives our users the ability to automate exports of those Analytics Reports. Fewer requests would be ideal so the exports run faster.
The main reason for my question is basically just to get an idea of what would be a “safe” limit value that will unlikely break later due to an API update. We don’t have an issue using pagination with offset, but it would be good to know if there’s a maximum recommended value for the limit keyword (just like how most REST API endpoints used to default to limit=50 but max is limit=250). Would you recommend not to go over LIMIT 1000 per request in this case or can we use something like LIMIT 5000 without worrying about that throwing an error in a future API update?