Flow "Update Operation" M2O Truncates Large Payloads (~150 Items) Despite Full Preview

Current Setup & Issue Context:
We have an orders collection with an M2O relation to order_items. Users upload Excel files (~2500 lines) containing order items. A flow processes this:

  1. Extracts/parses Excel into 154 unique order_items
  2. Updates single order record with full order_items M2O array (visible complete in operation preview/logs)

The Problem:
“Update Data” operation receives full 154-item array but silently truncates based on payload complexity:

  • Rich data (many fields) → ~100 items saved
  • Reduced parameters → All 154 saves successfully
  • No errors, no warnings—just partial persistence

Can someone explains why this happening? I can’t find any env variable controls such thing.

The env parameter you are looking for is: QUERY_LIMIT_DEFAULT

Recommend you don’t change this value as it is there to protect the project for accidental crash with large queries. Better to update your flow to process the data in batches or use the limit parameter to increase the API processing limits.