The UI and API are timing out because your database server is likely a bit underpowered to handle that volume of records. Your best bet is to go directly to the database:
Connect to your DB directly (e.g. through a CLI client or something like pgAdmin/phpMyAdmin depending on your DB)
Drop the table: `DROP TABLE your_collection_name;`
Clean up the Directus system tables so it doesn’t try to reference the now-gone collection:
DELETE FROM directus_fields WHERE collection = 'your_collection_name';
DELETE FROM directus_relations WHERE many_collection = 'your_collection_name' OR one_collection = 'your_collection_name';
DELETE FROM directus_collections WHERE collection = 'your_collection_name';
DELETE FROM directus_permissions WHERE collection = 'your_collection_name';
Restart Directus after
If you’d rather keep the collection but just nuke the data, you could TRUNCATE your_collection_name; instead of dropping the whole table in step 2, and skip step 3.
If you’re using Directus Cloud, our support team can help you out as well → support@directus.io
As for the flow, I’d recommend adding a condition or limit check to prevent runaway loops in the future. There’s no built-in execution cap on flows currently.