Large Json Files

Options
Hey guys,

I created an API that adds records to a table from a json record list. Everything goes very well when I send up to 1000 records in json.

When I send larger files, with 5,000 records for example, the API response time gets huge, it returns a runtime error message and, when I go to see the database, it saves 3 times the number of records that had been sent in json. Always 3 times the value.

How do I resolve this? I need to send files with 15K to 30K records.

Comments

  • Sean Montgomery
    Sean Montgomery Administrator

    ADMIN

    Options
     you are dealing with request timeout.

    Depending on your frontend, it might retry the request after a timeout.

    It is best practice to limit API requests to approximately 30 seconds. Otherwise, you run into issues like what you are mentioning above.

    This means that you need to chunk your requests up into more granular pieces. 1000 records sounds like a sweet spot, so if you had 15,000 you would send 15 requests.

    If you are looking for bulk import support, we have extremely good CSV and Airtable import support which handles all of this in the background - meaning no size limits. 
  • Fernando Andrade
    Options
    Thank you Sean,

    I just did this