More optimized way to get huge json into database

Options

Hey guys, I am having a hard time getting a big number of rows into the database.

Currently, I get zipped file from ftp, unzip it, split it into about 250 json files less than 1mb (otherwise, i get 413 error) and send it to xano api to insert the rows into my table, there are a total of 750000 rows in total.

The problem with this approach is that it takes a really long time and the instances's cpu usage from database is at maximum all the time.

Is there any better way to do this in xano? Apperciate any help in advance.

Tagged:

Best Answer

  • Ray Deck
    Ray Deck Trusted Xano Expert ✭✭✭
    Answer ✓
    Options

    Have you looked into the “database transaction” block in Xano? Usually these bulk-imports are held up waiting for all the insert statements to run. The block basically wraps up those inserts that are called inside the block into a single transaction that goes much faster.