Scaling Database to 130MM Records + Querying

Options

Hello - I am considering using Xano to host a database of 130MM polygons. I'd like to query the database using "Query All Records" with a Custom Query where a specific point (lat, lng) is within one of the polygons (return the polygon the point resides in). My question - do you foresee issues working with a database of this size? Will the query be slow due to having to work through this many records?

In summary, I'd like to better understand performance implications of working with a database of this size and using Custom Query prior to uploading all of the data.

Thanks for any insight.

Tagged:

Best Answer

  • Sean Montgomery
    Sean Montgomery Administrator

    ADMIN

    Answer ✓
    Options

    @WMcCartney anything of that size is a challenge for any product. Xano supports spatial indexes so as long as you are storing your data with the geo primitives, it will operate fast when an index is also used.

    I would recommend benchmarking yourself so you can see how performance changes from 10k,100k,1000k records and beyond. There may be tweaks along the way that may need to be done to support that many records. Also something of that size will sooner or later need the enterprise plan, so be aware of that. Then you can scale as high as you want.