queue or API function delay

Options

Hi community,

I'm building and integration of two apps with Xano. 1st system calls Xano API. Then I have a logic inside Xano with some database manupulation

And then it calls API of a 2nd system. 2nd system is OpenAI API which has some tough rate limits.

My 1st system might have some use cases where it may do 1000 - 10 000 calls in one minute (And I have no control over it). I need somehow to create a queue or delay execution of events over rate limit.

Is it possible to make in Xano?

Comments

  • Ray Deck
    Ray Deck Trusted Xano Expert ✭✭✭
    Options

    A couple of options.

    Assuming this is a post/"please do a thing" request:

    First, if you need to act on these requests at this speed without losing their data, you might have your endpoint just quickly write the request info to a table and then return, minimizing that on-demand load. Then make a background task that works on the data in that table to execute whatever actions are required.

    Second, if you mean to rate-limit, you can implement that by running a check at the start of the API endpoint against a table or Redis key. If you have seen too many requests in a period of time, precondition it out.

    If this is a get / "read a thing" request:

    You can implement API caching in the "settings" (under three-dots hamburger menu). This will short-circuit requests to deliver them faster without hammering your systems.

  • remike
    remike Member
    Options

    Hi Ray, thanks for getting back. I was thinking about posting a request to DB (I'm already doing it) and processing it afterwards but the System1 is expecting a meneangful result or an error, so I need to execute an API call to OpenAI.

    Thanks for pointing to Redis key, I'll check it out. I was also found that there a Sleep utility so I can just do a count of actions in progress and do a sleep function for Count of user requests in progress * 5 sec or something like that.