Running total

Options

I've successfully used an aggregation addon to recursively query the same table. This means that I can get a grand total in the same response as the query response list.

However, I can't get the addon query to include a "< =" timestamp filter, which would provide a running total. I'm passing the auth.id from the API call into the "main" query, and then onto the addon query to filter by user, but of course this is a fixed value. For a running total I need to pass or self-reference a variable - that is, each db entry's own timestamp.

Any chance this is possible?!

I do have this currently working in a loop, but as I want to allow users to apply front-end filters, I'm hoping to speed it up a little.

Thanks

Tagged:

Answers

  • Michael Udinski
    Michael Udinski Administrator

    ADMIN

    Options

    Hey @owen - it sounds like you are asking how you can add a second parameter to your Addon? You can do this by going to the Addon page (Library > Addons). Then selecting, the Addon itself. You'll see you have access to the inputs and the query so you can add your second input and define the expression in the by custom query section to be a < = sign. The Addon wizard within the main query is more of a quick Addon builder and just uses equals. But from the Addons page, you have full control!

  • owen
    owen Member
    Options

    Michael

    Thanks for the response. I've been working through the Addon page.

    What I've been trying to do is set additional query filters on the Addon where the filter value is dynamic, similar functionality to "$this" in array operations. In my case, I'm returning a chronologically-ordered list in my main query, and I want the Addon to aggregate some variables (e.g. sum of distance) over all records up-to-and-including "$this" record. This will give an aggregated running total added to each record returned. With some creativity, by applying an ">=" and "=<" filter to the timestamp field, I can return a moving average.

    I have only been able to pass a fixed value into the Addon query.

    Thanks

  • Ray Deck
    Ray Deck Trusted Xano Expert ✭✭✭
    Options

    How large is the data set you are performing these queries on? Intuitively, the structure implied here is really compute-intensive, which implies that moving more of the operation into the database is not going to give you a big performance boost. Depending on how you are doing things in the function stack and the size of the data set you are performing these secondary calculations on, however, there are other ways to get a boost in memory. Especially if its a recursive/running total situation. You can pull your initial list and perform calculations on it in memory much more cheaply than doing re-runs in the database. If you are finding the function stack is slow, we might look into how we can make it fast in that context.

  • owen
    owen Member
    Options

    Hi Ray

    I'm testing ~220 records per user, which takes about 2 seconds. Live data *could* be 10x that I think.

    My current solution queries the db in each loop to obtain aggregates, which I suspected was likely to be resource-heavy. I initially had a solution that calculated the aggregates manually through array operations (pushing each new entry to the end of the array, slicing off the start). This was quicker, but the function stack is much messier - hence harder to edit as I go along. I can revert to this approach if necessary once I have a settled solution and see how that works out

    I have given up using Addons for this purpose, although I've discovered it does actually work to provide a count, but not a sum / mean.

    Thanks

  • Ray Deck
    Ray Deck Trusted Xano Expert ✭✭✭
    Options

    I suspect you have an O(n^2) approach to how you do the calculations in the function stack, which we could make O(n) - a big speed improvement, by reducing it to a single pass.

    But that's performance - for dev, I would stick with the big, slow but easy-to-understand approach and maybe even keep it until you see this become your bottleneck. And if you want to revisit performance then, you will ahve data that makes it easy to test!

    What that day comes, this kind of data maniputation problem is the kind of thing we do routinely at Statechange daily office hours. Consider it if the perf problem is holding you up!