Sync between Dev Test and Live

Options

If I have 3 workspaces Dev, Test, and Live. What is the best way to sync all the changes done in Dev and move to Live?

Comments

  • Chris Coleman
    Chris Coleman Administrator

    ADMIN

    Options

    Hi @pepecortez3604 — great question, and a tricky one! We have a feature that enables cross-workspace sync but it is currently only available to our enterprise customers. :( Typically, you'd do all of this in one workspace and use Branches / Branch Merging and test data sources. You're even able to call your new API branches externally as well as run them against your test data sources before you merge the changes into live.

  • firestone8
    Options

    Question from a newbie here: how can we transfer dev data into the live data environment (within a single workspace)?

    In our case, we are using the dev data environment to curate the data. We would like to transfer that data into the live environment. Should this be achieved through developing a Task (cron job)?

  • jackb
    jackb Member
    Options

    @firestone8 did you find a solution for this?

  • firestone8
    Options

    Hi @jackb - not yet. Still looking for an answer/solution!

  • jackb
    jackb Member
    Options

    yeah @firestone8 in theory we can do this with a daily background task with functions to clear the test database, query the live database, and bulk add the live database to test.

    however the problem is that this is maintenance hell because you’ll have to manually define every single table and map each column in every table, so you’ll constantly have to maintain it as the schema changes. New table? Update the background task. New column? Update the task. Etc.

    Don’t think this is realistic.

  • firestone8
    Options

    Thanks Jackb for the answer. Indeed, this is what we were considering initially but would not proceed with this as it generate more technical debt than it solves the problem.

    We will probably be using Xano's dev/live environments as designed, and curate/test data in a separate Azure PostgreSQL database. Thanks anyway!

  • jackb
    jackb Member
    Options

    @firestone8 FYI i've been looking into the Metadata API and it seems that there is a much easier way to do this. Documentation: https://docs.xano.com/metadata-api

    You can create a background task, query the tables and content in the live data source, then use loops and some conditions to update / add to the test data. It wouldn't require any maintenance after initial setup.

  • firestone8
    Options

    Very useful @jackb. Thanks for sharing.
    We were not aware of the Metadata API, and believe this will solve our challenge. We'll try it out.

  • aemondis
    aemondis Member
    Options

    FWIW - I simply used the export functionality for the tables I was interested in, then manually imported in the test environment from the file. Be warned though - if you have any "list" fields in your DB that are references to another table, you may have to rebuild those from scratch. You would need to build a simple API to repair the references in this case.

    In my case, I just rebuilt the references by hand since the only data I imported was enumeration-like data used as references elsewhere.

  • jackb
    jackb Member
    Options

    i created a background task that moves live data to test each morning using the metadata API endpoints to get a list of tables and loop through them. for each one, clear the test data, get the live data, and push the live data to test. pretty straightforward and requires no maintenance.

  • Jeremymona
    Jeremymona Member
    edited November 2023
    Options

    @jackb is there any chance you could share your logic or do a quick video? Alternately id love to get in touch with you to get help doing it for our app!

  • jackb
    jackb Member
    Options

    @Jeremymona here you go: https://www.loom.com/share/4cc9838dbc8d4796bb35509e0f4573f5

    Happy to hop on a live zoom and help you out tomorrow / Monday if need be. Good luck!

  • Jeremymona
    Options

    This is fantastic @jackb! Worked perfectly. Thanks heaps mate.