This discussion has been locked.
You can no longer post new replies to this discussion. If you have a question you can start a new discussion

Deployment automation for vault setup

Hi community and M-Files, 

As we are heading to setup multiple M-Files environments and have recently faced the fact that admins need to be on the server to import vault configurations, I am wondering if there is any way to automate deployment of vault configurations (import functionality in admin).

Our idea would be to save the exported packaged into Git, take it automatically and upload it on specific server and vault. Does anyone do anything similar or how do you automate deployments?

Regards,

Dejan

  • If you mean the vault structure, you could set up replication via cloud storage between the development environment and the production vaults. Note that for instance vault applications and their configurations are not part of replication, those you have to install separately to each vault. Cloud storage for replication is provided by M-Files and included in all subscriptions without additional cost.

    I wouldn't recommend completely automating this structure replication (ie. expecting any change made to the dev vault to be immediately replicated to prod) as you probably want to have some control over when and what is deployed, but you could set up a scheduled export job in disabled mode in the dev vault and then run it manually whenever you are ready to deploy changes. On the production vaults the import job could monitor the cloud storage folder every 5 minutes for instance.

    More information on the cloud storage in chapter 5.7 of M-Files Replication and Archiving User's Guide.

    Some best practices related to deploying controlled changes between environments can be found in M-Files Best Practices - Applying Controlled Changes to an M-Files System. This has been written with highly regulated industries in mind and your deployment process likely doesn't have to be as heavy, but I'm just leaving it here since there may be some good tips in there anyway.

  • Thanks Joonas.

    Those are some great references and ideas. I will do some thinking how we could implement it in-house. We are not cloud based but rather having multiple environments that completely independent from each other (network-wise). So as you said I would need to move packages from environment to environment, and have some hot folders to monitor for some kind of automated deployment if I understood correctly. The only question I would still have if I could trigger export somehow through API, some scripts. Then I could get the package, move it somehow through network and import it on other environment. 

    You are right, certain things I would not move from one environment to another, such as imported AD groups, object types etc. That would not work well.

  • The cloud storage can also be used in on-prem to on-prem replication if it's okay for you that the packages reside (encrypted) on Azure while they are being transmitted. It's one way to move packages between on-prem servers.

  • Unfortunately we can not do it.

    Any ideas on this:

    The only question I would still have if I could trigger export somehow through API, some scripts.

    Thanks.