This discussion has been locked.
You can no longer post new replies to this discussion. If you have a question you can start a new discussion

M-Files as a storage for frequent big documents

Hi there,

I was approached today with a requirement to save big technical reports containing picture snippets and summaries. These reports are usually between 250-500MB each and they are in PDF format.
Another point is that these reports could easily be produce 1-10 times per month.
So if I calculate a middle case 5 documents of 300 MB per month I coming to 60 documents with size 17 GB.
I have did some load test with 200-250MB files over REST based API and it was not particularly fast (obviously due to size of a document). I am currently not aware about complexity of metadata for that specific document type. This can of course influence uploading speed as well.
I am not aware on any document size limit in M-Files but wondering if M-Files is a proper solution for such a use case.
I am also worried because Vault will grow to be quite big and this could influence other document types used in a system (planned 50-100 document types).

What are your experiences with big frequent documents? I would appreciate your experiences.

Dejan
Parents
  • Hi Dejan
    I have customer who currently has more than 500 zip files sized 100 - 300 MB and few sized around 2 GB in M-Files. We did at first have an issue with too little free disk space on the server to handle the temporary file while uploading. Once they increased the available space to be adequate it has worked just fine. The vault obviously has thousands of more common sized documents and receives over 100 new documents daily from automated import of files attached to emails. Those documents will be related other objects and some metadata is added manually resulting in around 500 modified objects on a daily basis in an organization with about 30 employees. Most of the documents and other objects have 20 - 30 properties, and most of those are filled automatically.
    Obviously, you will need to size the server to handle those large documents just like you would in any other system. And obviously, it takes a little time to upload such files if the connection does not have enough bandwidth. But apart from that I would not be concerned about the size of the files nor the metadata related to them. The danger lies not in the size but in the complexity when you over some years add more automated calculations and each time use the newest features available for that particular task.
    The company mentioned above has used M-Files for at least 6 years from they started out as a small firm developing a new product to now being a global player handling sales, procurement, production and service by subcontractors and vendors in many countries. When we first started out we had no idea where this was going to take us, and we have constantly added functionality and new features to the vault as they grew and added new functions to their organization. At this point we sometimes get surprised when a seemingly simple change in the vault structure suddenly creates unexpected consequences in features or functions build years ago. But file size has never been an issue as long as the server had the resources required to handle them.
    BR, Karl
Reply
  • Hi Dejan
    I have customer who currently has more than 500 zip files sized 100 - 300 MB and few sized around 2 GB in M-Files. We did at first have an issue with too little free disk space on the server to handle the temporary file while uploading. Once they increased the available space to be adequate it has worked just fine. The vault obviously has thousands of more common sized documents and receives over 100 new documents daily from automated import of files attached to emails. Those documents will be related other objects and some metadata is added manually resulting in around 500 modified objects on a daily basis in an organization with about 30 employees. Most of the documents and other objects have 20 - 30 properties, and most of those are filled automatically.
    Obviously, you will need to size the server to handle those large documents just like you would in any other system. And obviously, it takes a little time to upload such files if the connection does not have enough bandwidth. But apart from that I would not be concerned about the size of the files nor the metadata related to them. The danger lies not in the size but in the complexity when you over some years add more automated calculations and each time use the newest features available for that particular task.
    The company mentioned above has used M-Files for at least 6 years from they started out as a small firm developing a new product to now being a global player handling sales, procurement, production and service by subcontractors and vendors in many countries. When we first started out we had no idea where this was going to take us, and we have constantly added functionality and new features to the vault as they grew and added new functions to their organization. At this point we sometimes get surprised when a seemingly simple change in the vault structure suddenly creates unexpected consequences in features or functions build years ago. But file size has never been an issue as long as the server had the resources required to handle them.
    BR, Karl
Children
No Data