What if you want to migrate tens of millions of documents and more than 100 sites? And what if downtime during bulk migration must be reduced to zero? Then manual migration of documents is not an option at all.
Migrating large amounts of documents and data has several challenges. Challenges that play a role in every migration, but are even bigger when migrating large volumes of information. What are these challenges? Find out below.
- 1. Infrastructure
Usually, the hardware (network and servers) is not calculated on the peak load that occurs when migrating large amounts of documents. The speed required for migration is 10,000 to 20,000 times faster than in the “normal” situation and a solution needs to be found. For example, it is possible to process the hard disk externally and convert it to the target system. Although well-protected, this method is not preferred. Data, after all, leaves the safe walls of the business premises.
Another method is to spread the migration process over a longer period, with the source system still running. At the last moment, the source and target system can be synchronized so that the last modified files are transferred to the new environment. This is called a delta migration, a method now fully accepted, especially because it is safe and there is no downtime.
- 2. Content
Another challenge with bulk migration is the content. With such large quantities of data there is no stakeholder who knows exactly what data is present and how many terabytes it exactly is. After all, there is no way to do a right-click and read the number of terabyte easily. Of course, a system administrator can tell you how much is used for storage, but a translation to the exact number of files is impossible.
Managing large amounts of data is complex and in all cases the number of duplicates (both in files and in file folders) and ROT (redundant, obsolete and trivial) data is very large. Of course, these problems occur with small amounts of data, but in cases where there is large amounts this problem cannot be solved manually. For example, with a few hundred thousand documents it is still possible to carry out a clean-up operation with a small team during the summer, but with 50 million documents this is never ending work.
When migrating content to a new system there is always the desire to start with a clean slate. Of course, you don’t want to take pollution from the old system with you to a new environment. It is, therefore, necessary to clean up by deduplication, classification and restructuring. With large numbers of data an automated process for cleanup and the final bulk migration is a must!
With more than 10 years of experience in content migrations, Xillio has an optimized solution for bulk migrations. From a few thousand to millions of documents without creating a heavy load on your network yet increasing the quality of your content!