Migrating Huge Volumes of Data for a Leading Managed IT Services Provider

This is the featured image for our case study article called "Migrating Huge Volumes of Data"

The Client:

Our client is a provider of technology-managed services and product resellers. The company was established in the 1980s and was later acquired by one of the biggest investment management companies in the world. It is currently headquartered in California and has over 8,000 employees. Our client specializes in Managed Workplace services including IT solutions and hardware, integration and support. The company has key partnerships with technology companies such as HP, IBM, Cisco, and Apple.

The Challenge:

Our client wanted to duplicate their ServiceNow data to other two databases – Oracle and Snowflake. The challenge with that project was migrating huge volumes of data efficiently. Some data tables consisted of over a hundred million records. In such a case sharing all the information at once is not an option. Instead, we had to divide the data into specific pieces and migrate them separately.

The Solution:

For this project, we estimated a four-month time frame to deliver it. In the beginning, we established the KPIs of the project which were mainly focussed on dividing the huge data sets into smaller pieces. Fortunately, the client had the required understanding and cooperated fully for the proper separation of the data. 

The project was executed in stages. The approach was the following: migrate the data from the past six months plus the incremental data and afterward start sharing the rest of the data into smaller pieces – around ten million records per day for a given table. So as a first step, we shared the data that has been generated for the past 6 months. Next, we started the so-called incremental loader – meaning the synchronization between the different databases. The division of the data was done in cooperation with the client. We relied on the Perspectim DataSync tool for migrating and dividing the data tables. 

In order to ensure the proper quality of the process we needed to carefully monitor each step of the project. For such a big migration there are almost always records that are corrupted. In cases where there were such errors, we had to check manually, if everything was okay. One of the most common problems was identifying missing data from a certain row. When that happens we check manually where the problem occurred. Once the record is found it is reshared again. However, if there is a more specific problem, such as example failure in the encryption we investigate the issue, add an additional logic that would clean the data records that failed to be shared and reshare it again. 

The result:

Overall, we have migrated around fifty million records per week. The final result was that the client had all the data sets transferred to the two new databases and could operate that data for future business intelligence purposes, such as reporting and analytics easily and affordably. 

Data migration is just one of the services in our data portfolio. You can take a detailed look here. You can also book a free call with us and discuss your data challenge.

Have a Question?

We’re here to help you achieve your business goals with our innovative Data Management and AI solutions.

Contact us for an introduction on how we can assist your business with AI Solutions.

Lets meet!