Data Migration Tool

This XrmToolBox tool allows you to export the data from one Microsoft Dataverse environment in JSON format and import it to another environment for the selected columns and for specific rows based on the filter XML (developed by Rui Sousa).

It is important to keep the same GUID for some of the reference data in multiple environments (especially those which are referenced in the workflows, business rules, JavaScript, etc.).
To migrate the data from one environment to another, you can use this tool to select the table and specify the column/row filter, export it in JSON format and import it to the target environment.

Remark: This tool is similar to the Data Transporter tool. The advantage of this tool is allowing the user to export the data in JSON format. This can be useful if the configuration data is checked into source control or the user who prepares the data is different from the user who imports it into the target environment. However, there is no one-click Transfer action to migrate the data without exporting (import from the last exported would be the closest functionality). One more advantage is auto map the Teams by Name which allows migrating the lookup value of the Team if the GUID of the Teams are different in different environments. But there is a lack of auto-mapping for the default transaction currency. In the current version (2023.4.20.2), the Delete operation seems to be disabled and not supported yet. The current version does not support the association of the N:N table either. Same as the Data Transporter tool, the filtering of the rows is done based on the <filter> section of the FetchXML and the filter is limited to the selected table (it cannot be filtered based on the values in the linked tables).
  • Tool Information
  • Functionalities
    • Export the data in JSON format from the source environment (Fig. 1, Pt. 3)
      • Load the tables of the source environment (Fig. 1, Pt. 1)
      • Filter the list of tables by name (Fig. 1, Pt. 12)
      • Select the table to be exported (Fig. 1, Pt. 13)
      • Select the columns to be exported (Fig. 1, Pt. 14)
        • Hide the invalid columns which do not support create/update (Fig. 1, Pt. 11)
      • Filter the rows to be exported by filter XML (Fig. 1, Pt. 15)
        • Edit the filter XML in FetchXML Builder (Fig. 1, Pt. 16)
      • Connect to the target environment (Fig. 1, Pt. 5)
      • Auto-map the User, Team and Root BU based on the username/name instead of GUID (Fig. 1, Pt. 6)
      • Review the list of auto-mappings and add/remove custom mappings (Fig. 1, Pt. 8) (Fig. 2)
        • Add attribute mapping to set as the primary column (instead of GUID) to match the records (Fig. 2, Pt. 1)
        • Add value mapping to transform if the GUID is different between source and target environments (Fig. 2, Pt. 2)
        • Hide auto mappings to view the custom value mappings (Fig. 2, Pt. 3)
      • Apply and transform the GUID value mappings (Fig. 1, Pt. 7)
        • On Export Stage (the exported data file will contain the GUIDs of records in the target environment)
        • On Import Stage (the GUIDs of records will only be transformed based on the mapping during the import process)
      • Based on the current filter, preview the data to see how many records will be created, updated, or deleted in the target environment (Fig. 1, Pt. 2) (Fig. 3)
      • Export file (Fig. 1, Pt. 17)
        • Export the data from the source environment as JSON file
        • Export the settings (excluded columns, filter XML, etc.) of the selected table
        • Export both data and table settings
    • Import the data file into the target environment (Fig. 1, Pt. 4)
      • Choose the operations (Fig. 1, Pt. 9)
        • Create: The record will be created in the target environment if the record exists in the source environment but not in the target environment
        • Update: The record will be updated in the target environment if the record exists in both the source and target environments
        • Delete: (Not available yet) The record will be deleted in the target environment if the record exists in the target environment but not in the source environment
      • Select the batch size of the operation during the import (Fig. 1, Pt. 10)
      • Import file (Fig. 1, Pt. 18)
        • Select the data JSON file and import into the target environment
        • Import the table settings file to preset the excluded columns, filter XML, etc.
        • Import the last exported data file
    • Similar Tools
     

    Fig. 1 - Data Migration Tool


    Fig. 2 - Mappings


    Fig. 3 - Preview

    This is part of the #TooLDR series where I test and review the tools for Power Platform. You can check out the information for the other tools on this page.

    Comments

    1. Hello Linn,
      We have the challenge of merging two Dyn365 CE Sales environments with a few different columns. I have looked at the tools you recommend, if I have understood correctly, all tables need to be configured individually in all tools. Which is very time-consuming. Unfortunately, I have not yet been able to find any other method.
      Is there a tool or possibility to transfer all data at once? Perhaps, if the columns were previously adjusted.
      There are also files stored in SharePoint via the standard integration. Can these also be transferred?

      Thank you very much.
      Gerald Huber
      mailto: gerald.huber(at)vsb.de

      ReplyDelete
      Replies
      1. Hi Gerald
        These tools are meant to be used to move the data from a few reference tables from one environment to another.

        For such large repeatable data migration, I recommend using tools like SSIS with Kingswaysoft
        https://www.kingswaysoft.com/solutions/ssis-development-platform-components/microsoft-dataverse-integration

        Delete

    Post a Comment

    Popular Posts