Last Updated: 2024-09-08

Background

A solution is needed to allow sharing of dataflow definitions; especially between disconnected environments.

Scope of the tutorial

In this tutorial, you will learn how to import an existing dataflow and to export a dataflow from your canvas.

Learning objectives

Once you've completed this tutorial, you will be able to:

Prerequisites

Review the use case

Import a dataflow

Using an existing flow definition JSON file, import it onto your canvas.

Export a dataflow

Export a process group from your canvas into a flow definition JSON file.

Download the file

Save https://devcenter.datavolo.io/assets/4tutorials/import-export/Import_Export_Tutorial.json to your local workstation for use later in the tutorial.

Access a Runtime

Log into Datavolo Cloud and access (create if necessary) a Runtime.

Drag a new Process Group onto the canvas. In the Create Process Group window, click on the upload flow definition icon to the far right of the Name input box.

In the file chooser that opens, locate and select the Import_Export_Tutorial.json file previously saved on your workstation.

In the updated Create Process Group window, prefix the default Name from this flow definition file with My Version of the before clicking on Add to save it.

Drill into the new Process Group and validate that the dataflow looks similar to the following.

There is no need to evaluate this dataflow as the goal was simply to learn how to import a flow definition JSON file.

Leave the Process Group just imported then right click on it and select Download Flow Definition.

For this Process Group, you can select either option since there are no external services being referenced. Verify that a file was downloaded whose name is based on the modified Name of the Process Group you previously imported.

That's all there is to it!

Congratulations, you've completed the NiFi dataflow import/export tutorial!

What you learned

What's next?

Check out some of these other tutorials...

Further reading

Reference docs