Gaming industry under DDoS attack. Get DDoS protection now. Start onboarding
  1. Home
  2. Developers
  3. How to Transfer/Move a Docker Image to Another System?

How to Transfer/Move a Docker Image to Another System?

  • By Gcore
  • March 19, 2023
  • 3 min read
How to Transfer/Move a Docker Image to Another System?

In an ideal scenario, transferring docker images is done through the Docker Registry or though a fully-managed provider such as Gcore. You can easily upload an image through the docker push command, and others can pull the image using the docker pull command.

Although, if you need to move an image from one host to another to test the image before sending it to the production environment, or you want to share the image with someone in the office, then it can be achieved by exporting the image as a .tar file.

Docker supports two different types of methods for saving the container images to a single tarball.

  1. docker save – Save is used to persist an image (not a container)
  2. docker export – Export is used to persist a container (not an image)

Using Docker Save Command:

Saving Docker Image:

First, we will stick to the plan, that is saving the image only. Now, let’s walk through the docker save command. Assume that you need a Python image with Alpine, which can be pulled from Docker Hub:

$ docker pull python:2.7.17-alpine3.92.7.17-alpine3.9: Pulling from library/pythone7c96db7181b: Already exists1819f4b92bc2: Already exists8061b3761cb3: Pull complete73aebae115de: Pull completeDigest: sha256:5f6059d78f530c3c59c4842e104ddcfc772a27fb8fac0d900f4d77bcb4621d9bStatus: Downloaded newer image for python:2.7.17-alpine3.9docker.io/library/python:2.7.17-alpine3.9

After adding a few files or making changes in the container, you decide to create a tarball of the image to provide it to your colleague. You can achieve this by running the below-mentioned command:

$ docker save python:2.7.17-alpine3.9 > /path/to/save/my-python-container.tar

Just make sure that you use the exact image name and the tag during tar creation. In our case, it was python:2.7.17-alpine3.9. You can verify if the above command worked:

$ du -h my-python-container.tar 75M my-python-container.tar

Now, you can send the .tar file to another person via rsync, scp or a similar file transfer protocol as per your preference.

Loading Docker Image:

Once the target machine has the .tar file, you can load the image into the local registry using command docker load :

$ docker load < my-python-container.tar

Now, cross-check if you have that image on the target machine by using docker images or docker image list. The end result will be something like below :

$ docker image listREPOSITORY   TAG               IMAGE ID       CREATED              SIZEpython       2.7.17-alpine3.9  3f0e580ded94   2 hours ago          74.9MB

Using Docker Export Command:

Exporting Docker Container:

Note: The docker export command will not export the content of the volume, which is attached to the container. In this case, you need to run an additional command to backup, restore or migrate the existing volume. You can read more about this here.

Looking at the docker export method, first we will pull an Alpine image:

$ docker pull alpineUsing default tag: latestlatest: Pulling from library/alpinee6b0cf9c0882: Pull completeDigest: sha256:2171658620155679240babee0a7714f6509fae66898db422ad803b951257db78Status: Downloaded newer image for alpine:latestdocker.io/library/alpine:latest

Now, you can run the instance in detach mode so that the container doesn’t get destroyed when we exit it.

$ docker run -it --detach --name alpine-t alpine

To get the container ID and name which we created, we can use the docker ps command. Just in case, if in your machine the container has/was stopped for some reason, you can still get the ID and name by using docker ps -a:

$ docker psCONTAINER ID  IMAGE  COMMAND   CREATED         STATUS        PORTS    NAMES35f34fabfa84  alpine "/bin/sh" 14 seconds ago  8 seconds ago           alpine-t

As we can see, our container id is 35f34fabfa84 (it will be different for you), or you can use the container name as well; in our case, it is alpine-t. Now, we can run the docker export command to export the instance’s image:

$ docker export 35f34fabfa84 > alpine-t.tar

Alternatively, you can also use OPTIONS to do the same, and your .tar file will be ready for transfer.

$ docker export --output="alpine-t.tar" 35f34fabfa84

Importing Docker Container:

Now, you can import the .tar file to the target machine by using docker import:

$ sudo tar -c alpine-t.tar | docker import - alpine-t

To verify, you can run the container using --rm (it will destroy the container once you execute it):

$ docker run --rm -it --name alpine-test alpine-t:[TAG]

Next you could also automate the docker build process with Jenkins.

Explore Gcore Container as a Service

Related articles

Multi-Cloud Plan: What It Is and How It Works

Your cloud provider goes down. Applications fail. Customers can't access your services. And because you've built everything around a single vendor, there's nothing you can do but wait. For organizations locked into one cloud platform, this

Vendor Lock-In in Cloud Computing: What It Is and How to Avoid It

Imagine discovering that migrating your company's data to a new cloud provider will cost hundreds of thousands of dollars in egress fees alone, before you've even touched the re-engineering work. Or worse, picture being in Synapse Financial

What Is Sovereign Cloud and Why Does It Matter?

Picture this: a foreign government issues a legal order forcing your cloud provider to hand over sensitive patient records, classified research data, or critical national infrastructure details. You can't stop it. This isn't hypothetical. G

Types of Virtualization in Cloud Computing

Your physical servers are sitting idle at 15% to 20% CPU utilization while you're paying for 100% of the power, cooling, and hardware costs. Meanwhile, your competitors have consolidated 10 to 15 applications per server, pushing utilization

What's the difference between multi-cloud and hybrid cloud?

Multi-cloud and hybrid cloud represent two distinct approaches to distributed computing architecture that build upon the foundation of cloud computing to help organizations improve their IT infrastructure.Multi-cloud environments involve us

What is multi-cloud? Strategy, benefits, and best practices

Multi-cloud is a cloud usage model where an organization utilizes public cloud services from two or more cloud service providers, often combining public, private, and hybrid clouds, as well as different service models, such as Infrastructur

Subscribe to our newsletter

Get the latest industry trends, exclusive insights, and Gcore updates delivered straight to your inbox.