Skip to main content

Data Migration Service - AWS DMS

Data migration between various platforms. 

Have you ever thought about migrating your production database from one platform to another and dropped this idea later, because it was too risky, you were not ready to bare a downtime?

If yes, then please pay attention because this is what we are going to perform in this article. 

A few days back we’re trying to migrate our production MySQL RDS from AWS to GCP SQL, and we had to migrate data without downtime, accurate and real-time and that too without the help
of any Database Administrator.

After doing a bit research and evaluating few services we finally started working on AWS DMS 
(Data Migration Service) and figured out this is a great service to migrate a different kind of data. 

Let’s discuss some important features of AWS DMS: 

  • The source database remains fully operational during the migration.
  • Migrates the database securely, quickly and accurately.
  • No downtime required works as schema converter as well.
  • Supports various type or database like MySQL, MongoDB, PSQL etc.
  • Migrates real-time data also synchronize ongoing changes. 
  • Homogeneous migrations (migrations between the same engine types).
  • Heterogeneous migrations (migrations between different engine types).
  • Compatible with a long range of database platforms like RDS, Google SQL, on-premises etc. 
  • Inexpensive (Pricing is based on the compute resources used during the migration process).
This is a high-level overview of Data Migration Setup.

Let's perform step by step migration: 

Note: We've performed migration from AWS RDS to GCP SQL, you can choose database source and destination as per your requirement. 

Create replication instance:

A replication instance initiates the connection between the source and target databases, transfers the data, cache any changes that occur on the source database during the initial data load.

Use the fields to below to configure the parameters of your new replication instance including network and security information, encryption details, select instance class as per requirement.

After completing all mandatory fields click the next tab, and you will be redirected to Replication Instance tab. Grab a coffee quickly while the instance is getting ready.

Hope you are ready with your coffee because the instance is ready now.

Now let's create two endpoints “Source” and “Target”:

Click on “Run test” tab after completing all fields, make sure your Replication instance IP is whitelisted under security group. 

Create Target Endpoint:

Click on “Run test” tab again after completing all fields, make sure your replication instance IP is whitelisted under target DB authorization. 

Our replication setup is ready now, we've to create "Replication Task" to perform the migration. 

Create a “Replication Task” to start replication:

  • Task Name: any name
  • Replication Instance: The instance we’ve created above
  • Source Endpoint: The source database
  • Target Endpoint: The target database
  • Migration Type: Here I choose “Migrate existing data and replicate ongoing” because we needed ongoing changes.

Once all the fields are completed click on the “Create task” and you will be redirected to “Tasks” tab.

Verify the task status:

The task status is "Load Complete" and Validation Status is "Validated" that means migration has been performed successfully. 


  1. Data migrating without time lag , with more accuracy at real-time and that too without the help of any Database Administrator was really good.

    Database migration services
    VMware Cloud Migration
    Azure Cloud Migration
    AWS Cloud Migration
    Cloud Migration Services


Post a Comment

Popular posts from this blog

jgit-flow maven plugin to Release Java Application

Introduction As a DevOps I need a smooth way to release the java application, so I compared two maven plugin that are used to release the java application and in the end I found that Jgit-flow plugin is far better than maven-release plugin on the basis of following points: Maven-release plugin creates .backup and files to your working directory which can be committed mistakenly, when they should not be. jgit-flow maven plugin doesn't create these files or any other file in your working directory.Maven-release plugin create two tags.Maven-release plugin does a build in the prepare goal and a build in the perform goal causing tests to run 2 times but jgit-flow maven plugin builds project once so tests run only once.If something goes wrong during the maven plugin execution, It become very tough to roll it back, on the other hand jgit-flow maven plugin makes all changes into the branch and if you want to roll back just delete that branch.jgit-flow maven plugin doesn…

EC2 Ssh Connection Refused

When ssh: connect to host ip_address port 22 Connection refused

Unable to access server???
Exactly when you see the error - “ssh: connect to host ip_address port 22: Connection refused” while connecting your AWS EC2 Instance. In order to find solution of the problem, you will go to AWS forum and other channels where you need to answers several questions first. But it's very difficult to find the actual problem. In order to get clues what the problem is, we should provide as many details as possible about what we have tried and the results we are getting. Because there are hundreds of reason why a server or service might not be accessible, also connectivity is one of the toughest issue to diagnose, especially when you are hosting something critical on your box. I've seen several topics on this problem, but none offers a solution to it.  I was not aware for what should I look at first. So I walk through from the very basics and investigated the following thing Use of verbose while ss…

VPC per envrionvment versus Single VPC for all environments

This blog talks about the two possible ways of hosting your infrastructure in Cloud, though it will be more close to hosting on AWS as it is a real life example but this problem can be applied to any cloud infrastructure set-up. I'm just sharing my thoughts and pros & cons of both approaches but I would love to hear from the people reading this blog about their take as well what do they think.

Before jumping right away into the real talk I would like to give a bit of background on how I come up with this blog, I was working with a client in managing his cloud infrastructure where we had 4 environments dev, QA, Pre Production and Production and each environment had close to 20 instances, apart from applications instances there were some admin instances as well such as Icinga for monitoring, logstash for consolidating logs, Graphite Server to view the logs, VPN server to manage access of people.

At this point we got into a discussion that whether the current infrastructure set-u…