Skip to main content

Using AWS Data Migration Service

 Want to share this easily? 

Check out the Notion page.

By Morgan Lucas (she/her) from this video by Johnny Chivers

We use data migration services to, well, migrate data. But why would we want to do this?

 Perhaps...

  • We're moving our business to the cloud, and need to shift all of that cold storage we have onsite.
  • We want to use it as a backup in cause our infrastructure is out of commission.
  • We could have information to share with a 3rd party, and instead of giving access to on-site databases, we put it on AWS to share.
Nevertheless, let's recap what I've done.
  • Created publicly accessible, password-protected database with Amazon Aurora with PostgreSQL Compatibility to migrate to Amazon Dynamo DB
  • Managed inbound rules of security group to limit access
  • Used open source software HeidiDB to interact with database via a TCP/IP session and specific URL for DB (Not shown here for security)

 


 

  • Connected to Aurora PostgreSQL Database
    • ran queries that deleted and created tables populated with new information
  • Created publicly accessible replication instance to connect to our RDS to initiate database migration
  • Created configuration endpoints with existing instance used to migrate database User_Data to Amazon DynamoDB
  • Created role to talk to DMS (making sure to click the radio button beneath)
  • Successfully connected target (endpoint) to instance (middleman to transfer data)
  • Set up database migration task to move from Aurora PostgreSQL Database to Amazon DynamoDB

Find me on Twitter, or my blog. You can Buy Me A Coffee and help me keep writing!

Comments

Popular posts from this blog

Connecting IoT Devices to a Registration Server (Packet Tracer, Cisco)

In Packet Tracer, a demo software made by Cisco Systems. It certainly has changed a lot since 2016. It's almost an Olympic feat to even get started with it now, but it does look snazzy. This is for the new CCNA, that integrates, among other things, IoT and Automation, which I've worked on here before. Instructions here . I don't know if this is an aspect of "Let's make sure people are paying attention and not simply following blindly", or an oversight - The instructions indicate a Meraki Server, when a regular one is the working option here. I have to enable the IoT service on this server. Also, we assign the server an IPv4 address from a DHCP pool instead of giving it a static one. For something that handles our IoT business, perhaps that's safer; Getting a new IPv4 address every week or so is a minimal step against an intruder, but it is a step. There are no devices associated with this new server; In an earlier lab (not shown), I attached them to 'H

Create a Simple Network (Packet Tracer) + A Walkthrough

Again; I've done this, but now there's so many new things, I'm doing it again. The truly new portions were...everything on the right side of this diagram; The cloud needed a coax connector and a copper Ethernet connector. It's all easy to install, turn off the cloud (Weird), install the modules. Getting the Cable section of Connections was an unusual struggle - The other drop down menu had nothing within. It required going into the Ethernet options and setting the Provider Network to 'cable', which is the next step AFTER the drop-downs. The rest was typical DHCP and DNS setups, mainly on the Cisco server down there. The post is rather short - How about adding a video to it? Find out what A Record means - This site says 'Maps a name to an IP address', which is DNS. So it's another name for DNS? You can change them (presumably in a local context) to associate an IP address to another name.

Securing Terraform and You Part 1 -- rego, Tfsec, and Terrascan

9/20: The open source version of Terraform is now  OpenTofu     Sometimes, I write articles even when things don't work. It's about showing a learning process.  Using IaC means consistency, and one thing you don't want to do is have 5 open S3 buckets on AWS that anyone on the internet can reach.  That's where tools such as Terrascan and Tfsec come in, where we can make our own policies and rules to be checked against our code before we init.  As this was contract work, I can't show you the exact code used, but I can tell you that this blog post by Cesar Rodriguez of Cloud Security Musings was quite helpful, as well as this one by Chris Ayers . The issue is using Rego; I found a cool VS Code Extension; Terrascan Rego Editor , as well as several courses on Styra Academy; Policy Authoring and Policy Essentials . The big issue was figuring out how to tell Terrascan to follow a certain policy; I made it, put it in a directory, and ran the program while in that directory