Migrate from SQL Server or Oracle into Amazon Aurora using AWS Database Migration Service

Post on 23-Feb-2017

16 Views

Category:

Technology

1 Downloads

Preview:

Click to see full reader

Transcript

© 2017, Amazon Web Services, Inc. or its Affiliates. All rights reserved.

Paras Bhuva, Solutions Architect2/22/2017Amazon Web Servicesbhuparas@amazon.com

Migrate from SQL Server or Oracle into Amazon Aurora using AWS Database Migration Service

@parasbhuva

What to Expect from the Session

Agenda• Migrating to AWS• AWS Schema Conversion Tool Overview• Migration Considerations• AWS Database Migration Service Overview• Amazon Aurora Overview• Demo!• Best practices – SCT and DMS• Q&A

Migrating to AWS

• Quickly provision databases• Multiple Availability Zones• Rapid scaling• Automated patching• Easy read replica creation• High durability• Point in time recovery• Detailed metrics• Single-click encryption at rest

AmazonRDS

Why AWS?

• How will my on-premises data migrate to the cloud?• How can I make it transparent to my users?• How will on-premises and cloud data interact?• How can I integrate my data assets within AWS?• How can I move off of commercial databases?

How?

Migration Options• Lift and shift

• Leverage Amazon EC2 and Amazon S3• Keep existing DB engine but migrate to Amazon RDS

• For example Oracle on-premises to RDS Oracle• Migrate database engine

• Commercial engine to open source• Maintenance window

• Maintenance window duration vs. CDC with 0 downtime

Database Migration Process

AWS Schema Conversion Tool Overview

AWS Schema Conversion Tool

Features• Converts schema of one database engine to another• Database Migration Assessment report for choosing the best target engine• Code browser that highlights places where manual edits are required

The AWS Schema Conversion Tool helps automate many database schema and code conversion tasks when migrating from Oracle and SQL Server to open source database engines.

Convert Tables, Views, and Code

• Sequences• User Defined Types• Synonyms• Packages• Stored Procedures• Functions• Triggers• Schemas• Tables• Indexes• Views

Components of the Console

1. Source Schema2. Action Items3. Target Schema4. Schema Element Details5. Edit Window

Supported Conversions

$0for software license

Allowed Use Use Schema Conversion Tool to migrate database

schemas to Amazon RDS, Amazon Redshift, or Amazon EC2–based databases

To use Schema Conversion Tool to migrate schemas to other destinations, contact for special pricing

Pricing Free software license

For active AWS customers with accounts in good standing

Pricing, Terms & Conditions

Prerequisites

• Create Databases• Source• Target

• Download AWS Schema Conversion Tool• http://amzn.to/2b2YE2a

• Download Drivers• http://amzn.to/2axE0Hn

• Update Global Settings

Global Settings – Logging

Global Settings – Drivers

Download Drivers here http://amzn.to/2axE0Hn

Global Settings – Performance and Memory

Global Settings – Assessment Report

Few considerations before you start your DB migration project…

Time Considerations

•Any Hard Dates?•Planning Time?•Typically 2-3 Weeks•Several Iterations

Database Considerations

• Number of Schemas?• Number of Tables?• Engine Specific Types?• Users/Roles/Permissions?

Network Considerations

• Access (Firewalls, Tunnels, VPNs)?• Which VPC?• Which Security Groups?

Requirements Considerations

• Engine Selection Criteria?• Which Tables Need to Move?• Same Target For All Tables?

Phase Description Automation

1 Assessment SCT

2 Database schema conversion SCT/DMS

3 Application conversion/remediation SCT

4 Scripts conversion SCT

5 Integration with third-party applications

6 Data migration DMS

7 Functional testing of the entire system

8 Performance tuning SCT

9 Integration and deployment

10 Training and knowledge

11 Documentation and version control

12 Post production support

Database Migration Phases

DMS Overview

• Start your first migration in 10 minutes or less

• Keep your apps running during the migration

• Replicate within, to, or from Amazon EC2 or RDS

• Move data to the same or a different database engine

AWSDatabase Migration

Service(AWS DMS)

Customerpremises

Application users

AWS

Internet

VPN

• Start a replication instance• Connect to source and target

databases• Select tables, schemas, or

databases

Let AWS DMS create tables, load data, and keep them in sync

Switch applications over to the target at your convenience

Keep your apps running during the migration

AWSDMS

Multi-AZ option for high availability

Customerpremises or AWS

AWS

Internet

VPN

AWS DMS

AWS DMS

AWS Database Migration service pricing

T2 for developing and periodic data migration tasksC4 for large databases and minimizing time

T2 pricing starts at $0.018 per hour for T2.microC4 pricing starts at $0.154 per hour for C4.large

50 GB GP2 storage included with T2 instances100 GB GP2 storage included with C4 instances Data transfer inbound and within AZ is free

Data transfer across AZs starts at $0.01 per GB

Complete pricing details here:https://aws.amazon.com/dms/pricing/

Migration Scenarios and Options

On-Premises Migration Scenarios

• An on-premises database to a database on Amazon RDS DB instance

• An on-premises database to a database on an Amazon EC2 instance

• Migration from an on-premises database to another on-premises database is not supported.

RDS Migration Scenarios

• A database on an Amazon RDS DB instance to an on-premises database

• A database on an Amazon RDS DB instance to a database on an Amazon RDS DB instance

• A database on an Amazon RDS DB instance to a database on an Amazon EC2 instance

EC2 Migration Scenarios

• A database on an Amazon EC2 instance to an on-premises database

• A database on an Amazon EC2 instance to a database on an Amazon EC2 instance

• A database on an Amazon EC2 instance to a database on an Amazon RDS DB instance

DMS Components

• Replication Instances

• Endpoints

• Tasks

Replication Instances

• Performs the work of the migration

• Tasks run on instances

• Can support multiple tasks

• AWS DMS currently supports T2 and C4 instance classes for replication instances 

Public and Private Replication Instances

• A replication instance should have a public IP address if the source or target database is located in a network that is not connected to the replication instance's VPC by using a virtual private network (VPN), AWS Direct Connect, or VPC peering.

• A replication instance should have a private IP address when both the source and target databases are located in the same network that is connected to the replication instance's VPC by using a VPN, AWS Direct Connect, or VPC peer.

Sources for AWS Database Migration Service On-premises and Amazon EC2 instance databases:• Oracle versions 10.2 and later, 11g, and 12c, for the Enterprise, Standard, Standard One,

and Standard Two editions• Microsoft SQL Server versions 2005, 2008, 2008R2, 2012, and 2014, for the Enterprise,

Standard, Workgroup, and Developer editions. The Web and Express editions are not supported.

• MySQL versions 5.5, 5.6, and 5.7• MariaDB (supported as a MySQL-compatible data source)• PostgreSQL 9.3 and later• SAP Adaptive Server Enterprise (ASE) 15.7 and laterAmazon RDS instance databases• Oracle versions 11g (versions 11.2.0.3.v1 and later), and 12c, for the Enterprise, Standard,

Standard One, and Standard Two editions• Microsoft SQL Server versions 2008R2, 2012, and 2014, for the Enterprise and Standard

editions. Note that change data capture (CDC) operations are not supported. The Web, Workgroup, Developer, and Express editions are not supported.

• MySQL versions 5.5, 5.6, and 5.7• PostgreSQL 9.4• MariaDB (supported as a MySQL-compatible data source)• Amazon Aurora (supported as a MySQL-compatible data source)

Targets for AWS Database Migration Service On-premises and EC2 instance databases• Oracle versions 10g, 11g, 12c, for the Enterprise, Standard, Standard One, and Standard

Two editions• Microsoft SQL Server versions 2005, 2008, 2008R2, 2012, 2014 for the Enterprise, Standard,

Workgroup, and Developer editions. The Web and Express editions are not supported.• MySQL versions 5.5, 5.6, and 5.7• MariaDB (supported as a MySQL-compatible data target)• PostgreSQL versions 9.3 and later• SAP Adaptive Server Enterprise (ASE) 15.7 and laterAmazon RDS instance databases and Amazon Redshift• Oracle versions 11g (versions 11.2.0.3.v1 and later) and 12c, for the Enterprise, Standard,

Standard One, and Standard Two editions• Microsoft SQL Server versions 2008R2, 2012, and 2014, for the Enterprise, Standard,

Workgroup, and Developer editions. The Web and Express editions are not supported.• MySQL versions 5.5, 5.6, and 5.7• MariaDB (supported as a MySQL-compatible data target)• PostgreSQL versions 9.3 and later• Amazon Aurora (MySQL and PostgreSQL)• Amazon Redshift

Tasks Overview

• Run on a replication instance

• Contain two and only two endpoints (source and target)

• Different migration methods available

• Specify selection and/or transformation rules

• Can run multiple tasks

Migration Methods

• Migrate existing data

• Migrate existing data and replicate ongoing changes

• Replicate data changes only

DMS – Change Data Capture (CDC) “No Touch” design

• Reads recovery log of source database • Using the engine’s native change data capture API• No agent required on the source

Some requirements• Oracle: Supplemental logging required• MySQL: Full image row level bin logging required• SQL Server: Recovery model bulk logged or full• Postgres: wal_level = logical; max_replication_slots >= 1; max_wal_Senders >=1;

wal_sender_timeout = 0

Changes captured and applied as units of single committed transactionsActivated when load startsNo changes are applied until load completes, then applied as soon as possible in near real-time

Data copy: Existing data is copied from source tables to tables on the target.

Change data capture and apply: Changes to data on source are captured while the tables are loaded. Once load is complete, buffered changes are applied to the target. Additional changes captured on the source are applied to the target until the task

stopped or terminated AWS Database Migration Service

AWS Schema Conversion Tool

Oracle, SQL Server to Aurora migration

Assessment report: SCT analyses the source database and provides a report with a recommended target engine and information on automatic and manual conversions

Code Browser and recommendation engine: Highlights places that require manual edits and provides architectural and design guidelines.

Replication Instance

Source Target

Start Full Load

While Loading Data Also Capture Changes

Source TargetReplication Instance

Update

Load Complete - Apply Captured Changes

Source TargetReplication Instance

Update

Changes Reach Steady State

Source TargetReplication Instance

Update

Cutover - Shut Down Apps & Apply Remaining Changes

Source TargetReplication Instance

Update

Flip!

Source TargetReplication Instance

Update

Changes are Transactional and Come From the Logs

Replication Instance

Source Target

Update

t1

t2

t1

t2

Replication Instance

Source Target

Multiple Targets

Target

Target

Replication Instance

Source Target

Multiple Sources

Source

Source

Replication Instance

Source

Target

Multiple Sources and Targets

Source

Source

Target

You Don’t Have to Take Everything

Source

Target

Replication Instance

Homogenous or Heterogeneous

Replication Instance

SQL Server MySQL

Replication Instance

Oracle Oracle

Replication Instance

Oracle Aurora

Amazon Aurora

Enterprise customer wish list

A database that ….

Stays up, even when components fail ….

Performs consistently at enterprise scale …

Doesn’t need an army of experts to manage …

Doesn’t cost a fortune; no licenses to handle …

Amazon Aurora: enterprise-class database for the cloud

We started with enterprise requirements and walked backward to reimagine relational databases for the cloud ….

Enterprise-class availability, performance Delivered as a fully managed service No licenses; 1/10 the cost of commercial databases

Perfect fit for enterprise

6-way replication across 3 availability zones Failover in less than 30 secs Near instant crash recovery

Up to 500 K/sec read and 100 K/sec write 15 low latency (10 ms) Read Replicas Up to 64 TB DB optimized storage volume

Instant provisioning and deployment Automated patching and software upgrade Backup and point-in-time recovery Compute and storage scaling

Performance and scale

Enterprise class availability

Fully managed service

Aurora is used by: 2/3 of top 100 AWS customers

8 of top 10 gaming customers

Aurora customer adoption

Fastest growing service in AWS history

A service-oriented architecture applied to the database

Moved the logging and storage layer into a multitenant, scale-out database-optimized storage service

Integrated with other AWS services like Amazon EC2, Amazon VPC, Amazon DynamoDB, Amazon SWF, and Amazon Route 53 for control plane operations

Integrated with Amazon S3 for continuous backup with 99.999999999% durability

Control PlaneData Plane

Amazon DynamoDB

Amazon SWF

Amazon Route 53

Logging + Storage

SQL

Transactions

Caching

Amazon S3

1

2

3

Delivered as a managed service

Backup and recovery,data load and unload

Performance tuning

5%

25%

20%

40%

5% 5%

Scripting and coding

Securityplanning

Installing, upgrading, patching, and migrating

Documentation, licensing, and training

Databases are hard to manage

Hosting your databases on premises

youPower, HVAC, net

Rack & stack

Server maintenance

OS patches

DB s/w patches

Database backups

Scaling

High availability

DB s/w installs

OS installation

App optimization

Hosting your databases in Amazon EC2

Power, HVAC, net

Rack & stack

Server maintenance

OS installation

OS patches

DB s/w patches

Database backups

Scaling

High availability

DB s/w installs

App optimization

you

If you choose a managed DB service

App optimization

Power, HVAC, net

Rack & stack

Server maintenance

OS patches

DB s/w patches

Database backups

High availability

DB s/w installs

OS installation

Scaling

you

Learning Resources – Amazon AuroraService page – https://aws.amazon.com/rds/aurora/

Deep dive video (from re:Invent 2016) here – https://youtu.be/duf5uUsW3TM

Getting started with Aurora whitepaper – https://d0.awsstatic.com/whitepapers/getting-started-with-amazon-aurora.pdf

Performance Benchmark Guide –https://d0.awsstatic.com/product-marketing/Aurora/RDS_Aurora_Performance_Assessment_Benchmarking_v1-2.pdf

More resources found here – https://aws.amazon.com/rds/aurora/resources/

Before we get into the demo:Step 1: Database Migration Assessment

1. Connect Schema Conversion Tool to source and target databases.

2. Run Assessment Report.

3. Read Executive Summary.

4. Follow detailed instructions.

Demo time!

Best Practices – AWS Schema Conversion ToolGeneral Memory Management and Performance OptionsConfigure the AWS Schema Conversion Tool with different memory performance settings. Increasing memory speeds up the performance of your conversion but uses more memory resources on your desktop.

Fast conversion, but large memory consumption – This option optimizes for speed of the conversion, but might require more memory for the object reference cache.

Low memory consumption, but slower conversion – This option minimizes the amount of memory used, but results in a slower conversion. Use this option if your desktop has a limited amount of memory.

Balance speed with memory consumption – This option optimizes provides a balance between memory use and conversion speed.

If you are converting large database schemas, ex: a database with 3,500 stored procedures, you can configure the amount of memory available to the AWS Schema Conversion Tool. Details here:http://docs.aws.amazon.com/SchemaConversionTool/latest/userguide/CHAP_SchemaConversionTool.BestPractices.html

Best Practices – AWS Database Migration Service• Load multiple tables in parallel• Remove bottlenecks on the target• Use multiple tasks• Optimizing change processing• Determine optimal Size for the replication instance based on:

• Table Size• Data manipulation language (DML) activity• Transaction size• Total size of the migration• Number of tasks• Migrating Large Binary Objects (LOBs)

Complete list of best practices can be found here:http://docs.aws.amazon.com/dms/latest/userguide/CHAP_BestPractices.html

Thank you!

Paras Bhuva, Solutions Architect2/22/2017Amazon Web Servicesbhuparas@amazon.com

@parasbhuva

top related