datafactory mysql_Azure Data Factory Updates: Copy data from MySQL, DB2, Sybase, PostgreSQL and Tera...

We are happy to announce that we have added support for new data stores to Azure Data Factory. You can now additionally connect to MySQL, DB2, Sybase, PostgreSQL and Teradata. Data can now be copied from any of these data sources to Azure Blob or Azure SQL Database.

Let's take a look with this below example on how you can move data from MySQL to Azure Blob. Add a linked service for MySQL like this:

1.png

The linked service definition for MySQL will look like this:

{

"name": "MyLinkedService_MySQL",

"properties":

{

"type": "OnPremisesMySqlLinkedService",

"server": "[ServerName]",

"database": "[DatabaseName]",

"username": "[UserName]",

"password": "[Password]",

"gatewayName": "[GatewayNameUsedToConnectToThisMySQL]","

"authenticationType": "[Type of authentication: Basic/Windows/Anonymous]"

}

}

Description of each field can be found here. Please note to do the same for DB2, Sybase, PostgreSQL or Teradata you will just need to specify a different ‘type’ which can be found here. The remaining fields remain the same.

Next, lets define the data set which we intend to copy over. It will look like this:

{

"name": "DataSet_MySQL",

"properties":

{

"location":

{

"type": "RelationalTableLocation",

"tableName": "northwind_mysql.orders",

"linkedServiceName": "MyLinkedService_MySQL"

},

"availability":

{

"frequency": "Hour",

"interval": 1,

"waitOnExternal":

{

"retryInterval": "00:01:00",

"retryTimeout": "00:10:00",

"maximumRetry": 3

}

}

}

}

Now that the dataset is defined, lets use it in a pipeline to copy the data from MySQL to Azure Blob:

{

"name": "CopyPipeline_MySQL_To_Blob",

"properties":

{

"activities":

[

{

"name": "MySQLToBlobCopyActivity",

"inputs": [ {"name": "DataSet_MySQL"} ],

"outputs": [ {"name": "DataSet_AzureBlob"} ],

"type": "CopyActivity",

"transformation":

{

"source":

{

"type": "RelationalSource",

"query": "select * from northwind_mysql.orders"

},

"sink":

{

"type": "BlobSink",

"writeBatchSize": 1000000,

"writeBatchTimeout": "01:00:00"

}

},

"policy":

{

"concurrency": 1,

"executionPriorityOrder": "NewestFirst",

"retry": 2,

"timeout": "01:00:00"

}

}

]

}

}

With a simple and quick configuration you now have a pipeline which copies data from on-premises MySQL to Azure Blob.

2.png

The complete copy matrix along with the related information can be found here.

Note: These additional data stores can currently be connected as a source only for a copy activity and not as a destination.

We are continuously adding more data stores to Azure Data Factory. If you require one which we don’t support currently or have any other feedback then visit Azure Data Factory User Voice and/or Forums and let us know your thoughts. We are eager to hear from you!

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值