Full Text Search:

Created Service Type Note Context Reference
2/8/2019 Data Factory New Features With Mapping Data Flow in ADF, customers can visually design, build, and manage data transformation processes without learning Spark or having a deep understanding of their distributed infrastructure. Link Link Details
2/8/2019 Data Factory New Features Today, we are excited to announce the release of a set of new ADF connectors which enable more scenarios and possibilities for your analytic workloads. For example, you can now: Ingest data from Google Cloud Storage into Azure Data Lake Gen2, and process using Azure Databricks jointly with data coming from other sources. Bring data from any S3-compatible data storage that you may consume from third party data vendors into Azure. Copy data from MongoDB and others to Azure Cosmos DB's API for MongoDB for application consumption. Retrieve data from any RESTful endpoint as an extensible point to reach hundreds of SaaS applications. Link Link Details
12/20/2018 App Services New Features Azure Functions is now integrated with ADF, allowing you to run an Azure function as a step in your data factory pipelines. Link Link Details
11/30/2018 Data Factory New Features Azure Data Factory now supports service principal and managed service identity (MSI) authentication for Azure Data Lake Storage Gen2 connectors, in addition to Shared Key authentication. You can use these new authentication types when copying data to and from Gen2. Learn more about configurations and prerequisites from Service principal authentication and MSI authentication. Link Link Details
9/13/2018 Data Factory New Feature In Azure Data Factory, you can now copy data from Oracle Service Cloud and Google AdWords by using Copy Activity. For more information, see the Oracle Service Cloud connector and Google AdWords connector articles. Link Link Details
8/23/2018 Data Factory New Feature Azure Data Factory now supports service principal and managed service identity (MSI) authentication for Azure Blob storage, in addition to the Shared Key and SAS token authentications. You can use these new authentication types, for example, when copying data from/to Blob storage, or when you're looking up/getting metadata from Blob storage. Learn more from Azure Storage Azure Active Directory authentication overview and Azure Data Factory configurations and prerequisites. Link Link Details
8/9/2018 Data Factory New Feature GitHub is a development platform that allows you to host and review code, manage projects and build software alongside millions of other developers from open source to business. Azure Data Factory (ADF) is a managed data integration service in Azure that allows you to iteratively build, orchestrate, and monitor your Extract Transform Load (ETL) workflows. You can now integrate your Azure Data Factory with GitHub. The ADF visual authoring integration with GitHub allows you to collaborate with other developers, do source control, versioning of your data factory assets (pipelines, datasets, linked services, triggers, and more). Simply click ‘Set up Code Repository’ and select ‘GitHub’ from the Repository Type dropdown to get started. Link Link Details
8/6/2018 Data Factory New Feature The Azure Data Factory/Azure Cosmos DB connector is now integrated with the Azure Cosmos DB bulk executor library to provide the best performance. Data Factory now supports writing to Azure Cosmos DB by using UPSERT in addition to INSERT. You can find the configuration in the Data Factory UI both for pipeline activity authoring and for the Copy Data tool wizard. For the Azure Cosmos DB sink, you can choose upsert or insert. Link Link Details
6/27/2018 Data Factory New Feature Using code-free ADF UI/app, data engineers and developers can now provision and monitor Azure-SSIS Integration Runtime (IR) which are dedicated ADF servers for SSIS package executions. Link Link Details
6/21/2018 Data Factory New Feature Today, we are announcing the support for event based triggers in your Azure Data Factory (ADF) pipelines. A lot of data integration scenarios requires data factory customers to trigger pipelines based on events. A typical event could be file landing or getting deleted in your azure storage. Now you can simply create an event based trigger in your data factory pipeline. Link Link Details

Previous Next