Azure Data Factory

  1. Failed to merge '{0}' with existing secret because of {1} mismatch. New {2} = '{3}', existing {4} = '{5}'. Please make sure ...
  2. Failed to merge new Sql connection string with the existing one because of DataSource mismatch. New DataSource = "{0}", existing ...
  3. Failed to merge new Sql connection string with the existing one because of DataSource mismatch. New DataSource = '{0}', existing ...
  4. Failed to merge new Sql connection string with the existing one because of UserID mismatch. New UserID = "{0}", existing ...
  5. Failed to merge new Sql connection string with the existing one because of UserID mismatch. New UserID = '{0}', existing ...
  6. Failed to merge shared access key with existing ServiceBus transport shared access key because of ServiceBusEndpoint mismatch. ...
  7. Failed to merge shared access key with existing ServiceBus transport shared access key because of ServiceBusEndpoint mismatch. ...
  8. Failed to merge shared access key with existing ServiceBus transport shared access key because of ServiceBusSharedAccessKeyName ...
  9. Failed to merge shared access key with existing ServiceBus transport shared access key because of ServiceBusSharedAccessKeyName ...
  10. Failed to retrieve the credential for linked service. If this error persists, please try to remove the linked service and ...
  11. File format '{0}' is not enabled for subscription '{1}'. Please contact Azure Data Factory support team for further assistance. ...
  12. File format '{0}' is not enabled for subscription '{1}'. Please contact Azure Data Factory support team for further assistance: ...
  13. Firewall rules for the selected SQL server do not allow connection to server from Azure services. Change the server settings ...
  14. Five formats are supported: TextFormat, AvroFormat, JsonFormat, OrcFormat and ParquetFormat. If the type is 'TextFormat', ...
  15. Following properties for resource '{1}' have been deprecated: {0}. Please remove these properties from the JSON definition. ...
  16. Four formats are supported: TextFormat, AvroFormat, JsonFormat and OrcFormat. If the type is 'TextFormat', you can specify ...
  17. FTP Server is not supported for subscription '{0}'. Please contact Azure Data Factory support team for further assistance: ...
  18. Host name which can be either UNC name e.g. \server or localhost for the same machine hosting the gateway - required for ...
  19. Hub '{0}' is not ready. Current status: {1}. If '{0}' is in failed state, you can investigate by getting the object and inspecting ...
  20. Hub Name value is not valid for Pipeline '{0}'. Hub Name must not be specified for Pipelines with Null Activities and StoredProcedure ...
  21. Hub Name value is not valid for Pipeline {0}. Hub Name must not be specified for Pipelines with Null Activities and StoredProcedure ...
  22. Hub {0} is not ready. Current status: {1}. If {0} is in failed state, you can investigate by getting the object and inspecting ...
  23. In copy activity, a relational database (MySQL/DB2/Teradata/PostgreSQL/Sybase) based dataset can only be used as a copy source ...
  24. Invalid .Net activity package found in blob. Ensure that the package files are in the root of the zip file. Assembly Name: ...
  25. Invalid ClusterType in on-demand HDI linked service for Spark activity. DataFactoryName '{0}', PipelineName '{1}', ActivityName ...
  26. Invalid custom activity package found in blob. Ensure that the package files are in the root of the zip file. Assembly Name: ...
  27. Invalid parameter '{0}' provided to copy activity. This could be caused by the mismatch between the information in encrypted ...
  28. is not ready. Current status: {2}. If {0} {1} is in failed state, you can investigate by getting the object and inspecting ...
  29. Job submission failed, the user '{0}' does not have permissions to a subfolder in the /system/ path needed by Data Lake Analytics. ...
  30. Linked service '{0}' has an invalid authorization property value. Please re-authorize the linked service and deploy it again. ...
  31. Linked service '{0}' is not ready. Current status: {1}. If '{0}' is in failed state, you can investigate by getting the object ...
  32. Linked service '{0}' property '{1}' specifies value '{2}', which is not a valid URL. Please correct the URL and upload the ...
  33. Linked service '{0}' specifies authentication type '{1}' which is not supported for linked service type '{2}'. Please see ...
  34. Linked service type '{0}' is not enabled for subscription '{1}'. Please contact Azure Data Factory support team for further ...
  35. Linked service type '{0}' is not enabled for subscription '{1}'. Please contact Azure Data Factory support team for further ...
  36. Linked service {0} is not ready. Current status: {1}. If {0} is in failed state, you can investigate by getting the object ...
  37. Microsoft and Windows are either registered trademarks or trademarks of Microsoft Corporation in the U.S. and/or other countries. ...
  38. Microsoft Azure Data Factory is a cloud-based data integration service that automates the movement and transformation of ...
  39. Name of the gateway that the Data Factory service should use to connect to the on-premises DB2 database - required for credential ...
  40. Name of the gateway that the Data Factory service should use to connect to the on-premises HDFS - required for credential ...
  41. Name of the gateway that the Data Factory service should use to connect to the on-premises MySQL database - required for ...
  42. Name of the gateway that the Data Factory service should use to connect to the on-premises ODBC source - required for credential ...
  43. Name of the gateway that the Data Factory service should use to connect to the on-premises PostgreSQL database - required ...
  44. Name of the gateway that the Data Factory service should use to connect to the on-premises SQL Server database - required ...
  45. Name of the gateway that the Data Factory service should use to connect to the on-premises Sybase database - required for ...
  46. Name of the gateway that the Data Factory service should use to connect to the on-premises Teradata database - required for ...
  47. Name of the linked service that refers to an Azure Data Lake Store. This linked service must be of type: AzureDataLakeStoreLinkedService ...
  48. Name of the linked service that refers to an Azure SQL DW database. This linked service must be of type: AzureSqlDWLinkedService ...
  49. Name of the linked service that refers to an SQL Server Database. This linked service must be of type: OnPremisesSqlServer ...
  50. Name: '{0}' is not a valid name. For Data Factory naming restrictions, please see http://msdn.microsoft.com/en-us/library/dn835027.aspx ...
  51. Name: '{0}' is not a valid name. For Data Factory naming restrictions, please see http://msdn.microsoft.com/library/dn835027.aspx ...
  52. No data gateway is selected. Select an online data gateway in DATA GATEWAY field and reopen CREDENTIALS blade to set your ...
  53. No pipeline produces Dataset '{0}' from {1} to {2}. - Try changing the slice time range to fall within the active period ...
  54. No pipeline produces Dataset '{0}' from {1} to {2}. - Try changing the slice time range to fall within the active period ...
  55. No Pipeline produces Dataset '{0}'. Please deploy a pipeline that produces the Dataset first. If this Dataset is an input ...
  56. No Pipeline produces Dataset '{0}'. Please deploy a pipeline that produces the Dataset first. If this Dataset is an input ...
  57. No pipeline produces Dataset {0} from {1} to {2}. - Try changing the slice time range to fall within the active period of ...
  58. No pipeline produces Dataset {0} from {1} to {2}. - Try changing the slice time range to fall within the active period of ...
  59. No Pipeline produces Dataset {0}. Please deploy a pipeline that produces the Dataset first. If this Dataset is an input to ...
  60. No Pipeline produces Dataset {0}. Please deploy a pipeline that produces the Dataset first. If this Dataset is an input to ...
  61. OData with OAuth is not supported for subscription '{0}'. Please contact Azure Data Factory support team for further assistance. ...
  62. OData with Windows Auth is not supported for subscription '{0}'. Please contact Azure Data Factory support team for further ...
  63. OData with Windows Auth or OAuth is not supported for subscription '{0}'. Please contact Azure Data Factory support team ...
  64. On-premises Cassandra is not supported for subscription '{0}'. Please contact Azure Data Factory support team for further ...
  65. On-premises MongoDB is not supported for subscription '{0}'. Please contact Azure Data Factory support team for further assistance. ...
  66. On-premises Oracle with ODBC driver is not supported for subscription '{0}'. Please contact Azure Data Factory support team ...
  67. Ondemand Spark is not supported for subscription '{0}'. Please contact Azure Data Factory support team for further assistance: ...
  68. One or more IP addresses or host names of the Cassandra server. Specify a comma-separated list of IP addresses and/or host ...
  69. Onpremises Cassandra is not supported for subscription '{0}'. Please contact Azure Data Factory support team for further ...
  70. Operations API access is not allowed for your tenant: {0}. Please contact Azure Data Factory support team directly if you ...
  71. Operations API access is not allowed for your tenant: {0}. Please contact Azure Data Factory support team directly if you ...
  72. Optional Click 'Authorize' to allow this data factory and the activities it runs to access this OData source with your access ...
  73. Package connection string used in activity '{0}' does not contain AccountName and AccountKey. Package connection string must ...
  74. Path to the container and folder in the blob storage. Example: myblobcontainer/myblobfolder/{Year}/{Month}/{Day}. The below ...
  75. Pipeline '{0}' and Linked Service '{1}' referred by Activity '{2}' are not on the same hub. Please ensure that all activities ...