Azure Data Factory

  1. The linked service '{0}' cannot be converted to a ConnectionInfo object because the conversion is not supported for Asset ...
  2. The name can contain only letters, numbers and hyphens. The first and last characters must be a letter or number. Spaces ...
  3. The properties LinkedServiceReference and SecureString are only supported for string and array of strings type properties. ...
  4. The properties OracleWriterStoredProcedureName and OracleWriterTableType are not supported yet. Please remove them in OracleSink. ...
  5. The property 'allowPolyBase' in SQL Data WareHouse Sink must be set to 'true' when property 'polyBaseSettings' is specified. ...
  6. The property 'ExtendedProperties' is not a valid property for the activity. This property has been deprecated. Please use ...
  7. The property UpdateResourceEndpoint in linked service '{0}' specifies a URL for a new ARM resource that that is not supported ...
  8. The property {0} is not a valid property for HDInsightBYOCLinkedService. This property has been deprecated. Please use 'LinkedServiceName' ...
  9. The Recursive property is not allowed for your subscription. Please contact Azure Data Factory team if you would like to ...
  10. The Recursive property is not allowed for your subscription. Please contact Azure Data Factory team if you would like to ...
  11. The request cannot be fulfilled because data factory '{0}' has been temporarily locked. Its pipelines will continue to run; ...
  12. The request cannot be fulfilled because data factory '{0}' has been temporarily placed in readonly mode. Its pipelines will ...
  13. The required properties of Spark activity missing. DataFactoryName '{0}', PipelineName '{1}', ActivityName '{2}', RootPath ...
  14. The resource type name '{0}' is invalid and has changed to '{1}'. Please change the routing path of the request accordingly. ...
  15. The resource type name in the request URI was '{0}' but is expected to be '{1}'. Please change the routing path of the request ...
  16. The schema of the input at '{0}' does not match the model at '{1}'. Go to your Azure Machine Learning web service page and ...
  17. The schema of the input at '{0}' does not match the model at '{1}'. Go to your Azure Machine Learning web service page and ...
  18. The slice (start: {0}, end: {1}) failed because these linked services are in failed state: {2}. Please update the linked ...
  19. The slice (start: {0}, end: {1}) failed because these tables are in failed state: {2}. Please update the tables to fix the ...
  20. The slice (start: {0}, end: {1}) failed because these tables {2} and these linked services {3} are in failed state. Please ...
  21. The SliceIdentifierColumnName property is not allowed for your subscription. Please contact Azure Data Factory team if you ...
  22. The SliceIdentifierColumnName property is not allowed for your subscription. Please contact Azure Data Factory team if you ...
  23. The SQL Reader Stored Procedure Parameter of SQL source is not allowed for your subscription. Please contact Azure Data Factory ...
  24. The SQL Reader Stored Procedure Parameter of SQL source is not allowed for your subscription. Please contact Azure Data Factory ...
  25. The SqlWriterCleanupScript property is not allowed for your subscription. Please contact Azure Data Factory team if you would ...
  26. The SqlWriterCleanupScript property is not allowed for your subscription. Please contact Azure Data Factory team if you would ...
  27. The start date-time of the duration in which data processing will occur or the data slices will be processed. Example : 2014-05-01T00:00:00Z ...
  28. The Stored Procedure Activity '{0}' Linked Service is missing. Please specify an SQL Linked Service in the activity definition. ...
  29. The value specified for 'fileLinkedService' property in Hadoop Streaming Activity '{0}' is '{1}', but no Linked Service with ...
  30. The value specified for 'fileLinkedService' property in Hadoop Streaming Activity '{0}' is '{1}', but that Linked Service ...
  31. The value specified for interval property under availability settings is less than the minimum supported interval of {0} ...
  32. The value specified for the interval property under availability settings is less than the minimum supported interval of ...
  33. This is deprecated. Please instead navigate to 'Author and deploy'. Then use the 'New data store' command or select an existing ...
  34. This Service Bus Namespace will be used to create Queues to manage communication channels with the on-premises Hadoop cluster. ...
  35. This will make the data gateway non-functional immediately until the data gateway is newly registered with the regenerated ...
  36. Three formats are supported: TextFormat, AvroFormat, JsonFormat. If the type is 'TextFormat', you can specify the following ...
  37. To use Azure Active Directory based OAuth authentication, please provide an OData resource "url" and specify "authenticationType" ...
  38. To use OAuth authentication type, authorization is required to deploy this linked service. Click "Authorize" to allow this ...
  39. Two formats are supported: TextFormat, AvroFormat. If the type is 'TextFormat', you can specify the following properties. ...
  40. U-SQL script is not found. Please specifiy a valid U-SQL script in either the script property or the scriptLinkedService ...
  41. Unable to convert "{0}" to a DateTime value (InnerException = "Unexpected JToken type {1}"). Please use ISO8601 DateTime ...
  42. Unable to convert "{0}" to a DateTime value. Please use ISO8601 DateTime format such as "2014-10-01T13:00:00Z" for UTC time, ...
  43. Unable to convert "{0}" to a DateTime value. We recommend using ISO8601 format with a timezone designator such as "2014-10-01T13:00:00Z" ...
  44. Unable to convert "{0}" to a TimeSpan value. Please use a culture-insensitive format such as "00:30:00" (for 30 minutes). ...
  45. Unable to convert '{0}' to a DateTime value (InnerException = "Unexpected JToken type {1}'). Please use ISO8601 DateTime ...
  46. Unable to convert '{0}' to a DateTime value. Please use ISO8601 DateTime format such as "2014-10-01T13:00:00Z" for UTC time, ...
  47. Unable to convert '{0}' to a DateTime value. We recommend using ISO8601 format with a timezone designator such as "2014-10-01T13:00:00Z" ...
  48. Unable to convert '{0}' to a TimeSpan value. Please use a culture-insensitive format such as "00:30:00" (for 30 minutes). ...
  49. Unable to create or update an on-demand HDInsight cluster with {0} cores for subscription {1} because the maximum number ...
  50. Unable to create or update an on-demand HDInsight cluster with {0} cores for subscription {1} because the maximum number ...
  51. Unable to create or update an on-demand HDInsight cluster with {0} cores for subscription {1} because the maximum number ...
  52. Unable to create or update an on-demand HDInsight cluster with {0} cores for subscription {1} because the maximum number ...
  53. Unable to create or update an on-demand HDInsight cluster with {0} cores for subscription {1}. The Azure Data Factory quota ...
  54. Unable to create or update the pipeline '{0}' in the data factory '{1}' because activity '{2}' Concurrency is greater than ...
  55. Unable to create or update the pipeline '{0}' in the data factory '{1}' because activity '{2}' LongRetry is greater than ...
  56. Unable to create or update the pipeline '{0}' in the data factory '{1}' because activity '{2}' Retry is greater than the ...
  57. Unable to create or update the pipeline '{0}' in the data factory '{1}' because it has {2} activities, which is more than ...
  58. Unable to create or update the pipeline: {0} in the data factory: {1} because activity: {2}'s Concurrency is greater than ...
  59. Unable to create or update the pipeline: {0} in the data factory: {1} because activity: {2}'s Concurrency is greater than ...
  60. Unable to create or update the pipeline: {0} in the data factory: {1} because activity: {2}'s LongRetry is greater than the ...
  61. Unable to create or update the pipeline: {0} in the data factory: {1} because activity: {2}'s LongRetry is greater than the ...
  62. Unable to create or update the pipeline: {0} in the data factory: {1} because activity: {2}'s Retry is greater than the maximum ...
  63. Unable to create or update the pipeline: {0} in the data factory: {1} because activity: {2}'s Retry is greater than the maximum ...
  64. Unable to create or update the pipeline: {0} in the data factory: {1} because it has {2} activities, which is more than the ...
  65. Unable to create or update the pipeline: {0} in the data factory: {1} because it has {2} activities, which is more than the ...
  66. Unable to create the data factory '{0}' for the subscription {1} because this subscriber already has {2} data factories, ...
  67. Unable to create the data factory: {0} for the subscription: {1} because this subscriber already has {2} data factories, ...
  68. Unable to create the data factory: {0} for the subscription: {1} because this subscriber already has {2} data factories, ...
  69. Unable to create the Dataset '{0}' in the data factory '{1}' because it already has {2} Datasets, which is the limit per ...
  70. Unable to create the Dataset: {0} in the data factory: {1} because it already has {2} Datasets, which is the limit per data ...
  71. Unable to create the Dataset: {0} in the data factory: {1} because it already has {2} Datasets, which is the limit per data ...
  72. Unable to create the pipeline '{0}' in the data factory '{1}' because it already has {2} pipelines, which is the limit per ...
  73. Unable to create the pipeline: {0} in the data factory: {1} because it already has {2} pipelines, which is the limit per ...
  74. Unable to create the pipeline: {0} in the data factory: {1} because it already has {2} pipelines, which is the limit per ...
  75. Unable to find the Data Lake Analytics account because the DataLakeAnalyticsUri '{0}' could not be resolved. Please change ...