Azure Stream Analytics

  1. Name can contain only letters, numbers, periods, hyphens, and underscores. Name must start and end with a letter or number. ...
  2. Name can only container lowercase letters, numbers, and hyphens, and must begin with a letter or a number. The name can't ...
  3. Note: This output has permanent access to your Data Lake Store account. Access to Data Lake, once granted, does not expire ...
  4. Note: This output has permanent access to your Power BI dashboard. Access to Power BI, once granted, does not expire unless ...
  5. Note: This output has permanent access to your Power BI dashboard. Access to Power BI, once granted, does not expire unless ...
  6. Note: You are granting this output permanent access to your Data Lake Store account. Should you need to revoke this access ...
  7. Note: You are granting this output permanent access to your Power BI dashboard. Should you need to revoke this access in ...
  8. Note: You are granting this output permanent access to your Power BI dashboard. Should you need to revoke this access in ...
  9. Number of parameters used in function '{0}' doesn not match function definition: expected {1} parameters, {2} were provided. ...
  10. One more streaming jobs specified were not moved because they were in a running state. A streaming job must be in a stopped ...
  11. One or more streaming jobs specified were not moved because they experienced an unexpected error while being moved to the ...
  12. One or more streaming jobs specified were not moved because they had a different subscription id or resource group than the ...
  13. One or more streaming jobs specified were not moved because they have names that already exist in the target subscription ...
  14. Only a single parameter must be annotated using the ProjectionAction attribute. The following parameters in function {0} ...
  15. Only Unique or Primary Key constraints can be created on computed columns, while Check, Foreign Key, and Not Null constraints ...
  16. Output event contains a column called {0} that is not configured to be the {0}. Please remove the column or update output ...
  17. OutputStartMode and OutputStartTime are read only properties for the API version specified in the client request. Please ...
  18. PATCH of Inputs, Transformation, Functions, or Outputs is not allowed using the Streaming Job level API. Please use the API ...
  19. Power BI output is not yet supported in the Azure Preview portal. Please navigate to the Azure Management portal to use this ...
  20. Preview Azure Machine Learning Function and Scalar Function with Azure Machine Learning Binding are not supported in the ...
  21. Query testing is not yet supported in the Azure Preview portal. Please navigate to the Azure Management portal to use this ...
  22. Queues are messaging entities, similar to Event hubs and topics. They're designed to collect event streams from a number ...
  23. Sampling data is not yet supported in the Azure Preview portal. Please navigate to the Azure Management portal to use this ...
  24. Second parameter of function '{0}' is invalid. Please use valid column name identifiers - use '.' and '[', ']' to delimit ...
  25. Sometimes events don't make it from the client to the event hub as quickly as they should. If you still want to include these ...
  26. Source '{0}' can only be used in temporal predicate using 'datediff' function. Example: SELECT input1.a, input2.b FROM input1 ...
  27. Streaming Analytics job '{0}' is only able to use up to {1} Streaming Units based on the provided query. Please adjust the ...
  28. Streaming Job Start POST request failed due to unknown OutputStartMode to the API version specified in the client request. ...
  29. Streaming units are used as a pool of computation resources available to process the query. To learn more, click the "?" ...
  30. Temporal tables require the lower boundary column name in SYSTEM_TIME PERIOD definition to match the GENERATED ALWAYS AS ...
  31. Temporal tables require the upper boundary column name in SYSTEM_TIME PERIOD definition to match the GENERATED ALWAYS AS ...
  32. The 'targetResourceGroup' property did not match the expected format of "/subscriptions/{targetSubscriptionId}/resourceG ...
  33. The Azure Data Lake Store '{0}' was not found. Please make sure the specified Azure Data Lake Store exists and the user has ...
  34. The Azure Machine Learning data type '{0}' does not map to any Azure Stream Analytics data type. Please check that this is ...
  35. The collection name pattern for the collections to be used. The collection name format can be constructed using the optional ...
  36. The Event Hub is configured to only allow one event receiver at a given time and an existing receiver is already connected ...
  37. The file path used to locate your blobs within the specified container. Within the path, you may choose to specify one or ...
  38. The file path used to locate your blobs within the specified container. Within the path, you may choose to specify one or ...
  39. The file path used to locate your blobs within the specified container. Within the path, you may choose to specify one or ...
  40. The file path used to locate your files within the specified account. Within the path, you may choose to specify one or more ...
  41. The input alias '{0}' specified by the query, has not been defined. Please validate that the alias used in the query matches ...
  42. The job is currently already in the process of updating. Please wait until the current update is finished before issuing ...
  43. The join predicate is not time bounded. JOIN operation between data streams requires specifying max time distances between ...
  44. The name can contain only alphanumeric characters. The name cannot begin with a numeric character. The name is case-insensitive. ...
  45. The name of the output column containing the partition key. The partition key is a unique identifier for the partition within ...
  46. The name of the output column containing the row key. The row key is a unique identifier for an entity within a given partition. ...
  47. The ORDER BY clause is not valid in views, inline functions, derived tables, sub-queries, and common table expressions, unless ...
  48. The output column type is not supported by Stream Analytics in Service Bus Message properties. Column name: '{0}'. Column ...
  49. The script contains syntax that is not supported by the version of SQL Server that is associated with the database project. ...
  50. The selected resource and the stream analytics job are located in different regions. You will be billed to move data between ...
  51. The specified locale ID is not valid. Verify that the locale ID is correct and that a corresponding language resource has ...
  52. The specified maximum size limit is greater than the maximum value allowed. The maximum size limit must be less than 16777215 ...
  53. The step {0} can not have more than 1 reference to it. To fix, please create separate steps with the same sub-query and different ...
  54. The step {0} cannot have both of its inputs coming from the same sub-query. To fix, please create separate steps with the ...
  55. The Stream Analytics Job contains an incomplete function definition. Please make sure you have defined the inputs and outputs ...
  56. The target subscription/resource group combination specified must be different than the the source subscription/resource ...
  57. The underlying definition of the Azure Machine Learning function is in a format that is no longer supported. To learn how ...
  58. The value must be between 6 and 50 characters long. Name can contain only letters, numbers, and hyphens. Name must start ...
  59. There was a problem creating data model in Power BI. The following information may be helpful diagnosing the issue. Power ...
  60. There was a problem formatting the document id column as per DocumentDB constraints for DocumentDB db:[{0}], and collection:[{1}]. ...
  61. There was a problem writing to DocumentDB db:[{0}], and collection:[{1} due to throttling by DocumentDB. Please upgrade collection ...
  62. There was a problem writing to {0} due to throttling by DocumentDB. Please upgrade collection performance tier and tune the ...
  63. There was an error while reading sample input. Please check if the input source is configured correctly and data is in correct ...
  64. To make sure your queries work the way you expect, Stream Analytics needs to know which serialization format (JSON, CSV, ...
  65. Topics are messaging entities, similar to Event hubs and queues. They're designed to collect event streams from a number ...
  66. Unable to access blob storage. Please ensure that account name and key are configured properly. Blob storage error code: ...
  67. Unable to connect to Data Lake Store account. Make sure you are authorized to access the Data Lake Store account and firewall ...
  68. Unable to connect to input source at the moment. Please check if the input source is available and if it has not hit connection ...
  69. Unexpected error occurred while retrieving swagger endpoint definition. Please check if the url provided is valid and that ...
  70. Use 'Messaging' endpoint for messages from devices to the cloud. Use 'Operations Monitoring' endpoint for device telemetry ...
  71. Use Blob storage for ingesting large amounts of unstructured data. Stream Analytics jobs over Blobs will not be temporal ...
  72. Use DocumentDB for storing schema-free JSON data. DocumentDB is optimized for predictable throughput, low latency, and flexible ...
  73. Use Power BI to visualize job output in a real-time dashboard using Power BI. Power BI is only available for organizational ...
  74. Use Service bus Queues for sending data sequentially to one or more completing consumers. Messages are typically received ...
  75. Use Service bus Topics for sending messages to many consumers. Messages are made available to each subscription registered ...