Azure Stream Analytics

  1. A Data Lake Store is a hyper-scale repository for big data analytics workloads. Use this when you want to capture data of ...
  2. A nested INSERT, UPDATE, DELETE, or MERGE statement is not allowed in a SELECT statement that is not the immediate source ...
  3. A nested INSERT, UPDATE, DELETE, or MERGE statement is not allowed inside another nested INSERT, UPDATE, DELETE, or MERGE ...
  4. A potentially transient error occurred while connecting to the Azure Active Directory service. Please try again in a few ...
  5. A Service bus namespace is a container for a set of messaging entities. When you create a new Topic, you also created a Service ...
  6. A Service bus namespace is a container for a set of messaging entities. When you created a new Event hub, you also created ...
  7. A Service bus namespace is a container for a set of messaging entities. When you created a new Queue, you also created a ...
  8. Access to '{0}' was denied. Please make sure the user has at least read and write permissions to the specified Azure Data ...
  9. An account name is required to add a new Data Lake Store output. Please make sure your subscription contains at least 1 Data ...
  10. Azure Stream Analytics does not currently support Azure Machine Learning web services with no inputs. Please use an Azure ...
  11. Azure Stream Analytics is a fully managed, cost effective real-time event processing engine that helps to unlock deep insights ...
  12. Blob {1} may have been skipped. Blob {1} has a lastModifiedTime : {0} and is not newer than last seen blob {3} with lastModified ...
  13. Cannot split the join predicate into two key selectors. Please make sure reference data JOIN statement includes equality ...
  14. Choose when the job will start creating output. Keep in mind that the job might need to read input data ahead of time to ...
  15. Choose your internationalization preference. This setting tells the Stream Analytics job how to parse, compare, and sort ...
  16. Column '{0}' cannot be used in SELECT clause when GROUP BY clause is present. Only aggregating expressions and the expressions ...
  17. Column '{0}' is invalid in the HAVING clause because it is not contained in either an aggregate function or the GROUP BY ...
  18. Column '{0}' is invalid in the select list because it is not contained in either an aggregate function or the GROUP BY clause. ...
  19. Complex table names are not supported: '{0}'. Please do not use '.' inside query step names and inputs or escape the name ...
  20. Could not deserialize the input event as {0}. Some possible reasons: 1) Malformed events 2) Input source configured with ...
  21. Could not successfully send an empty batch to the Azure Function. Please make sure your function app name, function name, ...
  22. date}/{time}/{partition} place holder is found in the PathPattern but no DateFormat/TimeFormat/PartitionCount is provided. ...
  23. Each job requires at least one data stream input. Adding reference data input is optional. A data stream is a continuous ...
  24. Encountered error trying to perform operation: {0}. The following information might be useful in diagnosing the issue. Error ...
  25. Event hubs are messaging entities, similar to queues and topics. They're designed to collect event streams from a number ...
  26. Event hubs limit the number of readers within one consumer group (to 5). We recommend using a separate group for each job. ...
  27. Events sometimes arrive out of order after they've made the trip from the input source to your Stream Analytics job. You ...
  28. First parameter '{0}' of function 'GetMetadataPropertyValue' is invalid. Only original input stream names can be used. Common ...
  29. For DROP STATISTICS, you must provide both the object (table or view) name and the statistics name, in the form "objectName.statisticsName". ...
  30. Function 'CreateLineString' is not applicable to type '{0}' in expression '{1}'. GeoJSON LineString value is expected with ...
  31. Function 'CreateLineString' is not applicable to type '{0}' in expression '{1}'. GeoJSON LineString value is expected with ...
  32. Function 'CreatePoint' is not applicable to type '{0}' in expression '{1}'. Float value is expected. Example "CreatePoint(0.0,0.0)". ...
  33. Function 'CreatePoint' is not applicable to type '{0}' in expression '{1}'. Numeric value is expected. Example "CreatePoint(0.0, ...
  34. Function 'CreatePolygon' is not applicable to type '{0}' in expression '{1}'. GeoJSON Polygon value is expected with at least ...
  35. Function 'CreatePolygon' is not applicable to type '{0}' in expression '{1}'. GeoJSON Polygon value is expected with at least ...
  36. Function '{0}' has invalid script definition. The body of the script must be a function. Example: function() { }. Found '{1}' ...
  37. Function '{0}' has invalid script definition. The body of the script must be a function. Example: function() { }. Found '{1}' ...
  38. Function '{0}' is either not supported or not usable in this context. Note that aggregate functions must be used with a 'group ...
  39. Function '{0}' is either not supported or not usable in this context. Note that aggregate functions must be used with a GROUP ...
  40. Function '{0}' is either not supported or not usable in this context. User defined function calls must start with "udf." ...
  41. Function '{0}' is not applicable to type '{1}' in expression '{2}'. Record value in GeoJSON format is expected. Example "{'type':'Point', ...
  42. Function '{0}' is not applicable to type '{1}' in expression '{2}'. Record value in GeoJSON format is expected. Example "{'type':'Point', ...
  43. Function '{0}' is not applicable to type '{1}' in expression '{2}'. Record value in GeoJSON format is expected. Example "{'type':'Point', ...
  44. Function '{0}' is not applicable to type '{1}' in expression '{2}'. Record value in GeoJSON format is expected. Example "{'type':'Point', ...
  45. Function '{0}' returns a value of unsupported type: '{1}'. Only 'Number', 'Date', 'String', 'Object' and 'Array' types are ...
  46. GeJSON point must contain field "type" with value "Point" and field "coordinates" with 2 float type elements. Values cannot ...
  47. GeJSON polygon must contain field "type" with value "Polygon" and field "coordinates" with a list of no less than 4 point ...
  48. GeoJSON LineString must contain field "type" with value "LineString" and field "coordinates" with a list of no less than ...
  49. GeoJSON object type not supported: '{0}'. Currently supported types are 'Point' and 'Polygon'. Example: "{'type':'Point', ...
  50. GeoJSON object type not supported: '{0}'. Currently supported types are 'Point' and 'Polygon'. Example: "{'type':'Point', ...
  51. If 'TIMESTAMP BY OVER' and 'PARTITION BY' clauses are used in the query, all query steps and input sources must use the same ...
  52. If an event fails to be written to the output, you have a couple of options. 'Drop' which drops the event that caused the ...
  53. If any of the input sources use TIMESTAMP BY OVER clause, all other input sources must have TIMESTAMP BY OVER clause specifying ...
  54. If events arrive that are outside of the times you chose above, you have a couple of options. Drop deletes the events, and ...
  55. If join sources are partitioned, join predicate must include condition matching partition keys of both sources. Please add ...
  56. If join sources use multiple timelines (per key specified in TIMESTAMP BY OVER clause), join predicate must include condition ...
  57. If TIMESTAMP BY OVER clause is used with partitioned streams, 'PartitionId' must be used as partition key. Please use "PARTITION ...
  58. In an ALTER TABLE REBUILD or ALTER INDEX REBUILD statement, when a partition is specified in a DATA_COMPRESSION clause, PARTITION=ALL ...
  59. Input source with name '{0}' is defined both in the configuration and the 'CREATE TABLE' statement. Please use a single definition. ...
  60. Input source with name '{0}' is used both with and without TIMESTAMP BY clause. Please either remove TIMESTAMP BY clause ...
  61. Input source with name '{0}' uses TIMESTAMP BY clause with different field names. Please change TIMESTAMP BY clause to specify ...
  62. Input source with name '{0}' uses TIMESTAMP BY OVER clause with different OVER field names. Please change TIMESTAMP BY OVER ...
  63. Invalid or no matching collections found with collection pattern '{0}'. Collections must exist with case-sensitive pattern ...
  64. Invalid token is found in the FilePathPrefix property of the output DataLake source. Only {date} and/or {time} are valid. ...
  65. Invalid token is found in the PathPattern property of the input Blob source. Only {date}, {time} and/or {partition} are valid. ...
  66. IoT hub endpoints are messaging entities, similar to Event hubs, queues, and topics. They're designed to collect event streams ...
  67. IoT hubs limit the number of readers within one consumer group (to 5). We recommend using a separate group for each job. ...
  68. Join sources must use the same timeline. Please check keys specified in TIMESTAMP BY OVER clause. Stream {0} keys: {1}. Stream ...
  69. Lag' is not supported on operands of type {0} and {1}. The types of value (first argument) and default value (third argument) ...
  70. LastOutputEventTime must be available when OutputStartMode is set to LastOutputEventTime. Please make sure at least one output ...
  71. Line separated specifies that the output will be formatted by having each JSON object separated by a new line. Array specifies ...
  72. Maximum Event Hub receivers exceeded. Only 5 receivers per partition are allowed. Please use a dedicated consumer group for ...
  73. Microsoft and Windows are either registered trademarks or trademarks of Microsoft Corporation in the U.S. and/or other countries. ...
  74. Must pass parameter number {0} and subsequent parameters as '@name = value'. After the form '@name = value' has been used, ...
  75. Name can contain only letters, numbers, periods, hyphens and underscores. Name must start and end with a letter or number. ...