Azure Stream Analytics
- A Data Lake Store is a hyper-scale repository for big data analytics workloads. Use this when you want to capture data of ...
- A nested INSERT, UPDATE, DELETE, or MERGE statement is not allowed in a SELECT statement that is not the immediate source ...
- A nested INSERT, UPDATE, DELETE, or MERGE statement is not allowed inside another nested INSERT, UPDATE, DELETE, or MERGE ...
- A potentially transient error occurred while connecting to the Azure Active Directory service. Please try again in a few ...
- A Service bus namespace is a container for a set of messaging entities. When you create a new Topic, you also created a Service ...
- A Service bus namespace is a container for a set of messaging entities. When you created a new Event hub, you also created ...
- A Service bus namespace is a container for a set of messaging entities. When you created a new Queue, you also created a ...
- Access to '{0}' was denied. Please make sure the user has at least read and write permissions to the specified Azure Data ...
- An account name is required to add a new Data Lake Store output. Please make sure your subscription contains at least 1 Data ...
- Azure Stream Analytics does not currently support Azure Machine Learning web services with no inputs. Please use an Azure ...
- Azure Stream Analytics is a fully managed, cost effective real-time event processing engine that helps to unlock deep insights ...
- Blob {1} may have been skipped. Blob {1} has a lastModifiedTime : {0} and is not newer than last seen blob {3} with lastModified ...
- Cannot split the join predicate into two key selectors. Please make sure reference data JOIN statement includes equality ...
- Choose when the job will start creating output. Keep in mind that the job might need to read input data ahead of time to ...
- Choose your internationalization preference. This setting tells the Stream Analytics job how to parse, compare, and sort ...
- Column '{0}' cannot be used in SELECT clause when GROUP BY clause is present. Only aggregating expressions and the expressions ...
- Column '{0}' is invalid in the HAVING clause because it is not contained in either an aggregate function or the GROUP BY ...
- Column '{0}' is invalid in the select list because it is not contained in either an aggregate function or the GROUP BY clause. ...
- Complex table names are not supported: '{0}'. Please do not use '.' inside query step names and inputs or escape the name ...
- Could not deserialize the input event as {0}. Some possible reasons: 1) Malformed events 2) Input source configured with ...
- Could not successfully send an empty batch to the Azure Function. Please make sure your function app name, function name, ...
- date}/{time}/{partition} place holder is found in the PathPattern but no DateFormat/TimeFormat/PartitionCount is provided. ...
- Each job requires at least one data stream input. Adding reference data input is optional. A data stream is a continuous ...
- Encountered error trying to perform operation: {0}. The following information might be useful in diagnosing the issue. Error ...
- Event hubs are messaging entities, similar to queues and topics. They're designed to collect event streams from a number ...
- Event hubs limit the number of readers within one consumer group (to 5). We recommend using a separate group for each job. ...
- Events sometimes arrive out of order after they've made the trip from the input source to your Stream Analytics job. You ...
- First parameter '{0}' of function 'GetMetadataPropertyValue' is invalid. Only original input stream names can be used. Common ...
- For DROP STATISTICS, you must provide both the object (table or view) name and the statistics name, in the form "objectName.statisticsName". ...
- Function 'CreateLineString' is not applicable to type '{0}' in expression '{1}'. GeoJSON LineString value is expected with ...
- Function 'CreateLineString' is not applicable to type '{0}' in expression '{1}'. GeoJSON LineString value is expected with ...
- Function 'CreatePoint' is not applicable to type '{0}' in expression '{1}'. Float value is expected. Example "CreatePoint(0.0,0.0)". ...
- Function 'CreatePoint' is not applicable to type '{0}' in expression '{1}'. Numeric value is expected. Example "CreatePoint(0.0, ...
- Function 'CreatePolygon' is not applicable to type '{0}' in expression '{1}'. GeoJSON Polygon value is expected with at least ...
- Function 'CreatePolygon' is not applicable to type '{0}' in expression '{1}'. GeoJSON Polygon value is expected with at least ...
- Function '{0}' has invalid script definition. The body of the script must be a function. Example: function() { }. Found '{1}' ...
- Function '{0}' has invalid script definition. The body of the script must be a function. Example: function() { }. Found '{1}' ...
- Function '{0}' is either not supported or not usable in this context. Note that aggregate functions must be used with a 'group ...
- Function '{0}' is either not supported or not usable in this context. Note that aggregate functions must be used with a GROUP ...
- Function '{0}' is either not supported or not usable in this context. User defined function calls must start with "udf." ...
- Function '{0}' is not applicable to type '{1}' in expression '{2}'. Record value in GeoJSON format is expected. Example "{'type':'Point', ...
- Function '{0}' is not applicable to type '{1}' in expression '{2}'. Record value in GeoJSON format is expected. Example "{'type':'Point', ...
- Function '{0}' is not applicable to type '{1}' in expression '{2}'. Record value in GeoJSON format is expected. Example "{'type':'Point', ...
- Function '{0}' is not applicable to type '{1}' in expression '{2}'. Record value in GeoJSON format is expected. Example "{'type':'Point', ...
- Function '{0}' returns a value of unsupported type: '{1}'. Only 'Number', 'Date', 'String', 'Object' and 'Array' types are ...
- GeJSON point must contain field "type" with value "Point" and field "coordinates" with 2 float type elements. Values cannot ...
- GeJSON polygon must contain field "type" with value "Polygon" and field "coordinates" with a list of no less than 4 point ...
- GeoJSON LineString must contain field "type" with value "LineString" and field "coordinates" with a list of no less than ...
- GeoJSON object type not supported: '{0}'. Currently supported types are 'Point' and 'Polygon'. Example: "{'type':'Point', ...
- GeoJSON object type not supported: '{0}'. Currently supported types are 'Point' and 'Polygon'. Example: "{'type':'Point', ...
- If 'TIMESTAMP BY OVER' and 'PARTITION BY' clauses are used in the query, all query steps and input sources must use the same ...
- If an event fails to be written to the output, you have a couple of options. 'Drop' which drops the event that caused the ...
- If any of the input sources use TIMESTAMP BY OVER clause, all other input sources must have TIMESTAMP BY OVER clause specifying ...
- If events arrive that are outside of the times you chose above, you have a couple of options. Drop deletes the events, and ...
- If join sources are partitioned, join predicate must include condition matching partition keys of both sources. Please add ...
- If join sources use multiple timelines (per key specified in TIMESTAMP BY OVER clause), join predicate must include condition ...
- If TIMESTAMP BY OVER clause is used with partitioned streams, 'PartitionId' must be used as partition key. Please use "PARTITION ...
- In an ALTER TABLE REBUILD or ALTER INDEX REBUILD statement, when a partition is specified in a DATA_COMPRESSION clause, PARTITION=ALL ...
- Input source with name '{0}' is defined both in the configuration and the 'CREATE TABLE' statement. Please use a single definition. ...
- Input source with name '{0}' is used both with and without TIMESTAMP BY clause. Please either remove TIMESTAMP BY clause ...
- Input source with name '{0}' uses TIMESTAMP BY clause with different field names. Please change TIMESTAMP BY clause to specify ...
- Input source with name '{0}' uses TIMESTAMP BY OVER clause with different OVER field names. Please change TIMESTAMP BY OVER ...
- Invalid or no matching collections found with collection pattern '{0}'. Collections must exist with case-sensitive pattern ...
- Invalid token is found in the FilePathPrefix property of the output DataLake source. Only {date} and/or {time} are valid. ...
- Invalid token is found in the PathPattern property of the input Blob source. Only {date}, {time} and/or {partition} are valid. ...
- IoT hub endpoints are messaging entities, similar to Event hubs, queues, and topics. They're designed to collect event streams ...
- IoT hubs limit the number of readers within one consumer group (to 5). We recommend using a separate group for each job. ...
- Join sources must use the same timeline. Please check keys specified in TIMESTAMP BY OVER clause. Stream {0} keys: {1}. Stream ...
- Lag' is not supported on operands of type {0} and {1}. The types of value (first argument) and default value (third argument) ...
- LastOutputEventTime must be available when OutputStartMode is set to LastOutputEventTime. Please make sure at least one output ...
- Line separated specifies that the output will be formatted by having each JSON object separated by a new line. Array specifies ...
- Maximum Event Hub receivers exceeded. Only 5 receivers per partition are allowed. Please use a dedicated consumer group for ...
- Microsoft and Windows are either registered trademarks or trademarks of Microsoft Corporation in the U.S. and/or other countries. ...
- Must pass parameter number {0} and subsequent parameters as '@name = value'. After the form '@name = value' has been used, ...
- Name can contain only letters, numbers, periods, hyphens and underscores. Name must start and end with a letter or number. ...