SQL Server 2008 R2

  1. The CustomReportItem {3}' has multiple instances at runtime. All RenderItem instances of a particular CustomReportItem must ...
  2. The CustomReportItem {3}' has multiple instances at runtime. The {2} extension created an invalid RenderItem instance for ...
  3. The CustomReportItem {3}' has multiple instances at runtime. The {2} extension created an invalid RenderItem instance for ...
  4. The DAC instance name, version, and description are displayed in SQL Server Management Studio after the DAC has been registered. ...
  5. The data could not be uploaded because the data types for column, "{0}", are not consistent. SAP data type: {1}; Integration ...
  6. The data could not be uploaded because the decimals for column, "{0}", is not consistent. SAP decimals: {1}; Integration ...
  7. The data could not be uploaded because the InfoObject for column, "{0}", is not consistent. SAP InfoObject: {1}; Integration ...
  8. The data could not be uploaded because the length for column, "{0}", is not consistent. SAP length: {1}; Integration Services ...
  9. The data could not be uploaded because the number of columns in SAP and the number of columns in Integration Services are ...
  10. The data definition language (DDL) command cannot be executed at the Subscriber. DDL commands can only be executed at the ...
  11. The data extension used for the {0} {1}' failed to detect the default collation properties for the connection. Details: {3} ...
  12. The data feed could not be imported because PowerPivot for Excel is busy. Try again after the current operation has been ...
  13. The data file "%1!s!" already exists at the specified location. Cannot overwrite the file as the Overwrite option is set ...
  14. The data file "{0}" already exists at the specified location. Cannot overwrite the file because the overwrite option is set ...
  15. The Data Flow changed the size of a buffer away from the default values. The event text includes the sizes and the reasons. ...
  16. The data flow component provided a duplicate friendly name '{0}' for property '{1}'. Contact the component vendor for more ...
  17. The data flow component provided an empty name when retrieving a friendly name for value '{0}' of property '{2}'. Contact ...
  18. The data flow component provided duplicate value '{0}' when retrieving friendly names for property '{1}'. Contact the component ...
  19. The Data Flow engine scheduler cannot allocate enough memory for the execution structures. The system was low on memory before ...
  20. The Data Flow engine scheduler cannot retrieve object with ID %1!d! from the layout. The Data Flow engine scheduler previously ...
  21. The Data Flow engine scheduler failed to create a thread object because not enough memory is available. This is caused by ...
  22. The Data Flow engine scheduler failed to reduce the execution plan for the pipeline. Set the OptimizedMode property to false. ...
  23. The Data Flow engine scheduler failed to retrieve the execution tree with index %1!d! from the layout. The scheduler received ...
  24. The Data Flow task encapsulates the data flow engine that moves data between sources and destinations providing the facility ...
  25. The Data Flow task failed to create a buffer to call PrimeOutput for output "%3!s!" (%4!d!) on component "%1!s!" (%2!d!). ...
  26. The Data Flow task failed to create a required thread and cannot begin running. The usually occurs when there is an out-of-memory ...
  27. The Data Flow task failed to initialize a required thread and cannot begin execution. The thread previously reported a specific ...
  28. The data in row {0} was not committed. Error Source: {1}. Error Message: {2} Correct the errors and retry or press ESC to ...
  29. The data in the column '{0}' does not have the required content type. In a mining structure that has a Key Time column, any ...
  30. The data in the data file does not conform to the UNIQUE hint specified for the BULK rowset '%1!s!'. The data in the data ...
  31. The Data Mining Extensions (DMX) OPENROWSET statement supports ad hoc queries using external providers. Enable ad hoc data ...
  32. The Data Mining Extensions (DMX) OPENROWSET statement supports ad hoc queries using external providers. Enable ad hoc data ...
  33. The data mining relationship cannot be defined because the designer failed to identify the source dimension for mining model. ...
  34. The data mining relationship cannot be defined because the {0} case dimension is absent in this cube and needs to be added. ...
  35. The Data Mining Training transformation requires at least one input. Connect an output from a source or another transformation ...
  36. The data processing extension used for this data source is not available. It has either been uninstalled, or it is not configured ...
  37. The data processing extension used for this report is not available. It has either been uninstalled, or it is not configured ...
  38. The data refresh job failed to update the PowerPivot workbook because another user modified the file while data refresh was ...
  39. The data source '{0}' uses a custom data processing extension which does not implement IDbConnectionExtension. Therefore, ...
  40. The data source '{0}' uses a managed data provider which does not implement IDbConnectionExtension. Only Windows Integrated ...
  41. The data source is configured to prompt for credentials but no prompt text has been entered. Enter prompt text for the data ...
  42. The data source may might not be accessible from the location where the report was saved. The report was saved to "{0}". ...
  43. The data source view does not contain a definition for the '%{table/}' table or view. The Source property may not have been ...
  44. The data source view for the cube does not contain some tables, named calculations, primary keys, or relationships used by ...
  45. The data source {1}' has both or neither of the following: DataSourceReference and ConnectionProperties. DataSource must ...
  46. The data type "%1!s!" cannot be compared. Comparison of that data type is not supported, so it cannot be sorted or used as ...
  47. The data type "%1!s!" cannot be used with binary operator "%2!s!". The type of one or both of the operands is not supported ...
  48. The data type "%1!s!" cannot be used with unary operator "%2!s!". This operand type is not supported for the operation. To ...
  49. The data type "%1!s!" found on column "%2!s!" is not supported for the %3!s!. This column will be converted to DT_NTEXT. ...
  50. The data type "%1!s!" is invalid for transaction names or savepoint names. Allowed data types are char, varchar, nchar, varchar(max), ...
  51. The data type %1!s! does not exist. Verify the supported data types and mappings by querying msdb.dbo.sysdatatypemappings. ...
  52. The data type %1!s! is invalid for the %2!s! function. Allowed types are: char/varchar, nchar/nvarchar, and binary/varbinary. ...
  53. The data type conversion file, {0}, cannot be located. Therefore, the wizard cannot provide information about the conversion ...
  54. The data type for "%1!s!" is DT_IMAGE, which is not supported. Use DT_TEXT or DT_NTEXT instead and convert the data from, ...
  55. The data type for "%1!s!" is DT_NTEXT, which is not supported with ANSI files. Use DT_TEXT instead and convert the data to ...
  56. The data type for "%1!s!" is DT_TEXT, which is not supported with Unicode files. Use DT_NTEXT instead and convert the data ...
  57. The data type is about to be changed for the column. This will affect how the PowerPivot data is stored and might impact ...
  58. The data type mapping for %1!s! does not exist. Verify the list of available mappings by querying msdb.dbo.sysdatatypemappings. ...
  59. The data type of a foreign key binding (ordinal=%{iOrdinal/}) for the '%{tablecolumn/}' nested table does not match the data ...
  60. The data type of the %{structure/}.%{dmscolumn/} mining structure column must be numeric since it has a continuous content ...
  61. The data type of the '%{measure/}' measure must be the same as its source data type. This is because the aggregate function ...
  62. The data type of the parameter "{0}" is {1}. However, one or more of the values provided for the parameter cannot be converted ...
  63. The data type of the parameter "{0}" is {1}. However, the value provided for the parameter cannot be converted to this type. ...
  64. The data type of the {0} named calculation will be changed, and the following associated relationships will be removed:'{1}' ...
  65. The data type specified for the '%{column/}' column of the '%{structure/}' OLAP mining structure is not compatible with its ...
  66. The data types "%1!s!" and "%2!s!" are incompatible for binary operator "%3!s!". The operand types could not be implicitly ...
  67. The data types "%1!s!" and "%2!s!" are incompatible for the conditional operator. The operand types cannot be implicitly ...
  68. The data types of the operands of the conditional operator were incompatible. The operand types could not be implicitly cast ...
  69. The data types varchar(max), nvarchar(max), varbinary(max), and XML cannot be used in the compute clause by client driver ...
  70. The data viewer displays data flowing through a path during execution and can be configured to display data in numerous formats. ...
  71. The data-tier application cannot be upgraded because the database associated with the data-tier application is not available. ...
  72. The data-tier application was deleted from the instance. Refresh the Object Explorer pane before selecting a data-tier application. ...
  73. The database "%1!s!" does not exist. RESTORE can only create a database when restoring either a full backup or a file backup ...
  74. The database "%1!s!" is already configured for database mirroring on the remote server. Drop database mirroring on the remote ...
  75. The database "%1!s!" is in warm-standby state (set by executing RESTORE WITH STANDBY) and cannot be backed up until the entire ...