Column length conversion issues
The import data wizard of SQL Server Management Studio is a useful import tool for SQL server databases. But when you get errors while importing data, you will experience the limitations of this functionality. Data conversion errors are one of the most common and time consuming frustrations of data processing. And one of the hardest to find a solution to. In our day-to-day work, we lost a lot of time to solve this kind of errors. Therefore we developed the SQL DataTool.
In this article we will show how the SQL DataTool can help you with data load errors related to string conversion issues due to text lengths and data model problems. When using the import data wizard, the data conversion errors caused by column length are hard to solve. The first part of this article, we will give an example. In the second part of this article, we will show you how the SQL DataTool can help you with this data conversion error.
Jump to the solution to import files with length data conversion issues directly.
Explanation length data conversion error
Using the SQL Server Management Studio import data wizard is easy. It is quite straightforward, and that is exactly its strength. The weakness is that complex data imports can cause a lot of trouble. The same is true for large files or undefined data sources. We will now first give an example of what can trigger the length data conversion error.
The file that we use to explain the length data conversion error is shown below. This file has 1 headerrow that contains the columnheader, and 4 rows with data. The file itself is called 'zztest_TOO_LONG.txt' and is a plain text file.

Next we will prepare a table in SQL Server. This table contains only one column, NVARCHARTEST, with a length of 10. The datarows (rows 2 to 5) can be imported without problems, except for the last row. The string "TEST_TOO_LONG" is longer than the 10 characters that we defined the column to be.
When we use the SQL Server Management Studio import data wizard and choose to preview the data to be imported, we see that the headerrow and datarows are shown correctly.

Next, we need to map the data to be imported to the table that we have created previously. We do this using the column mapping functionality in the import data wizard. We select the table to which we want to import the data. The table is destination 'TEST_TOO_LONG' and the column is 'NVARCHARTEST'. As you can see, the expected size (or length) of the contents of the column is 10.

Finishing the import using SQL Server Management Studio will result in an empty table, and the following error:
- Executing (Error)
Messages
Information 0x402090de: Data Flow Task 1: The total number of data rows processed for file "C:\Testfiles\zztest_TOO_LONG.txt" is 5.
(SQL Server Import and Export Wizard)
Error 0xc02020c5: Data Flow Task 1: Data conversion failed while converting column "NVARCHARTEST" (10) to column "NVARCHARTEST" (40). The conversion returned status value 2 and status text "The value could not be converted because of a potential loss of data.".
(SQL Server Import and Export Wizard)
Error 0xc0209029: Data Flow Task 1: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR. The "Data Conversion 0 - 0.Outputs[Data Conversion Output].Columns[NVARCHARTEST]" failed because error code 0xC020907F occurred, and the error row disposition on "Data Conversion 0 - 0.Outputs[Data Conversion Output].Columns[NVARCHARTEST]" specifies failure on error. An error occurred on the specified object of the specified component. There may be error messages posted before this with more information about the failure.
(SQL Server Import and Export Wizard)
Error 0xc0047022: Data Flow Task 1: SSIS Error Code DTS_E_PROCESSINPUTFAILED. The ProcessInput method on component "Data Conversion 0 - 0" (32) failed with error code 0xC0209029 while processing input "Data Conversion Input" (33). The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running. There may be error messages posted before this with more information about the failure.
(SQL Server Import and Export Wizard)
This error is very common when importing larger datasets. Root causes for example are an unfamiliar source systems or files of which the data types and/or lengths of the columns are unknown. The dataset that we used to explain this data conversion error in this article is very limited. Therefore, it is quite easy to solve this error. But most companies work with a lot more data than the four rows we are working with in this example.
Importing length data conversion solution
Understanding the above error thrown by the SQL Server Management Studio import wizard is not easy. It does not give enough information to be able to solve the errors in an easy way. Even worse, when you solve the first erroneous record, the next one might pop up in the next import session. We have developed the SQL DataTool to help you with this kind of issues.
The first thing we have to do is to define the expected datamodel. In this case, it is very easy, since the plain text file that we use only contains one column. The type of this column is NVARCHAR with a length of 10. We define this data model on the 'Data model' screen of the SQL DataTool. You can read in detail in the manual how to define a data model.

Next, we will import the data. When we open the Data model screen, we can enter the details for the import. The image below shows the details for this example.

When the data is imported, the SQL DataTool will identify the errors automatically. We get a pop up that shows that 3 records have been imported successfully. One record gives an error. We can find this error in the file with the postfix '_error' in the same folder as the imported plain text file. The error that is shows is the below error.

The SQL DataTool allows you to import a file to the SQL Server Database without the error thrown by the SQL Server Management Studio import wizard. All records will be staged into the database, except for the records that throw an error. These erroneous records are written away into a new file, with the '_error' postfix. This allows you to easily import large files. You can even import the records with errors, when you have reviewed these records. To solve the issues mentioned in this article, you have two options: either you change the data model to fit the data, or you change the data to fit the defined data model.
The best solution is to adjust the data model based on the data to be imported. Since the SQL DataTool will separate the erroneous records, you are able to identify the issues and find the correct lengths of text columns (NVARCHAR, VARCHAR, CHAR, NCHAR etc).
If you do not mind whether the textual columns are completely staged into the database, you can adjust the erroneous records in the error-file. You can limit the lenght of the texts that do not meet the columns in your defined data model.
For any option that you choose, you will experience an easier and less timeconsuming staging process when you use our SQL DataTool.