Chapter 7. Importing and Transforming Data

   

Transact-SQL-based systems often interact with much older, legacy systems. Data that is imported from such sources is often inefficiently structured and poorly organized. Usually after importing such data, you want to transform it into a structure that can be handled efficiently and that will easily support a wide variety of queries.

This chapter introduces concepts and recipes useful for working with imported data tables and their transformations. We show data normalization and other techniques for transforming badly structured data into a form more acceptable for SQL processing. Denormalized data is usually found when importing from nonrelational systems or from files designed for human use. In general, the more human-readable the data is, the less efficiently it can be manipulated using SQL statements.

Data rows can be inserted into SQL tables by using either the INSERT statement or an embedded import mechanism, such as the BULK INSERT statement. Once the rows are in SQL Server, they can be manipulated using standard SQL commands.

In this chapter, we try to find a balance between the readability and efficiency of data tables. We discuss general techniques of folding and pivoting that can be used to transform human-readable data into SQL-friendly tables and back again. We show some techniques that you can use to ensure smooth linking between legacy data systems and SQL Server-based systems. We suggest some steps that you can follow to ensure that external data is imported safely, and we offer ways to provide efficient error handling for failed imports.

   


Transact-SQL Cookbook
Transact-SQL Cookbook
ISBN: 1565927567
EAN: 2147483647
Year: 2005
Pages: 152

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net