ProblemYou want to export tables or databases as SQL statements to make them easier to import later. SolutionUse the mysqldump program without the --tab option. DiscussionAs discussed in Section 10.14, mysqldump causes the MySQL server to write tables as text datafiles on the server host when it's invoked with the --tab option. If you omit the --tab , the server formats the table rows as the INSERT statements and returns them to mysqldump, which writes the output on the client host. The output also can include the CREATE TABLE statement for each table. This provides a convenient form of output that you can capture in a file and use later to recreate a table or tables. It's common to use such dump files as backups or for copying tables to another MySQL server. This section discusses how to save dump output in a file; Section 10.16 shows how to send it directly to another server over the network. To export a table in SQL format to a file, use a command like this: % mysqldump cookbook states > states.txt That creates an output file states.txt that contains both the CREATE TABLE statement and a set of INSERT statements: -- MySQL dump 10.10 -- -- Host: localhost Database: cookbook -- ------------------------------------------------------ -- Server version 5.0.27-log -- -- Table structure for table `states` -- CREATE TABLE `states` ( `name` varchar(30) NOT NULL, `abbrev` char(2) NOT NULL, `statehood` date default NULL, `pop` bigint(20) default NULL, PRIMARY KEY (`abbrev`) ); -- -- Dumping data for table `states` -- INSERT INTO `states` VALUES ('Alabama','AL','1819-12-14',4530182); INSERT INTO `states` VALUES ('Alaska','AK','1959-01-03',655435); INSERT INTO `states` VALUES ('Arizona','AZ','1912-02-14',5743834); INSERT INTO `states` VALUES ('Arkansas','AR','1836-06-15',2752629); INSERT INTO `states` VALUES ('California','CA','1850-09-09',35893799); INSERT INTO `states` VALUES ('Colorado','CO','1876-08-01',4601403); ... NOTE
The preceding mysqldump output actually was produced by using the --skip-extended-insert option, which causes each row to be written as a separate INSERT statement. If you omit --skip-extended-insert (which is the usual case), mysqldump writes multiple-row INSERT statements. Those are more difficult for you and me to read, but more efficient for the MySQL server to process. To dump multiple tables, name them all following the database name argument. To dump an entire database, don't name any tables after the database. This statement dumps all tables in the cookbook database: % mysqldump cookbook > cookbook.txt If you want to dump all tables in all databases, invoke mysqldump like this: % mysqldump --all-databases > dump.txt In that case, the output file also includes CREATE DATABASE and USE db_name statements at appropriate places so that when you read in the file later, each table is created in the proper database. Other options are available to control the output format:
Suppose now that you've used mysqldump to create an SQL-format dump file. How do you import the file back into MySQL? One common mistake at this point is to use mysqlimport. After all, it's logical to assume that if mysqldump exports tables, mysqlimport must import them. Right? Sorry, no. That might be logical, but it's not always correct. It's true that if you use the --tab option with mysqldump, you can import the resulting datafiles with mysqlimport. But if you dump an SQL-format file, mysqlimport won't process it properly. Use the mysql program instead. The way you do this depends on what's in the dump file. If you dumped multiple databases using --all-databases, the file will contain the appropriate USE db_name statements to select the databases to which each table belongs, and you need no database argument on the command line: % mysql < dump.txt If you dumped tables from a single database, you'll need to tell mysql which database to import them into: % mysql db_name < cookbook.txt Note that with this second import command, it's possible to load the tables into a database different from the one from which they came originally. For example, you can use this fact to create copies of a table or tables in a test database to help debug data manipulation statements, without worrying about affecting the original tables. |