Exporting a Database
You can use gs_dump to export data and all object definitions of a database from GaussDB(DWS). You can specify the information to be exported as follows:
- Export full information of a database, including its data and all object definitions.
You can use the exported information to create a same database containing the same data as the current one.
- Export all object definitions of a database, including the definitions of the database, functions, schemas, tables, indexes, and stored procedures.
You can use the exported object definitions to quickly create a same database as the current one, without data.
- Export data of a database.
Procedure
- Connect to the default database postgres as a database administrator through the database client tool provided by GaussDB(DWS).
For example, use the gsql client to connect to the database by running the following command:
1
gsql -d postgres -h 192.168.2.30 -U dbadmin -p 8000 -r
Enter your password as prompted.
- Use gs_dump to export data of the postgres database.
gs_dump -W Bigdata@123 -U jack -f /home/dbadmin/backup/postgres_backup.tar -p 8000 postgres -F t
Table 1 Common parameters Parameter
Description
Example Value
-U
User name for database connection.
-U jack
-W
User password for database connection.
- This parameter is not required for database administrators if the trust policy is used for authentication.
- If you connect to the database without specifying this parameter and you are not a database administrator, you will be prompted to enter the password.
-W Bigdata@123
-f
Folder to store exported files. If this parameter is not specified, the exported files are stored in the standard output.
-f /home/dbadmin/backup/postgres_backup.tar
-p
TCP port or the local Unix-domain socket file extension on which the server is listening for connections.
-p 8000
dbname
Name of the database to be exported.
postgres
-F
Format of exported files. The values of -F are as follows:
- p: plain text
- c: custom
- d: directory
- t: .tar
-F t
For details about other parameters, see gs_dump.
Examples
Example 1: Run gs_dump to export full information of the postgres database and compress the exported files in .sql format.
gs_dump -W Bigdata@123 -f /home/dbadmin/backup/postgres_backup.sql -p 8000 postgres -Z 8 -F p gs_dump[port='8000'][postgres][2017-07-21 15:36:13]: dump database postgres successfully gs_dump[port='8000'][postgres][2017-07-21 15:36:13]: total time: 3793 ms
Example 2: Run gs_dump to export data of the postgres database, excluding object definitions. The exported files are in a custom format.
gs_dump -W Bigdata@123 -f /home/dbadmin/backup/postgres_data_backup.dmp -p 8000 postgres -a -F c gs_dump[port='8000'][postgres][2017-07-21 15:36:13]: dump database postgres successfully gs_dump[port='8000'][postgres][2017-07-21 15:36:13]: total time: 3793 ms
Example 3: Run gs_dump to export object definitions of the postgres database. The exported files are in .sql format.
--Before the export, the nation table contains data.
select n_nationkey,n_name,n_regionkey from nation limit 3;
n_nationkey | n_name | n_regionkey
-------------+---------------------------+-------------
0 | ALGERIA | 0
3 | CANADA | 1
11 | IRAQ | 4
(3 rows)
gs_dump -W Bigdata@123 -f /home/dbadmin/backup/postgres_def_backup.sql -p 8000 postgres -s -F p
gs_dump[port='8000'][postgres][2017-07-20 15:04:14]: dump database postgres successfully
gs_dump[port='8000'][postgres][2017-07-20 15:04:14]: total time: 472 ms Example 4: Run gs_dump to export object definitions of the postgres database. The exported files are in text format and are encrypted.
gs_dump -W Bigdata@123 -f /home/dbadmin/backup/postgres_def_backup.sql -p 8000 postgres --with-encryption AES128 --with-key 1234567812345678 -s -F p gs_dump[port='8000'][postgres][2018-11-14 11:25:18]: dump database postgres successfully gs_dump[port='8000'][postgres][2018-11-14 11:25:18]: total time: 1161 ms
Last Article: Exporting a Single Database
Next Article: Exporting a Schema
Did this article solve your problem?
Thank you for your score!Your feedback would help us improve the website.