oracle数据泵的使用

1.oracle数据泵是oracle提供的更新、更快、更灵活的数据导入、导出工具。
2.创建DIRECTORY
在使用数据泵之前,必须创建DIRECTORY并获得访问这么对象的权限。
例如:
SQL> conn / as sysdba
Connected.
SQL> CREATE DIRECTORY MY_DIR AS '/U01/oracle';

Directory created.

SQL> GRANT READ,WRITE ON DIRECTORY MY_DIR TO TEST;

Grant succeeded.

现在,用户TEST就可以在目录MY_DIR上执行导入、导出了。

3.表的导出、导入
用TABLES参数来设置需要导入、导出的表,用CONTENT参数设置导入,导出的内容:ALL(包括表元数据和表数据), DATA_ONLY(仅包括数据),  METADATA_ONLY(仅包括元数据)

expdp Test/oracle DIRECTORY=MY_DIR TABLES=MYOBJECT,OBJ JOB_NAME='Expdp_TEST_JOB' DUMPFILE=expTABLES.dmp

SQL> conn Test/oracle
Connected.
SQL> select table_name from user_tables;

TABLE_NAME
------------------------------
OBJ
MYOBJECT

SQL> drop table OBJ PURGE;

Table dropped.

SQL> drop table MYOBJECT PURGE;

Table dropped.

SQL> select table_name from user_tables;

no rows selected

impdp Test/oracle DIRECTORY=MY_DIR TABLES=MYOBJECT,OBJ JOB_NAME='Expdp_TEST_JOB' DUMPFILE=expTABLES.dmp

SQL> select table_name from user_tables;

TABLE_NAME
------------------------------
OBJ
MYOBJECT
可见,表及其数据已经导入到数据库中了。

4.模式导入、导出
用SCHEMAS 参数设置要导入、导出的模式及其数据
例如:
expdp system/oracle schemas=TEST directory=MY_DIR dumpfile=TEST.dmp  logfile=expdpTEST.log

SQL> conn / as sysdba
Connected.

SQL> drop user TEST cascade;

User dropped.

[oracle@Linux ~]$ impdp system/oracle schemas=TEST directory=MY_DIR dumpfile=TEST.dmp logfile=expdpTEST.log

Import: Release 11.2.0.1.0 - Production on Thu Jul 7 14:24:43 2011

Copyright (c) 1982, 2009, Oracle and/or its affiliates.  All rights reserved.

Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Master table "SYSTEM"."SYS_IMPORT_SCHEMA_01" successfully loaded/unloaded
Starting "SYSTEM"."SYS_IMPORT_SCHEMA_01":  system/******** schemas=TEST directory=MY_DIR dumpfile=TEST.dmp logfile=expdpTEST.log
Processing object type SCHEMA_EXPORT/USER
Processing object type SCHEMA_EXPORT/SYSTEM_GRANT
Processing object type SCHEMA_EXPORT/DEFAULT_ROLE
Processing object type SCHEMA_EXPORT/TABLESPACE_QUOTA
Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
Processing object type SCHEMA_EXPORT/TABLE/TABLE
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
. . imported "TEST"."MYOBJECT"                           511.1 KB    5549 rows
. . imported "TEST"."OBJ"                                511.1 KB    5550 rows
Job "SYSTEM"."SYS_IMPORT_SCHEMA_01" successfully completed at 14:24:58

SQL> select username from dba_users;

USERNAME
------------------------------
OUTLN
SYS
SYSTEM
TEST
ADMIN
TEST_B
TEST_A
APPQOSSYS
DBSNMP
DIP
ORACLE_OCM

11 rows selected.

SQL> conn test/oracle
Connected.

5.数据库导入、导出

用FULL参数标示需要数据库全导入、导出。
expdp system/oracle full=Y directory=MY_DIR dumpfile=DBFull.dmp logfile=expdpDBfull.log
SQL> conn / as sysdba
Connected.

SQL> drop user TEST cascade;

User dropped.

SQL> drop user TEST_A cascade;

User dropped.

SQL> drop user TEST_B cascade;

User dropped.
impdp system/oracle full=Y directory=MY_DIR dumpfile=DBFull.dmp logfile=expdpDBFull.log
SQL> conn / as sysdba
Connected.
SQL> select username from dba_users;

USERNAME
------------------------------
OUTLN
SYS
SYSTEM
TEST
ADMIN
TEST_B
TEST_A
APPQOSSYS
DBSNMP
DIP
ORACLE_OCM

11 rows selected.

6.INCLUDE 和EXLUDE
INCLUDE 和EXLUDE参数用户限定导入、导出特定的对象。INCLUDE参数设于设定导入导出的对象,EXCLUDE参数用于设定除特定对象为都要导入导出的对象。这两个参数是相斥的。
INCLUDE=object_type[:name_clause] [, ...]
EXCLUDE=object_type[:name_clause] [, ...]
对象类型包括:TABLE,VIEW,PACKAGE
name_clause:sql表达式,用于过滤对象,如IN ('DEPT','EMP'), LIKE ('D%'), regexp_like(table_name,’(*JOB*|*DEPAR*|*EMP*)’) ,='EMP' 等等。

例如:
expdp TEST/oracle DIRECTORY=MY_DIR DUMPFILE=TEST2.DMP schemas=TEST include=TABLE:\"=\'OBJ\'\"

SQL> select table_name from user_tables;

TABLE_NAME
------------------------------
OBJ
MYOBJECT

SQL> drop table obj purge;

Table dropped.
impdp TEST/oracle DIRECTORY=MY_DIR DUMPFILE=TEST2.DMP schemas=TEST include=TABLE:\"=\'OBJ\'\"
SQL>  select table_name from user_tables;

TABLE_NAME
------------------------------
OBJ
MYOBJECT

 

7.网络导入导出
NETWORK_LIN参数用于标示一个作为数据导入导出源的数据库链。
用户必须有CREATE DATABASE LINK 的权限。本地用户和远程用户都需要授予EXP_FULL_DATABASE 角色。

SQL> GRANT CREATE DATABASE LINK TO TEST;

SQL> CREATE DATABASE LINK tlink connect to test identified by oracle using 'Instinct';

Database link created.

expdp Test/oracle tables=TEST.OBJ network_link=tlink directory=MY_DIR dumpfile=obj.dmp logfile=expdpobj.log
impdp Test/oracle tables=TEST.OBJ network_link=tlink directory=MY_DIR dumpfile=obj.dmp logfile=expdpobj.log


8.杂项信息
dmp文件和log文件存储在服务器上,而不是存储在客户端。
在导出导入执行的过程中,取消客户端进程不能停止数据泵任务,在数据泵任务执行的过程中,输入ctrl+C将停止客户端输出,显示命令行提示符。输入status来监视任务状态。
Export>status

9.PARALLEL
PARALLEL 参数用于并行导入导出。
例如:
expdp Test/oracle schemas=TEST directory=MY_DIR parallel=4 dumpfile=Test3dmp logfile=expdpTest.log

SQL> select  OWNER_NAME,JOB_NAME,OPERATION ,JOB_MODE,STATE,DEGREE from dba_datapump_jobs;

OWNER_NAME JOB_NAME             OPERATION   JOB_MODE  STATE            DEGREE
---------- -------------------------------- --------- ------------ ----------
TEST       SYS_EXPORT_SCHEMA_01 EXPORT      SCHEMA    EXECUTING         4

10.Data Pump API


SET SERVEROUTPUT ON SIZE 1000000
DECLARE
  l_dp_handle       NUMBER;
  l_last_job_state  VARCHAR2(30) := 'UNDEFINED';
  l_job_state       VARCHAR2(30) := 'UNDEFINED';
  l_sts             KU$_STATUS;
BEGIN
  l_dp_handle := DBMS_DATAPUMP.open(
    operation   => 'EXPORT',
    job_mode    => 'SCHEMA',
    remote_link => NULL,
    job_name    => 'EMP_EXPORT',
    version     => 'LATEST');

  DBMS_DATAPUMP.add_file(
    handle    => l_dp_handle,
    filename  => 'SCOTT.dmp',
    directory => 'TEST_DIR');

  DBMS_DATAPUMP.add_file(
    handle    => l_dp_handle,
    filename  => 'SCOTT.log',
    directory => 'TEST_DIR',
    filetype  => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE);

  DBMS_DATAPUMP.metadata_filter(
    handle => l_dp_handle,
    name   => 'SCHEMA_EXPR',
    value  => '= ''SCOTT''');

  DBMS_DATAPUMP.start_job(l_dp_handle);

  DBMS_DATAPUMP.detach(l_dp_handle);
END;
/


11.外部表

可以用 ORACLE_DATAPUMP 来把数据导出导入

CREATE TABLE OBJ_XT
  ORGANIZATION EXTERNAL
   (
     TYPE ORACLE_DATAPUMP
     DEFAULT DIRECTORY MY_DIR
     LOCATION ('OBJ_XT.DMP')
   )
   AS SELECT * FROM OBJ;
The data can then be queried using the following.
SELECT * FROM emp_xt;
也可以使用下列方式创建外部表
DROP TABLE OBJ_XT;

CREATE TABLE OBJ_XT (
 OWNER              VARCHAR2(30),
 OBJECT_NAME        VARCHAR2(30),
 SUBOBJECT_NAME             VARCHAR2(30),
 OBJECT_ID          NUMBER,
 DATA_OBJECT_ID             NUMBER,
 OBJECT_TYPE                VARCHAR2(19),
 CREATED            DATE,
 LAST_DDL_TIME      DATE,
 TIMESTAMP                  VARCHAR2(19),
 STATUS                     VARCHAR2(7),
 TEMPORARY                  VARCHAR2(1),
 GENERATED                  VARCHAR2(1),
 SECONDARY                  VARCHAR2(1),
 NAMESPACE          NUMBER,
 EDITION_NAME               VARCHAR2(30))
  ORGANIZATION EXTERNAL (
     TYPE ORACLE_DATAPUMP
     DEFAULT DIRECTORY MY_DIR
     LOCATION ('OBJ_XT.DMP')
  );

12.help
expdp

expdp help=y

Export: Release 10.1.0.2.0 - Production on Tuesday, 23 March, 2004 8:33

Copyright (c) 2003, Oracle.  All rights reserved.


The Data Pump export utility provides a mechanism for transferring data objects
between Oracle databases. The utility is invoked with the following command:

   Example: expdp scott/tiger DIRECTORY=dmpdir DUMPFILE=scott.dmp

You can control how Export runs by entering the 'expdp' command followed
by various parameters. To specify parameters, you use keywords:

   Format:  expdp KEYWORD=value or KEYWORD=(value1,value2,...,valueN)
   Example: expdp scott/tiger DUMPFILE=scott.dmp DIRECTORY=dmpdir SCHEMAS=scott
               or TABLES=(T1:P1,T1:P2), if T1 is partitioned table

USERID must be the first parameter on the command line.

Keyword               Description (Default)
------------------------------------------------------------------------------
ATTACH                Attach to existing job, e.g. ATTACH [=job name].
CONTENT               Specifies data to unload where the valid keywords are:
                      (ALL), DATA_ONLY, and METADATA_ONLY.
DIRECTORY             Directory object to be used for dumpfiles and logfiles.
DUMPFILE              List of destination dump files (expdat.dmp),
                      e.g. DUMPFILE=scott1.dmp, scott2.dmp, dmpdir:scott3.dmp.
ESTIMATE              Calculate job estimates where the valid keywords are:
                      (BLOCKS) and STATISTICS.
ESTIMATE_ONLY         Calculate job estimates without performing the export.
EXCLUDE               Exclude specific object types, e.g. EXCLUDE=TABLE:EMP.
FILESIZE              Specify the size of each dumpfile in units of bytes.
FLASHBACK_SCN         SCN used to set session snapshot back to.
FLASHBACK_TIME        Time used to get the SCN closest to the specified time.
FULL                  Export entire database (N).
HELP                  Display Help messages (N).
INCLUDE               Include specific object types, e.g. INCLUDE=TABLE_DATA.
JOB_NAME              Name of export job to create.
LOGFILE               Log file name (export.log).
NETWORK_LINK          Name of remote database link to the source system.
NOLOGFILE             Do not write logfile (N).
PARALLEL              Change the number of active workers for current job.
PARFILE               Specify parameter file.
QUERY                 Predicate clause used to export a subset of a table.
SCHEMAS               List of schemas to export (login schema).
STATUS                Frequency (secs) job status is to be monitored where
                      the default (0) will show new status when available.
TABLES                Identifies a list of tables to export - one schema only.
TABLESPACES           Identifies a list of tablespaces to export.
TRANSPORT_FULL_CHECK  Verify storage segments of all tables (N).
TRANSPORT_TABLESPACES List of tablespaces from which metadata will be unloaded.
VERSION               Version of objects to export where valid keywords are:
                      (COMPATIBLE), LATEST, or any valid database version.

The following commands are valid while in interactive mode.
Note: abbreviations are allowed

Command               Description
------------------------------------------------------------------------------
ADD_FILE              Add dumpfile to dumpfile set.
                      ADD_FILE=dumpfile-name
CONTINUE_CLIENT       Return to logging mode. Job will be re-started if idle.
EXIT_CLIENT           Quit client session and leave job running.
HELP                  Summarize interactive commands.
KILL_JOB              Detach and delete job.
PARALLEL              Change the number of active workers for current job.
                      PARALLEL=.
START_JOB             Start/resume current job.
STATUS                Frequency (secs) job status is to be monitored where
                      the default (0) will show new status when available.
                      STATUS=[interval]
STOP_JOB              Orderly shutdown of job execution and exits the client.
                      STOP_JOB=IMMEDIATE performs an immediate shutdown of the
                      Data Pump job.
impdp

impdp help=y

Import: Release 10.1.0.2.0 - Production on Saturday, 11 September, 2004 17:22

Copyright (c) 2003, Oracle.  All rights reserved.


The Data Pump Import utility provides a mechanism for transferring data objects
between Oracle databases. The utility is invoked with the following command:

     Example: impdp scott/tiger DIRECTORY=dmpdir DUMPFILE=scott.dmp

You can control how Import runs by entering the 'impdp' command followed
by various parameters. To specify parameters, you use keywords:

     Format:  impdp KEYWORD=value or KEYWORD=(value1,value2,...,valueN)
     Example: impdp scott/tiger DIRECTORY=dmpdir DUMPFILE=scott.dmp

USERID must be the first parameter on the command line.

Keyword               Description (Default)
------------------------------------------------------------------------------
ATTACH                Attach to existing job, e.g. ATTACH [=job name].
CONTENT               Specifies data to load where the valid keywords are:
                      (ALL), DATA_ONLY, and METADATA_ONLY.
DIRECTORY             Directory object to be used for dump, log, and sql files.
DUMPFILE              List of dumpfiles to import from (expdat.dmp),
                      e.g. DUMPFILE=scott1.dmp, scott2.dmp, dmpdir:scott3.dmp.
ESTIMATE              Calculate job estimates where the valid keywords are:
                      (BLOCKS) and STATISTICS.
EXCLUDE               Exclude specific object types, e.g. EXCLUDE=TABLE:EMP.
FLASHBACK_SCN         SCN used to set session snapshot back to.
FLASHBACK_TIME        Time used to get the SCN closest to the specified time.
FULL                  Import everything from source (Y).
HELP                  Display help messages (N).
INCLUDE               Include specific object types, e.g. INCLUDE=TABLE_DATA.
JOB_NAME              Name of import job to create.
LOGFILE               Log file name (import.log).
NETWORK_LINK          Name of remote database link to the source system.
NOLOGFILE             Do not write logfile.
PARALLEL              Change the number of active workers for current job.
PARFILE               Specify parameter file.
QUERY                 Predicate clause used to import a subset of a table.
REMAP_DATAFILE        Redefine datafile references in all DDL statements.
REMAP_SCHEMA          Objects from one schema are loaded into another schema.
REMAP_TABLESPACE      Tablespace object are remapped to another tablespace.
REUSE_DATAFILES       Tablespace will be initialized if it already exists (N).
SCHEMAS               List of schemas to import.
SKIP_UNUSABLE_INDEXES Skip indexes that were set to the Index Unusable state.
SQLFILE               Write all the SQL DDL to a specified file.
STATUS                Frequency (secs) job status is to be monitored where
                      the default (0) will show new status when available.
STREAMS_CONFIGURATION Enable the loading of Streams metadata
TABLE_EXISTS_ACTION   Action to take if imported object already exists.
                      Valid keywords: (SKIP), APPEND, REPLACE and TRUNCATE.
TABLES                Identifies a list of tables to import.
TABLESPACES           Identifies a list of tablespaces to import.
TRANSFORM             Metadata transform to apply (Y/N) to specific objects.
                      Valid transform keywords: SEGMENT_ATTRIBUTES and STORAGE.
                      ex. TRANSFORM=SEGMENT_ATTRIBUTES:N:TABLE.
TRANSPORT_DATAFILES   List of datafiles to be imported by transportable mode.
TRANSPORT_FULL_CHECK  Verify storage segments of all tables (N).
TRANSPORT_TABLESPACES List of tablespaces from which metadata will be loaded.
                      Only valid in NETWORK_LINK mode import operations.
VERSION               Version of objects to export where valid keywords are:
                      (COMPATIBLE), LATEST, or any valid database version.
                      Only valid for NETWORK_LINK and SQLFILE.

The following commands are valid while in interactive mode.
Note: abbreviations are allowed

Command               Description (Default)11g
------------------------------------------------------------------------------
CONTINUE_CLIENT       Return to logging mode. Job will be re-started if idle.
EXIT_CLIENT           Quit client session and leave job running.
HELP                  Summarize interactive commands.
KILL_JOB              Detach and delete job.
PARALLEL              Change the number of active workers for current job.
                      PARALLEL=.
START_JOB             Start/resume current job.
                      START_JOB=SKIP_CURRENT will start the job after skipping
                      any action which was in progress when job was stopped.
STATUS                Frequency (secs) job status is to be monitored where
                      the default (0) will show new status when available.
                      STATUS=[interval]
STOP_JOB              Orderly shutdown of job execution and exits the client.
                      STOP_JOB=IMMEDIATE performs an immediate shutdown of the
                      Data Pump job.

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值