You are on page 1of 23

The New Data Pump

Caleb Small Caleb@caleb.com

Next generation Import / Export


New features Better performance Improved security Versatile interfaces

Old Import / Export


Still exists Installed and enabled by default Supports features through version 9i No new releases Required to import pre-10g dump files Dump files & scripts are NOT compatible

New Features
Data Sampling & filtering Object filtering Estimate file size, max file size, multiple files Remap datafile, schema, tablespace Network mode Point-in-time export Version conversion Column data encryption

Better Performance
Complete re-write Parallel processing Parallel data streams to multiple files Runs within the instance Uses direct path whenever possible Restartable Tunable (auto tuning) Progress monitoring

Improved Security
Server based only no client side exports! Based on directory objects within database Always runs as oracle process on behalf of invoking database user.

Versatile Interfaces
Command line Parameter file Interactive mode DBMS_DATAPUMP package DB console (Enterprise manager) External table Scheduled job

Datapump Architecture
Master process
Manages and controls the operation

Worker process(es)
Responsible for data movement One for each degree of parallelism

Master table
Created in invokers schema at job start Maintained during job execution Dropped after successful completion Used to resume a paused/failed job

Control & status queues

Types of Exports
Table Schema Tablespace Database Transportable Tablespace (metadata) INCLUDE / EXCLUDE object filters QUERY and SAMPLE data filters CONTENTS = data | metadata | both

Directory Objects
Created as a database object Requires CREATE_ANY_DIRECTORY privilege Permissions (read, write) granted on the object to specific user(s) Not validated existence, syntax, OS privilege Accessed as user oracle at the OS level Default DATA_PUMP_DIR maps to

Data Access
Direct Path
Chosen automatically whenever possible Reads/writes data blocks directly No undo, redo can be turned off See Utilities Guide for exceptions (eg. active triggers, clustered tables, BFILE column, etc)

External Table
Equivalent to old conventional path Normal SQL and commit processing, slower NOT the same as the external table driver for SQL

Monitoring Data Pump


STATUS parameter Detach / re-attach jobs Stop / start / kill jobs STATUS command (interactive mode) Data dictionary views
DBA_DATAPUMP_JOBS DBA_DATAPUMP_SESSIONS V$SESSION_LONGOPS

Log File

Interactive Mode
NOT the same as old imp/exp! Default starts schema mode export Use command line arguments or par file logging vs interactive command mode Default logging mode logs to terminal Ctl-C to enter interactive command mode Job will continue to run even if client disconnects!
expdp scott/tiger@fred parfile=myjob.par

Interactive Mode Commands


STATUS CONTINUE_CLIENT EXIT_CLIENT STOP_JOB START_JOB KILL_JOB PARALLEL ADD_FILE HELP
Status of current job Resume logging mode Exit client, leave job running Stop current job, do not delete Re-start current job Delete current job Add/remove worker processes Add dump file during export Get help

Obsolete Parameters

Export BUFFER COMPRESS CONSISTENT DIRECT RECORD_LENGTH RESUMABLE STATISTICS USERID VOLSIZE

Import BUFFER CHARSET COMMIT COMPILE FILESIZE RECORD_LENGTH RESUMABLE STATISTICS

Others have changed, see the Utilities Guide!

Data Pump API


Grant execute on DBMS_DATAPUMP and optionally DBMS_METADATA Exec DBMS_DATAPUMP.OPEN Define parameters (job type, dump file, etc) Exec DBMS_DATAPUMP.START_JOB Optionally monitor, detach, re-attach, stop, start or kill the job Can be scheduled as a recurring job

External Table (SQL)


Created as a database object (TABLE) Organization EXTERNAL Uses ORACLE_DATAPUMP access driver Can load and unload data Dumpfile contains row data only Metadata stored in data dictionary Not compatible with regular dump file

Network Mode
Works across database link Import reads tables from remote DB and writes directly to tables in local DB
No dump file created Directory object still required for logging

Export reads tables from remote DB and writes to dump file on local server

SQL File
Import can generate an SQL file instead of actually performing the import Contains DDL that would have been executed based on job parameters Passwords excluded No change to target DB

Required Reading
Oracle Database New Features Guide Oracle Database Utilities PL/SQL Packages and Types Reference Oracle Data Pump FAQ on OTN

Demonstration
Directory object basics Basic Data Pump unload and load Query / Sample data filtering Re-attaching and monitoring a big job The PL/SQL API SQL File import External Table access driver
www.caleb.com/dba Caleb@caleb.com

Tuning Data Pump


PARALLEL is the only DP specific parameter Set to 2 times number of CPUs Will cause increased resource consumption
Memory, CPU, I/O bandwidth Do not overload these resources!

Individual worker processes can use parallel query Use multiple dump files to maximize parallelism
Separate files on separate physical devices/channels Use wildcard in filename eg: dumpfile%u.dmp Separate device/channel from source tablespace(s)

Tuning Data Pump


Instance parameters that may affect performance:
DISK_ASYNCH_IO=TRUE DB_BLOCK_CHECKING=FALSE DB_BLOCK_CHECKSUM=FALSE PROCESSES SESSIONS PARALLEL_MAX_SERVERS SHARED_POOL_SIZE UNDO_TABLESPACE

You might also like