Recently, I had a customer who was importing a long running Data Pump import job. The client needed to go home and still have a way to monitor the import job.
Starting with Oracle 10g Data Pump there is a useful feature in situations in which you start a job at one location and need to check on it at a later time from a different location:
Attach and Detach
While a Oracle Data Pump job is running, you can press [Ctrl]-C (or the equivalent on your client) to “detach” from the job. The messages will stop coming to your client, but it’s still running inside the database. Your client will be placed in an interactive mode (with Export> prompt). To see which jobs are running, type status.
To attach to an already running job run expdp system/password attach=<jobname>. If this is an option you think you will need to use often, then it is a good idea to use the JOBNAME parameter for all your jobs.
expdp system/mypassword attach=ExportSchemaHR;
To see what Data Pump jobs are running you can use this:
SELECT j.owner_name, j.job_name, j.job_mode, j.state, s.session_type, s.saddr FROM dba_datapump_jobs j,dba_datapump_sessions s WHERE UPPER(j.job_name) = UPPER(s.job_name);
So, for the import
SELECT j.owner_name, j.job_name, j.job_mode, j.state, s.session_type, s.saddr
FROM dba_datapump_jobs j,dba_datapump_sessions s WHERE UPPER(j.job_name) = UPPER(s.job_name);
2.impdp system/mypassword attach=ExportSchemaHR;
3.This is how you do this:
impdp user/password attach='job_user.job_name' impdp> status
The job_user.job_name is the name of the job that you would have seen when you started the job. The job_user is always the schema you logged in as when you started the job. The job_name is either the job_name you specified when you started the job, or a unique one that the datapump code gave it. It would look something like: