Help - Search - Members - Calendar
Full Version: Import using Data Pump Issue
Oracle DBA Forums > Oracle > Oracle Forum
lmih
Hi all,

I would like to ask something regarding the Import Data Pump utility.

I managed to make a successfull export with the Export Data Pump utility and when i performed the Import data pump i have the following case. All the tables are inserted successfully and then continues with these messages:

Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
Processing object type SCHEMA_EXPORT/TABLE/COMMENT
Processing object type SCHEMA_EXPORT/FUNCTION/FUNCTION
Processing object type SCHEMA_EXPORT/FUNCTION/ALTER_FUNCTION
Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/REF_CONSTRAINT
Processing object type SCHEMA_EXPORT/TABLE/TRIGGER
Processing object type SCHEMA_EXPORT/TABLE/INDEX/FUNCTIONAL_AND_BITMAP/INDEX
Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/FUNCTIONAL_AND_BITMAP/INDEX_STATISTICS
Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS

The strange case is that i don't see the final completion messages

Processing object type SCHEMA_EXPORT/JOB
Job "SYSTEM"."SYS_IMPORT_FULL_02" successfully completed at


What might be the case? Any idea? How can i ensure that the import was completed at last?

Have in mind that i use AIX 5.3 IBM operating system

Thank you in advance smile.gif
SteveC
See where? Output streamed to the command line? Attached or not? Writing to log file or not?
lmih
In the command line where i give the impdp command, the last message that appears is:

Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS

and then it seems that it stucks.
SteveC
Output could be buffered and not written to command line yet. Not using a log file is a poor practice.
lmih
I use log file and also into the log file the last message that appears is the usual one:

Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS

That's why is very strange to me!
burleson
>> The strange case is that i don't see the final completion messages

You buffer was not flushed!

Create a log file to capture the output, and then grep for "successfully completed" . . .

CODE
expdp system/password dumpfile=JOB_USER.dmp  logfile=/tmp/imp.log

check_success=`grep 'successfully completed' /tmp/imp.log|wc -l`

oracle_num=`expr $check_success`

if [ $oracle_num -ne 1 ]
then
/bin/mailx -s "** IMPORT ERROR!**" burleson2@usgov.com
fi



lmih
The successfull completion message that doesn't appear (Command Session or Log File ) is not in the export procedure but in the import procedure. Export works fine!

burleson
Hi,

Glad you got it working!

>> The successfull completion message that doesn't appear (Command Session or Log File ) is not in the export procedure

Just FYI, export logs return "export completed successfully", and you can grep for this in export batch jobs . . .
lmih
I think we have a misunderstand here. My problem is not that i don't see the export to be completed with all the necessary messages but the import .

When i try to import the dmp file from the previous export i can see the notification messages running through (both in the log file output or in command line) but somehow i cannot see the final message that confirms that the import has finished successfully. I repeat that i am talking about the Import using impdp command and not the export.
dave
trace the session, see what you are waiting on
burleson
>> i don't see the export to be completed with all the necessary messages but the import .

Like I said, it looks like you did not flush the buffer.

Use a technique like this, and spool your log to a flat file:

CODE
impdp system/password dumpfile=JOB_USER.dmp  logfile=/tmp/imp.log

check_success=`grep 'successfully completed' /tmp/imp.log|wc -l`

oracle_num=`expr $check_success`

if [ $oracle_num -ne 1 ]
then
/bin/mailx -s "** IMPORT ERROR!**" burleson2@usgov.com
fi
lmih
i created the script that you've provided me and here is the result:

. . imported "TEMP"."TEST" 0 KB 0 rows
Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
Processing object type SCHEMA_EXPORT/TABLE/COMMENT
Processing object type SCHEMA_EXPORT/FUNCTION/FUNCTION
Processing object type SCHEMA_EXPORT/FUNCTION/ALTER_FUNCTION
Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/REF_CONSTRAINT
Processing object type SCHEMA_EXPORT/TABLE/TRIGGER
Processing object type SCHEMA_EXPORT/TABLE/INDEX/FUNCTIONAL_AND_BITMAP/INDEX
Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/FUNCTIONAL_AND_BITMAP/INDEX_STATISTICS
Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS

The command line is stucked in the last message as you can see. I cannot understand why it doesn't flush immediately the successfull import message. It seems that needs a lot of time for the final confirmation.
dave
how do you know its finished? did you trace the session to see what it is doing
lmih
The messages in my previous post are taken from the session. Is there any alternative to trace the session which i am not familiar? The question is, how much time is needed (average time) to flush the final message of the successfull import. In my case needs 12 hours and this is something not acceptable. Probably the calculation of the Table Statistics takes so much time?
lmih
Ok i think i found the problem. The statistics are taking too much time even if we run impdp or sql script that analyze tables and indexes. What i really like to mention is that in one database instance i have 7 users that owns the same database. The database has 42GB size so 42GB*7=294GB. I will find the specifications of the servers in order to have a common sense if the calculation of statistics are justified to take so much time, cause once again we are talking that the analyze procedure takes hours. Please let me know if you have experienced tha same issue.

Thank you!!
SteveC
Statistics are imported if the table has them. That does not mean statistics are being gathered. What does an estimate show in terms of work to be performed?

Why not export/import (using dbms_stats, not expdp or impdp) the source statistics instead of gathering new values?
burleson
>> in one database instance i have 7 users that owns the same database

No!

A schema always has one, and only one owner. In a sense, a schema IS A user . . .

Yu may mean that you have seven schema owners, but what's that got to do with your problem?

You need to get up to speed on Oracle fundamentals, pronto:

http://www.rampant-books.com/book_2005_1_e...e_jumpstart.htm

As to your issue, perhaps you are being impatient and the job is not done yet.

Did you query v$session_longops?

http://www.dba-oracle.com/longops.htm


If you want, turn off the important of statistics and re-analyze later . . .
lmih
Thank you for your help and for the information. I know that a schema is a user and i wanted to clarify that i have 7 users that owns the same objects and the same amount of data.

Now i try to use data pump in order to accelerate the procedure of the traditional import/export. Currently the traditional import and export requires 1 hour. Using data pump the export needs only 5 minutes but the import (including statistics) needs 12 hours!!! In such way the export data pump makes my life tough. So i think is better to import without the statistics and run the analyze (tables, indexes) later on.

Export using datapump supposed to be more efficient and as far as i can see it has a slow response 12 hours????
SteveC
You said you have 7 schemas which are basically copies of one another, at 48GB each, and the export/import takes one hour? One after another, or are you running multiple export/import sessions?
lmih
1 hour is needed for one schema-user. I don't run multiple sessions only a single one :-)
burleson
>> I don't run multiple sessions only a single one :-)

Why not?

Have you tested the server stress?

On an SMP machine you may be able to nohup several imports at the same time, provided that the data files reside on a separate disk spindle . . .
lmih
Since i have only one specific dmp file how will i be able to run mltiple imports? What currently i do is to restore the dmp file only for one user, there is no need to restore the dmp file simultaneously to multiple users.


Also by saying if i have test the server stress any advice how could i perform this action?
burleson
>> i wanted to clarify that i have 7 users that owns the same objects

And I wanted to clarify that you don't! It's IMPOSSIBLE!

*************************************************************
>> Please let me know if you have experienced tha same issue.

I never import statistoics, so I would not know.

I import, and add indexes, constraints and CBO stats after . . . .

Plus, if you are on 10g, the import should kick-in a re-analyze automatically . . .
lmih
I mean 7 schema users sorry for the mistyping!

So I guess the final conlusion in regrads to Import/Export Data Pump ustility is to restore everything except STATISTICS for the
better efficient shake. And subsequently we run the analyze package.

I wanna thank you all for the creative and productive discussion that we had
This is a "lo-fi" version of our main content. To view the full version with more information, formatting and images, please click here.
Invision Power Board © 2001-2014 Invision Power Services, Inc.