错误日志如下:
IMP-00020: long column too large for column buffer size (7)Oracle手册中的错误信息参考大致是这样的:
IMP-00020long column too large for column buffer size num(x)
Cause: The column buffer is too small.
This usually occurs when importing LONG data.
Action: Increase the insert buffer size 10,000 bytes at a time (for example)
up to 66,000 or greater. Use this step-by-step approach because a buffer size
that is too large may cause a similar problem.
测试了几次 buffer size 调整还是报告同样的错误, 怀疑是 export dmp 文件坏掉了. 重新 export , imp 还是有问题. 逼得我没有办法, 不得不跑到 Metalink 上搜索(访问 Metalink 速度那叫一个慢啊! 好半天,总算看到结果了, 居然我是遇到了 Bug 2417643!
Note:286597.1 Oracle9i: IMP-20 on Import of Table with TIMESTAMP Column that was Created via Database Link CAUSE ----- If you create a table with the CREATE TABLE ... AS SELECT (CTAS) syntax, and the original table is located on a remote database that you access via database link, and the table has a column with the TIMESTAMP data type, then the new created table has an incorrect precision in the data dictionary. ... If you export this table, there is a mismatch in the metadata syntax of the table in the export dumpfile. As a result, import will fail because import expects a field of 7 bytes (used to store the value of a timestamp without any precision), but actually encounters values of 11 bytes (used to store timestamps with a precision). As a result, an error is produced: . IMP-00020: long column too large for column buffer size (7) . FIX --- This defect has been fixed in Oracle10g. This means that if the table is created in Oracle10g via a database link and exported afterwards, then the import of that table with a TIMESTAMP column, will not result in an IMP-20 error. . In Oracle8i and Oracle9i, use the workaround by correcting the precision of the TIMESTAMP column in the data dictionary. Example:解决办法即是 修改一下通过 CTAS 跨 database link 建立的表的 timestamp 字段的精度, 重新 export / import 数据即可.
--EOF--
来自 “ ITPUB博客 ” ,链接:http://blog.itpub.net/72624/viewspace-606979/,如需转载,请注明出处,否则将追究法律责任。
转载于:http://blog.itpub.net/72624/viewspace-606979/