views:

350

answers:

2

I'm getting this error message in production.

Unfortunately this comes from a purchased application and the support process is not precisely fast.

This happens when data is being copied from one table to another. Both tables are supposed to have the same columns types and length. So far I have reviewed a some of them by doing the following:

select distinct( length( column_name ) ) ) from source_table

Then comparing the value with the length of column_name in the target table but it's taking me a lot of time.

Is there a better way to perform this check?

I want to identify what column contains the data whose length goes beyond the limit of the source.

I'm working with:

Oracle9i Enterprise Edition Release 9.2.0.7.0 - Production
With the Partitioning, OLAP and Oracle Data Mining options
JServer Release 9.2.0.7.0 - Production
A: 

A brute force method for debugging this:

You could create a script that runs a FOR loop and inserts row by row from Table A to Table B, outputting the row ID or some pertinent data to the console. Once you identify the bad row, you could attempt to update column by column of an existing row with the column data from the bad row until you find your culprit.

akf
I would have to create a C table for I don't want to half migrate A to B and leave it in an inconsistent state.... mmhhh I wonder how much would this take. The A table is around 100 columns and contains about 9k records ...
OscarRyz
Yeah, that sounds cumbersome. However, if you are working on Table A from scratch, you could truncate it once you are done with your hunt. Good Luck!
akf
A: 

try the below link

ORA-01438: value larger than specified precision allows for this column

scripter
Thanks for the link scripter, but actually I knew what the problem is already. What I tried to figure out is WHICH was the problematic column and not really WHY was that happening.
OscarRyz