views:

1075

answers:

6
+1  Q: 

CAST and IsNumeric

Why would the following query return "Error converting data type varchar to bigint"? Doesn't IsNumeric make the CAST safe? I've tried every numeric datatype in the cast and get the same "Error converting..." error. I don't believe the size of the resulting number is a problem because overflow is a different error.

The interesting thing is, in management studio, the results actually show up in the results pane for a split second before the error comes back.

SELECT CAST(myVarcharColumn AS bigint)
FROM myTable
WHERE IsNumeric(myVarcharColumn) = 1 AND myVarcharColumn IS NOT NULL
GROUP BY myVarcharColumn

Any thoughts?

+1  A: 

Try this and see if you still get an error...

SELECT CAST(CASE WHEN IsNumeric(myVarcharColumn) = 0 THEN 0 ELSE myVarcharColumn END AS bigint) FROM myTable WHERE IsNumeric(myVarcharColumn) = 1 AND myVarcharColumn IS NOT NULL GROUP BY myVarcharColumn
Kevin Fairchild
A: 

Try wrapping it in a case:

select CASE WHEN IsNumeric(mycolumn) = 1 THEN CAST(mycolumn as bigint) END
FROM stack_table
WHERE IsNumeric(mycolumn) = 1
GROUP BY mycolumn
Dalin Seivewright
Any particularly decent reason why my answer was downvoted? (Besides the where clause that I edited in)
Dalin Seivewright
A: 

I came across this blog post that might help. I've not run into this issue before and not sure if it'll help you in this instance:

http://dotmad.blogspot.com/2007/02/cannot-call-methods-on-bigint-error.html

BenAlabaster
+5  A: 

IsNumeric returns 1 if the varchar value can be converted to ANY number type. This includes int, bigint, decimal, numeric, real & float.

Scientific notation could be causing you a problem. For example:

Declare @Temp Table(Data VarChar(20))

Insert Into @Temp Values(NULL)
Insert Into @Temp Values('1')
Insert Into @Temp Values('1e4')
Insert Into @Temp Values('Not a number')

Select Cast(Data as bigint)
From   @Temp
Where  IsNumeric(Data) = 1 And Data Is Not NULL

There is a trick you can use with IsNumeric so that it returns 0 for numbers with scientific notation. You can apply a similar trick to prevent decimal values.

IsNumeric(YourColumn + 'e0')

IsNumeric(YourColumn + '.0e0')

Try it out.

SELECT CAST(myVarcharColumn AS bigint)
FROM myTable
WHERE IsNumeric(myVarcharColumn + '.0e0') = 1 AND myVarcharColumn IS NOT NULL
GROUP BY myVarcharColumn
G Mastros
This worked. It now returns 1 less record than without casting. So there must have been one record with scientific notation. Thanks!
MarkB
Btw, excellent example for re-creating the error.
MarkB
A: 

According to BOL ISNUMERIC returns 1 when the input expression evaluates to a valid numeric data type; otherwise it returns 0.

Valid numeric data types include the following:

  • int
  • numeric
  • bigint
  • money
  • smallint
  • smallmoney
  • tinyint
  • float
  • decimal
  • real

So as others pointed out you will have some data that will pass ISNUMERIC test but fail on casting to bigint

kristof
+1  A: 

The best solution would be to stop storing integers in a varchar column. Clearly there is a data issue where the data is interpretable as a numeric but cannot be cast as such. You need to find the record(s) that is(are) the problem and fix them if the data is such that it can and should be fixed. Depending on what you are storing and why it is a varchar to begin with, you may need to fix the query instead of the data. But that will be easier to do also if you first find the records which are blowing up your current query.

How to do that is the issue. It is relatively easy to search for a decimal place in the data to see if you have decimals (other than.0 which would convert) using charindex. You could also look for any record containing e or $ or any other character that could be interpeted as numeric according to the sources already given. If you don't have a lot of records a quick visual scan of the data will probably find it, especially if you sort on that field first.

Sometimes when I've been stuck on finding the bad data that is blowing up a query, I've put the data into a temp table and then tried processing in batches (using interpolation) until I find the one it blows up on. Start with the first 1000 (don't forget to use order by or you won't get the same results when you delete the good records and 1000 is only a best guess if you have millions of records start with a larger number). If it passes, delete those 1000 records and select the next batch. Once it fails, select a smaller batch. Once you are down to a number that can easily be visually scanned, you will find the problem. I've been able to find problem records fairly quickly when I have millions of records and a wierd error that none of the queries I've tried (which are basically guesses as to what might be wrong) have found the issue.

HLGEM