tags:

views:

49

answers:

1

Hey everyone,

I've having a small issue loading a ~50meg file into a database. Unfortunately, and before we ask why I'm doing this, I need to add this feature for legacy purposes. The column is setup as type Image which means I can't unfortunately load the data in chunks and concatenate them (since sybase doesn't not allow it for Image type columns)

I'm building my parameter from the DBProviderFactory using DBType.Binary currently. I've tried other types and nothing helps.

I get the follow error:

There is not enough procedure cache to run this procedure, trigger, or SQL batch. Retry later, or ask your SA to reconfigure ASE with more procedure cache.

The Procedure cache on the server is set to 100meg.

Other Info:

  1. It works work ~35 meg files
  2. The old code (written in PB) uses UPDATEBLOB and it works there.

Any suggestions?

A: 

100 MB of procedure cache is pretty small. Especially if you have a single procedure taht you know will take over 50 MB. That is the total procedure cache for the whole database instance not per session or per procedure.

I would suggest you make your procedure cache bigger.

Todd Pierce
Thanks for the response Todd. Unfortunately, I when I asked about the DBA, he mentioned 100 meg is a lot of memory for procedure cache (as much as some of our 500+ user sites). Doing an "exec sp_configure 'procedure cache size'" I do get a RunValue of 50,000 at 2k memory pages. The argument is that it works with the original PowerBuilder tool using the UpdateBlob command and sees no reason to change it.
Nick
I am a DBA and I run ours at about 1 GB.
Todd Pierce
Thanks Todd, I will have to have chat with my DBA. Wish me luck!
Nick