I have a table that potentially will have high number of inserts per second, and I'm trying to choose a type of primary key I want to use. For illustrative purposes let's say, it's users table. I am trying to chose between using GUID and BIGINT as primary key and ultimately as UserID across the app. If I use GUID, I save a trip to database to generate a new ID, but GUID is not "user-friendly" and it's not possible to partition table by this ID (which I'm planning to do). Using BIGINT is much more convenient, but generating it is a problem - I can't use IDENTITY (there is a reason fro that), so my only choice is to have some helper table that would contain last used ID and then I call this stored proc:
create proc GetNewID @ID BIGINT OUTPUT
as
begin
update HelperIDTable set @ID=id, id = id + 1
end
to get the new id. But then this helper table is an obvious bottleneck and I'm concerned with how many updates per second it can do.
I really like the idea of using BIGINT as pk, but the bottleneck problem concerns me - is there a way to roughly estimate how many id's it could produce per second? I realize it highly depends on hardware, but are there any physical limitations and what degree are we looking at? 100's/sec? 1000's/sec?
Any ideas on how to approach the problem are highly appreciated! This problem doesn't let me sleep for many night now!
Thanks! Andrey