It is usually better to pre-allocate at a "starting" fixed-size and when you run out of space reallocate based on a growth-factor.
Depending on your needs the starting size and growth factor can be determined based on typical usage of the data you're dealing with or the API you create for your data can allow the caller to specify the starting size and growth factor as part of an initialization/creation call.
The starting size should be a number based on typical usage. Typical usage is the important factor which helps you pick a size so that you: A) don't waste space by choosing a starting size that is 'too large' and B) don't use a starting size too small to where many reallocations will be necessary until the target typical size is reached.
Typical size, of course, is a magic number. A way to determine typical size is to run some tests using various data sets and collect stats for starting size, number of reallocations and min/max memory usage for your data. You can average the results to get a usable typical starting size.
As for growth factor, a x1.5 or x2 growth factor is common. This is something you can gauge using test stats as with the starting size.
Another thing to consider is that you'll need to be careful of managing references to dynamically resizable data, as a realloc() will relocate the data in memory when needed. This means if you stored the address of the first element of the dynamically resizable array that address may be invalid after a call to realloc. This can be managed by an API wrapper around your custom data type that hands out indexes instead of memory addresses and has a way to resolve the index to the current address of the element when needed.