I have a psychological tic which makes me reluctant to use large libraries (like GLib or Boost) in lower-level languages like C and C++. In my mind, I think:
Well, this library has thousands of man hours put into it, and it's been created by people who know a lot more about the language than I ever will. Their authors and fans say that the libraries are fast and reliable, and the functionality looks really useful, and it will certainly stop me from (badly) reinventing wheels.
But damn it, I'm never going to use every function in that library. It's too big and it's probably become bloated over the years; it's another ball and chain my program needs to drag around.
The Torvalds rant (controversial though it is) doesn't exactly put my heart at ease either.
Is there any basis to my thinking, or am I merely unreasonable and/or ignorant? Even if I only use one or two features of a large library, by linking to that library am I going to incur runtime performance overheads?
I'm sure it depends too on what the specific library is, but I'm generally interested in knowing whether large libraries will, at a technical level, inherently introduce inefficiencies.
I'm tired of obsessing and muttering and worrying about this, when I don't have the technical knowledge to know if I'm right or not.
Please put me out of my misery!