Does using large libraries inherently make slower code?

Solution 1:

Even if I only use one or two features of a large library, by linking to that library am I going to incur runtime performance overheads?

In general, no.

If the library in question doesn't have a lot of position-independent code, then there will be a start-up cost while the dynamic linker performs relocations on the library when it's requested. Usually, that's part of the program's start-up. There is no run-time performance effect beyond that.

Linkers are also good at removing "dead code" from statically-linked libraries at build time, so any static libraries you use will have minimal size overhead. Performance doesn't even enter into it.

Frankly, you're worrying about the wrong things.

Solution 2:

I can't comment on GLib, but keep in mind that a lot of the code in Boost is header-only and given the C++ principle of the user only paying for what they're using, the libraries are pretty efficient. There are several libraries that require you to link against them (regex, filesystem come to mind) but they're separate libraries. With Boost you do not link against a large monolithic library but only against the smaller components that you do use.

Of course, the other question is - what is the alternative? Do you want to implement the functionality that is in Boost yourself when you need it? Given that a lot of very competent people have worked on this code and ensured that it works across a multitude of compilers and still is efficient, this might not exactly be a simple undertaking. Plus you're reinventing the wheel, at least to a certain extent. IMHO you can spend this time more productively.