Conversation

In university during our C++ year, we had this teacher who only wanted us to use statically linked libraries all the time (We were using windows). I remember asking him: "Why not use dynamic libraries instead?" and he said: "Because that way you can embed everything in the final .exe and you don't have to worry about shipping third party libraries alongside"
And I replied:
"But you are still distributing them, just bundled inside the .exe code. Wouldn't that also require you to include the licenses of those libraries? Also I believe LGPL libraries require your code to be independent from the libraries."

And he finally said: "Sure, I mean, you can play by GPL rules or you can find MIT libraries that don't bug you with legal details."

I haven't done any static linking since then because of two reasons:

1. It's a pain to set up, or at least was on Windows.
2. I want to respect licenses.
3. I use Linux bro, like, 90% of the libraries I use are already pre-installed lmao.

PS: Also I have never bothered figuring out how to statically link on Linux. I know you use .a files instead of .os; but since everything on Linux is often shared libraries I don't see the point.

Any replies are welcomed.

1
0
0
@meluzzy You are not wrong, it's just static linking is preferred in a lot of deployment cases. On Windows specifically, DLL Hell remains a thing, although now side-by-side assemblies aim to solve that issue (in a pretty convoluted way IMO). I think to some extent handling bugs arising from different library versions on different Linux distros is even worse - IIRC that's a reason why Go (used on a bazillion servers of Google) links statically by default. Also, for ad-hoc tasks like debugging it's much better to drop a single-file util that just works than mess with the configuration of the system through the package manager/winstaller.

There are surely more pro-con arguments, the point is that we have different ways for linking because use-cases differ, and both methods have their place.
0
0
1