Help building and linking shared libs on linux

Hey guys, I've been experimenting a bit with hotloading shared libs on linux lately.

However, once I started to have many functions to manually link I decided to try and link the lib at once at compile time the same way we do with sdl for example.

For what I've looked around, it seems there are couple options. Copying the lib into /usr/lib and name it something like libmy_lib.so, then link it with -lmy_lib and that works. But I would like to have my lib inside my project's folder, also having the lib at /usr/lib means I need to compile with sudo which is a pain.

I've struggled a bit to compile it having the lib locally, it just wouldn't find it. After some more research I found out about "soname". I don't quite understand it but I got it to work with something like that:

echo "Compiling renderer..."
g++ opengl_renderer.cpp -fPIC -Wl,-soname,libgl_render.so -ggdb -shared -o lib/libgl_render.so

echo "Compiling engine..."
g++ main.cpp $warn -ldl -ggdb -lGL -lSDL2 -L ~/ProjectFolder/lib -lgl_render

The thing is though, to run this it seems I need to set LD_LIBRARY_PATH which is unfortunate...

So my question is, for any folks which messed with this in the past, is there a better option I'm missing out? Or maybe is it best to actually put the lib at /usr/lib? Anyhow, thanks in advance!

Edited by BernFeth on Reason: Initial post
Reduce dynamic dependencies if you want stability
I honestly just gave up with third-party dynamic linking on all platforms because the operating systems kept changing the rules for how dynamic libraries are accessed. It could take one to five years before a game had to be rewritten from scratch or abandoned. The graphics engine I used and even core parts of the operating system eventually became deprecated while decades passed. The more abstractions you use, the more potentially deprecated libraries they will call behind your back. OpenGL will soon be an abstraction on top of Vulkan drivers because it is a pain to write efficient GPU drivers for an old API where changes to the context can happen at any time. Now I only call the bare minimum of the operating system directly from the executable and it just works. My oldest programs that I wrote with minimal dependencies for Windows 3.1 before libraries became a complex cobweb of dynamic dependencies, still runs without issues on modern computers.

X11 instead of SDL
I would recommend learning the X11 protocol, because it comes pre-installed with most Linux systems and is not really harder to use than SDL. The little additional boiler-plate for network communication is nothing compared to the mess I had to deal with in SDL, which tried to redefine main (causing linker problems), failed to install on some computers due to heavy dependencies (99% was thing I didn't even use) and often crashed on Linux (by assuming that I had a CRT display from the 1990s).

Writing this little module was all I needed to get X11 working with software rendering in full-screen and all dependency problems became a thing of the past. Now I understand which systems it would not work on and can clearly specify the minimum requirement (true-color and override-redirect).
https://github.com/Dawoodoz/DFPSR...urce/windowManagers/X11Window.cpp
Thanks for the thoughts and the actual X11 code! That's awesome. I have tried to mess around with X11 a bit but ended up finding handmade penguin which convinced me to stick with SDL for a while. That said, I always intended to go to X11 in the future, maybe I will be moving to that sooner than later.

About dynamic libraries, I understand what you are saying about avoiding it. My question though is, should I not write my own libraries then? For example if I want to write a library that handles rendering through opengl which is what I used as example in my previous post, are you saying this very idea is bad by itself? If I understand it correctly you are saying that operating systems changes they way they handle dynamic linking libraries so even if the library itself doesn't change, a program that uses the library will break because of OS changes?

If that's the case, instead of building a dl library would it be best to make header files like the stb libraries? What I'm thinking here is how to separate a chunk of code that does a specific job which would be useful to "include" in several projects.

I'm quite new to programming so I apologize if I'm asking stupid questions, thanks!
There is nothing wrong with using shared libraries on Linux. They are well defined how they work and behave predictably.

To link with shared library you don't need to copy it in /usr/lib. That is just default folder where compiler looks up for libraries - just like on Windows MSVC looks up in Windows SDK folder (somewhere in Program Files).

As you already wrote - you can always override that on commandline with -L argument:
1
gcc main.cpp -Lpath/to/folder -lsharedlib


Or even simpler, just pass directly path to shared lib:
1
gcc main.cpp path/to/folder/sharedlib.so


There's no need to worry about -soname unless you want to build distribution package and distribute it with stable ABI.

To make executable find your library next to executable, you should use $ORIGIN in rpath linker argument - this will make dynamic linker to find .so file next to executable. Because by default it only looks up in standard locations like /usr/lib, this is a difference from Windows.

So basically do this, and it should work:
1
2
g++ opengl_renderer.cpp -fPIC -ggdb -shared -o libgl_render.so
g++ main.cpp -Wl,-rpath,\$ORIGIN -ldl -ggdb -lGL -lSDL2 libgl_render.so -o main.exe


Make sure main.exe and libgl_render.so are next to each other. You can place libgl_render.so in different folder, then append that to $ORIGIN.

Edited by Mārtiņš Možeiko on
I see! Thank you, that is indeed much better!