If I want to run an application in Linux without using a package manager, is my only option to compile it from source?

I have 2 side projects I'd like to complete at work:

  1. Enable wiki users to authenticate against the corporate Active Directory (LDAP) server.
  2. Setup a code review tool for my dev team.

Here's the problem. The Linux server where the tools will reside has a version of PHP that doesn't have LDAP compiled in and the Apache Server is missing mod_proxy_html for some reason.

If I were running on Windows, I would download the relevant modules, drop them in the ext or modules directory, bounce the server then go on about my business. With Linux, however, it seems like my only option is to recompile PHP (for the LDAP libraries) or to compile the mod_proxy_html module and all of its dependencies myself.

Now, I know what you're probably thinking: "Why don't you just use a package manager to install the modules?" That's a fair question.

  1. The server can only access the public internet via a proxy with a list of whitelisted
    sites. (It's an intranet server after all.)
  2. The Apache and PHP that are running on the server are from a LAMP package. They aren't controlled by YUM, or RPM, or any other 3-letter acronym.

I was able to compile PHP from scratch, but not without a good bit of Yak Shaving. I had to download and compile 4 or 5 dependencies before I could get PHP compiled with the LDAP libraries, and even then, make test basically said, "Hey buddy. I know you went through a lot of trouble to compile this, but it's looking kinda janky. Good luck though!"

So my question is, why is this necessary? Why can't I just download a precompiled (statically compiled?) version of the libraries/modules that I want, put them in a place Apache and PHP can see them, restart the Apache server and go on my merry way.


The reason why you don't find the Windows model of software distribution nearly as prevalent is because of binary compatibility and library versioning. You can design a build process that compiles the software on any number of platforms (by autodetecting the necessary build tools) far more easily than you can support X number of Unix derivatives and Y number of processor architectures.

What many companies do in this scenario is have a package distribution server on their own network that the restricted servers are allowed access to. This server is either allowed to refresh its own repository, or acts as a proxy for the requests and serves the local version of the package if it's already had to download it once. This also ends up being the central distribution point if you end up building your own custom packages.

Unfortunately, this isn't going to be as helpful for you because you downloaded a 3rd party "application stack" instead of going with the vendor supplied solution. These are usually designed to make life easier for software enthusiasts who just want to get something up and running without having to invest as much effort in making the components play nicely with each other, but at the enterprise level you have to carefully consider how this is going to fit into your infrastructure's software lifecycle.


Whatever Linux distribution you are using most likely provides sources to the packages it offers. Most likely, it provides prebuilt packeges for a LAMP stack too.

The way to go here is to get a package's source, modify it to your needs and recompile it.

You should do this on a client PC and upload the resulting package to a server, thus removing the need to download something from the internet on it

If you choose to install third party vendor provided binaries, you should get their sources and modify them.

Why you don't find precompiled libs/modules for php on the web is because there isn't a need for this. Using (source) packages made for your distribution is usually more convenient.