3rd Party Libraries and Version Control.

    This site uses cookies. By continuing to browse this site, you are agreeing to our Cookie Policy.

    • 3rd Party Libraries and Version Control.

      I was wondering, if in the professional field, there was a standard for what 3rd Party libraries you should commit, and which ones you shouldn't (not thinking in terms of licensing)

      I see that the source base for the book has a pre-compiled library for zlib that was committed, but you have a separate .zip file that you need to download to get the rest of the 3rd party sources and pre-compiled libraries, what logic tells you whether you should commit pre-compiled libraries or not?

      This is one of those problems that have been bugging me, because I see committing pre-compiled libraries as a waste of space on the subversion server, but yet, it makes setting up the project extremely easy if the majority of the libraries are already compiled and configured correctly up on the subversion server.
    • In a professional setting, you almost always commit it all. Hard drive space is an insignificant part of your budget, especially when you consider how many tens or even hundreds of gigs are used up by maya files, psd's, etc. Code & libs tend to take up an incredibly small percentage of the overall repository.

      We chose to make a 3rd party zip because we are limited to the amount of hard drive space & bandwidth we have. In my personal projects, I often have the assets live in a zip file and not checked into source control for the same reason.

      -Rez
    • Oh yeah - commit everything.
      Mr.Mike
      Author, Programmer, Brewer, Patriot
    • Eh, this came up as a question as I was looking through the source yesterday. Why did you put the zlib library in the Libs folder, but kept the rest of the precompiled libraries (boost, luaplus, ect....) in the 3rdParty/<project>/libs folder?

      Another question that came up in my head, when reading the book, was how to deploy multiplatform projects. In the book you say to have a "win32", "mac", "linux", ect... folders which hold your binaries and shared libraries. Then, we put all of our assets in the data folder, a level up from our platform folder.

      However, once I deploy, I want to deploy my project so the data folder is in the same directory as my executable and shared libraries. How would I make this possible with the provided "bin" folder structure for multiplatform projects?

      Thanks.
    • Originally posted by Lordcorm
      Eh, this came up as a question as I was looking through the source yesterday. Why did you put the zlib library in the Libs folder, but kept the rest of the precompiled libraries (boost, luaplus, ect....) in the 3rdParty/<project>/libs folder?

      Hi,

      my guess is that they is no special reason.


      Another question that came up in my head, when reading the book, was how to deploy multiplatform projects. In the book you say to have a "win32", "mac", "linux", ect... folders which hold your binaries and shared libraries. Then, we put all of our assets in the data folder, a level up from our platform folder.

      However, once I deploy, I want to deploy my project so the data folder is in the same directory as my executable and shared libraries. How would I make this possible with the provided "bin" folder structure for multiplatform projects?

      Thanks.

      Put a (symbolic) link into your bin folder, so that all platform folders can access data by using ../data (or ..\data).

      Not that symbolic links under Windows (or ntfs to be precise) are called junctions and Windows is not officially promoting them. You can find a tool to use them here: codeproject.com/KB/winsdk/junctionpoints.aspx
      It's an implemented feature, just not documented :(
    • Im sorry for necroing this thread (kinda), but im having trouble implementing this.

      When it comes to HUGE libraries like Qt and Boost, what is the smartest thing to do? Having small 3rdparty libraries in the 3rdparty src directory, and having them compile the first time I build the project is not that big of a deal. But, Qt and Boost take HOURS (combined up to 5 hours) to compile (and this is on a dual 3.0GHz processor).

      Is there a fine on what should be committed and what shouldnt? Im choosing to stay away from precompiled libraries for the specific reason that my project needs to support multiple platforms and compilers.

      Should i commit precompiled libraries but have people who want to compile for unsupported base platforms and compilers checkout the 3rdparty libraries from a separate repository?

      What is the overall thinking process that goes into deciding if something is nessessary or worthy of being compiled?

      I had another idea where I would commit the sources for 3rdparty libraries into a separate repository called "<project name>-env". In it would contain all the libraries needed to build the project, with default configuration, and a "bootstrap.sh". Bootstrap.sh would build and install of the 3rdparty libraries, but, this idea is only valid up until someone changes the install paths. (installing Boost to C:\Boost\Boost2 instead of C:\Boost) I would rather not have people editing their paths in the makefiles for dependencies, because chances are they are going to accidentally commit parts of it they shouldnt.

      Lol, I hope that made sense.
      - Lordcorm
    • Every solution is different depending on your needs. In most professional environments I've worked in, we commit all the source code for everything, but we usually just link the game against the static libs for things that don't need to change (like zlib, boost, lua, etc). For these utilities, we usually create a separate sln file that will allow us to rebuild the complete source if we want, including all targets. That way we can easily rebuild whatever we need in case a new platform comes along that we need to support. The game sln file only compiles what has the potential to change and links everything else.

      Make sense?

      -Rez