This section describes typical developer workflows for developing a C++ or web software module which is involved in dependency relationships, and how Artifactory, Gradle, and the Holy Gradle plugins have been used in these workflows in practice.
The main intended audience consists of developers and technical leads in your team or organisation, and in those who supply you or build on your software. This document assumes you have read the Overview.
Concepts and Terminology
The following are some important terms from the Glossary which you will need to understand to read this document.
Configurations
Extending Configurations
Some information about configurations is explained in the Gradle Module Concepts section of the Overview. Additionally, a configuration can be defined as extending from another, which means two things. First, the set of artifacts for a configuration includes those of any configurations it extends from. Second, a module which declares a dependency on a configuration will also, in terms of transitive dependencies, pick up the dependencies of configurations which that one extends from.
For example, a module a
using the Pre-defined Configuration Set Types
would divide its dependencies for compile-time use into import_common
, import_Win32_Release
, and
import_x64_Release
, with import_Win32_Release
and import_x64_Release
extending from import_common
.
That means that any other module which declares a dependency on a
's import_x64_Release
will pick up
-
the artifacts from
import_x64_Release
ina
; -
the artifacts from
import_common
ina
; -
any transitive dependencies from
import_x64_Release
ina
; and -
any transitive dependencies from
import_common
ina
.
The following diagram highlights these in blue.
Private configurations
A private configuration is one which is used when building a module, but is not intended for other modules to use. For example, dependencies on build tools might be mapped from a private configuration, or dependencies on parts of a module (such as C++ headers) which are to be encapsulated by the depending module.
Private configurations are declared in Gradle by setting the property visible = false
on a
configuration, and appear in the ivy.xml
file marked with the attribute visibility="private"
.
These configurations can still be used by other modules, but they are excluded from some default
usage.
The following diagram highlights in blue the configurations which will be fetched when all
configurations of the other
module are fetched. Private configurations are shown with dashed
lines.
Workflows
Initial Set-up
This section describes how to set up a project so that you can fetch its packed dependencies, build it, and publish it to an Artifactory repository. You don’t have to do all these steps immediately when you start with a project but understanding them will be helpful.
Fetch Source
First you must create or check out a working copy of the source for the module you want to publish.
The Gradle name
part of the module is fixed to be the name of the Gradle project, which itself
defaults to the name of the folder containing the build.gradle
file. So, if you intend to really
publish the module (as opposed to just test publishing), you must put the source in an
appropriately-named folder, or override the module name as described under publishPackages.
If you are creating a new project, get a Gradle wrapper and commit it to source control, as described in Setting up a Gradle Wrapper.
Determine Outputs (Downstream Dependencies)
Although it might seem counter-intuitive, the first thing to start with is determining the outputs of your module, in terms of how down-stream modules will want to consume them. This determines the set of configurations your module will have, and you need to know that to decide what relationships to publish between your module’s configurations, and the configurations it uses from its dependencies.
You can skip some of this section if you are building a module which is not used by other modules (and maybe will not be published to Artifactory) — for example, if you are building an application. In that case you can normally
-
use a single private configuration (for example,
build
) for all your dependencies; -
not have a
packageArtifacts
block; -
and skip ahead to Determine Inputs (Upstream Dependencies).
Determine Configurations
In the world of Java and related languages, the set of artifacts is much simpler (namely, a small
number of JAR files), so Gradle provides a set of pre-defined configurations: compile
, test
, runtime
, etc.
Use of Gradle with C++ was limited when the holygradle
plugins were created, so no single
standard set of configurations exists. However, from version 7.7.0
there are several Pre-defined Configuration Set Types which will
set up patterns of configurations which have proven useful in practice.
configurationSets {
main { type configurationSetTypes.DLL }
}
Note that configurations should be declared near the top of the build.gradle
file, after any
plugin declarations, because they will be referenced in several other places.
Determine Artifacts
Deciding the set of configurations appropriate for your module is likely to be done at the same time
as deciding which files should be included in the ".zip
" file for each configuration. In Gradle,
the files for each configuration are specified using a number of Ant-style patterns. These are
case-sensitive and similar to Unix-style "glob" patterns, with the addition that "**
" means "any
number of directory levels".
A Gradle file containing the above configurations might also have a block like this, by convention
near the bottom. This code uses a loop over a list, plus interpolation of expressions into strings
(using ${…}
) to avoid repetition. This takes advantage of the fact that configurations, like
all Groovy properties, can be referenced as literals or as strings.
packageArtifacts {
import_common {
include "src/**/*.h"
include "src/**/*.idl"
}
doc {
from "doc/output" // Take files from here ...
to "doc" // ... and put them here in the ZIP.
include "*.chm"
}
build {
include "copy_outputs_to.bat"
}
configurationSets.main.typeAsDefault.axes["Configuration"].each { conf ->
"import_x64_${conf}" {
include "output/my_module_${conf}64.lib"
include "output/MIDL/**/${conf}/**/*"
}
"runtime_x64_${conf}" {
include "output/my_module_${conf}64.dll"
}
"debugging_x64_${conf}" {
include "output/my_module_${conf}64.pdb"
}
}
}
Determine Inputs (Upstream Dependencies)
As well as describing the module your Gradle project will produce, you need to describe which other modules it needs for building, testing, running, debugging, and so on.
Declare Dependency Repositories
Before declaring dependencies on pre-packed modules (or at least, before attempting to fetch them),
we must specify where to find them. You can specify one or more repositories, though you may find
that an Artifactory-backed repository is in fact a virtual repository which combines several other
repositories. For example, your Artifactory instance might have a libs-release
virtual repo which
includes both libs-release-local
(official releases of in-house modules) and
externals-release-local
(in-house packagings of third-party modules).
Following is an example repository declaration.
repositories.ivy {
credentials {
username my.username("Artifactory")
password my.password("Artifactory")
}
url project.holyGradleRepositoryBase + teamDependenciesRepo
}
Here the "Artifactory" string identifies a set of credentials cached in the Windows Credential Manager using the my-credentials-plugin. This avoids the need to store passwords in build files. For more information, see the page for that plugin.
The property project.holyGradleRepositoryBase
is added by the Holy Gradle and gives the
base URL of the Artifactory service from which the plugins were fetched. Using this in your
build file means it can be used at any site which has an Artifactory server with the same
repositories defined. The identifier teamDependenciesRepo
is resolved to a Gradle "project
property", which is defined in a gradle.properties
file in the same folder.
teamDependenciesRepo=team-integration
Determine Module Dependencies
Your build.gradle
file must declare dependencies on modules (other libraries or tools) which your
project needs to be built and tested, or which users of your module (including you) will need to
compile against it, run it, and/or debug it.
For in-house dependencies produced by your own company, you should generally ask the relevant team
to find out whether it is published in a local, private Artifactory instance. If not, you will have
to publish it yourself, as if you were Packaging Third-Party Dependencies. For libraries and
tools from external sources, you will have to create a build.gradle
file which will package up an
appropriate distribution of that tool and publish it to Artifactory.
For each such module, you add a "packed dependency", specifying
-
the relative location at which a link to the unpacked module should be created;
-
the
group
,name
, andversion
of the module; and -
a mapping from configurations of your module, to configurations of the dependency module.
The first two parts are simple, as shown in the following example.
packedDependencies {
// Other dependencies omitted ...
"../ImageLib" {
dependency "com.example-corp.example-team:ImageLib:13.1.3"
// Configuration information goes here ...
}
// ...
}
The dependency mapping will normally be conceptually straightforward, though it may be complicated if the configurations published by your dependencies have different names, or fail to match yours exactly. If you can use one of the Pre-defined Configuration Set Types it will make your build script much simpler. See configurationSetTypes and configurationSets for more information.
Tip
| Declaring the dependency mapping correctly is important: if you get it wrong, you may find that your project builds and runs, but down-stream modules which try to use yours fail. For this reason, we recommend setting up an auto-build job for a module which attempts to use various configurations of your own module, including building against LIBs/headers, and running with DLLs/EXEs. |
Following is an example of some of the dependencies based on a real-world build.gradle
file.
-
Boost is a build-time dependency, and this module exposes some Boost types in its interfaces, so the comiple-time imports from Boost need to be re-exported.
-
HTML Help Workshop is only used at (documentation) build time.
-
ImageLib is an in-house, static library and this project doesn’t expose any ImageLib types. However, ImageLib has a runtime dependency (on the Intel C++ compiler runtime), so this project has to declare that its runtime configurations depend on that. In this case, ImageLib doesn’t separate its runtime dependencies into
Debug
andRelease
, which may either be an oversight, or because there is only one version of the Intel runtime DLLs. ImageLib also does not use the pre-defined configuration names, so more complex mapping is needed.
packedDependencies {
"../Boost" {
dependency "org.boost:Boost:1.49.0_1"
configurationSet configurations.build, configurationSetTypes.DLL_64, export: true
}
// ...
"../HTML Help Workshop" {
dependency "com.microsoft:HTML_Help_Workshop:4.74.8702.0"
configuration "build->full"
}
// ...
"../ImageLib" {
dependency "com.example-corp.example-team:ImageLib:13.1.3"
configurationSets.main.typeAsDefault.axes["Configuration"].each { conf ->
configuration "build->compileVc10X64${conf}"
configuration "runtime_x64_${conf}->runtimeVc10X64${conf}"
configuration "runtime_x64_${conf}->externals"
configuration "debugging_x64_${conf}->pdbVc10X64${conf}"
}
}
// ...
}
Determine Installed Prerequisites
In our experience so far, most executable tools required for building can be packaged in stand-alone
folders and uploaded to Artifactory, rather than having to be installed on each developer’s machine.
This makes it easier to build the project for the first time on both developer and autobuild
machines. However, some things need to be installed, and for these you may wish to create a
"prerequisite". This is a mechanism from the intrepid
plugin, which lets you specify arbitrary
Groovy code to execute, to check that the tool is installed, and provide install instructions if
not. So far this has been used for Visual Studio, other Microsoft SDKs, and to check the OS
version.
Here is an example prerequisite.
prerequisites {
// Don't check "DirectX" immediately, only if there are "buildSomething" tasks (see
// "tasks.matching {...}" below).
specify("DirectX", { checker ->
def help = "You need the October 2006 DirectX SDK."
def dxdir = checker.readEnvironment("DXSDK_DIR")
if (dxdir == null) {
checker.fail "Please ensure the DXSDK_DIR environment variable is set correctly. " +
help
} else {
def dxErrFilePath = dxdir + /Utilities\Bin\x64\DXErr.exe/
def expectedDxVersion = "9.15.779.0000"
def dxVersion = checker.readFileVersion(dxErrFilePath)
if (dxVersion == null) {
checker.fail "Failed to read the file version for '${dxErrFilePath}'. " + help
} else if (dxVersion != expectedDxVersion) {
checker.fail "The file version of '${dxErrFilePath}' was '${dxVersion}' " +
"but expected '${expectedDxVersion}'. " + help
}
}
})
}
// Any 'build' tasks for this project depend on DirectX
afterEvaluate {
allprojects { proj ->
proj.tasks.matching { it.name.startsWith("build") }.each {
it.dependsOn prerequisites.getTask("DirectX")
}
}
}
Fetch Dependencies
At this point, you should be able to have Gradle fetch, unpack, and link all your packed (pre-built) dependencies. To do this, run the following at the Windows Command Prompt.
gw fetchAllDependencies
or
gw fAD
Note
| If this is the first time you have fetched dependencies from a particular Artifactory
server, or you have deleted your password from the Windows Credential Manager, add the --no-daemon
option to the command line, immediately after gw . For more information, see the documentation for
the my-credentials-plugin. |
If your project’s direct dependencies have further dependencies of their own, they will be fetched,
unpacked, and linked as well. The location of the links for these indirect dependencies is
determined by the modules which depend on them, using a relativePath
attribute in their ivy.xml
file, added by intrepid
.
Sharing Dependencies with Disconnected Sites
If you are developing a project together with a team at another site, and that site does not have an Artifactory server, you can still share dependencies with them. This means both sites can work on the same source code, and be sure they are using the same versions of dependencies.
The Holy Gradle can automatically download all dependencies, plus a copy of the Holy Gradle,
into a local_artifacts
folder or ZIP file. This can be shared by FTP, for example. A new
version of the ZIP file is only needed when dependencies change — see Changing Dependencies.
When a developer at a disconnected site gets a copy of these files, they should put the unzipped
local_artifacts
folder in the same place as the project’s build.gradle
. (The folder can also
be in any folder above the project root, to share it between multiple projects.) The developer
can then run gw fAD
and all dependencies will be unpacked to the Gradle cache, with
links added at the correct locations in the project folder. From version
7.5.0 of the Holy Gradle, the local_artifacts
contains a build_info
folder which contains a note of which source code repository and revision was used to generate it.
This means that the team which receives it can check that they have the correct version of the ZIP
file.
To make a local_artifacts
ZIP, just run
gw zipDependencies
Or, to put dependencies in a folder but not ZIP them, run
gw collectDependencies
If you want to check the result, you can follow these steps.
-
Check out a separate working copy of the source code.
-
Unzip the
local_artifacts
ZIP (or copy thelocal_artifacts
folder) into that new working copy. -
Run
gw -o -g tmp_guh fAD
.-
The
-o
argument means that Gradle will not use the network (work offline). This tests that all dependencies are available inlocal_artifacts
, without fetching from your Artifactory server. -
The
-g
argument means Gradle will use the folder./tmp_guh
instead of your usual Gradle user home for unpacking. This tests that all dependencies are available inlocal_artifacts
, without using files already downloaded to the Gradle cache on your PC.
-
Check Build
At this point you should be able to build your Visual Studio solution, provided the Include and Library Paths it uses match the directory links for the dependencies. You may find Visual Studio’s Property Sheets useful for this, as they allow common settings to be included in multiple projects.
Note that Gradle creates a directory called build
for its own use, and the Holy Gradle
creates one called packages
as a temporary storage area when creating packages. If you
want to override the build
folder name, put the following at the top level of your build.gradle
file.
project.buildDir = "someOtherBuildDir"
It is not currently possible to override the packages
folder location.
You don’t have to use Gradle to build your solution, but the devenv
plugin can help with this,
though it supports only
-
Visual Studio 10;
-
two Visual Studio Configuration names, "Release" and "Debug" (though any number of Platform values); and
-
one solution per Gradle project.
The following snippet will create tasks buildRelease
and buildDebug
, which both build the
default platform, x64
.
DevEnv.solutionFile "example-library.sln"
For multiple platforms, using the following syntax.
DevEnv {
solutionFile "example-library.sln"
platform "x64", "Win32"
}
Check Packaging
Next, check the packaging of your build output. The intrepid
plugin arranges for Gradle to
package one artifact (ZIP file) per entry in the packageArtifacts
block (discussed above under
Determine Artifacts), plus an extra artifact called buildScript
which just contains the
build.gradle
file.
To check the packaging, run the following command.
gw packageEverything
or
gw packEv
You should find a number of ZIP files (and mostly-empty directories) in the packages
sub-directory
of your project’s folder. If the contents don’t look as you expect, you’ll need to modify either or
both of your solution’s output settings, and the packageArtifacts
blocks.
Tip
| If your build.gradle file depends on any other files, you should probably add them to the
buildScript package by adding a block for it within packageArtifacts . |
Each ZIP file also contains a build_info
sub-directory holding information about
-
the location and version of the source of the module (assuming the project is under source control);
-
the auto-build job which produced it; and
-
miscellaneous other environment and software version info.
It’s also a good idea to check that the ivy.xml
file looks correct. You can generate just that
file by running this command.
gw generateIvyModuleDescriptor
or
gw gIMD
You can find the ivy.xml
file in the build/publications/ivy
sub-directory of your project’s
folder.
Check Publishing
Finally, you are ready to publish your packaged build output. First, you must add information to
the build.gradle
file to tell Gradle where to publish to, including appropriate credentials.
group = "com.example-corp.example-team"
version = System.getenv("NEXT_VERSION_NUMBER") ?: Project.DEFAULT_VERSION
// ...
publishPackages {
repositories.ivy {
credentials {
username my.username("Artifactory")
password my.password("Artifactory")
}
url project.holyGradleRepositoryBase + teamPublishRepo
}
}
The group
value is used for the group
part of the module in Artifactory. Using a value which
is specific to your team has a couple of advantages.
-
It makes it clear which team created and may be expected to support the module.
-
Artifactory allows deploy permissions to be set on a pattern-matching basis, so the server admin can make sure than only your team can publish packages in that location.
The name
, as described under Fetch Source, defaults to the name of the folder containing the build.gradle
file,
but can be overridden in settings.gradle
.
The version
can be set in several ways, as described in the documentation for the
publishPackages block. Note that it can be any string, not necessarily
a number, though it’s a good idea to avoid the hyphen (-
) character, as it is used as a
separator in regexp-based parsing by some related tools.
The URL for publishing may not be the same as the one for retrieving dependencies, as used in the
repositories.ivy
block at the top level of the file (see Declare Dependency Repositories).
In a typical Artifactory setup, you will be publishing to a "local" repository (one hosted on that
Artifactory server) but may be fetching dependencies from a "virtual" repository (a single URL which
combines artifacts from any number of local or remote repositories).
Note
| A common arrangement is to initially publish auto-builds to an team-private "integration" repository, then move them to a more public repository for release. See Making a Release, Promotion / Republishing, and Repository Clean-up for more details. |
Once you are happy with the contents of this block, run the following command to publish.
gw publish
Tip
|
You can use a If you do this, you should either (a) give each publication a new version number; or (b) delete the contents of the unpack cache each time. Otherwise Gradle will see your new publication with the same version, but the Holy Gradle will not unpack it, because it wll have recorded the fact that the same-named ZIP files have already been unpacked into the relevant unpack cache folder. |
The publish
task depends on the packageEverything
and generateIvyModuleDescriptor
tasks, so
normally you only need to run this one command after your solution is built.
Republishing
The intrepid
plugin also has (currently undocumented) features to support promoting multiple
related modules between repositories, as described in Promotion / Republishing. You may want
to add a republish
section to your build.gradle
now, so that you can use those features later.
See that section for more information.
Typical Workflow
This section describes the most common developer workflow: you are not planning to change the
project dependencies, but some other developer may have done so. When you pull in their changes,
the build.gradle
file may be updated, in which case you need to update your local copy of
dependencies before you can build.
Update Source Code
First you will update your local copy of the source code in whatever way is normal for your project.
For example, if you are using Mercurial, this is probably just by running hg pull
and hg update
.
However, if a dependency has been removed from the build.gradle
file by the new changes, then the
intrepid
plugin will no longer know about it, and so will leave a stale link when you use it to
update dependencies. Therefore, you should run
gw deleteLinks
or
gw delL
before updating your working copy. With Mercurial, you can automate this by adding a pre-update
hook for the repository to your %USERPROFILE%\mercurial.ini
, as follows.
[hooks]
preupdate.deleteLinks = gw deleteLinks
You could make this a global hook if all your repositories use Gradle; or, if only some do, but they
all use the Holy Gradle plugins, you could guard it with an if
, as follows.
[hooks]
preupdate.deleteLinks = if exist gw.bat gw deleteLinks
Fetch Dependencies
To update dependencies, run gw fAD
, as when fetching them for the first time (see
Fetch Dependencies).
You can automate the this under Mercurial with a post-update hook, as follows.
[hooks]
update.fetchDeps = if exist gw.bat gw fetchAllDependencies
However, this may not be suitable for a MISSING TARGET, because in that case
typically only the root project will contain gw.bat
. In that case, either run these commands
manually in the root project, or automate them with a batch file or similar.
Check Build
At this point, assuming there were no errors in the previous step you should be able to build. If you have problems, try comparing the state of your project folder with other developers, particularly the person who made the change, and with any auto-build systems.
Check Publishing
Assuming you don’t change dependencies as part of your own work, there’s no reason to re-test packaging and publishing.
Changing Dependencies
After your project has been adapted to Gradle and the Holy Gradle plugins (or created as such from scratch), you are likely to need to update dependency versions as development continues. This section covers how to do that.
Update Dependencies
First, run the following command.
gw deleteLinks
This ensures that all existing links are removed, before you change the description of the
packed dependencies. Once you have changed the packedDependencies
block, Gradle will not know to
delete the links for any dependencies which were removed. Of course, if you forget to do this at
this point, you can delete them manually.
Warning
| If you forget to do it before you commit your changes, you may fail to update the Include and Library Paths in your Visual Studio projects. You will not notice this because the build still works for you, because the links still exist. However, it may fail for other developers if they use the Mercurial hooks suggested in the Typical Workflow section. This is another good reason to use an auto-build. |
You can edit the packedDependencies
block to do any of the following:
-
update the versions of existing dependencies;
-
add new dependencies;
-
remove dependencies no longer required; or
-
change the configuration mappings for dependencies.
Fetch Dependencies
Run gw fAD
again. This will download and unpack any new dependencies, and create links
for them. This will not remove links for dependencies which were removed — see above for how to
do that. Also it will not remove the ZIP files and their unpacked files from your Gradle cache — for that, see the section on Gradle Cache Clean-up.
It is possible that some of your new dependency versions will be incompatible. They may conflict
directly, for example, if you define dependencies on A:1.2
and B:2.0
, but B:2.0
depends on
A:2.0
. More common is that they will conflict indirectly, because they both depend on some other
library at different versions. In this case, you will get an error like the following.
FAILURE: Build failed with an exception.
* What went wrong:
Could not resolve all dependencies for configuration ':everything'.
> A conflict was found between the following modules:
- com.example-corp.teamA:base-lib:1.2.8
- com.example-corp.teamA:base-lib:1.2.4
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output.
To understand where the conflict is, run
gw dependencies
This will output an ASCII-art tree diagram showing your dependencies. Within this, look for lines
where a module is shown with two versions separated by "->
". (A (*)
in this diagram means that
the sub-dependencies of that module have already appeared earlier in the diagram.) The following
example shows a multi-project build of several modules produced by some group teamC
.
someConfigurationName
+--- com.example-corp.teamC:test-app:unspecified
| +--- com.example-corp.teamC:comm-lib:unspecified
| | +--- com.example-corp.teamA:base-lib:1.2.8
| | | \--- org.boost:boost:1.54.0_9
| | \--- org.boost:boost:1.54.0_9
| \--- com.example-corp.teamC:framework:unspecified
| +--- com.example-corp.teamC:comm-lib:unspecified (*)
| +--- com.example-corp.teamA:utils:1.2.1
| | +--- com.example-corp.teamA:base-lib:1.2.4 -> 1.2.8 (*)
| | \--- org.boost:boost:1.54.0_9
| \--- org.boost:boost:1.54.0_9
\--- com.example-corp.teamC:test-helper:unspecified
\--- org.boost:boost:1.54.0_9
The problem is that teamC
's comm-lib
module depends on version 1.2.8
of teamA
's base-lib
,
but teamC
's framework
module depends on teamA
's utils
at version 1.2.1
and that then
depends on a different version of base-lib
. There are several ways to resolve this.
-
Downgrade the dependency from
comm-lib
onbase-lib
to1.2.4
. -
Upgrade the dependency from
comm-lib
onutils
to a later version which depends onbase-lib
at version1.2.8
. This may require work fromteamA
. -
Remove one of the dependencies, on
base-lib
orutils
. -
Split the dependencies into different configurations. Gradle resolves the dependency graph of each configuration separately. This may still lead to version conflicts later in your build process if, for example, you try to copy DLLs and EXEs from both configurations into the same folder. This can be the correct solution if you want to use different versions in different parts of your project, for example, in test code and in production code.
-
Add
dependenciesSettings.defaultFailOnVersionConflict = false
to yourbuild.gradle
. This will ignore all conflicts and automatically choose the "latest" version.WarningThis is not recommended, for several reasons. You may hide other version conflicts which happen in future. The later version of the module may not be compile-time- or runtime-compatible with the earlier version and may fail in unexpected ways. Lastly, Gradle 1.4 picks the "latest" version by string (lexicographical) comparison, so "1.2.11" comes before "1.2.3".
Check Build
After fetching the new dependencies, it’s a good idea to do a clean build, to make sure old build products aren’t still based on old versions of dependencies.
Check Packaging
Once your build works correctly, you should re-check packaging, as described in Check Packaging.
Check Publishing
When you are happy with the packaged outputs and the ivy.xml
file, you may want to re-check
publishing, possibly using a file:
URL as described in Check Publishing. However, unless
you or the Artifactory administrators have changed the destination repository setup in some way, or
authentication details have changed, there should be no need to do this.
New Check-Out Of Existing Workspace
If you want to check out and build second or further copies of a repo on the same PC, Gradle and
the Holy Gradle plugins will save you some time. Check out the source and run gw fAD
as usual.
You should find that dependencies don’t need to be downloaded or unpacked, and the only time spent
is to check versions and create links.
Packaging Third-Party Dependencies
Sometimes you may want to use a third-party library or build tool as a dependency, but find that it is not already available as an intrepid-style packed dependency in any suitable Artifactory repository. This section describes how you can package it up yourself. You can also download an example build script.
Using Source Versus Repackaging A Distribution
Often, third-party software is distributed in archives of pre-built binaries. In other cases, you may need to modify the source of a third-party component, or build the source because pre-built binaries are not available from the originating organisation. The following sub-sections describe the differences in these two cases.
Packaging Third-Party Modules From Binaries
Generally you will not be in a position to modify the distribution of the third-party software to
include a build.gradle
file. Instead, you can create a "wrapper" folder, which will contain
-
a
build.gradle
file which you write; and -
a folder containing the third-party files.
You may want to put this wrapper folder, without the third-party files, under source control.
However, intrepid
will package the build.gradle
file along with the rest of the module when it
is published, so this isn’t strictly necessary. If you need any other scripts etc. to automate the
packaging, you can include them as well by adding them to a buildScript
entry in the
packageArtifacts
block, as described under Check Packaging.
It’s a good idea to put clear instructions for the use of this build.gradle
file in comments near
the top of the file, for when it becomes necessary to publish a new version. In particular, the
packageArtifacts
block will contain assumptions about the paths to various third-party files, so
your instructions should state where to put them, whether they need to be unzipped first, etc.
Bear in mind that the name
of the published module will default to the name of the wrapper folder,
so you should add a line settings.gradle
which sets it to a specific name, as described under
publishPackages.
Packaging Third-Party Modules From Source
If you are building the source exactly as it is provided, or it’s a library whose source you use directly (e.g., JavaScript), then you use a wrapper as described in Packaging Third-Party Modules From Binaries. If you have modified and re-built the source of a third-party library to produce non-standard binaries, then you should have the modified source under source control, and follow the usual Initial Set-up steps.
See also the Publish Third-Party Dependency Module section for guidelines on naming and version numbering.
Package Third-Party Dependency Files
At this point, you should add configurationSets
and packageArtifacts
blocks to your script, as in
the usual Initial Set-up case. You won’t need to add a packedDependencies
block or a
top-level repositories.ivy
block, and you probably won’t need to specify any prerequisites
,
unless they’re needed just for the packaging of this third-party software.
Although the third-party software will probably be available to you as one collection of files, you should still think about how to divide them into configurations for publishing, so that your project and any other users of this module don’t have to pull in any more files than they need.
If you are using a wrapper approach, the files you include may by default not appear at the desired level in the output ZIP files. For example, if you have a directory structure like this
ThirdPartyLib\ <1>
ThirdPartyLib_bin_1.4\ <2>
bin\
tplib.dll
include\
tplib.h
lib\
tplib.lib
<1> The wrapper folder.
<2> The root folder of the unzipped original binary distribution.
then the following packageArtifacts
block
packageArtifacts {
String distroot = "ThirdPartyLib_bin_1.4"
import_common {
from distroot
include "**/*.h"
}
import_x64_Release {
from distroot
include "**/*.lib"
}
runtime_x64_Release {
from distroot
include "**/*.dll"
}
}
will produce ".zip
" files with the following structure.
ThirdPartyLib-import_common-1.4_1.zip\
ThirdPartyLib_bin_1.4\
include\
tplib.h
ThirdPartyLib-import_x64_Release-1.4_1.zip\
ThirdPartyLib_bin_1.4\
lib\
tplib.lib
ThirdPartyLib-runtime_x64_Release-1.4_1.zip\
ThirdPartyLib_bin_1.4\
bin\
tplib.dll
To avoid the extra "ThirdPartyLib_bin_1.4
" folder level, use the to
method as well. The string
argument to this is the target directory in the created ZIP file. So, a packageArtifacts
block
like this
packageArtifacts {
String distroot = "ThirdPartyLib_bin_1.4"
import_common {
from distroot
to "."
include "**/*.h"
}
import_x64_Release {
from distroot
to "."
include "**/*.lib"
}
runtime_x64_Release {
from distroot
to "."
include "**/*.dll"
}
}
will produce ZIP files with the desired structure:
ThirdPartyLib-import_common-1.4_1.zip\
include\
tplib.h
ThirdPartyLib-import_x64_Release-1.4_1.zip\
lib\
tplib.lib
ThirdPartyLib-runtime_x64_Release-1.4_1.zip\
bin\
tplib.dll
Add Third-Party License Information
Most software is subject to one or more licenses, and you can optionally record this information in
the ivy.xml
file. This may help release engineers, or other developers who want to re-use your
package, to decide whether it meets any licensing constraints they must comply with. You can also
add a general short description, ideally providing the URL where the original software can be found.
Neither Gradle nor the Holy Gradle plugins contain any explicit support for this, but it’s easy to add, as follows.
publishing {
publications.ivy.descriptor.withXml {
Node infoNode = asNode().info[0]
// License info comes before description.
// Schema at http://ant.apache.org/ivy/schemas/ivy.xsd
// Doc at http://ant.apache.org/ivy/history/2.2.0/ivyfile.html
Node licenseNode_public_domain = infoNode.appendNode("license")
licenseNode_public_domain.@name = "Public Domain"
licenseNode_public_domain.@url = "http://jsoncpp.sourceforge.net/LICENSE"
Node licenseNode_mit = infoNode.appendNode("license")
licenseNode_mit.@name = "MIT"
licenseNode_mit.@url = "http://jsoncpp.sourceforge.net/LICENSE"
String description = """
This is a pre-built artifact package for json cpp with the "Multi Threaded DLL" compiler option
enabled.
The source for this customised package is placed at <https://hgserv1.example-corp.com/scm/hg/jsoncpp>.
The original source of jsoncpp0.6.0-rc2 can be retrieved from
<http://sourceforge.net/projects/jsoncpp/files/jsoncpp/0.6.0-rc2/jsoncpp-src-0.6.0-rc2.tar.gz/download>."""
Node descriptionNode = infoNode.appendNode("description", description)
descriptionNode.@homepage = "https://hgserv1.example-corp.com/scm/hg/jsoncpp"
}
}
Notice that you can add information for multiple licenses, and that Groovy uses triple-quoted strings for multi-line literals.
Publish Third-Party Dependency Module
Once you’ve tested the packaging and are happy with the results, you can add a publishPackages
block to your script. This will need a repositories.ivy
block inside it to describe where to
publish to, even though you don’t need one at the top level of the script to describe where to get
dependencies from, because your project won’t have any. As before you need to provide a group
,
name
, and version
; following are some guidelines on choosing these.
If you are packaging the software directly, either from binary or when built from source without
changes, use a value which identifies the originating organisation; typically, the reverse domain
name, e.g. "org.boost
". If you have made any changes, it may be better to use some string which
identifies your own organisation and/or team. If you are in fact publishing an in-house
dependency from some other team, because the team which produces it doesn’t publish to Artifactory,
they might be happy for you to use their team name, or might prefer you use your own, if they don’t
want people to assume that it’s an officially supported release.
Similarly for the name
, use the original if you are packaging things unchanged, otherwise modify
it to reflect the reason for or effect of the change. For example, you might publish jsoncpp
as
jsoncpp_md
if you compiled it for multi-threaded use.
The version
should be the original version number, followed by an underscore (_
), followed by
a version number or string of your own. This identifies both the base version for your changes,
and the version of your packaging. Even if you made no changes, it’s a good idea to add your own
version part because you might find that your packaging was incorrect, and have to re-publish the
same version of the original software with a different version of your build.gradle
script. You
may not be able to overwrite the incorrect version if you have already published it to a public
repository. Or, you may not want to if you have already created useful builds which depend on it,
and you want to be sure you can reproduce them later.
So, you might end up with the following: a settings.gradle
with just one line:
rootProject.name = "jsoncpp_md"
and a build.gradle
containing the following:
group = "com.example-corp"
version = "0.6.0-rc2_1"
// ...
publishPackages {
repositories.ivy {
credentials {
username my.username("Artifactory")
password my.password("Artifactory")
}
url (project.hasProperty('publishUrl')
? project.property('publishUrl')
: 'http://PLEASE_SUPPLY_publishUrl_PROPERTY')
}
}
It is helpful to publish first to your own team’s integration repository, so that you can run a
build using this new module, to check that the packaging works correctly for you. You can do this
by running it with the command-line argument
"-PpublishUrl=http://artifactory.example-corp.com/artifactory/teamA-intergation-local/
". Once you
are happy with the packaging, you can either promote it to a more public repository from within
Artifactory (see Promotion / Republishing), or run the script again with another repo
URL, for example,
"-PpublishUrl=http://artifactory.example-corp.com/artifactory/externals-release-local/
".
Making a Release
The previous sections describe how to get the dependencies for a project, build it, and publish the resulting module to Artifactory. In most cases, you need to do a bit more than that to release some software.
The main things to consider are
-
what needs to be released;
-
what restrictions there are on your release (such as licensing, or export laws);
-
how you identify it later;
-
where the released things will be stored;
-
who should do the release;
-
how your customers will get the release.
It is a good idea to test your release process completely, including having your customer check a sample release.
What to Release
As discussed under Determine Outputs (Downstream Dependencies), the packageArtifacts block controls which files are included in the ZIP files (package artifacts) of your module.
You also need to consider whether any of your dependencies need to be released, and whether there are any files not in Artifactory which are part of your release process.
Publishing Dependencies
It is common during development within an organisation that one team will use pre-release or "release candidate" versions of modules from other teams. It is also possible that you will be Packaging Third-Party Dependencies (libraries or build tools) to use in your project. If you publish your project in some repository which your customers can access, but some of your dependencies are in other repositories which are not accessible to them, then they may not be able to fetch and use your module.
You may also have dependencies which are used to build your project, but are not needed by teams who use your module. In that case, you may still want to publish those dependencies (maybe to a different repository) so that they are stored permanently, to make sure that you can re-build your release from source in future.
License and Export Restrictions
If some of your dependencies are licensed commercial software, you may not have the right to
redistribute them, or to export them to other countries. If your customers have their own license
to get the software, you can give them a copy of the build.gradle
which you used to publish the
module, and they can publish it into their own private repository. Every module published using the
Holy Gradle has a buildScript
ZIP file which contains the script.
Release Systems Outside Artifactory
You may have other things which are not part of your project but are part of your release process — for example, a release notes document. You can either store them outside of Artifactory and
keep a note of the matching ID(s) of your release (see below), or you can copy them into your
project before making the release, and publish them to Artifactory. The Holy Gradle already does
this for source code: it doesn’t include all the source but it creates a build_info
folder in each
ZIP file, and that has files describing the source repository and version it was published from.
Release Notes
Release notes are an interesting case because you may want to send out updated release notes after
your release, for example, if bugs are found. One approach could be to create a separate module for
the release notes, which has a dependency on your original module. Suppose your original module
"com.example-corp:my-lib
" is published at version "1.3.1
"; then you could have multiple versions
of your release notes module, each using the date and time: "1.3.1-YYYYMMDD-HHMMSS
". Then
someone can always get the latest version by defining a dependency on
"com.example-corp:my-lib_release-notes:+
".
Identifying the Release
Your project has a module version ID when you are running Gradle and when you publish to Artifactory. See Check Publishing for how to set this with the Holy Gradle.
Customers will need this module version ID plus the name of (or URL for) an Artifactory virtual repository where they can get it. See How Customers Get the Release below.
Where to Put the Release
To publish to Artifactory you must choose a local repository to store your release. When choosing, you should consider
-
who needs to be able to get your release;
-
who should not be able to get it — for example, if it is a beta, or confidential; and
-
how long it should be kept for.
The repo to use depends on the policies for your server — see the Support page to find out who to ask, and tell them your choices for the points above. A common approach is promotion: your automated build publishes to a repo which keeps temporary builds, only for your team; when you are ready to release, you copy (promote) a build to a repo where others can get it. See Promotion / Republishing for information on how to do this.
Warning
| Regarding restricting access to modules, note that someone who has access may get a module from you and pass it on to someone else who should not have access. Artifactory can only provide a basic level of control, to prevent accidental sharing. You must make sure your customers understand any restrictions, and that they also tell new members of their teams. |
Who Should Do the Release
The first point here is that the user who runs Gradle must have "Deploy" permission for the relevant repository in Artifactory. Discuss this with your team lead and/or your local Support contact.
The other important issue is if you have dependencies which also need to be released, and they were published by another team: who should release (promote) those modules? If you can fetch the dependencies, then it is possible for you to promote them, but you should check the release policies in your organisation, and usually contact the team to discuss it.
How Customers Get the Release
Through Artifactory
To publish to Artifactory, just run gw publish
as described under Check Publishing.
If you customers have access to an Artifactory server, then they can fetch your module into
their project with the Holy Gradle, using the packedDependencies
block as usual. You must
tell them
-
your module’s version ID (
group/name/version
); and -
the virtual repository which they should fetch it from.
If that repository is not already in the repositories
section of their build.gradle
, they must
add it directly to that file, or ask their server administrator to add it to one of the virtual
repositories which is already used in their build script.
If the team is using a different Artifactory server — for example, at a different site — then the server administrator can set up a remote repository on that server to cache files from a virtual repository on your server. Ideally, the server admin will use a name which matches your virtual repository name (or create a virtual repository which matches it and includes the remote repo). If not, that admin must tell the other team which repository name to use. (The other team could access your server directly but that may be slower and use a lot of network capacity.)
Without Artifactory
The Holy Gradle has a feature to let teams which use Artifactory easily release to
teams which do not. This is basically the same as Sharing Dependencies with Disconnected Sites,
but for teams which are using your module, not teams which are developing your module.
The difference in this section is that you will use the Holy Gradle to automatically create a
separate build.gradle
which uses your project, and then test that.
First, add a new configuration and a new entry to your packageArtifacts
section, as follows.
configurations {
preBuiltArtifacts
}
group = "com.example-corp.teamA"
version = System.getenv("MY_MODULE_VERSION") ?: Project.DEFAULT_VERSION
packageArtifacts {
preBuiltArtifacts {
include "gw.bat", "gradle/**", "gradle.properties"
includeBuildScript {
addPackedDependency "${project.group}:${project.name}:${project.name}"
}
}
}
publishPackages {
// ...
}
Tip
| For the above to work, the group
and version
must be set before the packageArtifacts
block. |
Now publish your module to Artifactory as normal. You will see a ZIP file with a name containing
the string "preBuiltArtifacts
". Download and unzip it to a new folder, which must not have the
same name as your module. Then run
gw collectDependencies
This will download your module and all its dependencies into a folder called local_artifacts
.
Then ZIP up the following folder and files.
-
local_artifacts/
-
gradle/
-
gw.bat
-
settings.gradle
-
build.gradle
-
gradle.properties
Send this file to your customer. They can unzip it and run gw fAD
to unpack your module and all
its dependencies. You can and should test this for yourself before you send it, using
gw -o -g tmp_guh fAD
as described in Sharing Dependencies with Disconnected Sites.
(You can use the zipDependencies
task to do this, if you also configure it to add the other files
and folders above apart from local_artifacts/
. To see how to do this, read the Gradle
documentation on the Zip task.)
Promotion / Republishing
A common arrangement is to publish auto-builds to an "integration" repository, which has limited read access. Once a given build is determined to be release quality, this can be moved (promoted) to another Artifactory repository with more general read access.
For individual modules, this can be done via the Artifactory web interface. In the "Tree Browser" under the "Artifacts" tab, locate the version of the module you want to promote. Right-click on it and select "Copy", or left-click on it then click "Copy" in the panel on the right. Select the target repo and click "Copy" (or try "Dry Run" first if you like). Copying is very quick and cheap in Artifactory, because it just creates new entries to the same files, in its internal database.
If your module has new dependencies, and they were published to your team’s integration repo, you
will need to copy those as well. You can simplify this by using "republishing" support in the
intrepid
plugin.
Adding Republishing
Add the following to your project’s build.gradle
file.
configurations {
// ... existing configurations, plus:
republishing
}
packageArtifacts {
// ... existing packages, plus:
republishing {
to "republishing"
include "gradle/**", "gw.bat"
include "build.gradle"
// include "gradle.properties" // ... if appropriate for your project.
}
}
republish {
to (project.hasProperty('republishTo')
? project.property('republishTo')
: 'http://artifactory.example-corp.com/artifactory/libs-release-local/')
}
Republishing depends on the "everything" configuration which is deprecated. If you wish to use republishing in your project you will also need to add the following line to your gradle.properties
file.
createEverythingConfiguration=true
Using Republishing
You can then download published package with a name like "some_module-republishing-1.2.3.zip
",
unzip it, and run
gw checkPackedDependencies
to check that all modules used by your module are present in the target repo. This will print out a human-readable list of all transitive dependencies, indicating which are present and which not. For the ones which are not, you can copy them manually in Artifactory.
If you want to override the default target repo in the republish
block, because you are
publishing to a different repo, you can do the following.
gw -PrepublishTo=http://artifactory.example-corp.com/artifactory/some-other-repo/ cPD
Note
| You can, of course, run gw cPD in your project directory normally. However, making an
explicit "republishing " package means you can be sure that the build script contains exactly the
versions of your module’s dependencies that are recorded in the ivy.xml for that published
version. |
Using a Meta-Package
Some teams may publish meta-packages to provide an easy way to get just some of the files from a module. Normally you should just add a module as a gloss:packed-dependency in your own project. A meta-package is a "prepared" project in a ZIP file package. Example uses for a meta-package include the following.
-
Download just the runtime binaries for an application module, so a test engineer can run tests.
-
Fetch the exact source and dependencies for a released version of a module, to rebuild it.
To use a meta-package:
-
unzip it to a location of your choice;
-
open a command prompt at the unzipped location;
-
run
gw fetchAllDependencies
(orgw fAD
).
Repository Clean-up
Module versions in integration (auto-build) repositories may be deleted over time, whereas public releases are generally kept "forever", since other projects may depend on them, and it may be necessary in future to reproduce old builds of those projects. It is a good idea to set up some kind of automated process to delete old versions of modules in integration repos. The artifactory-manager-plugin can be used to automate this, including adding exceptions to keep specific builds.
Gradle Cache Clean-up
Gradle keeps a per-user cache of various files it downloads (versions of gradle, dependencies) and
metadata about those files; the intrepid
plugin unzips packed dependencies to a sub-folder of that
cache as well. The cache by default is located at %USERPROFILE%\.gradle
, but you can override
this by setting the environment variable GRADLE_USER_HOME
(or, temporarily, with the -g
command-line option).
Gradle never deletes anything from this directory, and neither it nor the Holy Gradle plugins provide anything to do this. It’s safe to delete the whole cache as long as Gradle isn’t running. It’s also probably safe to write some kind of script to, e.g., delete all files older than some number of days, but again, make sure you do this only when Gradle isn’t running.
You can also run gw.bat
with the --refresh-dependencies
argument to force it to re-download and
re-unpack dependencies, if you think there might be some problem with dependencies, but you don’t
want to delete your whole cache.