Delivering your dependency build outputs back to the stream

Several months back I started on an effort to create an Ant task that would deliver outputs of a dependency build back into the SCM. I did a couple quick tests to make sure that the underlying support was there by zimporting a hello world application, zloading it back out, and confirming it could still run. Full success. Then I coded up a quick prototype Ant task to create and run an IShareOperation to confirm that I could share the build outputs using available Java API rather than zimport. Again, full success. Great! I delivered the good news that we would be able to deliver this sample Ant task.

Unfortunately the devil (many devils, in fact) was in the details, as I discovered recently when I finally sat down again to properly implement this Ant task. This story really doesn’t have a happy ending, but it is worth sharing given everything I’ve learned while trying to implement this sample.

The first thing I realized was that, in order for this task to really be useful, I would need to not just store the outputs in the SCM, but also create and assign data set definitions to the folders the outputs would be stored in. Otherwise, you wouldn’t be able to load the outputs back out to MVS. It became quickly apparent that (a) I would basically be copy pasting the entire zimport tool and (b) I would need to use undocumented APIs to get this done. So, I threw away the idea of creating an Ant task and decided that I would invoke zimport from an Ant task, and that I would configure a post-build script in my dependency build definition to call this task. I would also use the scm command line deliver operation to deliver from the zimport workspace to the stream. I used this Using the SCM Command Line Interface in builds article as a starting point.

With this new approach in mind, I started coding my post-build Ant script. It was then that I started seeing failures when I tried to zimport programs of any meaningful size. It turns out I’d found a new bug: Hash mismatch TRE when zimporting load modules (244317). Unfortunately, there is as of this posting no fix for this bug, and therefore the solution I’m presenting is not currently a working option. You can however test out this sample on very small applications in the meantime, which is what I did to continue my work.

The next thing I realized was that in order to deliver the change sets created by zimport, I would need to add a comment or associate a work item in order to pass the commonly used “Descriptive Change Sets” precondition on deliver. Unfortunately (there’s that word again!), zimport does not tell you what change sets it created, or provide you with a way to annotate them. So, this would require some additional scripting and use of the SCM CLI. My sample adds comments; I leave it as an exercise for you, dear reader, to associate a work item (and potentially use curl to create a new one) should you so desire.

An important piece of the sample was to show how we could figure out what output needed to be delivered. So, I had created as part of my initial Ant task a utility to take the dependency build report as input and generate a zimport mapping file. I figured that piece at least was salvageable, even if my main solution was not going to be an Ant task. I discovered something interesting though during my testing: the build report is actually not created until AFTER the post-build script executes. Rats! So, I had to change my approach from using a post-build Ant script to adding a Post Build Command Line to my dependency build configuration and implementing everything as a shell script. I then converted my Java utility to some javascript that would generate my zimport mapping file.

There is much more to this painful saga, but I will spare you the details and share my solution here. Remember that as usual this sample is JUST a sample to get you started. It is not guaranteed or supported in any way. It has been only minimally tested. You will also see quickly that I am not an expert at shell scripting, perl, or javascript. My code is not sexy by any stretch of the imagination, nor is it robust. But, it works (at least for me!) and is hopefully enough to get you well on your way to your own full solution.

First, here is the main script zimport_and_deliver.sh that you will invoke from your dependency build:

#!/bin/sh
#set environment variables required by SCM tools
export JAVA_HOME=/var/java16_64/J6.0_64
export SCM_WORK=/u/jazz40

#specify locations of jrunscript, perl, and javascript used in this script
#some of these could be passed in as build properties set on the build engine
JRUNSCRIPT="/var/java16_64/J6.0_64/bin/jrunscript"
SCM="/u/ryehle/rtcv401/usr/lpp/jazz/v4.0.1/scmtools/eclipse/scm --non-interactive"
LSCM="/u/ryehle/rtcv401/usr/lpp/jazz/v4.0.1/scmtools/eclipse/lscm --non-interactive"
PERL=/usr/lpp/perl/bin/perl
PARSE_SCRIPT="/u/ryehle/parseBuildReport.js"

#function to clean up temporary files from previous run
function remove_temp_file {
	if [ -f $1 ]
	then
	    echo "Deleting $1"
	    rm $1
	fi
}

echo "Running $0 as $(whoami)"

#specify port for scm daemon. this could also be passed in,
#or let it auto assign and parse out the value
PORT=15869

#gather the arguments and echo for log
REPOSITORYADDRESS=$1
HLQ=$2
FETCH=$3
SYSDEFPROJECTAREA=$4
BLDWKSP=$5
LABEL=$6
PERSONAL_BUILD=$7
echo "Repository address[$REPOSITORYADDRESS] HLQ[$HLQ] Fetch directory[$FETCH]
System definition project area[$SYSDEFPROJECTAREA] Build workspace UUID[$BLDWKSP]
Build label[$LABEL] Personal build[$PERSONAL_BUILD]"

#do not store outputs if this is a personal build
if [ $PERSONAL_BUILD = "personalBuild" ]
then
#    echo "team build"
else
	echo "Personal build...exiting."
	exit 0
fi

#delete any existing temporary files
MAPPING=$FETCH/outputMappingFile.txt
FLOW=$FETCH/flow.tmp
CS_LIST=$FETCH/cslist.tmp
CS_UPDATE=$FETCH/csupdate.sh
SCM_DAEMON=$FETCH/daemon.tmp
remove_temp_file $MAPPING
remove_temp_file $FLOW
remove_temp_file $CS_LIST
remove_temp_file $CS_UPDATE
remove_temp_file $SCM_DAEMON

#create the zimport mapping file from the build report
echo "$(date) Creating the zimport mapping file from the build report"
$JRUNSCRIPT $PARSE_SCRIPT $FETCH/buildReport.xml $HLQ Output CompOutput > $MAPPING

#check to see if there is anything to be zimported
if [ ! -s $MAPPING ]
then
	echo "Nothing to zimport...exiting."
	exit 0
fi

#start the scm daemon process in the background and wait for it
echo "$(date) Starting the SCM daemon at port $PORT"
$SCM daemon start --port $PORT --description "zimport and deliver" >  $SCM_DAEMON 2>&1 &
while true
do
	[ -f $SCM_DAEMON -a -s $SCM_DAEMON ] && break
	sleep 2
done
cat $SCM_DAEMON
$PERL -ne 'if (/^Port: /) {exit 0;} exit 1;' $SCM_DAEMON
if [ $? -eq 1 ]
then
	echo "$(date) The SCM daemon failed to start...exiting."
	exit 1;
fi

#display the running daemons
$LSCM ls daemon

#calculate the stream being built and where outputs will be stored. this could be hardcoded to save time.
echo "$(date) Calculating stream being built: $LSCM list flowtargets $BLDWKSP -r $REPOSITORYADDRESS"
$LSCM list flowtargets $BLDWKSP -r $REPOSITORYADDRESS > $FLOW
cat $FLOW
PERL_SCRIPT="if (/\"(.*)\".*\(current\)/)
	{ print qq(\$1); exit 0;}
	die(qq(Could not find current flow for build workspace));"
STREAM=`$PERL -ne "$PERL_SCRIPT" $FLOW`

#create the zimport workspace
ZIMP_WKSP=zimport_$(date +%Y%m%d-%H%M%S)
echo "$(date) Creating workspace for zimport named $ZIMP_WKSP flowing to $STREAM"
$LSCM create workspace -r $REPOSITORYADDRESS -s "$STREAM" $ZIMP_WKSP

#perform the zimport
echo "$(date) Starting zimport"
$SCM zimport --binary -r $REPOSITORYADDRESS --hlq $HLQ --mapfile "$FETCH//outputMappingFile.txt" --projectarea "$SYSDEFPROJECTAREA" --workspace $ZIMP_WKSP

#gather list of change sets created from zimport and add a comment
#note: does not annotate new components
echo "$(date) Adding comment to generated change sets"
$LSCM compare workspace $ZIMP_WKSP stream "$STREAM" -r $REPOSITORYADDRESS -p c -f o > $CS_LIST
cat $CS_LIST
PERL_SCRIPT="if (/    \((\d+)\)/)
	{
	print qq($LSCM changeset comment \$1 \\\"Change set created by zimport from build $LABEL\\\" -r $REPOSITORYADDRESS \n);
	}"
$PERL -ne "$PERL_SCRIPT" < $CS_LIST > $CS_UPDATE
chmod 777 $CS_UPDATE
$CS_UPDATE

#we can deliver everything since we just created the workspace. otherwise we could have delivered the individual change sets.
echo "$(date) Delivering the changes"
$LSCM deliver -s $ZIMP_WKSP -r $REPOSITORYADDRESS

#delete the zimport workspace
echo "$(date) Deleting the zimport workspace $ZIMP_WORKSPACE"
$LSCM workspace delete $ZIMP_WKSP -r $REPOSITORYADDRESS

#stop the daemon
echo "$(date) Stopping the daemon at port $PORT"
$SCM daemon stop --port $PORT

echo "$(date) Done"

And here is the parseBuildReport.js javascript that takes a build report and generates a zimport mapping file:

Array.prototype.contains = function(object) {
	var i = this.length;
	while (i--) {
		if (this[i] === object) {
			return true;
		}
	}
	return false;
}

var doc = new XMLDocument(arguments[0]);
var hlq = String(arguments[1]);
var output_project_suffix = arguments[2];
var output_component_suffix = arguments[3];
var componentList = doc.getElementsByTagName('bf:component');
var outputArray = [];
for ( var i = 0; i < componentList.length; i++) {
	var component = componentList.item(i);
	var componentName = component.getAttribute('bf:name');
	var projectList = component.getElementsByTagName('bf:project');
	for ( var j = 0; j < projectList.length; j++) {
		var project = projectList.item(j);
		var projectName = project.getAttribute('bf:name');
		var fileList = project.getElementsByTagName('bf:file');
		for (var k = 0; k < fileList.getLength(); k++) {
			var file = fileList.item(k);
			var reason = file.getAttribute('bf:reason');
			if (reason != 0) {
				var outputList = file.getElementsByTagName('outputs:file');
				for (var l = 0; l < outputList.getLength(); l++) {
					var output = outputList.item(l);
					var member = output.getElementsByTagName('outputs:buildFile').item(0).getTextContent();
					var dataset = output.getElementsByTagName('outputs:buildPath').item(0).getTextContent();
					var outputModel = new OutputModel(dataset, member, componentName, projectName);
					outputArray.push(outputModel);
				}
			}
		}
	}
}
var projectsArray = [];
var componentsArray = [];
for ( var i = 0; i < outputArray.length; i++) {
	var output = outputArray[i];
	//println(output.dataset + output.member + output.component + output.project);
	var member = output.dataset.substr(hlq.length + 1) + "." + output.member;
	var project = output.project + output_project_suffix + ":" + output.dataset.substr(hlq.length + 1);
	println("P:" + member + "=" + project);

	//stash the zComponent project and component.. we only want one entry per project
	if (!projectsArray.contains(output.project)) {
		projectsArray.push(output.project);
		componentsArray.push(output.component);
	}
}
for (var i = 0; i < projectsArray.length; i++) {
	println("C:" + projectsArray[i] + output_project_suffix + "=" + componentsArray[i] + output_component_suffix);
}

function OutputModel (dataset,member,component,project) {
	this.dataset = dataset;
	this.member = member;
	this.component = component;
	this.project = project;
}

To try this out, you will need to:

  1. Store the two sample scripts above on your build machine.
  2. Add a Post Build Command Line to your dependency build definition. Specify a command to invoke the zimport_and_deliver.sh script and pass in all of the required parameters. I specify the following command: zimport_and_deliver.sh ${repositoryAddress} ${team.enterprise.scm.resourcePrefix} ${team.enterprise.scm.fetchDestination} “Common Build Admin Project” ${teamz.scm.workspaceUUID} ${buildLabel} ${personalBuild}
  3. Logged in to the build machine as the user under whom the build will run, run the “scm login” command to cache your jazz credentials. E.g., scm login -r https://your_host:9443/ccm -u builder -P fun2test -c. This allows you to run your scm commands from the zimport_and_deliver.sh script without hardcoding a password. By default, the credentials are cached in ~/.jazz-scm. Unfortunately, the scm command line does not allow you to specify a password file as you can for the build agent. Note that the –non-interactive flag passed to the SCM CLI ensures the CLI does not ask for a password and hang the build.

Now you should be able to run your dependency build and see that your outputs are stored back in the source stream. This sample script creates and leaves behind several temporary files for easier reading and debugging. You could certainly refactor the script to not use the temporary files once your solution is fully implemented and tested.

Some additional things to note about this sample:

  1. Notice in the shell script that we start an SCM daemon at the beginning and stop it at the end. We use “lscm” rather than “scm” to leverage this started daemon and reduce the overhead time of these commands. You could consider hard coding some things that don’t need to be calculated to save additional time, such as the current stream with which the build workspace flows. Note also that the zimport subcommand is not supported from lscm, so we use scm for that command. The open enhancement to support zimport from lscm can be seen here.
  2. This script checks the ${personalBuild} build property and exits if it is true. Outputs should likely only be stored in the SCM if they were generated by a team build.
  3. This sample zimports all outputs as binary. You will need to expand the sample if you want to import generated source as text.
  4. This sample uses a convention to create new zComponent projects and RTC components to store the outputs. We do not store the outputs in the .zOSbin folder of the source zComponent projects because there is no way to load files in that folder back out to MVS. We also would not want to run the risk of developers accidentally loading the outputs to their sandbox, nor would we want to potentially cause issues with the dependency build by intermingling the source and outputs.
  5. This sample requires RTC V4.0.1 for the scm list flowtargets command, and for a fix that allows you to specify a workspace for zimport.

Hopefully this sample is useful to you in some capacity, even without a working zimport. Feel free to comment back with your suggested improvements. Lastly, I would be remiss if I did not say THANK YOU to the many folks who helped me stumble through my lack of SCM CLI and scripting skills (Eric, Nicolas, John…) for this post.

This entry was posted in Enterprise Extensions, Rational Team Concert, System z. Bookmark the permalink.

2 Responses to Delivering your dependency build outputs back to the stream

  1. Pingback: Welcome to 2013…. new Jazz news and resources « Dan Toczala's Blog

  2. Tim Hahn says:

    Nice write-up Robin. I found it interesting though that you had to use a combination of shell scripting, great knowledge of the scm command-line, and a javascript program to get this all to work. It seems like it was quite an effort.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s