Opstree SHOA Part 1: Build & Release

At Opstree we have started a new initiative called SHOA, Saturday Hands On Activity. Under this program we pick up a concept, tool or technology and do a hands on activity. At the end of the day whatever we do is followed by a blog or series of blog that we have understood during the day.
 
Since this is the first Hands On Activity so we are starting with Build & Release.

What we intend to do 

Setup Build & Release for project under git repository https://github.com/OpsTree/ContinuousIntegration.

What all we will be doing to achieve it

  • Finalize a SCM tool that we are going to use puppet/chef/ansible.
  • Automated setup of Jenkins using SCM tool.
  • Automated setup of Nexus/Artifactory/Archiva using SCM tool.
  • Automated setup of Sonar using SCM tool.
  • Dev Environment setup using SCM tool: Since this is a web app project so our Devw443 environment will have Nginx & tomcat.
  • QA Environment setup using SCM tool: Since this is a web app project so our QA environment will have Nginx & tomcat.
  • Creation of various build jobs
    • Code Stability Job.
    • Code Quality Job.
    • Code Coverage Job.
    • Functional Test Job on dev environment.
  • Creation of release Job.
  • Creation of deployment job to do deployment on Dev & QA environment.
This activity is open for public as well so if you have any suggestion or you want to attend it you are most welcome

Tip : Setting up Git Jenkins integration on windows box

If you have ever tried setting up git as a version control system in a Jenkins installation on a windows box you would have faced an error message ssh key not available.

The reason behind this issue is that if you are using git with ssh protocol it tries to use your private key to perform git operations over ssh protocol & the location it expects is the .ssh folder at home directory of user. To fix this issue you have to create a HOME environment variable and point to your home directory where your .ssh folder exists after that restart Jenkins & now it should work fine.

Automation tips and tricks

As promised I’m back with the summary of cool stuff that I’ve done with my team in Build & Release domain to help us deal with day to day problems in efficient & effective way. As I said this month was about creating tools/utilities that sounds very simple but overall their impact in productivity & agility of build release teams and tech verticals was awesome :).

Automated deployment of Artifacts : If you have ever worked with a set of maven based projects that are interdependent on each other, one of the major problem that you will face in such a setup is to have the latest dependencies in your local system. Here I’m assuming two things you would be using a Maven Repo to host the artifacts & the dependencies would be SNAPSHOT dependencies if their is active development going on dependencies as well. Now the manual way of making sure that maven repo will always have the latest SNAPSHOT version is that every-time somebody does change in the code-base he/she manually deploy that artifact to maven repo. What we have done is that for each & every project we have created a Jenkins job that check if code is checked in for a specific component & if so that component’s SNAPSHOT version get’s deployed to maven repo. The impact of these utilities jobs was huge as now all the developers doesn’t have to focus on deploying their code to maven repo & also keeping track of who last committed the code was also not needed.

Log Parser Utility : We have done further improvement in our event based log analyzer utility. Now we also have a simple log parser utility through which we can parse the logs of a specific component & segregate the logs as per ERROR/WARN/INFO. Most importantly it is integrated with jenkins so you can go to jenkins select a component whose log needs to be analyzed, once analysis is finished the logs are segregated as per our configuration(in our case it is ERROR/WARN/INFO) after that in the left bar these segregations are shown with all the various instances of these categories and user can click on those links to go exactly at the location where that information is present in logs

Auto Code Merge : As I already told we have a team of around 100+ developers & a sprint cycle of 10 days and two sprints overlap each other for 5 days i.e first 5 days for development after tat code freeze is enforced and next 5 days are for bug fixing which means that at a particular point of time there are 3 parallel branches on which work is under progress one branch which is currently deployed in production second branch on which testing is happening and third branch on which active development is happening. You can easily imagine that merging these branches is a task in itself. What we have done is to create an automated code merge utility that tries to merge branches in a per-defined sequence if automatic merge is successful the merge proceeds for next set of branches otherwise a mail is sent to respective developers whose files are in conflict mode

Hope you will get motivated by these set of utilities & come up with new suggestions or point of improvements

Initial thoughts for a patch framework for a java based web project

Although this blog was not in the pipeline of feb month but I got a requirement to build a patch framework for a java based web project, so along with building this framework I thought of writing this blog as well so that I’ll get idea from other people as well.

First of all I will talk about what can be patched using this patch framework, majorly it will be three resources/things that can be patched

  • Class Files
  • JSP’s
  • Static Resources such as images, css, js …

I’m thinking of adding few other features in this patch framework as well

  • Sequence of patches should be maintained : Since we have a big team around 80 developers working on same code-base, their may be a scenario that we can have two or more patches which needs to be applied to a target system. Their may be a fair chance that those patches have to be executed in a sequence or you can say their could be dependency among patches.
  • Validation while applying patches : One of the validation that I can think of is that the resources that have to be patched will be either new or existing one & in case of existing resources the system should verify the location at which resources are patched should already have those resources
  • Rollback : The patch framework should have rollback capability

I’m planning to build this patch framework using

  • Shell scripting : For programming aspect
  • Git : As a version control system for storing patches
  • Jenkins : Provide a front-end to user for applying patches
  • Mysql : Not so sure about it yet but I’ve to store few information such as what all patches are applied, sequence of patches…. I can use file systems as well for storing this information

Let me know your thought about  this framework or any other feature that you can think of

Automation tips and tricks January 2013

I’m starting a new blog series in which I’ll be talking about various cool things or automations that I along with my team done in a month and what are my plans for next month.

Talking about January 2013, I’ve done following things

1.) Streamlining of environments : The big step in streamlining the environments is to change the owners of the application from root user to tomcat user & making ports of all the application consistent across environments i.e dev, qa, pt & staging. This will help me in my long term goal of introducing a server configuration tool most preferably puppet.
2.) Log Analyzer Utility : One of the major challenge that teams face is to get real time notifications of any exceptions that occur in the server logs, to overcome this problem we have written a log analyzer utility that will scan a log file backed by a meta file, this meta file have the information about who all should be notified for an exception. This utility is written in shell script and integrated with Jenkins CI server so that we can schedule the execution of this utility as per convenience, currently jenkins is executing this utility after every 15 minutes.
3.) System monitor : Off late we were facing challenge of servers getting disk out of space & when whole system goes down then only we were able to figure out the issue is due to disk space outage due to huge log files, to overcome this problem we have built a small shell utility that scans couple of folder’s recursively and provide a list of top 10 files whose size is greater then a specified threshold. In our case we have set this threshold as 1 GB, also all these variables can be provided as input to this utility such as folder’s to scan regular expression of files which needs to be considered the threshold value

This is what we have achieved in the month of January 2013 although these utilities seems to be but obvious and simple one but the effect they have in the productivity of the team is considerable.

Now plans for the month of February 2013, usually I choose those things which we are doing manually, this month we will be working on following things
1.) Utility which can perform automated merge if possible
2.) Utility that can automatically upload the artifacts to a central server(artifactory in our case)
3.) Integration of git common operations with Jenkins