Najam Sikander Awan

covering web stack along with microsoft technologies

Continuous Integration & Deployment for .Net Projects TL;DR

| Comments

I have been involved with development of small and large web applications for quite sometime now. While using latest tools and technology for development I always felt my workflow is from ancient times. Recently I got much obsessed with automation, I thought to improve my development workflow. In my mind I knew I need to do something about build process, tests the build, deployment of latest build and notify team members that new code/build is available.

I started looking around, playing with tools from different vendors and finally shortlisted few tools to start with. Following tools made the list.

  • PowerShell
  • Slack
  • GitLab
  • TeamCity
  • Octopus Deployment

If you are thinking wow it’s alot to get started with automation then I will recommond learning Powershell if your using windows and .Net technologies. With Powershell only you can do most of things I am about to share with you.

Sprint Killings

We are using scrum for development with two week sprint cycle to get things out. Short sprints have their benefits but they also kills, if you are doing testing and deployment manually and this pain reaches to maximum intensity if you need to to push code to multiple servers (Dev, QA, Staging, Production).

We have been using some tooling while writing code and communicating with each other and that was not modern and did hurt us after a while. Following list contains few common tools in use by our local industry:

  • TFS and SVN - for version control
  • __Vychat & skype - for communication
  • Zero unit tests - (well few devs were submiting their forms to check output and called that unit testing. I am not joking.)
  • Zero integration tests (Again manual process followed by Devs and QA)
  • Deployment using filezilla (Yes this is the holy grail of deployment)

Few team members had major issues with SVN and TFS as souce control specially when they are not using windows. Since I have been using Git for sometime now and loved it. I recommonded git and knew we can’t move ahead without installing local git server. After some research found GitLab that has free option available. We are running GitLab on ubuntu 12.04 box and untill now it’s working smoothly.

Co-workers that got beatings from SVN fall in love with git and we all thought it’s the only change we needed in our current workflow/tooling. Well that was a mistake.

Problem 1: Who Moved The Code?

“Is it ok to pull latest code from remote branch?”. Gitlab was providing us information about what changed in latest commit and by whom but couldn’t signal us if the code is free from compile time errors. That created fear among devs and they came up with a solution called “Push-Pa”.

Manual Solution: Push-Pa

To solve above problem our team agreed to follow a process which dictates that before pushing your code to version control make sure you pull the code first merge it with your local changes. Compile the merged code, test it to check every feature is still working and than push it to remote branch.

Compiling code is no rocket science but it was painful to do it manually against each commit. Team started to feel the burden but keeps doing it in fact they name this push-pull exercise as “Push-Pa”.

Real Solution: Teamcity

I knew Teamcity for a while and used it against TFS source control for a big web forms based web application. At that time I was using it against TFS and to get notifications if latest commit compiles without errors.

Since we are using GitLab now for latest projects first task at hand was how to integrate Teamcity with Gitlab server for monitoring latest commits in a specific branch.

Problem 2: Deployment Nightmare

Our deployment process was from 80’s or 90’s if ftp exist back then. For deployment first step was to pull the latest code from source control. Then make changes for server environment (if you miss transforms, and environment checks) in config files and other code which is specific to server environment. Once that’s done create a zip file and upload it to server via ftp.

On server we need to unzip the latest code, and create backup of running website. Then overwrite files and test to see if deployment works. If its failing restore the backup, if it works pray to God and say thanks for the blessings.

Solution: Octopus Deploy

There are many solutions out there to handle deployments some bundled with integration feature. In few talks related to devops I came across Octopus Deploy. It felt nice, easy and powerful best part was they were targeting .net devs with slogan “Automated deployment for .Net”.

Our team currently testing Octopus Deploy and it’s linked with our Teamcity server. It takes the Teamcity compiled output/package and transform the code and settings for our deployment servers. Once code is modified for web servers its uploaded and deployed automatically. Backup against each deployment is managed by Octopus Deploy, you can roll back to preview release anytime you want.

Configurations and integrating Octopus server with Teamcity and Gitlab was little hard but I got minimal setup working in fair amount of time. Now its much easier to deploy code and we can develop our project faster, push small changes/fixes frequently while keeping our sanity intact.

Problem 3: Are We There Yet?

Achieving continuous deployment was fun and I felt good about it, but one problem was still sticking around. Team asking each other if latest code has been deployed I could have setup emails notification on Teamcity and Octopus but knew different people value email differently. Some will wait for it and read it to get latest updates some will apply filter so these kind of emails skip their inbox.

In my view having these updates inside team chat room was the best option but it’s hard to integrate notifications with vychat or skype. This is why my first mission was to convert each member to slack and enforce everyone is using it. (still working on that front)

Solution: Slack

Slack offers integration points and rest API, using their API our Octopus server pushes updates about deployment like if deployment is successful or failed. After deployment notification another notification about same release updates team chat room for smoke tests(Powershell scripts) status.

Problem 4: Smoke Testing

Our current web application under development has many pages of different nature like public pages, secure pages, redirect pages and error pages.

One time we got complaint that one or two public pages are not working for users. Another time pages that suppose to be secure and ask for user credentials were open to everyone. Even though we have QA department and they take proper time to test still these issues could have crawled back.

Solution: PowerShell

To avoid future complaints in this area I have developed a Powershell script that has list of pages (public, secure, 404, 500) and it make requests to those pages and according to response it pushes notifications to slack. That Powershell script is executed by Octopus on remote server after each deployment.


We have made modifications to our local IT infrastructure we have some new servers that are running Gitlab, Teamcity and Octopus deploy. Gitlab is configured on Ubuntu 12.04 box for rest of the tools we are using windows platform.

Download Teamcity and Octopus Deploy setups files from following:

Installing Teamcity

Installation is fairly simple and for database I wanted to use Microsoft SQL Server.Teamcity installation wizard guides you to download SQL JDBC driver. Unzip the file and look for a file 4.0.jar copy this file to lib folder of Teamcity installation directory.

Now check your Teamcity installation wizard and test if drivers are loaded. If there are no errors you can now specify connection detail for your sql server database.

After database setup Teamcity will prompt you for terms and conditions and then ask you to create admin account.

Creating Build: Gitlab

First thing is to create a new project, next choose create build configuration for your new project. Since source is on GitLab you need to check version control settings inside Teamcity. Make sure your settings are similar to mine.

You might tell from screenshot that our team city build will monitor dev branch of our project and as soon as new commit will arrive we will be notified to process it.

Now we need to configure build steps right now I only have two build steps that downloads the code of latest commit and compiles it using the method I have choose. Second step will package the compiled code and pass it on to Octopus Deployment. You can see my build step configuration from following screenshot.

If you don’t see OctoPack section yet with your Teamcity installation you need to install OctoPack plugin. Installation is dead simple copy into Teamcity plugins folder (C:\ProgramData\JetBrains\TeamCity\plugins) then restart Teamciy windows service and refresh your browser you will have OctoPack section like screenshot.

Installing OctoPack

Installation is dead simple once you have installed the Octopus Deploy server. You can run a tentacle setup on any deployment server (Staging, Testing, Production).

Let’s configure our project inside Octopus Deploy now. Create a new project and name it.

Create a new environment and make sure add your tentacle or deployment server to that environment. I have one environment called development but later on I will add more like staging and production.

To integrate Octopus Deploy with Teamcity we need to create an API key. API keys are tied to Octopus Deploy user so you can create key by visiting users than pick a user and you will see a link to create new key.

We need to tell Octopus how to download Teamcity output for that you need to visit back Teamcity and configure a nuget feed. Goto teamcity administration area on left sidebar under integrations heading you will see a link called nuget click on it for configuration.

Now you need to add that feed inside Octopus Deploy you can do that from inside Octopus Deploy dashboard. Simply click on library link from top navigation bar then you will see a link “External Feeds” on left sidebar. External feeds link will allow us to add teamcity feed and test it. For testing click on the test link and on new page you will see a search field and a search button. Click on search button and you will get results from teamcity.

While are pieces of puzzle are in their place we can now define the deployment process for our project. Octopus Deploy breaks process into steps that you need to define and configured.

We start with our first step called deploy and like it’s name it deploys the latest build from Teamcity to any environment we have pick.

Slack Notifications

By this time you should be able to have a pipeline connecting your gitlab server to your deployment server using Teamcity and Octopus Deploy. It’s time to inform team that new build has been deployed. To configure slack with Octopus deploy you need to visit Octopus Deploy dashboard again and click on library. On left sidebar click on “Step templates” link and than on top menu you will see another link “import step template”. Octopus deploy has a community library and from there I have imported Slack Integration script.

Now you can add a new step into your process defined earlier. Follow screenshots to match my configuration.

Smoke Testing Deployment

Remember the problem I have mentioned earlier in this post about some pages slip through QA and I solved that by writing Powershell script. Well running that test script manually was not good enough. I configured my process to run Powershell script for testing pages and announce the result of test in slack. Follow the screenshots for my configurations.

Test Run

I have recorded a screencast showing everything connected with each other and working hope it will give you better understanding.

What’s Next

  • I need to add more environments into Octopus Deploy specially production so from testing server I can promote a build on live server without ftp and have support for rolling back.
  • Need to have more Powershell script to clean files of builds older than a month to save disk space.
  • Need to integrate twitter to post tweets about server status and post readings on slack. Monitoring bandwidth, hard disk space and some other events comes handy.

LCL 2014: Third Annual Conference Islamabad

| Comments

LCL 2014 third annual conference held on 15th November 2014 at Jinnah Convention center. It was full day event and was targeting students, entrepreneurs and industry on large scale. Due to marketing hype, organizers, speakers list most of they people set their expectations to high level including me.

I was interested in this event for couple of reasons for starters networking, interaction with students for CodeBar and last but not the least learning something new and useful. My fear was spending full day and achieving none of the goals I have set for this event at the cost of developing time required by CodeBar. Let me break down my day at this event and point out what went wrong, what worked and the result.

Event Day

Myself and Jumpstart Pakistan team gathered on our meeting place we pick equipment for our promised stall and start our journey to reach Convention Center.


We wanted to reach on time to setup our promised stall but after spending time and moving up and down we decided not to put any stall. There were no power sources for our equipment, tables that were arranged after a long wait were not aligned and one fall with laptop over it. While we were figuring out what to do next, organizers were figuring out when to start. For unknown reasons event started late and this affected execution of the event.


There was no wifi available to participants our team’s wifi has been shared with limited participants including LCL social media team.

Power sockets for charging

In absence of power sources to charge equipment my laptop died. Few people were roaming around to find from where they can charge their devices. While organizers were telling everyone to use #LCL2014 on all social media (twitter, facebook, instagram).


I must say large audience might like all this speech theme conference but to me it was not exciting. They were just for motivation not focusing on specifics. Interactions were down to zero due to time limitations. Few speeches were made to win the crowd but few which felt dry to many actually had some value. One such speech was disturbed greatly by arrival of Mr. Asad Umer. Now one more thing about speeches I have listen to many of them and now they feel redundant to me. I have heard some great and wise personalities like Mr. Asad Umer, Rozee Guru Mr. Monis but now instead of angro or rozee I want to know what other success they have. What small things they created that became big and created name of its own.


LCL 2014 had some big names that gains lots of attention but it’s hard for beginners or students to connect with them to seek help or guidance. I personally feel there are many local entrepreneurs that are more reachable to me and many other people and they should be on stage in large events.


I have been looking for water after lunch but couldn’t find it on my own. That was really a big downer.


I have been in “Convention Center” and I think it’s really a bad place to arrange any event or conference for tech people.

For tech events Internet and power supply is a must have and you can’t have positive impact without them so always have dedicated connection for audience and digital content publishers.

It was an effort and I know it’s almost impossible to organize and execute a large event without making someone really unhappy. I am hoping LCL team keep doing good work and rectify the issue one by one for their future events.

Thanks for reading.

Automating Ftp Uploads With Powershell

| Comments

In my last post I have shared hwo you can archive a folder via powershell and Pscx. In my normal backup process I am used to upload zip files using ftp client like filezilla but now I am trying powershell to perform this task.

Please feel free to test following code and let me know your thoughts and improvements.

#upload file using ftp
function FtpUpload($file, $ftphost, $ftpuser, $ftppass){   
   $Dir = Split-Path $file -Parent 
   $filename = $file.Replace("$Dir\", "")

   #ftp server 
   $webclient = New-Object System.Net.WebClient 
   $webclient.Credentials = New-Object System.Net.NetworkCredential($ftpuser,$ftppass) 
    #list every sql server trace file 
    foreach($item in (dir $Dir $filename)){ 
        "Uploading $item..." 
        $filepath = $ftphost+$item.Name
        $uri = New-Object System.Uri($filepath)         
        $webclient.UploadFile($uri, $item.FullName) 

FtpUpload E:\projects\AspNet\MVC\ ftpHost ftpUser ftpPassword

I have shared this script on github adjust it accroding to your needs and share your resutls.

Please feel free to suggest improvements and if you have any questions use comments area.

Thanks for your time.


Archive Folders With Powershell

| Comments

When your web application is running on web server and you don’t have automatic backup service available form your hosting service provider taking manual backups is really a frustrating job.

I have Pscx module installed on my PC and they have a nice little command Write-Zip available so I decided to write a custom powershell function that can create archive in zip formats for me. You can use windows task scheduler to invoke any powershell script so you can take advantage of this to have automatic daily or monthly backups.

#$target is the folder you want to zip
#$destinaion is the path where you want to create zip file
#$outFile is the fileName of the generated zip file
function CreateZip($target, $destination, $outFile)

#todays date
$date = Get-Date -Format yyyy_dd_MM

#if output file is not mentioned try to get a file name based on target
if ($outFile -eq $null -or $outFile -eq '')
    #$target = "E:\projects\AspNet\MVC\newFolder"
    $a = Split-Path $target -Parent
    $outFile = $target.Replace("$a\", "") + "_$date"  + ".zip"    

#if desitnation is not mentioned create zip file in target folder
if($destination -eq $null -or $destination -eq '')
    $destinationPath = $target + $outFile;
    $destinationPath = $destination + $outFile

Write-Zip $target -IncludeEmptyDirectories -OutputPath $destinationPath
return $destinationPath

#this command will zip folder at E:\projects\AspNet\MVC\newFolder as at E:\Projects\AspNet\ 
$bakfile = CreateZip E:\projects\AspNet\MVC\newFolder E:\Projects\AspNet\

I have shared this script on github adjust it accroding to your needs and share your resutls.

Please feel free to suggest improvements and if you have any questions use comments area.

Thanks for your time.


Sending Emails With Powershell

| Comments

I am managing different servers and each of them have different responsibilities.

I am learning powershell and trying to build lego blocks that will eventually become recipes for automating tasks on different servers. One important block is email report for that if you don’t have smtp configured on server you can use following powershell script with gmail SMTP information.

I will open source on github many of the small powershell scripts that you can make use of to build something cool.

$EmailFrom = ""
$EmailTo = "" 
$Subject = "emailing powershell report" 
$Body = "this should work on remove server as well. test email" 
$SMTPServer = "" 
$SMTPClient = New-Object Net.Mail.SmtpClient($SmtpServer, 587) 
$SMTPClient.EnableSsl = $true 
$SMTPClient.Credentials = New-Object System.Net.NetworkCredential("gmailUser", "gmailPasswor"); 
$SMTPClient.Send($EmailFrom, $EmailTo, $Subject, $Body)

Hope this will help someone, feel free to improve this script and than share it with me you can file bugs on github.

Thanks for your time.


Open Letter to PTCL - Desi Version

| Comments

Following is the email I have sent to it’s unplugged raw version so don’t mind about mistakes.

Since i have shifted to new house date you can check in your records that corresponds with address change I am facing dsl connection issues with line disturbance

I have tried all mediums to get some help for permanent solution but no luck that involves talking to agents, exchange staff, email to billing team and ptcl twitter account. Every time when situation reach to worst point where I could take no more they promise me some fix that can only work for 5 days at its best.

I have no of issues * charged for address change moths ago but not getting printed bill. Mention this to call center agent promised this will be fix but as you ptcl more then me you can guess the result.

  • Line is noisy and cause of data loss even when its no noise or less simple ping command can show you that. But the staff “Technical” staff mostly visit me don’t know about ping they just open a website and say see website work tor this vary instant its working we cant do anything else. Lineman also king of UAE I guess even ptcl staff don’t have power to call him when they require him. I am not joking one of ptcl dsl department guy told me once “sir mai call kar kar kai thak gia hoon usi kai liyeah app kai ghar kai bhair khara hoon aub app khud he lineman kai liyeah ptcl mai kisi ko bolyeah mai kuch nahi kar sakta”

  • Dsl stablity: speed is slow most of the time. Now a days if phone rings net disconnect, if someone pick receiver net disconnect, if you are talking on phone net disconnect.

  • Call center: every time same questions, same promises but no result. Yestday 24 jan 201 I talked to your agent between 9:00pm to 10:00pm. it was long a call and i wanted him to terminate my services but he said 24 hours time required and I am personally involved with your complaint. Today in morning when I get up from bed I check sms from ptcl saying my complaint no 93 has been closed yesterday around 10:00pm.

Filled with anger called your call center female agent again asking me whats the issue like there is no complaint history in front of her i ask for manage but that manager is not for common people of pakistan so after her promises that were not new she said talk to supervisor which I refused and asked her to transfer my call to billing department.

I do believe same person picked my call as billing representative and when i told her remove my dsl and phone line from billing she start doing same promises. When i explain the whole history to her then she said all the time I am facing issue with line or dsl that can be adjusted in billing. Imagine my surprise with my complaint history in which i have faced dead phone line for 15 days and I was never told of this thing by any call center agent. So this is one new problem I discover called “Theft”. Anyways she said before this day ends my problem will be fixed she will be taking this matter with her seniors, army cheif and prime minister of pakistan and i will not face this kind of issue again.

So based on above if you read all i wrote above after wasting 30 more minutes of life with PTCL tag on it kindly remove me from billing (land-line and dsl) if my issue is not fixed by day end.

I will pay the latest bill i have already printed and after that would not pay anything in case you are still keeping me in your billing records.



MongoDB Setup on Windows 8

| Comments

I am very much into learning new tools and burning hours of my life trying to figure them out. To track whats new on development horizon and technology I tend to keep my eyes on sessions, conferences, blogs and screen-casts.

In past few days I wanted to explore no-sql options two in particular RavenDB and MongoDB. I have successfully completed the environment setup for raven but didn’t use it in any project until now. On the other hand I have built a new website for my current employer using MVC4 and MongoDB. I have learned a a lot and in this blog post I will share few commands to setup mongoDB and deploy it on remote server.

You can download MongoDb zip file from I picked 64bit version of it.

Installing MongoDb as windows service

To install MongoDB using your command prompt navigate to the bin folder in my case it was on following path:


Then run following command to install it

D:\mongodb\bin>mongod --dbpath=D:\mongodb --logpath=D:\mongodb\log.txt --install

Then you can goto windows services and look for MongoDB service and simply start it. If you face any error during installation make sure you launch command prompt as administrator and log.txt file used in above command do exists in your hard drive.

If for some reason you want to un-install the MongoDB service run following command in command prompt after stopping service from system services.

D:\mongodb\bin>mongod --remove

GUI to manage MongoDB

You can interact with MongoDb database and collections using command prompt it will make you master of the little quires against small frequent tasks. In my case I wanted some simple GUI to perform simple tasks and I came across RoboMongo.

Installation is simple once this tool is installed and you have your MongoDb service up and running. click on the create button showing in following screen.

Robo Mongo Connection Screen

Proceed with defaults in address I have localhost and using port 27017. After connection is created you can click on the connect button and it will show you following screen listing with your databases and collections.

Robo Mongo Showing Databases and Collections


After completing v1.0 I have been asked to deploy our new web site on remote server. deployment was not a big deal but to deploy local MongoDb database onto remote I had to read about two commands mongodump and mongorestore

First to create a backup file of my local mongoDB database I run following command

mongodump --db DBName

Above command will generate called DBName under MongoDB_Installation_Folder/bin/dump inside generated folder you will find .bson files for all the collections your database contains. I have uploaded this folder on remote server.

Now to restore MongoDB on remote server I used mongorestore command syntax is simple you need to supply foldername/dbname to restore

mongorestore dump/DBName

Once restore is completed successfully you can check your database on remote using RoboMongo.

I hope this will help someone starting with MongoDB thank you for reading my blog.

Empower Your Outlook With Powershell

| Comments

I needed to genrate an email log that contains every email from our clients these normally stored in their seprate folder. One way was to copy and paste email headers and body into a word document. Second was to use powershell and got all the emails using some script.

After playing a little with powershell and google few of its basic I have completed my task with powershell.

Above code is used to select a folder under your outlook inbox in my case its jun 2013 which comes under EmailLog which is under my inbox. I have saved the result of this function into a variable. That would be passed into second function for html creation.

This function will take email objects and wrap html mark around them before saving them into a html file.

Screen Calipers

| Comments

I have used this tool for my web projects where clients do require pixel perfect html for all browsers. Since this tool helped me a lot in past I thought to share it with all.

screen calipers

Testing jQuery Code via YUI Test

| Comments

The term TDD is not new to me I heard it quite often but never took time to go into details until last few days. I have been checking stuff from YUI team and liked their YUI test framework for now ,but in future i might switch to qUint (jquery testing framework) as I am big fan of jQuery and using it daily.

There are different version of YUI like 2.0,3.0 and standalone version of YUI test that has no dependency over YUI framework according to yui theater video.

So for learning purpose I have written few lines of jQuery to set color of a paragraph on mouse hover. I wrote a test using YUI; in the test I simulate mouse hover event and then checking the paragraph css color properties using jquery :).

You can view the code at