Wednesday 18 December 2013

Search the whole SVN repository for a given filename

The SVN repository at work is huge, and I don't have the disk space to checkout the whole thing with the branches and everything on my small (but very fast) laptop SSD. But I needed to search through the whole repo for a file, the following command line can help out.

Windows

svn list -R https://subversion-repo/subfolder | findstr filename

Nix

svn list -R file:///subversion-repo/subfolder | grep filename

These commands don't look through the history but will find things at the current HEAD of the repository.

If you want to look for a particular point in time you can specify the revision thus:

svn list -r 1234 -R https://subversion-repo/subfolder | findstr filename

where 1234 is the revision to search though.

If you want to search the entire history you could script the search to look though every revision from 1 to n and list the files that match the search at each revision, then remove duplicates to get a single list. How about getting even fancier by recording the revision the file was first found and the revision it was deleted at. I have no requirement to do this right now but sounds like an interesting little project to try.

If you want to search for text in files I find searching the diffs useful. Just pipe the following into a file and search that in your favorite editor (Sublime text :-)

svn log -r1234:HEAD --diff https://subversion-repo/subfolder

this can be rather verbose but with a bit of tweaking and targeting of the repo/folder you can get some accurate results on text search in history

Sunday 15 December 2013

Personal Backup strategies

Its been on my mind of late that I don't have a very good backup strategy in place for my own things at home. I've got many gigs of photos, code, documents, videos that are locally backed up but all over the place and not very consistent, and then there is gmail and the 4 gig of emails in there. So I'm doing something about it.

The solution is:

Dropbox

I use dropbox for cloud sync and storage. This is not back up. I use it to get access to files easily from anywhere, but if I accidentally delete or change something, then the change is propagated straight to dropbox, so (unless you have packrat) its quite hard to undo the change or get to an older version.

I keep a local copy of all the dropbox files on my home server.

gmail

A weekly download of all gmail to local machines using gmvault. The official guide to set up
is here, but Scott Hansleman did a great write up of how to do this here.

This boils down to two commands. The first for the initial sync, the second for incremental backups on top of the same folder structure.
gmvault sync youremail@gmail.com -d D:\foldertosaveto
gmvault sync -t quick youremail@gmail.com -d D:\foldertosaveto

Output of my initial run. Yes took a while to run...
================================================================
Sync operation performed in 2h 36m 35s.
Number of reconnections: 70.
Number of emails quarantined: 0.
Number of emails that could not be fetched: 0.
Number of emails that were returned empty by gmail: 0
================================================================

Scheduled job

I have set up a scheduled job (in windows task scheduler) which runs a script every Friday that backs up the week's email to my hard disk. This script is just a simple .bat file where the contents are thus:
gmvault sync -t quick youremail@gmail.com -d D:\foldertosaveto
You will need to make sure that gmvault is on your path if you do it this way. setting up scheduled jobs is easy too. There are loads of on-line tutorials, here is one for windows 8.

Amazon glacier

  1. Sign up for Amazon glacier, you will need your credit card for this (first you need to sign up for an Amazon AWS account)
  2. Once logged in, create a key pair (Access Keys (Access Key ID and Secret Access Key)) save them to your machine.
  3. Go to the glacier console and create a vault for each type of backup you are planning on doing. I've created two for now, one for my photos and one for my mail backups. I might create another for music later.
  4. Ensure you have chosen the data centre closest to you for the vaults. Both of mine are in EU Ireland.

Cloudberry online backup

I use cloudberry online backup to do the heavy lifting of actually sending all my files up to amazon
http://www.cloudberrylab.com/amazon-glacier-storage-backup.aspx#amazonglacier. Its great you just set up some backup plans and a schedule and cloudberry does the rest. Its not free but really quite cheap given what it does and how well it does it.
  1. Install the cloudberry online backup desktop version (download from: http://www.cloudberrylab.com/amazon-s3-cloud-desktop-backup.aspx )
  2. Add a glacier cloud storage account (File->amazon glacier)
  3. Follow the wizard - it's really easy
  4. Go to the backup plans tab and create a new plan or use a predefined plan
  5. For my gmail backup I created a new plan
  6. Click the backup wizard (backup files). Again, a real easy wizard to follow. Select the glacier account/vault, the files to back up and the schedule. So easy.

Costs

I'm storing 150 gig in amazon glacier, that costs me £1.50 per month and I can store as much as I like, practically unlimited storage. Be careful though because it costs a lot more to get it out. But that's ok right? This is emergency backup. You might be able to get your files back from dropbox, local backup etc. Glacier is the long term emergency backup we all need.

Summary

The whole point was to get all the files that I care about into cheap storage with multiple redundant backup locations, so if/when I lose some data I can get it back. Dropbox provides an easy way to get back files but its not a total solution, amazon provides the cheap offsite secure backup that I want for my 150 gig+ of data.

Other options:




Comments from https://news.ycombinator.com/item?id=6927659

* by drdaeman

Isn't Glacier overpriced, compared to other personal backup solutions?

Say, I have a mere 2TiB of historical data (various junk I made or collected over last ten years or so). Storing on them with Amazon is $20/mo, and if I want to look on that photos from 2008 I have to wait for several hours just to find that I misremembered where they were stored and pulled out wrong files. And unless it happened that I uploaded a good amount of data on that exact day, I'll have to pay for downloads.

Other offers for unlimited storage are Cyphertite at $10/mo, Crashplan at $6/mo, Carbonite at $100/yr, AltDrive at $4.5/mo and so on. While they're probably not-so-unlimited (they don't say that, but I guess one won't have much luck storing a petabyte), less respectable than Amazon, and most services lack an API and require to use not-so-trusty proprietary software that has to be sandboxed properly, Glacier doesn't look like a good deal to me unless we're talking about backing up some either quite big data (like tens of terabytes) or relatively small amounts of data (less than 500GiB).

Disclaimer: I have no affiliation to any of companies mentioned above. Just happens that I'm currently fleeing from Bitcasa (they suck hard) and looking at various options to not maintain a self-hosted NAS.

* by tfe

The difference is that I trust Amazon far more than those other companies you mentioned. If they go out if business or even change their "unlimited" policy, you're exposed until you can get your 2TB re-uploaded to another provider. It's a pain and a risk I'm unwilling to take. I know Amazon isn't going to suddenly try to dump me as a customer.

* by damianstanger

Yes all good points. I have a relatively small data set < 200GiB and so my costs with glacier are less than $2 per month :-)


* by hengheng

I am using Glacier to store a backup of most of my personal data. This includes my home directory, the most relevant photos I have taken as jpeg, my gmvault and that's about it. I do not copy over any movies, music, raw photos or software, as this is my last line of defense, so it only needs to cover the essentials. I am under 1€ per month this way, and the backup gets refreshed only every other month or so.

I do have a local server that stores a windows backup image of my whole laptop, a second Harddisk in that Server to store a copy of the server, and an external hard disk with a windows backup at my parents that gets a refresh every time I am over there. All backups are truecrypt images for good measure, and I have tested recovery. Amazon stores a split truecrypt archive. Recovery cost about 20€ and took a day.

So yes, glacier is great as a personal backup, if you make it part of a larger strategy. To me, this is disaster recovery, and a small price to pay for this kind of insurance of important files and memories.

Sunday 1 December 2013

Script your build and deployment of android cordova apps with powershell

We are developing a new version of our customer facing solution, across web, ios and android, using cordova(phonegap).

Im a big proponent of build automation, and the classical (recommended?) way of using eclipse to build and manage the code base was getting me down, so i decided to write some scripts to build and deploy the app to either a device, an emulator or prepare for release. I also wanted any developer to be able to checkout the code and run the scripts to build the app.

I wrote the scripts in powershell (sorry!) with some batch files to make the various functions easy to run (im developing on windows 8 by the way).

You can find the scripts here: https://github.com/DamianStanger/AndroidBuildScripts

So how does it all work?

firstly it goes with out saying that you need your dev env set up for android development on the command line with cordova http://cordova.apache.org/docs/en/edge/guide_cli_index.md.html#The%20Command-Line%20Interface

As you will know (if you do cordova development) when you use the command line tools for creating cordova apps the folder created is where all your source code is placed and inside there is the www folder that is where you keep your .js and .html files. The problem is that you are keeping your source code along side the automatically built cordova files, not ideal. so I've created my own source folder that is where all the code you edit is kept. We then use powershell to copy these files to the correct places.

The development process

In a powershell (or dos cmd if you prefer)

build.bat
emulate.bat or install.bat

Line 01. run either or, depending on if you are using a real device or not

That's it. Now you might notice that build can take a wile to run because its setting up everything from scratch so i created a shortcut that will only copy your changes across.

quickCopy.bat
emulate.bat or install.bat

This is all good for general day to day dev but eventually you will want to test a production build on a real device for this use the following commands

release.bat
installRelease.bat

To make this work you must only have either a device plugged into usb or an emulator turned on (please use genymotion its so much faster than a standard emulator)
The release process signs and aligns your apk for you :-) so when you are ready you just send the apk you have tested to the play store.

The scripts

Here I'm going to show select lines of code from build.ps1

For building the app in debug and getting that on to your emulator or phone

function create()
cordova create app-cordova-android com.myapp.app myapp
...
cordova platform add android
...
cordova plugin add org.apache.cordova.device

function build()
cordova build android
function emulate()
cordova emulate android -d
function installDebug()
cordova run android -d

To build the releaseable apk and to get that onto your phone use the following:

function release()
cordova build android --release
function sign()
jarsigner -verbose -sigalg SHA1withRSA -digestalg SHA1 -keystore ..\appstore\android-keystore\myapp -keypass myappKeyPassword -storepass myappStorePassword -signedjar .\platforms\android\bin\myapp-release-signed.apk .\platforms\android\bin\myapp-release-unsigned.apk myapp
zipalign -f -v 4 .\platforms\android\bin\myapp-release-signed.apk .\platforms\android\bin\myapp-release-signed-aligned.apk
function installRelease()
adb uninstall com.myapp.app
adb install .\appstore\APKs\myapp-release-signed-aligned.apk

Release versioning

When doing a release to the play store you need to make sure the version numbers are incremented each time, for this i added a helper which will update all the relevent places in the source for you.

just run:

setVersion 102 1.0.2

This will change all the files that need changing in order to properly put a new version of the app onto the play store.

Upload to play store error
Upload failed
You uploaded a debuggable APK. For security reasons you need to disable debugging before it can be published in Google Play

Make sure your manifest is set thus:

<application android:debuggable="false" android:hardwareAccelerated="true" android:icon="@drawable/icon" android:label="@string/app_name">

Resources

Sunday 17 November 2013

Debug your android applications by capturing/monitoring their http traffic using wireshark

I’ve always wondered what my phone is telling the outside world and recently i had the need to actually find out as I’m developing an android app for work at the moment. I needed to find out what was going over the wire as i was getting some strange problems and could not debug the traffic on the production server.

Setup

Download and install wireshark : https://wireshark.org/

Disable wifi and mobile data on the phone.

Connect your phone to your laptop/desktop via USB.

Enable internet pass though. Basically you want your phones internet to come through the USB wire, through your computer network card, which when running a wireshark capture, through wireshark.

Set up a capture filter so that you only capture the data coming to and from your phone and not data initiated from the computer itself. i pick the option to ‘create a capture with detailed options’. Set a capture filter for example ‘host 192.168.15.129’,  where 192.168.15.129 is the ip address of the phone.

Additionally (or alternatively) you can filter the traffic by ip address after capture when viewing the results “ip.src==192.168.15.129 or ip.dst==192.168.15.129” where 192.168.15.129 is the ip address of your phone. Or filter the traffic by protocol, you probably care about http traffic so filter on this by entering “http” in the filter.

Results

You can get information overload with wireshark, it takes some getting used to, but if you dig you can find everything you need. Look for the requests you care about by looking down the info column and clicking the row. This will present all the packet details where you can dig as deep as you like into the request.

I use the Hypertext Transfer Protocol section as its the level of detail i care about. From here you can see the url and the headers as well as a link to the packet that contains the response, simply perfect.

Sunday 27 October 2013

jquery promises wrapped in javascript closures, oh my..

I recently had a question from one of my fellow devs regarding a problem where the values in a loop were not what they expected. This was down to the deferred execution of the success function after a promise had been resolved.

The following code has been simplified for this explanation:

function (userArray) {
  var i;

  for (i = 0; i < userArray.length; i++) {
    var userDto = userArray[i];
    var user = new UserModel();
    user.displayName = userDto.displayName;
    var promise = service.getJSON(userDto.policies);

    promise.then(function(policyDtos){
      system.log("user.displayname : " + user.displayname);
      convertAndStore(user, policyDtos);
    });
  }
}

Input:
userArray = ['bob', 'sue', 'jon']; //* see footer
Output:
user.displayname : jon
user.displayname : jon
user.displayname : jon

service.getJSON (line 07) returns a jquery promise, the function actually makes a service call to an external API and so can take some time to resolve. Notice how the var userDto and user are declared within the loop, this is not best practice in javascript as the variables are actually hoisted up to the containing function (next to var i). To a none javascript expert it looks like the variables will be created anew inside the loop as they are in c#. In fact there is only one copy of i, user and userDto so obviously the values are overwritten within every loop iteration.

This is the fixed function using closures.

function (userArray) {
  var i,userDto, promise;

  for (i = 0; i < userArray.length; i++) {
    userDto = userArray[i];
    promise = service.getJSON(userDto.policies);
 
    (function(capturedDisplayName) {
      var user = new UserModel();
      user.displayName = capturedDisplayName;

      promise.then(function(policyDtos){
        system.log("user.displayname : " + user.displayname);
        convertAndStore(user, policyDtos);
      });
    }) (userDto.displayName);
 
  }
}

Input:
userArray = ['bob', 'sue', 'jon']; //** see footer
Output:
user.displayname : bob
user.displayname : sue
user.displayname : jon

The introduced function on line 07 creates a capture around the variable userDto.displayName, the variable is a parameter called capturedDisplayName. Now there is a copy of this variable for every iteration of the loop allowing you to use it after the promise has resolved.
You may wonder why the variable promise works as intended given the problems with user and userDto? Well that is because the object referenced within the loop iteration has the .then attached to it before it is overwritten by the next loop, remember the object itself is not overwritten or changed on line 05 only the reference to the object.


//* In reality are more complex objects than simple strings, I'm trying to keep this simple for readability.
//** The names have been changed to protect the innocent.

Sunday 20 October 2013

HTC One and Vodafone. Removing the Kikin search service

If your like me then you will hate how the phone manufacturers install all sorts of things on your phone for you, (to be fair I think its Vodafone not HTC), and I know its not as bad as the galaxy (I’ve several friends with Samsung phones) but recently my phone started popping up a search service every time I selected (long pressed) a word to copy and paste it. really really annoying.

Anyway the Kikin service (http://www.kikin.com/) was really winding me up, software i didn’t ask for, didn’t want, couldn’t remove but did get in the way every time I wanted to simply select a piece of text.

So how to remove the kikin service?

OK first the bad news, you cant, well not without rooting your phone. But you can disable it so it no longer bothers you by popping up all the time.

Settings -> Apps -> All -> Kikin

Sadly you will see no uninstall, but if you turn notifications off, force stop, then Disable. you are done :-) now the long press select text click doesn't also do a web search automatically.
And I’m not sure but since I’ve done it I have not had Kikin auto update on me, or it might just be happening behind the scenes.

Vodafone, HTC, et al. Please please stop installing useless guff on to our phones. just because you can doesn’t mean you should.

Friday 27 September 2013

Problems connecting to a live DB in an MVC 4 web app using EF5.0

I had some real problems this week deploying my latest site live. Everything was runnig fine locally, or so i thought. connecting to a local sql2012 DB through entity framework 5.0

I was running on dev through iis express on port :41171

When i deployed to live the web server would spin and spin and then eventually produce an asp error saying [Win32Exception (0x80004005): The system cannot find the file specified]

[SqlException (0x80131904): A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. (provider: Named Pipes Provider, error: 40 - Could not open a connection to SQL Server)]
...

My first instincts were that SQL server was not set up right at the hosting providers end, I was using the correct connection string after all. But after some back and forth from the hosting provider (1and1) I came to realisation that the SQL instance (SQL server 2012) was probably set up right.
I finally removed this section from the web.config that entity framework adds in


<section name="entityFramework" type="System.Data.Entity.Internal.ConfigFile.EntityFrameworkSection, EntityFramework, Version=5.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089" requirePermission="false" />
...
...
<entityframework>
<defaultconnectionfactory type="System.Data.Entity.Infrastructure.SqlConnectionFactory, EntityFramework">
<parameters>
<parameter value="Data Source=.; Integrated Security=True; MultipleActiveResultSets=True" />
</parameters>
</defaultConnectionFactory>
</entityFramework>

What this magic does? [sic]
From what i gather this is telling entity framework to auto-magically connect to a database by convention if the connection string doesn't work. I managed to ascertain that it was my code was actually trying to connect to a local sqlexpress DB. So i took it out.

After removing that section I got this error message (again after ages of trying to connect)

Server Error in '/' Application.
A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. (provider: SQL Network Interfaces, error: 26 - Error Locating Server/Instance Specified)

[SqlException (0x80131904): A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. (provider: SQL Network Interfaces, error: 26 - Error Locating Server/Instance Specified)]
System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction) +5296071
...
...

AND now dev was also broken...

At this point you may be directed here by google/bing/duckduck et al. to https://blogs.msdn.com/b/sql_protocols/archive/2007/05/13/sql-network-interfaces-error-26-error-locating-server-instance-specified.aspx
its not the problem you are looking for... No you dont need to allow UDP port 1434 or anything like that.

My local connection string was
<add name="applicationName" connectionString="server=localhost; database=localdbname;Integrated Security = true;MultipleActiveResultSets=True" providerName="System.Data.SqlClient" />
my connection string on live was
<add name="applicationName" connectionString="Server=db123456789.db.1and1.com,1433;Database=123456789;User Id=mydbuser;Password=mydbuserpassword;" providerName="System.Data.SqlClient" />
all as provided by the host.

Why was it not connecting?? i almost put that config section back but then after debugging i found that inside the the application context that was driving entity framework the connection string was not there, again it was trying to use sqlexpress??

then i stumbled upon it, my context was configured thus.


public class theappContext : DbContext
{
public theappContext() : base("databasename")
{
}
...

and in the application_start in global.asax
Database.SetInitializer<theappcontext>(null);
It turns out you need the connection strings name to be the same as the string in base("databasename"). In my instance i needed to set it to databsename not applicationName as i had previously.

So finally these are my connection strings

On dev either:
<add name="databasename" connectionString="server=localhost; database=localdbname;Integrated Security = true;MultipleActiveResultSets=True" providerName="System.Data.SqlClient" />
<add name="databasename" connectionString="server=localhost; database=localdbname;User Id=mydbuser;Password=mydbuserpassword" providerName="System.Data.SqlClient" />

On live:
<add name="databasename" connectionString="Server=db123456789.db.1and1.com,1433;Database=123456789;User Id=mydbuser;Password=mydbuserpassword;" providerName="System.Data.SqlClient" />

And thats what fixed it, simply changing the name of the connection string. So entity framework was being super cleaver and making up for bad connection strings locally, which meant that when deploying live I had no hope. Why all this auto-magic?

Tuesday 13 August 2013

Using jasmine to test JQuery

Ive been doing lots of javascript work of late (mainly in a durandal SPA app) and wanted to test the following function that uses jquery.

function update () {
    errorVisibleFlag(false);
    $("input.delete:checkbox").each(function() {
        if ($(this).is(":checked")) {
            var idToDelete = $(this).attr("id"),
                status = $(this).attr("status");
            dataService.delete(idToDelete)
                .fail(function(){errorVisibleFlag(true)});
        }
    });
    get();
}

This function is part of a knockout view model within a durandal app.

I came up with the following jasmine test that uses spys to verify that the calls were working and also some DOM manipulation to allow jquery to bind the function to something. Ive found it works really well.

it('When update is called, delete is called with the correct parameters', function(){
  var inputCheckbox = '<input type="checkbox" class="delete" id="bc87d270-95bc-49bc-9cac-3e903d09590b" status="Pending">',
      inputCheckboxId = "#bc87d270-95bc-49bc-9cac-3e903d09590b";
  $("body").append(inputCheckbox);

  var spyDelete = spyOn(vm.dataService, "delete").andCallFake(getData);
  var spyGet = spyOn(vm.dataService, "get").andCallFake(getData);

  vm.update();

  expect(spyDelete).toHaveBeenCalledWith("bc87d270-95bc-49bc-9cac-3e903d09590b");
  expect(spyGet).toHaveBeenCalled();

  $(inputCheckboxId).remove();
});

Note: The viewmodel vm is injected into the test using require.js and the function dataService.delete returns a jquery promise

Saturday 27 July 2013

Monday 17 June 2013

Life, Death and Education. A couple of inspiring/thought provoking podcasts.

I don't only listen to technical podcasts. In fact there are 3 podcasts in particular that i listen to a lot; Radiolab, LES (London school of economics), and freakonomics radio. I usually post technical things on my blog but this time i thought id blog about a couple of episodes that i have listened to lately, thought provoking, inspirational, amazing, emotional.

How doctors want to die (hint: its very different from the general public)

Radiolab: The bitter end - http://www.radiolab.org/blogs/radiolab-blog/2013/jan/15/bitter-end/
Thought provoking. A difficult subject, death, but one that i think that everyone should talk about. This is a really good episode about how doctors would like to die, and contrasting that with the general public and the expectations that the 2 groups have. i would encourage everyone to listen to this and to get their loved ones to as well, its great for sparking that difficult conversation.
Extract: We turn to doctors to save our lives -- to heal us, repair us, and keep us healthy. But when it comes to the critical question of what to do when death is at hand, there seems to be a gap between what we want doctors to do for us, and what doctors want done for themselves.

Khan Academy - Reimagining Education

LES podcasts: Khan Academy - Reimagining Education - http://www2.lse.ac.uk/publicEvents/events/2013/04/20130410t1830vOT.aspx
Inspirational. Remember school? You either loved it or hated it. This podcast contains a great discussion on education of the future, some really good thinking on mastery and how education could be better in the future. I've got a little girl now and so we are thinking about her education, i hope the schools she goes to are as forward looking as these guys. I agree 100 % that the progression to mastery is definitely the way rather than the old skool approach of a set pace for all.
Extract: Salman Khan tells the inspiring story of how the Khan Academy came to be and shares his thoughts on what education could (should?) be like in the future.

When does life begin? when does it end?

Radiolab: 23 weeks 6 days - http://www.radiolab.org/2013/apr/30/
Emotional. A difficult podcast to listen to as it not only talks about life and death but relates it back to real people and in particular the story of one family, and one very premature baby. As I've said I've got a little girl now and so I related to this story in a big way. It did make me cry! very moving. But it does have a happy ending.
Extract: When Kelley Benham and her husband Tom French finally got pregnant, after many attempts and a good deal of technological help, everything was perfect. Until it wasn't. Their story raises questions that, until recently, no parent had to face… and that are still nearly impossible to answer.

Friday 7 June 2013

Quick multi cursor support in sublime text 2

In a previous post i listed out all the podcasts i listen to. Whilst creating that list i had to do a little bit of text manipulation, i used some of the powerful multi cursor support from sublime text to do it.
I wanted all my links to be hyperlinks instead of just text.

Below is the screen cast of how i went from a long list of web addresses:
<h2>.net</h2>
http://www.hm.com/
http://www.dnr.com/
http://c9.msdn.com/
...

to a long list of hyper links:
<h2>.net</h2>
<a href="http://www.hm.com/">http://www.hm.com/</a>
<a href="http://www.dnr.com/">http://www.dnr.com/</a>
<a href="http://c9.msdn.com/">http://c9.msdn.com/</a>
...

The process

First do an incremental find (Ctrl+i)
Then select all matched (Alt+Enter)
Copy all selections into clipboard (Ctrl+c)
End
Type the end tag
Home Type the start tag
Paste the links again
Close the start tag

Sublime text 2, get to know it.

Sublime is really powerful, this is just a really simple demo of a simple real world use. ive used it to process hundreds of thousands of lines pulled from databases whilst selecting complex regular expression matches then using sort lines and unique lines to get a really good feel of the data.

I used APowersoft screen recorder to record the screen cast and Key Jedi to display the key strokes as i went.

Thursday 6 June 2013

My Collection Of Podcasts For Developers

[update "2015-11-03"]I've just posted a new version of this list to developer-podcasts-v2[update]

I'm always on the lookout for more development focused podcasts and thought id better share my current crop as i know others are also looking for more podcasts to accompany them on their daily commute.

My interests

Some background on me: Primarily a .net developer with an interest in agile, javascript, ruby, design and devops. So this list derives from these interests.

.Net

http://www.hanselminutes.com/
http://www.dotnetrocks.com/
http://deepfriedbytes.com/
http://channel9.msdn.com/

General Development

http://www.se-radio.net/
http://herdingcode.com/
http://techcast.chariotsolutions.com/
http://devnology.nl/en/podcast
http://thisdeveloperslife.com/

Javascript

http://nodeup.com/

Ruby

http://ruby5.envylabs.com/

DevOps and IT

http://www.runasradio.com/
http://foodfight.libsyn.com/
http://devopscafe.org/

Business

http://www.startupsfortherestofus.com/

Design and Development

I found this resource last month that contains lots of other great podcasts on design and development http://www.smashingmagazine.com/2013/04/19/podcasts-for-designers-developers/?utm_source=feedly


Friday 24 May 2013

OO sessions. Learning from failed interviews.

OO Sessions

Earlier this week i did a session with a candidate looking for a job with ThoughtWorks. It was a little strange as she had already failed our coding test, but she was otherwise a really strong candidate. RM asked if she would like to have a session with a couple of us here so we could point her in the right direction (coding wise) and then she may re-apply in six months or so.

She had 3 years experience in .net, mainly writing windows forms applications, and in her eyes had produced some good object orientated code for her submission.

Coding challenges

We (ThoughtWorks) give candidates coding challenges to complete before interview so that we can more accurately asses programming skill. From the spec she had pulled out a good domain model for the problem consisting of 4 or 5 classes which reflected the simple domain quite well. There were attributes on these classes to represent the fields and collections, and also this was all quite well tested. But... this was very much a microsoft approach.

View logic, code behind

All the logic lived in the view of the windows form app. the code behind class had responsibility for everything, from reading files to processing, validation, sorting, and output formatting. None of the methods on this class were tested, in fact none were testable as everything depended on everything else, you had to spin up the app to run it, maybe thats why there were no tests.

Learnings

What struck me though was how intelligent the candidate was. It was great to see that after one and a half hours of discussing her code submission how she now views OOP. How behavior is part of the domain, how utility classes should be part of the domain, and how to unit test things properly. It was great to know we had imparted knowledge on her and that she was excited by what she had learned.

I place a lot of the blame on both Microsoft (for producing simplistic code examples for simple problems, which people then copy) but more so her peers, senior technical folk and technical leaders. These are the people you learn your craft from, these people should be able to develop better solutions than the simple code samples you see all too often in online tutorials. Had they mentored her in a better way she would not just be finding out how to do this stuff after 3 years.

I do hope she does reapply, i liked her and agree with RM that she has lots to offer, just learn a little more OO coding first. Its great that we (TW) are given the opportunity to teach like this. She was not joining the company. She had already failed the interview process. She may join in the future but thats not the point, the point was we did this for free, giving something back, it was great. And I hope everyone i meet who lacks a bit of knowledge or skills is as open as her to different ideas and different ways of doing things. I hope i am.

Monday 13 May 2013

My experiences with Mac OS - Developer mono-culture

Mac love

I talk to a lot of developers and read a lot of blogs and as i sit here in the office i see the vast amount of developers use Macs. I just got up and looked around 28 Macs, 7 Dells. Everyone raves about Mac, Mac Mac.

Me and my Mac

I used Mac for 2 years, pretty much the entire of 2010 and 2011, I've now had a Dell from 2012 till now 2013.

I've been a long time windows user, with bouts of time using Ubuntu and more recently Mint three and a half years ago i got the opportunity to get a Mac as my company laptop, I've never used a Mac before, so i got one, and hated it, yes I'm a dev... it was the best decision to try it though.

It was good that i was forced to use it at home for two years (it was my main home machine) i did gave it a good chance, until work wanted to replace it when i got the option of a dell or another Mac, i chose Dell. There was a degree of incredulity around that decision, mainly from fellow devs.

Whats up with me?

Whilst i agree the hardware itself was very nice, very well made (although the hard disk did pack in on the Mac after 2 months (nothings perfect)). I never fell for the OS though, far too many little annoyances, i never got on with it. Little things like the finder, interacting with the file system, the way the windows were managed, the lack of a tool bar that you can see your open windows, the way hidden windows can be hard to get back. Im sure people will say these things are non issues and you can do this and that to sort them out but to me it just got in the way of being productive. Then there is windows on bootcamp, yes you can do it but its a very sub optimal experience. from speed issues to keymaps it wasn't great. I do lots of .net often with SQLServer, its just too slow. So you boot into windows which is better, but then the keymaps are a pain. You might was well have a windows machine if you are doing windows development since you aren't using OSX.

So why the best decision for me to use Mac? well every one always raved about mac but never having used one much i was wondering what i was missing. Now I've used one for two years I've been happy going back to Microsoft with mint VM when i need it in a nix OS.

Back to Windows

So im now on windows 8, its a dell latitude, core i7, 8 Gig and find it a great experience. I'm into node and ruby and C# all of which run quick. Whilst now and then things don't work its really few and far between that they don't. The most recent thing was integrating cucumber with phantomJS, they don't play well together on windows at the moment.

Developer mono-culture

I just wanted to get another point of view out there. Sometimes i feel outnumbered as a windows user in certain circles, and it can be hard to speak out against Mac or anything else that developers are supposed to love, i don't think mono-culture is healthy, whether that is operating systems, IDEs, or habits. And I think Microsoft has come a very long way of late. Both communities should interact more there, less of the put downs. We are all writing code in the end its personal preferences in the end.

Applications I use regularly

  • dexpot, nexis file, conemu, launchy, beyond compare
  • visual studio 2012, webstorm, rubymine, sublime text
  • chrome, firefox
  • word, excel
  • SQLServer, MongoDB

Friday 10 May 2013

A SPA seed - Javascript stack with node and angular.

Single Page Application built on node.js and angularJS


I've been looking at creating a SPA with a full javascript stack so decided to pull together a seed based on node and angular with jshint to test all the .js files, mocha to run the node tests, karma to run the browser based angular tests and cucumber for BDD (full stack testing/acceptance tests).

I did this because i could not find any examples of how to pull together angular and node in the same project along with testing of everything. This is a good start but until i use it in anger i wont really know if ive got it right, so when i do i will try and update it.

https://github.com/DamianStanger/NodejsAngularSPASeed

Details

Node

node.js, npm, angular, karma, Mocha, phantom.js, jshint, jshintRunner

Ruby (1.9.2)

Ruby is for cucumber that is used for the full stack acceptance testing
ruby 1.9.2, devKit, bundler, cucumber, capybara

Next

The readme.md file gives details on how to get it all running and get the tests working. Then just clone this repo and use it as a starting point for your next node SPA app.

Wednesday 8 May 2013

Developing node modules, npm and git. how to publish npm packages from a windows machine whilst preserving the unix style line endings

I recently had a problem whilst publishing a new node.js module id written that is designed to run jshint recursively against a number of directories and or files.

I develop on windows and so the line endings are dos based (crlf) but the file to run your app as stored in the bin folder needs to have unix line endings (lf) for it to run on a mac or nix systems.

i save code in git and github with unix line endings turned on in the repository but on my working directory the file system is dos line endings. So when i publish using 'npm publish' the file in my bin is published with dos line endings.

this means that when you do an npm install -g on a mac or linux you get the error
env: node\r: No such file or directory

To fix this i developed a little batch script that i use to do the publishing of new versions its really simple and will change the line endings just before publishing.

dos2unix --d2u bin\jshintRunner
npm publish


it uses the built in dos2unix (im running windows 8 ultimate) to change the line endings. I hope this helps someone else out with a similar issue.

Link to the project on github : https://github.com/DamianStanger/jshintRunner and on npm https://npmjs.org/package/jshintrunner

Friday 26 April 2013

A node application to count lines within a files

Node with Mocha, Should and Sinon to count file lines

I've recently had the requirement to count lines of source code in 3 or 4 different code bases, including a couple of single page web apps written in javascript, angularjs and karma, a couple of java server side services and an acceptance test suite again written in java and selenium.

I wanted to compare the code bases and to look into the ratios of test code to production code so we as a team could get some collective feel for the entire code base, which to me was a very valuable exercise.
I decided to write a command line app in node and javascript, mainly because at the moment I’m trying to boost my javascript knowledge and I’m really interested in node. This command line app would count the lines of code in a code base. There is nothing better than a real requirement to spur you into action.
You can find the source code here: https://github.com/DamianStanger/lineCounter
I’m quite pleased how it has turned out but as is the way with every piece of software I’ve run out of budget (free time) before completion. But I would have liked to enhance it further if I could find the time.

Enhancements:


  • Return a json string that has file and line counts for every directory in the codebase. This output could then be pushed into a d3 app to visualise the source code and the relative sizes, that would be cool.
  • Ability to customise the ignored files and directories.
  • Ability to hook into team city, this would need some new output reporter creating so we could track the lines of code over time.

Learnings

  • I started off using Karma and jasmine for running the tests but found that they were difficult to get to play well with the node modules I created so I switched to Mocha (http://visionmedia.github.io/mocha/) glad I did, because I love it. I especially like the BDD style tests I can write with many nested describes to get the test context. I’m not sure how I’m going to cope going back to the flat structure of nUnit.
  • I started to use Should (https://npmjs.org/package/should) as the preferred mechanism for asserting. The fluent interface is really appealing, it’s very similar to one I’ve been using in .net for a while now.
  • I’ve needed to do a bit of mocking in this project and for this I found Sinon (http://sinonjs.org/ ). Very powerful and flexible, its been capable of meeting all my stubbing and mocking needs up to now. Bit of a learning curve but its all good.

Wednesday 2 January 2013

node.js an introduction and tutorial to javascript on the server

Presentation

I gave a presentation on node.js on my first day back from the Christmas holidays. It was fun, plenty of people in the room, all eager to learn the basics of node.

Firstly an overview of what node is, what its good for and an introduction to the event driven architecture behind node. I did the classic fast food example and then a coding session.

The live coding session consisted of an introduction to the node REPL, some basic examples of javascript on the server, followed by a simple web server and some performance testing with apache bench (https://httpd.apache.org/docs/2.2/programs/ab.html).

To finish up I demoed a reverse proxy written in one line of node. It's amazing how much power this has, and even more amazing that you can write something like that in such a concise manner but which is still understandable (read maintainable).

I thought it went really well. I don't give many presentations but when I do I like them to be good, relevant, interesting and entertaining (as far as a technical subject can be). Of course I was nervous, especially because i was videoing it. I wanted to actually see what i was like. It's the best way to improve, fast feedback reflection and improvement.

Live coding is always dangerous but it all went remarkably well. Although I did have a small hiccup in that i could not connect to the wireless network, but that only impacted one of my examples, I weathered that storm.

Video


So yes I videoed it and have uploaded to youtube http://youtu.be/vGBk8EB-Yz0 Check it out I'd be really interested to hear your feedback on my presentation style and content.



Attached below are the code examples from the talk so you can test them out if you like.

Enjoy.

Code demo

REPL

1+2;
var add = function(a,b){return a+b};
add(3,4);

process
process.pid
process.env.Path

fs.readFile('foo.txt', 'utf8', function (err,data) {
console.log(data);
});

foo bar code

setTimeout(function(){
console.log("foo");
}, 2000);
console.log("bar");

setInterval(function(){
console.log("bang!");
}, 1000);

hello world web server

var http = require('http');
var server = http.createServer(function (req, res) {
res.end('Hello World\n');
});
server.listen(1337, '127.0.0.1');
console.log('Server running at http://127.0.0.1:1337/');

curl http://localhost:1337/
curl -i http://localhost:1337/

res.write('hello');
setTimeout(function() {
res.end('World\n');
}, 6000);

The following needs apache bench installed and on your path
ab -n 10 -c 10 http://127.0.0.1:1337/

new file express.js

var express = require('express');
var app = express();
app.listen(8080);
app.get('/', function(req, res){
console.log("get");
res.send("Welcome to node!");
});

node package manager

npm install express

app.get('/foo/', function(req, res){
console.log("getfoo");
res.send("Welcome to foo!");
});

a one line reverse proxy...

using request, a module to simplify the making of web requests
var http=require("http"),
request=require("request");
http.createServer(function (req, res) {
console.log(req.url);
req.pipe(request("http://www.xperthr.com/" + req.url)).pipe(res);
}).listen(1337);