• Enforcing coding rules in your team with JSCS

    28 January 2014

    Enforcing coding rules in a team is hard, and most of the time it ends with a very long documentation lost on the company wiki which nobody reads. You can also check the coding rules during code reviews and make the apropriate corrections if needed but it can be very painful and time consuming.

    According to me, the only solution is to rely on some tools to check the code and warn the developper if he do not follow the rules. It can be a simple warning as it can totally forbid him to push some code, it is up to you to choose how strict you want to be.

    My tool of choice for this is JSCS (for JavaScript Code Style checker). We'll see how to integrate it in your team tools and in your workflow.

    Installing JSCS

    JSCS is a simple npm module to install globaly.

    npm install jscs -g

    You have now the JSCS tool available globaly.

    To check your code JSCS will need a configuration file named .jscs.json at the root of your project. This file will contain all the rules your code must follow.

    As an example you can check the jQuery .jscs.json configuration file bellow:

    {
        "requireCurlyBraces": [ "if", "else", "for", "while", "do" ],
        "requireSpaceAfterKeywords": [ "if", "else", "for", "while", "do", "switch", "return" ],
        "requireSpacesInFunctionExpression": {
            "beforeOpeningCurlyBrace": true
        },
        "disallowSpacesInFunctionExpression": {
            "beforeOpeningRoundBrace": true
        },
        "requireMultipleVarDecl": true,
        "requireSpacesInsideObjectBrackets": "all",
        "requireSpacesInsideArrayBrackets": "all",
        "disallowLeftStickedOperators": [ "?", "-", "/", "*", "=", "==", "===", "!=", "!==", ">", ">=", "<", "<=" ],
        "disallowRightStickedOperators": [ "?", "/", "*", ":", "=", "==", "===", "!=", "!==", ">", ">=", "<", "<="],
        "requireSpaceBeforeBinaryOperators": ["+", "-", "/", "*", "=", "==", "===", "!=", "!=="],
        "disallowSpaceAfterPrefixUnaryOperators": ["++", "--", "+", "-"],
        "disallowSpaceBeforePostfixUnaryOperators": ["++", "--"],
        "requireRightStickedOperators": [ "!" ],
        "requireLeftStickedOperators": [ "," ],
        "disallowKeywords": [ "with" ],
        "disallowMultipleLineBreaks": true,
        "disallowKeywordsOnNewLine": [ "else" ],
        "requireLineFeedAtFileEnd": true,
        "disallowSpaceAfterObjectKeys": true,
        "validateLineBreaks": "LF"
    }

    These rules enforce curly braces after if statements, disallow the use of with, enforce Unix line ending and so on. You can found the rules list and their meaning in the JSCS Readme.

    When you have finished to play with the rules, put your .jscs.json configuration file at the root of your project.

    You can already check your code by typing this command at the root of your project:

    jscs path/to/my/script.js

    or

    jscs .

    to check the whole project (. is the current directory).

    You should get something like this:

    JSCS from the command line

    Great ! You can now check your code with JSCS ! But for now this is not very convenient.

    Intregrating JSCS in Sublime Text

    For this you will need Sublime Text 3 with Package Control.

    First you will need to install SublimeLinter using Package Control (follow the SublimeLinter documentation if you are not sure about the procedure).

    Then, the same way you've installed SublimeLinter, search for the package SublimeLinter-jscs in Package Control and install it.

    And... you've done ! Now if you try to open your project JavaScript files in Sublime Text you should have your errors highlighted by SublimeLinter.

    Highlighting coding style errors with SublimeLinter-jscs

    If you are not using Sublime Text as text editor (shame on you !) JSCS has also some plugins for Vim, Brackets and LightTable.

    If you are using none of those, keep reading ;)

    Adding JSCS as a Grunt task

    The Sublime Text plugin is great but if you have some teamates that do not use an editor with a JSCS plugin and keep pushing some unchecked code all your hard work will be in vain.

    Here's enter Grunt.

    If you are already using Grunt it will be a simple task to add to your tasklist.

    Just install the grunt-jscs-checker npm package:

    npm install grunt-jscs-checker --save-dev

    And add a jscs entry in your Gruntfile.js:

    jscs: {
      src: "path/to/my/*.js"
    }

    Read the Readme for more options.

    Then run the task to check the whole project:

    JSCS from Grunt

    Not very different from the command line, but we can now configure Grunt to run it with other tasks, like your tests:

    grunt.registerTask('test', ['jscs', 'jasmine']);

    This way you will check your project coding style besides to the tests compliance.

    (If you are using Gulp instead of Grunt you can use the gulp-jscs task)

    Adding JSCS as a Git pre-commit hook

    Ok, now we need to be sure JSCS will be run before every commit. A way to do this is to use grunt-githooks that will easily bind your Grunt tasks to Git hooks.

    Like before, install the grunt-githooks npm package:

    npm install grunt-githooks --save-dev

    And add a githooks entry in your Gruntfile.js:

    githooks: {
      all: {
        'pre-commit': 'test'
      }
    }

    Run grunt githooks to bind the tasks and you're done.

    From now, next time you will commit the test task will run and if it finish in error (error in the coding style or in the tests) the commit will be aborded.

    JSCS as a Git pre-commit hook

  • Choosing and making quality npm modules

    11 January 2014

    Since its creation Node.js gained more and more attention and the size of its module repository (NPM) increase very rapidly.

    But with near 55 000 available modules created by the community it can be very difficult to make the good choice for your own projects.

    How to know if it's safe to use this tiny module on your billion pageviews/day project ?

    Short answer: You can't.

    Long answer: You really can't, but according to me there is two main criteria to consider:

    • Quality
    • Support

    I made a list of tools/practices to help you to check npm modules. It can help you as a user to choose your modules but also as a module author to improve your code quality.

    Quality

    Tests

    Running tests with Mocha

    According to Nodechecker more than half of the npm modules do not have any tests, that's sad :(.

    The presence of a complete test suite is very important to help you to see if the functionalities of the module were fully tested by the author. And combined with a continuous integration service like Travis CI you can easily check the module status at each commit.

    Tests are a very powerful tool to prevent regressions between releases and they must not be neglected.

    For Users:

    • Check the module readme for a "badge" like this: Build status for travis-web. Most of online services provides some similar images to display the current status of the project.
    • Search a test folder at the project root. The standard command to run the tests of a npm module is npm test.

    For Authors:

    Code coverage

    Code coverage report with Blanket.js

    Tests are great, but to be efficient they must cover a maximum of cases in your module.

    A good way to ensure a good coverage of your code base is to use an automated tool to determine how many percents of your code were covered by your tests and what conditions were not tested.

    Some tools like Blanket.js can analyze your code then output a report in HTML (or other format) that highlight the non-covered lines.

    You can also send this report to a service like Coveralls to keep an history of your code coverage and be warned if any commit decrease the percentage of covered code.

    For Users:

    • Check the module readme for a Coveralls "badge" like this: Coverage status for gitlabhq

    For Authors:

    Documentation

    An undocumented module is useless for everyone except its author. What is the point to open-source something if you do not explain to the users what it does and how to use it ?

    The minimal documentation is the Readme at the root of the project, it should at least contain:

    • name of the module
    • description of what it does
    • how to install
    • how to use (example)
    • license

    That's it. Some other sections can be nice to have too:

    • how to run tests (important if it is different from the simple npm test)
    • how to contribute
    • where to get support
    • how to contact the authors

    Most of the npm modules are very simple so their documentation fits in the Readme, for bigger modules you can have a doc/documentation folder at the root of the project, a wiki in the GitHub repository or a dedicated website.

    Proper versioning

    The Node.js userland is used to the Semantic Versioning specification, so your modules must follow them.

    This specifications can be summarized like this:

    Given a version number MAJOR.MINOR.PATCH, increment the:

    • MAJOR version when you make incompatible API changes,
    • MINOR version when you add functionality in a backwards-compatible manner, and
    • PATCH version when you make backwards-compatible bug fixes.

    Additional labels for pre-release and build metadata are available as extensions to the MAJOR.MINOR.PATCH format.

    So upgrading from version 2.4.5 to 2.6.4 should be painless (backwards compatibility is preserved, only some patches and some new features). Breaking changes must only be introduced with a new MAJOR version.

    A proper versioning is critical because it allows you to upgrade seamlessly your depencendices without any risks for your application.

    For Users:

    • next-update: Tests if module's dependencies can be updated to newer / latest versions without breaking the tests.

    For Authors:

    • semver: The Semantic Versioning 2.0.0 specification

    Code quality

    Report with CodeClimate

    Quality is a combination of many criteria: consistency, readability, maintainability, etc. It can be hard to evaluate, but some tools can help you in this task.

    For example some tools like JSHint can help you to spot very early some potential problems in your code (unused variables, missing var keyword, function in a loop, etc.) and JSCS can enforce some coding rules in your team to keep code consistency.

    There is also some tools to check your code maintainability like Plato or CodeClimate, they can generate some very useful reports to spot the quality problems in your project.

    For Users:

    • Check the module readme for a CodeClimate "badge" like this: CodeClimate status for jekyll
    • Search a .jshintrc file (JSHint) and/or .jscs.json file (JSCS) at the project root.

    For Authors:

    But these tools does not make everything, and the best tool is still some regular code reviews with one of your peers.

    Support

    Support is all about the future of the module. Non-fixed bugs or futures node.js version incompatibility can make a module broken and useless very quickly. It is hard to guess if a module will be well supported by its author(s) but there is some signs that can give us some indications.

    Sadly, there is no magic tool for now.

    Repository activity

    Most of the npm modules are hosted on GitHub and you can have a nice overview of the activity on the pulse page. You can also check the date of the last commit, the number of open issues, the number of open pull requests, etc.

    But an inactive project does not mean the project is abandoned, maybe the author is considering the module as "done" if it fits his needs and if there is no open issues or pull requests.

    Authors:

    You can also check the author name. It can be a trusted company (Nodejitsu, Walmart Labs, Yahoo, etc.), a trusted developer (John Resig, Isaac Z. Schlueter, Mikeal Rogers, etc.) or a total stranger. In all cases the support of the past projects of the company/developer will indicate if they are used to maintain their open-source projects.

    Popularity

    The popularity is not an indicator of the good health of a module, but it can indicate there is already a lot of people trusting this module and its authors. Chances are if there is a problem with the support and if the module is still widely used it will be forked and/or maintained by some other contributors (cross your fingers here).

    I hope this tools and advices will help you to improve your project quality and solidity of your projects. I mentioned most of the tools I'm using on my personal and professional projects, if you know some other useful tools please post them in the comments, I'll update the list.

  • Installing nginx 1.4.0 with SPDY support

    03 May 2013

    Since the version 1.4.0 nginx supports the draft 2 of SPDY protocol.

    But this SPDY module is not enabled by default and to enable it you need to entirely recompile nginx.

    Here's a simple guide to do that.

    Fist, be sure to have all needed packages (It can vary depending of your system, for information I'm doing this installation on Ubuntu 12.10)

    sudo apt-get update
    sudo apt-get install make gcc libpcre3-dev libssl-dev

    Then download the source code of the last stable version of nginx on their repository: http://hg.nginx.org/nginx/tags

    Currently the last stable version is the 1.4.0 (revision 7809529022b8);

    wget http://hg.nginx.org/nginx/archive/7809529022b8.tar.gz

    Decompress it:

    tar -xzf 7809529022b8.tar.gz
    cd nginx-7809529022b8

    Launch the configure command:

    ./auto/configure --prefix=/etc/nginx --sbin-path=/usr/sbin/nginx --conf-path=/etc/nginx/nginx.conf --error-log-path=/var/log/nginx/error.log --http-log-path=/var/log/nginx/access.log --pid-path=/var/run/nginx.pid --lock-path=/var/run/nginx.lock --http-client-body-temp-path=/var/cache/nginx/client_temp --http-proxy-temp-path=/var/cache/nginx/proxy_temp --http-fastcgi-temp-path=/var/cache/nginx/fastcgi_temp --http-uwsgi-temp-path=/var/cache/nginx/uwsgi_temp --http-scgi-temp-path=/var/cache/nginx/scgi_temp --user=nginx --group=nginx --with-http_ssl_module --with-http_realip_module --with-http_addition_module --with-http_sub_module --with-http_dav_module --with-http_flv_module --with-http_mp4_module --with-http_gunzip_module --with-http_gzip_static_module --with-http_random_index_module --with-http_secure_link_module --with-http_stub_status_module --with-mail --with-mail_ssl_module --with-file-aio --with-ipv6 --with-http_spdy_module

    (See the --with-http_spdy_module parameter at the end of the command)

    (If you already have nginx installed and want to know with which parameters it was compiled you can run the command nginx -V)

    Then if everything works fine, launch make and make install:

    make
    sudo make install

    Update your nginx configuration file

    server {
      listen 443 ssl spdy;
    
      ssl_certificate mycertificate.crt;
      ssl_certificate_key mycertificate.key;
    
      # ... my server configuration ...
    
    }

    Finally, (re)start nginx

    sudo service nginx restart

    And voilà! Your website should now run with SPDY.

  • Quick tip - Readable ternary operation

    28 April 2013

    You are certainly used to ternary operations in JavaScript, you know, those things you frequently use as a shortcut instead of a if statement:

    var sky = summer ? 'blue' : 'grey';

    Short, readable. Good.

    But what if we got more complex conditions ?

    var sky = spring ? 'lightblue' : summer ? 'blue' : fall ? 'lightgrey' : 'grey';

    Still short, but it start to be much less readable and difficult to understand at first sight.

    So, we can indent our code to keep this readable.

    var sky =
        spring ? 'lightblue' :
        summer ? 'blue'      :
        fall   ? 'lightgrey' :
        'grey'
    ;

    Haaaa, much better. We can easily see here our conditions and values while keeping the code quite short.

    Has a general rule:

    var [variable] =
        [condition 1] ? [value 1] :
        [condition 2] ? [value 2] :
        [fallback value]
    ;

    But, do you really need to do a complex ternary operation ? ;)

  • Chrome Logger for Node.js

    24 April 2013

    Chrome Logger is a Chrome extension that allows you to display your server side debugging messages in the Chrome console.

    So, I made a simple implementation for Node.js to help you to debug your Node.js application directly in Chrome.

    You can found node-chromelogger on GitHub.

    Juste make a

    npm install chromelogger

    like any other npm package.

    Then, use it in your application:

    var chromelogger = require('chromelogger');
    var http = require('http');
    
    var server = http.createServer();
    
    server.on('request', chromelogger.middleware);
    
    server.on('request', function(req, res) {
      res.chrome.log('Message from Node.js %s', process.version);
      res.end('Hello World');
    });
    
    server.listen(7357);

    Node Chrome Logger provide several logging methods on the ServerResponse (res) object:

    • res.chrome.log
    • res.chrome.warn
    • res.chrome.error
    • res.chrome.info
    • res.chrome.group
    • res.chrome.groupEnd
    • res.chrome.groupCollapse

    These methods matches the Console API of the Chrome Developer Tools.

    You can also use it as an ExpressJS middleware if you want (see Readme for more informations).

    The main limitation is it uses the HTTP Headers to send the informations to the client, so if you started to send the body you can't send more debug messages to the client and it will trigger an error.

    If you need a more advanced Node.js debug tool I suggest you to use the great Node Inspector package.