As a backend developer I used to treat JavaScript as a toy language. Frontend programming was just a little addition. The “Real Work” was done on backend.

It has changed. Mainly thanks to Single-Page-Applications. Nowadays JavaScript and CoffeeScript are one of the most important languages. Frontend programming is as important as backend.

Recently I’ve dived into frontend and I don’t regret it at all. CoffeeScript and Gameboxed Engine are the most brillant things since Rails. While I enjoy CoffeeScript it still relies on JavaScript types. And it is a pain.

Adding arrays

  [1,2,3] + [3] => "1,2,33"

Here comes the type-casting. While all JavaScript developers will explain why a string is the result of the addition of two arrays this behaviour is simply ridiculous. Even raising an exception is better choice than returning a string. The newest ECMAScript provides a lot of interesting features although it doesn’t address this simple issue. Is it expected behaviour? Definitely not. Should it be fixed? Definitely yes.

At least there is a sane solution.

  [1,2,3].concat([3]) => [1,2,3,3]

Subtracting arrays

  [1,2,3] - [3] => NaN

Funny thing. Adding arrays returns a string. Subtracting them returns NaN. Again, JavaScript devs will explain why. Maybe even try to convince you that it’s perfectly fine behaviour. It’s not.

Is there a function to subtract arrays? Sorry, but no. You either have to implement your own or use underscore.js, sugar.js or something similar.

Let’s take a look at underscore.js

  _.difference([1,2,3], [3]) => [1,2]

Not bad. However it looks worse if you try to chain more methods.

  _.first(_.difference([1,2,3], [3])) => 1

Sugar.js looks better:

  [1,2,3].subtract([3]).first() => 1

However sugar.js does monkey patching on the native types which may be considered harmful. Couldn’t it just be added as a core feature?

Y2K problem

  new Date().getYear() => 112

Why the result is 112? ECMAscript specification states Date#getYear returns current year minus 1900.

  2012 - 1900 => 112

It was fine until 2000. Currently, twelve years after millenium, this method should be either fixed or removed.

Surprisingly only IE fixed it. All other browsers still come with that bug.

Solution? Use Date#getFullYear. Or moment.js.

  new Date().getFullYear() => 2012

Hash and object as a key

  class Point
    constructor: (@x, @y) ->

  pointA = new Point(0, 0)
  pointB = new Point(100, 200)
  hash = {}
  hash[pointA] = 10
  hash[pointB] = 50

hash[pointA] should return 10, shouldn’t it?

  hash[pointA] => 50

Why 50? Let’s inspect the hash.

  Object.keys(hash) => ["[object Object]"]

The keys are always converted to strings. Under the hood JavaScript calls pointA.toString() and pointB.toString() which in both cases returns “[object Object]”. You would overwrite Point#toString to return unique indentifier. However it doesn’t seem to be correct semantically.

jQuery map vs each

This quirk has nothing to do with JavaScript itself. However it reveals interesting inconsistency. array, callback(element, index) )

The callback gets element as a first argument and index as second.

What about jQuery.each?

    jQuery.each( array, function(index, element) )

For some reason arguments are reversed - index is first, element is second.


A new language. Syntax like in CoffeeScript. Standard library like in Ruby. All existing libraries, plugins and components can still be used without modification. Possible?