Felix Rieseberg

ECMAScript 6: A Quick Intro to the Future of JavaScript

By this point, you have probably heard of ECMAScript 6, also known as the upcoming version of the standard that defines JavaScript. Maybe you haven't, in which case you shall be informed that I'm basically talking about the next version of JavaScript. The spec with the Codename 'Harmony' includes a whole bunch of exciting changes intending to make the language more flexible and powerful. So let's take a quick look at the most important features!

Arrow Functions

Arrow functions are, as the name suggests, functions defined using the arrow syntax (parameter => statement). Compared to traditional JavaScript functions, arrow functions feature three important differences:

1) Dramatically shorter syntax. If your arrow function has only one statement, you can even omit the curly braces and the return – making arrow functions awesome for short, one-line functions. The following to code blocks are identical:

people.map(function (person) { return person.age; }));  
people.map(person => person.age);  

2) The value of this is lexical, meaning that it is defined by the definition of the function, not by the place where it's being used. If you ever wrote something along the lines of var self = this, you’ll be happy to hear that this isn’t changed by arrow functions. It also can't be modified - trying to overwrite this will throw an exception. Again, consider the example below:

name: 'John Doe',

// ES5
greet: function (people) {  
    var self = this;
    people.forEach(function (person) {
        console.log(self.name + 'greets' + person);
// ES6
greet: function (people) {  
    people.forEach(person => console.log(this.name + ' greets ' + person));

3) Always anonymous: Arrow functions can't be used with constructors and don't like the new keyword.

The let and const statements

Rejoice, developers! The let statement allows us to declare a variable for the current scope only, whereas var has always defined variables either globally or locally for whole function. Otherwise, it behaves like the var statement.

function variables_es5() {  
    var a = 'John';
    if (true) {
      var a = 'Tim';
    console.log(a); // Returns 'Tim'!

function variables_es6() {  
    let a = 'John';
    if (true) {
      let a = 'Tim';
    console.log(a); // Returns 'John'!

Also new: constant variables declared with the const statement.

const name = 'John';  


One of the most commonly named issues of JavaScript is that modular design is difficult - various methods and best practices have come up over time, but ES6 will finally include a proper module system. Modules have two core components: Dependencies - if you're building a widget with jQuery, your module might depend on jQuery. Secondly, Exports - jQuery itself exports all its juicy features in $. This probably feels familiar - Node.js uses a similar system right now.

Modules come in the form of files - a JavaScript module is essentially a JavaScript file that has one or more defined exports. Here's a simple example showing how to consume and to export. Assume two files:

// people.js
var people = ['John', 'Tim', 'Helen', 'Katherine', 'Cary'];  
export default people;  

As you can see, we create an object and export it as default - meaning that the export isn't named, it's simply the only available export. In another file, we can now import the exported object:

// greetings.js
import people from 'people';

var greetings = {  
    greetPeople: function () {
        people.forEach(function (person) {
            console.log('Hi ' + person + '!');

Named Exports

The bigger the library, the more complex your exports might become. To keep things simple and manageable, you can name your exports.

var people = ['John', 'Tim', 'Helen', 'Katherine', 'Cary'];  
var friends = ['Rachel', 'Monica', 'Phoebe', 'Joey', 'Chandler', 'Ross'];

export { people, friends };  

Importing named exports then work exactly the same way:

import { people, friends } from 'people';


Let's be honest - classes may have been missing from JavaScript, but all of us have used patterns trying to get around that fact. The ES6 implementation of classes is essentially standardizing the object-oriented pattern many of us have been using for a while now. The ES6 classes feature prototype-based inheritance, super calls, instance and static methods as well as constructors. If you're used to classes in any other major language, you'll feel right at home.

class Person extends Actor {  
    constructor(options) {
        this.firstName = options.fistName;
        this.middleNames = options.middleNames;
        this.lastName = options.lastName;

    fullname() {
        var middle = middleNames.join(' ');
        return this.firstName + middle + this.lastName;

    act() {
        console.log('Now acting: ' this.fullname());

    static johndoe() {
        return new Person({ firstName: 'John', lastName: 'Doe'});

Looking at the above example, you'll find many things that seem familiar. Classes are defined using the class keyword and may extend another class. Members of that parent class can be called using the super object. A constructor that acts upon calling new myClass(params) is defined with the constructor keyword. Even though the introduction of classes is one of the more dramatic changes, the concepts are all in all pretty boring and don't introduce anything unknown to developers.


Generators are a bit complicated, but an important new feature - we're dealing with a new type of function here. Specifically, generators make the development of iterators more powerful. The basic assumption about loops in JS is that they always run to completion before any other code is being run. Generators change that - you can run code, stop it, and continue running it at a later time. To pause the execution of a generator, you can use the new yield keyword, allowing you to continue execution at a later point. If that doesn't sound like dramatic change, consider this: Calling a generator doesn't immediately execute its body - instead, it returns an iterator object that's able to iterate through the generator's code blocks. Let's look at some code:

function *myGenerator() {  
    yield 1;
    yield 2;
    yield 3;

var generator = myGenerator();  
generator.next(); // returns 1;  
generator.next(); // returns 2;  
generator.next(); // returns 3;  

Let's talk syntax - a generator function is denoted by a leading *. You can denote the generator also by writing function* name, but I personally prefer the variant above. As you can see, we iterate through the generator function by calling next() on it's iterator object. There's a lot more magic in generators - if you want to check it out, I recommend Kyle Simpson's fantastic series on ES6 generators.

Template Strings

The new template strings are string literals with embedded expressions - essentially placeholders inside a string. Instead of single or double quotes, template strings are declared with back ticks - with the embedded expressions marked with ${ }. Consider the example below:

var name = 'John';  
var age = 29;  
return `My name is ${ name }, in a year I will be ${ age + 1 } years old`;  

If a template string is preceded by an expression (aka a function), we're looking at a so-called tagged template strings. The tag function is called with the the literal pieces of the string passed as an array. Processed pieces of the template string are passed as their own parameters, following the initial array. The idea here is that a tag function is able to generate results based on a processed template string - something that'll be extremely useful in the many cases in which JS is used to generate content or escape HTML.

var log = function(literals, values) {  
    console.log(literals[0]); // "Hello"
    console.log(values[0]); // "Walker"
    return "Logged!";

var name = "John";  
var result = log `Hello, ${name}!`;  
return result; // "Logged!"  

New Collections: Map, Set, WeakMap, WeakSet

JavaScript is getting more collections, specifically map and set as well as weak versions of those. The Map is a simple key/value store created with new Map([]), where the passed array contains sets of key/value pairs. A set on the other hand is a collection of unique values of any kind (including object references). The uniqueness of the values makes the set tricky, since the values have to be unique within the set and the comparison uses a different algorithm than the === operator. Yes, that's annoying and horrible, but also still in flux, so check major sites like the MDM or MSDN for updates.

Let's talk about the weak versions of those. The WeakMap holds key/value pairs too, but isn't enumerable. The references to key objects are held "weakly", meaning that they do not prevent garbage collection in case there would be no other reference to the object. WeakSet also only holds weak references, but also only allows objects (and not arbitrary values of any kind). Again, the list doesn't really maintain a list of its elements, allowing for garbage collection.


You might be surprised to see promises here - after all, you have probably used them before. The major change: Promises will finally become a native feature of JavaScript. A quick refresh: Promises are objects describing deferred and asynchronous computation, enabling the well-known callback pattern. There isn't really anything new in the spec - looking at the code below, you shouldn't see anything surprising if you used jQuery or other promise libraries before. It's great to see though that we'll finally be able to just call new Promise(). If this concept is new to you, I recommend checking out the Promises/A+ specs.

new Promise(function(resolve, reject) { … });  

Is this it? Using ES6 "Harmony" today

ES6 is not complete just yet - and the features mentioned here are merely the ones most ready for prime time. There are more in the works. Major browsers and engines are currently in the process of implementing the new standard, but if you want to be on the safe side, you should transpile your written ES6 code to compatible ES5 code - for instance using the excellent and popular project 6to5.

ES6 and Node.js

If you dislike the idea of something changing your code using transpilation (and there are good reasons to), you have choices. The Node.js unstable branch 0.11 (as well as the latest versions of 0.12) support a --harmony flag, enabling experimental support for many ES6 features. It's pretty buggy though, given that the flag enables all ES6 features - even the ones that are still heavily "work in progress". More fine tuned flags are available, but it doesn't feel quite ready for production just yet. An alternative is the Node community's enfant terrible, io.js - the ES6 features considered stable by Google's V8 engine are enabled by default, making io.js a good choice for ES6-keen Node developers.

ES6 and Browsers

Well, transpiling is still the best option, but native browser support is getting there quickly. This compatibility table tracks the current progress of browser developers integrating ES6 - as you can see, IE's technical preview is leading (let's admit it, that's surprising), with Chrome and Firefox close behind. As usual, it's probably going to be mobile browsers who will have you using polyfills and transpiling for quite some time to come.

Should I use it?

While the majority of production use cases won't yield a need for ES6, I heavily recommend playing with it. If you're just getting started on a sophisticated project that you will still be working on in 2016, I recommend going with ES6 today and transpiling until widespread support is available - 2015 is certainly the year of JavaScript "Harmony".