Hemanth.HM

A Computer Polyglot, CLI + WEB ♥'r.

What's New in Node V10?

| Comments

node v10 has a good set of new features, the below are few of them that I liked.

FS promisified:

fs/promises API is an experimental promisified versions of the fs functions.

1
2
3
4
5
const fsp = require('fs/promises');
const stat = async (dir) => fsp.stat(dir);
stat(".")
.then(console.log, console.error)
.catch(console.error);

console.table:

1
2
3
4
5
6
7
console.table([{ a: 1, b: 'Y' }, { a: 'Z', b: 2 }]);
// ┌─────────┬─────┬─────┐
// │ (index) │  a  │  b  │
// ├─────────┼─────┼─────┤
// │    0    │  1  │ 'Y' │
// │    1    │ 'Z' │  2  │
// └─────────┴─────┴─────┘

top level await in REPL:

Starting the REPL with --experimental-repl-await flag with enable top level await, so you need not wrap it in an async function.

1
2
3
4
$ node --experimental-repl-await

> const fsp = require('fp/promises');
> await fsp.stat(".");

pipeline for streams:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
const fs = require('fs');
const zlib = require('zlib');
const pipeline = util.promisify(stream.pipeline);

async function run() {
  await pipeline(
    fs.createReadStream('archive.tar'),
    zlib.createGzip(),
    fs.createWriteStream('archive.tar.gz')
  );
  console.log('Pipeline succeeded');
}

run().catch(console.error);

async generators:

1
2
3
4
5
6
7
8
const answer = function *gen() {yield 42;};

var iterator = answer();
iterator.next().then(step =>
  iterator[Symbol.asyncIterator](); //iterator
  step.done; // false
  step.value; // 42
});

for-await-of-loops:

1
2
3
4
5
6
7
8
9
10
async function foo() {
    const Iters = [
        Promise.resolve('foo'),
        Promise.resolve('bar'),
    ];
    for await (const it of Iters) {
        console.log(it);
    }
}
foo();

Optional catch binding:

1
2
3
4
5
6
7
8
9
function meow(){
try {
  throw new Error();
}
catch {
  return true;
}
return false;
}

RegExp Unicode Property Escapes:

1
(/\p{Script=Greek}/u).test('π'); // true

util.types.is[…]:

No more type checks deps!

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
> Object.keys(util.types)
[ 'isExternal',
  'isDate',
  'isArgumentsObject',
  'isBooleanObject',
  'isNumberObject',
  'isStringObject',
  'isSymbolObject',
  'isNativeError',
  'isRegExp',
  'isAsyncFunction',
  'isGeneratorFunction',
  'isGeneratorObject',
  'isPromise',
  'isMap',
  'isSet',
  'isMapIterator',
  'isSetIterator',
  'isWeakMap',
  'isWeakSet',
  'isArrayBuffer',
  'isDataView',
  'isSharedArrayBuffer',
  'isProxy',
  'isWebAssemblyCompiledModule',
  'isModuleNamespaceObject',
  'isAnyArrayBuffer',
  'isArrayBufferView',
  'isTypedArray',
  'isUint8Array',
  'isUint8ClampedArray',
  'isUint16Array',
  'isUint32Array',
  'isInt8Array',
  'isInt16Array',
  'isInt32Array',
  'isFloat32Array',
  'isFloat64Array',
  'isBigInt64Array',
  'isBigUint64Array' ]

Redux Async Actions

| Comments

The most common question I hear post intro to redux is: "How do I fetch some data in actions?"

Most of them would hit the roadblock with: Actions must be plain objects. Use custom middleware for async actions. that is because Actions are meant to be plain JavaScript objects and must have a type property that indicates the type of action being performed.

Let us see a quick example to make an API request say this xkcd comic API.

As there is no community consensus for handling async actions and there are many libs out there that will make things easier in handling async actions, but in this example below we shall take the vanilla approach.

Let us start with an initial state that looks like:

1
2
3
4
5
const initialState = {
  loading: false,
  error: false,
  comic: null
}

a reducer which handle fetching, fetched and failed states of the action.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
const reducer = (state = initialState, action) => {
  switch (action.type) {
    case 'FETCHING_COMIC':
      return {
        ...state,
        comic: action.comic
      }
    case 'FETCH_COMIC_SUCCESS':
      return {
        ...state,
        comic: action.comic
      }
    case 'FETCH_COMIC_FAILED':
      return {
        ...state,
        error: action.error
      }
  }
}

a store and dispatch based on the flow:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
const store = Redux.createStore(reducer);

store.dispatch({
  type: 'FETCHING_COMIC'
})

fetch('https://xkcd-imgs.herokuapp.com/')
  .then(response => response.json())
  .then(comic => {
    store.dispatch({
      type: 'FETCH_COMIC_SUCCESS',
      comic
    })
  })
  .catch(error => store.dispatch({
    type: 'FETCH_COMIC_FAILED',
    error
  }))

Some mandatory render method (not react this time ;))

1
2
3
4
5
const render = function(state) {
    let xkcd = document.querySelector('#xkcd');
    xkcd.src = state.comic.url;
    xkcd.alt = state.comic.title;
  }

Working code:

Some interesting discussions:

P.S: Thanks to <GreenJello> on the quick review.

Node Logger Appcrash

| Comments

Lately /me came across a app-crash whose logs were indicating that the issue was in the logger itself, the error read: "URIError: malformed URI sequence"

Digging through the debug logs I reached a point in the logger were there was log for an a Object which looked like:

1
querystring.unescape(querystring.stringify(obj))

Where querystirng was node's inbuilt querystring lib.

Test case was all green for this particular method, I tired to break this with different types of inputs and wasn't able to break it at one shot.

Further, digging into the node source, querystring there was this sweet line throw new URIError('URI malformed'); which was exactly i was looking for and the comment also read: // Surrogate pair, that was a big clue, so played around a bit with unicode surrogate pair and after a while I was able to reproduce the error!

So a half surrogate pair, anything in the range of \uD800 to \uDFFF would break querystring.stringify:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
> const data = {meow: '\uDFFF'};

> querystring.stringify(data)
URIError: URI malformed
    at QueryString.escape (querystring.js:154:13)
    at Object.QueryString.stringify.QueryString.encode (querystring.js:210:24)
    at repl:1:13
    at realRunInThisContextScript (vm.js:22:35)
    at sigintHandlersWrap (vm.js:98:12)
    at ContextifyScript.Script.runInThisContext (vm.js:24:12)
    at REPLServer.defaultEval (repl.js:346:29)
    at bound (domain.js:280:14)
    at REPLServer.runBound [as eval] (domain.js:293:12)
    at REPLServer.onLine (repl.js:544:10)

P.S: I also did a twitter thread (pop quiz on JS) for the fun of it!

Rethinking Async in Javascript

| Comments

This post is more like a drama script. I would love to see the below conversation as a one-act play on a stage!

Master: Can you write a function to read a contents of a file?

Apprentice: hmm.. that's very easy!

1
2
3
function read(filename){
  return fs.readFileSync(filename, 'utf8');
}

Master: Okies....say the file is like 10GB in size?

Apprentice:

1
2
3
4
5
6
7
8
function read(filename, callback){
  fs.readFile(filename, 'utf8', function (err, res){
    if (err) {
     return callback(err);
    }
    callback(null,res);
  });
}

Master: OK, not bad...now process that file.

Apprentice:


1
2
3
function process(file){
  /* Some processing stuff */
}
1
2
3
4
5
6
7
8
9
10
11
12
function read(filename, callback){
  fs.readFile(filename, 'utf8', function (err, res){
    if (err) {
     return callback(err);
    }
    try {
      callback(null, process(res));
    } catch (ex) {
      callback(ex);
    }
  });
}

1
2
3
4
5
6
7
8
9
10
11
12
13
function read(filename, callback){
  fs.readFile(filename, 'utf8', function (err, res){
    if (err) {
     return callback(err);
    }
    try {
      res = process(res);
    } catch (ex) {
      return callback(ex);
    }
    callback(null, res);
  });
}

Master: Takes a gasp and talks about

how do we avoid callback hell?

  • Name your functions.

  • Keep your code shallow.

  • Modularize!

  • Binding this


Apprentice: Makes a sad face.

Master: When you know...then you know!


Let us make a promise!

  • fulfilled

  • rejected

  • pending

  • settled


1
2
3
4
5
6
7
8
9
var promise = new Promise(function(resolve, reject) {
  // Some async...
  if (/* Allz well*/) {
    resolve("It worked!");
  }
  else {
    reject(Error("It did not work :'("));
  }
});
1
2
3
4
5
promise.then(function(result) {
  console.log(result); // "It worked!"
}, function(err) {
  console.log(err); // Error: "It don not work :'("
});

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
async1().then(function() {
  return async2();
}).then(function() {
  return async3();
}).catch(function(err) {
  return asyncHeal1();
}).then(function() {
  return asyncHeal4();
}, function(err) {
  return asyncHeal2();
}).catch(function(err) {
  console.log("Ignore them");
}).then(function() {
  console.log("I'm done!");
});

Let us talk about async map:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
async.map(['file1','file2','file3'], fs.stat, function(err, results){
    // results is now an array of stats for each file
});

async.filter(['file1','file2','file3'], fs.exists, function(results){
    // results now equals an array of the existing files
});

async.parallel([
    function(){ ... },
    function(){ ... }
], callback);

async.series([
    function(){ ... },
    function(){ ... }
]);

A: Happ(Y)ily ever after?


M: No! We have some more issues:

Streams are broken, callbacks are not great to work with, errors are vague, tooling is not great, community convention is sort of there..


  • you may get duplicate callbacks
  • you may not get a callback at all (lost in limbo)
  • you may get out-of-band errors
  • emitters may get multiple “error” events
  • missing “error” events sends everything to hell
  • often unsure what requires “error” handlers
  • “error” handlers are very verbose
  • callbacks suck

Master: Let us talk about generators:


1
2
3
4
5
6
7
function *Counter(){
 let n = 0;
 while(1<2) {
   yield n;
   n = n + 1;
 }
}
1
2
3
4
5
6
7
8
let CountIter = Counter();

CountIter.next();
// Would result in { value: 0, done: false }

// Again 
CountIter.next();
//Would result in { value: 1, done: false }

1
2
3
4
5
6
7
function *fibonacci() {
    let [prev, curr] = [0, 1];
    for (;;) {
        [prev, curr] = [curr, prev + curr];
        yield curr;
    }
}
1
2
3
4
5
for (fib of fibonacci()) {
    if (fib === 42)
        break;
    console.log(fib);
}

1
2
3
function *powPuff() {
  return Math.pow((yield "x"), (yield "y"));
}
1
2
3
4
5
6
7
let puff = powPuff()

puff.next();

puff.next(2);

puff.next(3); // Guess ;)

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
function* menu(){
  while (true){
    var val = yield null;
    console.log('I ate:', val);
  }
}


let meEat = menu();

meEat.next();

meEat.next("Poori");

meEat.next("Pizza");

meEat.throw(new Error("Burp!"));


1
2
3
4
5
6
7
8
9
10
function* menu(){
  while (true){
    try{
      var val = yield null;
      console.log('I ate: ', val);
    }catch(e){
      console.log('Good, now pay the bill :P');
    }
  }
}
1
meEat.throw(new Error("Burp!"));

M: We can delegate!

1
2
3
4
5
6
7
var inorder = function* inorder(node) {
  if (node) {
    yield* inorder(node.left);
    yield node.label;
    yield* inorder(node.right);
  }
}

A: Confused

M: Deeper you must go! Hmmm


1
2
3
4
5
6
7
function *theAnswer() {
  yield 42;
}

var ans = theAnswer();

ans.next();

A: MORE CONFUSED

M: So....


1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
function theAnswer() {
  var state = 0;
  return {
    next: function() {
      switch (state) {
        case 0:
          state = 1;
          return {
            value: 42, // The yielded value.
            done: false
          };
        case 1:
          return {
            value: undefined,
            done: true
          };
      }
    }
  };
}

M: But, this has not yet solved the initial issue!


M: Let us assume a function named run ~_~


1
2
3
4
5
6
7
8
9
run(genFunc){
 /*
   _ __ ___ __ _ __ _(_) ___ 
 | '_ ` _ \ / _` |/ _` | |/ __|
 | | | | | | (_| | (_| | | (__ 
 |_| |_| |_|\__,_|\__, |_|\___|
                  |___/        
 */
}
1
2
3
4
5
6
run(function *(){
  var data = yield read('package.json');
  var result = yield process(data);
  console.log(data);
  console.log(result);
});


1
2
3
4
5
6
7
8
9
10
11
12
13
var fs = require(fs);

function run(fn) {
  var gen = fn();

  function next(err, res) {
    var ret = gen.next(res);
    if (ret.done) return;
    ret.value(next);
  }

  next();
}

A: WOW!

M: HMM, let us talk about THUNKS!

A: Thunks??!


let timeoutThunk = (ms) => (cb) => setTimeout(cb,ms)


1
2
3
4
5
function readFile(path) {
    return function(callback) {
        fs.readFile(path, callback);
    };
}

Instead of :

1
readFile(path, function(err, result) { ... });

We now have:

1
readFile(path)(function(err, result) { ... });

So that:

1
var data = yield read('package.json');

Master: Baaazinga!


M: More usecases! Generator-based flow control.

A: Very eager.


1
$ npm install thunkify

Turn a regular node function into one which returns a thunk!

1
2
3
4
5
6
7
8
var thunkify = require('thunkify');
var fs = require('fs');

var read = thunkify(fs.readFile);

read('package.json', 'utf8')(function(err, str){

});

1
$ npm install co

Write non-blocking code in a nice-ish way!

1
2
3
4
var co = require('co');
var thunkify = require('thunkify');
var request = require('request');
var get = thunkify(request.get);
1
2
3
4
5
6
7
8
co(function *(){
  try {
    var res = yield get('http://badhost.invalid');
    console.log(res);
  } catch(e) {
    console.log(e.code) // ENOTFOUND
 }
}());

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
var urls = [/* Huge list */];

// sequential

co(function *(){
  for (var i = 0; i < urls.length; i++) {
    var url = urls[i];
    var res = yield get(url);
    console.log('%s -> %s', url, res[0].statusCode);
  }
})()

// parallel

co(function *(){
  var reqs = urls.map(function(url){
    return get(url);
  });

  var codes = (yield reqs).map(function(r){ return r.statusCode });

  console.log(codes);
})()

M: Interesting? What more?


1
$ npm install co-sleep
1
2
3
4
5
6
7
8
9
var sleep = require('co-sleep');
var co = require('co');

co(function *() {
  var now = Date.now();
  // wait for 1000 ms
  yield sleep(1000);
  expect(Date.now() - now).to.not.be.below(1000);
})();

1
$ npm install co-ssh
1
2
3
4
5
6
7
8
9
10
11
12
var ssh = require('co-ssh');

var c = ssh({
  host: 'n.n.n.n',
  user: 'myuser',
  key: read(process.env.HOME + '/.ssh/some.pem')
});

yield c.connect();
yield c.exec('foo');
yield c.exec('bar');
yield c.exec('baz');

1
2
3
4
5
var monk = require('monk');
var wrap = require('co-monk'); // co-monk!
var db = monk('localhost/test');

var users = wrap(db.get('users'));
1
2
3
yield users.remove({});

yield users.insert({ name: 'Hemanth', species: 'Cat' });
1
2
3
4
5
6
// Par||el!
yield [
  users.insert({ name: 'Tom', species: 'Cat' }),
  users.insert({ name: 'Jerry', species: 'Rat' }),
  users.insert({ name: 'Goffy', species: 'Dog' })
];

1
$ npm install suspend
1
2
3
4
5
6
7
var suspend = require('suspend'),
    resume = suspend.resume;

suspend(function*() {
    var data = yield fs.readFile(__filename, 'utf8', resume());
    console.log(data);
})();
1
2
3
4
5
6
var readFile = require('thunkify')(require('fs').readFile);

suspend(function*() {
    var package = JSON.parse(yield readFile('package.json', 'utf8'));
    console.log(package.name);
});

A: Is that all? Can I go home now?

M: No!?


M: Koa FTW?!


1
$ npm install koa
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
var koa = require('koa');
var app = koa();

// logger

app.use(function *(next){
  var start = new Date;
  yield next;
  var ms = new Date - start;
  console.log('%s %s - %s', this.method, this.url, ms);
});

// response

app.use(function *(){
  this.body = 'Hello World';
});

app.listen(3000);

A: Something for the client?


M: Hmmm, good question, we have Task.js

generators + promises = tasks

1
2
3
4
5
6
7
8
9
10
11
12
13
<script type="application/javascript" src="task.js"></script>

<!-- 'yield' and 'let' keywords require version opt-in -->
<script type="application/javascript;version=1.8">
function hello() {
    let { spawn, sleep } = task;
    spawn(function() { // Firefox does not yet use the function* syntax
        alert("Hello...");
        yield sleep(1000);
        alert("...world!");
    });
}
</script>

M: Sweet and simple!

1
2
3
4
5
6
7
8
9
10
spawn(function*() {
    try {
        var [foo, bar] = yield join(read("foo.json"),
                                    read("bar.json")).timeout(1000);
        render(foo);
        render(bar);
    } catch (e) {
        console.log("read failed: " + e);
    }
});

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
var foo, bar;
var tid = setTimeout(function() { failure(new Error("timed out")) }, 1000);

var xhr1 = makeXHR("foo.json",
                   function(txt) { foo = txt; success() },
                   function(err) { failure() });
var xhr2 = makeXHR("bar.json",
                   function(txt) { bar = txt; success() },
                   function(e) { failure(e) });

function success() {
    if (typeof foo === "string" && typeof bar === "string") {
        cancelTimeout(tid);
        xhr1 = xhr2 = null;
        render(foo);
        render(bar);
    }
}

function failure(e) {
    xhr1 && xhr1.abort();
    xhr1 = null;
    xhr2 && xhr2.abort();
    xhr2 = null;
    console.log("read failed: " + e);
}

A: Thank you master I feel enlightened!

M: Are you sure?

A: Hmmm....

M: This is just the beginning!


M: Let me talk about async-await

1
2
3
4
5
async function <name>?<argumentlist><body>

=>

function <name>?<argumentlist>{ return spawn(function*() <body>); }

Example of animating elements with Promise:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
function chainAnimationsPromise(elem, animations) {
    var ret = null;
    var p = currentPromise;
    animations.forEach(function(anim){
      p = p.then(function(val) {
            ret = val;
            return anim(elem);
        });
    });

    return p.catch(function(e) {
        /* ignore and keep going */
    }).then(function() {
        return ret;
    });
}

Same example with task.js:

1
2
3
4
5
6
7
8
9
10
11
function chainAnimationsGenerator(elem, animations) {
    return spawn(function*() {
        var ret = null;
        try {
            for(var anim of animations) {
                ret = yield anim(elem);
            }
        } catch(e) { /* ignore and keep going */ }
        return ret;
    });
}

Same example with async/await:

1
2
3
4
5
6
7
8
9
async function chainAnimationsAsync(elem, animations) {
    var ret = null;
    try {
        for(var anim of animations) {
            ret = await anim(elem);
        }
    } catch(e) { /* ignore and keep going */ }
    return ret;
}

Another example from the draft:

1
2
3
4
5
6
7
8
9
async function getData() {
  var items = await fetchAsync('http://example.com/users');
  return await* items.map(async(item) => {
    return {
      title: item.title,
      img: (await fetchAsync(item.userDataUrl)).img
    }
  }
}

M: Now let us do a performance review.

A: Runs away!!


Hope you liked the play! You might also like reading Are you async yet? post.

Succeed With Service Workers

| Comments

A brief introduction to service workers:

Web Applications are alive only if the network is reachable, if you are not connected to the network you ending you seeing an error page, this has been the major drawback of the web content delivery when compared of other technology stacks.

The service worker comes to the rescue at this scenario which provides a Web Worker context, which can be started by a runtime when navigations are about to occur and can be consulted when navigations occur to that location, network requests are dispatched to this worker and it can over-ride the default network stack behavior.

Conceptually, the worker is between the network and a document renderer, allowing the it to provide content for documents, even while offline!

With previous effort to provide offline support, using HTML5 Application Cache, we have also experienced that several attributes of the design contribute to unrecoverable errors.. As a result, the major design principle of the service worker is ‘error recoverability ’. Service workers are started and kept alive by their relationship to events, not documents. This behaviour is highly influenced by Shared Workers and Event Pages in the chromium extensions model.

Service workers gives us an overall benefit in having an excellent user experience as it’s gives support out of the box for offline support, instead of seeing an ‘You are offline message’, one could give a smoother offline-first experience. As a result of caching one can provide faster experience with lesser bandwidth consumption, combined with the power of ‘Add to homescreen’, ‘Push Notifications’ and more it would be easy to create truly progressive applications on par, and often better experience than native apps!

Service workers may be started by user agents without an attached document and may be killed by the user agent at nearly any time.

Service worker definition from the spec would be, “Service workers are generic, event-driven, time-limited script contexts that run at an origin.“

In simple terms we can consider it as a Shared Workers that can start, process events, and die without ever handling messages from documents and may be started and killed many times a second.

These special power of a service worker makes them a good candidate for a range of runtime services that may outlive the context of a particular document, such as handling push notifications, background data synchronization, responding to resource requests from other origins, or receiving centralized updates to expensive-to-calculate data. Mind of a Service Worker.

Here are some attributes of what a service worker thinks and acts like:

  • It executes in the registering service worker client's origin.
  • Has a state, which is one of parsed, installing, installed, activating, activated, and redundant.
  • It has a script URL.
  • It has an associated containing service worker registration, which contains itself.
  • Has an associated id (a UUID), which uniquely identifies itself during the lifetime of its containing service worker registration.
  • Lifecycle events being install and activate.
  • Functional events including fetch.
  • Has a script resource map which is a List of the Record {[[key]], [[value]]} where [[key]] is a URL and [[value]] is a script resource.
  • Has a skip waiting flag. Unless stated otherwise it is unset.
  • Has an imported scripts updated flag. It is also initially unset.

Interface definition:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
[Exposed=(Window,Worker)]
interface ServiceWorker : EventTarget {
  readonly attribute USVString scriptURL;
  readonly attribute ServiceWorkerState state;
  readonly attribute DOMString id;
  void postMessage(any message, optional sequence<Transferable> transfer);
  // event
  attribute EventHandler onstatechange;
};
ServiceWorker implements AbstractWorker;
enum ServiceWorkerState {
  "installing",
  "installed",
  "activating",
  "activated",
  "redundant"
};

Up and running with service workers:

Before we get up and running with service workers, there are few important things to keep in mind. A service worker runs in a worker context, has no DOM access, is non-blocking, fully async and hence APIs such as synchronous XHR and localStorage can't be used inside a service worker. Service workers only run over HTTPS, for security reasons.

Service workers has no access to DOM, but can access:

  • The navigator object
  • The location object (read-only) setTimeout()/clearTimeout() and setInterval()/clearInterval()
  • The Application Cache
  • Importing external scripts using the importScripts() method
  • Other service workers.

Service workers need to be on HTTPS only for certain reasons like:

  • Better to protect end users from man-in-the-middle attacks
  • Do good by encouraging HTTPS adoption
  • Existing "playground" services (e.g. github.io) now work with HTTPS
  • HTTPS is coming across much more of the web quickly
  • Devtools can loosen the restriction for development (file://, localhost, etc.)

Here is how one would register for a ServiceWorker:

1
2
3
4
5
6
if ('serviceWorker' in navigator) {
    navigator.serviceWorker.register('/pwa/sw.js', {
      scope: '/pwa/'
    }).then(reg => console.log('Yes!', reg);)
   .catch(err => console.log('No :(', err);)
}

In this example, /pwa/sw.js is the location of the ServiceWorker script, and it controls pages whose URL begins with /pwa/. Note that the scope is optional, and defaults to /. .register returns a promise which will resolve to based on whether the serviceworker was registered or not. P.S: pwa is your progress web app ;)

It’s important to note that, the page where the service worker is registered, must have been served securely i.e on HTTPS without any certificate errors and script must be on the same origin as the page unless you are using importScripts.

Life cycle:

On .register the worker script goes through three stages:

  • Download
  • Install
  • Activate

You can use the events to interact with install and activate phases:

1
2
3
4
5
self.addEventListener('install', function(event) {
  event.waitUntil(
   doAllTheWork();
  );
});

event.waitUntil basically extends the installation process, once done the activate event is triggered.

1
2
3
self.addEventListener('activate', function(event) {
  // Have fun!
});

So finally we are all set the control the page?

Well, not yet.

The document we called .register from isn't being controlled yet as the ServiceWorker wasn’t there for the first load, a document will pick a ServiceWorker to be its controller when it navigates to the same. If you refresh the document, it'll be under the ServiceWorker's control and it will set at navigator.serviceWorker.controller. Friends with the Network

There is this event called fetch that will be fired when:

Navigations within the Service Worker's scope. All most all the Requests triggered by the page which have registered to that Service Worker.

P.S: Requests being: page itself, the JS, CSS, images, XHR, beacons et.al.

Expectations being: iframes & s ServiceWorkers. (Any other service worker) Requests triggered within a ServiceWorker. (No scope for inception)

Hearing for fetch events:

1
2
3
self.addEventListener('fetch', function(event) {
  console.log(event.request);
});

request object would have information about URL, method & headers.

But the real power is in hijacking the response and sending a different response!

1
2
3
self.addEventListener('fetch', function(event) {
  event.respondWith(new Response("Hello from a different world!"));
});

The Response object comes from the Fetch Spec, which also returns a promise, which is helpful if the response is from a remote URL.

Hope this was useful, there is much more to be added to this article, but this stayed in my draft for a very loooong time, had to publish it.

Thanks to Shwetank from Opera for quick reviews.

URL Navigation From Service Worker

| Comments

Given that the service worker has only readonly access to the navigator.location it is not possible to change the location from within the service worker, but with WindowClient.navigate API it is easy to do a navigation!

The navigate() method of the WindowClient interface loads a specified URL into a controlled client page then returns a Promise that resolves to the existing WindowClient.

Say, you want to navigate to a particular URL (/meow.html in this case ;)) after the service worker is activated, you could something like:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
self.addEventListener('activate', event => {
  event.waitUntil(self.clients.claim().then(() => {
    /*
      returns a Promise for a list of service worker clients.
      type could be: window, worker, sharedworker or all.
    */
    return self.clients.matchAll({type: 'window'});
  }).then(clients => {
    return clients.map(client => {
      // `WindowClient.navigate()` is not yet supported in all the browser.
      if ('navigate' in client) {
        return client.navigate('meow.html');
      }
    });
  }));
});

Hope with this it's clear now on how a service worker can navigate the clients it controls to a given URL.

Happy navigating! ;)

Update: Directly from the spec:

The navigate() method must run these steps or their equivalent:

  • Let url be the result of parsing url with entry settings object's API base URL.

  • If url is failure, return a promise rejected with a TypeError.

  • If url is about:blank, return a promise rejected with a TypeError.

  • If the context object's associated service worker client's active worker is not the incumbent settings object's global object's service worker, return a promise rejected with a TypeError.

  • Let promise be a new promise.

  • Run these steps in parallel:

    • Let browsingContext be the context object's associated service worker client's global object's browsing context.

    • If browsingContext has discarded its Document, reject promise with a TypeError and abort these steps.

    • Queue a task to run the following substeps on the context object's associated service worker client's responsible event loop using the user interaction task source:

      • Navigate browsingContext to url with replacement enabled and exceptions enabled. The source browsing context must be browsingContext.

      • If the navigation throws an exception, reject promise with that exception and abort these steps.

      • If the origin is not the same as the service worker's origin, then:

        • Resolve promise with null.

        • Abort these steps.

      • Let client be the result of running Capture Window Client algorithm, or its equivalent, with browsingContext as the argument.

      • Resolve promise with client.

  • Return promise.

Webpack: Bundle Unimported Assets

| Comments

If you reading this post, I assume that you are enough awareness of webpack, so I shall be directly diving into the code part of it.

Scenario:

Assume that there are few less file that you need to bundle along with webpack's output but those are not required or imported in any of your source/target files, below is simple trick in webpack config to bundle such files:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
const ExtractSass = {

  entry: glob_entries('./public/styles/external/*.less'),
      output: {
          filename: './.build/junk/[name].[chunkhash].js',
      },
      module: {
          loaders: [
              {
                  test: /\.less$/,
                  loader: ExtractTextPlugin.extract('isomorphic-style-loader','css-loader?modules&localIdentName=[name]_[local]','less-loader')
              },
          ]
      },
      plugins: [
          new ExtractTextPlugin('./.build/css/[name].css')
      ]
}

The entry attribute in webpack config bascialy handles, string, array and object:

  • If a string is passed it's resolved to a module which is loaded upon startup.

  • If an array is passed all modules are loaded upon startup and the last one is exported.

  • If an object is passed multiple entry bundles are created. The key is the chunk name. The value can be a string or an array.

The above snippet has the following depencies:

/public/styles/external/*.less has those files that are not required in any of src files but needs to bundled, the trick bascially is to create js file for each those less files and then extract them to their respective css files to ./build/css/ path.

Sample output:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
Child
    Hash: c1d6d83b3be5f8bb84ab
    Version: webpack 1.13.1
    Time: 3738ms

     Asset                                          Size        Chunks            Chunk Names
    ./.build/junk/Header.2fcd80b30e542a8a90e2.js    1.48 kB      1  [emitted]     Header
    ./.build/junk/Info.349ad333e96a6d35313f.js      1.47 kB      2  [emitted]     Info
    ./.build/junk/Section.a33bf0dc98964f4593b9.js   1.52 kB      4  [emitted]     Section

          ./.build/css/Header.css                   77 bytes     1  [emitted]     Header
          ./.build/css/Info.css                     96 bytes     2  [emitted]     Info
          ./.build/css/Section.css                  493 bytes    3  [emitted]     Section
        + 77 hidden modules
    Child extract-text-webpack-plugin:
            + 2 hidden modules
    Child extract-text-webpack-plugin:
            + 2 hidden modules
    Child extract-text-webpack-plugin:
            + 2 hidden modules
    Child extract-text-webpack-plugin:
            + 2 hidden modules
    Child extract-text-webpack-plugin:
            + 2 hidden modules

This might be very rare scenario or a workaround for some server side rendering stuff, none the less, it's not all that hacky and looks neat?

Path Resolver With JavaScript Proxies

| Comments

So today morning, when I was lurking on #javascript channel on freenode, there was this question:

merpnderp> If I have var foo = {x:{y:{z:bar}}}; And I have var keys = ['x','y','z']; How would I make something like foo[[keys.join('.')]] = 'baz'; work?

Me and @doodadjs were trying to solve this using [].reduce and then there was this silly thought of extending the same code to proxies

So, here is the code:

1
2
3
4
5
6
var handler = {
    get: function(target, path){
        return path.split(".").reduce((o, k) => o && (k in o) ? o[k] : undefined, target)
};

var obj = new Proxy({}, handler);

Now one could do something like:

1
obj['a'] = {b: {c : 1}}

and then obj['a.b.c'] would be equivalent to obj['a']['b']['c'] and not to worry non-existing keys would just return undefined as in

obj['a.x.c'] would return undefined.

P.S: Havn't checked the perfs yet, but this was just for the fun of it!

ServiceWorker Communication via MessageChannel

| Comments

Two way communication between ServiceWorker and main thread is easily possible with MessageChannel API.

Let the code do the talking:

In your app.js have a simple sendMessage method that looks like:

1
2
3
4
5
6
7
8
9
function sendMessage(message) {
  return new Promise((resolve, reject) => {
    const messageChannel = new MessageChannel();
    messageChannel.port1.onmessage = function(event) {
      resolve(`Received a direct message from the ServiceWorker: ${event.data}`);
    };
    navigator.serviceWorker.controller.postMessage(message, [messageChannel.port2])
  });
}

And in your sw.js recive and respond to the message:

1
2
3
4
self.addEventListener('message', function(event) {
  console.log(`Received a message from main thread: ${event.data}`);
  event.ports[0].postMessage(`Roger that! - "${event.data}"`);
});

Now from your app.js if you want to send a message to service worker and get a reponse from it you would:

1
2
3
var log = console.log.bind(console);
var error = console.error.bind(console);
sendMessage('hello').then(log, error).catch(error);

This would log something like:

1
2
Received message from main thread: Hello
Received direct message from service worker: Roger that! - "Hello"

String.prototype.replace You Might Have Missed!

| Comments

If you ask for an example of String.prototype.replace in JavaScript, the most common response would be:

1
2
var str = 'foobar';
var replaced = str.replace('bar', 'baz');

or

1
2
var str = 'The quick brown fox had a great Xmas';
var replace = str.replace(/Xmas/i, 'Christmas!');

So it's mostly about seach and replace feature that most (P.S: By most of them I mean those I have come across on IRC.) of them wouldn talk about but would have not noticed that the signature of the replace method is like:

1
replace(regexp|substr, newSubStr|function[, flags])

From the signature, the focus of this post will be on the function in the params, this will be invoked to create the new substring and receives the below parameters:

  • match: The matched substring.

  • p1,p2..pn: The nth parenthesized submatch string.

  • offset: The offset of the matched substring.

  • string: The entire string which is being processed.

Let us see few example application of the same:

A simple example of increasing your protein and fat intake ;)

1
2
3
> function replacer(match) { return 2 * match }
> '10 gms of protein and 5gms of fat.'.replace(/[0-9]+/g, replacer)
> '20gms of protein and 10gms of fat.'

The below snippet replaces a Fahrenheit degree with its equivalent Celsius degree, for an input 212F, the function returns 100C, nothing big about this, but if you notice that the replace function's second agrument is a function convert which receives the parameters specified in the table above and returns a string.

1
2
3
4
5
6
7
8
9
10
// from mdn

function f2c(temprature) {
  function convert(match, p1, offset, string) {
    return ((p1 - 32) * 5/9) + 'C';
  }
  var s = String(temprature);
  var test = /(-?\d+(?:\.\d*)?)F\b/g;
  return s.replace(test, convert);
}

This feature would be more useful when you are doing a replace operation over a loop, using a replacer function like this one could totally avoid a loop.

Say, the input is an string that looks like:

1
let str = '___.___.___.__._...__';

Where __. is a high singal and __. is a low singal and rest of them are noise, you can filter them easily with an explicity loop using the replacer function like:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
let str = '___.___.___.__._...__';
let res = [];
str.replace(/(___.)|(__.)/g, function(match, p1, p2) {
  if (p1) { res.push({ high: true, length: p1.length }); }
  if (p2) { res.push({ high: false, length: 1 }); }
});

console.log(res);

/*
[ { high: true, length: 4 },
  { high: true, length: 4 },
  { high: true, length: 4 },
  { high: false, length: 1 } ]
*/

Well, it has been a decent amount of time since I wrote amount some fundamentals of JavaScript, hope this was useful for you, feel free to share few of your experince with replacer functions.

Copyright © 2021 - Hemanth.HM - Powered by Octopress