Pure Hot Reload

2 March 2020

Developing front-end applications requires constant switching between the IDE (or simple text editor) and the browser window. Additionally, when the code is updated, the developer needs to refresh the page which takes a little bit of time. Although this time span can be pretty short (e.g., 1-2 seconds), if the process is repeated over and over again, it adds up to hours in total by the end of the work week (e.g., 2s per refresh × 50 refreshes per hour × 8 hours a day × 5 days = 4000s = 66 min = ~1 hour). It's much better to free up this time and use it more productively, for example, to add a few extra tests or examples in documentation.

hot reload in javascript

This is why the concept of hot reload was invented: to watch changes in the source code, and send the browser a message, that an update is needed. But HR is more than that: it's not just about reloading the window, it's about substituting existing code (functions, classes), for new one SEAMLESSLY for maximum productivity and efficiency. This is not just a "nice" feature to have, it's a functional requirement for Preact and React components that can have a STATE: if we just refreshed a component, the state will be lost and we'd have to restart our testing process from the beginning (e.g., select destination from the list, then select a car, to get to the state when the "continue" button is showing). Hot reload allows to preserve components' state and helps us develop applications quickly.

Idio web server belongs to a new generation of tooling that take the maximum advantage of language features. ES6 standard has been completed in 2015, and everything that we need for web development is available to us already without any need to transpile language features, making such tools as Webpack and Babel redundant for development. And for the build step, our Node.JS development company is using Closure Compiler. The requirements are thus to transpile JSX, serve packages as modules, and enable hot reload. All of these features are provided by Idio that aims to implement effective solutions using bare minimum of additional code, and just let browsers do the work. In this article, I'll explain, how pure hot reload can work, that is, using JavaScript itself instead of getting vendor-locked into complex packages like Webpack.

section break

HMR 101

JQuery first appeared in 2006, and JavaScript for the web was quite common of course. But it was in 2012-2015 that the web really underwent its industrial revolution: Flash was being abandoned, multiple Open Source UI frameworks such as React and Vue emerged and started to gain popularity, and build tools began to fill their market niche. It was also time for the hot reload to appear.

React Hot Reload

I came to know Hot Module Reload like any React developer from documentation in 2017, which by then was pretty stable. But the concept of hot module reload was first discussed by Bruce Hauman and Dan Abramov back in 2015. At the time, it was a pretty ground-breaking idea, welcomed with enthusiasm by many developers as it's really an essential feature of a development environment that preserves your creativity flow.

dan abramov talk

Dan's solution is to record methods of a class separately and replace them in instances with proxies. When a change is detected, new methods are sent to the client, and proxies will now execute new code, therefore the actual instance and its state stay untouched. It's about wrapping methods and functions with another meta-function that would be able to perform substitutions at runtime.

To enable the reload for React, we need to import the hot method and wrap the default method with it:

import { Component } from 'react'
import { hot } from 'react-hot-loader'

class App extends Component {
  render() {}

export default hot(module)(App)

When calling the hot method for the first time, its code will register a proxy for the class.

// hot will call:
  'RHL' + moduleId

// register => createClassProxy =>
// defineProxyMethods => fakeBasePrototype =>

/* Each method is going to be wrapped now */
function methodWrapperFactory(wrapperName, realMethod) {
  return copyMethodDescriptors(function wrappedMethod() {
    for (var _len2 = arguments.length,
      rest = Array(_len2),
      _key2 = 0; _key2 < _len2; _key2++
    ) {
      rest[_key2] = arguments[_key2];

    return realMethod.apply(this, rest);
  }, realMethod);

A proxy means that methods of the prototype were wrapped by a function that can will call updated methods when they arrive. This is why when you step into an method of a class when hot reload is activated, you'll not get to it straight away, but will end up in the wrappedMethod instead as shown on the video below.

using the debugger

The wrapper and other features make up react-hot-loader while it's Webpack that is responsible for establishing client-server communication and providing an API for updates of MODULES (everything that's exported), but not classes (refreshes to which are more refined by React's hot loader).

Webpack Modules

Whereas the react-hot-loader can be seen as "software" reloading mechanism, the modular system of Webpack can then be thought of as underlying "hardware". Its internal logic is very easy to understand NOT!

depack modular system
© Tobias Koppers

What you see is a complex build tool that implements its own modular system. In short, it analyses source code, builds dependency trees, and puts everything together (it can also tree-shake unused code). It will always wrap source code into its own, making it a black box for everyone who doesn't get hands on its internals. The total code can then be served to the browser in chunks, but it's all tied up to Webpack's own modules.

/* index.js */
const a = lib()

// becomes:
const a = Object(_lib__WEBPACK_IMPORTED_MODULE_3__["lib"])();

/* lib.js */
export const lib = () => {

// becomes
"use strict";
/* harmony export (binding) */ __webpack_require__.d(__webpack_exports__, "lib", function() { return lib; });
const lib = () => {
  return 'example';

Both files are concatenated into a single chunk. The code above shows that the lynchpin of Webpack is __webpack_require__ that ties together all modules in the whole of the bundle. It's quite similar to browserify that also makes use of CommonJS require method (via its prelude) to make dependencies aware of each other.

But then again, when you debug, to step in an exported function from a module, you will have to make 4 clicks instead of 1:

  • step in webpack module getter (4)
  • step in return from module getter (3)
  • step in object wrapper (2)
  • land in function call (1)

Your debugging experience is weak and you have to set breakpoints manually in your functions, in order to step into them without this inconvenience. When source maps are enabled, you will be landing on random places in code like so:

debugging with source maps

To understand what's going on behind the scenes, we have to disable source maps, and check where we're really in the program. As I described above, your import will be wrapped in a getter of Webpack's module system, which in turn will be wrapped in an Object converter.

debugging without source maps

So not only Webpack is bulky, but it also is directly going against our developer experience. Those seconds we might have saved on hot-reload, we're now loosing when debugging our front-end applications.

Price of Stack

From this quick research we're drawing a conclusion that our web interfaces today are based on a build platform, such as Webpack, and on a framework, such as React. Both of them imply their additional cost: in terms of dependencies, and complexity introduced into the system (e.g., clicking 3 times to step into a function).

We've looked at the complexity derived from an implementation of the modules' wireframe, so let's quickly discuss the technical cost in terms of 3rd party dependencies. Webpack has become pretty much a default for front-end web-development, but did you know that you're pulling along 342 additional dependencies during its installation:

webpack dependencies graph

Your node_modules will be 20MB after that, but it seems like nobody cares about that sort of thing anymore. Everyone is so used to this sad reality of life, that it just became normal. Webpack is probably OK for larger projects but if you want to create a very simple JS app, you basically need a nuclear power plant, and this is repeated for each new project. Webpack is good software that has helped to put together many websites, but it doesn't have to be a standard, and many people have been experimenting with other bundlers like Rollup, Parcel, and Closure Compiler.

section break

New Age Solutions

Open Source supposedly saves us from "vendor lock" — but does it really? Instead of corporate vendor lock into proprietary software, we now have Open Source vendor lock into popular projects like Webpack. This is because the web was still maturing in the last 5 years.

From User To Developer

However, now we can explore another way: in 2020, we're moving on to new age tools that provide completely transparent experience, thanks to the adoption of modules by browsers. They will allow you to stop being a USER of 3rd party technology, and remain a DEVELOPER who takes her profession in hand.

Today, web developers can finally focus on utilising language and worry less about bundling. We still need to build for older browsers which is fine, but development process can go directly through the browser, bypassing custom modular systems of build tools that were required before browsers supported modules. This is how nice and important ES6 is, benefits of which we are ripping today.

Which One Are You?

user of software You give developers something to do! Preorder Art Deco JavaScript to learn easy professional JS coding to become a developer yourself.
developer of software The internet is made by people like you. Why not try Alibaba Cloud Hosting and receive a 1GB instance free for 12 months? + complimentary .tech domain!

Although we've been talking about build tools up to this point, I'd like you to switch off thinking in terms of the platform they provide from now on: with new age in mind, we always focus on the principle and not technology.

Frictionless Development

To explore the principle behind frictionless front-end development, let me introduce Idio. Idio is a professional web server which has been compiled with Closure, and has only 2 dependencies. It consists of ES6-rewritten Koa with a collection of middleware, but it also allows to develop JavaScript for the browser. To enable the web development process, it ships with our in-house FrontEnd middleware — one of the modules-era nimble tools that does the following:

1. You specify the directory with your frontend code, e.g., frontend.

For example, you can have the following source in frontend/index.js:

import { dep } from 'my-dep'
import local from './local'

const el = document.querySelector('#id')
dep(el) // execute dependency
local(el) // execute local

When the browser requests this JS module via a script tag, <script type="module" src="frontend"></script>, FrontEnd will intersect the request, and lookup my-dep in your node_modules, to find its package.json. From there, the module field will be read, and the served file will actually contain absolute paths required by the browser. Local dependencies can also be served, which would be impossible under normal circumstances when the extension (.js) is not present, but the middleware can handle that.

import { dep } from '/node_modules/my-dep/src/index.js'
import local from './local'

const el = document.querySelector('#id')
dep(el) // execute dependency
local(el) // execute local

Now the browser can import the dependency natively. The middleware will return any JS file under the /node_module path, enabling direct dependency serving, without having to wrap anything in vendor modular system. This simple step drastically reduces the complexity of the tool-chain, as it's the browser that's doing most of the work from this point onwards. On-the-fly modifications of paths are made with the simplest RegEx that picks up import statements.

2. You write JSX code in your frontend.

JSX is positively THE BEST templating syntax for web development, and there's no doubt about that. Although it was initially developed by React, its roots actually go much deeper than that, as MXML (Flash applications) used to have similar structure.

flex data bindings
© Jeffry Houser 2012

JSX is a revival of MXML and an essential part of programming of the web. There's an interesting alternative to JSX called lit-html but it makes no sense why I should use template literals when writing tags. It's simply a hack to eliminate JSX transpilation, but it's not a replacement for JSX, which is the daddy of modern web computing.

// JSX
import { render, Component } from 'preact'

class App extends Component {
  constructor() {
    this.state = { name: 'world' }
  render({ message }) {
    return (<div>
      Hello, <span>{this.state.name}</span>

render(<App message="Idio"/>, document.querySelector('.app'))

An example of using lit html is the following:

import { define, html } from 'https://unpkg.com/hybrids@4.1.3/src/index.js'

const HybridsApp = {
  appName: 'Welcome to Hybrids!',

  pageLoaded: '<span>No page loaded, yet.</span>',

  page: ({ pageLoaded }) => pageLoaded,

  render: ({ appName, page }) =>
        <p>Example of an unbundled Hybrids app with dynamic imports.</p>
          <button onclick="${load}" page="./pages/Welcome.mjs">
            Welcome Page
          <button onclick="${load}" page="./pages/Other.mjs">
            Other Page

© F1LT3R 2020, hybrids-dynamic-load.

The advantage of this over JSX is that it doesn't require transpilation and can be run natively within a browser. The disadvantage is that there's no syntax highlighting (unless you use a plugin) and it's not as fluent as JSX that doesn't require calling html function to start a template block.

You might be thinking that to transpile JSX you need Babel, but the fact is that you don't. You don't need to build ASTs and you don't need to install additional 250 dependencies that come with Babel, another "established" standard of coding. FrontEnd uses a JSX transpiler powered by regular expressions. It includes a hack to find where the initial tag (<) is opening by evaluating the jsx file using the vm module of Node.JS. The tag is detected by listening to an error:

  <div className={className}>

SyntaxError: Unexpected token <

This evaluation is needed to prevent false positives, e.g., if (a<example && b). After detecting the location of the angle bracket, we are able to parse text, until the closing tag is found, unless it was self-closing. Properties are also extracted using regular expressions on strings. Now look, you can think that it's very simplistic, but this stuff works and works really well. It supports comments and pretty much everything you want from JSX, except that you can't use > in properties, e.g.,

const Next = () => (<a href="#">Next Page</a>)

const Component = ({ data = [] }) => {
  return (<div> {
    data.map((i, j) => {
      if (j > 10) return <Next />
      return i

But methods like that simply need to be taken out of the tag scope. You're probably not going to replace Webpack with Idio at your corporate insurer job, but if you're working for fun projects, there's no reason why you can't be using this. Except for that there's no source maps, which will be added later on this year, but because the code produced looks absolutely natural, it's easy to debug it anyway (unless you've got the super-power setup where your VS Code debugger hooks up to a running Chrome RDP process).

// JSX - compiled code
import { render, Component } from 'preact'

class App extends Component {
  constructor() {
    this.state = { name: 'world' }
  render({ message }) {
    return (h('div',{},
      `Hello, `,h('span',{},this.state.name),

render(h(App,{message:"Idio"}), document.querySelector('.app'))

I've created many widgets, front-end apps and back-end servers with SSR powered by this transpiler. There hasn't been any problems. Maybe it will throw errors on some very advanced edge use cases (never happened for me, although your usage might be something I've never tested), but simplifying code will always help. Its simplicity is the key.

In short, with the FrontEnd middleware from Idio, I was able to bypass the build step for development stage, which is the most prevalent one in any web UI life cycle. By testing modern code right in a modern browser, I can reduce the friction between my actual ES6 modules (which actually have been a standard since 2015), and the browser. Transforms on JSX are done real-time but don't introduce any additional modular complexity.

Developer Freedom

Personally, I really wanted to breath freely when working on each new project, and not be brought down by the anxiety that filled me up when:

  • MY node_modules were being taken over;
  • MY code was wrapped in somebody else's system; and
  • I lost all control.

There's no happier place than to be able to direct the development process yourself, being able to say, "I'm the boss here, not you, dependencies and bundlers!".

control photo with birds

The FrontEnd solution might not be approved in corporate — which only confirms the fact they've been locked into their "Open Source" software, but for JS enthusiasts and hackers, it's perfect. I'm not offering an alternative that you'll be stuck with: you can see that developing front-end is really easy in 2020, as all you need is to serve JavaScript modules with a few on-the-fly modifications. I've not invented a new system, I've only patched up those things that were missing:

  1. rename imported package names into paths: 'dep' -> '/node_modules/dep/src/module.js';
  2. serve packages and other JS files even without extensions ('./lib');
  3. transpile JSX on the fly via an MVP transpiler;
  4. [read below] implement hot reload for ES6 exports.
frontend middleware

These 4 things enable clean development environment with minimum technical debt. With browsers supporting modules, we can leave build tools for building, but develop code natively. All features that we might want are implemented by Chrome and iOS Safari for us to test — happy days. It's true that we'll also need to combine our frontend code into a bundle, but we'll discuss the build step at the end. For now, let's have a look at hot reload, the final requirement for the front-end and its simplest form of implementation.

section break


In the light of the above, we can engineer our requirements based on the fact that browsers will load ECMA modules. We DON'T need to come up with our own modular system. What we want the hot-reload to achieve, is to replace existing exports with their updated version as files are saved on the computer.


There are 2 main areas that we need to focus on:

1. Updating functions, such as
export const fn = () => { console.log('hello world') }
2. Updating classes, e.g.,
export class MyClass { constructor() }

We also assume there's some entry code that renders our JS on the page, and we can execute it again upon refresh. However, there's a big difference between 2 of the above scenarios: whereas functions are always executed independently of their state (unless they reference some file-scoped variable), instances of classes are created only once, and need to preserve their state. If we update some code for a class whereas its instance has already been constructed, unlike with a pure function, we can't just "call" it again.

Let's try to solve the problem step by step. The main question is then, how to update the source code of a function?

Proxies Approach

The initial idea is to intercept requests to JS modules, find each export that it has, record it in window._exports[{FILE_NAME}][{EXPORT_NAME}], and reexport a proxy method that calls the real export by path. Upon updates, we'll change the _exports object to point to the new version, but the proxy stays the same.

proxies approach
  • When the file is requested for the first time, the _exports object will be populated for it to contain pointers to its exports. The actual exports are overridden with their proxies that call them via the global _exports object.
  • When a file is refreshed, the _exports is updated to point to new versions of exports. Since it was proxies that were initially exported from the file, only they are called throughout the front-end code and not actual exports.

The draft implementation is as follows:

/** @type {import('koa').Middleware} */
export default (ctx) => {
  ctx.type = 'application/javascript'
  // read the source code of the file
  const s = readFileSync(join(FRONTEND, ctx.path), 'utf8')
  // path the source code (assume a `test` method was exported)
  ctx.body = `${src.replace('export', '      ')}
window._exports = window._exports || {}
window._exports['${ctx.path}'] = window._exports['${ctx.path}'] || {}
window._exports['${ctx.path}']['test'] = test
const testProxy = (...args) =>
export { testProxy as test }

We read the source file, and patch it on the fly to remove actual export statements and wrap methods (hard-coded as test in this example) in a proxy which is then exported. When this file is requested again by an adapter that listens to Websocket updates, the window._exports[ctx.path]['test'] will be updated to a new function. Because initially we exported a proxy that calls the current version of the method by its path, all code that executes the proxy, will now call the new version.

Let me just reiterate the following:

Once we export a function, it's always the first version that's going to live through out the whole front-end code. This function we call a proxy.

The proxies method is the only solution to updating source code of functions, because there isn't anything like function.source = () => { return 'new source' }. It's not perfect as we introduce a quite complex mechanism and also we'll need to click twice to step into methods, which is preferable to avoid. Is there a better solution?

Bindings Intro

To dig deeper, we need to understand the essential property of ES6 modules. It is that we always export a binding and not the actual object or primitive value. It's quite different from traditional CommonJS modules, such that are used in Node.JS. Consider the illustration below:

// lib.js
export let a = 10
export const changeA = () => {
  a = a + 1

Here, we export 2 BINDINGS from the lib file: a binding of the variable a, and a binding of variable changeA. In case of changeA, the binding is constant which means that this variable can't bind to anything else other than it's original assignment (or slot in memory in other words).

// index.js
import { a, changeA } from './lib.js'

console.log(a) // 10
console.log(a) // 11

When we import those BINDINGS and execute the changeA function, despite the fact that the value of a is primitive, it's changed to 11, because in the original file we updated the value of the binding. But this only works because we declared a as a let. As you know, we can't assign over consts:

// lib.js
export const a = 10
export const changeA = () => {
  a = a + 1
  // a++ is also illegal

If we change the binding to be constant like above, we won't be able to make changes to it.

// index.js
import { a, changeA } from './lib.js'

console.log(a) // 10
changeA() // throws: Assignment to constant variable.

This is a very short intro, but just for the sake of completeness, you obviously can exports objects and then reassign to their properties. It's common sense.

export const a = {}
export const changeA = () => {
  a.b = 'hello world'

How does this help us with hot reload?

Exploiting Bindings

When creating proxies, we took notice of the fact that whatever we export originally is prevalent through the rest of the code. It's the same as with bindings, whatever we export first, is always referenced in all places it was imported, even if we change its value. Therefore, proxies themselves are just kind of bindings, which means we can eliminate them! You can get the idea of where I'm going with this.

section break