Solving Presence in Rails: Pusher vs. Node Service

There you are, sipping a mocha, writing another Rails app. It has users, and you’d love it if those users could interact and have some deep meaningful realtime connection. What do you reach for? Pusher? Generally that’s my first go to for realtime stuff! Pusher is awesome. It’s a software as a service platform for adding realtime awesomeness to your
app. As you would expect there are libraries in every flavor so you can use Pusher from whatever crazy setup you’ve got.

In the past, I’ve dropped the pusher gem into Rails apps and Wham! now I’m pushin realtime updates to my users from the server. If you’re curious how easy it is to get started with Pusher + Rails this is it:

If you want to follow along at home, you can checkout the 736e3b861705
branch of

Add the `pusher_rails` gem to the Gemfile, then copy and paste the initializer code they give you when you create an app on their site.

# Gemfile
gem 'pusher_rails'

# config/initializers/pusher.rb
require 'pusher'
Pusher.url = "http://#{ some key they give you }{ some app id they give you }"
Pusher.logger = Rails.logger

Initialize an instance of a `Pusher` object in javascript somewhere.
(This is also how you might setup logging).

// app/assets/javascripts/application.js
Pusher.log = function (message) {
  if (window.console && window.console.log) {
var pusher = new Pusher(some key they give you);

That’s it. You’re now up and running and can subscribe to events that are being pushed to the client.

Wow, that was easy. Whats the catch? Where does it fall down? Great question! (Disclaimer: You can get more out of Pusher if you pay $$, I’m interested in squeezing as much out of the free service as possible). This is great if the client is just listening for updates, but as soon as you need clients to emit events to each other, or emit events back to the server, Pusher wants you to pay. (Thanks @Phil Leggetter! The only limitation on free account is number of connections.) I completely understand, seems like a valid business model. That said, it’s surprisingly easy to get more realtime mileage if you extract that logic into a service.

Extract into a service you say…

To checkout the Node service that I extracted take a gander at: (the juicy stuff is in

Node.js is a great platform for and has many tools surrounding realtime communication [see and peer.js]. IMO, it’s got a beautiful evented architecture and great tools for managing tiny requests and streams efficiently.

Following the docs on was a great start to getting things running locally. Moving the client side logic to the javascript served from my Rails app and replacing all the client side references to localhost to with my heroku domain running the node app was enough to get going.

So what to change in Rails? I was able to strip out every single reference to Pusher, including the gem and initializer :). And then add a few lines requiring the library.

For me, the trick to getting everything to play nicely was setting up the connection from the client, then as soon as the first Rails page loaded, emitting a `register` event storing a hash of users by socket id in node. Essentially syncing the sessions.

One beauty of using is that you get presence (who’s online) just by storing this hash of users.

How did I arrive at this solution? Why move away from pusher and into Node? What was the smell/thing to look for that pushed me to make this huge change? Another great question! My presence implementation started to feel, um, hella hacky. Lets look at some code:

At some point I decided that users should be able to see who else is online (presence). I thought of a few ways to accomplish this, the first of which was to emit a `hello` event to all other people in the channel when the page loads. (I think you can do this with paid Pusher, I’m essentially building a toy, so that wasn’t an option).

Okay, option 2: I’ll send an xhr request when the page loads and post to a Rails endpoint (I called it /api/online_users). This was a pretty cool, but fragile solution. Here’s a couple commits with most of the code: w1zeman1p/code_racer/commit/f5e6abee69

The gist is that on document ready, send a POST, on before unload send a DELETE. Then have all clients bind to a channel called `presence` and when an OnlineUser resource is created, trigger that event and notify all users.

// on document ready
  url: '/api/online_user',
  type: 'POST',
  data: window.CURRENT_RACER

// cleanup stuff
function cleanup() {
    url: '/api/online_user',
    type: 'DELETE',
$(window).on('beforeunload', function () {
  var x = cleanup();
  return x;

// bind all users to presence channel, and listen for add_user
CodeRacer.pusher = new Pusher(key);
CodeRacer.presence = CodeRacer.pusher.subscribe('presence');
CodeRacer.presence.bind('add_user', function (data) {
  console.log('User coming online:', data);

From the Rails side, one option is to store all these users that are online in the SQL database. I didn’t go down that path for fear that talking to the SQL db would be too slow (I didn’t do any perf testing here, might be worth a try).

I tried using the Rails cache, in production I used memcachier. This worked pretty well, until some users `beforeunload` DELETE never fired and they ended up sticking around. More code?

# app/controllers/api/online_users_controller.rb
before_action :get_users
after_action :set_users

def create
  @users << user_hash unless @users.include?(user_hash)
  Pusher['presence'].trigger('add_user', user_hash)
  render json: @users

def user_hash
    nickname: current_user.nickname

def get_users
  @users ||='users') ||

def set_users
  Rails.cache.write('users', @users.to_a)
# ...

Kinda hacky? Yeah, I thought so too. It all depends on the `beforeunload` event firing just right and actually completing the DELETE request perfectly to remove the user from the cache. I suppose I could poll… ew. gross. No thanks.

Option 3! Replace the online user resource completely with a node service. This was the winner. No Rails controller (talk about skinny ;)), No Rails cache, No $.ajax requests, all

Here’s a jumpstart for getting some node code running and doing presence with `register` and `online_users` events. The idea here is that we’ll emit a `register` event from the client when the page loads, and we’ll listen for an `online_users` event for batch updates about who’s online (this could probably be more efficient if we just listened for add and remove rather than batch updating?).

// Node application running in a seprate instance than the Rails server.
// app.js
var http = require('http'),
  static = require('node-static'),
  file = new static.Server('./public'),
  _ = require('lodash');

var server = http.createServer(function (req, res) {
  req.addListener('end', function () {
    file.serve(req, res);

// process.env.PORT is for heroku ūüôā
server.listen(process.env.PORT || 8000);
var io = require('')(server);
var users = {};

io.on('connection', function (socket) {
  socket.on('register', function (data) {
    users[] = data;
    io.sockets.emit('online_users', _.values(users));

  socket.on('disconnect', function () {
    delete users[];
    io.sockets.emit('online_users', _.values(users));

Now that we’ve fired up a node app and we’re listening for connections, lets see the code we’ll need from the client.

// app/assets/javascripts/application.js
var socket = io('');
socket.on('online_users', function (data) {
  console.log('online users: ', data);
// on document ready
socket.emit('register', window.CURRENT_RACER);

Where can we go from here? What incredible powers does this give us? Peer to peer! A feature I’d love to add is voice/video of the racers, see that look of focus and determination…

as the type as fast as possible :). Seems like a pretty reasonable feature to add with peer.js using Web RTC.

Solving Presence in Rails: Pusher vs. Node Service

App Landing Page for Ionic app

We’ve all been to sites that explain features of an app and entice you to visit the market place and download. If you’ve developed an app and are starting to build out a landing page, you might have googled “app landing page” and found some themes from These are flashy designs that display the features of the app and expect the creator to drop in app screenshots that my slide around or transition as you move between features.

Something great about apps built with ionic framework is that you can demo the nearly full (minus device features) app right on your marketing page (not sure this is cool, legally). I’ve done that for a couple ionic apps: Pushbit¬†and Insider AI¬†and want to show you how.

These two pages are both being served from Rails apps, but you shouldn’t have any difficulty getting things working from your own server.

The key here is to run the ionic app in an iframe.

In Rails, there is a /public directory that contains static html. In /public I’ll create a directory called app and copy the contents of www to app.

In the html for the page, I’ll put the image of the device as the background, then an iframe pointing to the app’s index.html. The iframe should be in an iframe tag, but to comply with the blog formatting I’ve omitted the tag brackets.

<div id="phone">
  <img src="/assets/iphone6.png">
  iframe src="/app/index.html" frameBorder="0" class="screens" /iframe
App Landing Page for Ionic app

Getting started: Ionic + ES6

For some context: When working on front end stuff, I’ve always written pure javascript with the exception of a few one off coffee script fixes. I’m super stoked about ES6 and have been converting many of my js heavy projects to use some of the ES6 syntax. Most recently I’ve been working on an app called Pushbit for competing with friends for weekly pushup count. It’s written using the ionic framework and I’ve built much of it using some cool new ES6 syntax and would love to share my build process.

Here’s how I got started (follow along on github: w1zeman1p/es6demo)

ionic start es6demo
cd es6demo

We’ll need to transpile our es6 code into es5 code for it to work on all current browsers including the phones. I’ve chosen to create a directory called `jssrc` at the root level of the app to store all my javascript code. The transpiled file will end up in www/js.

mkdir jssrc
mv www/js/* jssrc/

Luckily, ionic comes with a nice gulpfile that we can add to. I like gulp, and incase you enjoy a different flavor of build system you should checkout Addy Osmani’s list of es6 tools. The idea here is that we want to some npm modules to read our es6 code and output a single concatenated es5 javascript file. We’ll use 3 new npm modules which you might need to install using these commands:

npm install gulp-traceur
npm install gulp-sourcemaps
npm install gulp-watch
npm install gulp-concat

Here’s a snippet to get you started:

var gulp = require('gulp'),
  concat = require('gulp-concat'),
  sourcemaps = require('gulp-sourcemaps'),
  traceur = require('gulp-traceur'),
  watch = require('gulp-watch');

var paths = {
  scripts: ['./jssrc/**/*.js']

gulp.task('scripts', function () {
  return gulp.src(paths.scripts)

gulp.task('default', ['scripts']);

gulp.task('watch', function () {, ['scripts']);

Lets go through this and talk about each piece.

First we need to require the modules we’ll use for transpiling. I’ve tried gulp-traceur and gulp-6to5 which are both gulp packages that overlay Traceur and 6to5 respectively. ¬†Traceur is a project out of google and seems to have the most traction at the moment. It also was the tool that worked best for me. gulp-sourcemaps is used to build a sourcemap file that can be used by the browser during debugging to show you the original code, rather than the transpiled es5. gulp-watch is handy for constantly running your gulp task when your es5 files change.

sourcemaps = require('gulp-sourcemaps'),
traceur = require('gulp-traceur'),

I’ve added `scripts: [‘./jssrc/**/*.js’]` to the given `paths` variable.

The `scripts` task will be used to convert our scripts. First it reads the files in `paths.scripts`, then initializes sourcemaps, then runs the files through traceur, then concatenates them into a file called `all.js`, then drops that file into www/js.

gulp.task('scripts', function () {
  return gulp.src(paths.scripts)

Now that our gulpfile is all good to go we can run `gulp` and `gulp watch` which will start listening to changes in our javascript files.

gulp watch

Lets checkout the www/index.html file and make sure we’re including the new transpiles `all.js` file.

There is an autogenerated section that looks like this:

    <!-- your app's js -->
    <script src="js/app.js"></script>
    <script src="js/controllers.js"></script>
    <script src="js/services.js"></script>

Update that to reference `all.js`.

    <!-- your app's js -->
    <script src="js/all.js"></script>

We also need to add in a few special files referenced here: that will allow us to run traceur compiled js files in the browser.

Add these scripts to your index.html head:

  <script src=""></script>
  <script src=""></script>

Lets fire up the server and see the result.

ionic serve

See the default app in the browser? Cool!

Lets get to writing some ES6 to test it out. Open up jssrc/app.js. First I like to go through and convert all anonymous function arguments to the () => {} syntax. something like this:

// ... OLD
angular.module('starter', ['ionic', 'starter.controllers', ''])
  .run(function($ionicPlatform) {
    $ionicPlatform.ready(function() {

// ... NEW!
angular.module('starter', ['ionic', 'starter.controllers', ''])
  .run(($ionicPlatform) => {
    $ionicPlatform.ready(() => {

Refresh the page and all should be well.

Lets try something a little more interesting. Add the following snippet at the bottom of `jssrc/app.js`:

class Snowman {
  constructor (name) { = name;

  sayHi() {
    console.log("Hi! I'm " + + " and I like warm hugs.");

var olaf = new Snowman('Olaf');

Open the dev console (cmd+opt+i), then refresh and observe the message from the instance of our es6 class.

That should get you started. In a following post I’ll show you how I built out my angular model layer with influences from my backbone.js experience.

Getting started: Ionic + ES6

Suck at reading? Watch or listen instead

One of my biggest weaknesses is reading speed. I must read something a few times before extracting meaning. Recently, I’ve embraced this and know that if I want to learn something I should just find the same content in a different medium.

While in school, I compensated for my lack of reading comprehension by paying extremely close attention during lectures. I avoided any classes that required heavy reading and took as many STEM classes as possible. In college, I claimed to hate subjects like History, Humanities and English. Looking back I know it was mostly just fear of failing.

People often ask if I’ve read certain books and my answer is usually “I don’t know how to read.” I’ll explain that rather than reading a book I’ll spend some time finding a video screencast, podcast or audiobook that covers the same subject. Today, educational tech content is so available it’s awesome.

Wait, aren’t there way more books than podcasts and videos? Great¬†question! There are certainly more written content than audio/video content about web development. That said, it’s unreasonable to assume that someone would read all the books. It’s also fair to say there is a lot of overlap in many of the books. I’ve been following my listen/watch focused learning strategy for about 4 years now and I’ve never run out of content (and I listen to podcasts at 1.5x).

Great resources for watching web development content:

Great resources for listening to web development content:

Suck at reading? Watch or listen instead