Today’s Readings

Apparently as developers we suck at taking full advantage of “touch keyboard” possibilities, but doing better doesn’t seem so hard!

Sticking with that line for a sec, provides a couple very simple techniques for vastly improving the usability and accessibility of form elements.

And provides a nifty CSS-only trick for indicating form element validation via pseudo elements. Understandably, he’s not all that crazy about his final work, with the extraneous span, but be sure to check-out his update at the end of the article.

And speaking of Jeremy, was too, as he addresses Jeremy’s question about Service Worker cache handling.

And speaking of Service Workers, do you know how to debug Service Workers in Chrome DevTools?

This isn’t news to anyone, but we should all be made to listen to presentations like this over and over again, until we do something about it…

The overweight web: Average web page size is up 15% in 2014
The Website Obesity Crisis, by

SVG Partial Blur and iOS Style Translucency. I don’t see an practical application for this, but it looks so cool I had to mention it. Maybe as a modal overlay’s background layer?

I love learning about new, exciting presenters and thinkers, so thanks to for posting the video of ‘ presentation Building a Better Web Browser (James’ talk actually begins at about 1:49).

Gorgeous, insightful, and entertaining application of canvas rendering SVG maps that dynamically pan and animate as you read and scroll through a story. Fun!

Universal constant: sequels suck. Right? Maybe not always. brings us 12 Little-Known CSS Facts (The Sequel).

It is exciting to see front-end technologies taking a bigger foothold in the server-side world, as MVCs continue to push their way into “back-end” areas, such as setting up data persistence and sessions when using React!

And speaking of React, HTML into React components automates the process of converting static HTML into React components…

And speaking of automating React, OverReact let’s you wireframe React components and download the starter files. What is this world coming to, that we have a WYSIWYG for something like React??? :-)

And speaking of automation, RoboJS dynamically loads JS depending on how the DOM is manipulated:

Add a node to the DOM and a JS will be loaded! Remove a node and the JS will be disposed!!” No framework dependencies, and less than 6kb

Really fascinating look under the hood of WordPress by : Useful Tips To Get Started With WordPress Hooks.

And finally, this…

A pastor, a priest and a rabbi walk into a bar...  All three hurt their head...
A pastor, a priest and a rabbi walk into a bar…

Happy reading,
Atg

Converting WordPress to Web App: Adding a Build Process

Part three of my Converting WordPress to Web App series, as we convert this site from a standard WP website to a cache-enabled, offline-first, performance-optimized, installable Web App, that takes advantage of Build and Deployment processes, and sits safely revisioned in a Version Control repo.

The steps that I plan to take are listed below; each will become a link as I complete and publish that part of the series.

  1. Series Intro
  2. Adding Version Control
  3. Adding a Build Process (this post)
  4. Adding a Deployment Process
  5. Adding Caching and Offline Support
  6. Adding “Add to home screen” functionality

And now, on to…

Adding a Build Process

Be forewarned: This is a lengthy one, digging into a lot of stuff, so get your coffee and make sure you’re in a comfy chair…

While a Build Process can be really powerful, this is where the standard WP set-up starts to quiver… Any long-time WP developer is used to the easy 5-minute installation, manual CSS, JS, and PHP edits, and FTPing changes to the web. This is the “standard” WP process.

But when a Build Process is introduced (and soon a Deployment Process), those manual edits and uploads can cause repo conflicts, and will cause lost changes. For example, if one team member manually edits a CSS file and FTPs it to the server, and another team member triggers a Build Process, a CSS preprocessor might compile a new CSS file, overwriting the first team member’s changes.

The new approach will be to make edits only in the preprocessor file (referred to as the “source” file), then letting the Build Process create the CSS file (referred to as the “distribution” file). (The upcoming Deployment Process will then push that CSS file to the server.)

While this change may sound more cumbersome (and it is, but only initially!), there are way too many benefits from a Build Process to not use one. We just need to change our ways a wee bit. And change is supposed to be a good thing, right?? ;-)

Preparing for a Build Process

Before getting started, a little planning is needed. I mentioned “source” and “distribution” files above. The division is basically “before the Build Process” and “after the Build Process”, or more accurately, “used by the Build Process” and “created by the Build Process. A deeper way to think about it is “doesn’t need to be deployed” and “does need to be deployed”.

Source files consist of any files that will get some work done to them or will only be used to create other files; these are the files that the Build Process will “process”, and thus do not need to be deployed to a server. A common practice is to put such files into a directory named src. The Build Process will then save the resulting files into a distribution directory, commonly named dist. The dist directory is what will be deployed to the server during the upcoming Deployment Process.

This means, to get ready, a little reorganizing is needed: any Less or Sass files, unminified files, individual SVG icons, etc., should go into the src directory, and WP Core, Theme and Plugin files should go into the dist directory.

At this point the repo’s root directory should look something like this abbreviated example:

/dist
  /wp-admin
  /wp-content
  /wp-includes
  .htaccess
  favicon.ico
  ...
  wp-config.php
  ...
/src
  /icons
  /scripts
  /styles
.gitattributes
.gitignore
...

Finally, Building a Build Process!

Although I am already familiar with Grunt, and Grunt is still an awesome, powerful tool, the new kid on the block is Gulp, and there are a lot of reasons to go with Gulp, so that’s what I chose for this project.

There are a ton of things Gulp can do during a Build Process, and every project’s Build Process should be at least slightly unique. Here are the tasks I want to perform during this project’s Build:

  • Concatenate and minify SVG icons.

    As I am not much of a designer, I went hunting for “free SVG social icons” and was floored at how hard it was to find good quality sets that had the six icons I need! Finally, at the end of the second page of my Google Search (I know, who knew there was more than one page of Google Search results??) I came across iconmonstr. All I can say is AWESOME!

    Then I went straight to CSS Tricks to find out how to use SVG icons…

    I found three articles that I recommend reading if this is new to you:

    1. SVG symbol a Good Choice for Icons. Don’t pay too much attention, just skim to get the idea, then move on to…
    2. SVG use with External Reference, Take 2. Basically, you can use an external SVG sprite, and it’s totally awesome, except it doesn’t work in any IE, and is just now being fixed in Edge. There are a few options listed for implementing the actual icons, and I am going with using a PHP @include to add the svg block at the top of each page, because I only have a few icons, and my icons are simple, single colors. You may want another option if you have more icons or your icons are more complex.
    3. Icon System with SVG Sprites. Putting all that together into an “icon system” via a Build Process. (And since this article is focused on Grunt, I went hunting and found gulp-svgstore, which does the same thing, and even references this article.)

    In addition to concatenating the individual icons into a single sprite file, I also wanted to minify them to remove all the garbage typically found in exported SVG files. I chose to go with gulp-svgmin, mainly because it uses SVGO, which has impressed people such as , and who is going to argue with Jake?

    gulp-svgmin also adds an ID to each symbol in the sprite, so the icons can properly use them.

  • Automate prefixing and cleaning CSS. Using mixins within preprocessors are nice, but they require continued maintenance (such as when a prefix is no longer needed), so I chose to automate even that, by using Autoprefixer. While I don’t need a pre-processor, the benefits of this post-processor are way too huge to pass up… Not only does it add prefixes based on my custom browser-matrix, but it also removes the ones I no longer need. Such wow! The above article is all about Grunt, but it works with Gulp too; the “official” Gulp-version appears to be PostCSS/Autoprefixer.
  • Concatenate and/or minify CSS and JS files. I chose gulp-concat for file concatenation, then I chose gulp-minify-css for my CSS minification and gulp-uglify for my JS minification. No fancy reasons, these are just where I ended up.
  • Determine my site’s critical CSS. If you are not yet familiar with this concept, I first read about it from the Google Developers team, with regards to their PageSpeed Module, but you can find a ton of writings about it online. The following are some good inspiration, a couple specifically with regards to WordPress:

    There are two main critical CSS methods you will see used: ‘s Critical and The Filament Group‘s Critical CSS. As I compared the configuration and set-up of both, Addy’s simply looked much cleaner to me, so that’s what I went with.

    These three articles really got me set-up:

    1. To set-up the Gulp Task all I really needed was Critical’s GitHub page. The examples and in-page documentation got nearly everything working (with the caveat regarding the base that I mentioned above). The next step is to put this minified CSS into pages for first-time users.
    2. For this, I started with as he demonstrates how to get this code into WP templates (scroll to the bottom, just before the Wrapping up section).
    3. That said, Ryan’s version does mean that the critical CSS gets pushed into every page, for every download; this is a bit wasteful. To resolve that issue, I borrowed again from Jeremy’s article, where he takes advantage of a browser cookie to determine whether or not the user already has the full CSS file cached.

A couple quick code blocks, because these required some custom work…

Here is my critical CSS Task:

// generate critical CSS
gulp.task( 'styles-critical', function() {

    // prevent Node from balking at self-signed ssl cert
    process.env.NODE_TLS_REJECT_UNAUTHORIZED = 0;

    // run critical css
    critical.generate({
        /* note: cannot use 'base:' or will break remote 'src:' */

        // we want css, not html
        inline: false,

        // css source file
        css: 'src/styles/style.css',

        // css destination file
        dest: 'dist/wp-content/themes/atg/critical-min.css',

        // page to use for picking critical
        src: 'https://aarontgrogg.dreamhosters.com/',

        // make sure the output is minified
        minify: true,

        // pick multiple dimensions for top nav
        dimensions: [{
            height: 500,
            width: 300
        }, {
            height: 600,
            width: 480
        }, {
            height: 800,
            width: 600
        }, {
            height: 940,
            width: 1280
        }, {
            height: 1000,
            width: 1300
        }, {
            height: 1200,
            width: 1800
        }, {
            height: 1200,
            width: 2300
        }]
    });
});

Note the first line inside of the Task’s callback function, process.env.NODE_TLS_REJECT_UNAUTHORIZED = 0;. Node doesn’t like self-signed SSL certificates, which is what I use for my local and stage servers; this tells it to shut-up and do it’s job. Never use this setting on a production Node server, it disables a ton of security features!

And this is what I added to my functions.php:

// utility function to dynamically create cache-buster, based on file's last modified date
// adapted from: http://www.particletree.com/notebook/automatically-version-your-css-and-javascript-files/
if ( !function_exists( 'atg_create_cache_buster' ) ) {
  function atg_create_cache_buster( $url ){
    return filemtime( $url );
  }
}

// utility function to dynamically add cache-buster to file name
// adapted from: http://www.particletree.com/notebook/automatically-version-your-css-and-javascript-files/
if ( !function_exists( 'atg_add_cache_buster' ) ) {
  function atg_add_cache_buster( $url, $buster ){
    $path = pathinfo( $url );
    $ver = '.' . $buster . '.';
    return $path['dirname'] . '/' . str_replace( '.', $ver, $path['basename'] );
  }
}

// add the critical CSS in the <head>
if ( ! function_exists( 'atg_add_css' ) ) :
  function atg_add_css() {

    // name of css file
    $cssfile = '/style-min.css';

    // file path for the css file
    $csspath = get_stylesheet_directory() . $cssfile;

    // get cache-buster
    $cachebuster = (string) atg_create_cache_buster( $csspath );

    // url for the css file
    $cssurl = atg_add_cache_buster( get_stylesheet_directory_uri() . $cssfile, $cachebuster );

    // check if they need the critical CSS
    if ( $_COOKIE['atg-csscached'] == $cachebuster ) {

      // if they have the cookie, then they have the CSS file cached, so simply enqueue it
        wp_enqueue_style( 'atg-style', $cssurl );

    } else {

      // write the critical CSS into the page
      echo '<style>';
        include( get_stylesheet_directory() . '/critical-min.css' );
      echo '</style>'.PHP_EOL;

      // add loadCSS to the page; note the PHP variables mixed in for the cookie setting
      echo "<script>!function(e,t){'use strict';function s(s){function n(){var t,s;for(s=0;s-1&&(t=!0);t?r.media='all':e.setTimeout(n)}var r=t.createElement('link'),i=t.getElementsByTagName('script')[0],a=t.styleSheets;return r.rel='stylesheet',r.href=s,r.media='only x',i.parentNode.insertBefore(r,i),n(),r}s('".$cssurl."'),t.cookie='atg-csscached=".$cachebuster.";expires=\"".date("D, j M Y h:i:s e", strtotime("+1 week"))."\";path=/'}(this,this.document);</script>".PHP_EOL;

      // add the full CSS file inside of a noscript, just in case
      echo '<noscript><link rel="stylesheet" href="'.$cssurl.'"></noscript>'.PHP_EOL;
    }

  } // atg_add_css
endif; // function_exists

The first couple functions, credited in the comments, create a dynamic cache-buster based on the last-modified date of the file itself; sort of a self-defining cache-buster; very clever! Then I pretty much follow Jeremy’s example exactly, with minor modifications to make it all WordPressy. Note the need for get_stylesheet_directory() when dealing with the file’s path (for checking the last-modified date, and for the include), and get_stylesheet_directory_uri() to get the file’s URL (for wp_enqueue_style and loadCSS).

Also, a quick note regarding loadCSS’ cookie expiration date: it is really far in the future (Tue, 19 Jan 2038 03:14:07 GMT, to be exact). But just because that cookie exists does not necessarily mean that the CSS file is still in the browser’s cache; there are a lot of issues (sorry, “challenges”) with browser cache. Users can clear it or it can fill quickly with the MBs of assets that we push through the tubes. And when a browser’s cache is full, older files get kicked out. Though the worst-case scenario is that the browser gets a normal CSS link, this kind of defeats the purpose of this exercise…

So, all that just to say that I’ve shortened my cookie date considerably, to one week from the user’s visit. I feel like this is a safe estimate. In my worst-case scenario, a user might the inline CSS when they do have the CSS file in cache, but for now, I think that makes more sense.

Next, I needed to add a few new lines into my .htaccess file to strip the cache buster and deliver the actual file:

# remove cache-buster
<IfModule mod_rewrite.c>
  RewriteEngine On
  RewriteBase / 
  RewriteCond %{REQUEST_FILENAME} !-f
  RewriteCond %{REQUEST_FILENAME} !-d
  RewriteRule ^(.+)\.(.+)\.(js|css)$ $1.$3 [L]
</IfModule>

And finally, in my header.php, in place of the link to my CSS file, I add the following function call:

<?php atg_add_css(); ?>

That wraps up the critical CSS stuff!

Additional plugins used

There were also a few additional plugins that I either needed or chose to use:

  • First and foremost was gulp-load-plugins, because it means only maintaining a single list of Gulp plugins (in the package.json file) and it makes referring to the plugins much easier because I know that they are all namespaced inside the plugins object I create, and the plugin names have all been normalized.
  • I didn’t want to be doing all the concatenating and minifying with each and every Build Process, because most files will not change between Builds. So I added gulp-changed (for my CSS and JS files) and gulp-changed-in-place (for the third-party CSS and JS files). Both create hashes of the files they have processed, and check for updates to those files before processing them again. If any of the files have changed since the last Build Process, they will be processed again, but otherwise they will be ignored.
  • gulp-svgmin requires the use of Node.js’s path module in order to create a unique ID for each individual symbol in the concatenated sprite. This one threw me for a few minutes, because after you spend a few hours in Gulpland, it’s easy to think that everything is Gulp, but this one is simply Node.js…
  • Live Reload. gulp-livereload is a developer’s wet dream, when developing locally! Monitoring your local environment files for updates, then automatically triggering things like another Build Process, then a browser refresh, sure saves a lot of tabbing or clicking around when you’re trying to iron something out.

If you’re looking for more info on getting started with Gulp, offers a nice Gulp starter file, with explanations for everything he uses (some of which you may not want, and can easily remove).

Repo Files versus Deployment Files

Another concept to discuss at this point is the difference between files that should, and should not, be included in the repo: Repos should not contain any files that are created by the Build Process.

This might seem odd, because that means a repo will not contain any concatenated and minified CSS and JS files, and it will not contain any optimized and concatenated SVG files…

So after using that fancy Build Process to create all those files, and you’re not going to commit them?

Correct. The reason for this is that repos have a really difficult time determining the difference between such files. Also, since they are going to be rebuilt every time we perform a Build Process, there is really no reason to include them. Our source files are the important ones to revision and share between team members; with them, our created files can easily be created any time.

Additionally, adding Gulp to this project added a slew of new files to the repo’s directory, sitting inside a new root-level node_modules directory. Not only do these constitute a tremendous number of files, but they are also something that can be installed on a computer after the repo is cloned. And, the files that are installed tend to be specific to an operating system. Keeping such files out of the repo means that Windows users can clone the same repo as Mac or Linux users, and npm install will install whatever files their operating system needs, and not any that they don’t.

Updating the .gitignore File

So to keep these unwanted files out of the repo, I added a few lines to the .gitignore file:

node_modules
dist/wp-content/themes/atg/critical-min.css
dist/wp-content/themes/atg/style-min.css
dist/wp-content/themes/atg/icons/icons.svg

Status Thus Far

You can see where I am thus far by checking out the repo on GitHub, though this will be a working repo, meaning it will change as this series progresses. So if you are reading this much later than it was written, it may not look the way that it seems like it should; just keep pushing through and eventually it will!

Next Up

This was by-far the hardest step in this process, because there is a lot to pull together, nearly everything had a bunch of options, and a lot of it was kind of specific to this project…

Next time when we will be Adding a Deployment Process, and there are a few items you may want to have ready if you want to “play along at home”:

  1. A Deployment Process is basically the act of moving files from your locahost or repo to a web server. There are many ways to move files, the most common is easily via FTP, but I will be using a service called DeployBot. It basically monitors your repo, and, when it sees changes (from a push), it can automatically push those changes to your Stage server for testing and review. DeployBot also offers a one-button deployment to your live server when you are ready, and provides a rollback feature, as well as their own Build and ignore options. So, have a read around, and if you’re interested in this option, create an account and browse around, but don’t start “attaching” anything yet…
  2. The deployment process will require access to your repo, so make sure you have those login credentials handy, and if you want your deployment process to be able to automatically deploy to your Stage server, have one setup and have those FTP credentials handy as well.

That should be it!

Until then, happy Web Apping,
Atg

Today’s Readings

With the huge news that responsive images are now part of the WP Core functionality, Smashing Magazine takes us on a nice little walk through the what and how of this incredibly powerful new feature!

And as long as we’re talking about responsive images, let’s take it several steps further, by adding lazy-loading, adaptive JPEG compression, and switching images to the BPG image format! Incredible article, great suggestions, and impressive improvements… I had never even heard of BPG images!

And as long as we’re shaving off bytes and bytes, why not also dive into Chroma-Subsampling? (`nother one I had never heard of… sheesh…)

Service Workers get a lot of hype with regards to making your site work offline, which is awesome! But there is a lot more to it, and walks us through a bunch of other possibilities!

And then there’s this:

As a proof of concept I have been able to intercept fetch requests from the page and serve them using an ExpressJS running inside a ServiceWorker.
express-service: ExpressJS server running inside ServiceWorker

You have got to be kidding me… Freaking awesome!

gives us a nice run-down on the state of JS in his JavaScript: 2015 in Review.

flex-grow is weird. Or is it? Oh, it is. And so is all of Flexbox. I think that some of the attribute names are horribly inconsistent and unintuitive, but I also paid no attention during the Draft process, so I have no right to complain. Just need to shove them all into my head and move on!

We took a hacker to a café and, in 20 minutes, he knew where everyone else was born, what schools they attended, and the last five things they googled.
Here’s Why Public Wifi is a Public Health Hazard

I mean, we all know it, right? Nevertheless, it is quite shocking to read how easy it is to get so much…

Keeping on that spooky theme, What’s Ahead for Your Data in 2016? I suddenly want to unplug and crawl under a rock…

Trouble reading ES6 examples?
Copy & paste them into the Babel REPL to see how Babel transforms them into ES5.
Eric Elliott on Twitter

I tend to give single-page apps a bit of shit, because I hate how they are usually all JS-or-nothing. Well is here to help us with Reimagining Single-Page Applications With Progressive Enhancement

And finally, if you’re still using Sublime Text (I am), offers up a list of power-user tips for Sublime Text. Making the machines do more work is always my preferred approach… :-)

Happy reading,
Atg

How to add Critical CSS to a WordPress site

Like most front-end developers, I want my sites to load faster than fast, ideally before the user has even thought about going to my site… :-)

And like probably most front-end developers, I know about the techniques I can use (CSS in the head, JS at end of the body, sprite my icons, optimize my images, set cache headers for my assets, yadda-yadda-yadda…), and I do incorporate the ones that are “easy”, but some are just… harder. Like putting the critical (“above-the-fold”) CSS in-page. Especially when you have a WordPress site…

Well, I have finally decided to take on the task of adding critical CSS to the Netbiscuits company website, and I thought I would document the process as I go. So, sit back, have a nice leisurely read, and enjoy my pain and frustration… :-)

TL;DR, get to it…

Planning

When you are talking about WordPress, everything works around the tried-and-true, well-known procedure:

  1. The Famous 5-Minute Install
  2. Pick a Theme
  3. Pick a few Plugins, if you want
  4. Manually edit CSS, JS and PHP files, if you need
  5. And FTP them to your web host’s server

Well, in a process like that, the typical methods you see for adding critical CSS to a site don’t easily work. Those methods typically involve a Grunt or Gulp process, which requires a Node.js installation, and you would probably also use a Build Process and a Deployment Process as well. That’s a lot of automation, and none of it fits into those nice, neat little steps listed above…

So, as I see it, these are your options for adding the critical CSS to a WP site:

  • Manual. Hunting and pecking to find what you need. This is going to be laborious, but is also going to give you the most efficient CSS, but is also going to be a nightmare to maintain.
  • Browser extension. There are a few out there, but none that I saw were configurable, and none created CSS that worked well for me.
  • Task runner. If you already have a task runner in place, this is pretty easy. If not, it’s a bit more work, but in the end you have an automated solution that you can easily maintain.

While the idea of manually crafting the critical CSS is attractive because of the lean, “perfect” CSS I could craft, such manual work pains me to think about… Especially maintaining it…

And as I said above, no browser extension reliably created anything I liked.

So using a task runner was my choice.

Research

I first had to decide which critical CSS methods I wanted to consider. I quickly boiled things down to two big names (which in my mind means greater crowdsourcing and better long-term upkeep). I decided to look into:

  1. criticalcss, by The Filament Group, and
  2. critical, by

And since I already use, and want to continue using, Grunt for this project, those two URLs get transmutated into:

  1. grunt-criticalcss, still by The Filament Group, and
  2. grunt-critical, by (ported from Addy’s version)

Decision

There is not much difference between Addy’s and the The Filament Group’s versions, so I was initially stymied. I like how Addy’s version handled multiple source and destination files, while sharing a single options object, like:

critical: {
  dist: {
    options: {
      base: './',
      dimensions: [{
        width: 1300,
        height: 900
       },
       {
        width: 500,
        height: 900
      }]
    },
    files: [
      {src: ['index.html'], dest: 'dist/index.html'},
      {src: ['blog.html'], dest: 'dist/blog.html'}
      {src: ['about.html'], dest: 'dist/about.html'}
      {src: ['contact.html'], dest: 'dist/contact.html'}
    ]
  }
}

Or even using wildcards, like:

critical: {
  dist: {
    options: {
      base: './',
      dimensions: [{
        width: 1300,
        height: 900
      },
      {
        width: 500,
        height: 900
      }],
      src: '*.html',
      dest:  'dist/'
    }
  }
}

Both code samples blatantly copied-and-pasted directly from Understanding Critical CSS

Whereas The Filament Group’s version requires a separate object for each source that should be examined, requiring the options object to be replicated for each source, like:

criticalcss: {
  home: {
    options:  {
      outputfile : 'css/critical/critical-home.css',
      filename : 'all.css',
      url : 'http://fgwebsite.local'
    }
  },
  services: {
    options:  {
      outputfile : 'css/critical/critical-services.css',
      filename : 'all.css',
      url : 'http://fgwebsite.local/services/'
    }
  },
  about: {
  ...

Code sample blatantly copied-and-pasted directly from How we make RWD sites load fast as heck

I suppose this configuration could actually be good, if your options were different for some sources, but it also feels like a DRY violation…

However, since all of the tutorials that I liked use The Filament Group’s version, I decided to go with The Filament Group’s grunt-criticalcss.

Setting-up Grunt

Before we can even start we need to install Grunt and that means we have to install Node.js.

Now we can start customizing things for critical CSS! And as I mentioned, I will be borrowing bits and pieces from several of the Additional Resources I listed above.

Install criticalcss

To start we need to add The Filament Group’s grunt-criticalcss to our package.json file (you will see it hiding amongst the other Grunt plugins I use for this project):

{
  "name": "netbiscuits-theme",
  "version": "0.1.1",
  "private": true,
  "devDependencies": {
    "grunt": "~0.4.2",
    "grunt-bump": "^0.5.0",
    "grunt-contrib-concat": "~0.5.1",
    "grunt-contrib-less": "~1.0.0",
    "grunt-contrib-uglify": "~0.7.0",
    "grunt-contrib-watch": "~0.6.1",
    "grunt-criticalcss": "^0.6.0",
    "grunt-grunticon": "~1.4.0",
    "grunt-svgmin": "~2.0.0"
  }
}

Then we run npm install again to make sure we have everything installed.

Configure criticalcss

Next, I worked my way through Jeremy Keith’s article, but since I have multiple Templates, each require their own critical CSS file. This is where I jumped to Joe Watkin’s article: in my Gruntfile.js, I added a new Task (criticalcss) to the grunt.initConfig, and then added a new set of options for each template ('knowledge-base', 'pricing-plan', etc.):

grunt.initConfig({
  ...
  criticalcss: {
    'knowledge-base' : {
      options:  {
        outputfile : 'dist/wp-content/themes/netbiscuits/css/critical/knowledge-base.css',
        filename : 'dist/wp-content/themes/netbiscuits/css/knowledge-base.min.css',
        url : 'http://netbiscuits.local/knowledge-base/netbiscuits-analytics-marketers-guide/'
      }
    },
    'pricing-plan' : {
      options:  {
        outputfile : 'dist/wp-content/themes/netbiscuits/css/critical/pricing-plan.css',
        filename : 'dist/wp-content/themes/netbiscuits/css/pricing-plan.min.css',
        url : 'http://netbiscuits.local/pricing-plan/'
      }
    },
    'global': {
      options:  {
        outputfile : 'dist/wp-content/themes/netbiscuits/css/critical/global.css',
        filename : 'dist/wp-content/themes/netbiscuits/css/global.min.css',
        url : 'http://netbiscuits.local/'
      }
    },
    ...
  },
  ...
});

In the above, for each Template, we are telling criticalcss to fetch the existing CSS (filename), compare it to a page (url), and save to a new file (outputfile) only the CSS that is visible above the scroll.

Then we load the Task along with all of our other Grunt Tasks:

...
grunt.loadNpmTasks('grunt-criticalcss');
...

And register it, along with all of our other Grunt Tasks:

...
grunt.registerTask('default', [... 'criticalcss', ...]);
...

Run criticalcss

Now, if we run Grunt via a command line (grunt), or just this Task specifically (grunt criticalcss), we should see new files in the dist/wp-content/themes/netbiscuits/css/critical directory.

Adding Critical CSS to the Templates

Now that we have our critical CSS files, it’s time to make use of them and add them to our WP templates! Continuing our dance between Jeremy Keith’s and Joe Watkins’ processes…

From Jeremy’s article, I got the basis for using a browser cookie to determine if the browser has already downloaded the actual CSS file, and thus whether to add the critical CSS to the page or to simply a link to that CSS file. The presence of this cookie, however, could prevent a user’s browser from downloading an updated CSS file, so Jeremy also adds a cache-buster to be able to force the browser to download the updated CSS file.

And from Joe’s article, I got the basis for adding multi-template support.

In my functions.php, I added the following (note that I manually define a cache-buster; ideally this is automated, but for this tutorial I’m hard-coding):

// define cache-buster: YYYYMMDD[MMSS]
define( 'NB_SITE_VERSION', '20160113' );

// create global variable to receive stylesheet references from the templates
$nb_stylesheet_queue = array();

// receives css file name and pushes it into the above array
if ( ! function_exists( 'nb_enqueue_stylesheet' ) ) :
  function nb_enqueue_stylesheet( $style ) {

    // get global array
    global $nb_stylesheet_queue;

    // push $style into array
    $nb_stylesheet_queue[] = $style;

  } // nb_enqueue_stylesheet
endif; // function_exists

// enqueue global CSS
nb_enqueue_stylesheet( 'global' );

// add either critical css or link to css file
if ( ! function_exists( 'nb_add_css_to_page' ) ) :
  function nb_add_css_to_page() {

    // get global array
    global $nb_stylesheet_queue;

    // loop through all enqueued stylesheets
    foreach( $nb_stylesheet_queue as $style) {

      // get the full css file URL
      $fullstyle = THEME_DIRECTORY . 'css/' . $style . '.' . NB_SITE_VERSION . '.min.css';

      // check if the user has the full CSS file
      if ($_COOKIE['nb_csscached'] === NB_SITE_VERSION) {

        // if they have the cookie, then they have the CSS file cached, so simply enqueue it
        wp_enqueue_style( 'nb_css', $fullstyle );

      } else {

        // if not, write the critical CSS into the page
        echo '<style>';
          include( TEMPLATEPATH . 'css/critical/' . $style . '.' . NB_SITE_VERSION . '.min.css' );
        echo '</style>'.PHP_EOL;

        // add a minified-version of loadCSS; note the PHP variables mixed in there
        echo "<script>!function(e,t){'use strict';function s(s){function n(){var t,s;for(s=0;s-1&&(t=!0);t?r.media='all':e.setTimeout(n)}var r=t.createElement('link'),i=t.getElementsByTagName('script')[0],a=t.styleSheets;return r.rel='stylesheet',r.href=s,r.media='only x',i.parentNode.insertBefore(r,i),n(),r}s('".$fullstyle."'),t.cookie='nb_css=".NB_SITE_VERSION.";expires=\"Tue, 19 Jan 2038 03:14:07 GMT\";path=/'}(this,this.document);</script>".PHP_EOL;

        // and add a noscript-wrapped CSS link for non-JS users
        echo '<noscript><link rel="stylesheet" href="'.$fullstyle.'"></noscript>'.PHP_EOL;

      }

    } // foreach

  } // nb_add_css_to_page

endif; // function_exists

In my page template files, I added something like this before calling get_header();:

nb_enqueue_stylesheet( 'knowledge-base' );

In my header.php, inside the head, I added this:

nb_add_css_to_page();

Lastly, in my .htaccess, I added a small rewrite to deal with the cache-buster (this is explained quite well in Jeremy’s article, which is also where I copied this code from; it also conveniently deals with any JS files that use the cache-buster):

RewriteCond %{REQUEST_FILENAME} !-f
RewriteRule ^(.+).(d+).(js|css)$ $1.$3 [L]

In the above code, note that I use THEME_DIRECTORY for the URL of the CSS file (used for $fullstyle), but TEMPLATEPATH for the file path of the CSS file (used for the PHP include).

Additionally, I make use of The Filament Group’s loadCSS to asynchronously append the full CSS file for the browser to cache, and set a cookie that this has been done (which includes two PHP variables).

Lastly, a link to the full CSS file is also included inside of a noscript tag, just in case there is no JS.

How this all Works

The first time a visitor comes to the site…

  1. the PHP checks for the cookie and sees none…
  2. so it includes the critical CSS file contents into the page…
  3. then the loadCSS dynamically appends the full CSS file via an async link
  4. and adds a browser cookie to indicate that the full CSS file should be in browser cached…
  5. then the page continues loading normally…
  6. and the user benefits from having the critical CSS in-page.

And on the next time page load…

  1. the PHP checks for the cookie and finds one…
  2. so it enqueues a link to the full CSS file(s)…
  3. then the page continues loading normally…
  4. and the user benefits from having the full CSS file cached by the browser.

And in any case, if the user does not have JS for whatever reason, the full CSS file is fetched via the noscript block.

Summary

Altering the standard WP set-up is never an easy undertaking. In my attempt to add support for inline critical CSS to a WP site, I made use of The Filament Group’s CriticalCSS Grunt task, and borrowed and altered code from several other developers’ articles. I also made sure users that have visited the site before, and have the site’s full CSS in their browser’s cache, do not get the critical CSS inlined again, and made sure I am able to force users to download new versions if the old version has been updated.

The process wasn’t always easy, and it seldom is, but isn’t that part of the fun?

In the end, however you choose (or need) to get your site’s critical CSS, your users will greatly appreciate any efforts you take to make your site load more quickly. Whether they know it or not! :-)

Happy CSSing,
Atg

Converting WordPress to Web App: Adding Version Control

Part two of my Converting WordPress to Web App series, as we convert this site from a standard WP website to a cache-enabled, offline-first, performance-optimized, installable Web App, that takes advantage of Build and Deployment processes, and sits safely revisioned in a Version Control repo.

The steps that I plan to take are listed below; each will become a link as I complete and publish that part of the series.

  1. Series Intro
  2. Adding Version Control (this post)
  3. Adding a Build Process
  4. Adding a Deployment Process
  5. Adding Caching and Offline Support
  6. Adding “Add to home screen” functionality

And now, on to…

Adding Version Control

At the moment this website does not use a repo. And while using a repo for a WP site is not exactly a challenge, it is an important part of this process.

GitHub provides a really nice, easy-to-follow Getting your project on GitHub guide. Use the root directory of your local WordPress installation (this is the directory where the wp-admin and wp-includesdirectories, and the wp-config.php file can be found) as the repo to push to GitHub. We will be changing some things as we get into the next couple steps, but for now this will get us started.

Following through that GitHub guide, you should be able to make a trivial change-or-two to your files, commit, pull and push them, then log-in to GitHub and see your changes reflected. It’s a pretty cool feeling if you’ve never done it before, so play and have fun.

On a typical project, you will want to commit often, and push when you have something that is ready for the team, or is ready “to be saved”. I highly recommend getting to know Git better, digging into Branches, Rebasing, and more, to be able to take advantage of the real power of Git-based repos.

Status Thus Far

As I said above, this installment was not exactly a challenge, but any project worth the time to create, should be worth the time to secure in a repo. And when you are working on a team project, a repo is vital, allowing everyone to contribute (mostly) without fear of overwriting one-another’s work.

Next Up

Things will really start escalating in the next installment when I will be Adding a Build Process, so there are a few items you may want to have ready if you want to “play along at home”:

  1. Be familiar with the concepts of a Build Process / Task Runner. You don’t have to be an expert, but you might want to read-up on, and at least understand, the “What” and “Why“.
  2. Pick a Task Runner. I will be using Gulp, but you can also use Grunt, Broccoli, Node, or maybe even your text editor itself! As with the repo selection, you should be able to use whatever you choose.

Until then, happy Web Apping,
Atg