Reading List

The most recent articles from a list of feeds I subscribe to.

Building Skeleton Screens with CSS

This article was originally published on CSS-Tricks.com.

Designing loading states on the web is often overlooked or dismissed as an afterthought. Performance is not only a developer's responsibility - building an experience that works with slow connections can be a design challenge as well.

While developers need to pay attention to things like minification or caching, designers have to think about how the UI will look and behave while it is in a “loading” or “offline” state.

The Illusion of Speed

Permalink to “The Illusion of Speed”

Perceived performance is a measure of how fast something feels to the user. The idea is that users are more patient and will think of a system as faster, if they know what’s going on and can anticipate content before it’s actually there. It’s a lot about managing expectations, and keeping the user informed.

For a web app, this concept might include displaying “mockups” of text, images or other content elements - called skeleton screens 💀. You can find these in the wild, used by companies like Facebook, Google, Slack and others:

Slack desktop app using skeleton screens while loading
Holy moly to you too, Slack.

An Example

Permalink to “An Example”

Say you are building a web app. It’s a travel-advice kind of thing where people can share their trips and recommend places, so your main piece of content might look something like this:

card UI of a travel blog post

You can take that card and reduce it down to its basic visual shapes, the skeleton of the UI component.

skeleton version of the same card, outlined in gray rectangles

Whenever someone requests new content from the server, you can immediately start showing the skeleton, while data is being loaded in the background. Once the content is ready, simply swap the skeleton for the actual card. This can be done with plain vanilla Javascript, or using a library like React.

Now you could use an image to display the skeleton, but that would introduce an additional request and data overhead. We’re already loading stuff here, so it’s not a great idea to wait for another image to load first. Plus it’s not responsive, and if we ever decided to adjust some of the content card’s styling, we would have to duplicate the changes to the skeleton image so they’d match again. 😒 Meh.

A better solution is to create the whole thing with just CSS. No extra requests, minimal overhead, not even any additional markup. And we can build it in a way that makes changing the design later much easier.

Drawing Skeletons in CSS

Permalink to “Drawing Skeletons in CSS”

First, we need to draw the basic shapes that will make up the card skeleton. We can do this by adding different gradients to the background-image property. By default, linear gradients run from top to bottom, with different color stop transitions. If we just define one color stop and leave the rest transparent, we can draw shapes.

Keep in mind that multiple background-images are stacked on top of each other here, so the order is important. The last gradient definition will be in the back, the first at the front.

.skeleton {
  background-repeat: no-repeat;
  background-image: 
    /* layer 2: avatar */
    /* white circle with 16px radius */
    radial-gradient(circle 16px, white 99%, transparent 0),
    /* layer 1: title */
    /* white rectangle with 40px height */
    linear-gradient(white 40px, transparent 0),
    /* layer 0: card bg */
    /* gray rectangle that covers whole element */
    linear-gradient(gray 100%, transparent 0);
}

These shapes stretch to fill the entire space, just like regular block-level elements. If we want to change that, we’ll have to define explicit dimensions for them. The value pairs in background-size set the width and height of each layer, keeping the same order we used in background-image:

.skeleton {
  background-size:
    32px 32px,  /* avatar */
    200px 40px, /* title */
    100% 100%;  /* card bg */
}

The last step is to position the elements on the card. This works just like position:absolute, with values representing the left and top property. We can for example simulate a padding of 24px for the avatar and title, to match the look of the real content card.

.skeleton {
  background-position:
    24px 24px,  /* avatar */
    24px 200px, /* title */
    0 0;        /* card bg */
}

Break it up with Custom Properties

Permalink to “Break it up with Custom Properties”

This works well in a simple example - but if we want to build something just a little more complex, the CSS quickly gets messy and very hard to read. If another developer was handed that code, they would have no idea where all those magic numbers are coming from. Maintaining it would surely suck.

Thankfully, we can now use custom CSS properties to write the skeleton styles in a much more concise, developer-friendly way - and even take the relationship between different values into account:

.skeleton {
  /*
    define as separate properties
  */
  --card-height: 340px;
  --card-padding:24px;
  --card-skeleton: linear-gradient(gray var(--card-height), transparent 0);
  
  --title-height: 32px;
  --title-width: 200px;
  --title-position: var(--card-padding) 180px;
  --title-skeleton: linear-gradient(white var(--title-height), transparent 0);

  --avatar-size: 32px;
  --avatar-position: var(--card-padding) var(--card-padding);
  --avatar-skeleton: radial-gradient(
    circle calc(var(--avatar-size) / 2), 
    white 99%, 
    transparent 0
  );

  /* 
    now we can break the background up 
    into individual shapes 
  */
  background-image: 
    var(--avatar-skeleton),
    var(--title-skeleton),
    var(--card-skeleton);

  background-size:
    var(--avatar-size),
    var(--title-width) var(--title-height),
    100% 100%;

  background-position:
    var(--avatar-position),
    var(--title-position),
    0 0;
}

Not only is this a lot more readable, it’s also way easier to change some of the values later on.
Plus we can use some of the variables (think --avatar-size, --card-padding, etc.) to define the styles for the actual card and always keep it in sync with the skeleton version.

Adding a media query to adjust parts of the skeleton at different breakpoints is now also quite simple:

@media screen and (min-width: 47em){
  :root {
    --card-padding: 32px;
    --card-height: 360px;
  }
}

Caveat: Browser support for custom properties is good, but not at 100%. Basically all modern browsers have support, with IE/Edge a bit late to the party. For this specific usecase, it would be easy to add a fallback using Sass variables though.

Add Animation

Permalink to “Add Animation”

To make this even better, we can animate our skeleton and make it look more like a loading indicator. All we need to do is put a new gradient on the top layer and then animate its position with @keyframes.

Here’s a full example of how the finished skeleton card could look:

Skeleton Loading Card by Max Böck (@mxbck) on CodePen.

💡 Pro Tip: You can use the :empty selector and a pseudo element to draw the skeleton, so it only applies to empty card elements. Once the content is injected, the skeleton screen will automatically disappear.

More on Designing for Performance

Permalink to “More on Designing for Performance”

For a closer look at designing for perceived performance, check out these links:

Offline-Friendly Forms

Forms on the web don't usually play nice with bad connections. If you try to submit a form while offline, you'll most likely just lose your input. Here's how we might fix that.

TL;DR: Here’s the CodePen Demo of this post.

With the introduction of Service Workers, developers are now able to supply experiences on the web that will work even without an internet connection. While it’s relatively easy to cache static resources, things like forms that require server interaction are harder to optimize. It is possible to provide a somewhat useful offline fallback though.

First, we have to set up a new class for our offline-friendly forms. We’ll save a few properties of the <form> element and then attach a function to fire on submit:

class OfflineForm {
  // setup the instance.
  constructor(form) {
    this.id = form.id;
    this.action = form.action;
    this.data = {};
    
    form.addEventListener('submit', e => this.handleSubmit(e));
  }
}

In the submit handler, we can include a simple connectivity check using the navigator.onLine property. Browser support for it is great across the board, and it’s trivial to implement.

⚠️ There is however a possibility of false positives with it, as the property can only detect if the client is connected to a network, not if there’s actual internet access. A false value on the other hand can be trusted to mean “offline” with relative certainty. So it’s best to check for that, instead of the other way around.

If a user is currently offline, we’ll hold off submitting the form for now and instead store the data locally.

handleSubmit(e) {
  e.preventDefault();
  // parse form inputs into data object.
  this.getFormData();
  
  if (!navigator.onLine) {
    // user is offline, store data on device.
    this.storeData();
  } else {
    // user is online, send data via ajax.
    this.sendData();
  }
}

Storing the Form Input

Permalink to “Storing the Form Input”

There are a few different options on how to store arbitrary data on the user’s device. Depending on your data, you could use sessionStorage if you don’t want the local copy to persist in memory. For our example, let’s go with localStorage.

We can timestamp the form data, put it into a new object and then save it using localStorage.setItem. This method takes two arguments: a key (the form id) and a value (the JSON string of our data).

storeData() {
  // check if localStorage is available.
  if (typeof Storage !== 'undefined') {
    const entry = {
      time: new Date().getTime(),
      data: this.data,
    };
    // save data as JSON string.
    localStorage.setItem(this.id, JSON.stringify(entry));
    return true;
  }
  return false;
}

Hint: You can check the storage in Chrome’s devtools under the “Application” tab. If everything went as planned, you should see something like this:

chrome devtools showing the localstorage contents

It’s also a good idea to inform the user of what happened, so they know that their data wasn’t just lost.
We could extend the handleSubmit function to display some kind of feedback message.

feedback message explaining the offline state
How thoughtful of you, form!

Checking for Saved Data

Permalink to “Checking for Saved Data”

Once the user comes back online, we want to check if there’s any stored submissions. We can listen to the online event to catch connection changes, and to the load event in case the page is refreshed:

constructor(form){
  ...
  window.addEventListener('online', () => this.checkStorage());
  window.addEventListener('load', () => this.checkStorage());
}

When these events fire, we’ll simply look for an entry in the storage matching our form’s id. Depending on what type of data the form represents, we can also add an “expiry date” check that will only allow submissions below a certain age. This might be useful if we only want to optimize for temporary connectivity problems, and prevent users from accidentally submitting data they entered two months ago.

checkStorage() {
  if (typeof Storage !== 'undefined') {
    // check if we have saved data in localStorage.
    const item = localStorage.getItem(this.id);
    const entry = item && JSON.parse(item);

    if (entry) {
      // discard submissions older than one day. (optional)
      const now = new Date().getTime();
      const day = 24 * 60 * 60 * 1000;
      if (now - day > entry.time) {
        localStorage.removeItem(this.id);
        return;
      }

      // we have valid form data, try to submit it.
      this.data = entry.data;
      this.sendData();
    }
  }
}

The last step would be to remove the data from localStorage once we have successfully sent it, to avoid multiple submissions. Assuming an ajax form, we can do this as soon as we get a successful response back from the server. We can simply use the storage object’s removeItem() method here.

sendData() {
  // send ajax request to server
  axios.post(this.action, this.data)
    .then((response) => {
      if (response.status === 200) {
        // remove stored data on success
        localStorage.removeItem(this.id);
      }
    })
    .catch((error) => {
      console.warn(error);
    });
}

If you dont want to use ajax to send your form submission, another solution would be to just repopulate the form fields with the stored data, then calling form.submit() or have the user press the button themselves.

☝️ Note: I’ve omitted some other parts like form validation and security tokens in this demo to keep it short, obviously these would have to be implemented in a real production-ready thing. Dealing with sensitive data is another issue here, as you should not store stuff like passwords or credit card data unencrypted locally.

If you’re interested, check out the full example on CodePen:

Offline Form by Max Böck on CodePen.

You're Offline

A truly responsive website should adapt to all kinds of situations. Besides different viewport sizes, there are other factors to consider. A change in connectivity is one of them.

Earlier this week, I was sitting in a train on my way to speak at a local meetup. InterCity trains in Austria all have WIFI now, so I was doing some last-minute work on my slides online. Train WIFI being what it is though, the network wasn’t exactly reliable. The connection kept dropping everytime we went through a tunnel or too many passengers were logged on.

This is quite a common scenario. People are on the move, network coverage can be poor, internet connections fail. Luckily, we can prepare our websites for this and make them more resilient by building them offline-first.

Offline support is awesome, however your users might not be aware of these capabilites - and they shouldn’t have to be. In some cases they might not even know that they’ve gone offline. That’s why it’s important to communicate what’s going on.

Chances are not every part of your site will work offline. Certain things may not be cached, others may require server interaction. This is fine of course, but the interface should reflect that. Just like a responsive layout adapts to changes in viewport size, your offline-optimized site should adapt to changes in connectivity.

Checking for Offline

Permalink to “Checking for Offline”

The key ingredients here are the offline event and the navigator.onLine property. By combining them, we can check for network changes and react accordingly.

Here’s an example of a simple connectivity check:

let isOffline = false;
window.addEventListener('load', checkConnectivity);

// when the page has finished loading,
// listen for future changes in connection
function checkConnectivity() {
  updateStatus();
  window.addEventListener('online', updateStatus);
  window.addEventListener('offline', updateStatus);
}

// check if we're online, set a class on <html> if not
function updateStatus() {
  if (typeof navigator.onLine !== 'undefined'){
    isOffline = !navigator.onLine;
    document.documentElement.classList.toggle('is-offline', isOffline);
    ...
  }
}

⚠️ Note: With the online event, there’s a slight possibility of false positives: A user might be connected to a network (which is interpreted as being online), but something higher up might block actual internet access. The offline event is a bit more reliable, in the sense that an “offline” user can be expected NOT to have access.

Get Notified

Permalink to “Get Notified”

Now we want to display some kind of notification to offline users, so they know what’s going on. This can be done in a number of ways; however I would recommend using aria-live regions to make it accessible and have screen readers announce the connection change as well.

Using such a notification bar is pretty straightforward. First, define an element to display messages on your page:

<!-- notification container -->
<div 
  class="notification" 
  id="notification" 
  aria-live="assertive" 
  aria-relevant="text" 
  hidden
></div>

The aria-live attribute tells screen readers to announce changes to this element. “assertive” means it will interrupt whatever it is currently announcing at the time and prioritize the new message. The aria-relevant tells it to listen for changes in the text content of the element.

You can extend the handler function from before to populate the notification area whenever you detect that a user has gone offline:

function updateStatus() {
  ...
  const notification = document.querySelector('#notification');
  if (isOffline) {
    notification.textContent = 'You appear to be offline right now.';
    notification.removeAttribute('hidden');
  } else {
    notification.textContent = '';
    notification.setAttribute('hidden');
  }
}

This is a very simple implementation - you can of course always get a bit fancier with an animated notification bar (or “toast message”). There are also some nice pre-made components for this.

If you’re reading this on my site, you can see a version of these notifications in action if you simply switch off your WIFI for a second.
Go ahead, I’ll wait.

If you’re somewhere else or your browser doesn’t support service worker / offline events, here’s how this could look:

Telling the User what’s available

Permalink to “Telling the User what’s available”

Notifications are a good start, but it would be even nicer if we could give the user some visual indication of which parts they can actually use offline, and which not.

To do this, we can loop over all the links on page load and check their href against the cache. If they point to a cached resource (e.g. will work offline), they get a special class.

const links = document.querySelectorAll('a[href]');
Array.from(links).forEach((link) => {
  caches.match(link.href, { ignoreSearch: true }).then((response) => {
    if (response) {
      link.classList.add('is-cached');
    }
  });
});

Once the offline event fires, we toggle a class on the body and visually disable all links that aren’t cached. This should only apply to URLs, so we can ignore tel:, mailto: and anchor links.

.is-offline {
  /* disable all links to uncached pages */
  a:not(.is-cached) {
    cursor:not-allowed;
    pointer-events: none;
    opacity:.5;
  }
  /* ignore anchors, email and phone links */
  a[href^="#"],
  a[href^="mailto"],
  a[href^="tel"] {
    cursor:auto;
    pointer-events: auto;
    opacity:1;
  }
}

Offline Forms

Permalink to “Offline Forms”

Another way we might use this is to prevent users from filling out forms. Most forms pass data to the server and require a connection to work, so they won’t be very useful when offline.

What’s worse is that users might not know there is a problem until it’s too late: imagine filling out a lengthy form and finally hitting the submit button, only to find a network connection error page and all your inputs gone. That’s frustrating.

/* Disable Forms when offline */
.is-offline form {
  position:relative;
  opacity:.65;
  cursor:not-allowed;
  pointer-events:none;
  
  &::after {
    content: 'Sorry, you\'re offline.';
    display:block;
    position:absolute;
    top:50%;
    left:50%;
    transform:translate(-50%, -50%);
    color:#FFFFFF;
    background-color:#2D2D2D;
    padding:1rem 2rem;
  }
}
a disabled form with the words 'sorry, youre offline' in a box on top
No contact forms in offline country.

That effectively disables every form on the page, indicating that this functionality is currently not available. Depending on what your form does, you might also consider applying these styles just to the submit button - that way a user could pre-fill the form (possibly even have it validated in JS), and then submit it once they come back online.

If you’re doing this, remember to suppress “submit on enter” as well, and make sure the user knows why submitting won’t work at the moment.

UPDATE: I found a better way to handle this - by storing form submissions in localStorage and then checking for them once the connection comes back online. Read about it in “Offline-Friendly Forms”.

Further Reading

Permalink to “Further Reading”

How to turn your website into a PWA

A Progressive Web App, or PWA, uses modern web capabilities to deliver an app-like user experience. Any website can be a PWA - here's how to do it.

The "add to homescreen" prompt in a PWA

Turning a basic website into a PWA is not that hard and has a lot of real benefits, so I want to take a look at the three main steps necessary to achieve just that.

But first, let me address some common misconceptions:

1) Your thing does not have to be an “Application” to be a PWA.

A Progressive Web App can easily be a blog, a marketing site, a shop or a collection of cat memes. At its core, a PWA is just a way to optimize your code for better, faster delivery. You can -and should- take advantage of these new possibilites, regardless of your content.

Side note: the term “Application” in PWA is heavily debated, since some people feel it communicates the wrong idea. IMHO, its just a name - and these days it’s hard to define the difference between websites and “web apps” anyway.

2) Your thing does not have to be a Javascript-powered single page app.

Again, if you’re not running a cutting edge React-Redux SPA, that’s no reason to shy away from using this technology. My own site is just a bunch of static HTML based on Jekyll, and it’s still a perfectly valid Progressive Web App. If you run something on the web, it can benefit.

3) PWAs are not specifically made for Google or Android.

The beauty of it is that PWAs offer the best of both worlds - deep linking and URLs from the www, offline access, push notifications and more from native apps - while still staying completely platform-independent. No app stores, no separate iOS / Android codebases, just the web.

4) PWAs are ready and safe to use today.

Jup, the “P” stands for progressive, meaning everything about it can be viewed as an extra layer of enhancement. If an older browser does not support it, it will not break; it just falls back to the default: a regular website.

OK, So why should I do this?

Permalink to “OK, So why should I do this?”

Turning your website into a PWA offers some serious advantages:

  • A faster, more secure user experience
  • A better Google ranking
  • Better usability
  • Better performance
  • Offline access

Even if you don’t expect your users to “install” your PWA (e.g. place a shortcut on their home screen),
there is still a lot to be gained by making the switch. In fact, all of the steps necessary to make a PWA will actively improve your website and are widely considered best practice.

Step 1: The Manifest.

Permalink to “Step 1: The Manifest.”

A manifest is just a JSON file that describes all the meta data of your PWA. Things like the name, language and icon of your app go in there. This information will tell browsers how to display your app when it’s saved as a shortcut. It looks something like this:

{
  "lang": "en",
  "dir": "ltr",
  "name": "This is my awesome PWA",
  "short_name": "myPWA",
  "icons": [
    {
      "src": "\/assets\/images\/touch\/android-chrome-192x192.png",
      "sizes": "192x192",
      "type": "image\/png"
    }
  ],
  "theme_color": "#1a1a1a",
  "background_color": "#1a1a1a",
  "start_url": "/",
  "display": "standalone",
  "orientation": "natural"
}

This is usually called “manifest.json”, and linked to from the <head> of your site:

<link rel="manifest" href="manifest.json">

Tip: You don't have to write that file yourself. There are different icon sizes for different systems, and getting everything right can be tedious. Instead, just make one 500x500 image of your app icon (probably your logo), and head over to Real Favicon Generator. They render all common sizes, provide meta tags and generate a manifest file for you. It's awesome.

Step 2: Go HTTPS.

Permalink to “Step 2: Go HTTPS.”

Progressive Web Apps need to be served over a secure connection, so the HTTPS protocol is the way to go. HTTPS encrypts the data users send to your server and prevents intruders from tampering with their connection. As of recently, Google also heavily favors sites on HTTPS and ranks them higher than non-secure competitors.

To switch to HTTPS, you will need an SSL certificate from a trusted authority. How to get them depends on your hosting situation, but generally there are two common ways to do it:

👉 If you operate your own server or have root access to one, check out LetsEncrypt. It’s a free, open and straightforward certificate authority that allows anyone to start using HTTPS. It’s quite easy to set up and is just as trusted as other authorities.

👉 If you’re on shared hosting, a lot of providers unfortunately won’t allow you the level of control you need to use LetsEncrypt. Instead, they usually offer SSL certificates for a monthly or annual fee. If you’re unsure how to get a cert, contact your hosting provider.

After you obtained a certificate, there might be some adjustments you need to make to your code so that all resources are fetched on a secure line. For more information about the process, read this detailed guide from keyCDN or check out Chris Coyier’s article if you want to migrate a WordPress site.

If everything goes as planned, you’ll be rewarded with a nice green lock icon next to your URL:

HTTPS lock icon

Step 3: The Service Worker.

Permalink to “Step 3: The Service Worker.”

This is where the magic happens. A Service Worker is essentially a piece of Javascript that acts as a middleman between browser and host. It automatically installs itself in supported browsers, can intercept requests made to your site, and respond to them in different ways.

You can set up a new SW by simply creating a Javascript file at the root directory of your project. Let’s call it sw.js. The contents of that file depend on what you want to achieve - we’ll get to that in a second.

To let the browser know we intend to use this file as a Service Worker, we need to register it first. In your site’s main script, include a function like this:

function registerServiceWorker() {
  // register sw script in supporting browsers
  if ('serviceWorker' in navigator) {
    navigator.serviceWorker.register('sw.js', { scope: '/' }).then(() => {
      console.log('Service Worker registered successfully.');
    }).catch(error => {
      console.log('Service Worker registration failed:', error);
    });
  }
}

The scope parameter defines which requests the SW should be able to intercept. It’s a relative path to the domain root. For example, if you were to set this to /articles, you could control requests to yourdomain.com/articles/my-post but not to yourdomain.com/contact.

Offline is the new black

Permalink to “Offline is the new black”

There is a number of cool things you can do with Service Workers. One of them is the ability to cache your content, store it locally, and thus make it available when the user is offline. Even if they are online, this will have a huge impact on page loading time, since requests can just bypass the network completely and assets are instantly available.

Other than with traditional browser caching, you can define a list of resources to cache when the worker is installed - so a user does not have to navigate to a page first for it to be cached. Here’s how that might look:

// sw.js
self.addEventListener('install', e => {
 e.waitUntil(
   // after the service worker is installed,
   // open a new cache
   caches.open('my-pwa-cache').then(cache => {
     // add all URLs of resources we want to cache
     return cache.addAll([
       '/',
       '/index.html',
       '/about.html',
       '/images/doggo.jpg',
       '/styles/main.min.css',
       '/scripts/main.min.js',
     ]);
   })
 );
});

Tip: If you want to get started with offline-first quickly, I'd highly recommend using sw-precache. This is a tool made by the folks at Google that integrates with your existing Gulp or Grunt build process to generate the service worker file for you.

You can simply pass it a list of files and it will automatically track all changes, and keep your Service Worker cache up to date. Because sw-precache integrates into your site’s build process, you can use wildcards to precache all of the resources that match a specific pattern, like so:

import gulp from 'gulp';
import path from 'path';
import swPrecache from 'sw-precache';

const rootDir = '/';

gulp.task('generate-service-worker', callback => {
  swPrecache.write(path.join(rootDir, 'sw.js'), {
    staticFileGlobs: [
      // track and cache all files that match this pattern
      rootDir + '/**/*.{js,html,css,png,jpg,gif}',
    ],
    stripPrefix: rootDir
  }, callback);
});

Run this task in your build, and you’ll never have to worry about cache invalidation again!
For smaller, mostly static sites, you can have it precache every image, HTML, JavaScript, and CSS file. For sites with lots of dynamic content, or many large images that aren’t always needed, precaching a “skeleton” subset of your site often makes the most sense.

PS: For a deeper look into the subject of offline support, be sure to check out “The Offline Cookbook” by Jake Archibald.

Testing your PWA

Permalink to “Testing your PWA”

The Chrome Lighthouse Extension is a testing tool to check Progressive Web Apps for their Performance, Accessibility and compliance with the PWA spec.

It tests your site in different viewports and network speeds, measures time to first paint and other performance factors, and gives valueable advice for areas that still need improvement. It’s a really good benchmark for websites in general.

Google Lighthouse Report showing audits for PWA, Performance, Accessibility and Best Practices
Lighthouse report for mxb.at

You can either install the Lighthouse extension in the Chrome Web Store or use Chrome Canary, where it is included in the Devtools’ Audit tab by default.

Further Reading

Permalink to “Further Reading”

Hopefully that gave you a quick overview on how to get started with PWAs. If you want to dive deeper, here are some good places to learn more:

Bottle Slider Wiggle Effect

I built this product slider as part of a wine shop I was working on in 2015, and since it's also featured in a case study here on my site, I had a couple of people asking me how the animation was done.

Well, it’s really quite simple – so here’s a quick rundown on how to make the bottles dance. You can see the actual live thing in action on one of the product pages here. Grab some Grüner Veltliner while you’re at it.

The Slider

Permalink to “The Slider”

Markup is pretty straightforward, just your standard slider structure. A parent div and an ul with some list items. The real production version obviously has a little bit more going on, what with that fancy ratings popover and all. But for now, this should do the job:

<div class="slider">
  <ul class="slider__content">

    <li class="slider__item">
      <a href="link/to/product">
        <img src="image_of_bottle.jpg" alt="" />
      </a>
    </li>

    <li class="slider__item">...</li>
    <li class="slider__item">...</li>

  </ul>
</div>

To make this into a slider widget, you will need some CSS and a bit of Javascript. I used a jQuery plugin called Flexslider for this one, and I like it a lot. But almost any other slider would work too. The only important part for this effect is a callback function that gets triggered before each sliding transition.

Flexslider does exactly that with its before method. You pass it the $slider variable (the parent element), and then apply a class to it that later controls the animation state. After the animation has finished, we need to remove that class again. My wiggle lasts about a second, so I put in a setTimeout for that duration (plus a little more for good measure).

$slider.flexslider({
  //animation: 'slide',
  //selector: '.slider__item',
  //animationLoop: false,
  //slideshow: false,
  before: function($slider){
    $slider.addClass('slider--shaking');
    window.setTimeout(function(){
      $slider.removeClass('slider--shaking');
    }, 1200);
  }
});

The Animation

Permalink to “The Animation”

Next up is the actual CSS keyframe animation that makes the bottles swing from side to side. Mine looks like this:

@keyframes wiggle {
  25%  { transform: rotate3d(0, 0, 1, 6deg)  }
  50%  { transform: rotate3d(0, 0, 1, -4deg) }
  75%  { transform: rotate3d(0, 0, 1, 2deg)  }
  100% { transform: rotate3d(0, 0, 1, 0deg)  }
}

We tilt the items first right, then left, then right again, losing momentum in each turn to simulate the inertia a real bottle would have.
The rotate3d is to force graphic acceleration, which makes for smoother animation performance. Also, be sure to include vendor prefixes for the transform - or, if you’re lazy like me, let Autoprefixr do that for you.

The last step is to apply the keyframe animation to your slider every time it gets triggered.
Two things are important here:

  1. define the transform-origin for each object. This will be the fixed point that anchors the animation, it corresponds to the center of gravity in the real world. For my bottles, that would be center bottom.

  2. 💡__PRO TIP:__ to make it seem more realistic, apply a little delay to every other bottle, so they dont all wiggle in unison. A small offset in timing does the trick.

.slider--shaking .slider__item {
  //disable hover effects while transitioning
  pointer-events:none;

  //set up the wiggle animation
  animation-name: wiggle;
  animation-duration: 1s;
  animation-fill-mode: both;

  //set the 'fixed point' of the animation
  transform-origin:bottom center;
}

.slider--shaking .slider__item:nth-child(2n) {
  //slightly offset every other item
  animation-delay:.1s;
}

Aaand you’re done! Not much to it, but makes for a nice touch and a cool thing to show off. 🍾