I discovered this technique while building a little ios game recommendation page, here is the prototype: hitme.railsplayground.net/hecate/ The basic functionality of the page is to move left or right exposing more game recommendations as the user swipes through the list, pulling more to display based on their ratings.

I was getting a little cocky with the ease of use and awesomeness of jquery templating and started throwing in all kinds of unnecessary silliness like CSS3 cube transitions, an HTML5 loading indicator and then I found the straw that broke the browser’s back: trying to apply a -webkit-transition:left 0.3s to an element containing over 400 child elements resulted in that element totally failing to be rendered. This doesn’t happen on desktop browsers but I was able to reliably reproduce this on an iphone 3GS. The relatively small number of elements involved which were able to totally cripple display make this a very powerful weapon in your arsenal. Go forth and destroy!

 

 

You’ve probably heard of using a setTimeout call with a time value of 0 in order to mimic multi-threading in javascript. This is not a good idea. If we really want to achieve the bliss of unresponsiveness we must beware of this type of optimization. What are the browser limits anyway? How much pain do I have to cause before my browser screams for mercy? Thanks to a recent talk by slicknet we know that  these browsers will wimper when they hit these limits during script execution:

  • IE – 5 million statements
  • Firefox – 10 seconds
  • Safari – 5 seconds
The following code from John Resig’s recent post about twitter performance will reduce your function calls by a factor of 8 since  the scroll event reports about every 30ms. Doing away with such nonsense can provide a much livelier, exciting and dangerous application.
didScroll = false;
$(window).scroll(function() {
    didScroll = true;
});

setInterval(function() {
    if ( didScroll ) {
        didScroll = false;
        // Check your page position and then
        // Load in more results
    }
}, 250);

go crazy with animations

Posted: May 6, 2011 in Uncategorized

If you are going to do some animating you might as well animate everything all the time. Here is an example of what not to do, this is a sedate and boring way to navigate relationships on My IGN. We can make this much better with way more animation. Try and click on the leaf nodes now sucker.

When trying to piss people off with web development it is always nice to fall back on the tried and true method of shoving as much stuff down the pipe as you can. People use many methods to achieve this goal.

  1. Use a bigass javascript library. See this pretty little list of tiny javascript libraries all of which are under 5K. Do not use these. Small javascript libraries are for pussies. Instead checkout the full commercial version of Ext JS which weighs in at a respectable 1.1MB minified
  2. Use larger than necessary images. Even though there are free services which dynamically resize images devious web developers will avoid these at all costs and gleefully display heavy 400×400 graphics as a tiny thumbnail relying on the user’s browser to do the resizing.
  3. Never attempt to use local storage. This is especially important when developing websites for mobile devices. The puny caching ability of phones recently maxing out around 2MB can easily be exceeded forcing redundant requests for all your page components.

You are probably thinking: “What is all this nonsense about web page performance? I’ve got jQuery and a million plugins and I’m ready to do some damage.”

Excellent! Everybody knows that modifying the DOM is expensive but hey we’re making fancy shit. It is not immediately obvious but interrogating and changing elements with a read, read, write, write only triggers a single reflow while a read, write, read, write triggers two because the browser has to flush the queue and perform all DOM affecting activities before each read. Thanks to Stoyan for explaining this and helping us to write horrifically performing javascript like the following (which could be way faster if the reads were grouped).

<!DOCTYPE html>
<html>
<head>
    <title>Boxes</title>

    <script type="text/javascript" src="jquery.js"></script>
<style type="text/css">
.dancer {
    height:20px;width:50px;border:solid 1px;
    float:left;margin:3px;padding:3px;
}
</style>
</head>

<body>

<?php
for($i=0;$i<624;$i++) {
    echo '<p id="item'.$i.'" class="dancer">dance</p>';
}
?>

<script type="text/javascript">
for(i=0;i<100;i++) {
    next = parseInt(i)+1;
    console.log($('#item'+i).css("width"));
    $('#item'+i).css("height","50px");
    console.log($('#item'+i).css("width"));
    $('#item'+i).css("height","20px");
    console.log($('#item'+next).css("width"));
    $('#item'+next).css("height","50px");
    console.log($('#item'+next).css("width"));
    $('#item'+next).css("height","20px");
}
</script>

</body>
</html>

Your boss probably runs yslow so yeah you have to obey souders and minimize everything but you can still fuck shit up by executing mad javascript before the browser gets the chance to start painting actual content.

Any javascript that affects the DOM placed above your actual HTML content will be executed before the browser attempts to actually display anything which gives us plenty of opportunity to delay delivery of what is actually desired by the user. This scenario is often seen in the wild when companies add a cornucopia of third party scripts to their pages. Is your favorite website using optimizely and omniture to help serve you better? Use dynatrace to see how many hundreds of milliseconds these tools were able to delay all rendering if their scripts are placed in the head as the vendors recommend.

Let’s delay rendering with some filthy code:

for(i=0;i<100000;i++) {
    console.log('writing line '+i);
    if(i===99999) {
        document.write('whats up?');
    }
}