Since the creation of the Internet, average file sizes have been steadily growing. What started out as kilobytes has progressed to megabytes (yes, plural), and our files are only growing still.
While this phenomenon isn’t disconcerting at first glance, the impact it has on performance and maintainability is awful. Add in aging devices, bandwidth restrictions, or slow speeds in general… and we have a much bigger problem.
Thankfully, we have control over not only our file sizes, but also how our pages are rendered in the browser. This sort of control gives web developers like ourselves a chance to help ease this problem, and optimize our code for better performance in the process.
Why bother?
I completely understand a lack of interest when most internet connections in the US are fairly fast these days. I mean, if everything works fine already why bother?
Performance and optimization are about more than how quickly we can download content. There are also quite a few SEO and UX benefits to be had by taking the time to look at our code. Not to mention, decreasing file sizes by optimizing our code for better performance has the added bonus of decreasing our bandwidth costs as hosts, and decreases bandwidth usage (think ISP/cellular data caps) on the user level as well.
Thinking modular is the first step
Modular code typically adds bloat in the form of more options. Here, we want to think modular in terms of combining as many common pieces of our code as possible. If we can combine two CSS classes into one and use less code to provide the same result, we should.
Modularity isn’t as important when it comes to basic HTML and CSS, but when you get into the more complex world of JavaScript, having too much bloat can hurt you — especially on mobile.
Minimize HTTP and dependency requests
Dependency requests are by far the biggest factor in slowing down most page loading speeds. Each additional request adds bloat and another layer of complexity to the parsing and downloading process. It’s often easy to forget that calling images from your stylesheet also count as well, so be sure to limit those and use alternative optimization methods such as sprites or SVG when possible.
While we’re on the topic of external dependencies, if your website is large enough to require a few dozen requests at minimum… It may be time to consider using a CDN. Using a CDN to distribute your content won’t decrease file sizes and/or load times as much as removing extra HTTP requests all together, but it can likely remove any slow server connections out of the equation at least.
Production vs. development environment code bases
There should be a very stark difference when comparing your development and production level code bases. Taking this step alone can sometimes see the largest decrease in file sizes across the board.
It’s typical today to see developers refer to their “production” or “development” environment, especially on large scale projects. But it’s also useful on the smaller end of things as well. The largest difference between the two environments can be seen with image compression and the minifying/compression of code. In the end, we want our production environment to be as lean and fast as possible while our development environment should be the same, only minus the image/code compression optimization.
Using the built-in tools like Photoshop’s “Save for web” compression can be a good starting point for images. There is a plethora of knowledge to be explored elsewhere as well with conversations on image formats, compression algorithms, quality control, and best practices.
For code, the best use of compression usually depends on the language you’re working with. It’s also highly debatable whether compression of code helps or hurts other people trying to understand your code, but that’s a conversation for another time. When it comes to plain HTML and CSS, I use services like Google’s htmlcompressor and the YUI Compressor for CSS.
Write smarter, more readable code
Sometimes the very code we’re writing is the slowest link in the chain. Inefficient CSS or bloated JavaScript can hurt loading times more than you may think. This Mozilla post goes into great detail about the importance of writing lean CSS selectors and explaining how browsers render them. In short, writing the exact path down a chain of selectors is much less efficient than simply using the smallest uniquely identifiable selector instead. They both direct the styling to the same element, the latter simply gets the job done much, much faster.
JavaScript can be even worse than poorly written CSS, and in many cases it’s easily overlooked. How many times have you copied and pasted an external JS library into your project without really looking in depth at the source itself? Typekit is a wonderful example of this, as when their servers stall it can bring a webpage using their fonts to its knees and cause an additional 30 seconds or even minutes of extra load time.
Thankfully, such events happen rarely, but it’s still good practice to call JavaScript last if possible, as is the case with Google Analytics. Doing so allows the browser to parse through the head files (CSS, HTTP requests, etc) and display the markup, before JavaScript begins to slow things down.
Keep HTML very simple
In keeping with our goal to write leaner CSS selectors and keep bloat to a minimum, writing efficient HTML should also be a priority.
CSS resets often target all common elements and enforce “resetting” styling on them. So even if you aren’t targeting that extra div, it’s likely still slowing things down by having to have its padding and margin reset at a minimum. Typically, an extra div or two won’t really hurt anything though. Only when you start ending up with dozens of them do things get crazy. With the introduction of more elements into the HTML5 spec, we also have much more flexibility in this area as well.
Google likes it when we write cleaner code
Google has made it a priority to whip the internet collectively into shape. In order to occupy prominent positions within their search results, pages must now pay critical attention to many different attributes about how they’re rendered. Calling too many external resources, having absurdly large images, or even having poorly written JavaScript can pull a site down in ranking.
Thankfully though, this is all with good intention as their requirements for a good search ranking are built around good development practices. Google also offers a very in depth guide to optimizing different aspects of your site for better SEO — which also happens to promote fantastic development practices at the same time.
Conclusion
When optimizing our code, we have to not only think about file sizes but also consider how it will be read; either by browsers or even other humans. Mobile use should also be taken into consideration, with many service providers enforcing very constraining data caps these days.
So while it may take extra time to perform all this optimization, it’s certainly a worthwhile endeavor since it not only offers better performance in the browser and on mobile, but also has the chance to promote better development practices and even get your content a higher rank on search engines like Google.
Next time you prepare to launch, throw your images into a compression engine… You may be surprised how many megabytes it can shave off!
Featured image, modular speed image via Shutterstock.