About Google Pagespeed Insights and Xara
By default, websites generated by Xara Web Design or Xara Designer Pro software are not server optimized. Quite correctly, in order to be universally compatible, there is no attempt to enable compression or use specific .htaccess settings for example. The images themselves are also not fully optimized (compressed) and include meta data which for display purposes is unnecessary. In general, Xara is optimized for speed and squeezing every last bit of compression from an image is a time-consuming process, so again this is understandable. On the plus, the mobile ready themes are generally scored as mobile friendly by google so there is little we need to do for that aspect. Since page speed is increasingly being taken into account in google ranking, it makes sense to optimize it as much as possible and of course to make most efficient use of resources.
There are various elements to page speed according to google, plus there are other factors rated highly by other page speed analysis websites, but we are going to concentrate on googles criteria for obvious reasons!
Also, this articles is based on a standard LAMP setup (Linux, Apache, Mysql, Php) and cPanel based hosting account although for other setups, much can be reused.
Google Pagespeed Insights Ranking Criteria
For an overview of the aspects that google uses to rank your website page speed, please see the following article, but I cover most of it as I go through
Optimizing a Xara Website for Google
So lets go! We are going to use default themes with no modifications as our testing on a shared server with caching switched off. You will need ftp access to your website (and be familiar with its use) and have made a backup before starting! You will also need a plain text or code editor (notepad for example), don’t use a word processor or rich text editor as it may generate additional hidden code.
Hopefully, the suggestions listed here will apply to most themes but there may be specifics out of the scope of this article.
We are using Google Pagespeed Insights website to scores although extra functionality can be achieved through API access.
I am doing two themes in parallel, Black Noir and Applab.
Here are the lousy default scores with no optimization. (I switched off some optimizations that are normally enabled by default on my hosting so as to better show the improvement of each stage but also because certain other optimizations can conflict with each other. Your initial results may be better or worse than these!).
BLACK NOIR – 44/100 mobile, 64/100 Desktop.
APPLAB – 38/100 mobile, 54/100 Desktop.
5 out of the 10 criteria are flagged up as problems, the 5 that pass at this stage are:-
Avoid Landing page redirects
Self-explanatory, you are simply visiting a page so not redirects should occur. If you add www. redirect or https in the future, ensure you update all incoming links and sitemaps to the full new URLs to avoid redirects as much as possible.
Prioritize Visible Content
Fortunately on the themes I am going to use this is taken care of. Watch out in future for poorly designed themes that break this or other rules.
Reduce Server Response Time.
By default google expects a page to start producing output within 0.2 seconds. A good host should achieve this. As there are no database calls to be made and the pages generated by Xara are pre-generated html, this should never be a problem, if this is a problem on your hosting for a simple basic website, you might want to consider changing your hosting. As you add more to your website, this will only become more likely to be an issue in the future.
Minify HTML.
I was actually surprised that google didn’t complain about this. Looking at the HTML code, it appeared at first glance that there are a lot of line breaks and its spaced out quite well although there is very little indentation. On inspecting on Gtmetrix. it reckons it can only be minified by 3%, so google was sensible enough not to complain about this. So well done Xara for keeping it readable yet optimized! In fact, this message usually gets negated when we enable compression anyway.
Minify Javascript.
Another well done Xara! It looks like the Javascript libraries that are included must already be minifed. If other libraries get included by other themes that are not minified, look at the Minify CSS section below as the same method can be used to rectify this.
The issues we need to try and fix.
So, working through the 5 issues that are flagged as a problem.
The top of the list as a priority to fix is Enable Compression so lets add a few lines to .htaccess and see how it improves. As there is no .htaccess generated by Xara by default, you may need to create this file. If there is already one there, make a backup of it first and add this code towards the bottom. There are numerous combinations of this code but this one seems to allow for old browsers and most importantly exclude (already compressed) images. Source (http://stackoverflow.com/questions/2835818/how-do-i-enable-mod-deflate-for-php-files)
## SWITCH COMPRESSION ON <IfModule mod_deflate.c> SetOutputFilter DEFLATE <IfModule mod_setenvif.c> # Netscape 4.x has some problems... BrowserMatch ^Mozilla/4 gzip-only-text/html # Netscape 4.06-4.08 have some more problems BrowserMatch ^Mozilla/4\.0[678] no-gzip # MSIE masquerades as Netscape, but it is fine # BrowserMatch \bMSIE !no-gzip !gzip-only-text/html # NOTE: Due to a bug in mod_setenvif up to Apache 2.0.48 # the above regex won't work. You can use the following # workaround to get the desired effect: BrowserMatch \bMSI[E] !no-gzip !gzip-only-text/html # Don't compress images SetEnvIfNoCase Request_URI .(?:gif|jpe?g|png)$ no-gzip dont-vary </IfModule> <IfModule mod_headers.c> # Make sure proxies don't deliver the wrong content Header append Vary User-Agent env=!dont-vary </IfModule> </IfModule> ## END SWITCH COMPRESSION ON
Here are the new results now
BLACK NOIR – 49/100 Mobile 71/100 Desktop.
APPLAB – 42/100 Mobile 63/100 Desktop.
That’s a modest improvement but already and the Desktop warning is now orange instead of red on both themes!
So we are down to 4 issues remaining.
Next on the list is Leverage browser caching, lets add a few more lines to .htaccess (there are various variations of these lists but I have tweaked to satisfy google, you can adjust for your own preference).
## LEVERAGE BROWSER CACHING ExpiresActive On ExpiresByType image/jpg "access 1 week" ExpiresByType image/jpeg "access 1 week" ExpiresByType image/gif "access 1 week" ExpiresByType image/png "access 1 week" ExpiresByType text/css "access 1 month" ExpiresByType application/pdf "access 1 month" ExpiresByType application/x-javascript "access 1 month" ExpiresByType application/javascript "access 1 month" ExpiresByType application/x-shockwave-flash "access 1 month" ExpiresByType image/x-icon "access 1 month" ExpiresDefault "access 1 week" ## END LEVERAGE BROWSER CACHING
Here are the scores after that optimization
BLACK NOIR – 67/100 Mobile, 86/100 Desktop.
APPLAB – 58/100 Mobile 74/100 Desktop.
WOW- that’s a huge improvement, particularly for Black Noir Theme! Mobile is now orange and Desktop is green!
Next we’ll deal with Minify CSS. When we click on the Show How To Fix link underneath, for Black Noir theme, it reveals that the highslide.css file could be optimized by 22%. On Applab theme it shows ani.css could be optimized 11% (even though this file is also in Black Noir theme, google does not flag it up). There are online and offline minifiers for this type of thing but google makes this easy and does it for you. Look for the link;
Download optimized image, JavaScript, and CSS resources for this page.
This gives a zip file with several optimized resources in for your convenience. Open the zip file and look in the css folder. So lets take the minified version of the css files and substitute the optimized versions instead in the index_htm_file folder. Take a backup of the existing css files for your own peace of mind in case of problems then extract the minified versions and upload them to the index_html_files directory.
Lets see what difference that has made
BLACK NOIR – 67/100 Mobile, 86/100 Desktop.
APPLAB – 58/100 Mobile 74/100 Desktop.
None at all it seems! Never mind there’s only 2 things its complaining about now.
So now we will deal with Optimize Images
- UPDATE – As of 2017, Google now defaults to optimize to ‘lossy’ images whereas previously they were ‘lossless’. This means that some image quality will be lost during the optimization process. So if you choose the option to use the google generated optimized images, you will be using images that have slightly less quality than the original. See my new article for more information.
This step is a bit more involved. There are several ways this can be approached, I will detail 2 methods here. First off, backup your images just in case. Just note it strips meta information from the files so if you need that retained, you need to omit this step or investigate an option that retains it.
First option, replace your images with the optimized versions given by google!
So we start by replacing the optimized images with the ones that google just provided us also. In the zip of optimized files we got the css from, there is also a folder called image with optimized versions of our images in. They will all be in there with their original filenames. So we simply need to upload these files into the index_html_files directory as with the css. Now google only gives a limited number of files per page, so you will need to repeat this process several times.
In testing, I hit a point where the same optimized files keep getting produced by google even though I have replaced them! In the end, for example for Black Noir, google still complained that the 14@2x.jpg image and I just uploaded it! After some head scratching, I then realized that google changes the @ to an _ in the filenames, so you just need to extract the files then change each filename google gives with an _ in to an @ symbol if that is what the filename should be!
Anyway, this option does get there eventually, but the following option is quicker for many files.
Second Option, install jpegoptim and optipng (or similar) and manually optimize!
This option involves either installing these two applications (or others of your choice) on your local machine, or preferably on your server subject to permissions of your hosting. If you install them on your local machine, you will need to download all of the jpg and png files to your local machine, optimize them then re-upload them to your hosting.
So you need to go into the directory where the files are and use the following command line commands:-
jpegoptim --strip-all *.jpg
That optimizes all the Jpeg files using its best normal compression method. Plus you can then also use:-
jpegoptim --strip-all --all-progressive *.jpg
and that goes through them all again, this time trying the progressive method of compression to see if it is more efficient for each image, if so, it will recompress them with that method, if not, it will leave as normal.
Then for png files, this uses default settings;
optipng *.png
or to tweak every last byte (be warned this may take ages!)
optipng -o7 *.png
the second command takes much longer and is very processor intensive but can sometimes yield an extra few % compression for some larger images for those who are fanatical about such things.
If these files you have optimized were not on the server, you need to upload those optimized files to the server now.
It still complained to me that a few files were not fully optimized, I haven’t full investigated this, it may be that google tolerates a small amount of imperceivable image degradation whereas the above command line tools I recommend do not.
- UPDATE – I have now investigated this matter and as of 2017, Google now defaults to optimize to ‘lossy’ images whereas previously they were ‘lossless’. This means that some image quality will be lost during the optimization process. So if you choose the option to use the google generated optimized images, you will be using images that have slightly less quality than the original.
- Please see my new article – Google Pagespeed Image Optimization Updated 2017
So I just used the technique in Option 1 above of replacing the remaining files with the goo gle optimized ones, noting the quirk of @ being change to _ in the filenames that google produces and changing them accordingly.
Finally, the image warning disappeared. Now lets see how much that has improved the situation:-
BLACK NOIR – 68/100 Mobile, 88/100 Desktop.
APPLAB – 68/100 Mobile, 88/100 Desktop.
Well a modest improvement for Black Noir but Applab has finally caught up, I suspected the difference between the two was always that Applab has more images or that the images it uses were less optimized.
So that just leaves the one aspect being reported by google;
Eliminate render-blocking JavaScript and CSS in above-the-fold content
Now I have looked into this aspect but it starts getting very complicated! I have gained some improvement by inlining some of the css and asyncing some of the Javascript, but doing this wrong can break some functionality of your website or make it not appear correctly. And as these inclusions will vary so much from theme to theme, I am not able to post anything that can be used in a generic way on other themes. Also, every time a site changes, even slightly, the files get regenerated and so any amendments made would need to be repeated. Really speaking, this is something that would be better addressed by Xara themselves as this type of thing is much better to be determined at the point of generating the code rather than coming along afterwards and trying to re-organize it then.
So that’s where I will leave it. Take from it what you will, apply it to your Xara or other websites (but take a backup first, I accept no responsibility for breaking your website etc.) and enjoy the reduced load on your server resources, quicker loading times for your website visitors and a better ranking on google because your site has been optimized to its liking!
Additional:
It is really only recommended to perform these optimizations when a site is complete or near completion.
Enabling the cache for example will load images from your browser cache, so if you then change an image on the website, it would still show the old (cached) image (new visitors will get the newest uploaded image). You can refresh all images on your browser by F5 or ctrl-F5.
To prevent the optimized images and CSS files being overwritten each time the site is published, ensure you select Fast Update (Changed files only) in the ftp settings in Xara. It should then only upload elements in the site that have changed since last time it was published.
If you upload manually to your server, you need to only update files that are new or changed since last time.
If something does change, or new elements are added, you will need to repeat the relevant section for optimal results.
The .htaccess changes should be preserved as Xara does not touch this.