Optimizing Laravel app for 100/100 PageSpeed score

These are some of the notes on how to optimize website (specifically built with Laravel) to score 100/100 in google PageSpeed Insights. This tests are changing and always pushing toward page performance and user experience. Sites I have optimized to have score 100 four months ago, now have lower score. I'm going to update this document as I learn how to reach for that 100 goal.

Whole process is split in smaller elements.

  • images
  • scripts
  • styles
  • server speed

100 pagespeed score

Images

If you already have images on your website (static, part of design), you should optimize them if you haven't done that yet. For different image types you need to use different tools to optimized them. These are command line tools which you can use on linux or os x. This is useful for php plug-in which can leverage those tool to optimize images when user upload new content.

Images influence Prioritize Visible Content test, like described in this discussion. Basically specify your image width and height in the image tag.

Tools

  • pngcrush installation and usage on this link and this

  • jpegoptim source code

    yum install -y libjpeg-turbo-devel
    
    ./configure
    make
    make strip
    make install
    
  • gifsicle

    curl -OL <download_link>
    
    tar -xzf <download_filename>
    
    cd ..
    ./configure
    make
    make check
    make install
    

There are some other good tools with better compression like mozjpeg but I haven't tested them.

Usage

Following scripts are what I used to optimize photos I already uploaded on my site. This is lossless compression.

For png photos:

find . -type f -name "*.png" -printf '%p\n' -exec pngcrush -ow -q -reduce -brute "{}" \;

... and jpg photos:

find . -type f \( -name "*.jpg" -o -name "*.jpeg" \) -exec jpegoptim --strip-all --all-progressive "{}" \;

Or to trim the size little more, you can use -m<quality>, --max=<quality> option. Quality of 85 is good enough.

find . -type f \( -name "*.jpg" -o -name "*.jpeg" \) -exec jpegoptim --strip-all --all-progressive -m85"{}" \;

This is actually necessary to achieve good score on google page speed test.

Using with framework

I'm working with Laravel PHP framework, and there is a nice package which can leverage those tools to compress images before storing them in case where you alow users (maintainers) to upload new photos. It is laravel-image-optimizer

Other tools

One thing that these image optimization tools will not handle is a downscaling, so if you insert image from, for example unsplash, you will endup with unnecessarly large resolution.

To do that use packages as intervention or just plain imagemagick commands.

mogrify -resize 1920x\> ./uploads/uipazkzjv64-jonathan-pielmayer.jpg

Also these pictures can have a best quality, to trim the size down significantly but keep quality decent, use -m parameter with jpegoptim command. I learned that -m85 is good enough quality for my needs and google is satisfied with it.

Amazon S3

If you push images on amazon s3 server, you'll have problem with expiration headers not being set. To solve that you need to setup your upload process like this:

\Storage::drive('s3')
    ->put(
        $path,
        $img->stream()->getContents(),
        [
            'visibility' => 'public',
            'CacheControl' => 'max-age=5184000',
            'ContentType' => $img->mime()
        ]
    );

If you have many images already uploaded, you can use these script to add headers to them with amazon cli tool. Cannot remember exact command, but if I remember correctly is something like this answer.

Scripts

With laravel you have included elixir or mix, great tool for merging and minifying scripts.

The problems are with external scripts. How to handle google analytics script or mailchimp script. They suggest to keep loading scripts from their servers to always have up to date version. But they do not have cache control headers and google will not forgive you even analytics.js file.

We could just merge them with our internal scripts. So first we need a way to download them and then make sure we have new version at least every day.

This way we will have one file with internal and external scripts which will be updated every day.

Make sure to add async to your script element.

<script src="..." async />

If you had this script tag in the header, there is a danger that you have somewhere inline script to do "some small thing". These will fail if script is relaying on libraries like jQuery which will be included asynchronously later.

Styles

Concatenate and minify this too. Export critical styles that are necessary for initial render of the page. This was hardest thing for me. I tested some available tools, but ended up writing my own script.

Server Speed

Apache Config

I'm using Apache http server for my apps, and there is few things you can do to improve performance. Good resource for example configuration is HTML5 Boilerplate source code from where you can pick what you like and move to your apache config or in .htaccess file.

Application optimization

Optimizing response time is usually "manual" work. Check for slow queries or external services your app is using (external APIs calls, sending emails). There is one thing you can do, but make sure that app already has decent speed before introducing cache since this can mask some problems which would otherwise be obvious.

Minifying output html

This will actually slow down initial response time, but later when cache is used it will make smaller response.

With laravel just create middleware which will minify html output.

This is example of a simple minify middleware. Of course this can be much better. This one will remove spaces between two html elements.

<?php

namespace App\Http\Middleware;

use Closure;
use Illuminate\Http\Response;

class MinifyHtml
{
    /**
     * Handle an incoming request.
     *
     * @param  \Illuminate\Http\Request $request
     * @param  \Closure $next
     * @return mixed
     */
    public function handle($request, Closure $next)
    {

        /**
         * @var $response Response
         */
        $response = $next($request);

        $contentType = $response->headers->get('Content-Type');
        if (strpos($contentType, 'text/html') !== false) {
            $response->setContent($this->minify($response->getContent()));
        }

        return $response;

    }

    public function minify($input)
    {
        $search = [
            '/\>\s+/s',  // strip whitespaces after tags, except space
            '/\s+</s',  // strip whitespaces before tags, except space
        ];

        $replace = [
            '> ',
            ' <',
        ];

        return preg_replace($search, $replace, $input);
    }
}

Laravel speed optimization with caching

The package I like the most is Spatie responsecache. However there are some tricky bits you need to watch for. For a start you need to review default cache profile and adjust it to your needs. Author considered this so Profiles are easily changed.

Profile is saying exactly which request and which response should be cached, and you need to be careful about this to avoid unintended effects. I shaped my cache profile mostly by trial and error method.

Depending on case, but if most of your users are not signed in when they use the site you could just say do not cache users when they are signed in. In my case those were admins, and caching things would be harder to handle.

All routes are cached by default

One more thing to note is that all routes are cached by default. I had a case where i didn't want that option and there is an example how to do it. There is a way how to turn off caching for particular route with middleware. All depends on your needs.

CSRF tokens

When your forms contain CSRF input fields and caching is turned on every user will have same CSRF. My solution to that was to have a script which will call server after page is loaded and fill in CSRF token values. That way page can be cached with no problems.

changing content

If you have admin panel where user can change content of the site, you need to clear the cache. The way you do that is to listen for those changes. In my case, non admin users couldn't change any model in Laravel, so I just listened for all model changes and cleared cache when that happens.

use Spatie\ResponseCache\ResponseCache;
// ...
Event::listen(['eloquent.saved: *', 'eloquent.created: *', 'eloquent.deleted: *'], function() {
    $this->app[ResponseCache::class]->flush();
});

Another way is to bild a trait that you could use only on models you want to change cache.

Cache profile

namespace App;

use Illuminate\Http\Request;
use Spatie\ResponseCache\CacheProfiles\BaseCacheProfile;
use Spatie\ResponseCache\CacheProfiles\CacheProfile;
use Symfony\Component\HttpFoundation\Response;

class LyraCacheProfile extends BaseCacheProfile implements CacheProfile
{


 public function shouldCacheRequest(Request $request)
 {

     // if signed in do not cache
     if (auth()->check()) {
         return false;
     }

     if ($this->isRunningInConsole()) {
         return false;
     }

     return $request->isMethod('get');
 }

 public function shouldCacheResponse(Response $response)
 {
     return $response->isSuccessful() && !$response->isRedirection();
 }

 public function cacheNameSuffix(Request $request)
 {
     if (auth()->check()) {
         return auth()->user()->id;
     }
     // handling pages with flashed session
     return implode(',', array_only(session()->all(), session()->get('flash.old', [])));
 }


}

Using redis as a cache driver

There can be a problem if you have many pages with different parameter combinations and you are storing cache in files. Every combination will be cached and you may run out of memory. Better solution in this case is to use redis as a memory driver, but configure it as a cache.

maxmemory 350mb
maxmemory-policy allkeys-lru

This is a knowledge I gather from previous projects, and these notes are written after I have completed them. I'm looking forward to work on another project where I can go through these notes, clean everything up and extend with more code examples. If you have any suggestions, ideas or find mistakes let me know.

Author

I plan to write more articles about common laravel components. If you are interested let’s stay in touch.
comments powered by Disqus