My Suggestions to Speed up Testing with Laravel and PHPUnit

22. August 2017 Blog 4

Over the last year at work we have been working on a new project and have built our application on Laravel 5.*. I got to lead the project, so as the leader I wanted to set a good standard for how the application should be built. This meant I encouraged a Test Driven Development (TDD) approach and enforced 100% code coverage. To ensure this I setup a Continuous Integration process and in that process it not only runs all tests, but it does a code coverage analysis and won’t deploy the code to AWS until there is 100% code coverage.

The project has definitely been a learning process and also sometimes a struggle as a couple team members had never done any automated testing at all. But over time everyone started to understand the process and see the value in it.

As you can guess the size of the test suite after coding for a year has grown to be pretty large. We currently have over 700 tests that run and make over 3000 assertions in those tests. As we have added to the application the testing time has obviously gone up quite a bit and every once in a while I try to optimize the tests to see if I can get them to run a little faster as it saves all of time. So this is a list of optimizations or updates I have made to the way Laravel 5 is setup to test using PHPUnit by default.

Just for context we haven’t moved any tests to Dusk yet, so all of our functional tests are still using laravel/browser-kit-testing to simulate actual clicks and user interaction on the site.

Use sqlite with :memory: option

In my opinion the first thing everyone should do is make sure in their testing they are using sqlite and use the :memory: option with it. This means the application creates a blank database purely in memory, not a text file, and sets up the database from scratch every test.

I like this option for two reasons. First, it is fast! Right out of the gate this will give you significant improvement over using a test MySQL database or even a file based sqlite database. Second, it insures that each test is independent of the others and no data is being carried over. After each test the database is released from memory and has to be recreated.

This option was great for us to get started and our tests were super fast. However, as our code base increased we noticed that while our normal tests were still plenty fast, our code coverage tests were getting slower and slower.

File based sqlite with direct file manipulation

So after doing some searching and finding some blogs that talked about optimizing testing I found some suggestions on different options. One option I found was to create a build process that would create a empty sqlite file based off of migrations and use that to setup the database each time.

While I liked the idea I saw one major problem with it. Developers will forget to rerun the rebuild script after they add a migration and tests will fail. So I wanted to come up with a more automated method.

Here is the method I came up with. I set a static variable on the parent Test class that all the tests extend. The static variable simple stores a boolean for whether the database setup has been ran. If not then it runs it once and changes the static to true so it doesn’t run again during that test.

The database setup has a few tasks it completes to get things ready for testing. First, it finds the sqlite file that I have specified in my config/database.php for testing, mine is simply database/testing.sqlite, and makes sure it is an empty file. It then uses artisan to run the migrations. Finally, it gets the contents of the file and stores it in another static variable.

After the setup has ran once on each subsequent test instead of running migrations it does something else. It simply empties the sqlite file again, and writes the content stored in the static variable to the file.

Pretty simple process and while I didn’t see significant gains during normal testing, I saw huge improvements during code coverage testing. I was happy and thought all was good in the world. That is till one day I was talking with a developer on the team and he told me his tests were taking hours to run. So I remotely connected into his computer and couldn’t figure out why only on his computer tests were taking so long. After some digging I realized he didn’t have an SSD so all the file manipulation was actually way slower on his computer. Also I realized that while I had an alias setup to turn on Xdebug only when code coverage was running, by having the extension loaded in your php.ini at all slows things down. So that leads to the next two improvement.

Don’t have Xdebug in your php.ini at all

I add xdebug to my php.ini just like they describe in the documentation and made sure that by default the profiler was disabled. So when I would run my code coverage tests I simply ran something like this:

phpunit -dxdebug.profiler=On --coverage-html ./storage/logs/phpunit

This worked great, until I realized that if I didn’t enable the profiler just by having the extension loaded in php.ini tests ran way slower. I didn’t want to be enabling it and disabling it in the ini file so instead I tried to enable xdebug with a -d option the same way. I found out you cannot load an extension that way with phpunit as php has already been loaded by the time phpunit tries to set that value. So I removed it from my php.ini completely and created an alias to run this:

php -dzend_extension="php_xdebug-2.5.0-7.1-vc14-x86_64.dll" -dxdebug.profiler_enable=On "C:\pathto\phpunit" --coverage-html ./storage/logs/phpunit

So you can enable an extension when you startup directly start php, so I have php load with the extension and then call phpunit. This worked great, now xdebug was only loaded during code coverage. But I still had to address the SSD file write speed issue.

Look at Update section below for an update to this.

Check to see if code coverage is running

Since code coverage tests are slow already it was still faster to use the file option, even without an SSD. I believe this is due to the fact that while you are running code coverage it is literally watching every line of code that gets ran and keeping track. So when you run a migration before every test there is a ton of Laravel core code that is running in order for artisan to work, for migrations to run, etc… So I realized lets use :memory: while running tests normally and use files while running code coverage. But how could I automate that?

The first thing my testing class does is check to see if Xdebug’s profiler is enabled. This is a simple test and I set a static boolean on the class. Looks something like this:

self::$textDbEnabled = (ini_get('xdebug.profiler_enable') == 1);

Now before each test runs it simply checks to see if textDbEnabled is true and if it is uses the file method, if not it simply uses :memory: and does a migration every test. Problem solved!

This worked great for a few months. However, as our number of tests went from about 400 to 700 it got to the point where I felt like even when doing normal testing I was wasting to much time waiting for tests to run. Even if I picked specific files to run instead of the whole test suite it seemed like it was taking longer. I decided to figure out why.

The conclusion that I came up with is that as our application has grown, so has our number of migrations files. And if artisan:migrate is running before every test that is a lot of stuff to run. So every time we add a migration file it is literally slowing down each individual test.

Using a combination of files and :memory:

So how could I get some speed back? My thought was somehow convert migrations into a simple .sql file that had the structure of the database after all migrations had ran. First thing I checked was if I could export or dump an sqlite database that was loaded into :memory:. If there is an option or way of doing this I surely couldn’t find it, so I kept digging.

I did find an option to export the schema of a file based sqlite database. The one caveat was that you have to have the actual executable sqlite3 available. So I found some instructions to get that installed on my computer as there is no Windows installer for it. Then you can simply run a command that looks like this:

sqlite3 testing.sqlite .schema > schema.sql

Now that I knew I could make a dump of the schema I was ready to try a new process. On the very first test, use the same code I used above to create a file based sqlite database (empty file and artisan migrate). Next, use exec() to create a sql file with the schema:

exec("cd database && sqlite3 testing.sqlite .schema > schema.sql");

Get the contents of that schema.sql file and store it in a static variable. Then on each test after that switch to using :memory: and simply do a raw database statement with the sql commands.

I haven’t perfected it yet, for some reason I can’t run the entire database creation in a single command but for now I do something like this:

$lines = explode("\n",self::$dbStructure);
foreach($lines as $line) {
    DB::statement($line);
}

Since all the migrations aren’t running on each test this made testing way faster, even without an SSD since we are only doing file manipulation on the initial test setup. I have seen test times go down over 50%, so that means they are running at least 100% faster.

Conclusion and future plans

I’ve learned that building a large application with a large test suite can be tough in multiple ways. Not only is it tough to get people to write tests (including yourself), but it is hard to have a stable and fast testing process. As an application grows so does your testing process.

To make future improvements I have also thought about taking a lot of old migrations and combining them. I’ve never really read anything about consolidating migrations over time so I’ll have to do some research on what best practices are on that and see if it is a good idea. The one issue I foresee is each database has a migration table that stores what migrations have ran. If you start changing these or deleting some of them I don’t know how Laravel will deal with migration files being gone. If you have experience with this please leave a comment and let me know how you have dealt with this in the past.

Update

FlyLo11 on reddit was nice enough to point out that in my aliases to turn on xdebug I was calling xdebug.profiler_enable=On however there is option for xdebug.coverage_enable=On. I tried this on a single test file so far but it cut the time it took to run the test to 33% of the original time. So I would highly encourage you to update to use this option.


4 thoughts on “My Suggestions to Speed up Testing with Laravel and PHPUnit”

  • 1
    Dees on February 14, 2018 Reply

    Would you be so kindly to share your Tests\TestCase.php file.

    • 2
      pitchinnate on February 21, 2018 Reply

      Yes I will need to strip some stuff out but I will try to post it today.

  • 3
    Dees on March 20, 2018 Reply

    I tried to recreate this into a package. Let me know what you think: https://github.com/dees040/festing

    Feedback is welcome!

Leave a Reply

Your email address will not be published. Required fields are marked *