Strategies toward fast(er) tests

English: Usain Bolt at the World Championship ...

Fast as Usain Bolt (Photo credit: Wikipedia)

I talked some weeks months ago (time flies when you’re having fun) about our dabbling into modularization for our Sybil Rails application.

We are further, and somewhat faster, while still trying to find better ways. We already uncovered some of them along the way, that we share below with their use cases.

Why we want fast tests (and you should too)

 

First of all, when you’ve tasted subsecond tests, it is very hard to come back. Trust me on this one, especially in languages such as Ruby where (without a lot of static checks), your tests are the first and foremost tool you use to validate whether what you’re working on is working or not.

The key point is the length of the test-code cycle, as you’ll want to repeat it as quick as possible. This allows me to validate what I’m doing, step by step. It is not a reason to forego the initial design, just a good thing to be able to get feedback on the fact that you are progressing along the road nicely.

If your test runs in 5 minutes, maybe you’ll run it once an hour. If it runs in one minute, you’ll probably runs it several time an hour. If its sub-second (not for the whole suite, but at least for the test you are working on), you’ll probably runs it with every save (maybe I should run my test in my save action. Does anyone do that, using guard for example?).

MiniTest is just fine

In other words: MiniTest is just fine (whether you prefer “unit” style of “rspec” style) in a quite large number of cases, so if your application is not purely composed of Rails objects, use MiniTest in everyplace you can. Rails being for us a quite complete framework for designing web interfaces on database. A (good) part of our application fit this description, a (good) other part does not (analyzing code, mainly).

The basic rule is: do not depend on Rails when it does not make sense, ie:

  • Not doing interactions with any kind of web requests

  • Not linked to databases access

The best way to make this work may be (as we did) to extract the non web application code in external gems (refer to my previous post for some pointer on that). May be it is my Java past, but I’ve always prefered to separate my business logic from my web application (having the latter depends on the former, of course). I see this as a “good design practice” (separation of concerns) more than language related.

rails

rails (Photo credit: icopythat)

Everything in this gem (and / or lib files under the Rails app, even if I’m no fan of this way of working) can and should be tested with MiniTest. Once you got the habit of having subsecond tests, it is difficult to come back to any other situation.

Quick mention to Corey Haines “Fast Rails Tests” here (basic version: to make his tests faster, Corey did extract most business logic to external modules- read what he write/watch his videos, because I’m not making him justice here). While we have an objective of having fast tests, this is not the only goal we have, and (for example) we do not want to tamper too much with our design to make it testable (even if in most situations, doing really “unit” test push you toward the Single Responsibility Principle – a good thing). That means that the methods we have on the model are staying on the model. So, into the next part!

Business logic does not require a database

I had some tests to do on parts of the model (as I’m probably quite clearly in the “Rich models, Lean controllers camp), typically revolving around business logic (I’m here to test my code, not ActiveRecord, which I trust to get the job done and my objects in the database).

After some reflection about extracting my business logic (to a delegate object and/or a module, for example), I decided not to, for the reason stated below: those methods belongs to my model, so I won’t put them in any other place. Still wanting faster tests, we found a rather nice strategy in Avdi Grimm’s Object On Rails which can bore down to: use/require only the parts of Rails that you really need. For instance, do not inherit from ActiveRecord if you do not need to.

Among the stuff you may not need, you may find notably ActiveRecord. If you are testing some business method on your object, you do not need to pull it out or in the database, so why pay the price? NullDB come handy in this situation, effectively “nullifying” most of ActiveRecord database adapters behavior, while keeping what is necessary (like generating your fields in the object based on the schema). For the rest, it is mostly about going back to Ruby typical situation by requiring the classes you need (as you no more depend on Rails “automagical” inclusion of everything needed).

A nice side effect of this work was to become more aware of all the parts of Rails we are using, and why.

Using Rails test to test Rails controllers is fine

I’ve mostly talked until now of Rails test cost, but let’s not forget the benefits: for controller or even model testing, Rails does offer a lot of simplicity, allowing to write test easily, even for view testing (UI testing is a problem in every language and framework I know). The objective should not be to “never pay the price” but mostly to “pay the price when you reap the benefit”.

In my point of view, controller tests are nearly integration (I won’t touch the integration versus acceptance debate with a 10-foot pole, but it is a good read) anyway: most of the backend, heavyweight logic should have been tested long before, so it’s really the interaction that is tested here, situation where I’m ok (or let say, less burdened) by some waiting time.

Our real solution will probably be to have soon a Continuous Integration server able to run the full suite after each push to our GitHub repository.

Specific use cases

We have some tests that connect to external services and APIs over HTTP. For this, using Webmock to mock HTTP requests is really nice (most of our tests revolve around verifying our functionalities, not the availability of the service) and VCR (which, as its name show, can record and replay any interaction. One of the nice aspect is the possibility to connect to the external service “for real”, and record the interaction to be replayed afterward, or to let VCR fixtures timeout at some point, to ensure some contact with the external world).

We had some slowness due to files IO (like, reading source code). The solution in this case is mostly to reduce the sample used to what is really necessary (a good idea, not only for performance reason), and/or to use StringIO to stay fully “in memory” – or mount a drive in RAM for the time of the test, which achieve the same effect).

Use the power of Zeus

 

Lightnings on Saint-Julien-en-Genevois.

Zeus power (Photo credit: Wikipedia)

Now there are still some way to make those controllers tests much faster, by using Zeus server. Zeus make some preload magic of your Rails application, allowing further Rails commands to go much more faster (back to sub second in some situation). While nice when using Rails server or console, it is really nice for testing (starting a test requires to load the Rails environment, and you should start your tests much more often than your application). Our experience with Zeus has been good, but it really shines when you are updating the test code a lot, with minimal impact on the actual server code (that require some reloading in Zeus). As we are working to get back to a comfortable level of test, using Zeus did quicken the process.

Integration tests are valid

Not everything can be tested in a subsecond timeframe. If you have real integration tests, that are really simulating an end-to-end user interaction with your application, you’ll need database operations, controllers requests & responses and so on. This is fine: integration tests are not there to provide you with instant gratification on what you are currently working one, but to validate on a regular basis that your main flows are still working as expected. Those can be run later, or even better, by a Continuous Integration tool.

Are we fast yet ?

Not yet. But the situation is improving nicely, making us much more confident about working on Sybil.

sybil analyzer :
Finished tests in 22.465900s, 4.2731 tests/s, 9.8371 assertions/s.
96 tests, 221 assertions, 0 failures, 0 errors, 0 skips
Coverage report generated for Unit Tests to /home/mestachs/projects/
8thcolor-gems/sybil_analyzer/coverage. 668 / 843 LOC (79.24%) covered.

sybil integration_spec
pass: 48,  fail: 0,  error: 0
total: 48 tests with 92 assertions in 21.665818723 seconds
Coverage report generated for RSpec to /home/mestachs/projects/sybil_master/coverage. 1326 / 1888 LOC (70.23%) covered.

sybil model spec
Finished tests in 0.524240s, 41.9655 tests/s, 89.6537 assertions/s.
22 tests, 47 assertions, 0 failures, 0 errors, 0 skips
Coverage report generated for RSpec to /home/mestachs/projects/sybil_master/coverage. 350 / 382 LOC (91.62%) covered.

That was a quick overview on various strategies we use. We’ll most probably come back on some of them in more details in a future post.

Thanks to Avdi Grimm’s Object On Rails for showing us some ways, Corey Haines for talking about Fast Tests, Bundler for making extraction of gems possible and Burke Libbey’s Zeus for being there when we could not find a better.

Enhanced by Zemanta




Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>