Welcome!

Hi-tech Highway

Josh Litvin

Subscribe to Josh Litvin: eMailAlertsEmail Alerts
Get Josh Litvin via: homepageHomepage mobileMobile rssRSS facebookFacebook twitterTwitter linkedinLinkedIn


Related Topics: XML Magazine, SOA & WOA Magazine

Article

The Difference Between Unit Testing and Integration Testing

This article will discuss how to prevent your unit tests turning into integration tests.

This article was written by Gil Zilberfeld, Product Manager, Typemock.

First things first:

What is a unit test?

A unit test is:

  • Repeatable: You can rerun the same test as many times as you want.
  • Consistent: Every time you run it, you get the same result. (for example: Using threads can produce an inconsistent result)
  • In Memory: It has no "hard" dependencies on anything not in memory (such as file system, databases, network)
  • Fast: It should take less than half a second to run a unit test.
  • Checking one single concern or "use case" in the system: (More than one can make it harder to understand what or where the problem is when the problem arises.)

By breaking any of these guidelines, you increase the chance of developers either not trusting or not believing the test results (due to repeated false failures by the tests), or people not wanting to run the tests at all because they run too slowly.

What is an integration test?
An integration test is any test that can't be described as a unit test. The different kinds of integration tests (performance tests, system tests, acceptance tests etc.) are not the subject of this piece. Our main concern here is just differentiating unit tests from everything else.

An integration test might:

  • Use system dependent values that change dynamically (such as DateTime.Now, or Environment.MachineName)
  • Create objects of which it has little control (such as threads, random number generators)
  • Reach out to external systems or local machine dependencies (from calling web services to using local configuration files)
  • Test multiple things in the course of one test case (from database integrity, to configurations, to protocols, to system logic, in one go).

Taking those bullets into account, we can say that Integration tests will most likely:

  • Be much slower than unit tests
  • Be much harder or time consuming to recreate and run fully
  • Collecting the test result might be more problematic

When do integration tests make sense?
Integration tests can make a lot of sense in most systems in two particular configurations:

  • Written as an additional layer of regression testing beyond unit tests
  • In legacy code situations, written before unit tests or refactoring can be done

Additional regression test layer
Integration tests can serve as an extra "smoke test" that can be run when the full system needs to be tested , proving that deployment of all the system components works well, for example. With those kinds of tests, you would test many small things in one large test case.

Integration tests on legacy code
Michael Feathers, in his book "Working Effectively with Legacy Code" defined legacy code as "code that does not have tests." It's usually hard or impossible to write unit tests for such code, as it is mostly untestable. Tools like Typemock Isolator help relieve that problem by being able to fake anything, and integration tests can serve as an additional smoke test to make sure you didn't break the integration between system components, one level above unit tests.

Summary
By being able to distinguish unit from non-unit tests, we can make sure they are separated in our projects, and give our developers a "safe green zone" (as mentioned in Roy Osherove's book ‘The Art of Unit Testing') that contains only unit tests, in which developers can run and always trust the results.

•   •   •

With over 15 years of experience in software development, Gil has worked with a range of aspects of software development, from coding to team management, and implementation of processes. Gil presents, blogs and talks about unit testing, and encourages developers from beginners to experienced, to implement unit testing as a core practice in their projects. Gil writes a personal blog (http://www.gilzilberfeld.com), and contributes to the Typemock blog - http://blog.typemock.com/

More Stories By Josh Litvin

Yaniv Yehuda is the Co-Founder and CTO of DBmaestro, an Enterprise Software Development Company focusing on database development and deployment technologies. Yaniv is also the Co-Founder and the head of development for Extreme Technology, an IT service provider for the Israeli market. Yaniv was a captain in Mamram, the Israel Defense Forces computer centers where he served as a software engineering manager.