12

Using PostgreSQL.

I'm trying to set up a proper test suite for an API. So far it works but the tests are done on the main database directly. I then have to remove everything my tests created, or edited in such case.

I know it's a bad thing to do (because I can forget to revert a change, or mess up the sequences). So I would like to create a test database with the same structure and base data, and then delete it afterwards. Is this approach the good one to take in this case?

And if I want to do it this way, how should I do it? Is there a way in NodeJS to execute an SQL script? I tried with a shell script but so far it's a been complete mess with the permissions, so I figured it would be easier with NodeJS directly.

I'm using Mocha for my tests.

3
  • Are these test for API endpoints or for your models? Commented Dec 19, 2013 at 20:02
  • For API endpoints. I test every route, try every edge case I can think of, try good and bad requests. These request often create rows in the database, which will be used in the next tests. I'm not sure I'm doing this right though. Commented Dec 19, 2013 at 20:11
  • 1
    If you want to keep your tests tight, I would suggest mocking your data rather than setting up another point of failure (db connections, schema issues, etc.) which really aren't in the scope of testing API endpoints. When you connect to a REAL database, this converts your tests from unit tests to integration tests. Both have their place, but you shouldn't mix them. Commented Dec 19, 2013 at 20:13

1 Answer 1

6

I would suggest a separate test database. It can be light, and you will want to know the data that is in there (so you can test it!). A base dataset that can handle all your business rules can be exported as a SQL file (or some export method).

Typically your application will have a connection to the database, and your test framework will have some approach to run a method prior to tests starting. It is here that you specify the test DB. Your database access objects (DAOs), or scripts, methods, will utilize the main connection in some way, either as a method parameter, or require statement, etc.

As an example, I'm using the knex module to connect to the DB and build queries. I initialize and reference my single DB connection as specified in their docs.

var Knex = require( 'knex' );
Knex.knex = Knex.initialize( {
  client : 'mysql',
  connection : {
  host : 'my.domain.com',
  user : 'dbrole',
  password : 'password',
  database : 'productiondb',
  charset : 'utf8'
  }
} );

My DAOs get the connection like this:

var knex = require('knex').knex;

Now in my unit tests, before a test suite is run, I can set my connection to be the test DB

var Knex = require( 'knex' );
Knex.knex = Knex.initialize( {
  client : 'mysql',
  connection : {
  host : '127.0.0.1',
  user : 'root',
  password : 'root',
  database : 'testdb',
  charset : 'utf8'
  }
} );

And there you have it! Exact same code is used in test and production, and your production DB is de-coupled from your tests. This pattern can work with a lot of frameworks, so you'll have to adapt (and clean up your tests if they are junking up the test DB, maybe a restore to default when all tests are complete).

Edit: By the way, knex works with postgre and is a fun way to build queries in pure node JS. It can also execute raw SQL.

Sign up to request clarification or add additional context in comments.

1 Comment

For node.js, recommend taking a look at BookshelfJS bookshelfjs.org, that is a data mapper layer on top of knex.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.