Skip to content
Olivier Guimbal edited this page Dec 9, 2020 · 10 revisions

❓ Why use pg-mem instead of an instance of postgres in Docker ?

As stated by SoInsightful

  • Requires minimal installation. I don't need to download and maintain Docker or tamper with Dockerfile/docker-compose.yml files or avoid name or port conflicts or monitor logging output or start/restart/stop containers etc. Even if that can all be done relatively easy, there are still pain points.

  • It's easy to keep everything clean. There are no environment variables, ports, massive memory allocations, file writes/reads, mile-long Docker logs. There's zero risk of ever overwriting any data by accident. You don't need to incessantly run migration scripts. You don't need as much work for handling queries for repopulating mock databases. You don't have the process taking up disk space or memory in the background when you close down the project.

  • It is fast. While in-memory queries should be considerably faster, I don't actually think this is the main reason. Rather, you could get your mock database up-and-running from an uninstalled folder in literal seconds, and close it down in a millisecond. That seems very nice to me.

The obvious possible drawback:

The library will, of course, not be a perfect representation of a real Postgres database. But I could absolutely see it being good enough for all testing purposes.

❓ What if I need an extension, like uuid-ossp ?

pg-mem does not implement any existing postgres extension (at time of writing) - it could be a bit too heavy to include everything, given that few people use them.

That said, it's pretty easy to implement them yourself. For instance, if you use the uuid_generate_v4() function of uuid-ossp, you could register an implementation like this:

  import { v4 } from 'uuid';

  db.registerExtension('uuid-ossp', (schema) => {
    schema.registerFunction({
      name: 'uuid_generate_v4',
      returns: DataType.uuid,
      implementation: v4,
    });
  });

In which case, the following SQL statement will work as expected:

create extension "uuid-ossp";
select uuid_generate_v4();

❓ How to import my production schema in pg-mem ?

You could just dump the schema in like this:

 pg_dump --schema-only --no-owner --no-acl --disable-triggers --no-comments --no-publications --no-security-labels --no-subscriptions --no-tablespaces --host HOST --user USER --password DBNAME > dump.sql

Replacing HOST, USER and DBNAME by your config. This will generate a dump.sql file that should be executable in pg-mem.

⚠ If the dumped SQL is not supported by pg-mem, file an issue !

Then, pretty straightforward:

import fs from 'fs';
import {newDb} from 'pg-mem';

cons db = newDb();

// create schema
db.public.none(fs.readFileSync('dump.sql', 'utf8'));

// then, create a backup (insert data that will be common to all tests before that if required)
const backup = db.backup();

Then use it like that:

// restore 'db' as original
backup.restore();
// => use 'db' !

⚠ If your DB uses extensions, you'll have to mock them (see ad-hoc question in this FAQ)

Clone this wiki locally