Compactr

compactr

Schema based serialization made easy!

What is this and why does it matter? [From the horse's mouth]

Protocol Buffers are awesome. Having schemas to deflate and inflate data while maintaining some kind of validation is a great concept. Compactr's goal is to build on that to better suit Node development and reduce repetition by allowing you to build schemas for your data directly in your scripting language. For example, if you have a DB schema for a model, you could use that directly as a schema for Compactr.

Get it: npm install --save compactr

Sample usage:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
const Compactr = require("compactr");
const Schema = Compactr.schema({
      bool: { type: 'boolean' },
      num: { type: 'number' },
      str: { type: 'string' },
      arr: { type: 'array', items: { type: 'string' } },
      obj: { type: 'object', schema: { sub: { type: 'string' } } }
});

// decode encoded.
Schema.read(Schema.write({
    bool: true,
    num: 23.23,
    str: 'hello world',
    arr: ['a', 'b', 'c'],
    obj: {
        sub: 'way'
    }
}).array());
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
const Compactr = require('compactr');

// Defining a schema
const userSchema = Compactr.schema({
    id: { type: 'number' },
    name: { type: 'string' }
});



// Encoding
userSchema.write({ id: 123, name: 'John' });

// Get the schema header bytes (for partial loads)
const header = userSchema.headerBytes();

// Get the partial load bytes
const partial = userSchema.contentBytes();

// Get the full header + content bytes
const buffer = userSchema.bytes();




// Decoding (full)
const content = userSchema.decode(buffer);

// Decoding (partial)
const content = userSchema.decode(header, partial);

GIF FTW!

compactr

Suggest a module

Comments