Updated 8 months ago


One is a browser side application cache. It guarantees entity uniqueness across the entire cache.

Each entity tracked for uniqueness must have a unique id. There is precisely ONE distinct entity in the cache for each unique id. Entities that do not have a unique id are still cached but not tracked for uniqueness.


npm install one --save
import * as One from 'one';

// get a hold of an instance
let one = One.getCache();

// or with debugging options
let one = One.getCache(true);

// you can then use the instance

// One.getCache() is a singleton so you can also do this

Or simply put one.min.js on your page to access the One global variable from anywhere. In this case the instance is created for you and you can access it directly (no need to call getCache()).



A word about javascript versions compatibility

ES6 introduced Maps which are supposed to be up to 3 times more efficient than using Object for storing and retrieving keyed items. One uses Maps by default. However for older browsers that do not support the standard you will need to compile the library with the babel-polyfill. The caveat is that it bumps up the size of the minified lib from ~16kb go ~100kb.

To compile with polyfill:

  1. Uncomment this line in index.js ```


//import 'babel-polyfill';

2. Run this line in terminal from the One folder:


npm run prepub

That's it. The compiled versions are added to the dist folder.

There are three significant operation types to be aware of:
* **[put]( / [get]( / [evict](** - items go into the cache with a ```put``` operation and come out with a ```get``` call. Use ```evict``` to force items out of the cache.
* **[queue](** - fast input to bypass uniqueness tracking
* **[time travel](** - ```undo()``` and ```redo()``` to go back and forth in time

Some code

let item1 = {uid:1}
let item2 = {uid:2, ref:item1}

// puts all items with uid separately in the cache

One.get(item1) === undefined // false (item1 is added from item2)
item1 === One.get(item1) // true (same object)
item2.ref === One.get(1) // true


All data is immutable. Once an item enters the cache it freezes and cannot change. This is to enable quick identity checks against immutable entities (ie React identity check).

let item = {uid:1}
Object.isFrozen(item) // false

Object.isFrozen(item) // true

let result = One.get(item)
result === item // true

If you later want to edit a reference of the object you can get an editable copy from the cache. This gives you a separate clone of the object that is now editable:

let item = {uid:1}

let editable = One.getEdit(1) // or cuid.getEditable(item1);
Object.isFrozen(editable) // false
item === editable // false

editable.text = "test"

let edited = One.get(1)
edited.text = "text" // true
Object.isFrozen(edited) // true

Editing an item changes all its instances in the cache:

let item = {uid:1}
let item2 = {uid:1, child:item}
One.get(1) === item // true
One.get(2) === item2 // true

// Let's do some editing
let editable = One.getEdit(1);
editable.text = "test"
One.put(editable) // also updates item2 reference to item

let result = One.get(2)
console.log(JSON.stringify(result.item)) // {uid:1, text:"test"}


More an more applications are giving users the ability to edit data in the browser. With a normalized data model various instances of an entity can exist at the same time in different locations. This depends on how data is received from the server and added to the local model / store.

This is inconvenient because:

  • Keeping all the instances in sync can be a daunting task.
  • It can make debugging hard.
  • It requires tracking each instance and makes reasoning about data complicated.
  • It can make the application structure needlessly complex.

Redux brings a great breakthrough by putting the entire application state in one place and mutating it only via dispatched actions. But it doesn't enforce entity uniqueness. One aims to take the concept a step further by making each entity unique and immutable in a single store (cache).

Performance considerations

Yes there is a performance cost in analyzing each entity deeply to track its dependencies. One offers a couple of ways to mitigate this: Read optimization and Queuing.

  • Read optimized: the penalty is incurred on write operations only. These happen a lot less frequently than read ops. Read ops are super fast (a simple key lookup).
  • Queuing allows the developer to choose when to perform the write operation. One defers the write analysis when writing to the queue. The queue can commit between render operations. This way the UI remains fluid.
    If you were to track all instances of an entity on each update the write penalty could end up being comparably high. This is besides the added complexity introduced by such tracking management.

Data shape

This is not currently designed to work with cyclical data. It is best for non-cyclical objects received from the server in the form of json (or other non-cyclical fomats).
It might happen later if there's a need.