0

When I search for node.js shared memory, I find only third party solutions that rely on inter-process communication, OS memory mapped files, third party caching servers, and/or string serialization. These are less efficient methods compared to accessing native objects directly inside the same process.

To clarify some more, I'd prefer shared memory that works this way:

  • node.js would have some kind of global object scope that every request can see.

  • The size of the global scope would only be limited by system memory size.

  • Node would need to provide "named locks" or some other method of locking. (I think node.js doesn't need this unless you use workers)

  • The solution must be written in pure Javascript (no "C" add-on)

  • There is no serialization of the stored object.

I'm primarily a ColdFusion/CFML developer. All I have to do is this to share native objects between requests.

application.myVar={ complex: "object", arr:["anything", "i", "want"] };

I already wrote an application that stores massive lookup tables, partial html, and more in memory using ColdFusion. There is no I/O or interprocess communication for most requests. Can this be done in node.js?

1 Answer 1

2

Node.js is already a single threaded server, so any request would be handled within the same scope.

And there are already libs just for that: In memory storage on nodejs server

If you need to scale you can move on to some other store later on (redis, mongodb etc).

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.