Coder Social home page Coder Social logo

cache-manager's People

Contributors

aletorrado avatar anchan828 avatar bryandonovan avatar corradodellorusso avatar dabroek avatar dependabot[bot] avatar deyhle avatar elliotttf avatar gswalden avatar imjohnbo avatar jaredwray avatar lchenay avatar lukechilds avatar marcoreni avatar mojtaba-na avatar orgads avatar philippeauriach avatar pukoren avatar quentinlemcode avatar raadad avatar renovate[bot] avatar ricall avatar ricardomozartlino avatar seanzx85 avatar sebelga avatar slukes avatar tirke avatar tmbobbins avatar v4l3r10 avatar zzau13 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

cache-manager's Issues

get call stuck when redis server is down

I have encountered a case where when i am making a get call to redis store in multicache it get stuck because redis store is not throwing the connection error as the redis client does. Although redis connection pool of redis store emits events "redisError" if it is not able to make the connection but using this event we cannot make our call sync in case of fallback to original calls to db rather then redis in case of connection failure.

Question: are calls waiting for the first to complete?

I think this module is a perfect match to use in my app. I just want to double check one thing.

If I have a wrapped function and calls it 5 times before the wrapped async function has completed then all 5 calls will wait for the wrapped function to complete and return the same value. I.e the slow async function is only called once?

I understand this is the case when something already is cached. But question is calls made while the cache entry is being calculated.

Request: isCacheableRequest

i'd propose the opposite of isCacheableValue:

isCacheableRequest, a passthrough function that skips cache based on request key rather than value.

2 advantages come to mind:

  1. have specific data not be cacheable by request params (use-case specific: temporal data where any request older than 48 hours can be safely cached, anything younger can not and should always pass through).
  2. kill-flag to disable caching and cache-checking globally

this could be implemented on two levels - an actual isCacheableRequest modifier method and adding the request/key argument to isCacheableValue to disable cache on specific value+key combinations.

example:

var cacheEnabled = config.cacheEnabled;
var multiCache = cacheManager.multiCaching([memCache, redisCache], {
    isCacheableRequest: function (key) {
        return (key <= Date.now() - 60000 && cacheEnabled); // key older than 1 minute AND cacheEnabled
    }
}

Not returning cached value between script calls

Perhaps I am confused on how this library works.... but I'm trying the example here, but using the filesystem store:

var cacheManager = require('cache-manager');
var fsStore = require('cache-manager-fs');
var diskCache = cacheManager.caching({
    store: fsStore, options: {
      ttl: 60*60,
      maxsize: 1000*1000*1000,
      path: 'diskcache',
      preventfill: true
    }
  });

var ttl = 30;

function getUser(id, cb) {
    setTimeout(function () {
        console.log("Returning user from slow database.");
        cb(null, {id: id, name: 'Bob'});
    }, 5000);
}

var userId = 123;
var key = 'user_' + userId;

// Note: ttl is optional in wrap()
memoryCache.wrap(key, function (cb) {
    getUser(userId, cb);
}, {ttl: ttl}, function (err, user) {
    console.log(user);

    // Second time fetches user from memoryCache
    memoryCache.wrap(key, function (cb) {
        getUser(userId, cb);
    }, function (err, user) {
        console.log(user);
    });
});

Okay so, I have my getUser method, that simulates a database taking 5 seconds to return the value from the database.

When I run the script, it runs getUser(), then it caches it for ttl = 30 seconds. And then it does getUser again and returns the value immediately.

Okay so that works fine.

But then if I run the script again, right away, it does not initially fetch the getUser value from the filesystem store. Instead it runs getUser again and then caches it again to a second .dat file on the server.

Why doesn't it go fetch the value from the cache when I run the script a second time? I do it within 30 seconds, which is what the ttl is set to and the key (user_123) doesn't change between the script calls.

The whole point of caching is so that after a value is cached, subsequent script calls (or web requests) fetch the cached value instead of running getUser() each time. Does this library only cache things when they are within a single script call? If so, what is the point of the filesystem store?

Am I just completely misunderstanding how to use this?

multi_caching checking all caches in series, even if value is found in first cache

This seems like a pretty large performance hit, especially since it doesn't appear to be needed.

The code in question:
https://github.com/BryanDonovan/node-cache-manager/blob/master/lib/multi_caching.js#L80

It seems like relying on isCacheableValue to determine if we should return the result immediately doesn't make too much sense. Do we really care if the value is cacheable at this point? We just found it in the cache and we should defer to the set function to decide if it should be cached or not.

I've been doing some profiling and this popped out. I may raise a pull request if my testing proves this to be a net gain.

cross-process caches

I'm uncertain if this is a possibility, or even a good process.

I have a RESTful service running in Express that has multiple instances through pm2.

I use your library to create an in-memory cache for various items in the project, and use the file-cache package to save a token / credentials for a user on disk. The reason I do this separately is I am afraid that a client request may hop between processes (it shouldn't, but it could).

For example:
Client authenticates, gets dashboard, sits idle for 2 minutes, goes to client page, gets different server. At this point their token is no longer valid as the token seems to exist only on the first process they were on.

I was hoping creating a common folder with the processes all pointing to it might create a "shared" cache, but this doesn't seem to be the case. If I authenticate in one browser, then switch browsers, I have to re-authenticate (different process). If I run a single process, this isn't the case, I can switch between browsers fine as they all go to the same process.

Is it possible to have a "shared" cache between two processes in this case?

Thank you in advance.

isCacheableValue doesn't seem to work

const cacheManager = require('cache-manager');
const memoryCache = cacheManager.caching({
  store: 'memory',
  isCacheableValue: val => val !== 'bar'
});

memoryCache.set('foo', 'bar')
  .then(memoryCache.get('foo'))
  .then(console.log);
  // "bar"

If I set isCacheableValue to return false for 'bar' I can still cache it. Am I using it incorrectly?

Can't make nested calls to a wrapped function

The cache.wrap() function can't be called multiple times in series (nested). E.g., this no longer works (from examples/redis_example/example.js):

function get_cached_user(id, cb) {
    memory_cache.wrap(id, function(cache_callback) {
        get_user(id, cache_callback);
    }, cb);
}

get_cached_user(user_id, function(err, user) {
    // First time fetches the user from the (fake) database:
    console.log(user);

    get_cached_user(user_id, function(err, user) {
        // Second time fetches from cache.
        console.log(user);
    });
});

This started failing in commit 46459ff

Cache Timeout

Using this module along with the redis cache manager store. If redis is down, currently the cache will just hang forever instead of returning an error. This could be considered a bug in the redis store, but if we had a way to add a timeout to the cache calls, it would solve the problem for us.

multi_caching causes invalid TTL entries

It looks like multi-caching can cause invalid TTL (https://github.com/BryanDonovan/node-cache-manager/blob/master/lib/multi_caching.js#L160). For example let's assume the highest priority is memory followed by a shared storage such as a DB:

  • an entry is cached in DB
  • the service is stopped then restarted
  • the next time that entry is queried it will be populated from the DB cache to the memory cache with a default TTL which is > than the real TTL in the DB.

A more realistic example is where multiple services share the cache, in that scenario:

  • SERVICEA sets ENTRY1
  • SERVICEB gets ENTRY1, it is in DB cache but not in memory so it sets with the default TTL in memory cache resulting in both services to have out of sync cache entries.

A solution is to pass the TTL from a lower cache level to a higher cache level when populating it, this would require that wrap or get return the TTL of the entry as an extra arg.

cache error fallback

If the call to the cache store fails. Should it fall back to the wrapped function?

free memory for aged objects without calling get() again

Hi,

currently I'm trying to use cache-manager in a first in first out (fifo) way.
Cached are "big" arrays (up to 2mb per entry) which should be stored for at least 60 minutes.

The cache gets used by requests which are unix timestamps, die keys are timestamp based, too. This makes the search for an entry really fast and failure free.

My problem is the never released memory for outdated entries. As far as I understand lru-cache correctly, aged entries get deleted only on a request for an outdated key. I'm using the keys() function to search for matching keys so older stored values will never be requested again and the memory is "lost".

Is there a mistake in the way I'm thinking or is lru-cache just not the right way to cache my values?

Thanks for any kind of hint or reply ;-)!

process not responding after 3 time request

hello

i cache the result of products from mongodb with your module by this way :

var checkexist = req.query.checkexist;

        redis_cache.get('allunames', function (err, result) {
            if (result == null) {
                _products.find({},'uname').exec(function(err,product){
                    if (checkexist) {
                        for (var i = 0; i < product.length; i++) {
                            if (product[i].uname == checkexist) {
                                res.json(200, {status:true});
                                return;
                            }
                        }

                        res.json(200, {status:false});
                        return;
                    } else {
                        res.json(200, product);
                    }



                    redis_cache.set('allunames', product, function (err) {});
                });
            } else {
                product = result;

                if (checkexist) {
                    for (var i = 0; i < product.length; i++) {
                        if (product[i].uname == checkexist) {
                            res.json(200, {status:true});
                            return;
                        }
                    }

                    res.json(200, {status:false});
                    return;
                } else {
                    res.json(200, product);
                }
            }





        })

first time ... it return result from db
second time it rerurns from your cache module and redis
3rd time ... the forever process stop responding and needs a forever restart

i handle the cache exist/notexist strategy with if -> (if (result == null))

can u please repost my code with your .wrap method ... I didnt understand it

Support promises

Native promises are already supported in nodejs 0.12. I think all methods should return promise. And wrap method can take function returning promise:

/**
 * @returns {Promise}
 */
function getCachedUser(id) {
    return memoryCache.wrap(id, function(/* optional cacheCallback */) {
        return getUser(id);
    }, ttl);
}

No longer runs in node 0.10.x

Updating from cache-manager 1.1.0 to 1.4.0 does not run on Node 0.10.x

Unhandled rejection ReferenceError: Promise is not defined
at Object.self.set (/Users/bbergendahl/projects/bc-frontend/node_modules/cache-manager/lib/stores/memory.js:31:20)

Postgres support?

I'd love to be able to use this cache manager with my postgres backend. I have been watching this page for some time but it seems no one has done a postgres manager yet. Unfortunately I don't quite have the know-how to do it so just putting it out there that it would be a huge support!

ttl of zero

I'm trying to set a ttl of 0 but it seems like my objects are cached indefinitely. I'm doing this because caching time in my system is variable depending on which key (can be 0 sometimes) but I still want to benefit from the concurrency behaviour of this module all of the time, like when two calls to the same key are made at the same time.

var cacheManager = require('cache-manager').caching({ ttl: 0 });

function test () {
    console.log('trying cache')

    return cacheManager.wrap('key', function () {
        console.log('fetched')
        return Promise.resolve('value')
    }, { ttl: 0 });

}

test();

setTimeout(function () {
    test();    
}, 1000);

I would expect

trying cache
fetched
trying cache
fetched

but instead I get

trying cache
fetched
trying cache

Cached function can be called many times while waiting for it to complete using wrap

Since the function that is being cached is expected to be asynchronous, it seems strange to me that multiple wrap calls for the same key would result in multiple calls of the function.

Below is a code snippet that shows this behavior. node-caching implements this behavior as I would expect - which is to only call the function once.

var sinon = require('sinon');
var async = require('async');

var construct = sinon.spy(function (cb) {
    setTimeout(function () {
        cb(null, 'value');
    }, 500);
});

var cacheManager = require('cache-manager');
var cache = cacheManager.caching({
    store: 'memory',
    max: 50,
    ttl: 5 * 60
});

var values = [];
for (var i = 0; i < 20; i++) {
    values.push(i);
}

async.each(values, function (val, cb) {
    cache.wrap('key', construct, cb);
}, function () {
    console.log('constructor called %d times using node-cache-manager', construct.callCount);
    construct.reset();

    var cache = new require('caching')('memory');
    async.each(values, function (val, cb) {
        cache('key', 1000, construct, cb);
    }, function () {
        console.log('constructor called %d times using node-caching', construct.callCount);
    });
});

Output:

constructor called 20 times using node-cache-manager
constructor called 1 times using node-caching

Multi-Store potentially returning incorrect value

I am replacing our current cache solution with this project and, while sniffing through the code to understand it's implementation, I came across these lines :

if (result) {
  // break out of async loop.
  return cb(err, result, i);
}

Which will fail if the stored value is falsy (null, 0, false, '', etc.) which is not correct. The condition should be changed to if (result !== undefined) { ... } instead, no?

Cannot call method 'bind' of undefined

I'm trying to use this module which depends on cache-manager

https://github.com/lammertw/prerender-mongodb-cache

but I get the following error and not entirely sure what's at fault.

/node_modules/cache-manager/lib/caching.js:81
    self.del = self.store.del.bind(self.store);
                              ^
TypeError: Cannot call method 'bind' of undefined
    at Object.caching (/node_modules/cache-manager/lib/caching.js:81:31)
    at Object.module.exports.init (/node_modules/prerender-mongodb-cache/lib/mongoCache.js:18:36)
    at Object.server.use (/node_modules/prerender/lib/server.js:39:51)
    at Object.module.exports.http.customMiddleware (/config/prerender.js:18:14)
    at /node_modules/sails/lib/hooks/http/middleware/load.js:24:25
    at Function.forEach (/node_modules/lodash/dist/lodash.js:3297:15)
    at builtInMiddlewareLoader (/node_modules/sails/lib/hooks/http/middleware/load.js:19:5)
    at loadExpress (/node_modules/sails/lib/hooks/http/initialize.js:104:9)
    at /node_modules/sails/lib/hooks/http/index.js:190:18
    at /node_modules/sails/lib/app/private/after.js:91:14

Is getAndPassUp used?

I'm trying to implement a new feature on node-cache-manager, and I have stepped into the getAndPassUp method. I have trouble finding where it is used, as I can't see any reference to it in the project, except in tests.

It is used?

Common cache interface with atomic operations

Hey @BryanDonovan, great module.

I'm looking at this as potential solution to replace the caching layer on one of my projects (ExpressBrute) which currently has a bunch of its own caching storage adapters that are a pain to maintain and don't make sense to tie to my specific project. It seems like we both need a way to use multiple different caching stores with a unified interface

The only trick is, I need a way to do more than just get, set anddelete. Specifically I need access to an atomic way to do increment, so that multiple process can increment a counter without losing updates. For example without atomic increments, two processes can request the current count (e.g. 1), both processes increment that count by one and then both processes update the count to the new value (e.g the count becomes 2, instead of 3).

So I guess my question is: do you define a set API for your cache storage engines to target right now? And how are updates to that API managed? It seems like we need something like accord or waterline, but for cache stores. I don't know if you're already doing that or if it's something you're interested in doing.

Obviously increment can polyfilled in a non-atomic way with a get and a set and it will work across all stores until they they update to support the atomic method 1 by 1. If the underlying store doesn't support atomic increments (redis, mongo, and memcached all do), then the polyfill can just be left in place and it's not a big deal.

Thoughts on how to proceed?

Set different TTL for different functions

Hey

Firstly just want to say this is a really cool module

I was thinking it might be nice to be able to set a different TTL for different cached functions

e.g.

var getShortLivedField = function(id, callback) {
  ...
}

var getLongLivedField = function(id, callback) {
  ...
}

app.get('/short/lived', function(req, res) {
  var cache_key = "short:" + req.params.id;
  var ttl = 1;
  cache.wrap(cache_key, ttl, function(cache_cb) {
    getShortLivedField(req.params.id, cache_cb);
  }, function(err,result){
    res.send(result);
  });
})

app.get('/long/lived', function(req, res) {
  var cache_key = "long:" + req.params.id;
  var ttl = 600;
  cache.wrap(cache_key, ttl, function(cache_cb) {
    getLongLivedField(req.params.id, cache_cb);
  }, function(err,result){
    res.send(result);
  });
})

This could stay backwards compatible by checking the arguments types in the cache.wrap function

This would work well with redis and other stores but probably not with the lru-cache as that seems to have a global TTL

or would you suggest that is better to create a custom redis store that has a map of TTL to keys
then depending on the key that is passed in, the correct ttl can be used?

Happy to help out with any changes if needed?

Wrapped memory cache does not respect TTL

The default memory cache only uses the TTL that is given when it is created, not the TTL that is given when used as part of a wrap call.

This seems like a common scenario in multi-tiered caches.

It seems like this is caused by lru-cache not using the TTL that wrap passes on to the set method.

feature: cache size

It would be awesome to be able to get the current size in bytes of all the items that have been added or expired from the cache. Is it as simple as keeping a counter that is incremented whenever an item is added, deleted or expired? Maybe it could be an instantiation option like trackSize: true

module does not compile on Windows

When I attempt to install on Windows server 2012 node-gyp build fails.
I have Visual studio installed and any other compiled module I use compiles without errors.

C:\node_prerender>npm install cache-manager
npm http GET https://registry.npmjs.org/cache-manager/0.0.4
npm http 304 https://registry.npmjs.org/cache-manager/0.0.4
npm http GET https://registry.npmjs.org/hiredis
npm http GET https://registry.npmjs.org/lru-cache
npm http GET https://registry.npmjs.org/redis
npm http GET https://registry.npmjs.org/async
npm http 304 https://registry.npmjs.org/hiredis
npm http 304 https://registry.npmjs.org/async
npm http 304 https://registry.npmjs.org/lru-cache
npm http 304 https://registry.npmjs.org/redis
npm http GET https://registry.npmjs.org/bindings
npm http 304 https://registry.npmjs.org/bindings

> [email protected] install C:\node_prerender\node_modules\cache-manager\node_modul
es\hiredis
> node-gyp rebuild


C:\node_prerender\node_modules\cache-manager\node_modules\hiredis>node "C:\Progr
am Files\nodejs\node_modules\npm\bin\node-gyp-bin\\..\..\node_modules\node-gyp\b
in\node-gyp.js" rebuild
Building the projects in this solution one at a time. To enable parallel build,
please add the "/m" switch.
C:\node_prerender\node_modules\cache-manager\node_modules\hiredis\build\binding
.sln : Solution file error MSB5004: The solution file has two projects named "h
iredis".
gyp ERR! build error
gyp ERR! stack Error: `C:\Windows\Microsoft.NET\Framework\v4.0.30319\msbuild.exe
` failed with exit code: 1
gyp ERR! stack     at ChildProcess.onExit (C:\Program Files\nodejs\node_modules\
npm\node_modules\node-gyp\lib\build.js:267:23)
gyp ERR! stack     at ChildProcess.EventEmitter.emit (events.js:98:17)
gyp ERR! stack     at Process.ChildProcess._handle.onexit (child_process.js:789:
12)
gyp ERR! System Windows_NT 6.2.9200
gyp ERR! command "node" "C:\\Program Files\\nodejs\\node_modules\\npm\\node_modu
les\\node-gyp\\bin\\node-gyp.js" "rebuild"
gyp ERR! cwd C:\node_prerender\node_modules\cache-manager\node_modules\hiredis
gyp ERR! node -v v0.10.20
gyp ERR! node-gyp -v v0.10.10
gyp ERR! not ok
npm ERR! weird error 1
npm ERR! not ok code 0

Cache stampede problem

Hi,

Is there a "dog piling" aka "stampede" problem when a key has expired and multiple calls want to read that key?

Let me describe in detail what I mean first.

First, let's say we have an expensive network call fetchResource which takes about 10s and it is wrapped with wrap:

function getResource(id) {
  return resourceCache.wrap(key, () => {
    return fetchResource(id); // takes 10s
  });
}

Second, let's say the cache is empty, or the key has expired, and we get 1000 calls to getResource within 10s. What will happen then? Will we see 1000 fetchResource calls?

Is there a recommended way to get out of that situation?

A way I was thinking about would be to detect if there is an ongoing request and return the promise associated with that request, instead of creating a new call to fetchResource. Would that work?

Thanks,
Martin

1.2 issue with multiCache wrap

Sorry this report is extremely light on details since I'm traveling. We use cache-manager with redis cache manager and use multiCache with wrap. When upgrading from 1.1 to 1.2, our wrap function seems to return empty results without calling the source function that generates the result. The only major difference I see in the code is stuff related to isCachableValue.. This may be a bug with the redis cache manager, but just wanted to post something here in case it gives a hint to something that may need fixing. Thanks for the awesome project! We use it a lot.

purge?

is cache purge functionality available?

[Proposal] - Official Stores for Redis and Memcached

Coming in from other server-side languages and communities, I was a bit perplexed that there wasn't a good driver based system for Node.

I came across this package and was really excited.
However, I am a bit confused as to why you only have an in-memory store and not simple stores for Redis or Memcached?

I noticed the Redis example but was surprised that it is not available when pulling this in with NPM.

Would you be interested in bringing in configurable stores for these cache mechanisms?

Provide tags for releases

npm shows a 2.0 release, however there is no tag for that release here. Could you please provide proper tags?

TTL Issues

First, thanks for a great plugin.

In some of your examples you show:
multiCache.set('foo2', 'bar2', {ttl: ttl}, function(err) {
and in others:
multiCache.set('foo2', 'bar2', ttl, function(err) {

I noticed that redis only respects the first setting, and mongo doesn't respect neither.

caching of Buffer()

I've noticed a discrepancy when caching a Buffer (such as for an image). The memory cache set/get work fine. However, on Redis or the FS cache modules, get returns a JSON string like this:

{"type":"Buffer","data":[65,66,67,68,69,70,71]}

Just wondering if there is a suggested way of handling this. Should the modules themselves store metadata in the cache that a buffer was stored as a value and then deserialize it? Or should there be an options flag in the cache-manager set command that indicates it's caching a buffer.

Multiple requests during first cache

How does the wrap function handle multiple concurrent requests to the same key when the value still is not cached? E.g.

function getCachedUser(id, cb) {
    memoryCache.wrap(id, function (cacheCallback) {
        getUser(id, cacheCallback);
    }, ttl, cb);
}

If getUser() takes 500ms to respond and you call getCachedUser(123, cb) 10 times before 500ms has passed, does it make 10 calls to getUser() because the value is not cached until the first getUser() returns?

Store engines aren't fully compatible with node-cache-manager

From the readme it appears like the different store engines are drop in replacements for the default memory store. However they all seem to have varying degrees of support.

For example, I have developed my module using the default memory store and everything works well. If I use the Redis memory store it breaks my app as it doesn't support Promises on the get/set methods. I've submitted a PR to fix that but I've also noticed the max value in the options isn't respected when using Redis. On top of that, you can create multiple cache instances using the memory store and by default they will use a separate cache. Using the Redis store by default they all use the same DB and so will have cache collisions.

I've only tested the Redis store with my app but looking at the code for other store engines it looks like most will have similar issues.

These problems can all be worked around on the application side but this means that store engines aren't drop in replacements, they require changing application code. I think it would be better if all settings were defined in node-cache-manager and required to be respected by all store engines.

High performance/availability cache setup - feature request

Hello!

I'm currently adding a new feature to node-cache-manager. However I am not sure of the way to go yet, because I can't find a consistent API for the available stores.

The idea of the feature request is to add an always available, low latency multi store setup. This way even if our frontend server loses connection to backend for whatever reason, it will keep working at maximum speed for our front users.

Idea for the high perf/high availability setup using node-cache-manager:

Using a multi store, and wrap method

  • First store have a TTL of 0 (always resolve, keep data forever)
  • Secondary stores have a TTL > 0 (fetched from it if not in primary store, expires after TTL)
  • When data expires from secondary store, data is still returned immediately from first store, while secondary store is being updated, once done primary store will be updated too.

However, since API of stores is not defined, I can't have access to TTL, expired date/time of keys, etc. and of course I don't want to call .get on each store every time wrap is called, to check if it contains the key in order to update it.

Any idea on how to implement this in an efficient manner?

Feature specs available here: dial-once@9948303

Is there a way to invalidate/expire/delete a cache manually?

I have some cached data that is no longer relevant in certain scenarios. I want to therefore be able to invalidate cache before its ttl expires so that next time data is requested, it would be refetched and recached. Is there presently an api to drop particular cached data by its "id"?

CallBack_filler error

Hi
I am getting a cache error even after the http response has been sent
/*/node_modules/cache-manager/lib/callback_filler.js:9
waiting.forEach(function(task) {
^

TypeError: Cannot read property 'forEach' of undefined
at CallbackFiller.fill (//node_modules/cache-manager/lib/callback_filler.js:9:12)
at /
/node_modules/cache-manager/lib/caching.js:117:44
at nextTickCallbackWith0Args (node.js:419:9)
at process._tickDomainCallback (node.js:389:13)

Not sure if it is because of cache-manager or not.
Can you kindly help.

Error: Can't find ./stores/memory [undefined]

I am trying to run a unit that utilizes node-cache-manager, however I keep receiving this error from Chrome about being unable to find './stores/memory'... Need some assistance to resolve this issue.

Chrome 55.0.2883 (Mac OS X 10.11.6) Queue Manager encountered a declaration exception FAILED
	Error: Can't find ./stores/memory [undefined] (required by /Users/cyrus_gomez-alcala/Development/CloneDir/ced-event-queue-processor/node_modules/cache-manager/lib/caching.js)
	    at require (node_modules/karma-typescript/lib/commonjs.js:13:23)
	    at node_modules/karma-typescript/lib/commonjs.js:18:24
	    at Object.caching (/var/folders/jd/06g9g2555_1fxy7zv7w3b0j00000gp/T/karma-typescript-bundle-17671uiqYs3E3lhyT.js:78496:22)
	    at Suite.describe (test/spec/manager/queueManger.spec.ts:28:15 <- test/spec/manager/queueManger.spec.js:25:32)
	    at Object.global.wrappers./Users/cyrus_gomez-alcala/Development/CloneDir/ced-event-queue-processor/test/spec/manager/queueManger.spec.ts.../../../src/manager/queueManager (test/spec/manager/queueManger.spec.ts:26:0 <- test/spec/manager/queueManger.spec.js:23:1)
	    at require (node_modules/karma-typescript/lib/commonjs.js:17:25)
	    at node_modules/karma-typescript/lib/commonjs.js:32:9
	    at Array.forEach (native)
	    at node_modules/karma-typescript/lib/commonjs.js:31:40
	    at node_modules/karma-typescript/lib/commonjs.js:34:3`

This line produces the error

let test = caching({store : 'memory', max: 1, ttl: 60}); 

queueManager.spec.ts

import QueueManager from '../../../src/manager/queueManager';
import { caching } from 'cache-manager';

jasmine.DEFAULT_TIMEOUT_INTERVAL = 999999;

describe("Queue Manager", () => {
    let connection:any;
    let test = caching({store : 'memory', max: 1, ttl: 60});

    console.log("test " + test);

    beforeAll( (done) => {
        connection = QueueManager.connect(queueConfig);
        done();
    });


    it("Create Connection to Rabbit MQ", (done) => {
       expect(connection).not.toBe(null);
       done();
    });

});

KARMA.CONF.JS

module.exports = function(config) {
    config.set({
      basePath: '.',
      karmaTypescriptConfig: {
        bundlerOptions: {
          exclude: ["express"]
        },
        compilerOptions: {
          "target": "es6",
          "module": "commonjs",
          "moduleResolution": "node",
          "removeComments": false,
          "noImplicitAny" : false,
          "sourceMap": true,
          "typeRoots": [
            "node_modules/@types/"
          ],
          "compileOnSave": true
        },
        include: [
          "src/**/*.ts",
          "test/spec/**/*.spec.ts"
        ],
        exclude: ["node_modules", "src/router/queueProcessorRouters.ts"],
        reports: {
          "html": "coverage",
          "text-summary": ""
        },
        transformPath: function(filepath) {
          return filepath.replace(/\.(ts|tsx)$/, ".js");
        },
      },
      frameworks: ["jasmine", "karma-typescript"],
      files: [
          { pattern: "src/**/*.ts" },
          { pattern: "test/spec/**/*.spec.ts" }
      ],
      exclude:["node_modules", "src/router/queueProcessorRouters.ts"],
      preprocessors: {
          "src/config/*.ts": ["karma-typescript"],
          "src/manager/*.ts": ["karma-typescript"],
          "src/messageProcessor/*.ts": ["karma-typescript"],
          "src/service/*.ts": ["karma-typescript"],
          "src/util/*.ts": ["karma-typescript"],
          "test/spec/**/*.spec.ts": ["karma-typescript"]
      },
      mime: {
          "text/x-typescript":["ts","tsx"]
      },
      reporters: ["progress", "kjhtml", "karma-typescript"],
      browsers: ["Chrome"],
      logLevel: config.LOG_DEBUG,
    });
};

PACKAGE.JSON

"dependencies": {
    "@types/amqplib": "^0.5.0",
    "@types/cache-manager": "^1.2.4",
    "@types/express": "^4.0.34",
    "@types/jasmine": "^2.5.40",
    "@types/node": "^6.0.52",
    "@types/request": "0.0.36",
    "amqplib": "^0.5.1",
    "cache-manager": "^2.2.0",
    "express": "^4.14.0",
    "fibonacci": "^1.6.4",
    "file-stream-rotator": "0.0.7",
    "morgan": "^1.7.0",
    "pm2-check-express": "^0.1.0",
    "request": "^2.79.0",
    "typescript": "^2.1.4",
    "uuid": "^3.0.1",
    "winston": "2.3.0"
  },
  "devDependencies": {
    "grunt": "^1.0.1",
    "grunt-auto-install": "^0.3.1",
    "grunt-contrib-clean": "^1.0.0",
    "grunt-contrib-copy": "^1.0.0",
    "grunt-contrib-watch": "^1.0.0",
    "grunt-karma": "^2.0.0",
    "grunt-ts": "^6.0.0-beta.3",
    "jasmine-core": "^2.5.2",
    "karma": "^1.3.0",
    "karma-chrome-launcher": "^2.0.0",
    "karma-jasmine": "^1.1.0",
    "karma-jasmine-html-reporter": "^0.2.2",
    "karma-phantomjs-launcher": "^1.0.2",
    "karma-typescript": "^2.1.6",
    "karma-typescript-preprocessor": "^0.3.1"
  }
}

multi-caching - how to set options?

	const memoryCache = cacheManager.caching({store: 'memory', max: 100, ttl: config.get('cache.memory.ttl') });

	const isCacheableValue = function(value) {
		return value !== null && value !== false && value !== undefined;
	};

		otherCache = cacheManager.caching({
			store: memcachedStore,
			isCacheableValue: isCacheableValue,
			options: {
				hosts: [host],
				backoffLimit: 10000,
				autodiscover: false,
				disabled: false,
				maxValueSize: 1048576,
				queue: true,
				netTimeout: 500,
				reconnect: true,
				onNetError: function onNetError(err) {
					winston.error('[CACHE] memcached error occured:', err);
				}
			},
			ttl: config.get('cache.memcached.ttl')
		});

	const cacher = cacheManager.multiCaching([memoryCache, otherCache]);

now i run wrap() and getting

 <rejected>   TypeError: Cannot read property 'isCacheableValue' of undefined

without multi caching i can run the node-cache-manager.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.