Comments (18)
I find that having Olric embedded in Caddy, while a good idea in theory, didn't seem that good in practice because configuration of Olric is not naturally compatible with Caddy and the way it works with the graceful reloads and app module system. Pointing to a yaml config feels foreign and awkward, especially for a module under the caddyserver org.
If Olric could work that way and be fully configured via structs with JSON tags and not have trouble reloading config (but persisting cache storage via UsagePool) then I'd be more comfortable with that.
Souin acting as a client for Olric is a pretty good compromise - we just need to clearly document how to run Olric alongside Caddy to provide the cache clustering.
from cache-handler.
Yeah, I mean, as long as it's an embedded, distributed cache -- as automatic as practical -- that is the goal of this module. 👍
from cache-handler.
I don't think this will introduce more complexity because I just have to declare a new provider and implement my provider interface that's it imo.
from cache-handler.
I'd like to hear from @dunglas and @mholt on this, but 👍 to me. I was mainly concerned about the config UX but that's in a good place now I think. I don't know the specifics of the caching RFCs etc so I'd like people who know more to speak on that.
from cache-handler.
I saw how the configuration improved recently, so I am glad about that. I think this is fine, as long as it still has distributed caching. Would also need @dunglas' approval I think.
from cache-handler.
I have just realized that Souin doesn't use Olric in embedded-server mode. So Souin is just a client for an Olric cluster. In this scenario, there is no difference between memcache or Olric.
In the current version of cache-handler, Caddy servers form an Olric cluster to cache and serve content. So it functions as an in-process cache. With the current implementation of Souin, cache-handler is going to lost this functionality.
from cache-handler.
Ah, gotcha. One of the requirements of this cache handler is that it supports distributed caching.
from cache-handler.
I did like how easily autocache (groupcache wrapper) integrated as a Go library with JSON configuration, in the initial versions of this module. Wonder if that should be reconsidered. IMO the distributed caching should be embedded if possible.
from cache-handler.
If Olric could work that way and be fully configured via structs with JSON tags and not have trouble reloading config (but persisting cache storage via UsagePool) then I'd be more comfortable with that.
I think that JSON based configuration is easy to implement along with YAML. But reloading config without restarting the process is hard to implement and it requires extensive testing.
from cache-handler.
I don't think it's a good idea to embed a distributed storage system because the replication will grow more and more along the time and will make the software growing indefinitely. IMO I could adapt the Olric provider in Souin to embed this but it will implicate the dump about the Olric instance each time the configuration reload. Then if we chose to never drop the existing db it will cause inconsistent. This approach prevent the data being dropped when the configuration changes and allow any cache or other software to access the data inside Olric.
TLDR I can embed but I'm not very fan about that.
from cache-handler.
the replication will grow more and more along the time and will make the software growing indefinitely.
Hm? The code doesn't grow indefinitely. The first version of this cache was mostly working, and wasn't a large amount of code.
from cache-handler.
@mholt Okay, if you're sure about that, I'll make changes to be able to use Olric this way.
from cache-handler.
It should be available on friday. What do you think about a new configuration key like embed_distributed
? It would be interresting to be able to chose between the existing Olric client mode and the embedded mode isn't it ?
from cache-handler.
I suppose that could be useful if people are already running a separate Olric instance. But it would probably add more complexity too.
from cache-handler.
@buraksezer Do you agree with that ? darkweak/souin#80
from cache-handler.
@mholt @francislavoie Olric embedded is now supported in Souin and can be configured through the caddy configuration file.
from cache-handler.
Can you show us an example of how Caddy would be configured with olric clustering? I don't see an example in the PR you merged.
from cache-handler.
{
"apps": {
"souin_cache": {
"headers": [
"Content-Type",
"Authorization"
],
"log_level": "info",
"olric": {
"configuration": {
"olricd": {
"bindAddr": "0.0.0.0",
"bindPort": 3320,
"serializer": "msgpack",
"keepAlivePeriod": "20s",
"bootstrapTimeout": "5s",
"partitionCount": 271,
"replicaCount": 2,
"writeQuorum": 1,
"readQuorum": 1,
"readRepair": false,
"replicationMode": 1,
"tableSize": 1048576,
"memberCountQuorum": 1
},
"client": {
"dialTimeout": "-1s",
"readTimeout": "30s",
"writeTimeout": "30s",
"keepAlive": "150s",
"minConn": 1,
"maxConn": 100
},
"logging": {
"verbosity": 6,
"level": "DEBUG",
"output": "stderr"
},
"memberlist": {
"environment": "local",
"bindAddr": "0.0.0.0",
"bindPort": 3322,
"enableCompression": false,
"joinRetryInterval": "10s",
"maxJoinAttempts": 2
}
}
},
"ttl": "1000s"
},
"http": {
"servers": {
"": {
"listen": [":80"],
"routes": [
{
"match": [
{
"header": {
"Content-Type": ["*"]
},
"path": [
"/a*"
]
}
],
"handle": [
{
"handler": "souin_cache",
"ttl": "30s"
}
]
},
{
"match": [
{
"header": {
"Content-Type": ["*"]
},
"path": [
"/b*"
]
}
],
"handle": [
{
"handler": "souin_cache",
"headers": []
}
]
},
{
"match": [
{
"header": {
"Content-Type": ["*"]
},
"path": [
"*"
]
}
],
"handle": [
{
"handler": "souin_cache"
}
]
}
]
}
}
}
}
}
from cache-handler.
Related Issues (20)
- Redis uses TCP port 6379 as default
- Getting constant misses using API Platform HOT 14
- Panic if abort directive is used HOT 20
- Enabling cache increases response times HOT 7
- Bump souin version with latest changes
- fwd=uri-miss HOT 9
- Constant CPU usage HOT 4
- License HOT 2
- FYI: cache tests HOT 2
- Panic on incorrect badger configuration HOT 9
- How does stale work? HOT 5
- Cannot match current iteration key ETag HOT 9
- HEAD gives zero-sized Content-Length HOT 9
- Ability to force caching even if client sends no-cache HOT 2
- Caddy with cache-handler crashes frequently on virtuozzo/jelastic HOT 9
- big memory usage, memory leak? HOT 1
- Missing response on cached paths HOT 32
- Prevent caching based on response header HOT 12
- Maintenance and docs re: integrating upstream changes
- Last-Modified header and cache skip
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from cache-handler.