Coder Social home page Coder Social logo

cache-handler's Introduction

Caddy Module: http.handlers.cache

This is a distributed HTTP cache module for Caddy based on Souin cache.

Features

Minimal Configuration

Using the minimal configuration the responses will be cached for 120s

{
    order cache before rewrite
    cache
}

example.com {
    cache
    reverse_proxy your-app:8080
}

Global Option Syntax

Here are all the available options for the global options

{
    order cache before rewrite
    log {
        level debug
    }
    cache {
        allowed_http_verbs GET POST PATCH
        api {
            basepath /some-basepath
            prometheus
            souin {
                basepath /souin-changed-endpoint-path
            }
        }
        badger {
            path the_path_to_a_file.json
        }
        cache_keys {
            .*\.css {
                disable_body
                disable_host
                disable_method
                disable_query
                headers X-Token Authorization
                hide
            }
        }
        cache_name Another
        cdn {
            api_key XXXX
            dynamic
            email [email protected]
            hostname domain.com
            network your_network
            provider fastly
            strategy soft
            service_id 123456_id
            zone_id anywhere_zone
        }
        etcd {
            configuration {
                # Your etcd configuration here
            }
        }
        key {
            disable_body
            disable_host
            disable_method
            headers Content-Type Authorization
        }
        log_level debug
        mode bypass
        nuts {
            path /path/to/the/storage
        }
        olric {
            url url_to_your_cluster:3320
            path the_path_to_a_file.yaml
            configuration {
                # Your olric configuration here
            }
        }
        regex {
            exclude /test2.*
        }
        stale 200s
        ttl 1000s
        default_cache_control no-store
    }
}

:4443
respond "Hello World!"

Cache directive Syntax

Here are all the available options for the directive options

@match path /path

handle @match {
    cache {
        cache_name ChangeName
        cache_keys {
            (host1|host2).*\.css {
                disable_body
                disable_host
                disable_method
                disable_query
                headers X-Token Authorization
            }
        }
        cdn {
            api_key XXXX
            dynamic
            email [email protected]
            hostname domain.com
            network your_network
            provider fastly
            strategy soft
            service_id 123456_id
            zone_id anywhere_zone
        }
        key {
            disable_body
            disable_host
            disable_method
            disable_query
            headers Content-Type Authorization
        }
        log_level debug
        regex {
            exclude /test2.*
        }
        stale 200s
        ttl 1000s
        default_cache_control no-store
    }
}

Provider Syntax

Badger

The badger provider must have either the path or the configuration directive.

badger-path.com {
    cache {
        badger {
            path /tmp/badger/first-match
        }
    }
}
badger-configuration.com {
    cache {
        badger {
            configuration {
                # Required value
                ValueDir <string>

                # Optional
                SyncWrites <bool>
                NumVersionsToKeep <int>
                ReadOnly <bool>
                Compression <int>
                InMemory <bool>
                MetricsEnabled <bool>
                MemTableSize <int>
                BaseTableSize <int>
                BaseLevelSize <int>
                LevelSizeMultiplier <int>
                TableSizeMultiplier <int>
                MaxLevels <int>
                VLogPercentile <float>
                ValueThreshold <int>
                NumMemtables <int>
                BlockSize <int>
                BloomFalsePositive <float>
                BlockCacheSize <int>
                IndexCacheSize <int>
                NumLevelZeroTables <int>
                NumLevelZeroTablesStall <int>
                ValueLogFileSize <int>
                ValueLogMaxEntries <int>
                NumCompactors <int>
                CompactL0OnClose <bool>
                LmaxCompaction <bool>
                ZSTDCompressionLevel <int>
                VerifyValueChecksum <bool>
                EncryptionKey <string>
                EncryptionKeyRotationDuration <Duration>
                BypassLockGuard <bool>
                ChecksumVerificationMode <int>
                DetectConflicts <bool>
                NamespaceOffset <int>
            }
        }
    }
}

Etcd

The etcd provider must have the configuration directive.

etcd-configuration.com {
    cache {
        etcd {
            configuration {
                Endpoints etcd1:2379 etcd2:2379 etcd3:2379
                AutoSyncInterval 1s
                DialTimeout 1s
                DialKeepAliveTime 1s
                DialKeepAliveTimeout 1s
                MaxCallSendMsgSize 10000000
                MaxCallRecvMsgSize 10000000
                Username john
                Password doe
                RejectOldCluster false
                PermitWithoutStream false
            }
        }
    }
}

NutsDB

The nutsdb provider must have either the path or the configuration directive.

nuts-path.com {
    cache {
        nuts {
            path /tmp/nuts-path
        }
    }
}
nuts-configuration.com {
    cache {
        nuts {
            configuration {
                Dir /tmp/nuts-configuration
                EntryIdxMode 1
                RWMode 0
                SegmentSize 1024
                NodeNum 42
                SyncEnable true
                StartFileLoadingMode 1
            }
        }
    }
}

Olric

The olric provider must have either the url directive to work as client mode.

olric-url.com {
    cache {
        olric {
            url olric:3320
        }
    }
}

The olric provider must have either the path or the configuration directive to work as embedded mode.

olric-path.com {
    cache {
        olric {
            path /path/to/olricd.yml
        }
    }
}
olric-configuration.com {
    cache {
        nuts {
            configuration {
                Dir /tmp/nuts-configuration
                EntryIdxMode 1
                RWMode 0
                SegmentSize 1024
                NodeNum 42
                SyncEnable true
                StartFileLoadingMode 1
            }
        }
    }
}

Redis

The redis provider must have either the URL or the configuration directive.

redis-url.com {
    cache {
        redis {
            url 127.0.0.1:6379
        }
    }
}

You can also use the configuration. Refer to the rueidis client options to define your config as key value.

redis-configuration.com {
    cache {
        redis {
            configuration {
                ClientName souin-redis
                InitAddress 127.0.0.1:6379
                SelectDB 0
            }
        }
    }
}

What does these directives mean?

Key Description Value example
allowed_http_verbs The HTTP verbs allowed to be cached GET POST PATCH

(default: GET HEAD)
api The cache-handler API cache management
api.basepath BasePath for all APIs to avoid conflicts /your-non-conflict-route

(default: /souin-api)
api.prometheus Enable the Prometheus metrics
api.souin.basepath Souin API basepath /another-souin-api-route

(default: /souin)
badger Configure the Badger cache storage
badger.path Configure Badger with a file /anywhere/badger_configuration.json
badger.configuration Configure Badger directly in the Caddyfile or your JSON caddy configuration See the Badger configuration for the options
cache_name Override the cache name to use in the Cache-Status response header Another Caddy Cache-Handler Souin
cache_keys Define the key generation rules for each URI matching the key regexp
cache_keys.{your regexp} Regexp that the URI should match to override the key generation .+\.css
cache_keys.{your regexp} Regexp that the URI should match to override the key generation .+\.css
cache_keys.{your regexp}.disable_body Disable the body part in the key matching the regexp (GraphQL context) true

(default: false)
cache_keys.{your regexp}.disable_host Disable the host part in the key matching the regexp true

(default: false)
cache_keys.{your regexp}.disable_method Disable the method part in the key matching the regexp true

(default: false)
cache_keys.{your regexp}.disable_query Disable the query string part in the key matching the regexp true

(default: false)
cache_keys.{your regexp}.headers Add headers to the key matching the regexp Authorization Content-Type X-Additional-Header
cache_keys.{your regexp}.hide Prevent the key from being exposed in the Cache-Status HTTP response header true

(default: false)
cdn The CDN management, if you use any cdn to proxy your requests Souin will handle that
cdn.provider The provider placed before Souin akamai

fastly

souin
cdn.api_key The api key used to access to the provider XXXX
cdn.dynamic Enable the dynamic keys returned by your backend application (default: true)
cdn.email The api key used to access to the provider if required, depending the provider XXXX
cdn.hostname The hostname if required, depending the provider domain.com
cdn.network The network if required, depending the provider your_network
cdn.strategy The strategy to use to purge the cdn cache, soft will keep the content as a stale resource hard

(default: soft)
cdn.service_id The service id if required, depending the provider 123456_id
cdn.zone_id The zone id if required, depending the provider anywhere_zone
default_cache_control Set the default value of Cache-Control response header if not set by upstream (Souin treats empty Cache-Control as public if omitted) no-store
key Override the key generation with the ability to disable unecessary parts
key.disable_body Disable the body part in the key (GraphQL context) true

(default: false)
key.disable_host Disable the host part in the key true

(default: false)
key.disable_method Disable the method part in the key true

(default: false)
key.disable_query Disable the query string part in the key true

(default: false)
key.headers Add headers to the key matching the regexp Authorization Content-Type X-Additional-Header
key.hide Prevent the key from being exposed in the Cache-Status HTTP response header true

(default: false)
mode Bypass the RFC respect One of bypass bypass_request bypass_response strict (default strict)
nuts Configure the Nuts cache storage
nuts.path Set the Nuts file path storage /anywhere/nuts/storage
nuts.configuration Configure Nuts directly in the Caddyfile or your JSON caddy configuration See the Nuts configuration for the options
etcd Configure the Etcd cache storage
etcd.configuration Configure Etcd directly in the Caddyfile or your JSON caddy configuration See the Etcd configuration for the options
olric Configure the Olric cache storage
olric.path Configure Olric with a file /anywhere/olric_configuration.json
olric.configuration Configure Olric directly in the Caddyfile or your JSON caddy configuration See the Olric configuration for the options
redis Configure the Redis cache storage
redis.url Set the Redis url storage localhost:6379
redis.configuration Configure Redis directly in the Caddyfile or your JSON caddy configuration See the Nuts configuration for the options
regex.exclude The regex used to prevent paths being cached ^[A-z]+.*$
stale The stale duration 25m
timeout The timeout configuration
timeout.backend The timeout duration to consider the backend as unreachable 10s
timeout.cache The timeout duration to consider the cache provider as unreachable 10ms
ttl The TTL duration 120s
log_level The log level One of DEBUG, INFO, WARN, ERROR, DPANIC, PANIC, FATAL it's case insensitive

Other resources

You can find an example for the Caddyfile or the JSON file.
See the Souin configuration for the full configuration, and its associated Caddyfile

cache-handler's People

Contributors

buraksezer avatar darkweak avatar dunglas avatar frederichoule avatar mholt avatar princemaple avatar soyuka avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

cache-handler's Issues

Question: Would this work with hdhomerun for cache/cdn?

So since you use reverse_proxy with hdhomerun, would this be possible to cache incoming tuners you sent out via dowmstream to share the same data as cdn?

if so how would a caddy config file look like?

My config so far.

{
  order request_id before header
  order rate_limit before basicauth
}

(theheaders) {
	header Strict-Transport-Security "max-age=31536000; includeSubDomains; preload"
	header X-Xss-Protection "1; mode=block"
	header X-Content-Type-Options "nosniff"
	header Cache-Control "no-store, no-cache, must-revalidate, max-age=0"
	header Pragma "no-cache"
	header X-Frame-Options "SAMEORIGIN"
	header Permissions-Policy "accelerometer=(), ambient-light-sensor=(), autoplay=(self), camera=(), encrypted-media=(), fullscreen=(self), geolocation=(), gyroscope=(), magnetometer=(), microphone=(), midi=(), payment=(), picture-in-picture=(*), speaker=(), sync-xhr=(), usb=(), vr=()"
	header Access-Control-Allow-Credentials true {
		defer

		}
	header /* {
        -Server
		}
}

domain {
import theheaders
encode gzip
log {
	format single_field common_log # if getting erros this is removed later on from https://caddyserver.com/docs/caddyfile/directives/log > https://github.com/caddyserver/format-encoder

	output file C:\stuff\caddy\logs\dvr_access.log {
	roll true # Rotate logs, enabled by default
	roll_size_mb 5 # Set max size 5 MB
	roll_gzip true # Whether to compress rolled files
	roll_local_time true # Use localhost time
	roll_keep 2 # Keep at most 2 log files
	roll_keep_days 7 # Keep log files for 7 days
	}
}
rate_limit {
	distributed
	zone dynamic_example {
		key    {remote_host}
		events 2
		window 6s
	}
}
rewrite * /a{path}
	reverse_proxy ip:port ip:port ip:port ip:port {
	# load balancing
	lb_policy least_conn
	lb_try_duration 500ms
	lb_try_interval 250ms
	# passive health checking
	max_fails 2
	transport http {
					dial_timeout 3s
					keepalive_idle_conns_per_host 4
					#keepalive_idle_conns 4
					#max_conns_per_host 2
			}
}
basicauth {
    user phash
	}	
}

Last-Modified header and cache skip

    @cacheable_sitemap {
       path_regexp "^/([a-z0-9-]+-sitemap([0-9]+)?|sitemap_index)\.xml"
    }

    header @cacheable_sitemap {
        -Last-Modified
        +X-Sitemap 1
    }

    cache @cacheable_sitemap {
        cache_name sitemap
        key {
            disable_method
            disable_query
        }
        mode bypass
        ttl 300s
    }

I would like to use such rules to cache sitemap files.

Unfortunately, no matter what I do Last-modified header is still returned to the client and subsequent requests have If-Modified-Since set making souin to request backend due to: detail=REQUEST-REVALIDATION. For test purposes I have added X-Sitemap one and it gets appended.

I have also tried request_header matcher to remove "if-modified-since" but I guess it executes after cache part.

Any ideas? :)

fwd=uri-miss

I keep getting fwd=uri-miss on all requests, on all URLs.

cache-status: TestCache; fwd=uri-miss; stored; key=GET-https-temporary-rul

I tried adding request_header -Cache-Control in case it was the Cache-Control header from the browser, but without success. What am I doing wrong?

Caddyfile looks like that:

{
	order cache before rewrite

	on_demand_tls {
		ask http://localhost:5555/ask
		interval 2m
		burst 5
	}

	cache {
		allowed_http_verbs GET
		ttl 86400s
		redis {
			url 127.0.0.1:6379
		}
		key {
			disable_body
			disable_query
		}
	}
}

(upstream) {
	method GET

	request_header -Cache-Control

	header * {
		Server LPV2
		-X-Cloud-Trace-Context
		defer
	}

	tls [email protected] {
		on_demand
	}

	reverse_proxy endpoint.com {
		header_up Host {upstream_hostport}
	}

	cache
}

localhost {
	tls internal
	import upstream
}

https:// {
	import upstream
}

http:// {
	redir https://{host}{uri} permanent
}

Serve stale content from cache if origin responds with 5xx

As requested in this forum thread, I want to be able to use caddy as a reverse caching proxy which makes an service more resilient to outages of the origin server, so that if the origin goes down or responds with errors, caddy serves old content.

Specifically:

  1. I only want http 200 responses to be cached
  2. If an origin server responds with http 5xx, I want to serve stale content from the cache.

Also attaching a corresponding nginx.conf which achieves what I want for reference.

nginx.conf.gz

When is this going to launch ?

A lot of people have referred to this repo when talking about distributed cache of on demand TLS. Is there a timeline on when this would be launched.
Otherwiese what are current recommended alternatives ?

Maintenance and docs re: integrating upstream changes

Hi ๐Ÿ‘‹! I am trying to make some simple changes and running into trouble.

I've spent a few hours trying to do a simple patch version upgrade here,
upgrading darkweak/souin from v1.6.40 to v1.6.43 to fix the bug where it
info-logs the entire upstream response bodies on every cache fill(!!).
But between initial setup issues (from breaking changes in transitive
deps), breaking API changes in darkweak/souin patch versions, and sparse
documentation, I've come away tired and empty-handed.

I can see that substantive parts of this repo appear to be copy-pasted
from darkweak/souin upstream. But the files are not always identical,
and sometimes they don't even correspond to the same commit. For
example, at current master 7e61efad6316, file configuration.json
corresponds to upstream commits souin/v1.6.43~15 (103602ce8fe4) and
earlier, whereas file configuration.go corresponds to upstream commits
at souin/v1.6.43~7 (6b17da1cf027) and later. No upstream commit
contains both files at the versions currently vendored in this repo.

Since this repo pins darkweak/[email protected], I even tried to extract the
patches from that repo to apply here:

# generate 4 patch files for changes in just the Caddy plugin directory
git -C ../souin format-patch v1.6.40... ./plugins/caddy

But these patches fail to apply because of divergence from upstream:

# the first patch applies cleanly (after manually updating to go1.21)...
git am -p3 --exclude=go.mod ../souin/0001-feat-chore-allow-chained-storages-359.patch
# ...but the second fails because of skew in configuration.json, go.mod, go.sum
git am -p3 ../souin/0002-feat-rfc-Cache-invalidation-and-Cache-groups-363.patch
# (no use trying the rest)

Is there a plan for the maintenance and documentation of this project?
All I need is simple in-memory caching, and the Caddyfile API of this
project is attractive, but in its current state I'd need to make a
number of cleanup changes to make it viable. I'm willing to put in that
effort, but without a plan I fear that the effort will be wasted as
darkweak/souin continues to diverge.

(much love to caddy-team; i appreciate what y'all have built and am a
happy user overall ๐Ÿ’œ )

Debug logging but without body content

While configuring caching, I like to see in the logs when a response is pulled from upstream, cached, and served from cache. I've been able to see this by configuring debug logging. But when I configure debug logging, it also logs the response body that was cached. For my API reverse proxied content, that can be handy (though I don't need it). But for my file asset reverse proxied content, it spams the logs with binary data.

Is there a way to configure logging to not include the body content? Or even more granular control of what logs body content (API requests) and what doesn't (file asset downloads)?

Caddy returns 200 when 5xx fom upstream

When I use 'cache' handler, Caddy always returns status code 200 and empty response whenever upstream is unavailable or returns 5xx errors.
Please advice how can upstream errors can be passed through the cache. Thanks.

Design/spec discussion

Currently, this handler works in the most basic sense, but it has a lot of potential and needs some TLC before being truly production-ready. Right now, it's a great foundation to build from.

Since this module was recently spliced out of Caddy's standard modules before the RC1 tag, previous discussions can be found here:

Plus the third-party v1 implementation of a cache middleware: https://github.com/nicolasazrak/caddy-cache

With the context from those previous discussions in mind, where should we take this project? Who will take the lead in developing it?

big memory usage, memory leak?

Hi
For educational purposes I am building a small CDN server, I am using Caddy and your cache plugin for this.
I have a problem, the cache-handler uses too much RAM in my opinion.
I use loader.io to benchmark. When benchmark run and server handle traffic, memory usage in server grows very fast.

At startup, the server uses about 300mb of RAM.
When I turn on benchmark loader.io with settings:

  1. Mantain client load - 0 to 500
  2. Accept-Encoding - gzip,deflate,br
  3. loader.io only loads a CSS file of about 1kb in size.

At the end of the test the server handles about 120k requests per minute and RAM consumption rises to 1.3GB.
It should be noted that with nginx only 1 CPU the server can handle 250k requests per minute with a larger number of connections, so performance is not at the highest level.

I added your plugin to Caddy using the command - caddy add-package github.com/caddyserver/cache-handler
Server - 2 CPU ARM Ampere Altra, 4GB ram, Ubuntu 22.
Caddy version 2.7.4

My caddy config

Global

{
	order cache before rewrite
	cache
}

# Import all websites configs
import websites/*

Website

cdn1.domain1.com {
	cache {
		cache_name xxx
       		stale 1m
        	ttl 1h
        	default_cache_control public
    	}
	encode {
		zstd
		gzip 4
		minimum_length 300
	}
	reverse_proxy https://www.domain2.com {
		header_up Host {upstream_hostport}
	}
}

Have any of you tested Caddy and cache-handler in this way? In what could there be a problem?
Greetings

Panic on incorrect badger configuration

I configured a cache handler like this:

    cache {
        badger {
            configuration {
                 Dir ""
		 ValueDir ""
		 InMemory true
            }
        }
    }

and when I make a request to a cached resource, I get this panic:

Jul 17 12:41:42 dynamicsites caddy[3505454]: {"level":"debug","ts":1689597702.9780827,"logger":"http.stdlib","msg":"http2: panic serving 162.243.172.248:43536: runtime error: invalid memory address or nil pointer dereference\ngoroutine 80 [running]:\nnet/http.(*http2serverConn).runHandler.func1()\n\tnet/http/h2_bundle.go:6042 +0x145\npanic({0x1d4fc00, 0x34b9f80})\n\truntime/panic.go:884 +0x213\ngithub.com/dgraph-io/badger/v3.(*DB).IsClosed(...)\n\tgithub.com/dgraph-io/badger/[email protected]/db.go:537\ngithub.com/dgraph-io/badger/v3.(*DB).View(0x0?, 0x0?)\n\tgithub.com/dgraph-io/badger/[email protected]/txn.go:795 +0x2f\ngithub.com/darkweak/souin/pkg/storage.(*Badger).Prefix(0x0?, {0xc0000c2e80?, 0x0?}, 0xc00072ad80?)\n\tgithub.com/darkweak/[email protected]/pkg/storage/badgerProvider.go:127 +0x6c\ngithub.com/darkweak/souin/pkg/middleware.(*SouinBaseHandler).ServeHTTP(0xc00016c000, {0x2529b30, 0xc0004e88d8}, 0xc0002f9d00, 0xc000681c00)\n\tgithub.com/darkweak/[email protected]/pkg/middleware/middleware.go:311 +0x7dd\ngithub.com/caddyserver/cache-handler.(*SouinCaddyMiddleware).ServeHTTP(0xc00019b2c0, {0x2529b30, 0xc0004e88d8}, 0xc0002f9d00, {0x251be60?, 0xc0009ffe30})\n\tgithub.com/caddyserver/[email protected]/httpcache.go:81 +0xb3\ngithub.com/caddyserver/caddy/v2/modules/caddyhttp.wrapMiddleware.func1.1({0x2529b30?, 0xc0004e88d8?}, 0x251be60?)\n\tgithub.com/caddyserver/caddy/[email protected]/modules/caddyhttp/routes.go:290 +0x42\ngithub.com/caddyserver/caddy/v2/modules/caddyhttp.HandlerFunc.ServeHTTP(0x251be60?, {0x2529b30?, 0xc0004e88d8?}, 0x0?)\n\tgithub.com/caddyserver/caddy/[email protected]/modules/caddyhttp/caddyhttp.go:58 +0x2f\ngithub.com/caddyserver/caddy/v2/modules/caddyhttp.wrapRoute.func1.1({0x2529b30, 0xc0004e88d8}, 0xc0002f9d00)\n\tgithub.com/caddyserver/caddy/[email protected]/modules/caddyhttp/routes.go:259 +0x3a8\ngithub.com/caddyserver/caddy/v2/modules/caddyhttp.HandlerFunc.ServeHTTP(0xc00060b680?, {0x2529b30?, 0xc0004e88d8?}, 0x251be60?)\n\tgithub.com/caddyserver/caddy/[email protected]/modules/caddyhttp/caddyhttp.go:58 +0x2f\ngithub.com/caddyserver/caddy/v2/modules/caddyhttp.(*Subroute).ServeHTTP(0xc0004fbba0, {0x2529b30, 0xc0004e88d8}, 0x1c94401?, {0x251be60, 0x22776b8})\n\tgithub.com/caddyserver/caddy/[email protected]/modules/caddyhttp/subroute.go:74 +0x6d\ngithub.com/caddyserver/caddy/v2/modules/caddyhttp.wrapMiddleware.func1.1({0x2529b30?, 0xc0004e88d8?}, 0x251be60?)\n\tgithub.com/caddyserver/caddy/[email protected]/modules/caddyhttp/routes.go:290 +0x42\ngithub.com/caddyserver/caddy/v2/modules/caddyhttp.HandlerFunc.ServeHTTP(0x251be60?, {0x2529b30?, 0xc0004e88d8?}, 0xc000438c78?)\n\tgithub.com/caddyserver/caddy/[email protected]/modules/caddyhttp/caddyhttp.go:58 +0x2f\ngithub.com/caddyserver/caddy/v2/modules/caddyhttp.wrapRoute.func1.1({0x2529b30, 0xc0004e88d8}, 0xc0002f9d00)\n\tgithub.com/caddyserver/caddy/[email protected]/modules/caddyhttp/routes.go:259 +0x3a8\ngithub.com/caddyserver/caddy/v2/modules/caddyhttp.HandlerFunc.ServeHTTP(0x1d38640?, {0x2529b30?, 0xc0004e88d8?}, 0xe?)\n\tgithub.com/caddyserver/caddy/[email protected]/modules/caddyhttp/caddyhttp.go:58 +0x2f\ngithub.com/caddyserver/caddy/v2/modules/caddyhttp.wrapRoute.func1.1({0x2529b30, 0xc0004e88d8}, 0xc0002f9d00)\n\tgithub.com/caddyserver/caddy/[email protected]/modules/caddyhttp/routes.go:227 +0x336\ngithub.com/caddyserver/caddy/v2/modules/caddyhttp.HandlerFunc.ServeHTTP(0x1d38640?, {0x2529b30?, 0xc0004e88d8?}, 0xe?)\n\tgithub.com/caddyserver/caddy/[email protected]/modules/caddyhttp/caddyhttp.go:58 +0x2f\ngithub.com/caddyserver/caddy/v2/modules/caddyhttp.wrapRoute.func1.1({0x2529b30, 0xc0004e88d8}, 0xc0002f9d00)\n\tgithub.com/caddyserver/caddy/[email protected]/modules/caddyhttp/routes.go:227 +0x336\ngithub.com/caddyserver/caddy/v2/modules/caddyhttp.HandlerFunc.ServeHTTP(0x1d38640?, {0x2529b30?, 0xc0004e88d8?}, 0xe?)\n\tgithub.com/caddyserver/caddy/[email protected]/modules/caddyhttp/caddyhttp.go:58 +0x2f\ngithub.com/caddyserver/caddy/v2/modules/caddyhttp.wrapRoute.func1.1({0x2529b30, 0xc0004e88d8}, 0xc0002f9d00)\n\tgithub.com/caddyserver/caddy/[email protected]/modules/caddyhttp/routes.go:227 +0x336\ngithub.com/caddyserver/caddy/v2/modules/caddyhttp.HandlerFunc.ServeHTTP(0xc0006524a0?, {0x2529b30?, 0xc0004e88d8?}, 0x1f3be40?)\n\tgithub.com/caddyserver/caddy/[email protected]/modules/caddyhttp/caddyhttp.go:58 +0x2f\ngithub.com/caddyserver/caddy/v2/modules/caddyhttp.(*Server).enforcementHandler(0x0?, {0x2529b30?, 0xc0004e88d8?}, 0xc0006524a0?, {0x251be60?, 0xc00008a720?})\n\tgithub.com/caddyserver/caddy/[email protected]/modules/caddyhttp/server.go:388 +0x252\ngithub.com/caddyserver/caddy/v2/modules/caddyhttp.(*Server).wrapPrimaryRoute.func1({0x2529b30?, 0xc0004e88d8?}, 0x4d59d7?)\n\tgithub.com/caddyserver/caddy/[email protected]/modules/caddyhttp/server.go:364 +0x3b\ngithub.com/caddyserver/caddy/v2/modules/caddyhttp.HandlerFunc.ServeHTTP(0xc0005fb6c0?, {0x2529b30?, 0xc0004e88d8?}, 0xc0002f9d00?)\n\tgithub.com/caddyserver/caddy/[email protected]/modules/caddyhttp/caddyhttp.go:58 +0x2f\ngithub.com/caddyserver/caddy/v2/modules/caddyhttp.(*Server).ServeHTTP(0xc0002dd400, {0x2529b30, 0xc0004e88d8}, 0xc0002f9d00)\n\tgithub.com/caddyserver/caddy/[email protected]/modules/caddyhttp/server.go:300 +0xb66\nnet/http.serverHandler.ServeHTTP({0x1000?}, {0x2529b30, 0xc0004e88d8}, 0xc0002f9b00)\n\tnet/http/server.go:2936 +0x316\nnet/http.initALPNRequest.ServeHTTP({{0x252b6c8?, 0xc0009ff110?}, 0xc0005c4380?, {0xc0003e4a50?}}, {0x2529b30, 0xc0004e88d8}, 0xc0002f9b00)\n\tnet/http/server.go:3545 +0x245\nnet/http.(*http2serverConn).runHandler(0x25238d0?, 0x352c968?, 0x0?, 0x0?)\n\tnet/http/h2_bundle.go:6049 +0x83\ncreated by net/http.(*http2serverConn).processHeaders\n\tnet/http/h2_bundle.go:5762 +0x68a"}

Here's the stack trace formatted a little more nicely:

http2: panic serving 162.243.172.248:43536: runtime error: invalid memory address or nil pointer dereference
goroutine 80 [running]:
net/http.(*http2serverConn).runHandler.func1()
	net/http/h2_bundle.go:6042 +0x145
panic({0x1d4fc00, 0x34b9f80})
	runtime/panic.go:884 +0x213
github.com/dgraph-io/badger/v3.(*DB).IsClosed(...)
	github.com/dgraph-io/badger/[email protected]/db.go:537
github.com/dgraph-io/badger/v3.(*DB).View(0x0?, 0x0?)
	github.com/dgraph-io/badger/[email protected]/txn.go:795 +0x2f
github.com/darkweak/souin/pkg/storage.(*Badger).Prefix(0x0?, {0xc0000c2e80?, 0x0?}, 0xc00072ad80?)
	github.com/darkweak/[email protected]/pkg/storage/badgerProvider.go:127 +0x6c
github.com/darkweak/souin/pkg/middleware.(*SouinBaseHandler).ServeHTTP(0xc00016c000, {0x2529b30, 0xc0004e88d8}, 0xc0002f9d00, 0xc000681c00)
	github.com/darkweak/[email protected]/pkg/middleware/middleware.go:311 +0x7dd
github.com/caddyserver/cache-handler.(*SouinCaddyMiddleware).ServeHTTP(0xc00019b2c0, {0x2529b30, 0xc0004e88d8}, 0xc0002f9d00, {0x251be60?, 0xc0009ffe30})
	github.com/caddyserver/[email protected]/httpcache.go:81 +0xb3
github.com/caddyserver/caddy/v2/modules/caddyhttp.wrapMiddleware.func1.1({0x2529b30?, 0xc0004e88d8?}, 0x251be60?)
	github.com/caddyserver/caddy/[email protected]/modules/caddyhttp/routes.go:290 +0x42
github.com/caddyserver/caddy/v2/modules/caddyhttp.HandlerFunc.ServeHTTP(0x251be60?, {0x2529b30?, 0xc0004e88d8?}, 0x0?)
	github.com/caddyserver/caddy/[email protected]/modules/caddyhttp/caddyhttp.go:58 +0x2f
github.com/caddyserver/caddy/v2/modules/caddyhttp.wrapRoute.func1.1({0x2529b30, 0xc0004e88d8}, 0xc0002f9d00)
	github.com/caddyserver/caddy/[email protected]/modules/caddyhttp/routes.go:259 +0x3a8
github.com/caddyserver/caddy/v2/modules/caddyhttp.HandlerFunc.ServeHTTP(0xc00060b680?, {0x2529b30?, 0xc0004e88d8?}, 0x251be60?)
	github.com/caddyserver/caddy/[email protected]/modules/caddyhttp/caddyhttp.go:58 +0x2f
github.com/caddyserver/caddy/v2/modules/caddyhttp.(*Subroute).ServeHTTP(0xc0004fbba0, {0x2529b30, 0xc0004e88d8}, 0x1c94401?, {0x251be60, 0x22776b8})
	github.com/caddyserver/caddy/[email protected]/modules/caddyhttp/subroute.go:74 +0x6d
github.com/caddyserver/caddy/v2/modules/caddyhttp.wrapMiddleware.func1.1({0x2529b30?, 0xc0004e88d8?}, 0x251be60?)
	github.com/caddyserver/caddy/[email protected]/modules/caddyhttp/routes.go:290 +0x42
github.com/caddyserver/caddy/v2/modules/caddyhttp.HandlerFunc.ServeHTTP(0x251be60?, {0x2529b30?, 0xc0004e88d8?}, 0xc000438c78?)
	github.com/caddyserver/caddy/[email protected]/modules/caddyhttp/caddyhttp.go:58 +0x2f
github.com/caddyserver/caddy/v2/modules/caddyhttp.wrapRoute.func1.1({0x2529b30, 0xc0004e88d8}, 0xc0002f9d00)
	github.com/caddyserver/caddy/[email protected]/modules/caddyhttp/routes.go:259 +0x3a8
github.com/caddyserver/caddy/v2/modules/caddyhttp.HandlerFunc.ServeHTTP(0x1d38640?, {0x2529b30?, 0xc0004e88d8?}, 0xe?)
	github.com/caddyserver/caddy/[email protected]/modules/caddyhttp/caddyhttp.go:58 +0x2f
github.com/caddyserver/caddy/v2/modules/caddyhttp.wrapRoute.func1.1({0x2529b30, 0xc0004e88d8}, 0xc0002f9d00)
	github.com/caddyserver/caddy/[email protected]/modules/caddyhttp/routes.go:227 +0x336
github.com/caddyserver/caddy/v2/modules/caddyhttp.HandlerFunc.ServeHTTP(0x1d38640?, {0x2529b30?, 0xc0004e88d8?}, 0xe?)
	github.com/caddyserver/caddy/[email protected]/modules/caddyhttp/caddyhttp.go:58 +0x2f
github.com/caddyserver/caddy/v2/modules/caddyhttp.wrapRoute.func1.1({0x2529b30, 0xc0004e88d8}, 0xc0002f9d00)
	github.com/caddyserver/caddy/[email protected]/modules/caddyhttp/routes.go:227 +0x336
github.com/caddyserver/caddy/v2/modules/caddyhttp.HandlerFunc.ServeHTTP(0x1d38640?, {0x2529b30?, 0xc0004e88d8?}, 0xe?)
	github.com/caddyserver/caddy/[email protected]/modules/caddyhttp/caddyhttp.go:58 +0x2f
github.com/caddyserver/caddy/v2/modules/caddyhttp.wrapRoute.func1.1({0x2529b30, 0xc0004e88d8}, 0xc0002f9d00)
	github.com/caddyserver/caddy/[email protected]/modules/caddyhttp/routes.go:227 +0x336
github.com/caddyserver/caddy/v2/modules/caddyhttp.HandlerFunc.ServeHTTP(0xc0006524a0?, {0x2529b30?, 0xc0004e88d8?}, 0x1f3be40?)
	github.com/caddyserver/caddy/[email protected]/modules/caddyhttp/caddyhttp.go:58 +0x2f
github.com/caddyserver/caddy/v2/modules/caddyhttp.(*Server).enforcementHandler(0x0?, {0x2529b30?, 0xc0004e88d8?}, 0xc0006524a0?, {0x251be60?, 0xc00008a720?})
	github.com/caddyserver/caddy/[email protected]/modules/caddyhttp/server.go:388 +0x252
github.com/caddyserver/caddy/v2/modules/caddyhttp.(*Server).wrapPrimaryRoute.func1({0x2529b30?, 0xc0004e88d8?}, 0x4d59d7?)
	github.com/caddyserver/caddy/[email protected]/modules/caddyhttp/server.go:364 +0x3b
github.com/caddyserver/caddy/v2/modules/caddyhttp.HandlerFunc.ServeHTTP(0xc0005fb6c0?, {0x2529b30?, 0xc0004e88d8?}, 0xc0002f9d00?)
	github.com/caddyserver/caddy/[email protected]/modules/caddyhttp/caddyhttp.go:58 +0x2f
github.com/caddyserver/caddy/v2/modules/caddyhttp.(*Server).ServeHTTP(0xc0002dd400, {0x2529b30, 0xc0004e88d8}, 0xc0002f9d00)
	github.com/caddyserver/caddy/[email protected]/modules/caddyhttp/server.go:300 +0xb66
net/http.serverHandler.ServeHTTP({0x1000?}, {0x2529b30, 0xc0004e88d8}, 0xc0002f9b00)
	net/http/server.go:2936 +0x316
net/http.initALPNRequest.ServeHTTP({{0x252b6c8?, 0xc0009ff110?}, 0xc0005c4380?, {0xc0003e4a50?}}, {0x2529b30, 0xc0004e88d8}, 0xc0002f9b00)
	net/http/server.go:3545 +0x245
net/http.(*http2serverConn).runHandler(0x25238d0?, 0x352c968?, 0x0?, 0x0?)
	net/http/h2_bundle.go:6049 +0x83
created by net/http.(*http2serverConn).processHeaders
	net/http/h2_bundle.go:5762 +0x68a

I'm not sure what's going on -- I assume it's panicking because the badger configuration is incorrect. It would be much easier to debug my configuration if there were an error message instead of a panic here.

Caddy with cache-handler crashes frequently on virtuozzo/jelastic

Hi,

I have an issue with caddy terminating after a while when running on Virtuozzo/Jelastic. It is a bit difficult to debug this issue since logs are not easy to come by. But maybe there are known issues?

Basically, the behaviour is like this: Caddy runs just fine without the cache activated in the Caddyfile. Once cache is activated, I can hammer it with requests and it is stable, however, after a period of about 24 hours Caddy terminates and since virtuozzo doesn't offer a restart policy for its implementation of running custom docker containers, it has to be manually restarted. Unfortunately I am bound to use Jelastic here with this setup.

The Caddyfile is quite simple:

# Global directives
{
	order cache before rewrite
	cache
}

# Domain directives
domain.xx {
	cache
	reverse_proxy <IP>:3000
}

The instance has 2GB of memory assigned and doesn't report any OOM issues. Load is minimal, even when running load-tests. Caddy is caching requests to a NextJS instance that serves certain responses with s-maxage/stale-while-revalidate headers and this works fine, too.

So my first question is, have there been reports of crashes for this environment or nextJS in particular?

I will try to get my hands on a log file to post here.

Caddy exit after enable cache

image
Hi, I've got a problem running the cache handler module. When I tried to run Caddy with this module enabled, it crashed after a few minutes. I thought I hadn't left enough memory (2G) for caddy, then I got the same issue on my dedicated machine. I tried it with redis enabled and default mode, but none of them worked after pointing my domain to it. However, caddy doesn't exit on my staging machine with this cache-handler module enabled. I am not sure what happened. Could you provide some debug commands so that I can share more information with you? Thanks.

http.handlers.cache Reused response from concurrent request with

I am getting the http.handlers.cache Reused response from concurrent request with .. log in my error logs consistently. I'm unsure if this is a legitimate issue and what the user experience is when they receive it. Is this something I should be concerned about and how should I interpret it?

memory leak when adding partial config

I'm trying to set up caddy as a caching reverse proxy for several sites and I will be using the config API to add, modify and remove sites as needed.
Everything works as expected, except when I configure new sites by adding new routes to the first http server, after say 20 sites the resident set size is at aroung 3G which seems like a lot. If I download the whole config using the API and reload it into a freshly started caddy instance the memory usage remains normal at around 350MB, resident set size is just a bit higher then the used memory.

I've been trying to get more information using pprof, I have some data that might help narrow down the issue, but I think this part is a bit out of my league at this point.

I compiled caddy using the following command:
xcaddy build --with github.com/caddyserver/cache-handler --with github.com/corazawaf/coraza-caddy --with github.com/porech/caddy-maxmind-geolocation --with github.com/caddyserver/transform-encoder --with github.com/imgk/caddy-pprof

Caddy version is:
v2.5.1 h1:bAWwslD1jNeCzDa+jDCNwb8M3UJ2tPa8UZFFzPVmGKs=

The initial configuration is in the attached zip file, named empty_nosites.config

By configuring one site I mean (the onesite.config file is also in the attached zip):
curl -X POST -H "Content-Type: application/json" -d @onesite.config http://192.168.6.31:2019/config/apps/http/servers/srv0/routes/

The final configuration is in the cache_20_sites.config. I got to that by doing the above operation 20 times, with different hostnames.

A also included an svg output from go pprof showing the alloc_space after adding the 20 sites one by one, named 20_sites_onebyone.svg

After saving the configuration using:
curl http://192.168.6.31:2019/config/ > cache_20_sites.config

I restarted caddy and posted the whole config right back:
curl -X POST -H "Content-Type: application/json" -d @cache_20_sites.config http://192.168.6.31:2019/config/

Then I took another snapshot with pprof. This can be found in the file 20_sites_allatonce.svg

bugreport.zip

Could someone please help me with this?

Add interface for cache?

When I was looking to start working on a similar module, I stumbled across this one and I love the work that is being done on this module, this is going to be a great addition to the Caddy modules list!

The only thing I was wondering is if there could be an interface for the caching backend, similar to the certmagic storage interface so the default storage, olric, could be swapped out with something like redis?

Installation

It was not clear to me how to install. I think you used to have instructions for how to compile. Now they disappeared.
It would be nice if in the readme, you linked to the compilation instructions over on the main caddy documentation page.
https://caddyserver.com/docs/build#xcaddy

I am so glad to see that this package now looks like it is main stream supported.

HURRAH.

API Config - Surrogate-key prune

Im trying to configure the cache through the json api with the following config:

 "apps"  => [
                "cache" => [
                    'api'       => [
                        'basepath' => '/cache/',
                    ],
                    "log_level" => "info",
                    'ttl'       => '3600s',
                ],
]

I can see the cache working in the response (ive set a custom max age in my response going out)

Age: 8
Cache-Control: max-age=200, public
Cache-Status: Souin; hit; ttl=192

I can also see that the following request no longer hits my upstream and only caddy.

However when I go to my caddy instance: caddy:2019/config/cache/ (is this right?) I only get null as a response not a list of cached keys? The purge method doesnt work either as it just returns method not allowed

Im assuming im accessing the wrong api endpoint? Is the api endpoint configured over the caddy admin api or do I need to set up seperate routes in the caddy config to actually handle it?

Cant update caddy config when olric enabled

Caddy crashes when posting a new config to :2019/load when olric is enabled as the cache driver
Seems to not shut down the olric port gracefully / cant reuse it on reboot

caddy_1  | 2022/05/26 14:11:12 [INFO] Storage engine has been loaded: kvstore => service.go:129
caddy_1  | 2022/05/26 14:11:12 [DEBUG] memberlist: Got bind error: Failed to start TCP listener on "172.19.0.2" port 3322: listen tcp 172.19.0.2:3322: bind: address already in use
caddy_1  | Impossible to start the embedded Olric instance: Could not set up network transport: failed to obtain an address: Failed to start TCP listener on "172.19.0.2" port 3322: listen tcp 172.19.0.2:3322: bind: address already in use
caddy_1  | Closed EmbeddedOlricProvider chan

Also sometimes seeing this but not sure if related or not

caddy_1  | 2022/05/26 14:01:12 [INFO] Routing table has been pushed by 172.19.0.2:3320 => operations.go:87
caddy_1  | {"level":"debug","ts":1653573697.4362159,"logger":"http.stdlib","msg":"http: panic serving 172.19.0.1:40298: runtime error: invalid memory address or nil pointer dereference\ngoroutine 16465 [running]:\nnet/http.(*conn).serve.func1()\n\tnet/http/server.go:1825 +0xbf\npanic({0x1969700, 0x2c547a0})\n\truntime/panic.go:844 +0x258\ngithub.com/darkweak/souin/cache/providers.(*EmbeddedOlric).ListKeys(0x0)\n\tgithub.com/darkweak/[email protected]/cache/providers/embeddedOlricProvider.go:109 +0x1f2\ngithub.com/darkweak/souin/api.(*SouinAPI).GetAll(...)\n\tgithub.com/darkweak/[email protected]/api/souin.go:60\ngithub.com/darkweak/souin/api.(*SouinAPI).HandleRequest(0xc006e0f630, {0x7fc0373ab558, 0xc0008ec2a0}, 0xc000246400)\n\tgithub.com/darkweak/[email protected]/api/souin.go:90 +0x1b8\ngithub.com/darkweak/souin/plugins/caddy.(*SouinCaddyPlugin).ServeHTTP(0xc0000fa600, {0x7fc0373ab558?, 0xc0008ec2a0}, 0xc000246300, {0x1f50760, 0xc0008c1f60})\n\tgithub.com/darkweak/souin/plugins/[email protected]/httpcache.go:77 +0x94e\ngithub.com/caddyserver/caddy/v2/modules/caddyhttp.(*metricsInstrumentedHandler).ServeHTTP(0xc006e4c1c0, {0x1f5adf8?, 0xc0008ee000}, 0xc000246300, {0x1f50760, 0xc0008c1f60})\n\tgithub.com/caddyserver/caddy/[email protected]/modules/caddyhttp/metrics.go:132 +0x53b\ngithub.com/caddyserver/caddy/v2/modules/caddyhttp.wrapMiddleware.func1.1({0x1f5adf8?, 0xc0008ee000?}, 0x1f50760?)\n\tgithub.com/caddyserver/caddy/[email protected]/modules/caddyhttp/routes.go:272 +0x3b\ngithub.com/caddyserver/caddy/v2/modules/caddyhttp.HandlerFunc.ServeHTTP(0x1f50760?, {0x1f5adf8?, 0xc0008ee000?}, 0x7785c6?)\n\tgithub.com/caddyserver/caddy/[email protected]/modules/caddyhttp/caddyhttp.go:57 +0x2f\ngithub.com/caddyserver/caddy/v2/modules/caddyhttp.wrapRoute.func1.1({0x1f5adf8, 0xc0008ee000}, 0xc000246300)\n\tgithub.com/caddyserver/caddy/[email protected]/modules/caddyhttp/routes.go:244 +0x322\ngithub.com/caddyserver/caddy/v2/modules/caddyhttp.HandlerFunc.ServeHTTP(0x400?, {0x1f5adf8?, 0xc0008ee000?}, 0x1b06960?)\n\tgithub.com/caddyserver/caddy/[email protected]/modules/caddyhttp/caddyhttp.go:57 +0x2f\ngithub.com/caddyserver/caddy/v2/modules/caddyhttp.(*Server).enforcementHandler(0x0?, {0x1f5adf8?, 0xc0008ee000?}, 0xc0005fdcd0?, {0x1f50760?, 0xc006e4c200?})\n\tgithub.com/caddyserver/caddy/[email protected]/modules/caddyhttp/server.go:318 +0x252\ngithub.com/caddyserver/caddy/v2/modules/caddyhttp.(*Server).wrapPrimaryRoute.func1({0x1f5adf8?, 0xc0008ee000?}, 0x49ba97?)\n\tgithub.com/caddyserver/caddy/[email protected]/modules/caddyhttp/server.go:294 +0x3b\ngithub.com/caddyserver/caddy/v2/modules/caddyhttp.HandlerFunc.ServeHTTP(0xc0002eadc0?, {0x1f5adf8?, 0xc0008ee000?}, 0xc000246300?)\n\tgithub.com/caddyserver/caddy/[email protected]/modules/caddyhttp/caddyhttp.go:57 +0x2f\ngithub.com/caddyserver/caddy/v2/modules/caddyhttp.(*Server).ServeHTTP(0xc00027ea20, {0x1f5adf8, 0xc0008ee000}, 0xc000246300)\n\tgithub.com/caddyserver/caddy/[email protected]/modules/caddyhttp/server.go:230 +0xac6\nnet/http.serverHandler.ServeHTTP({0xc0008a7f50?}, {0x1f5adf8, 0xc0008ee000}, 0xc000246100)\n\tnet/http/server.go:2916 +0x43b\nnet/http.(*conn).serve(0xc0008dc000, {0x1f5c808, 0xc006bf2bd0})\n\tnet/http/server.go:1966 +0x5d7\ncreated by net/http.(*Server).Serve\n\tnet/http/server.go:3071 +0x4db"}

Reproducible example here:

https://github.com/mattvb91/caddy-cache-olric-bug

Edit: Just to point out this is using @latest in the Dockerfile

Getting constant misses using API Platform

Hi,

I am tying to setup this module with an API Platform setup (not running in Docker, but standard PHP-FPM). I have configured Caddy as can be seen below, but I am getting constant misses (I do not get any hits).

Here is my Caddyfile:

{
        order cache before rewrite
        cache {
                cache_keys {
                        headers Authorization Content-Type
                }
                key {
                        headers Authorization Content-Type
                }
                log_level debug
        }
        log {
                level debug
        }
}
(cors) {
        @origin header Origin {args.0}
        header @origin Access-Control-Allow-Origin "{args.0}"
        header @origin Access-Control-Allow-Methods "OPTIONS,HEAD,GET,POST,PUT,PATCH,DELETE"
}

app.example {
        route {
                cache
                root * /var/www/app/api/public
                vulcain

                push
                header ?Permissions-Policy "browsing-topics=()"

                php_fastcgi unix//var/run/php/php-fpm.sock
                encode zstd gzip
                file_server
        }
}

other_host {
        other_not_cached_stuff
}

Here are my request headers:

GET /users/7 HTTP/2
Host: REDACTED_APP_DOMAIN
User-Agent: Mozilla/5.0 (X11; Linux x86_64; rv:109.0) Gecko/20100101 Firefox/109.0
Accept: */*
Accept-Language: en-US,en;q=0.5
Accept-Encoding: gzip, deflate, br
Authorization: Bearer TOKEN
Origin: https://REDACTED_APP_DOMAIN
Connection: keep-alive
Referer: https://REDACTED_APP_DOMAIN
Sec-Fetch-Dest: empty
Sec-Fetch-Mode: cors
Sec-Fetch-Site: same-site
TE: trailers

Here are my response headers:

HTTP/2 200 OK
accept-patch: application/merge-patch+json
access-control-allow-credentials: true
access-control-allow-origin: https://REDACTED_APP_DOMAIN
access-control-expose-headers: link
alt-svc: h3=":443"; ma=2592000
cache-control: max-age=0, public, s-maxage=3600
cache-status: Souin; fwd=uri-miss; stored
content-type: application/ld+json; charset=utf-8
date: Sat, 04 Feb 2023 20:56:32 GMT
etag: "03407f50c18524d5cc417eaa8f0d214e"
link: <https://REDACTED_APP_DOMAIN/docs.jsonld>; rel="http://www.w3.org/ns/hydra/core#apiDocumentation"
permissions-policy: browsing-topics=()
server: Caddy
surrogate-key: /users/7
vary: Accept, Content-Type, Authorization, Origin
x-content-type-options: nosniff
x-frame-options: deny
content-length: 216
X-Firefox-Spdy: h2

Looking at the Caddy logs, all I get is:

Feb 04 20:56:32 app-dev caddy[37178]: {"level":"debug","ts":1675544192.855148,"logger":"http.handlers.cache","msg":"Incoming request: &{Method:GET URL:/users/13 Proto:HTTP/2.0 ProtoMajor:2 ProtoMinor:0 Header:map[Accept:[*/*] Accept-Encoding:[gzip, deflate, br] Accept-Language:[en-US,en;q=0.5] Authorization:[Bearer TOKEN] Date:[Sat, 04 Feb 2023 20:56:32 UTC] If-None-Match:[\"1e80b930ac97fcc265d9e68a93a5ebe5\"] Origin:[https://REDACTED_APP_DOMAIN] Referer:[https://REDACTED_APP_DOMAIN/] Sec-Fetch-Dest:[empty] Sec-Fetch-Mode:[cors] Sec-Fetch-Site:[same-site] Te:[trailers] User-Agent:[Mozilla/5.0 (X11; Linux x86_64; rv:109.0) Gecko/20100101 Firefox/109.0]] Body:0xc00ad91860 GetBody:<nil> ContentLength:0 TransferEncoding:[] Close:false Host:REDACTED_APP_DOMAIN Form:map[] PostForm:map[] MultipartForm:<nil> Trailer:map[] RemoteAddr:86.120.236.159:39426 RequestURI:/users/13 TLS:0xc008d5e000 Cancel:<nil> Response:<nil> ctx:0xc00b2e8f60}"}
Feb 04 20:56:32 app-dev caddy[37178]: {"level":"debug","ts":1675544192.859668,"logger":"http.handlers.cache","msg":"writeRequests called. Writing to value log"}
Feb 04 20:56:32 app-dev caddy[37178]: {"level":"debug","ts":1675544192.8601124,"logger":"http.handlers.cache","msg":"Sending updates to subscribers"}
Feb 04 20:56:32 app-dev caddy[37178]: {"level":"debug","ts":1675544192.8603485,"logger":"http.handlers.cache","msg":"Writing to memtable"}
Feb 04 20:56:32 app-dev caddy[37178]: {"level":"debug","ts":1675544192.860538,"logger":"http.handlers.cache","msg":"2 entries written"}

Is there anything I am missing here? Thank you!

Enabling cache increases response times

I have a Caddyfile like this:

{
    order cache before rewrite
    cache
}

ourdomain {
    encode gzip
    root * /home/ubuntu/landing
    try_files {path} {path}/index.html

    @fileExists file
    handle @fileExists {
        file_server
    }

    handle {
        cache
        reverse_proxy http://remoteserver:4567 {
            lb_try_duration 60s
            fail_duration 1s
            health_uri /health
            health_interval 1s
        }
    }
    log {
        output file /var/log/caddy/access.log
        format console
    }
}

When I request a file from it, using this command:

curl -o /dev/null -s -w 'Establish Connection: %{time_connect}s\nTTFB: %{time_starttransfer}s\nTotal: %{time_total}s\n'  https://ourdomain/file.js

I get a response like this:

Establish Connection: 0.182702s
TTFB: 2.137184s
Total: 3.592550s

If I remove the cache line from the handle block, I see a response like this:

Establish Connection: 0.304530s
TTFB: 0.960026s
Total: 2.472493s

Consistently, enabling the cache seems to make my requests take longer. Is this because the cache is downloading the entire file from upstream, before serving it to the client? As opposed to just streaming the entire thing immediately?

Is there a way to avoid this behaviour?

(This applies to both my first request and subsequent requests but I'm guessing that's because my max-age=0... still working on that)

Prevent caching based on response header

So, long story short I'm trying to accelerate an old Drupal 7 website, that more than 95% is anonymous user's traffic and the rest is mostly admins, editors and publishers working on the platform.
Problem is that Drupal 7 doesn't have an Authorization header, so any logged user will be cached.

While reading #34 I though that a good solution was to only cache based on the response header, since the bulk of public content is pre-rendered by the cms and tagged accordingly via headers. But to my understanding in Caddy there is no way to run a matcher on the response because it is already too late.

A regex based on path isn't an option, since the path of unpublished content is the same as when it would be published.

But, I also have to admit that I still don't fully understand how Souin works, in particular what are the cache keys. Because reading at this and this it almost seems that the functionality of caching only based on response headers is already there, its just I don't understand how to do it.

fatal errors are not logged

I'm trying to use the cache handler and it wasn't working. The symptom:

tve@cloud /h/caddy> curl -o /dev/null 'https://example.com/data/binary/tagVisits'
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
  0     0    0     0    0     0      0      0 --:--:--  0:00:07 --:--:--     0
curl: (92) HTTP/2 stream 0 was not closed cleanly: INTERNAL_ERROR (err 2)

Only after I turned on debug logging I got:

Oct 10 09:12:24 cloud caddy[2743409]: {"level":"debug","ts":1665418344.1734848,"logger":"http.stdlib
","msg":"http2: panic serving 0.0.0.0:56316: Impossible to set value into Badger, Value with si
ze 2779485 exceeded 1048576 limit. Value:\n00000000  48 54 54 50 2f 30 2e 30  20 32 30 30 20 4f 4b 0
d  |HTTP/0.0 200 OK.|\n00000010  0a 41 6c 74 2d 53 76 63  3a 20 68 33 3d 22 3a 34  |.Alt-Svc: h3=\":
  1. why is this error not logged without debug?
  2. why is there a panic?
  3. it looks to me like this 'badger' thing needs some config to cache >1MB? do I now need to go on a quest to figure out what badger is, why http.cache uses it, and how to configure it?

Ability to force caching even if client sends no-cache

Hi,

Is it possible to force cache even if the client sends no-cache in the request? During testing, I saw that when I disable cache in chrome, all requests that are supposed to be cached on the proxy are instant misses. Can this behaviour be configured?

panic when use cache-handler

Oct 30 23:09:37 VM-zlNr9iaPx8gT caddy[36302]: panic: Header called after Handler finished
Oct 30 23:09:37 VM-zlNr9iaPx8gT caddy[36302]:         panic: Header called after Handler finished
Oct 30 23:09:37 VM-zlNr9iaPx8gT caddy[36302]: goroutine 325119 [running]:
Oct 30 23:09:37 VM-zlNr9iaPx8gT caddy[36302]: net/http.(*http2responseWriter).Header(0xc015d24d80?)
Oct 30 23:09:37 VM-zlNr9iaPx8gT caddy[36302]:         net/http/h2_bundle.go:6362 +0x7a
Oct 30 23:09:37 VM-zlNr9iaPx8gT caddy[36302]: github.com/darkweak/souin/plugins.(*CustomWriter).Header(0xc015e8c060?)
Oct 30 23:09:37 VM-zlNr9iaPx8gT caddy[36302]:         github.com/darkweak/[email protected]/plugins/base.go:55 +0x37
Oct 30 23:09:37 VM-zlNr9iaPx8gT caddy[36302]: github.com/caddyserver/caddy/v2/modules/caddyhttp/encode.(*responseWriter).Close(0xc009960aa0)
Oct 30 23:09:37 VM-zlNr9iaPx8gT caddy[36302]:         github.com/caddyserver/caddy/[email protected]/modules/caddyhttp/encode/encode.go:260 +0x33
Oct 30 23:09:37 VM-zlNr9iaPx8gT caddy[36302]: panic({0x1c9e7a0, 0x240dd70})
Oct 30 23:09:37 VM-zlNr9iaPx8gT caddy[36302]:         runtime/panic.go:890 +0x262
Oct 30 23:09:37 VM-zlNr9iaPx8gT caddy[36302]: net/http.(*http2responseWriter).Header(0x1f335e0?)
Oct 30 23:09:37 VM-zlNr9iaPx8gT caddy[36302]:         net/http/h2_bundle.go:6362 +0x7a
Oct 30 23:09:37 VM-zlNr9iaPx8gT caddy[36302]: github.com/darkweak/souin/plugins.(*CustomWriter).Header(0xc0148ef550?)
Oct 30 23:09:37 VM-zlNr9iaPx8gT caddy[36302]:         github.com/darkweak/[email protected]/plugins/base.go:55 +0x37
Oct 30 23:09:37 VM-zlNr9iaPx8gT caddy[36302]: github.com/caddyserver/caddy/v2/modules/caddyhttp/reverseproxy.Handler.finalizeResponse({{0x0, 0x0, 0x0}, {0x0, 0x0, 0x0}, 0xc010952310, 0x0, {0xc00d07c320, 0x1, ...}, ...}, ...)
Oct 30 23:09:37 VM-zlNr9iaPx8gT caddy[36302]:         github.com/caddyserver/caddy/[email protected]/modules/caddyhttp/reverseproxy/reverseproxy.go:966 +0x1e2
Oct 30 23:09:37 VM-zlNr9iaPx8gT caddy[36302]: github.com/caddyserver/caddy/v2/modules/caddyhttp/reverseproxy.(*Handler).reverseProxy(0xc00c8656c0, {0x241edb0?, 0xc009960aa0}, 0xc018338a00, 0xc018a31400, 0xc015e8c060, {0xc00d05d1a0, {0xc0109482c0, 0>
Oct 30 23:09:37 VM-zlNr9iaPx8gT caddy[36302]:         github.com/caddyserver/caddy/[email protected]/modules/caddyhttp/reverseproxy/reverseproxy.go:934 +0x1065
Oct 30 23:09:37 VM-zlNr9iaPx8gT caddy[36302]: github.com/caddyserver/caddy/v2/modules/caddyhttp/reverseproxy.(*Handler).proxyLoopIteration(0xc00c8656c0, 0xc018338a00, 0x4?, {0x241edb0, 0xc009960aa0}, {0x0, 0x0}, {0xc0cfc2cc7441fc32, 0x6ca57c957807,>
Oct 30 23:09:37 VM-zlNr9iaPx8gT caddy[36302]:         github.com/caddyserver/caddy/[email protected]/modules/caddyhttp/reverseproxy/reverseproxy.go:570 +0xf05
Oct 30 23:09:37 VM-zlNr9iaPx8gT caddy[36302]: github.com/caddyserver/caddy/v2/modules/caddyhttp/reverseproxy.(*Handler).ServeHTTP(0xc00c8656c0, {0x241edb0, 0xc009960aa0}, 0xc018a31400, {0x2416340, 0xc015e8c1e0})
Oct 30 23:09:37 VM-zlNr9iaPx8gT caddy[36302]:         github.com/caddyserver/caddy/[email protected]/modules/caddyhttp/reverseproxy/reverseproxy.go:478 +0x3da
Oct 30 23:09:37 VM-zlNr9iaPx8gT caddy[36302]: github.com/caddyserver/caddy/v2/modules/caddyhttp.wrapMiddleware.func1.1({0x241edb0?, 0xc009960aa0?}, 0x2416340?)
Oct 30 23:09:37 VM-zlNr9iaPx8gT caddy[36302]:         github.com/caddyserver/caddy/[email protected]/modules/caddyhttp/routes.go:290 +0x42
Oct 30 23:09:37 VM-zlNr9iaPx8gT caddy[36302]: github.com/caddyserver/caddy/v2/modules/caddyhttp.HandlerFunc.ServeHTTP(0x2416340?, {0x241edb0?, 0xc009960aa0?}, 0x80?)
Oct 30 23:09:37 VM-zlNr9iaPx8gT caddy[36302]:         github.com/caddyserver/caddy/[email protected]/modules/caddyhttp/caddyhttp.go:58 +0x2f
Oct 30 23:09:37 VM-zlNr9iaPx8gT caddy[36302]: github.com/caddyserver/caddy/v2/modules/caddyhttp.wrapRoute.func1.1({0x241edb0, 0xc009960aa0}, 0xc018a31400)
Oct 30 23:09:37 VM-zlNr9iaPx8gT caddy[36302]:         github.com/caddyserver/caddy/[email protected]/modules/caddyhttp/routes.go:259 +0x3a8
Oct 30 23:09:37 VM-zlNr9iaPx8gT caddy[36302]: github.com/caddyserver/caddy/v2/modules/caddyhttp.HandlerFunc.ServeHTTP(0x0?, {0x241edb0?, 0xc009960aa0?}, 0xc01617b280?)
Oct 30 23:09:37 VM-zlNr9iaPx8gT caddy[36302]:         github.com/caddyserver/caddy/[email protected]/modules/caddyhttp/caddyhttp.go:58 +0x2f
Oct 30 23:09:37 VM-zlNr9iaPx8gT caddy[36302]: github.com/caddyserver/caddy/v2/modules/caddyhttp/rewrite.Rewrite.ServeHTTP({{0x0, 0x0}, {0xc010948200, 0x1d}, {0x0, 0x0}, {0x0, 0x0}, {0x0, 0x0, ...}, ...}, ...)
Oct 30 23:09:37 VM-zlNr9iaPx8gT caddy[36302]:         github.com/caddyserver/caddy/[email protected]/modules/caddyhttp/rewrite/rewrite.go:136 +0x3ff
Oct 30 23:09:37 VM-zlNr9iaPx8gT caddy[36302]: github.com/caddyserver/caddy/v2/modules/caddyhttp.wrapMiddleware.func1.1({0x241edb0?, 0xc009960aa0?}, 0x2416340?)
Oct 30 23:09:37 VM-zlNr9iaPx8gT caddy[36302]:         github.com/caddyserver/caddy/[email protected]/modules/caddyhttp/routes.go:290 +0x42
Oct 30 23:09:37 VM-zlNr9iaPx8gT caddy[36302]: github.com/caddyserver/caddy/v2/modules/caddyhttp.HandlerFunc.ServeHTTP(0x2416340?, {0x241edb0?, 0xc009960aa0?}, 0x4?)
Oct 30 23:09:37 VM-zlNr9iaPx8gT caddy[36302]:         github.com/caddyserver/caddy/[email protected]/modules/caddyhttp/caddyhttp.go:58 +0x2f
Oct 30 23:09:37 VM-zlNr9iaPx8gT caddy[36302]: github.com/caddyserver/caddy/v2/modules/caddyhttp.wrapRoute.func1.1({0x241edb0, 0xc009960aa0}, 0xc018a31400)
Oct 30 23:09:37 VM-zlNr9iaPx8gT caddy[36302]:         github.com/caddyserver/caddy/[email protected]/modules/caddyhttp/routes.go:259 +0x3a8
Oct 30 23:09:37 VM-zlNr9iaPx8gT caddy[36302]: github.com/caddyserver/caddy/v2/modules/caddyhttp.HandlerFunc.ServeHTTP(0x1d360c0?, {0x241edb0?, 0xc009960aa0?}, 0xe?)
Oct 30 23:09:37 VM-zlNr9iaPx8gT caddy[36302]:         github.com/caddyserver/caddy/[email protected]/modules/caddyhttp/caddyhttp.go:58 +0x2f
Oct 30 23:09:37 VM-zlNr9iaPx8gT caddy[36302]: github.com/caddyserver/caddy/v2/modules/caddyhttp.wrapRoute.func1.1({0x241edb0, 0xc009960aa0}, 0xc018a31400)
Oct 30 23:09:37 VM-zlNr9iaPx8gT caddy[36302]:         github.com/caddyserver/caddy/[email protected]/modules/caddyhttp/routes.go:227 +0x336
Oct 30 23:09:37 VM-zlNr9iaPx8gT caddy[36302]: github.com/caddyserver/caddy/v2/modules/caddyhttp.HandlerFunc.ServeHTTP(0x1d360c0?, {0x241edb0?, 0xc009960aa0?}, 0xe?)
Oct 30 23:09:37 VM-zlNr9iaPx8gT caddy[36302]:         github.com/caddyserver/caddy/[email protected]/modules/caddyhttp/caddyhttp.go:58 +0x2f
Oct 30 23:09:37 VM-zlNr9iaPx8gT caddy[36302]: github.com/caddyserver/caddy/v2/modules/caddyhttp.wrapRoute.func1.1({0x241edb0, 0xc009960aa0}, 0xc018a31400)
Oct 30 23:09:37 VM-zlNr9iaPx8gT caddy[36302]:         github.com/caddyserver/caddy/[email protected]/modules/caddyhttp/routes.go:227 +0x336
Oct 30 23:09:37 VM-zlNr9iaPx8gT caddy[36302]: github.com/caddyserver/caddy/v2/modules/caddyhttp.HandlerFunc.ServeHTTP(0xc01533ac50?, {0x241edb0?, 0xc009960aa0?}, 0x4?)
Oct 30 23:09:37 VM-zlNr9iaPx8gT caddy[36302]:         github.com/caddyserver/caddy/[email protected]/modules/caddyhttp/caddyhttp.go:58 +0x2f
Oct 30 23:09:37 VM-zlNr9iaPx8gT caddy[36302]: github.com/caddyserver/caddy/v2/modules/caddyhttp/encode.(*Encode).ServeHTTP(0xc00d067ec0, {0x241ef00, 0xc0158efc40}, 0x506055?, {0x2416340, 0xc015e8c260})
Oct 30 23:09:37 VM-zlNr9iaPx8gT caddy[36302]:         github.com/caddyserver/caddy/[email protected]/modules/caddyhttp/encode/encode.go:129 +0xa4
Oct 30 23:09:37 VM-zlNr9iaPx8gT caddy[36302]: github.com/caddyserver/caddy/v2/modules/caddyhttp.wrapMiddleware.func1.1({0x241ef00?, 0xc0158efc40?}, 0x0?)
Oct 30 23:09:37 VM-zlNr9iaPx8gT caddy[36302]:         github.com/caddyserver/caddy/[email protected]/modules/caddyhttp/routes.go:290 +0x42
Oct 30 23:09:37 VM-zlNr9iaPx8gT caddy[36302]: github.com/caddyserver/caddy/v2/modules/caddyhttp.HandlerFunc.ServeHTTP(0x51?, {0x241ef00?, 0xc0158efc40?}, 0xc01454cd78?)
Oct 30 23:09:37 VM-zlNr9iaPx8gT caddy[36302]:         github.com/caddyserver/caddy/[email protected]/modules/caddyhttp/caddyhttp.go:58 +0x2f
Oct 30 23:09:37 VM-zlNr9iaPx8gT caddy[36302]: github.com/caddyserver/cache-handler.(*SouinCaddyPlugin).ServeHTTP.func1({0x2427250?, 0xc011339080?}, 0x241e170?)
Oct 30 23:09:37 VM-zlNr9iaPx8gT caddy[36302]:         github.com/caddyserver/[email protected]/httpcache.go:130 +0x62
Oct 30 23:09:37 VM-zlNr9iaPx8gT caddy[36302]: github.com/darkweak/souin/plugins.DefaultSouinPluginCallback.func3({0x241ef00?, 0xc0158efc40?}, 0x0?)
Oct 30 23:09:37 VM-zlNr9iaPx8gT caddy[36302]:         github.com/darkweak/[email protected]/plugins/base.go:207 +0xae
Oct 30 23:09:37 VM-zlNr9iaPx8gT caddy[36302]: created by github.com/darkweak/souin/plugins.DefaultSouinPluginCallback
Oct 30 23:09:37 VM-zlNr9iaPx8gT caddy[36302]:         github.com/darkweak/[email protected]/plugins/base.go:203 +0x95b
Oct 30 23:09:38 VM-zlNr9iaPx8gT systemd[1]: caddy.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Oct 30 23:09:38 VM-zlNr9iaPx8gT systemd[1]: caddy.service: Failed with result 'exit-code'.
Nov 01 09:42:17 VM-zlNr9iaPx8gT systemd[1]: Starting Caddy...
Nov 01 09:42:17 VM-zlNr9iaPx8gT caddy[57698]: caddy.HomeDir=/var/lib/caddy
Nov 01 09:42:17 VM-zlNr9iaPx8gT caddy[57698]: caddy.AppDataDir=/var/lib/caddy/.local/share/caddy
Nov 01 09:42:17 VM-zlNr9iaPx8gT caddy[57698]: caddy.AppConfigDir=/var/lib/caddy/.config/caddy
Nov 01 09:42:17 VM-zlNr9iaPx8gT caddy[57698]: caddy.ConfigAutosavePath=/var/lib/caddy/.config/caddy/autosave.json
Nov 01 09:42:17 VM-zlNr9iaPx8gT caddy[57698]: caddy.Version=v2.6.2 h1:wKoFIxpmOJLGl3QXoo6PNbYvGW4xLEgo32GPBEjWL8o=
Nov 01 09:42:17 VM-zlNr9iaPx8gT caddy[57698]: runtime.GOOS=linux
Nov 01 09:42:17 VM-zlNr9iaPx8gT caddy[57698]: runtime.GOARCH=amd64
Nov 01 09:42:17 VM-zlNr9iaPx8gT caddy[57698]: runtime.Compiler=gc
Nov 01 09:42:17 VM-zlNr9iaPx8gT caddy[57698]: runtime.NumCPU=4
Nov 01 09:42:17 VM-zlNr9iaPx8gT caddy[57698]: runtime.GOMAXPROCS=4
Nov 01 09:42:17 VM-zlNr9iaPx8gT caddy[57698]: runtime.Version=go1.19.2
Nov 01 09:42:17 VM-zlNr9iaPx8gT caddy[57698]: os.Getwd=/
Nov 01 09:42:17 VM-zlNr9iaPx8gT caddy[57698]: LANG=en_US.UTF-8
Nov 01 09:42:17 VM-zlNr9iaPx8gT caddy[57698]: LANGUAGE=en_US:en
Nov 01 09:42:17 VM-zlNr9iaPx8gT caddy[57698]: PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/snap/bin
Nov 01 09:42:17 VM-zlNr9iaPx8gT caddy[57698]: NOTIFY_SOCKET=/run/systemd/notify
Nov 01 09:42:17 VM-zlNr9iaPx8gT caddy[57698]: HOME=/var/lib/caddy
Nov 01 09:42:17 VM-zlNr9iaPx8gT caddy[57698]: LOGNAME=caddy
Nov 01 09:42:17 VM-zlNr9iaPx8gT caddy[57698]: USER=caddy
Nov 01 09:42:17 VM-zlNr9iaPx8gT caddy[57698]: INVOCATION_ID=e966785de59744ada980d9b074c52f2b
Nov 01 09:42:17 VM-zlNr9iaPx8gT caddy[57698]: JOURNAL_STREAM=8:3487217

Incomplete responses from reverse_proxy (context cancelled) are being cached

Doing some tests to use caddy + cache handler to cache deb packages from snapshot.debian.org.

It looks like in some cases, the client connection is being dropped after 10sec, resulting in the upstream connection to be dropped as well, and somehow the fragmentary response is still being saved in cache as valid.

Detailed info:

caddy version
f11c3c9f5a1be082450d64369853e1dacda22dde+modified (17 Aug 23 17:34 UTC)
{
  admin off
  skip_install_trust
  order basicauth after request_header
  order replace after encode
  order cache before rewrite

  log {
    output stdout
    format json
    level debug
  }

  cache {
    log_level debug
    cache_keys {
	.* {
          disable_body
	}
      }

    key {
      disable_body
    }

    stale 31536000s
    ttl 31536000s

    nuts {
      configuration {
      Dir "/data/cache"
      EntryIdxMode 1
      RWMode 0
      SegmentSize 1024
      NodeNum 42
      SyncEnable true
      StartFileLoadingMode 1
      }
    }
  }
}


http://: {
  cache
  reverse_proxy http://snapshot.debian.org {
    header_up Host snapshot.debian.org:80
    header_up -X-Forwarded-*
    header_up -X-Real-IP
    header_down -Server
  }
}

Here is the relevant part of the logs:

{"level":"error","ts":1708923817.4212139,"logger":"http.log.error","msg":"context deadline exceeded","request":{"remote_ip":"10.0.42.13","remote_port":"55452","client_ip":"10.0.42.13","proto":"HTTP/1.1","method":"GET","host":"snapshot.debian.org","uri":"/archive/debian/20240220T000000Z/pool/main/g/gcc-12/gcc-12_12.2.0-14_arm64.deb","headers":{"User-Agent":["DuboDubonDuponey/1.0 (apt)"]}},"duration":10.004700584}
{"level":"debug","ts":1708923817.4218266,"logger":"http.handlers.reverse_proxy","msg":"upstream roundtrip","upstream":"snapshot.debian.org:80","duration":10.005120997,"request":{"remote_ip":"10.0.42.13","remote_port":"55452","client_ip":"10.0.42.13","proto":"HTTP/1.1","method":"GET","host":"snapshot.debian.org:80","uri":"/archive/debian/20240220T000000Z/pool/main/g/gcc-12/gcc-12_12.2.0-14_arm64.deb","headers":{"Date":["Mon, 26 Feb 2024 05:03:27 UTC"],"User-Agent":["DuboDubonDuponey/1.0 (apt)"]}},"error":"context canceled"}
{"level":"debug","ts":1708923840.918993,"logger":"http.handlers.cache","msg":"Incomming request &{Method:GET URL:/archive/debian/20240220T000000Z/pool/main/g/gcc-12/gcc-12_12.2.0-14_amd64.deb Proto:HTTP/1.1 ProtoMajor:1 ProtoMinor:1 Header:map[If-Range:[Tue, 10 Jan 2023 03:53:37 GMT] Range:[bytes=21-] User-Agent:[DuboDubonDuponey/1.0 (apt)]] Body:{} GetBody:<nil> ContentLength:0 TransferEncoding:[] Close:false Host:snapshot.debian.org Form:map[] PostForm:map[] MultipartForm:<nil> Trailer:map[] RemoteAddr:10.0.42.13:55454 RequestURI:/archive/debian/20240220T000000Z/pool/main/g/gcc-12/gcc-12_12.2.0-14_amd64.deb TLS:<nil> Cancel:<nil> Response:<nil> ctx:0xc012447080}"}
{"level":"debug","ts":1708923844.5852408,"logger":"http.handlers.reverse_proxy","msg":"upstream roundtrip","upstream":"snapshot.debian.org:80","duration":3.665729569,"request":{"remote_ip":"10.0.42.13","remote_port":"55454","client_ip":"10.0.42.13","proto":"HTTP/1.1","method":"GET","host":"snapshot.debian.org:80","uri":"/archive/debian/20240220T000000Z/pool/main/g/gcc-12/gcc-12_12.2.0-14_amd64.deb","headers":{"If-Range":["Tue, 10 Jan 2023 03:53:37 GMT"],"User-Agent":["DuboDubonDuponey/1.0 (apt)"],"Date":["Mon, 26 Feb 2024 05:04:00 UTC"],"Range":["bytes=21-"]}},"headers":{"Date":["Mon, 26 Feb 2024 05:04:02 GMT"],"Content-Range":["bytes 21-19266935/19266936"],"Permissions-Policy":["interest-cohort=()"],"X-Varnish":["544520629"],"Via":["1.1 varnish (Varnish/6.5)"],"Server":["Apache"],"X-Content-Type-Options":["nosniff"],"X-Xss-Protection":["1"],"Last-Modified":["Tue, 10 Jan 2023 03:53:37 GMT"],"Etag":["\"125fd78-5f1e0d427f099\""],"Cache-Control":["max-age=31536000, public"],"Age":["0"],"X-Frame-Options":["sameorigin"],"Referrer-Policy":["no-referrer"],"Accept-Ranges":["bytes"],"X-Clacks-Overhead":["GNU Terry Pratchett"],"Content-Length":["19266915"]},"status":206}
{"level":"debug","ts":1708924170.954446,"logger":"http.handlers.cache","msg":"Incomming request &{Method:GET URL:/archive/debian/20240220T000000Z/pool/main/g/gcc-12/gcc-12_12.2.0-14_amd64.deb Proto:HTTP/1.1 ProtoMajor:1 ProtoMinor:1 Header:map[User-Agent:[DuboDubonDuponey/1.0 (apt)]] Body:{} GetBody:<nil> ContentLength:0 TransferEncoding:[] Close:false Host:snapshot.debian.org Form:map[] PostForm:map[] MultipartForm:<nil> Trailer:map[] RemoteAddr:10.0.42.13:55486 RequestURI:/archive/debian/20240220T000000Z/pool/main/g/gcc-12/gcc-12_12.2.0-14_amd64.deb TLS:<nil> Cancel:<nil> Response:<nil> ctx:0xc000c78a50}"}
{"level":"debug","ts":1708924170.9549901,"logger":"http.handlers.cache","msg":"The stored key GET-http-snapshot.debian.org-/archive/debian/20240220T000000Z/pool/main/g/gcc-12/gcc-12_12.2.0-14_amd64.deb matched the current iteration key ETag &{Matched:true IfNoneMatchPresent:false IfMatchPresent:false IfModifiedSincePresent:false IfUnmodifiedSincePresent:false IfUnmotModifiedSincePresent:false NeedRevalidation:true NotModified:false IfModifiedSince:0001-01-01 00:00:00 +0000 UTC IfUnmodifiedSince:0001-01-01 00:00:00 +0000 UTC IfNoneMatch:[] IfMatch:[] RequestETags:[] ResponseETag:\"125fd78-5f1e0d427f099\"}"}

Hope I am not missing something obvious here.

On the client side, apt will reject the resource (since it does not match the expected hash) - that is how I noticed the issue.
Clearly, the resource has been inserted in the cache, but it should have not.

Any insight here?

Thanks in advance.

License

Hello,

Can you please specify a license? I see no mention of it in the repository.

Thanks!

How does stale work?

Hey!

I suspected that stale would ship cached content pass its TTL:

  • User requests /foo
  • /foo is cached with TTL 60 and the key has an age of 65
  • Caddy responds with /foo from cache
  • Caddy gets a fresh /foo and stores it in cache instead

But that does not seem to be the case. (btw, this is how varnish behaves when setting a grace). So what does stale do?

Badger or Nuts?

@darkweak I've just seen this PR, which adds support for Nuts: #30

Is Nuts now recommended over Badger? If so, how is it enabled in the CaddyFile?

A related question: are Badger and Nuts in-memory by default?

can't figure out badger

I'm trying to cache a 2.7MB response and I get a badger error (see #8):

Oct 10 09:12:24 cloud caddy[2743409]: {"level":"debug","ts":1665418344.1734848,"logger":"http.stdlib
","msg":"http2: panic serving 0.0.0.0:56316: Impossible to set value into Badger, Value with si
ze 2779485 exceeded 1048576 limit. Value:\n00000000  48 54 54 50 2f 30 2e 30  20 32 30 30 20 4f 4b 0

Looks like there's a 1MB limit. The README points to "See the Badger configuration for the options" which links to https://dgraph.io/docs/badger/get-started/. Well, I'm probably blind, but I don't see any documentation about configuration there.

I'm also not finding any information on how the cache handler uses badger (store metadata, store responses, store ...?). Nor on what the default configuration is. E.g. is it used as in-memory cache, or on disk? If the latter, where is the data stored on disk? ...

I tried the following, but it doesn't change the error:

        route /data/* {
                cache {
                        badger {
                                configuration {
                                        MemTableSize 8000000
                                        ValueThreshold 4000000
                                }
                        }
                }
                reverse_proxy https://motus.org {
                        header_up Host motus.org
                }
        }

Putting the badger config in the global cache directive has the same results.

Bug with rewrite directive?

I found a behaviour that feels weird to me: The module looks for cached responses with the original request URI, but if there is a rewrite internally the cache entry is written with the rewritten URI. That means that original request URI does not get a cache hit if requested again.

Minimal example:

{
	order cache before rewrite
}

http://:80 {

	cache {
		ttl 20s
	}
	rewrite /say/* /respond{path}
	respond /respond/* "{path.2}"

}

If I request http://localhost/say/hi several times I would think that I get the cached response from the 2nd request onward (within the 20 seconds of course). What actually happens, is that I always get a cache miss.
Requesting http://localhost/respond/say/hi as the second request yields me a cache hit.

I know I could just order cache after rewrite for this silly use case but I have a more elaborate use case in combination with my image filter module.

elaborate use case
{
	order cache before rewrite
	order image_filter before file_server
}

http://:80 {
	@thumbnail {
		path_regexp thumb (?i)/w([0-9]+)(/.+)$
	}
	root .
	handle @thumbnail {
		cache {
			ttl 20s
		}
		rewrite {re.thumb.2}
		image_filter {
			resize {re.thumb.1} 0
		}
	}
}

It resizes an image where size is given within the path. So http://localhost/w100/image.png would give me a 100 pixel wide thumbnail of image.png. Because of the cache misses when ordering cache before rewrite the image is resized on every request.
If I order cache after rewrite here and first request http://localhost/w100/image.png and then http://localhost/w200/image.png I actually get the 100 pixel wide cached response in the 2nd request even though I should get a newly resized 200px wide response image.

how to configure redis with caddy?

hello team
I have read the configuration but still don't understand how to configure because the documentation is not clear when configuring caddy with redis, i have configured redis on Caddyfile as below but found that they don't work with cache, so how can i configure it properly? Can someone give me a sample Caddyfile configuration file?

{
        order coraza_waf first
        order cache before rewrite
}
caddy.manhtuong.net {
        coraza_waf {
                include /etc/caddy/waf/config.conf
                include /etc/caddy/waf/whitelist.conf
                include /etc/caddy/waf/coreruleset/crs-setup.conf.example
                include /etc/caddy/waf/coreruleset/rules/*.conf
        }
        cache {
                redis {
                        url 127.0.0.1:6789
                }
        }
        route / {
                rate_limit {body.id} 200r/m
        }
        file_server {
	        hide .git
        }
        reverse_proxy x.x.x.x:9000
        log {
                output file /home/caddy/caddy.manhtuong.net.log
        }
}

Constant CPU usage

Caddy consumes a constant non-insignificant amount of CPU when the cache handler is registered.

Built with:

$ xcaddy build v2.6.4 --with github.com/caddyserver/[email protected]

Caddyfile:

{
	order cache before rewrite
}

:8080 {
	cache
	respond "hello"
}

Unexpected CPU usage:

$ ./caddy run &
$ top -p $!
    PID USER      PR  NI    VIRT    RES    SHR S  %CPU  %MEM     TIME+ COMMAND
 259235 root      20   0 1776392  85420  40576 S   6.2   2.5   0:17.95 caddy

The CPU % is a bit exaggerated in the example because that's running on a wimpy AMD GX-424CC (Linux x86_64), but also on an Apple M2 it constantly consumes ~1% while idle (macOS arm64).

When the cache configuration is removed from the Caddyfile, Caddy idles at virtually 0% CPU.


I've profiled Caddy in the situation with Go pprof, but haven't investigated further: cpu.pprof.

Don't Cache Logged In Users

I am building a mostly read web site. A few people edit content, many read it.
(Well hopefully someday, many will read it).
I would like the anonymous user to see the cached version, but the logged in user should see the most recent version.

In Caddy 1, my application server set images and anonymous user pages to be public. And I set the cachekey to include the jwt token. That way if they were logged in, they had that token, and no one else would see their cached version of the page. Was simple and worked great.

How do I achieve the same effect with cache-handler? Reading through the options, it does not seem possible.

Possible to configure a disk based cache instead of in-memory?

I'm not sure if this is something that badger or olric can do, but I'm looking to cache pages to disk using the cache handler and then serve them from there, rather than keeping it in memory for the ttl.

Is this something that this plugin can do? If so, are there any docs or examples for this? I followed the link to the official badger docs, but I didn't see anything for this specifically.

Thank you!

HEAD gives zero-sized Content-Length

Minor issue I noticed, as I don't have HEAD in my allowed_http_verbs. I run a system in which clients perform a HEAD on an HTTP object to check the filesize which they will then offer to download to their users.

If Souin is set to cache HEADs (as per the default), the Content-Length is set to 0 (and then in this system's case: an "invalid file" error is presented to the user):

{"level":"info","ts":1692184054.9633253,"logger":"http.log.access.log3","msg":"handled request","request":{"remote_ip":"2a0c:XXXX:XX::XXXX","remote_port":"60900","client_ip":"2a0c:XXXX:XX::XXXX","proto":"HTTP/2.0","method":"HEAD","host":"media.domain.com","uri":"/40/40b010a7a2a0a43c89b84745389e654e/test.jpg","headers":{"Accept-Encoding":["identity"],"User-Agent":["..."]},"tls":{"resumed":false,"version":772,"cipher_suite":4865,"proto":"h2","server_name":"media.domain.com"}},"bytes_read":0,"user_id":"","duration":0.037238329,"size":0,"status":200,"resp_headers":{"Server":["Caddy"],"Content-Length":["0"],"Date":["Wed, 16 Aug 2023 11:07:34 GMT"],"Accept-Ranges":["bytes"],"Etag":["\"4a564749dbf2d3367446834ad05d8436\""],"Alt-Svc":["h3=\":443\"; ma=2592000"],"Content-Type":["image/jpeg"],"Cache-Control":["public, max-age=86400"],"Cache-Status":["Souin; fwd=uri-miss; stored; key=HEAD-https-media.domain.com-/40/40b010a7a2a0a43c89b84745389e654e/test.jpg"],"Last-Modified":["Wed, 16 Aug 2023 11:07:32 GMT"]}}
{"level":"info","ts":1692184055.2995944,"logger":"http.log.access.log3","msg":"handled request","request":{"remote_ip":"XXX.XXX.XXX.YYY","remote_port":"56648","client_ip":"XXX.XXX.XXX.YYY","proto":"HTTP/2.0","method":"HEAD","host":"media.domain.com","uri":"/40/40b010a7a2a0a43c89b84745389e654e/test.jpg","headers":{"Accept-Encoding":["identity"],"User-Agent":["..."]},"tls":{"resumed":false,"version":772,"cipher_suite":4865,"proto":"h2","server_name":"media.domain.com"}},"bytes_read":0,"user_id":"","duration":0.000410535,"size":0,"status":200,"resp_headers":{"Server":["Caddy"],"Age":["2"],"Content-Length":["0"],"Accept-Ranges":["bytes"],"Cache-Status":["Souin; hit; ttl=86398; key=HEAD-https-media.domain.com-/40/40b010a7a2a0a43c89b84745389e654e/test.jpg"],"Date":["Wed, 16 Aug 2023 11:07:34 GMT"],"Cache-Control":["public, max-age=86400"],"Last-Modified":["Wed, 16 Aug 2023 11:07:32 GMT"],"Alt-Svc":["h3=\":443\"; ma=2592000"],"Content-Type":["image/jpeg"],"Etag":["\"4a564749dbf2d3367446834ad05d8436\""]}}

Without the verb, the content-length is passed down:

{"level":"info","ts":1692184163.5322647,"logger":"http.log.access.log3","msg":"handled request","request":{"remote_ip":"2a0c:XXXX:XX::XXXX","remote_port":"43284","client_ip":"2a0c:XXXX:XX::XXXX","proto":"HTTP/2.0","method":"HEAD","host":"media.domain.com","uri":"/c8/c81171196700360c12b739f579ec550b/test.jpg","headers":{"Accept-Encoding":["identity"],"User-Agent":["..."]},"tls":{"resumed":false,"version":772,"cipher_suite":4865,"proto":"h2","server_name":"media.domain.com"}},"bytes_read":0,"user_id":"","duration":0.011600336,"size":0,"status":200,"resp_headers":{"Alt-Svc":["h3=\":443\"; ma=2592000"],"Cache-Status":["Souin; fwd=bypass; detail=UNSUPPORTED-METHOD"],"Content-Length":["3088829"],"Date":["Wed, 16 Aug 2023 11:09:22 GMT"],"Content-Type":["application/octet-stream"],"Etag":["\"f4f2e028a611a0c81dc3b79fa186e5be\""],"Last-Modified":["Wed, 16 Aug 2023 11:09:21 GMT"],"Accept-Ranges":["bytes"],"Server":["Caddy"]}}

Cannot match current iteration key ETag

I have this plugin configured as follows:

{
	on_demand_tls {
		ask http://localhost:3903/check
		interval 2m
		burst 5
	}

	order cache before rewrite

	cache {
		allowed_http_verbs GET
		default_cache_control public
		nuts {
			path /var/db/caddy/souin/
		}
		ttl 8h
	}
}

https:// {
	cache {
		timeout {
			backend 30s
		}
	}

	tls {
		on_demand
	}

	reverse_proxy localhost:3902
}

However, every time I hit one of the domains fronted by this configuration, I see this in the logs:

$ curl -v https://www.domain.com/
...
 TLSv1.3 (IN), TLS handshake, Newsession Ticket (4):
< HTTP/2 200
< accept-ranges: bytes
< alt-svc: h3=":443"; ma=2592000
< cache-control: public
< cache-status: Souin; fwd=uri-miss; stored; key=GET-https-www.domain.com-/
< content-security-policy: default-src 'self';
< content-type: text/html
< date: Sun, 30 Jul 2023 22:46:54 GMT
< etag: "636849236767b0deb94dd80d04c5bc74"
< last-modified: Thu, 02 Feb 2023 12:56:22 GMT
...

And the ETag never changes between curl runs, however in the logs I then see:

{"level":"info","ts":1690757113.7676728,"logger":"http.handlers.cache","msg":"The key GET-https-www.domain.com-/ didn't match the current iteration key ETag GET-https-www.domain.com-/"}
{"level":"info","ts":1690757115.2038972,"logger":"http.handlers.cache","msg":"The key GET-https-www.domain.com-/ didn't match the current iteration key ETag GET-https-www.domain.com-/"}
{"level":"info","ts":1690757116.2018719,"logger":"http.handlers.cache","msg":"The key GET-https-www.domain.com-/ didn't match the current iteration key ETag GET-https-www.domain.com-/"}
{"level":"info","ts":1690757117.201102,"logger":"http.handlers.cache","msg":"The key GET-https-www.domain.com-/ didn't match the current iteration key ETag GET-https-www.domain.com-/"}
{"level":"info","ts":1690757118.0917957,"logger":"http.handlers.cache","msg":"The key GET-https-www.domain.com-/ didn't match the current iteration key ETag GET-https-www.domain.com-/"}

Cache a Redirect Response from reverse_proxy

I have an external service behind a reverse proxy that returns a 302 Redirect response. I'd like to be able to change the response into a 301 and to cache the redirect response so that it can be served from cache if re-requested w/in the configured TTL.

In my attempts to do this, I can get the response to change to a 301 response, but it always misses the cache with a header of: Cache-Status: Souin; fwd=uri-miss; detail=UPSTREAM-ERROR-OR-EMPTY-RESPONSE; key=GET-.... Is this possible, and if it is how should I configure this type of behavior?

Panic if abort directive is used

/etc/caddy/Caddyfile:

{
        order cache before rewrite
        cache
}

:80 {
        cache
        abort
}

Start caddy, then do curl http://127.0.0.1. Result: Caddy dies.

Feb 13 12:40:18 desrv systemd[1]: Started Caddy.
Feb 13 12:40:18 desrv caddy[4509]: {"level":"info","ts":1676320818.3576553,"logger":"tls.cache.maintenance","msg":"started background certificate maintenance","cache":"0xc00057b5e0"}
Feb 13 12:40:18 desrv caddy[4509]: {"level":"info","ts":1676320818.3577983,"logger":"tls","msg":"cleaning storage unit","description":"FileStorage:/var/lib/caddy/.local/share/caddy"}
Feb 13 12:40:18 desrv caddy[4509]: {"level":"info","ts":1676320818.3579395,"logger":"tls","msg":"finished cleaning storage units"}
Feb 13 12:40:18 desrv caddy[4509]: {"level":"info","ts":1676320818.3581178,"msg":"serving initial configuration"}
Feb 13 12:40:43 desrv caddy[4509]: panic: net/http: abort Handler
Feb 13 12:40:43 desrv caddy[4509]: goroutine 50 [running]:
Feb 13 12:40:43 desrv caddy[4509]: github.com/caddyserver/caddy/v2/modules/caddyhttp.StaticResponse.ServeHTTP({{0x0, 0x0}, 0x0, {0x0, 0x0}, 0x0, 0x1}, {0x24f43a0?, 0xc0001f6780?}, 0xc0000b8300, ...)
Feb 13 12:40:43 desrv caddy[4509]:         github.com/caddyserver/caddy/[email protected]/modules/caddyhttp/staticresp.go:184 +0xa38
Feb 13 12:40:43 desrv caddy[4509]: github.com/caddyserver/caddy/v2/modules/caddyhttp.wrapMiddleware.func1.1({0x24f43a0?, 0xc0001f6780?}, 0xc0006f0070?)
Feb 13 12:40:43 desrv caddy[4509]:         github.com/caddyserver/caddy/[email protected]/modules/caddyhttp/routes.go:290 +0x42
Feb 13 12:40:43 desrv caddy[4509]: github.com/caddyserver/caddy/v2/modules/caddyhttp.HandlerFunc.ServeHTTP(0x4487b9?, {0x24f43a0?, 0xc0001f6780?}, 0xffffffffffffffff?)
Feb 13 12:40:43 desrv caddy[4509]:         github.com/caddyserver/caddy/[email protected]/modules/caddyhttp/caddyhttp.go:58 +0x2f
Feb 13 12:40:43 desrv caddy[4509]: github.com/caddyserver/cache-handler.(*SouinCaddyPlugin).ServeHTTP.func1({0x0?, 0x185f960?}, 0xc0006f0070?)
Feb 13 12:40:43 desrv caddy[4509]:         github.com/caddyserver/[email protected]/httpcache.go:129 +0x62
Feb 13 12:40:43 desrv caddy[4509]: github.com/darkweak/souin/plugins.DefaultSouinPluginCallback.func2({0x24f43a0?, 0xc0001f6780?}, 0xf?)
Feb 13 12:40:43 desrv caddy[4509]:         github.com/darkweak/[email protected]/plugins/base.go:210 +0x7a
Feb 13 12:40:43 desrv caddy[4509]: created by github.com/darkweak/souin/plugins.DefaultSouinPluginCallback
Feb 13 12:40:43 desrv caddy[4509]:         github.com/darkweak/[email protected]/plugins/base.go:206 +0x81b
Feb 13 12:40:43 desrv systemd[1]: caddy.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Feb 13 12:40:43 desrv systemd[1]: caddy.service: Failed with result 'exit-code'.
$ /usr/bin/caddy version
v2.6.3 h1:QRVBNIqfpqZ1eJacY44I6eUC1OcxQ8D04EKImzpj7S8=

Built about 15 minutes ago with xcaddy build --with github.com/caddyserver/cache-handler

This is a minimal repro. I have a service that only approved IPs are able to access, otherwise Caddy aborts the request, and when I hit it from an unapproved IP after adding caching it killed Caddy.

Proposal to use Souin as cache system core to cache-handler

As we discussed on Slack, I suggest you to use an external cache-system.

What does Souin support ?

All (TODO items)[https://github.com/caddyserver/cache-handler#todo] except the Varnish tag (a PR to support that will be created on Souin on the next week).
It supports the JSON and Caddyfile configuration and you can find an example using Souin as caddymodule here. The caddy duration and caddy matchers are supported too. The only thing to be maintained will be the Souin caddy module integration in this repository and not manage the cache directives anymore. And the difference between the Souin repository and this one will be the used key, here it will be cache and stay souin_cache on the main repository.

Finally I'll maintain Souin with the help of existing contributors, and I'm in contact with some pentesters to make an audit on this cache system to ensure this is safe for production.

Open to discuss with you about it.

Suspected cache pollution when adding multiple hosts

When adding multiple hostnames to an upstream (?) after a certain amount of time (or certain number of hostnames) there is some kind of cache pollution happening.

Once the cache has been flushed the files return to giving good responses for a certain amount of time before the same happens again.

This has been reproduced in this sample repository here with minimal config: https://github.com/mattvb91/caddy-cache-bug

Samples taken from discussions over from #26 (comment)

image

After cache flush:

image

Here is a video of it happening in action:

Screencast.2022-05-18.09.10.13.mp4

Olric nil pointer panic on last line of internal transport listenAndServe

in my thread about caching in caddy to I posted a reply where I detailed an exception I was getting from olric when I attempted to run my GOARCH=arm xcaddy build of caddy 2.4.0 beta + this cache-handler package.

forest@thingpad:~/Desktop/git/caddy/cmd/caddy$ GOARCH=arm ~/Desktop/programs/xcaddy/xcaddy build --with github.com/caddyserver/cache-handler

...

heres the Caddyfile i used:

{
    cache {
        olric_config olricd.yml
    }
}

localhost

route /cache-s-maxage {
    cache
    header Cache-Control "s-maxage=5, max-age=1"
    respond "Hello, s-maxage!"
}

route /cache-max-age {
    cache
    header Cache-Control "max-age=60"
    respond "Hello, max-age!"
}

and heres my olricd.yml file:

# Copied from https://github.com/buraksezer/olric/blob/master/cmd/olricd/olricd.yaml
olricd:
  bindAddr: "0.0.0.0"
  bindPort: 3320
  serializer: "msgpack"
  keepAlivePeriod: "300s"
  requestTimeout: "5s"
  partitionCount:  13
  replicaCount: 1
  writeQuorum: 1
  readQuorum: 1
  readRepair: false
  replicationMode: 0 # sync mode. for async, set 1
  tableSize: 1048576 # 1MB in bytes
  memberCountQuorum: 1

logging:
  verbosity: 6
  level: "DEBUG"
  output: "stderr"

memberlist:
  environment: "local"
  bindAddr: "0.0.0.0"
  bindPort: 3322
  enableCompression: false
  joinRetryInterval: "1ms"
  maxJoinAttempts: 1

Heres what it printed

root@odroidxu4:~# ./caddy2 run
2021/02/18 22:51:21.269	INFO	using adjacent Caddyfile
[WARNING][caddyfile] Caddyfile:1: input is not formatted with 'caddy fmt'
2021/02/18 22:51:21.276	INFO	admin	admin endpoint started	{"address": "tcp/localhost:2019", "enforce_origin": false, "origins": ["localhost:2019", "[::1]:2019", "127.0.0.1:2019"]}
2021/02/18 22:51:21.276	INFO	http	server is listening only on the HTTPS port but has no TLS connection policies; adding one to enable TLS	{"server_name": "srv0", "https_port": 443}
2021/02/18 22:51:21.276	INFO	http	enabling automatic HTTP->HTTPS redirects	{"server_name": "srv0"}
2021/02/18 22:51:21.276	INFO	tls.cache.maintenance	started background certificate maintenance	{"cache": "0x2459180"}
2021/02/18 16:51:21 [INFO] Join completed. Synced with 0 initial nodes => olric.go:304
panic: runtime error: invalid memory address or nil pointer dereference
[signal SIGSEGV: segmentation violation code=0x1 addr=0x0 pc=0x11db8]

goroutine 85 [running]:
runtime/internal/atomic.goStore64(0x25ac014, 0xd, 0x0)
	/usr/local/go/src/runtime/internal/atomic/atomic_arm.go:144 +0x1c
github.com/buraksezer/olric.(*Olric).setOwnedPartitionCount(0x25ac000)
	/home/forest/go/pkg/mod/github.com/buraksezer/[email protected]/routing.go:452 +0x15c
github.com/buraksezer/olric.(*Olric).updateRoutingOperation(0x25ac000, 0x13915e8, 0x22281b0, 0x13915e8, 0x2228180)
	/home/forest/go/pkg/mod/github.com/buraksezer/[email protected]/routing.go:496 +0x33c
github.com/buraksezer/olric.(*Olric).requestDispatcher(0x25ac000, 0x13915e8, 0x22281b0, 0x13915e8, 0x2228180)
	/home/forest/go/pkg/mod/github.com/buraksezer/[email protected]/olric.go:265 +0xac
github.com/buraksezer/olric/internal/transport.(*Server).processMessage(0x22d8a80, 0xa694d568, 0x26cc690, 0x25fa6d4, 0x20cc540, 0x0, 0x0)
	/home/forest/go/pkg/mod/github.com/buraksezer/[email protected]/internal/transport/server.go:194 +0x294
github.com/buraksezer/olric/internal/transport.(*Server).processConn(0x22d8a80, 0xa694d568, 0x26cc690)
	/home/forest/go/pkg/mod/github.com/buraksezer/[email protected]/internal/transport/server.go:218 +0x108
created by github.com/buraksezer/olric/internal/transport.(*Server).listenAndServe
	/home/forest/go/pkg/mod/github.com/buraksezer/[email protected]/internal/transport/server.go:257 +0xd8

for your viewing pleasure, here is a small section of /home/forest/go/pkg/mod/github.com/buraksezer/[email protected]/internal/transport/server.go annotated with line numbers

Better Examples Please

I am a beginner at configuring Caddy. Never tried Soin.

You have one example which tosses everything in. Way too complex for the beginner to figure out. I feel like giving up.

It would be great to have a simple proxy cache server example.
Then a more complex one caching based on path, or cookies.
Then a few other simple examples.
I could read each example by itself, and soon I would understand what is going on. Right now there are just to many new things to learn to figure it out quickly.

Missing response on cached paths

Cache misses on first request and hits on next, that's fine. But I no longer receive a response from the endpoint.

This is my Caddyfile.

{
	order cache before rewrite
	cache
}

:{$APP_PORT} {
	handle_path /api/* {
		reverse_proxy http://localhost:{$SERVER_PORT} {
			transport http {
				versions h2c
			}
		}
	}

	handle_path /api/v1/translate {
		cache {
			allowed_http_verbs POST
			ttl 24h
		}
	}
}

Logs from cURL.

*   Trying [::1]:7861...
* Connected to localhost (::1) port 7861
> POST /api/v1/translate? HTTP/1.1
> Host: localhost:7861
> User-Agent: curl/8.3.0
> Accept: */*
> Content-Type: application/json
> Content-Length: 97
>
< HTTP/1.1 200 OK
HTTP/1.1 200 OK
< Cache-Control:
Cache-Control:
< Cache-Status: Souin; fwd=uri-miss; stored; key=POST-http-localhost:7861--718c30564a9d539ac393940896d588a095e1b2e1dea02edb7d36e62d9d40f757
Cache-Status: Souin; fwd=uri-miss; stored; key=POST-http-localhost:7861--718c30564a9d539ac393940896d588a095e1b2e1dea02edb7d36e62d9d40f757
< Server: Caddy
Server: Caddy
< Date: Tue, 19 Sep 2023 21:05:15 GMT
Date: Tue, 19 Sep 2023 21:05:15 GMT
< Content-Length: 0
Content-Length: 0

Logs from Caddy

{
  "level":"info","ts":1695157539.7626615,
  "logger":"http.handlers.cache",
  "msg":"Serve from cache &{Method:POST URL:? Proto:HTTP/1.1 ProtoMajor:1 ProtoMinor:1 Header:map[Accept:[*/*] Content-Length:[97] Content-Type:[application/json] Date:[Tue, 19 Sep 2023 21:05:39 UTC] User-Agent:[curl/8.3.0]] Body:{Reader:{\n         \"text\": \"world!\",\n         \"source\": \"eng_Latn\",\n         \"target\": \"spa_Latn\"\n      }} GetBody:<nil> ContentLength:97 TransferEncoding:[] Close:false Host:localhost:7861 Form:map[] PostForm:map[] MultipartForm:<nil> Trailer:map[] RemoteAddr:172.17.0.1:45620 RequestURI:/? TLS:<nil> Cancel:<nil> Response:<nil> ctx:0xc000201a40}"
}

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.