Coder Social home page Coder Social logo

graphql-cache's People

Contributors

afuno avatar dependabot[bot] avatar jdorfman avatar jeromedalbert avatar sumitseth avatar thebadmonkeydev avatar vincedevendra avatar yonasb avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

graphql-cache's Issues

Invalidating cached fields

Is there any way to invalidate cached fields? It would be great if, during a mutation, you could also invalidate fields that have been cached

Warming caches?

This is a great gem. Thank you for releasing it!

I was curious if you had a good strategy for warming caches in the background. Ideally, we'd want to set the expiry on graph to an hour and run an ActiveJob that warms/refreshes the cache every 30-40 min.

Anyone have thoughts on how to do this? I'm happy to write an enhancement but I'm unsure what that would be. Maybe a flag/option passed into a query to ignore and regenerate the cache?

The goal would be for someone calling the graph to rarely, if ever, end up with a cache miss.

[FEAT] Cache introspection queries

Is your feature request related to a problem? Please describe.
This project looks almost perfect for addressing some hotspots in our code!

Describe the solution you'd like
I would like to be able to apply caching to the introspection fields/queries. Introspection queries are currently the heaviest GraphQL query we face because we have a very large schema—but our schema doesn't change while the server is running. I plan on using a cache key of the build number + schema name in production.

It isn't clear to me how to attach caching to these automatically provided fields.

Additional context
We are open to other solutions besides caching at the field level but aren't keen on directly matching against the query because when we open up our GraphQL API to more 3rd parties we can't ensure they always introspect in the same way.

query IntrospectionQuery {
  __schema {
    queryType {
      name
    }
    mutationType {
      name
    }
    subscriptionType {
      name
    }
    types {
      ...FullType
    }
    directives {
      name
      description
      locations
      args {
        ...InputValue
      }
    }
  }
}

fragment FullType on __Type {
  kind
  name
  description
  fields(includeDeprecated: true) {
    name
    description
    args {
      ...InputValue
    }
    type {
      ...TypeRef
    }
    isDeprecated
    deprecationReason
  }
  inputFields {
    ...InputValue
  }
  interfaces {
    ...TypeRef
  }
  enumValues(includeDeprecated: true) {
    name
    description
    isDeprecated
    deprecationReason
  }
  possibleTypes {
    ...TypeRef
  }
}

fragment InputValue on __InputValue {
  name
  description
  type {
    ...TypeRef
  }
  defaultValue
}

fragment TypeRef on __Type {
  kind
  name
  ofType {
    kind
    name
    ofType {
      kind
      name
      ofType {
        kind
        name
        ofType {
          kind
          name
          ofType {
            kind
            name
            ofType {
              kind
              name
              ofType {
                kind
                name
              }
            }
          }
        }
      }
    }
  }
}

Reduce setup friction

Ideas

  • Is there a way to get rid of the need to set field_class GraphQL::Cache::Field to allow for cache metadata?
    • don't use metadata and instead make a DSL method cached_field that takes options for cache metadata?
    • Override the field class in the GraphQL namespace (probably not a great idea and likely to break with each change of the class in the graphql-ruby gem.
  • Generaters or setup tasks for setting middleware?
  • Make user set schema in configuration and then add middleware/plugins/field integrations automatically on the schema object?

Support Rails 5.2 recyclable key caching

With Rails 5.2 and Rails.application.config.active_record.cache_versioning set to true (which is default on config.load_defaults 5.2), some_model.cache_key will always have the same value, e.g. SomeModel/1234. The changing part is #cache_version which by default is the value of updated_at.

This is the concept of recyclable cache keys:

We currently have:

      def guess_id
        return object.cache_key if object.respond_to?(:cache_key)
       ...

This mean that that by default, caching would never expire for models in a new Rails 5.2 app. 🙀I may be wrong as I am trying to wrap my head around how version is exactly handled.

If so, we probably want to change that with either @cache.write(object.cache_key, "bar", version: object.cache_version) / @cache.read(object.cache_key, version: object.cache_version) syntax or maybe just @cache.write(object, "bar") / @cache.read(object) syntax (internally handles whether cache versioning is enabled depending on Rails settings, respond_to :cache_version, etc). Or maybe key.cache_key_with_version if object.respond_to?(:cache_key_with_version) as a first quick fix that won't use cache recycling.

Caching resolvers?

👋 Hey there! I've been looking at implementing graphql-cache in my application, and I'm running into a little trouble. Fair warning that I'm not totally sure, especially after reading your tests, that I've set things up in an idiomatic and expected way, but I'm mostly using resolver classes to manage my queries:

class Query < ApplicationObject
    field :author, resolver: Resolvers::Author
    # ...etc.
end
class ApplicationObject < GraphQL::Schema::Object
    field_class GraphQL::Cache::Field
end
module Resolvers
    class Author < GraphQL::Schema::Resolver
        description 'A single Author by slug or UUID.'

        type Types::Author, null: true

        argument :id, ID, required: true

        def resolve(id:)
            Users::Author.find_by(slug: id) || Users::Author.find(id)
        end
    end
end
class Schema < GraphQL::Schema
    query ::Query

    middleware GraphQL::Cache::Middleware
end

What I'm finding, though, is that if I try to tack cache: true onto one of the resolvers' fields, I get a nastygram and a 500 from Rails:

no _dump_data is defined for class Proc
/usr/local/rvm/gems/ruby-2.6.0-preview2@structur/gems/activesupport-5.2.0/lib/active_support/cache.rb:807:in `dump'
/usr/local/rvm/gems/ruby-2.6.0-preview2@structur/gems/activesupport-5.2.0/lib/active_support/cache.rb:807:in `marshaled_value'
/usr/local/rvm/gems/ruby-2.6.0-preview2@structur/gems/activesupport-5.2.0/lib/active_support/cache.rb:787:in `should_compress?'
/usr/local/rvm/gems/ruby-2.6.0-preview2@structur/gems/activesupport-5.2.0/lib/active_support/cache.rb:718:in `initialize'
/usr/local/rvm/gems/ruby-2.6.0-preview2@structur/gems/activesupport-5.2.0/lib/active_support/cache.rb:445:in `new'
/usr/local/rvm/gems/ruby-2.6.0-preview2@structur/gems/activesupport-5.2.0/lib/active_support/cache.rb:445:in `block in write'
/usr/local/rvm/gems/ruby-2.6.0-preview2@structur/gems/activesupport-5.2.0/lib/active_support/cache.rb:663:in `block in instrument'
/usr/local/rvm/gems/ruby-2.6.0-preview2@structur/gems/activesupport-5.2.0/lib/active_support/notifications.rb:170:in `instrument'
/usr/local/rvm/gems/ruby-2.6.0-preview2@structur/gems/activesupport-5.2.0/lib/active_support/cache.rb:663:in `instrument'
/usr/local/rvm/gems/ruby-2.6.0-preview2@structur/gems/activesupport-5.2.0/lib/active_support/cache.rb:444:in `write'
/usr/local/rvm/gems/ruby-2.6.0-preview2@structur/gems/graphql-cache-0.2.5/lib/graphql/cache/marshal.rb:52:in `write'
/usr/local/rvm/gems/ruby-2.6.0-preview2@structur/gems/graphql-cache-0.2.5/lib/graphql/cache/marshal.rb:37:in `read'
/usr/local/rvm/gems/ruby-2.6.0-preview2@structur/gems/graphql-cache-0.2.5/lib/graphql/cache.rb:59:in `fetch'
/usr/local/rvm/gems/ruby-2.6.0-preview2@structur/gems/graphql-cache-0.2.5/lib/graphql/cache/middleware.rb:58:in `call'

I was hoping you might have some insight into how I could get these fields to be cacheable---unless this is just a totally backasswards way to build a schema, in which case I can reconsider my end of things. Thanks in advance for any thoughts you have!

[BUG] Cannot use custom cache key on root-level field

Describe the bug
A custom cache key cannot be used on a root-level field because it is skipped when there is no parent object.

gem version:
>= 0.6

graphql-ruby version:
Any

Expected behavior
The root-level field should be cached according to the custom key provided in the field definition.

Not compatible with GraphQL-Ruby 1.9

Resolving dependencies.....
Bundler could not find compatible versions for gem "graphql":
  In snapshot (Gemfile.lock):
    graphql (= 1.9.2)

  In Gemfile:
    graphql

    graphql-cache was resolved to 0.2.1, which depends on
      graphql (~> 1.8.0.pre10)

Running `bundle update` will rebuild your snapshot from scratch, using only
the gems in your Gemfile, which may resolve the conflict.

Cache miss: (GraphQL::Cache:Tag:title:Tag:tags/10-20190522154014980856)

I tried to cache the title:

class Types::TagType < Types::BaseObject
  field :id, ID, null: false
  field :title, String, null: false, cache: true
end

But in the console I get a lot of messages of this type:

Cache miss: (GraphQL::Cache:Tag:title:Tag:tags/10-20190522154014980856)

How to fix it?

Caching connection types is broken against graphql-ruby 1.9

There was a change to the way relay connections are wrapped post resolution (or at least by the time we resolve the cache proc). Now when there is a cache hit against a relay connection field, we need to ensure that the result coming back from our resolver is a proper subclass of GraphQL::Relay::BaseConnection.

When the GraphQL::Relay::EdgesInstrumenter runs, it is working with an object that it is not expecting and a NoMethodError: no method 'edge_nodes' for <class of cached object>.

Remove gemer

It was fun. But I did it the right way which also supports documentation whereas gemer did not and I really don't feel like adding it. Rip it out!

Refactor middleware structure

The structure of the gem has been a concern for me for a while. It was mostly hacked together based on how StackShare needed it to work. I'd like to refactor so that less data flows so deeply into our objects and so that our abstraction is cleaner and more easily digested.

How do I invalidate the cache if my object is changed

Is your feature request related to a problem? Please describe.
I am using graphql with graphql-cache.

But now I want to invalidate the cache if my object is changed

Describe the solution you'd like
It will be better if we can just call a method in the after_commit callback and invalidate the cache of related records

Cannot use graphql-batch and graphql-cache on the same field

Out of the box, if using this both graphql-cache and graphql-batch queries return an error:

{"error":"no _dump_data is defined for class Proc"}

which is the error encountered by graphql-cache when it doesn't know how to dump a resolved value to the cache store. I'll have to investigate if it is even possible to combine these two gems (at least on the same field, my current belief is that if used on separate fields, there is no conflict out of the box, but that will require validation)

Support for field-level expiration settings

Not sure of the best API for this, but initially I was thinking something like this:

field :foo, Int, 'A complex calculated field', cache: { expires_in: 180.minutes }

There's also the possibility of using some custom DSL to allow for something like this:

field :foo, Int, 'A complex calculated field' do
  cache_expiry 180.minutes
end

Passing GQL query context into cache key generation

Hey all!

Super awesome gem, digging into it a bit, I don't readily see any way to access the GQL query context when generating a cache key or resolver.

field :stats, SegmentStatsType, "Stats for this segment", null: false, 
    cache: { key: :guid, expiry: 6.hours.to_i }

I have one segment which is user specific that I'd like to use the current user's id as part of the cache key, and it's not readily apparent how to achieve this.

field :stats, SegmentStatsType, "Stats for this segment", null: false, 
    cache: { key: -> (obj) { obj.what_is_my_cache_key(context[:current_user] }, expiry: 6.hours.to_i }

Digging into the source code, I don't think this is a supported use case yet unless I'm just missing it. I'm happy to help implement something to support this use case but would love some insight on to how you would imagine this working.

Thanks again for all the work you've put into this thus far, and for open sourcing it.

Is there any functionality built in that invalidates/updates the cache upon updates of a field?

I noticed in testing that the cache sometimes automatically gets updated after insertions, but i don't know when the field in the cache gets updated, and when it doesn't.

For example, doing just puts seems to get me the old, cached value, but running inspect seems to give an updated value?

puts Rails.cache.fetch("GraphQL::Cache:Query:restaurants:ids:#{de_moete.id}").inspect doesn't seem to cache the value, but without the inspect it does seem to get cached?

"cache if" feature

Similar to Rails views:

- cache_if (current_user.nil?) do

And actionpack-action_caching:

caches_action :show, if: -> { current_user.nil? }

It would be nice to be able to do something like this in graphql-cache:

field :calculated_field, Int, cache: { if: -> { current_user.nil? } }

Although the double-brackets might feel awkward, so it might look cleaner to do

field :calculated_field, Int do
  cache if: -> { current_user.nil? }
end

(which is related to my second bullet point on #52 referring to the ability to have cache on its own line)

[FEAT] Release a new version with gql-batch fix

Is your feature request related to a problem? Please describe.
We were testing out graphql-cache installed from Rubygems and found out graphql-batch wasn't compatible with it. We ended up writing our own initializer to overcome the issue..
As I was going to contribute back this, I found out vincedevendra@6a8dced which already solved the issue. I would've noticed and saved some time if this was included in the latest release.

Describe the solution you'd like
A new release including the graphql-batch fixes would fix it.

Additional context
Apologies if this is not exactly the right place to ask for it, but I wasn't sure where to ask

undefined method `force=' for GraphQL::Cache:Module (NoMethodError)

Rails 5.2.3 application with GraphQL 1.9.6.
The problem is in the configuration file.

I had to comment out the line.

GraphQL::Cache.configure do |config|
  config.namespace = 'GraphQL::Cache' # Cache key prefix for keys generated by graphql-cache
  config.cache     = Rails.cache      # The cache object to use for caching
  config.logger    = Rails.logger     # Logger to receive cache-related log messages
  config.expiry    = 5400             # 90 minutes (in seconds)
  # config.force     = false            # Cache override, when true no caching takes place
end

Error on caching list of scalars

We assume that all list types are lists of custom objects by using raw.map(&:object). When a scalar list is cached an undefined method 'object' error thrown.

Add means of forcing through request

I'm curious if there's a way we can add a means to force cache misses with a request form the front end. For instance, an object like a profile with a collection of objects...it is likely someone would want to cache that collection when reading for most cases, but would want a real-time result when adding these objects to the resource.

It looks like we may be able to support some kind of special context value like context[:force_cache] = true and skip cache all together when that value is set.

Advanced custom cache keys

I am not sure that this is a good idea. I will need to think more about that tomorrow or if I even actually need to to this. But here is my idea dump right now for future reference.

My use case is that my custom cache key value is potentially complicated and would need:

  • a lambda/proc on multiple lines (not super elegant, but already possible?)

    field :some_field, String, cache: { key: ->(obj) {
      "#{obj.id}123blah"
    } } do
      ...
    end
  • its own line, e.g.

    field :calculated_field, Int do
      cache <complicated code block goes here>
    end
  • or even its own method within the enclosing Type class.

    field :calculated_field, Int, cache: { key: :custom_cache_key }

    where :custom_cache_key is first looked up in the instance of the enclosing type class that is defining this field (with respond_to?) and only then calls the parent object if it's not defined.

  • or just use Rails.cache.fetch blocks manually inside the resolver

"singleton can't be dumped" exception when trying to cache a BatchLoader field

Hello. When using BatchLoader gem to... batch load a field, the following error is returned when trying to cache it:


  singleton can't be dumped
    /home/billk/.gem/ruby/2.3.0/gems/activesupport-5.1.6/lib/active_support/cache.rb:665:in `dump'
    /home/billk/.gem/ruby/2.3.0/gems/activesupport-5.1.6/lib/active_support/cache.rb:665:in `dup_value!'
    /home/billk/.gem/ruby/2.3.0/gems/activesupport-5.1.6/lib/active_support/cache/memory_store.rb:130:in `write_entry'
    /home/billk/.gem/ruby/2.3.0/gems/activesupport-5.1.6/lib/active_support/cache.rb:400:in `block in write'
    ...

I haven't tested with https://github.com/Shopify/graphql-batch. Maybe there is a problem with lazy fields in general?

Document common errors and solutions in README

Is your feature request related to a problem? Please describe.
Reducing developer adoption/bug triage time

Describe the solution you'd like
We need to include some kind of FAQ or Common Errors section in the README or wiki

Additional context
This is coming about because several old issues (that were mostly user error) are still getting a lot of traffic indicating that people are still finding them by searching out a solution to their own instance of that issue.

Add query document caching

We should include the ability to cache queries based on the actual query document similar to HTTP caching. This would require creating a "plugin" for graphql-ruby. Plugins are run during query parsing, middleware are run during resolution.

[FEAT] Never expire cache

I have a scenario where I want my cache to never expire, letting Redis deal with the less frequently used keys in case memory reaches cap. In my current case there is normally enough space to cache all possible queries for one specific service. That said, the library is very easy to use and works just fine, so thanks for putting this together.

Describe the solution you'd like
Pass a constant or some sort of value to never expire the cache. I was wondering whether there is a way to configure the cache to never expire. I couldn't find if it is possible anywhere. Is there a way to do such thing?

Allow "cache: true" for interface types

When I try to add cached: true on an interface I get unknown keyword: cache.

The underlying class used in an interface field is different maybe, which could be why it doesn't work.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.