Coder Social home page Coder Social logo

dipdup-io / metadata Goto Github PK

View Code? Open in Web Editor NEW
17.0 17.0 4.0 691 KB

Tezos TZIP-16/TZIP-12 metadata indexer

Home Page: https://ide.dipdup.io/?resource=https://metadata.dipdup.net/v1/graphql

Dockerfile 2.16% Go 96.51% Makefile 0.67% PLpgSQL 0.66%
dipdup indexer metadata tezos tzip-12 tzip-16

metadata's People

Contributors

852kerfunkle avatar aopoltorzhicky avatar dependabot[bot] avatar dmirgaleev avatar droserasprout avatar gdsoumya avatar m-kus avatar vvuwei avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar

metadata's Issues

TokenInfo.token_id is fixed point decimal

Seems a bit odd to me, but maybe there's a specific reason?

Maybe uint64 makes more sense here. Decimal turns out to be numeric instead of bigint in the gql schema, which makes for some odd queries.

Add ability to limit block scope similar to what's done in dipdup

In dipdup, it's currently possible to limit the block scope via config using the following settings:

indexes:
  my_index:
    first_level: 1000000
    last_level: 2000000

For testing, it would be useful to be able to do the same thing in the metadata plugin. This would also limit the tester's S3 footprint which is nice for cost savings!

Url encoding issues

Some IPFS links won't resolve because the url isn't encoded properly.

For example ipfs://bafybeid5xl7cwdaelgh2rrxeicul6v7xbuwvtgiywklui43todymiyxx3y/TZLAND#247_FINALE.glb (just imagine this could be a json file for the sake of this ticket) will fail to fetch because the # isn't url encoded. Filenames containing spaces appear to work fine, though.

Tried fixing this but I can't quite figure out where that goes wrong and adding additional url encoding causes spaces to be double url encoded (%2520).

Thanks!

GORM field tags are not being processed

The primary key and index specifications made using gorm tag in the structs are not being processed when the tables are generated in postgres.

Reference :
https://github.com/dipdup-net/metadata/blob/a4d18d7b27f576d6460a0c6d47aae1a2e8b555f9/cmd/metadata/models/token_metadata.go#L11-L27
pk

According to the token_metadata struct there should be a composite pk on token_id, contract and network but when I described the table it shows id as the only pk. I am using PostgresSQL v14.1

Reading chunked ipfs responses fails

Heya,

another issue I found is that some files on IPFS large enough to be chunked in the response fail to read.

The issue is pool.go in request:

return ioutil.ReadAll(io.LimitReader(resp.Body, pool.limit))

I assume using LimitReader instead of resp.Body.Read bypasses the chunked reading done by the http module internally.

Using:

buf := make([]byte, resp.ContentLength)
_, err := resp.Body.Read(buf)
return buf, err

Makes that work.

But my question: Not quite sure what is to be achieved with the limit? To filter out large files? Are partial responses desired?

Resolved metadata not being saved into DB

There seems to be a logic bug in the code ref: https://github.com/dipdup-net/metadata/blob/d2ec49730fa5a32039eb493ac64aadb90546daea/cmd/metadata/service/token.go#L161

This line forces the saver function to only save the metadata if there are at least 10 pending tokens which cause the metadata to be not stored at all until this condition is met. This is problematic in many ways, especially for contracts that don't have that high frequency of token mints. This also affects any previously unresolved/failed token mints which can also be blocked on this condition.

A easy fix would be to go back to the previous code where there was a separate case with a ticker that would autosave the data after some amount of time irrespective of the token count

How to set up

Heya, this is looking amazing. Currently I'm indexing metadata in dipdup-py. It's a little tedious that way.

I suppose I drop my dipdup-py config into the build dir and build run the container build? Or is there some way to reference an external config that I missed?

Thanks.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.