Coder Social home page Coder Social logo

go-ds-s3's Introduction

S3 Datastore Implementation

This is an implementation of the datastore interface backed by amazon s3.

NOTE: Plugins only work on Linux and MacOS at the moment. You can track the progress of this issue here: golang/go#19282

Quickstart

  1. Grab a plugin release from the releases section matching your Kubo version and install the plugin file in ~/.ipfs/plugins.
  2. Follow the instructions in the plugin's README.md

Building and installing

The plugin can be manually built/installed for different versions of Kubo (starting with 0.23.0) with:

git checkout go-ds-s3-plugin/v<kubo-version>
make plugin
make install-plugin

Updating to a new version

  1. go get the Kubo release you want to build for. Make sure any other dependencies are aligned to what Kubo uses.
  2. make install and test.

If you are building against dist-released versions of Kubo, you need to build using the same version of go that was used to build the release (here).

If you are building against your own build of Kubo you must align your plugin to use it.

If you are updating this repo to produce a new version of the plugin:

  1. Submit a PR so that integration tests run
  2. Make a new tag go-ds-s3-plugin/v<kubo_version> and push it. This will build and release the plugin prebuilt binaries.

Bundling

As go plugins can be finicky to correctly compile and install, you may want to consider bundling this plugin and re-building kubo. If you do it this way, you won't need to install the .so file in your local repo, i.e following the above Building and Installing section, and you won't need to worry about getting all the versions to match up.

# We use go modules for everything.
> export GO111MODULE=on

# Clone kubo.
> git clone https://github.com/ipfs/kubo
> cd kubo

# Pull in the datastore plugin (you can specify a version other than latest if you'd like).
> go get github.com/ipfs/go-ds-s3@latest

# Add the plugin to the preload list.
> echo -en "\ns3ds github.com/ipfs/go-ds-s3/plugin 0" >> plugin/loader/preload_list

# ( this first pass will fail ) Try to build kubo with the plugin
> make build

# Update the deptree
> go mod tidy

# Now rebuild kubo with the plugin
> make build

# (Optionally) install kubo
> make install

Contribute

Feel free to join in. All welcome. Open an issue!

This repository falls under the IPFS Code of Conduct.

Want to hack on IPFS?

License

MIT

go-ds-s3's People

Contributors

aschmahmann avatar biglep avatar dbarbashov avatar dependabot-preview[bot] avatar galargh avatar gammazero avatar guseggert avatar hinshun avatar hsanjuan avatar iand avatar ianopolous avatar imjoshholloway avatar ipfs-mgmt-read-write[bot] avatar jorropo avatar josiasbruderer avatar koxon avatar kubuxu avatar lio-wd avatar marten-seemann avatar masih avatar michaelmure avatar rach-id avatar ribasushi avatar shadowjonathan avatar stebalien avatar tobowers avatar web-flow avatar web3-bot avatar whyrusleeping avatar yuvipanda avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

go-ds-s3's Issues

plugin was built with a different version of package github.com/jbenet/goprocess

Hello, Sorry if this question is dupe.
However I am using go-ds-s3 checkout v0.4.23 and go-ipfs v0.4.23 that I download from
https://dist.ipfs.io/go-ipfs/v0.4.23/go-ipfs_v0.4.23_linux-amd64.tar.gz.
However I got this error. Did i did something wrong with this?

root@karnetif01:~/go-ds-s3# make install ./set-target.sh v0.4.23 go build -asmflags=all=-trimpath="/root/work" -gcflags=all=-trimpath="/root/work" -buildmode=plugin -o "s3plugin.so" "plugin/main/main.go" chmod +x "s3plugin.so" Built against v0.4.23 install -Dm700 s3plugin.so "/root/.ipfs/plugins/go-ds-s3.so" root@karnetif01:~/go-ds-s3# ipfs version Error: error loading plugins: loading plugin /root/.ipfs/plugins/go-ds-s3.so: plugin.Open("/root/.ipfs/plugins/go-ds-s3"): plugin was built with a different version of package github.com/jbenet/goprocess root@karnetif01:~/go-ds-s3#

tag a release ?

With go-datastore released with the Sync() interface change, it would be handy to have this released as well.

CC @aschmahmann

Excessive Has()/GetSize() calls, caching?

I'm currently having a problem with using this plugin for b2 backblaze storage, which has an s3 API, they charge for "Class B" transactions, under which HeadObject falls, which GetSize uses, which in turn is used by Has()

As far as I know, Has() is called every time a bitswap request comes over the network, to see if the block can be provided to the other node, with the rate at which this happens, however, the transactions quickly build up, and I'm looking at a 300K transaction count just after bringing this node up for just 5 hours.

That in turn translates to some significant monthly extra costs, is it possible to "cache" the keys available in the s3 bucket? To perform a ListObjects call every minute or so, and to update a local "Key List" file which'd every Has call will be validated against (performing no calls to the s3 api), a Put should then automatically add that key to the list.

Error: error loading plugins: loading plugin /data/ipfs/plugins/go-ds-s3.so: not built with cgo support

After

1- cloning the repo
2- run make build (getting as output s3plugin.so)
3- run make install
4- make the appropriate changes in both /data/ipfs/config

"Datastore": {
    "BloomFilterSize": 0,
    "GCPeriod": "1h",
    "HashOnRead": false,
    "Spec": {
      "mounts": [
        {
         "child": {
           "type": "s3ds",
           "region": "us-east-1",
           "bucket": "ipfs-test2-node",
           "accessKey": "******",
           "secretKey": "******"
        },
        "mountpoint": "/blocks",
        "prefix": "s3.datastore",
        "type": "measure"
       },
        {
          "child": {
            "compression": "none",
            "path": "datastore",
            "type": "levelds"
          },

and /data/ipfs/datastore_spec,

{"mounts":[{"bucket":"ipfs-test2-node","mountpoint":"/blocks","region":"us-east-1","rootDirectory":""},{"mountpoint":"/","path":"datastore","type":"levelds"}],"type":"mount"}

I restarted ipfs sudo systemctl restart ipfs

and then while checking that didn't have errors

systemctl status ipfs

then got

Error: error loading plugins: loading plugin /data/ipfs/plugins/go-ds-s3.so: not built with cgo support

image

garbage collection seems to be broken

Garbage collection appears to be broken with this plugin.

I have a working implementation of things where I can add / pin content, list content, get the size of the repo etc.

However running ipfs repo gc returns nothing and shows this in the daemon logs:
image

No content is deleted from the s3 bucket when this happens.

Oddly I can run ipfs block rm CID just fine and the content is removed from the s3 bucket

feat: optimize keys for S3 performance

When I was doing the Dumbo Drop project I hit most of the performance bottlenecks you can find in S3 and Dynamo. One thing I stumbled upon was a much better pattern for storing IPLD blocks in S3.

From the aws documentation.

your application can achieve at least 3,500 PUT/COPY/POST/DELETE or 5,500 GET/HEAD requests per second per prefix in a bucket

This statement isnโ€™t 100% honest. Most of the time you will not truly see this performance against every prefix, but itโ€™s a good window into how S3 is architected and what the performance constraints are.

Weโ€™re in a very lucky situation, we can really optimize for this because every block already has a randomized key you can use as a prefix. Iโ€™ve recently built two block storage backends for IPLD and in both cases used the CID as a prefix rather than the final key, so something like {cid.toString()}/data and the performance I was able to get was tremendous.

If you really hammer a bucket with writes this way, youโ€™ll see moments in which itโ€™s re-balancing in order to get more throughput. Once I had a few billion blocks in a single bucket I aimed 2000+ lambda functions at the same bucket writing 1MB blocks and Lambda started having issues before I could saturate the bucket which was reliably doing about 40GB/s write throughput.

This library, and any other IPFS/IPLD storage backends for S3, should probably take the same approach.

go: error loading module requirements

All other dependencies installed correctly... Same result on both Linux amd64, arm64.

$ make go.mod IPFS_VERSION=v0.4.21             ./set-target.sh v0.4.21
go: github.com/dgraph-io/[email protected]+incompatible: go.mod has post-v2 module path "github.com/dgraph-io/badger/v2" at revision v2.0.0-rc.2
go: error loading module requirements
Makefile:20: recipe for target 'go.mod' failed
make: *** [go.mod] Error 1

go version go1.12.7 linux/amd64

Question: does/could the datastore support multiple ipfs instances?

Hi there,

Would the datastore support concurrent access (ether rw or ro) to a given bucket? that is to say, can two distinct ipfs processes (be they on the same computer, or in containers, different VMs etc.) share a bucket? In the js-ipfs implementation they apparently use an object within the bucket as a lockfile, but I've had a quick look at the code and don't really see why it would be the case here (tho I am no go programmer).

Any thoughts/guidance would be gratefully received!

Sharding?

By default, go-ipfs provides a sharding option for the datastore. When using this plugin the datastore is not being sharded.

As described in previous issues, the serialization in the datastore_spec is not 1:1 because when I try to add shardFunc this results in an error.

Is there a way to achieve sharding for the data stored in S3?

Update S3 sdk to v2

Version 2 of the Amazon S3 sdk is now modularized - see aws/aws-sdk-go-v2#444

Updating to this should massively reduce the size of this plugin and may have performance benefits too. This would solve #81

Is this working for IPFS v0.6.0?

I'm trying to install this plugin for IPFS v0.6.0 but it's not working for me. Is it updated to work for this version yet?

When I try to bundle I get this...

make build
go version go1.14.4 linux/amd64
bin/check_go_version 1.14.2
plugin/loader/preload.sh > plugin/loader/preload.go
go fmt plugin/loader/preload.go >/dev/null
go build  "-asmflags=all='-trimpath='" "-gcflags=all='-trimpath='" -ldflags="-X "github.com/ipfs/go-ipfs".CurrentCommit=5a1953922-dirty" -o "cmd/ipfs/ipfs" "github.com/ipfs/go-ipfs/cmd/ipfs"
/usr/local/go/pkg/tool/linux_amd64/link: signal: killed
make: *** [cmd/ipfs/Rules.mk:22: cmd/ipfs/ipfs] Error 1

When I try to install as a plugin I get errors when trying to ipfs init...

make IPFS_VERSION=v0.6.0 install
./set-target.sh v0.6.0
go build  -asmflags=all=-trimpath="" -gcflags=all=-trimpath="" -buildmode=plugin -o "s3plugin.so" "plugin/main/main.go"
chmod +x "s3plugin.so"
Built against v0.6.0
install -Dm700 s3plugin.so "/root/.ipfs/plugins/go-ds-s3.so"
ipfs init --profile server
Error: error loading plugins: loading plugin /root/.ipfs/plugins/go-ds-s3.so: plugin.Open("/root/.ipfs/plugins/go-ds-s3"): plugin was built with a different version of package github.com/jbenet/goprocess

Am I doing something wrong or is it just not ready yet?

DOSsing S3 with GetSize calls even with bloom filter

Our S3 based ipfs instances have been flooding S3 with requests for GetSize requests. We're seeing a constant 5Mb/s of requests to S3 whilst we are not using it. It seemed to start around August 21st 2020. The only correlated event I can think of that may be significant is the filecoin space race launch. The log looks like:

2021-06-16T10:46:43.995Z	ERROR	engine	blockstore.GetSize(QmUCRuDoHafwt4CCkWy6gr45a6voofy5b4JktWstbW18NQ) error: ServiceUnavailable: Service Unavailable
	status code: 503, request id: tx000000000000000000000-0000000000-0000-default, host id: 
2021-06-16T10:46:44.025Z	ERROR	engine	blockstore.GetSize(Qmets7C6YhHgvm5BvWPT9dyXd1U6JLMUekhJwNm2P7acTd) error: ServiceUnavailable: Service Unavailable
	status code: 503, request id: tx000000000000000000000-0000000000-0000-default, host id: 
2021-06-16T10:46:44.056Z	ERROR	engine	blockstore.GetSize(QmcSB5Exmymh3P9stGFKvJyWP8XPp4L92FpXN8mAT8hkxT) error: ServiceUnavailable: Service Unavailable
	status code: 503, request id: tx000000000000000000000-0000000000-0000-default, host id:

We have the bloom filter configured with:

"BloomFilterSize": 268435456,

The S3 bucket in question has ~ 900k blocks in it. The S3 service rate limits us to 750 req/s/IP address.

My understanding is incoming requests will be served even before the bloom filter is built on startup. Is it possible that this is flooding S3 and preventing the bloom filter from being built at all, and thus the flood continues? Is there any other way to stop this?

more detailed documentation ?

I was trying to test this plugin, but I can't find a way to load the plugin properly in go-ipfs. I tried various things, using a go-ipfs binary from dist.ipfs.io, compile go-ipfs myself, changing various import path in this plugin, injecting this plugin as an internal one ...

The last error I get is:

ERROR cmd/ipfs: error loading plugins: loading plugin /home/michael/.ipfs/plugins/s3plugin.so: plugin.Open("/home/michael/.ipfs/plugins/s3plugin"): plugin was built with a different version of package gx/ipfs/QmSF8fPo3jgVBAy8fpdjjYqgG87dkJgUprRBHRd2tmfgpP/goprocess main.go:65

Could you detail a bit more the building an install process ? Could you also provide an example of configuration ?

Thanks !

Multiple IPFS instances one S3 bucket. Is this possible?

I want to set up multiple IPFS instances around the world that share the same S3 datastore.

Only one IPFS instance would need to be able to write to the S3 bucket. The rest will only read the data on S3 and serve the content via IPFS.

Is this possible with the S3 datastore? If so how would I go about doing this?

I've got two VPS's with IPFS and the S3 plugin setup to use the same bucket. I add and pin a file on one then shut that IPFS server down.

I then try to receive the file from an IPFS client bootstrapped to the second IPFS server attached to the S3 bucket and nothing happens.

x509 Cannot validate certificate

I've set up an S3 compatible object store and connected my ipfs client to it through the instructions.

However when I run ipfs daemon

I get this error:

x509: cannot validate certificate for 192.168.X.X because it doesn't contain any IP SANs

Here is the full errror:

2021-03-19T21:53:19.983Z        ERROR   cmd/ipfs        error from node construction: could not build arguments for function "reflect".makeFuncStub (reflect/asm_amd64.s:12): failed to build *mfs.Root: received non-nil error from function "github.com/ipfs/go-ipfs/core/node".Files (github.com/ipfs/[email protected]/core/node/core.go:106): failure writing to dagstore: RequestError: send request failed
caused by: Head "https://192.168.X.X:443/ipfsbucket/bucketsubdirectory/CIQFTFEEHEDF6KLBT32BFAGLXEZL4UWFNWM4LFTLMXQBCERZ6CMLX3Y": x509: cannot validate certificate for 192.168.X.X because it doesn't contain any IP SANs

Any pointers as to what may be causing this issue?

IPFS daemon error

Hello,

I followed steps to bundle go-ds-s3 into go-ipfs. Then I run ipfs init, update the config and datastore_spec files.
Then I run ipfs daemon and encounter this error

2021-10-28T08:56:47.862+0700	ERROR	cmd/ipfs	ipfs/daemon.go:422	error from node construction: could not build arguments for function "reflect".makeFuncStub (/usr/local/Cellar/go/1.16.5/libexec/src/reflect/asm_amd64.s:14): failed to build *mfs.Root: received non-nil error from function "github.com/ipfs/go-ipfs/core/node".Files (/Users/khoanguyen/hawking/go-ipfs/core/node/core.go:112): failure writing to dagstore: Forbidden: Forbidden
	status code: 403, request id: 06H1GESW1CNNPRY9, host id: zMyLTE9pnOkk/7mbtZQVeLFajmp2PAQmXXZoyBTt4jMZAtI+qttAgqX9YlVMkeATq9Tv4sQK9pM=

Error: could not build arguments for function "reflect".makeFuncStub (/usr/local/Cellar/go/1.16.5/libexec/src/reflect/asm_amd64.s:14): failed to build *mfs.Root: received non-nil error from function "github.com/ipfs/go-ipfs/core/node".Files (/Users/khoanguyen/hawking/go-ipfs/core/node/core.go:112): failure writing to dagstore: Forbidden: Forbidden
	status code: 403, request id: 06H1GESW1CNNPRY9, host id: zMyLTE9pnOkk/7mbtZQVeLFajmp2PAQmXXZoyBTt4jMZAtI+qttAgqX9YlVMkeATq9Tv4sQK9pM=

Any idea why this happens?

Thanks in advance.

Dependabot can't resolve your Go dependency files

Dependabot can't resolve your Go dependency files.

As a result, Dependabot couldn't update your dependencies.

The error Dependabot encountered was:

gopkg.in/[email protected]: unrecognized import path "gopkg.in/check.v1" (parse https://gopkg.in/check.v1?go-get=1: no go-import meta tags ())

If you think the above is an error on Dependabot's side please don't hesitate to get in touch - we'll do whatever we can to fix it.

View the update logs.

Although Data Is Persisted Across Node Reboots, It Does Not Persist In The Pin Store

Been experimenting with this plugin running an IPFS node, and have noticed that although I may restart my IPFS node which is using this datastore, and the data does remain stored long-term, subsequent queries such as ipfs pin ls indicate that nothing is pinned, which doesn't match what I'm seeing within my S3 datastore minio.

I think this might be in part caused by #2 but I'm not sure what the actual issue is.

Getting error of `Error: unknown datastore type: s3ds` with a clean new ipfs node

I am playing around with the go-ds-s3 implementation and having an issue bringing ipfs node with s3ds as the datastore on a brand new ipfs instance.
Error: unknown datastore type: s3ds

IPFS Specs:

go-ipfs version: 0.9.0
Repo version: 11
System version: amd64/linux
Golang version: go1.16.5

Below are the steps I have taken:

  1. Turned on modules

export GO111MODULE=on

  1. Cloned go-ipfs, pulled in datastore plugin, and added plugin to the preload list
git clone https://github.com/ipfs/go-ipfs
cd go-ipfs
go get github.com/ipfs/go-ds-s3@latest
echo "s3ds github.com/ipfs/go-ds-s3/plugin 0" >> plugin/loader/preload_list
  1. Rebuild go-ipfs plugin and install go-ipfs
Make build
Make install
  1. Run IPFS init, edit Datastore in the config file (/.ipfs/config), edit data_store_spec
  • Config file
  "Datastore": {
    "StorageMax": "10GB",
    "StorageGCWatermark": 90,
    "GCPeriod": "1h",
    "Spec": {
      "mounts": [
        {
          "child": {
            "type": "s3ds",
            "region": "us-east-2",
            "bucket": "test-bucket-2",
            "rootDirectory": "test_root",
            "accessKey": "",
            "secretKey": ""
          },
          "mountpoint": "/blocks",
          "prefix": "s3.datastore",
          "type": "measure"
        },
        {
          "child": {
            "compression": "none",
            "path": "datastore",
            "type": "levelds"
          },
          "mountpoint": "/",
          "prefix": "leveldb.datastore",
          "type": "measure"
        }
      ],
      "type": "mount"
    },
    "HashOnRead": false,
    "BloomFilterSize": 0
  },
  • Data_store_spec
{
  "mounts": [
    {
      "bucket": "test-bucket-2",
      "mountpoint": "/blocks",
      "region": "us-east-2",
      "rootDirectory": "test_root"
    },
    {
      "mountpoint": "/",
      "path": "datastore",
      "type": "levelds"
    }
  ],
  "type": "mount"
  1. Take node online

ipfs daemon

  • Getting error
Error: unknown datastore type: s3ds

Dependabot can't resolve your Go dependency files

Dependabot can't resolve your Go dependency files.

As a result, Dependabot couldn't update your dependencies.

The error Dependabot encountered was:


If you think the above is an error on Dependabot's side please don't hesitate to get in touch - we'll do whatever we can to fix it.

View the update logs.

Unable to load plugin

I've built this plugin and used it with ipfs v0.4.23 and it works fine. However, When I rebuild ipfs v0.4.23 with the 2 line diff from ipfs/go-ipfs-pinner#2 I get the following error:

Error: error loading plugins: loading plugin /tmp/peergos9005038333179153292/.ipfs/plugins/go-ds-s3.so: plugin.Open("/tmp/peergos9005038333179153292/.ipfs/plugins/go-ds-s3"): plugin was built with a different version of package github.com/jbenet/goprocess

@Stebalien Any idea what could be happening here? That goprocess repo hasn't changed in almost a year.

ipfs load go-ds-s3 error

I have git clone latest "go-ipfs" and latest "go-ds-s3" release๏ผŒand make build & make install ,but when i "ipfs init",come out error:

" Error: error loading plugins: loading plugin /root/.ipfs/plugins/go-ds-s3.so: plugin.Open("/root/.ipfs/plugins/go-ds-s3"): plugin was built with a different version of package internal/unsafeheader"

I don't understand about this error,the โ€œReadMeโ€ of "go-ds-s3" say:
git clone https://github.com/ipfs/go-ipfs

cd go-ipfs

Pull in the datastore plugin (you can specify a version other than latest if you'd like).

go get github.com/ipfs/go-ds-s3@latest

Add the plugin to the preload list.

echo "s3ds github.com/ipfs/go-ds-s3/plugin 0" >> plugin/loader/preload_list

Rebuild go-ipfs with the plugin

make build

(Optionally) install go-ipfs

make install

I just do like that,why have error?????

Invalid ELF header

Hello,

I build the go-ds-s3.so using make install with the destination directory being the mounted volume of go-ipfs docker container. Were running make install on macOS.

I got this error when trying to start:

Error: error loading plugins: loading plugin /data/ipfs/plugins/go-ds-s3.so: plugin.Open("/data/ipfs/plugins/go-ds-s3.so"): /data/ipfs/plugins/go-ds-s3.so: invalid ELF header

Please help me resolve this. Many thanks.

Error building/installing plugin

Hello,

Attempting to follow the Bundling steps in the README results in the following for me during the make build step:

go build "-asmflags=all='-trimpath='" "-gcflags=all='-trimpath='" -ldflags="-X "github.com/ipfs/go-ipfs".CurrentCommit=7a4752ac9-dirty" -o "cmd/ipfs/ipfs" "github.com/ipfs/go-ipfs/cmd/ipfs"
^N# github.com/ipfs/go-ds-s3/plugin
../../go/pkg/mod/github.com/ipfs/[email protected]/plugin/s3ds.go:13:2: cannot use &S3Plugin literal (type *S3Plugin) as type plugin.Plugin in slice literal:
        *S3Plugin does not implement plugin.Plugin (wrong type for Init method)
                have Init() error
                want Init(*plugin.Environment) error
make: *** [cmd/ipfs/Rules.mk:22: cmd/ipfs/ipfs] Error 2

This does not appear to be environment-related, though here is my system information anyway to save us a few steps:

GO111MODULE="on"
GOARCH="amd64"
GOBIN=""
GOCACHE="/home/taigrr/.cache/go-build"
GOENV="/home/taigrr/.config/go/env"
GOEXE=""
GOFLAGS=""
GOHOSTARCH="amd64"
GOHOSTOS="linux"
GOINSECURE=""
GONOPROXY=""
GONOSUMDB=""
GOOS="linux"
GOPATH="/home/taigrr/go"
GOPRIVATE=""
GOPROXY="https://proxy.golang.org,direct"
GOROOT="/usr/lib/go"
GOSUMDB="sum.golang.org"
GOTMPDIR=""
GOTOOLDIR="/usr/lib/go/pkg/tool/linux_amd64"
GCCGO="gccgo"
AR="ar"
CC="gcc"
CXX="g++"
CGO_ENABLED="1"
GOMOD="/home/taigrr/ipfs/go-ipfs/go.mod"
CGO_CFLAGS="-g -O2"
CGO_CPPFLAGS=""
CGO_CXXFLAGS="-g -O2"
CGO_FFLAGS="-g -O2"
CGO_LDFLAGS="-g -O2"
PKG_CONFIG="pkg-config"
GOGCCFLAGS="-fPIC -m64 -pthread -fmessage-length=0 -fdebug-prefix-map=/tmp/go-build151506377=/tmp/go-build -gno-record-gcc-switches"

unknown authority

Hi, I got this problem
ERROR cmd/ipfs ipfs/daemon.go:344 error from node construction: could not build arguments for function "reflect".makeFuncStub (/usr/lib/go/src/reflect/asm_amd64.s:12): failed to build *mfs.Root: received non-nil error from function "github.com/ipfs/go-ipfs/core/node".Files (pkg/mod/github.com/ipfs/[email protected]/core/node/core.go:108): failure writing to dagstore: RequestError: send request failed
caused by: Head https://s3-hcm-r1.longvan.net/chinsu/test/CIQFTFEEHEDF6KLBT32BFAGLXEZL4UWFNWM4LFTLMXQBCERZ6CMLX3Y: x509: certificate signed by unknown authority

datastore_spec :

{"mounts":[{"bucket":"chinsu","mountpoint":"/blocks","region":"us-east-1","rootDirectory":"test"},{"mountpoint":"/","path":"datastore","type":"levelds"}],"type":"mount"}

ipfs config:

"Spec": {
"mounts": [
{
"child": {
"type": "s3ds",
"region": "us-east-1",
"bucket": "chinsu",
"rootDirectory": "test",
"regionEndpoint": "s3-hcm-r1.longvan.net",
"accessKey": "key",
"secretKey": "key"
},
"mountpoint": "/blocks",
"prefix": "s3.datastore",
"type": "measure"
},
{
"child": {
"compression": "none",
"path": "datastore",
"type": "levelds"
},
"mountpoint": "/",
"prefix": "leveldb.datastore",
"type": "measure"
}
],
"type": "mount"
},

Please help me, Thanks.

DO Spaces: Occasional 'Service Unavailable'

Hey! I have an ipfs-go server with a built-in s3 plugin. As a storage, I use Digitalocean Spaces. And it works fine except that I see occasional 'Service Unavailable' errors in ipfs logs. And it's really unclear whether those are errors from DO side or some kind of plugin errors?

If I do 'ipfs cat' to one of the mentioned resources, I can easily fetch it.
Any ideas what might be wrong?

Dec  8 00:30:12 IPFS-node-03 ipfs[23909]: 2021-12-08T00:30:12.243Z#011ERROR#011engine#011decision/blockstoremanager.go:92#011blockstore.GetSize(bafk2bzacebdnt6zlyjpslvmctnbpdrk6gzh3gibsqhzwty2s2jzitgcvb45yo) error: ServiceUnavailable: Service Unavailable
Dec  8 00:37:08 IPFS-node-03 ipfs[23909]: 2021-12-08T00:37:08.899Z#011ERROR#011engine#011decision/blockstoremanager.go:92#011blockstore.GetSize(bafk2bzacecrb45xhu3fzlrzq7ea2wun7hgdcpo5obxjzejawllsvoxvisqkw2) error: ServiceUnavailable: Service Unavailable
Dec  8 00:44:12 IPFS-node-03 ipfs[23909]: 2021-12-08T00:44:12.650Z#011ERROR#011engine#011decision/blockstoremanager.go:92#011blockstore.GetSize(bafkreid2hlsyidvc6p2ffsyw7bpyyqctiwkc6opsyaao5bvpiepjazc32e) error: ServiceUnavailable: Service Unavailable
Dec  8 01:35:04 IPFS-node-03 ipfs[23909]: 2021-12-08T01:35:04.099Z#011ERROR#011engine#011decision/blockstoremanager.go:92#011blockstore.GetSize(bafkreicqhjcdpuznlir2ebumbciwxcqdklhiw6l77nkfhyfwnqxp4m4xum) error: ServiceUnavailable: Service Unavailable
Dec  8 02:15:10 IPFS-node-03 ipfs[23909]: 2021-12-08T02:15:10.814Z#011ERROR#011engine#011decision/blockstoremanager.go:92#011blockstore.GetSize(bafk2bzacecdbdg6vgjgkbwqqbhtjqynrprzmam4ubxgjzeie4b6fha5h2plhc) error: ServiceUnavailable: Service Unavailable
Dec  8 02:56:52 IPFS-node-03 ipfs[23909]: 2021-12-08T02:56:52.556Z#011ERROR#011engine#011decision/blockstoremanager.go:92#011blockstore.GetSize(bafk2bzaceaufcredcjmmrhqi4fkff4vgrbepsileb4gs6u4njzwfet2ldb6k6) error: ServiceUnavailable: Service Unavailable
Dec  8 04:24:25 IPFS-node-03 ipfs[23909]: 2021-12-08T04:24:25.678Z#011ERROR#011engine#011decision/blockstoremanager.go:92#011blockstore.GetSize(QmZJohFaHKfaHv23SPRaujZf5h9UguE59uN8GjQfyRY4uG) error: ServiceUnavailable: Service Unavailable
Dec  8 04:41:25 IPFS-node-03 ipfs[23909]: 2021-12-08T04:41:25.447Z#011ERROR#011engine#011decision/blockstoremanager.go:92#011blockstore.GetSize(QmStcUqCkc4vcpZFbEJxZ3gpENpYFdFTdGRXMrsdGViP92) error: ServiceUnavailable: Service Unavailable
Dec  8 04:49:28 IPFS-node-03 ipfs[23909]: 2021-12-08T04:49:28.713Z#011ERROR#011engine#011decision/blockstoremanager.go:92#011blockstore.GetSize(QmaM4dMnXRrc3Nanb2TBYcg9cu2TCStFw3LSx5Duwj1QxW) error: ServiceUnavailable: Service Unavailable
Dec  8 07:37:42 IPFS-node-03 ipfs[23909]: 2021-12-08T07:37:42.284Z#011ERROR#011engine#011decision/blockstoremanager.go:92#011blockstore.GetSize(QmPjnA8NDFgmRSth3CsJBv2rCkVW6CFepZScpsPRhnrWWx) error: ServiceUnavailable: Service Unavailable
Dec  8 07:44:51 IPFS-node-03 ipfs[23909]: 2021-12-08T07:44:51.634Z#011ERROR#011engine#011decision/blockstoremanager.go:92#011blockstore.GetSize(QmbkZUSM6itZGgGyF6xfarqDN8B5PmWjCJJeewFZjsmSAh) error: ServiceUnavailable: Service Unavailable
Dec  8 08:37:50 IPFS-node-03 ipfs[23909]: 2021-12-08T08:37:50.750Z#011ERROR#011engine#011decision/blockstoremanager.go:92#011blockstore.GetSize(QmPjdoRXh57xH9KsfrHUYFin2LhfjrUYcwiYywGa9BHdLT) error: ServiceUnavailable: Service Unavailable
Dec  8 08:43:09 IPFS-node-03 ipfs[23909]: 2021-12-08T08:43:09.966Z#011ERROR#011engine#011decision/blockstoremanager.go:92#011blockstore.GetSize(Qmb3DxUyY986Nc7rx8kpUHXrCtM6LrH3radEPvnrhUJhBm) error: ServiceUnavailable: Service Unavailable
Dec  8 09:23:16 IPFS-node-03 ipfs[23909]: 2021-12-08T09:23:16.739Z#011ERROR#011engine#011decision/blockstoremanager.go:92#011blockstore.GetSize(QmPAP3u9oPkA3Xd1FZXiBdNic2L3TbRYFgfj8zSPVyEET3) error: ServiceUnavailable: Service Unavailable
Dec  8 10:01:01 IPFS-node-03 ipfs[23909]: 2021-12-08T10:01:01.629Z#011ERROR#011engine#011decision/blockstoremanager.go:92#011blockstore.GetSize(Qmb2gKL2k3EjAeqHaASdKGA7wKcmELsaaEuSbHaiSErZDs) error: ServiceUnavailable: Service Unavailable
root@IPFS-node-03:/var/log# ipfs version --all
go-ipfs version: 0.11.0-dev-5a61bedef-dirty
Repo version: 11
System version: amd64/linux
Golang version: go1.16.8

config (datastore part):

  "Datastore": {
    "BloomFilterSize": 0,
    "GCPeriod": "1h",
    "HashOnRead": false,
    "Spec": {
      "mounts": [
        {
          "child": {
            "accessKey": "XXX",
            "bucket": "bucket",
            "region": "sfo2",
            "regionEndpoint": "sfo2.digitaloceanspaces.com",
            "rootDirectory": "ipfs",
            "secretKey": "XXX",
            "type": "s3ds"
          },
          "mountpoint": "/blocks",
          "prefix": "s3.datastore",
          "type": "measure"
        },
        {
          "child": {
            "compression": "none",
            "path": "datastore",
            "type": "levelds"
          },
          "mountpoint": "/",
          "prefix": "leveldb.datastore",
          "type": "measure"
        }
      ],
      "type": "mount"
    },
    "StorageGCWatermark": 90,
    "StorageMax": "100GB"
  },

trouble building for 0.4.22

After build ipfs and go-ds-s3 when i try to launch ipfs - i got an error:
plugin was built with a different version of package github.com/jbenet/goprocess

My steps:

cd go-ipfs/
git checkout v0.4.22
make build IPFS_VERSION=v0.4.22
cp cmd/ipfs/ipfs /usr/local/bin/ipfs
ipfs version
cd ..
git clone https://github.com/ipfs/go-ds-s3
cd go-ds-s3
make go.mod IPFS_VERSION=v0.4.22
make build IPFS_VERSION=v0.4.22
mkdir -p ~/.ipfs/plugins
cp s3plugin.so ~/.ipfs/plugins/go-ds-s3.so
ipfs version

Environments:

GOARCH="amd64"
GOHOSTOS="linux"
GOPATH="/home/alex/go"
GOROOT="/usr/local/go"

Could you write steps of building s3 plugin correctly?

Implement prefixed queries

The basic blockstore wraps its datastore in a "namespace" datastore that prefixes all keys with /blocks/. It also adds this prefix to all calls to Query. This means that calls to AllKeysChan on blockstores stored in S3 always fail because this package doesn't support prefixes in query.

So, this package needs to support them, even if "support" means filtering client side.

trouble building for 0.4.21

New attempt to build this, but I still have troubles:

$ make go.mod IPFS_VERSION=v0.4.21
./set-target.sh v0.4.21
error writing go.mod: open /home/michael/go/pkg/mod/github.com/ipfs/[email protected]/go.mod883788688.tmp: permission denied
go mod edit: no flags specified (see 'go help mod edit').
make: *** [Makefile:13: go.mod] Error 1

Basically what's happening is that /home/michael/go/pkg/mod/**/**/* is read only on my system (debian), for a good reason IMHO. This is not something I have done specifically, it's how go modules behave normally.

I tried to remove the set-target script from the makefile and update go.mod myself:

$ make build
go build -buildmode=plugin -i -o "s3plugin.so" "plugin/main.go"
go build encoding: open /usr/lib/go-1.12/pkg/linux_amd64_dynlink/encoding.a: permission denied
go build unicode/utf16: open /usr/lib/go-1.12/pkg/linux_amd64_dynlink/unicode/utf16.a: permission denied
go build container/list: mkdir /usr/lib/go-1.12/pkg/linux_amd64_dynlink/container/: permission denied
go build crypto/internal/subtle: mkdir /usr/lib/go-1.12/pkg/linux_amd64_dynlink/crypto: permission denied
go build crypto/subtle: mkdir /usr/lib/go-1.12/pkg/linux_amd64_dynlink/crypto/: permission denied
go build internal/x/crypto/cryptobyte/asn1: mkdir /usr/lib/go-1.12/pkg/linux_amd64_dynlink/internal/x: permission denied
go build internal/nettrace: open /usr/lib/go-1.12/pkg/linux_amd64_dynlink/internal/nettrace.a: permission denied
go build internal/x/net/dns/dnsmessage: mkdir /usr/lib/go-1.12/pkg/linux_amd64_dynlink/internal/x: permission denied
go build internal/testlog: open /usr/lib/go-1.12/pkg/linux_amd64_dynlink/internal/testlog.a: permission denied
go build internal/x/crypto/curve25519: mkdir /usr/lib/go-1.12/pkg/linux_amd64_dynlink/internal/x: permission denied
go build io: open /usr/lib/go-1.12/pkg/linux_amd64_dynlink/io.a: permission denied
go build math/rand: open /usr/lib/go-1.12/pkg/linux_amd64_dynlink/math/rand.a: permission denied
go build sort: open /usr/lib/go-1.12/pkg/linux_amd64_dynlink/sort.a: permission denied
go build internal/singleflight: open /usr/lib/go-1.12/pkg/linux_amd64_dynlink/internal/singleflight.a: permission denied
go build internal/syscall/unix: mkdir /usr/lib/go-1.12/pkg/linux_amd64_dynlink/internal/syscall/: permission denied
go build time: open /usr/lib/go-1.12/pkg/linux_amd64_dynlink/time.a: permission denied
make: *** [Makefile:13: s3plugin.so] Error 1

It seems to be a problem with the -i flag of go build -buildmode=plugin -i -o "s3plugin.so" "plugin/main.go", in a similar fashion as golang/go#27285. Again, this seems to attempt to write the files shipped by go, which is a bad idea.

Removing the -i flag, the build finally succeed:

$ make install
go build -buildmode=plugin -o "s3plugin.so" "plugin/main.go"
chmod +x "s3plugin.so"
Built against v0.4.21
install -Dm700 s3plugin.so "/home/michael/.ipfs/plugins/go-ds-s3.so"

Alas, loading the plugin fail:

$ ipfs version
Error: error loading plugins: loading plugin /home/michael/.ipfs/plugins/go-ds-s3.so: plugin.Open("/home/michael/.ipfs/plugins/go-ds-s3"): plugin was built with a different version of package github.com/ipfs/go-ipfs/filestore/pb

Note that I compiled myself the go-ipfs binay, same machine, same environment, exactly from the v0.4.21 tag.

Any help ?

Error loading plugin on ipfs v0.7.0

I've built this against the pre-built go-ipfs v0.7.0 with

make install IPFS_VERSION=v0.7.0

My go version matches that used to build go-ipfs: https://dist.ipfs.io/go-ipfs/v0.7.0/build-info

go version go1.14.4 linux/amd64

When I try to load the plugin it errors with:

Error: error loading plugins: loading plugin /tmp/peergos15169824593847595019/.ipfs/plugins/go-ds-s3.so: plugin.Open("/tmp/peergos15169824593847595019/.ipfs/plugins/go-ds-s3"): plugin was built with a different version of package internal/cpu

plugin was built with a different version of package github.com/ipfs/go-datastore/query

Hello, guys, I'm checked all issues and this not helped me.(

specification:

ipfs version v0.10.0
go version 1.16.8
go-ds-s3 - last master

Error: error loading plugins: loading plugin /data/ipfs/plugins/go-ds-s3.so: plugin.Open("/data/ipfs/plugins/go-ds-s3"): plugin was built with a different version of package github.com/ipfs/go-datastore/query

may be someone found solutions?
I'm tried to fix this 3 day - i tried other version, add IPFS version to env, and a lot other, but failed anyway.

ps.
https://github.com/ipfs/go-ipfs/blob/master/docs/plugins.md#preloaded-plugins
and
https://github.com/ipfs/go-ds-s3#building-and-installing

this not helped - i done step-by-step, but still error

interface {} is nil, not string

Hi, I am attempting to set up a fresh IPFS node to point to S3 storage using this plugin

I have followed the instructions and have tried various combinations of things and have finally gotten the IPFS daemon to run, but I now have the following error:

go-ipfs version: 0.10.0-dev-f63a997c3-dirty
Repo version: 11
System version: amd64/linux
Golang version: go1.17
panic: interface conversion: interface {} is nil, not string

goroutine 1 [running]:
github.com/ipfs/go-ipfs/plugin/plugins/levelds.(*leveldsPlugin).DatastoreConfigParser.func1(0x1dde8e0)
	go-ipfs/plugin/plugins/levelds/levelds.go:57 +0x1cd
github.com/ipfs/go-ipfs/repo/fsrepo.AnyDatastoreConfig(0x1e85800)
	go-ipfs/repo/fsrepo/datastores.go:88 +0x84
github.com/ipfs/go-ipfs/repo/fsrepo.MountDatastoreConfig(0x1dde8e0)
	go-ipfs/repo/fsrepo/datastores.go:113 +0x185
github.com/ipfs/go-ipfs/repo/fsrepo.AnyDatastoreConfig(0x1e2de80)
	go-ipfs/repo/fsrepo/datastores.go:88 +0x84
github.com/ipfs/go-ipfs/repo/fsrepo.(*FSRepo).openDatastore(0xc000c2a8a0)
	go-ipfs/repo/fsrepo/fsrepo.go:425 +0xef
github.com/ipfs/go-ipfs/repo/fsrepo.open({0xc00043c1f0, 0xb})
	go-ipfs/repo/fsrepo/fsrepo.go:169 +0x24c
github.com/ipfs/go-ipfs/repo/fsrepo.Open.func1()
	go-ipfs/repo/fsrepo/fsrepo.go:113 +0x25
github.com/ipfs/go-ipfs/repo.(*OnlyOne).Open(0x3478240, {0x1d49c80, 0xc0001db120}, 0xc000d3f630)
	go-ipfs/repo/onlyone.go:35 +0x155
github.com/ipfs/go-ipfs/repo/fsrepo.Open({0xc00043c1f0, 0xc000010018})
	go-ipfs/repo/fsrepo/fsrepo.go:115 +0x65
main.daemonFunc(0xc000129110, {0x1, 0x60}, {0x1f0c520, 0xc000c2a780})
	go-ipfs/cmd/ipfs/daemon.go:280 +0x565
github.com/ipfs/go-ipfs-cmds.(*executor).Execute(0x2515900, 0xc000129110, {0x2541d30, 0xc000c2a7e0}, {0x1f0c520, 0xc000c2a780})
	pkg/mod/github.com/ipfs/[email protected]/executor.go:77 +0x375
github.com/ipfs/go-ipfs-cmds/cli.Run({0x2541898, 0xc000cf8a40}, 0x34833c0, {0xc00003c040, 0x2, 0x2}, 0x40, 0xc000060708, 0xc000010020, 0x22d8760, ...)
	pkg/mod/github.com/ipfs/[email protected]/cli/run.go:137 +0x90e
main.mainRet()
	go-ipfs/cmd/ipfs/main.go:168 +0x4ea
main.main()
	go-ipfs/cmd/ipfs/main.go:71 +0x19

Here is my ~/.ipfs/datastore_spec

{
    "mounts": [
        {
            "bucket": "BUCKET_NAME",
            "mountpoint": "/blocks",
            "region": "us-east-2",
            "rootDirectory": "ipfs"
        },
        {
            "mountpoint": "/",
            "path": "datastore",
            "type": "levelds"
        }
    ],
    "type": "mount"
}

Here is my ~/.ipfs/config

{
    ...,
    "Datastore": {
        "StorageMax": "10GB",
        "StorageGCWatermark": 90,
        "GCPeriod": "1h",
        "Spec": {
            "mounts": [
                {
                    "child": {
                        "type": "s3ds",
                        "region": "us-east-2",
                        "bucket": "BUCKET",
                        "rootDirectory": "ipfs",
                        "accessKey": "",
                        "secretKey": ""
                    },
                    "mountpoint": "/blocks",
                    "prefix": "s3.datastore",
                    "type": "measure"
                },
                {
                    "mountpoint": "/",
                    "path": "datastore",
                    "type": "levelds"
                }
            ],
            "type": "mount"
        },
        "HashOnRead": false,
        "BloomFilterSize": 0
    },
    ...
}

Thanks!

Missing method for ipfs v0.6.0

Building against go-ipfs v0.6.0 errors with:

./s3.go:375:5: cannot use (*S3Bucket)(nil) (type *S3Bucket) as type datastore.Batching in assignment:
*S3Bucket does not implement datastore.Batching (missing Sync method)

Error loading plugin with ipfs v0.6.0

I've built the plugin against go-ipfs v0.6.0 (with go 1.14.4), but it fails to load with:

plugin was built with a different version of package github.com/ipfs/go-ipfs/core/bootstrap

All I did was change go.mod in master to point to go-ipfs v0.6.0 and build with

make IPFS_VERSION=/home/ian/go-ipfs build

(and go ipfs dir had checked out the v0.6.0 tag)

Build error for IPFS_VERSION v0.10.0

macOS Monterey
Verion 12.0.1

go version go1.17.3 darwin/amd64

======================================

% git clone https://github.com/ipfs/go-ds-s3.git
% make IPFS_VERSION=v0.10.0
% make install
./set-target.sh v0.10.0
go build -trimpath -buildmode=plugin -o "s3plugin.so" "plugin/main/main.go"

github.com/klauspost/compress/zstd/internal/xxhash

asm: xxhash_amd64.s:120: when dynamic linking, R15 is clobbered by a global variable access and is used here: 00092 (/Users/peter/go/pkg/mod/github.com/klauspost/[email protected]/zstd/internal/xxhash/xxhash_amd64.s:120) ADDQ R15, AX
asm: assembly failed

github.com/cespare/xxhash/v2

asm: xxhash_amd64.s:120: when dynamic linking, R15 is clobbered by a global variable access and is used here: 00092 (/Users/peter/go/pkg/mod/github.com/cespare/xxhash/[email protected]/xxhash_amd64.s:120) ADDQ R15, AX
asm: assembly failed
make: *** [s3plugin.so] Error 2

ipfs daemon error

Hello,

I've an error running ipfs daemon :

Initializing daemon...
go-ipfs version: 0.4.22-4e981576b-dirty
Repo version: 7
System version: amd64/linux
Golang version: go1.12.9
09:17:04.924 ERROR   cmd/ipfs: error from node construction:  could not build arguments for function "reflect".makeFuncStub (/usr/local/go/src/reflect/asm_amd64.s:12): failed to build provider.Provider: could not build arguments for function "github.com/ipfs/go-ipfs/core/node".ProviderCtor (/go/go-ipfs/core/node/provider.go:24): failed to build *provider.Queue: function "github.com/ipfs/go-ipfs/core/node".ProviderQueue (/go/go-ipfs/core/node/provider.go:19) returned a non-nil error: s3ds: filters or orders are not supported daemon.go:337

Error: could not build arguments for function "reflect".makeFuncStub (/usr/local/go/src/reflect/asm_amd64.s:12): failed to build provider.Provider: could not build arguments for function "github.com/ipfs/go-ipfs/core/node".ProviderCtor (/go/go-ipfs/core/node/provider.go:24): failed to build *provider.Queue: function "github.com/ipfs/go-ipfs/core/node".ProviderQueue (/go/go-ipfs/core/node/provider.go:19) returned a non-nil error: s3ds: filters or orders are not supported

I use this Dockerfile :


ENV GO111MODULE on

RUN git clone https://github.com/ipfs/go-ipfs.git

WORKDIR go-ipfs

RUN git checkout v0.4.22

RUN go get github.com/ipfs/go-ds-s3@latest

RUN echo "s3ds github.com/ipfs/go-ds-s3/plugin 0" >> plugin/loader/preload_list

RUN make build

RUN make install

WORKDIR /root

RUN apt-get update && apt-get -y install vim awscli jq

RUN go get github.com/whyrusleeping/ipfs-key

RUN go get -u github.com/Kubuxu/go-ipfs-swarm-key-gen/ipfs-swarm-key-gen

COPY entrypoint.sh /

RUN chmod a+x /entrypoint.sh

COPY config .

And this config :

  "Identity": {
    "PeerID": "xxxxx",
    "PrivKey": "xxxxx"
  },
  "Datastore": {
    "StorageMax": "10GB",
    "StorageGCWatermark": 90,
    "GCPeriod": "1h",
    "Spec": {
      "mounts": [
        {
          "child": {
            "type": "s3ds",
            "region": "eu-west-1",
            "bucket": "test-xxxxx",
            "rootDirectory": "data",
            "accessKey": "",
            "secretKey": ""
          },
          "mountpoint": "/blocks",
          "prefix": "s3.datastore",
          "type": "measure"
        },
        {
          "child": {
            "type": "s3ds",
            "region": "eu-west-1",
            "bucket": "test-xxxxx",
            "rootDirectory": "meta",
            "accessKey": "",
            "secretKey": ""
          },
          "mountpoint": "/",
          "prefix": "s3.datastore",
          "type": "measure"
        }
      ],
      "type": "mount"
    },
    "HashOnRead": false,
    "BloomFilterSize": 0
  },
  "Addresses": {
    "Swarm": [
      "/ip4/0.0.0.0/tcp/4001",
      "/ip6/::/tcp/4001"
    ],
    "Announce": [],
    "NoAnnounce": [],
    "API": "/ip4/127.0.0.1/tcp/5001",
    "Gateway": "/ip4/127.0.0.1/tcp/8080"
  },
  "Mounts": {
    "IPFS": "/ipfs",
    "IPNS": "/ipns",
    "FuseAllowOther": false
  },
  "Discovery": {
    "MDNS": {
      "Enabled": true,
      "Interval": 10
    }
  },
  "Routing": {
    "Type": "dht"
  },
  "Ipns": {
    "RepublishPeriod": "",
    "RecordLifetime": "",
    "ResolveCacheSize": 128
  },
  "Bootstrap": [
    "/dnsaddr/bootstrap.libp2p.io/ipfs/QmNnooDu7bfjPFoTZYxMNLWUQJyrVwtbZg5gBMjTezGAJN",
    "/dnsaddr/bootstrap.libp2p.io/ipfs/QmQCU2EcMqAqQPR2i9bChDtGNJchTbq5TbXJJ16u19uLTa",
    "/dnsaddr/bootstrap.libp2p.io/ipfs/QmbLHAnMoJPWSCR5Zhtx6BHJX9KiKNN6tpvbUcqanj75Nb",
    "/dnsaddr/bootstrap.libp2p.io/ipfs/QmcZf59bWwK5XFi76CZX8cbJ4BhTzzA3gU1ZjYZcYW3dwt",
    "/ip4/104.131.131.82/tcp/4001/ipfs/QmaCpDMGvV2BGHeYERUEnRQAwe3N8SzbUtfsmvsqQLuvuJ",
    "/ip4/104.236.179.241/tcp/4001/ipfs/QmSoLPppuBtQSGwKDZT2M73ULpjvfd3aZ6ha4oFGL1KrGM",
    "/ip4/128.199.219.111/tcp/4001/ipfs/QmSoLSafTMBsPKadTEgaXctDQVcqN88CNLHXMkTNwMKPnu",
    "/ip4/104.236.76.40/tcp/4001/ipfs/QmSoLV4Bbm51jM9C4gDYZQ9Cy3U6aXMJDAbzgu2fzaDs64",
    "/ip4/178.62.158.247/tcp/4001/ipfs/QmSoLer265NRgSp2LA3dPaeykiS1J6DifTC88f5uVQKNAd",
    "/ip6/2604:a880:1:20::203:d001/tcp/4001/ipfs/QmSoLPppuBtQSGwKDZT2M73ULpjvfd3aZ6ha4oFGL1KrGM",
    "/ip6/2400:6180:0:d0::151:6001/tcp/4001/ipfs/QmSoLSafTMBsPKadTEgaXctDQVcqN88CNLHXMkTNwMKPnu",
    "/ip6/2604:a880:800:10::4a:5001/tcp/4001/ipfs/QmSoLV4Bbm51jM9C4gDYZQ9Cy3U6aXMJDAbzgu2fzaDs64",
    "/ip6/2a03:b0c0:0:1010::23:1001/tcp/4001/ipfs/QmSoLer265NRgSp2LA3dPaeykiS1J6DifTC88f5uVQKNAd"
  ],
  "Gateway": {
    "HTTPHeaders": {
      "Access-Control-Allow-Headers": [
        "X-Requested-With",
        "Range",
        "User-Agent"
      ],
      "Access-Control-Allow-Methods": [
        "GET"
      ],
      "Access-Control-Allow-Origin": [
        "*"
      ]
    },
    "RootRedirect": "",
    "Writable": false,
    "PathPrefixes": [],
    "APICommands": [],
    "NoFetch": false
  },
  "API": {
    "HTTPHeaders": {}
  },
  "Swarm": {
    "AddrFilters": null,
    "DisableBandwidthMetrics": false,
    "DisableNatPortMap": false,
    "DisableRelay": false,
    "EnableRelayHop": false,
    "EnableAutoRelay": false,
    "EnableAutoNATService": false,
    "ConnMgr": {
      "Type": "basic",
      "LowWater": 600,
      "HighWater": 900,
      "GracePeriod": "20s"
    }
  },
  "Pubsub": {
    "Router": "",
    "DisableSigning": false,
    "StrictSignatureVerification": false
  },
  "Reprovider": {
    "Interval": "12h",
    "Strategy": "all"
  },
  "Experimental": {
    "FilestoreEnabled": false,
    "UrlstoreEnabled": false,
    "ShardingEnabled": false,
    "Libp2pStreamMounting": false,
    "P2pHttpProxy": false,
    "QUIC": false,
    "PreferTLS": false
  }
}

Any idea ?

Thanks in advance,

Cyril

plugin was built with a different version of package

I get the following error when I run ipfs daemon

ERROR: install failed: version check failed: Error: error loading plugins: loading plugin /home/ec2-user/.ipfs/plugins/go-ds-s3.so: plugin.Open("/home/ec2-user/.ipfs/plugins/go-ds-s3"): plugin was built with a different version of package github.com/google/uuid

how may I rectify this error?

Reduce library size

The latest version of this has almost doubled the size to 60mb. This is way too much for a glorified rest client. For comparison the Java S3 client is ~3mb. Is there any way to reduce this by stripping out unused dependencies? Or just including the S3 sdk, and not all of amazon?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.