Coder Social home page Coder Social logo

solo-io / gloo Goto Github PK

View Code? Open in Web Editor NEW
4.0K 104.0 429.0 157.96 MB

The Feature-rich, Kubernetes-native, Next-Generation API Gateway Built on Envoy

Home Page: https://docs.solo.io/

License: Apache License 2.0

Makefile 0.85% Go 97.71% Shell 0.70% Dockerfile 0.08% HCL 0.15% Python 0.14% Mustache 0.19% PowerShell 0.05% Smarty 0.14%
gloo envoy api-gateway serverless api-management kubernetes kubernetes-ingress-controller microservices hybrid-apps legacy-apps

gloo's Introduction

Gloo Gateway v2
An Envoy-Powered API Gateway

Important Update

Important Gloo Gateway is now a fully conformant Kubernetes Gateway API implementation!

The existing Gloo Edge v1 APIs were not changed and continue to be fully supported.

About Gloo Gateway

Gloo Gateway is a powerful Kubernetes-native ingress controller and API gateway that is based on the Kubernetes Gateway API. It excels in function-level routing, supports legacy apps, microservices and serverless, offers robust discovery capabilities, integrates seamlessly with open-source projects, and is designed to support hybrid applications with various technologies, architectures, protocols, and clouds.

Installation   |   Documentation   |   Blog   |   Slack   |   Twitter |   Enterprise Trial


Gloo Gateway v2 Architecture

Quickstart with glooctl

Install Gloo Gateway and set up routing to the httpbin sample app.

  1. Install glooctl, the Gloo Gateway command line tool.

    curl -sL https://run.solo.io/gloo/install | GLOO_VERSION=v2.0.0-beta1 sh
    export PATH=$HOME/.gloo/bin:$PATH
  2. Install the Gloo Gateway v2 control plane, and wait for it to come up.

    glooctl install
  3. Deploy the httpbin sample app, along with a Gateway and HTTPRoute to access it.

    kubectl -n httpbin apply -f https://raw.githubusercontent.com/solo-io/gloo/v2.0.x/projects/gateway2/examples/httpbin.yaml
  4. Port-forward the Gateway.

    kubectl port-forward deployment/gloo-proxy-http -n httpbin 8080:8080
  5. Send a request through our new Gateway.

    curl -I localhost:8080/status/200 -H "host: www.example.com" -v

Congratulations! You successfully installed Gloo Gateway and used an HTTP gateway to expose the httpbin sample app.

Note To learn more about Gloo Gateway's support for the Kubernetes Gateway API, see the docs.

Quickstart with Helm

  1. Install the custom resources of the Kubernetes Gateway API.

    kubectl apply -f https://github.com/kubernetes-sigs/gateway-api/releases/download/v1.0.0/standard-install.yaml
  2. Install Gloo Gateway v2. This command creates the gloo-system namespace and installs the Gloo Gateway v2 control plane into it.

    helm install default -n gloo-system --create-namespace  oci://ghcr.io/solo-io/helm-charts/gloo-gateway --version 2.0.0-beta1
  3. Verify that the Gloo Gateway v2 control plane is up and running and that the gloo-gateway GatewayClass is created.

    kubectl get pods -n gloo-system
    kubectl get gatewayclass gloo-gateway 
  4. Deploy the httpbin sample app, along with a Gateway and HTTPRoute to access it.

    kubectl -n httpbin apply -f https://raw.githubusercontent.com/solo-io/gloo/v2.0.x/projects/gateway2/examples/httpbin.yaml
  5. Port-forward the Gateway.

    kubectl port-forward deployment/gloo-proxy-http -n httpbin 8080:8080
  6. Send a request through our new Gateway.

    curl -I localhost:8080/status/200 -H "host: www.example.com" -v

Note To learn more about Gloo Gateway's support for the Kubernetes Gateway API, see the docs.

Using Gloo Gateway

  • Kubernetes Gateway API: Gloo Gateway is a feature-rich ingress controller, built on top of the Envoy Proxy and fully conformant with the Kubernetes Gateway API.
  • Next-generation API gateway: Gloo Gateway provides a long list of API gateway features including rate limiting, circuit breaking, retries, caching, transformation, service-mesh integration, security, external authentication and authorization.
  • Hybrid apps: Gloo Gateway creates applications that route to backends implemented as microservices, serverless functions and legacy apps. This feature can help users to - A) Gradually migrate from their legacy code to microservices and serverless. B) Add new functionalities using cloud-native technologies while maintaining their legacy codebase. C) Allow different teams in an organization choose different architectures. See here for more on the Hybrid App paradigm.

What makes Gloo Gateway unique

  • Function-level routing allows integration of legacy applications, microservices and serverless: Gloo Gateway can route requests directly to functions. Request to Function can be a serverless function call (e.g. Lambda, Google Cloud Function, OpenFaaS Function, etc.), an API call on a microservice or a legacy service (e.g. a REST API call, OpenAPI operation, XML/SOAP request etc.), or publishing to a message queue (e.g. NATS, AMQP, etc.). This unique ability is what makes Gloo Gateway the only API gateway that supports hybrid apps as well as the only one that does not tie the user to a specific paradigm.
  • Gloo Gateway incorporates vetted open-source projects to provide broad functionality: Gloo Gateway supports high-quality features by integrating with top open-source projects, including gRPC, GraphQL, OpenTracing, NATS and more. Gloo Gateway's architecture allows rapid integration of future popular open-source projects as they emerge. Full automated discovery lets users move fast: Upon launch, Gloo Gateway creates a catalog of all available destinations and continuously keeps them up to date. This takes the responsibility for 'bookkeeping' away from the developers and guarantees that new features become available as soon as they are ready. Gloo Gateway discovers across IaaS, PaaS and FaaS providers as well as Swagger, gRPC, and GraphQL.

Next Steps

Thanks

Gloo Gateway would not be possible without the valuable open-source work of projects in the community. We would like to extend a special thank-you to Envoy.

Security

Reporting security issues : We take Gloo Gateway's security very seriously. If you've found a security issue or a potential security issue in Gloo Gateway, please DO NOT file a public Github issue, instead send your report privately to [email protected].

gloo's People

Contributors

arianaw66 avatar artberger avatar ashleywang1 avatar ben-taussig-solo avatar bewebi avatar bslabe123 avatar christian-posta avatar davidjumani avatar eitanya avatar elcasteel avatar grahamgoudeau avatar gunnar-solo avatar ilackarms avatar jameshbarton avatar jbohanon avatar jenshu avatar kdorosh avatar marcogschmidt avatar mitchdraft avatar mlholland avatar nadine2016 avatar ned1313 avatar nfuden avatar npolshakova avatar rachael-graham avatar saiskee avatar sam-heilbron avatar sheidkamp avatar sodman avatar yuval-k avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

gloo's Issues

alternative to extending the translator

#220 addresses the need to add functionality to Gloo without modifying the core Gloo repo

it introduces the ability to add additional plugins to Gloo as well as provide callbacks for the Translator via TranslatorSyncerExtensions

just to keep track of an alternate solution, it may be prudent to add complexity to the plugins (essentially, offload the work of extending Gloo's functionality entirely to plugins).

just something to track as we continue to build out

petstore YAML k8s annotation service spelling

When I followed

kubectl apply \
-f https://raw.githubusercontent.com/solo-io/gloo/master/example/petstore/petstore.yaml 

it created the petstore setup in default. As I was investigating the YAML it had "sevice": "petstore" and it did not seem to matter. But I am guessing should be "service". Very small, minute thing. LOVE this tool. Learning it today.

metadata:
annotations:
kubectl.kubernetes.io/last-applied-configuration: >
{"apiVersion":"v1","kind":"Service","metadata":{"annotations":{},"labels":{"sevice":"petstore"},"name":"petstore","namespace":"default"},"spec":{"ports":[{"port":8080,"protocol":"TCP"}],"selector":{"app":"petstore"}}}

BUG: gloo should make its own service account

Currently on a default install glooctl binds it's gloo-role to the default service account. This is a security issue as most operators leave their default service account permissionless.

Have glooctl create its own service account and use that.

/tools# kubectl describe clusterrolebinding gloo-role-binding
Name:         gloo-role-binding
Labels:       <none>
Annotations:  kubectl.kubernetes.io/last-applied-configuration={"apiVersion":"rbac.authorization.k8s.io/v1","kind":"ClusterRoleBinding","metadata":{"annotations":{},"name":"gloo-role-binding","namespace":""},"roleR...
Role:
  Kind:  ClusterRole
  Name:  gloo-role
Subjects:
  Kind            Name     Namespace
  ----            ----     ---------
  ServiceAccount  default  gloo-system

What are the proper RBAC permissions for Gloo to read inside k8s/openshift

I see " RBAC permissions have not been granted to Gloo to read from the registry, " if my service, that has a /swagger.json returning a valid swagger setup, is not read correctly. How do I ensure the right permissions? I ran "glooctl install kube" to set this up. And I have 2 things to read, including the petstore YAML that I put into a separate project, not default, that is not found with "glooctl get upstreams".

Cannot access gloo UI

I've followed the installation instructions at https://github.com/solo-io/gloo/blob/master/docs/getting_started/kubernetes/1.md using minikube. For some reason the UI isn't available. As I understand it's provided by control-plane service that I exposed through "port-forward"

kubectl port-forward --namespace=gloo-system control-plane-<pod id> 8081:8081 

The port is available but instead of html it replies some binary content (to be precise, the reply is 9 bytes: 00 00 00 04 00 00 00 00 00).

Could you explain how to connect gloo UI?

Also there is some minor inconsistency in your document. "Petstore" API is served by :8080/api/pets and not by :8080/petstore. There is no list action, etc.

Tutorial #2 - error when updating route

cat <<EOF | glooctl route update --path-exact /petstore/findPet --upstream default-petstore-8080 --function findPetById --extensions -
parameters:
  headers:
    x-pet: '{id}'
EOF

Outputs the following:

Using virtual service: default
Unable to get updated route: unable to get old route: a matcher wasn't specified

Something I might have done wrong or a change since the tutorial was written?

Fix generation of kube.yaml.go

In upgrading to solo-kit 0.2.1, the generate script for this file went away so changes to kube.yaml don't update the CLI

Update site for open source gloo docs

make site works from this repo, but only produces the site locally. Need to push an updated docker image, replace the existing site docs with the newly generated ones, and update the readmes in this repo to refer to open source gloo site.

gateway should merge virtual services based on domain

currently, gloo returns an error on the vservice if its domains conflict with any other vservices

what this feature would do is do an opinionated merge (with some kind of route sort) of one or more virtual services based on domain. if domains overlap, we will create a merged vhost with that (it would be good to LOG it for now)

Cleanup help text in glooctl

Related to #202 , we misspell permissions in the glooctl help text when running glooctl create upstream kube --help.

We also have several typos in the flags.

  • listenen
  • defaukt

gateway crd

the gateway crd can get out of sync fairly easily, it's been noticed when virtual services are deleted.

Authetication Question

hello, got a question, i have a new route to my monolith created with gloo "glooctl route create --path-prefix / --upstream myoldapp-8080", and i would know if i can set any kind of auth (at least basic) on this route, so user have to authenticate when then call "h t t p :// myserver /", thanks a lot

Build bug: multiple paths in GOPATH

This generate statement assumes the $GOPATH will be a single directory:

//go:generate protoc -I$GOPATH/src/github.com/lyft/protoc-gen-validate -I. -I$GOPATH/src/github.com/gogo/protobuf/protobuf --gogo_out=Mgoogle/protobuf/struct.proto=github.com/gogo/protobuf/types,Mgoogle/protobuf/duration.proto=github.com/gogo/protobuf/types:${GOPATH}/src/ filter.proto

If you have multiple paths in GOPATH, it fails:

GOPATH=/home/ceposta/gopath:/another/gopath:/foo/path

running gloo with istio

this issue should track progress on deploying Gloo with istio sidecars. the existing Gloo deployment has been tested to work with istio 1.0.3 without any additional configuration.

gloo performs the following network functions:

  • function discovery polls services for grpc reflection and openapi docs
  • envoy proxies services
  • gloo (the control plane itself) listens for grpc and serves envoys who connect

nomad installation failure

Ingress container keeps failing. Stderr:
[2018-09-25 10:03:16.979][6][info][main] external/envoy/source/server/server.cc:183] initializing epoch 0 (hot restart version=10.200.16384.127.options=capacity=16384, num_slots=8209 hash=228984379728933363 size=2654312) [2018-09-25 10:03:16.982][6][info][main] external/envoy/source/server/server.cc:185] statically linked extensions: [2018-09-25 10:03:16.982][6][info][main] external/envoy/source/server/server.cc:187] access_loggers: envoy.file_access_log,envoy.http_grpc_access_log [2018-09-25 10:03:16.982][6][info][main] external/envoy/source/server/server.cc:190] filters.http: envoy.buffer,envoy.cors,envoy.ext_authz,envoy.fault,envoy.filters.http.rbac,envoy.grpc_http1_bridge,envoy.grpc_json_transcoder,envoy.grpc_web,envoy.gzip,envoy.health_check,envoy.http_dynamo_filter,envoy.ip_tagging,envoy.lua,envoy.rate_limit,envoy.router,envoy.squash,io.solo.azure_functions,io.solo.function_router,io.solo.gcloudfunc,io.solo.lambda,io.solo.nats_streaming,io.solo.transformation [2018-09-25 10:03:16.982][6][info][main] external/envoy/source/server/server.cc:193] filters.listener: envoy.listener.original_dst,envoy.listener.proxy_protocol,envoy.listener.tls_inspector [2018-09-25 10:03:16.982][6][info][main] external/envoy/source/server/server.cc:196] filters.network: envoy.client_ssl_auth,envoy.echo,envoy.ext_authz,envoy.http_connection_manager,envoy.mongo_proxy,envoy.ratelimit,envoy.redis_proxy,envoy.tcp_proxy,io.solo.filters.network.consul_connect [2018-09-25 10:03:16.982][6][info][main] external/envoy/source/server/server.cc:198] stat_sinks: envoy.dog_statsd,envoy.metrics_service,envoy.statsd [2018-09-25 10:03:16.982][6][info][main] external/envoy/source/server/server.cc:200] tracers: envoy.dynamic.ot,envoy.lightstep,envoy.zipkin [2018-09-25 10:03:16.982][6][info][main] external/envoy/source/server/server.cc:203] transport_sockets.downstream: envoy.transport_sockets.capture,raw_buffer,tls [2018-09-25 10:03:16.982][6][info][main] external/envoy/source/server/server.cc:206] transport_sockets.upstream: envoy.transport_sockets.capture,raw_buffer,tls [2018-09-25 10:03:16.983][6][critical][main] external/envoy/source/server/server.cc:78] error initializing configuration '': Proto constraint validation failed (BootstrapValidationError.Admin: ["value is required"]): [2018-09-25 10:03:16.983][6][info][main] external/envoy/source/server/server.cc:449] exiting [2018-09-25 10:03:38.400][6][info][main] external/envoy/source/server/server.cc:183] initializing epoch 0 (hot restart version=10.200.16384.127.options=capacity=16384, num_slots=8209 hash=228984379728933363 size=2654312) [2018-09-25 10:03:38.401][6][info][main] external/envoy/source/server/server.cc:185] statically linked extensions: [2018-09-25 10:03:38.401][6][info][main] external/envoy/source/server/server.cc:187] access_loggers: envoy.file_access_log,envoy.http_grpc_access_log [2018-09-25 10:03:38.401][6][info][main] external/envoy/source/server/server.cc:190] filters.http: envoy.buffer,envoy.cors,envoy.ext_authz,envoy.fault,envoy.filters.http.rbac,envoy.grpc_http1_bridge,envoy.grpc_json_transcoder,envoy.grpc_web,envoy.gzip,envoy.health_check,envoy.http_dynamo_filter,envoy.ip_tagging,envoy.lua,envoy.rate_limit,envoy.router,envoy.squash,io.solo.azure_functions,io.solo.function_router,io.solo.gcloudfunc,io.solo.lambda,io.solo.nats_streaming,io.solo.transformation [2018-09-25 10:03:38.401][6][info][main] external/envoy/source/server/server.cc:193] filters.listener: envoy.listener.original_dst,envoy.listener.proxy_protocol,envoy.listener.tls_inspector [2018-09-25 10:03:38.401][6][info][main] external/envoy/source/server/server.cc:196] filters.network: envoy.client_ssl_auth,envoy.echo,envoy.ext_authz,envoy.http_connection_manager,envoy.mongo_proxy,envoy.ratelimit,envoy.redis_proxy,envoy.tcp_proxy,io.solo.filters.network.consul_connect [2018-09-25 10:03:38.401][6][info][main] external/envoy/source/server/server.cc:198] stat_sinks: envoy.dog_statsd,envoy.metrics_service,envoy.statsd [2018-09-25 10:03:38.401][6][info][main] external/envoy/source/server/server.cc:200] tracers: envoy.dynamic.ot,envoy.lightstep,envoy.zipkin [2018-09-25 10:03:38.401][6][info][main] external/envoy/source/server/server.cc:203] transport_sockets.downstream: envoy.transport_sockets.capture,raw_buffer,tls [2018-09-25 10:03:38.401][6][info][main] external/envoy/source/server/server.cc:206] transport_sockets.upstream: envoy.transport_sockets.capture,raw_buffer,tls [2018-09-25 10:03:38.402][6][critical][main] external/envoy/source/server/server.cc:78] error initializing configuration '': Proto constraint validation failed (BootstrapValidationError.Admin: ["value is required"]): [2018-09-25 10:03:38.402][6][info][main] external/envoy/source/server/server.cc:449] exiting [2018-09-25 10:04:00.192][6][info][main] external/envoy/source/server/server.cc:183] initializing epoch 0 (hot restart version=10.200.16384.127.options=capacity=16384, num_slots=8209 hash=228984379728933363 size=2654312) [2018-09-25 10:04:00.195][6][info][main] external/envoy/source/server/server.cc:185] statically linked extensions: [2018-09-25 10:04:00.195][6][info][main] external/envoy/source/server/server.cc:187] access_loggers: envoy.file_access_log,envoy.http_grpc_access_log [2018-09-25 10:04:00.195][6][info][main] external/envoy/source/server/server.cc:190] filters.http: envoy.buffer,envoy.cors,envoy.ext_authz,envoy.fault,envoy.filters.http.rbac,envoy.grpc_http1_bridge,envoy.grpc_json_transcoder,envoy.grpc_web,envoy.gzip,envoy.health_check,envoy.http_dynamo_filter,envoy.ip_tagging,envoy.lua,envoy.rate_limit,envoy.router,envoy.squash,io.solo.azure_functions,io.solo.function_router,io.solo.gcloudfunc,io.solo.lambda,io.solo.nats_streaming,io.solo.transformation [2018-09-25 10:04:00.195][6][info][main] external/envoy/source/server/server.cc:193] filters.listener: envoy.listener.original_dst,envoy.listener.proxy_protocol,envoy.listener.tls_inspector [2018-09-25 10:04:00.195][6][info][main] external/envoy/source/server/server.cc:196] filters.network: envoy.client_ssl_auth,envoy.echo,envoy.ext_authz,envoy.http_connection_manager,envoy.mongo_proxy,envoy.ratelimit,envoy.redis_proxy,envoy.tcp_proxy,io.solo.filters.network.consul_connect [2018-09-25 10:04:00.195][6][info][main] external/envoy/source/server/server.cc:198] stat_sinks: envoy.dog_statsd,envoy.metrics_service,envoy.statsd [2018-09-25 10:04:00.195][6][info][main] external/envoy/source/server/server.cc:200] tracers: envoy.dynamic.ot,envoy.lightstep,envoy.zipkin [2018-09-25 10:04:00.195][6][info][main] external/envoy/source/server/server.cc:203] transport_sockets.downstream: envoy.transport_sockets.capture,raw_buffer,tls [2018-09-25 10:04:00.195][6][info][main] external/envoy/source/server/server.cc:206] transport_sockets.upstream: envoy.transport_sockets.capture,raw_buffer,tls [2018-09-25 10:04:00.196][6][critical][main] external/envoy/source/server/server.cc:78] error initializing configuration '': Proto constraint validation failed (BootstrapValidationError.Admin: ["value is required"]): [2018-09-25 10:04:00.198][6][info][main] external/envoy/source/server/server.cc:449] exiting

BUG: gloo does not respect named ports

I have this service/deployment combination in my clusters. It's a very standard default backend for ingress. Note that the service forwards from port 80 to the named port http on the pod.

kind: Service
apiVersion: v1
metadata:
  name: arch-default-backend
spec:
  ports:
  - port: 80
    targetPort: http
  selector:
    app: arch-default-backend
---
kind: Deployment
apiVersion: extensions/v1beta1
metadata:
  name: arch-default-backend
spec:
  replicas: 1
  template:
    metadata:
      labels:
        app: arch-default-backend
    spec:
      terminationGracePeriodSeconds: 60
      containers:
      - name: default-backend
        image: gcr.io/google_containers/defaultbackend:1.0
        livenessProbe:
          httpGet:
            path: /healthz
            port: 8080
            scheme: HTTP
          initialDelaySeconds: 30
          timeoutSeconds: 5
        resources:
          limits:
            cpu: 10m
            memory: 20Mi
          requests:
            cpu: 10m
            memory: 20Mi
        ports:
        - name: http
          containerPort: 8080
          protocol: TCP

With this setup gloo is logging this every second:

{"level":"warn","ts":1546956254.3821619,"logger":"gloo.v1.event_loop.setup.kubernetes_eds","caller":"kubernetes/eds.go:133","msg":"upstream gloo-system.default-arch-default-backend-80: port 80 not found for service arch-default-backend"}

function discovery: re-discovery failing after pod deletion

Today we tried to use gloo on kubernetes to serve gRPC service for json compatiable client, we met some problems and need some help.

  1. the function-discover depends on grpc reflecting feature which is relatively new and it is missing in doc, for our python server we forgot import that package and it took sometimes to work it out.
  2. when the server is started and discovered by the function-discover, we do performance testing using wrk -t4 -c2000 -d30s -T5s --latency http://xxxxxx then when adding new service, it will not be discovered any more. But i can monitor the gloo-system that all the pods are running. the logs is just attempting to ..., and no more retries
  3. all the precedure is automatically, so i am a little worried about the stability. I mean if i come across problems above in the production mode, i can do nothing but to reinstall.

Could you give me some suggestions, which kind of log file do i need to debug, Thanks in advance.

UDS fails when discovering for namespaces that start with a number

apparently, kubernetes allows namespaces with names that are not DNS-1035 labels, but will not allow upstreams (or presumably any CRD) to be written with a non DNS-1035 name.

this means that a namespace can start with a number, but upstream may only begin with a letter.

this could be problematic if a user tries to run UDS on a namespace starting with a number, as upstreams are currently named by namespace-servicename-port

Installation question

Following the getting started doc, and trying to install on a GKE cluster.
running to the following issue. Not sure whether I'm doing something wrong here

kubectl apply \
>   --filename https://raw.githubusercontent.com/solo-io/gloo/master/install/kube/install.yaml
namespace "gloo-system" created
customresourcedefinition.apiextensions.k8s.io "upstreams.gloo.solo.io" created
customresourcedefinition.apiextensions.k8s.io "virtualservices.gloo.solo.io" created
customresourcedefinition.apiextensions.k8s.io "roles.gloo.solo.io" created
customresourcedefinition.apiextensions.k8s.io "attributes.gloo.solo.io" created
configmap "ingress-config" created
clusterrolebinding.rbac.authorization.k8s.io "gloo-cluster-admin-binding" created
clusterrolebinding.rbac.authorization.k8s.io "gloo-discovery-cluster-admin-binding" created
clusterrolebinding.rbac.authorization.k8s.io "gloo-knative-upstream-discovery-binding" created
deployment.apps "control-plane" created
service "control-plane" created
deployment.apps "function-discovery" created
deployment.apps "ingress" created

service "ingress" created
deployment.extensions "kube-ingress-controller" created
deployment.extensions "upstream-discovery" created
Error from server (Forbidden): error when creating "https://raw.githubusercontent.com/solo-io/gloo/master/install/kube/install.yaml": clusterroles.rbac.authorization.k8s.io "gloo-role" is forbidden: attempt to grant extra privileges: [PolicyRule{Resources:["pods"], APIGroups:[""], Verbs:["get"]} PolicyRule{Resources:["pods"], APIGroups:[""], Verbs:["watch"]} PolicyRule{Resources:["pods"], APIGroups:[""], Verbs:["list"]} PolicyRule{Resources:["services"], APIGroups:[""], Verbs:["get"]} PolicyRule{Resources:["services"], APIGroups:[""], Verbs:["watch"]} PolicyRule{Resources:["services"], APIGroups:[""], Verbs:["list"]} PolicyRule{Resources:["secrets"], APIGroups:[""], Verbs:["get"]} PolicyRule{Resources:["secrets"], APIGroups:[""], Verbs:["watch"]} PolicyRule{Resources:["secrets"], APIGroups:[""], Verbs:["list"]} PolicyRule{Resources:[

gRPC Example hang

Trying https://github.com/solo-io/gloo/tree/master/example/grpc
with latest version of Virtualbox, Minikube, kubectl, glooctl
on a blank minikube.

at this step:

[07:01] curl $GRPC_URL/bookstore.Bookstore/ListShelves
curl: (7) Failed to connect to 10.0.2.15 port 30508: Connection timed out
[07:06] 

Logs

Bookstore

2018/05/30 06:57:59 listening on 8080
2018/05/30 06:58:15 /grpc.reflection.v1alpha.ServerReflection/ServerReflectionInfo

kube-ingress

ERROR: logging before flag.Parse: W0530 06:46:01.324885       1 client_config.go:529] Neither --kubeconfig nor --master was specified.  Using the inClusterConfig.  This might not work.
ERROR: logging before flag.Parse: W0530 06:46:01.326152       1 client_config.go:529] Neither --kubeconfig nor --master was specified.  Using the inClusterConfig.  This might not work.
"Wed, 30 May 2018 06:46:01 UTC: github.com/solo-io/gloo/pkg/storage/crd/crd_storage_client.go:103"	registering crd v1.crd{
  Plural:    "upstreams",
  Group:     "gloo.solo.io",
  Version:   "v1",
  Kind:      "Upstream",
  ShortName: "us",
}
"Wed, 30 May 2018 06:46:01 UTC: github.com/solo-io/gloo/pkg/storage/crd/crd_storage_client.go:103"	registering crd v1.crd{
  Plural:    "virtualservices",
  Group:     "gloo.solo.io",
  Version:   "v1",
  Kind:      "VirtualService",
  ShortName: "vs",
}
"Wed, 30 May 2018 06:46:01 UTC: github.com/solo-io/gloo/pkg/storage/crd/crd_storage_client.go:103"	registering crd v1.crd{
  Plural:    "virtualmeshes",
  Group:     "gloo.solo.io",
  Version:   "v1",
  Kind:      "VirtualMesh",
  ShortName: "vm",
}
"Wed, 30 May 2018 06:46:01 UTC: github.com/solo-io/gloo/cmd/kube-ingress-controller/main.go:105"	starting ingress controller
"Wed, 30 May 2018 06:46:01 UTC: github.com/solo-io/gloo/vendor/github.com/solo-io/kubecontroller/controller.go:113"	Starting "gloo-ingress-syncer" controller
"Wed, 30 May 2018 06:46:01 UTC: github.com/solo-io/gloo/vendor/github.com/solo-io/kubecontroller/controller.go:116"	Waiting for informer caches to sync
"Wed, 30 May 2018 06:46:01 UTC: github.com/solo-io/gloo/cmd/kube-ingress-controller/main.go:81"	starting ingress status sync
"Wed, 30 May 2018 06:46:01 UTC: github.com/solo-io/gloo/cmd/kube-ingress-controller/main.go:94"	starting ingress sync
"Wed, 30 May 2018 06:46:01 UTC: github.com/solo-io/gloo/vendor/github.com/solo-io/kubecontroller/controller.go:113"	Starting "kube-ingress-controller" controller
"Wed, 30 May 2018 06:46:01 UTC: github.com/solo-io/gloo/vendor/github.com/solo-io/kubecontroller/controller.go:116"	Waiting for informer caches to sync
"Wed, 30 May 2018 06:46:01 UTC: github.com/solo-io/gloo/vendor/github.com/solo-io/kubecontroller/controller.go:121"	Starting workers
"Wed, 30 May 2018 06:46:01 UTC: github.com/solo-io/gloo/vendor/github.com/solo-io/kubecontroller/controller.go:127"	Started workers
"Wed, 30 May 2018 06:46:01 UTC: github.com/solo-io/gloo/vendor/github.com/solo-io/kubecontroller/controller.go:121"	Starting workers
"Wed, 30 May 2018 06:46:01 UTC: github.com/solo-io/gloo/vendor/github.com/solo-io/kubecontroller/controller.go:127"	Started workers

control-plane

non stop flow of:

ERROR: logging before flag.Parse: E0530 07:11:03.617946       1 reflector.go:205] github.com/solo-io/gloo/pkg/storage/crd/roles.go:84: Failed to list *v1.Role: roles.gloo.solo.io is forbidden: User "system:serviceaccount:gloo-system:default" cannot list roles.gloo.solo.io in the namespace "gloo-system"
ERROR: logging before flag.Parse: E0530 07:11:04.620914       1 reflector.go:205] github.com/solo-io/gloo/pkg/storage/crd/roles.go:84: Failed to list *v1.Role: roles.gloo.solo.io is forbidden: User "system:serviceaccount:gloo-system:default" cannot list roles.gloo.solo.io in the namespace "gloo-system"
ERROR: logging before flag.Parse: E0530 07:11:05.623252       1 reflector.go:205] github.com/solo-io/gloo/pkg/storage/crd/roles.go:84: Failed to list *v1.Role: roles.gloo.solo.io is forbidden: User "system:serviceaccount:gloo-system:default" cannot list roles.gloo.solo.io in the namespace "gloo-system"
ERROR: logging before flag.Parse: E0530 07:11:06.626267       1 reflector.go:205] github.com/solo-io/gloo/pkg/storage/crd/roles.go:84: Failed to list *v1.Role: roles.gloo.solo.io is forbidden: User "system:serviceaccount:gloo-system:default" cannot list roles.gloo.solo.io in the namespace "gloo-system"
ERROR: logging before flag.Parse: E0530 07:11:07.628632       1 reflector.go:205] github.com/solo-io/gloo/pkg/storage/crd/roles.go:84: Failed to list *v1.Role: roles.gloo.solo.io is forbidden: User "system:serviceaccount:gloo-system:default" cannot list roles.gloo.solo.io in the namespace "gloo-system"
ERROR: logging before flag.Parse: E0530 07:11:08.633828       1 reflector.go:205] github.com/solo-io/gloo/pkg/storage/crd/roles.go:84: Failed to list *v1.Role: roles.gloo.solo.io is forbidden: User "system:serviceaccount:gloo-system:default" cannot list roles.gloo.solo.io in the namespace "gloo-system"
ERROR: logging before flag.Parse: E0530 07:11:09.650339       1 reflector.go:205] github.com/solo-io/gloo/pkg/storage/crd/roles.go:84: Failed to list *v1.Role: roles.gloo.solo.io is forbidden: User "system:serviceaccount:gloo-system:default" cannot list roles.gloo.solo.io in the namespace "gloo-system"
ERROR: logging before flag.Parse: E0530 07:11:10.653396       1 reflector.go:205] github.com/solo-io/gloo/pkg/storage/crd/roles.go:84: Failed to list *v1.Role: roles.gloo.solo.io is forbidden: User "system:serviceaccount:gloo-system:default" cannot list roles.gloo.solo.io in the namespace "gloo-system"
ERROR: logging before flag.Parse: E0530 07:11:11.655923       1 reflector.go:205] github.com/solo-io/gloo/pkg/storage/crd/roles.go:84: Failed to list *v1.Role: roles.gloo.solo.io is forbidden: User "system:serviceaccount:gloo-system:default" cannot list roles.gloo.solo.io in the namespace "gloo-system"

ingress

[2018-05-30 06:46:50.281][1][info][main] external/envoy/source/server/server.cc:200] initializing epoch 0 (hot restart version=9.200.16384.127.options=capacity=16384, num_slots=8209 hash=228984379728933363)
[2018-05-30 06:46:50.281][1][info][main] external/envoy/source/server/server.cc:202] statically linked extensions:
[2018-05-30 06:46:50.281][1][info][main] external/envoy/source/server/server.cc:204]   access_loggers: envoy.file_access_log,envoy.http_grpc_access_log
[2018-05-30 06:46:50.281][1][info][main] external/envoy/source/server/server.cc:207]   filters.http: envoy.buffer,envoy.cors,envoy.ext_authz,envoy.fault,envoy.grpc_http1_bridge,envoy.grpc_json_transcoder,envoy.grpc_web,envoy.gzip,envoy.health_check,envoy.http_dynamo_filter,envoy.ip_tagging,envoy.lua,envoy.rate_limit,envoy.router,envoy.squash,io.solo.azure_functions,io.solo.function_router,io.solo.gcloudfunc,io.solo.lambda,io.solo.nats_streaming,io.solo.transformation
[2018-05-30 06:46:50.281][1][info][main] external/envoy/source/server/server.cc:210]   filters.listener: envoy.listener.original_dst,envoy.listener.proxy_protocol
[2018-05-30 06:46:50.281][1][info][main] external/envoy/source/server/server.cc:213]   filters.network: envoy.client_ssl_auth,envoy.echo,envoy.ext_authz,envoy.http_connection_manager,envoy.mongo_proxy,envoy.ratelimit,envoy.redis_proxy,envoy.tcp_proxy
[2018-05-30 06:46:50.281][1][info][main] external/envoy/source/server/server.cc:215]   stat_sinks: envoy.dog_statsd,envoy.metrics_service,envoy.statsd
[2018-05-30 06:46:50.281][1][info][main] external/envoy/source/server/server.cc:217]   tracers: envoy.dynamic.ot,envoy.lightstep,envoy.zipkin
[2018-05-30 06:46:50.281][1][info][main] external/envoy/source/server/server.cc:220]   transport_sockets.downstream: raw_buffer,ssl
[2018-05-30 06:46:50.281][1][info][main] external/envoy/source/server/server.cc:223]   transport_sockets.upstream: raw_buffer,ssl
[2018-05-30 06:46:50.294][1][warning][upstream] external/envoy/source/common/config/grpc_mux_impl.cc:224] gRPC config stream closed: 1, 
[2018-05-30 06:46:50.294][1][warning][upstream] external/envoy/source/common/config/grpc_mux_impl.cc:39] Unable to establish new stream
[2018-05-30 06:46:50.294][1][info][config] external/envoy/source/server/configuration_impl.cc:52] loading 0 listener(s)
[2018-05-30 06:46:50.294][1][info][config] external/envoy/source/server/configuration_impl.cc:92] loading tracing configuration
[2018-05-30 06:46:50.294][1][info][config] external/envoy/source/server/configuration_impl.cc:114] loading stats sink configuration
[2018-05-30 06:46:50.294][1][info][main] external/envoy/source/server/server.cc:399] starting main dispatch loop
[2018-05-30 06:46:55.295][1][warning][upstream] external/envoy/source/common/config/grpc_mux_impl.cc:224] gRPC config stream closed: 1, 
[2018-05-30 06:46:55.296][1][warning][upstream] external/envoy/source/common/config/grpc_mux_impl.cc:39] Unable to establish new stream
[2018-05-30 06:47:00.297][1][warning][upstream] external/envoy/source/common/config/grpc_mux_impl.cc:224] gRPC config stream closed: 1, 
[2018-05-30 06:47:00.297][1][warning][upstream] external/envoy/source/common/config/grpc_mux_impl.cc:39] Unable to establish new stream
[2018-05-30 06:47:05.297][1][warning][upstream] external/envoy/source/common/config/grpc_mux_impl.cc:224] gRPC config stream closed: 1, 
[2018-05-30 06:47:05.297][1][warning][upstream] external/envoy/source/common/config/grpc_mux_impl.cc:39] Unable to establish new stream
[2018-05-30 06:47:10.298][1][warning][upstream] external/envoy/source/common/config/grpc_mux_impl.cc:224] gRPC config stream closed: 1, 
[2018-05-30 06:47:10.298][1][warning][upstream] external/envoy/source/common/config/grpc_mux_impl.cc:39] Unable to establish new stream
[2018-05-30 06:47:15.297][1][warning][upstream] external/envoy/source/common/config/grpc_mux_impl.cc:224] gRPC config stream closed: 1, 
[2018-05-30 06:47:15.298][1][warning][upstream] external/envoy/source/common/config/grpc_mux_impl.cc:39] Unable to establish new stream
[2018-05-30 06:47:20.298][1][warning][upstream] external/envoy/source/common/config/grpc_mux_impl.cc:224] gRPC config stream closed: 1, 
[2018-05-30 06:47:20.298][1][warning][upstream] external/envoy/source/common/config/grpc_mux_impl.cc:39] Unable to establish new stream
[2018-05-30 06:47:25.297][1][warning][upstream] external/envoy/source/common/config/grpc_mux_impl.cc:224] gRPC config stream closed: 1, 
[2018-05-30 06:47:25.298][1][warning][upstream] external/envoy/source/common/config/grpc_mux_impl.cc:39] Unable to establish new stream
[2018-05-30 06:47:30.297][1][warning][upstream] external/envoy/source/common/config/grpc_mux_impl.cc:224] gRPC config stream closed: 1, 
[2018-05-30 06:47:30.297][1][warning][upstream] external/envoy/source/common/config/grpc_mux_impl.cc:39] Unable to establish new stream
[2018-05-30 06:47:35.299][1][warning][upstream] external/envoy/source/common/config/grpc_mux_impl.cc:224] gRPC config stream closed: 1, 
[2018-05-30 06:47:35.299][1][warning][upstream] external/envoy/source/common/config/grpc_mux_impl.cc:39] Unable to establish new stream
[2018-05-30 06:47:40.298][1][warning][upstream] external/envoy/source/common/config/grpc_mux_impl.cc:224] gRPC config stream closed: 1, 
[2018-05-30 06:47:40.298][1][warning][upstream] external/envoy/source/common/config/grpc_mux_impl.cc:39] Unable to establish new stream
[2018-05-30 06:47:45.299][1][warning][upstream] external/envoy/source/common/config/grpc_mux_impl.cc:224] gRPC config stream closed: 1, 
[2018-05-30 06:47:45.299][1][warning][upstream] external/envoy/source/common/config/grpc_mux_impl.cc:39] Unable to establish new stream
[2018-05-30 06:47:47.799][1][info][upstream] external/envoy/source/common/upstream/cluster_manager_impl.cc:127] cm init: initializing cds
[2018-05-30 06:47:50.313][1][info][upstream] external/envoy/source/common/upstream/cluster_manager_impl.cc:379] add/update cluster gloo-system-control-plane-8081 during init
[2018-05-30 06:47:50.317][1][info][upstream] external/envoy/source/common/upstream/cluster_manager_impl.cc:379] add/update cluster gloo-system-ingress-8080 during init
[2018-05-30 06:47:50.320][1][info][upstream] external/envoy/source/common/upstream/cluster_manager_impl.cc:379] add/update cluster gloo-system-ingress-8443 during init
[2018-05-30 06:47:50.321][1][info][upstream] external/envoy/source/common/upstream/cluster_manager_impl.cc:108] cm init: initializing secondary clusters
[2018-05-30 06:47:50.324][1][info][upstream] external/envoy/source/common/upstream/cluster_manager_impl.cc:131] cm init: all clusters initialized
[2018-05-30 06:47:50.324][1][info][main] external/envoy/source/server/server.cc:383] all clusters initialized. initializing init manager
[2018-05-30 06:47:50.325][1][info][config] external/envoy/source/server/listener_manager_impl.cc:602] all dependencies initialized. starting workers
[2018-05-30 06:57:32.189][1][info][upstream] external/envoy/source/common/upstream/cluster_manager_impl.cc:385] add/update cluster default-grpc-bookstore-8080 starting warming
[2018-05-30 06:57:32.189][1][info][upstream] external/envoy/source/common/upstream/cluster_manager_impl.cc:392] warming cluster default-grpc-bookstore-8080 complete
[2018-05-30 06:58:15.764][1][info][upstream] external/envoy/source/common/upstream/cluster_manager_impl.cc:385] add/update cluster default-grpc-bookstore-8080 starting warming
[2018-05-30 06:58:15.764][1][info][upstream] external/envoy/source/common/upstream/cluster_manager_impl.cc:392] warming cluster default-grpc-bookstore-8080 complete
[2018-05-30 07:02:50.326][1][info][main] external/envoy/source/server/drain_manager_impl.cc:63] shutting down parent after drain

function-discovery

"Wed, 30 May 2018 06:58:15 UTC: github.com/solo-io/gloo/internal/function-discovery/detector/detector.go:64"	no more retries for "gloo-system-ingress-8080"
"Wed, 30 May 2018 06:58:15 UTC: github.com/solo-io/gloo/internal/function-discovery/detector/detector.go:64"	no more retries for "gloo-system-ingress-8443"
"Wed, 30 May 2018 06:58:15 UTC: github.com/solo-io/gloo/internal/function-discovery/detector/detector.go:64"	no more retries for "gloo-system-control-plane-8081"
"Wed, 30 May 2018 06:58:15 UTC: github.com/solo-io/gloo/internal/function-discovery/updater/updater.go:256"	updated upstream &v1.Upstream{
  Name:              "default-grpc-bookstore-8080",
  Type:              "kubernetes",
  ConnectionTimeout: 0,
  Spec:              &types.Struct{
    Fields: {
      "service_port": &types.Value{
        Kind: &types.Value_NumberValue{
          NumberValue: 8080.000000,
        },
      },
      "service_name": &types.Value{
        Kind: &types.Value_StringValue{
          StringValue: "grpc-bookstore",
        },
      },
      "service_namespace": &types.Value{
        Kind: &types.Value_StringValue{
          StringValue: "default",
        },
      },
    },
  },
  Functions:   []*v1.Function{},
  ServiceInfo: &v1.ServiceInfo{
    Type:       "gRPC",
    Properties: &types.Struct{
      Fields: {
        "descriptors_file_ref": &types.Value{
          Kind: &types.Value_StringValue{
            StringValue: "grpc-discovery:Bookstore.descriptors",
          },
        },
        "service_names": &types.Value{
          Kind: &types.Value_ListValue{
            ListValue: &types.ListValue{
              Values: []*types.Value{
                &types.Value{
                  Kind: &types.Value_StringValue{
                    StringValue: "Bookstore",
                  },
                },
              },
            },
          },
        },
      },
    },
  },
  Status: &v1.Status{
    State:  1,
    Reason: "",
  },
  Metadata: &v1.Metadata{
    ResourceVersion: "1518",
    Namespace:       "gloo-system",
    Annotations:     {
      "generated_by":                                     "kubernetes-upstream-discovery",
      "kubectl.kubernetes.io/last-applied-configuration": "{\"apiVersion\":\"v1\",\"kind\":\"Service\",\"metadata\":{\"annotations\":{},\"labels\":{\"sevice\":\"grpc-bookstore\"},\"name\":\"grpc-bookstore\",\"namespace\":\"default\"},\"spec\":{\"ports\":[{\"port\":8080,\"protocol\":\"TCP\"}],\"selector\":{\"app\":\"grpc-bookstore\"},\"type\":\"LoadBalancer\"}}\n",
      "gloo.solo.io/discovery-type":                      "grpc",
    },
  },
}
"Wed, 30 May 2018 06:58:15 UTC: github.com/solo-io/gloo/internal/function-discovery/eventloop/event_loop.go:98"	attempting update for "default-grpc-bookstore-8080"
"Wed, 30 May 2018 06:58:15 UTC: github.com/solo-io/gloo/internal/function-discovery/updater/updater.go:222"	attempting to apply update for upstream "default-grpc-bookstore-8080"
"Wed, 30 May 2018 06:58:16 UTC: github.com/solo-io/gloo/internal/function-discovery/eventloop/event_loop.go:98"	attempting update for "default-grpc-bookstore-8080"
"Wed, 30 May 2018 06:58:16 UTC: github.com/solo-io/gloo/internal/function-discovery/updater/updater.go:222"	attempting to apply update for upstream "default-grpc-bookstore-8080"
"Wed, 30 May 2018 07:15:44 UTC: github.com/solo-io/gloo/internal/function-discovery/eventloop/event_loop.go:112"	beginning update for []string{
  "gloo-system-control-plane-8081",
  "gloo-system-ingress-8080",
  "gloo-system-ingress-8443",
  "default-grpc-bookstore-8080",
} upstreams: 4
"Wed, 30 May 2018 07:15:44 UTC: github.com/solo-io/gloo/internal/function-discovery/eventloop/event_loop.go:98"	attempting update for "default-grpc-bookstore-8080"
"Wed, 30 May 2018 07:15:44 UTC: github.com/solo-io/gloo/internal/function-discovery/updater/updater.go:222"	attempting to apply update for upstream "default-grpc-bookstore-8080"
"Wed, 30 May 2018 07:15:44 UTC: github.com/solo-io/gloo/internal/function-discovery/eventloop/event_loop.go:98"	attempting update for "gloo-system-control-plane-8081"
"Wed, 30 May 2018 07:15:44 UTC: github.com/solo-io/gloo/internal/function-discovery/updater/updater.go:222"	attempting to apply update for upstream "gloo-system-control-plane-8081"
"Wed, 30 May 2018 07:15:44 UTC: github.com/solo-io/gloo/internal/function-discovery/eventloop/event_loop.go:112"	beginning update for []string{
  "gloo-system-control-plane-8081",
  "gloo-system-ingress-8080",
  "gloo-system-ingress-8443",
  "default-grpc-bookstore-8080",
} upstreams: 4
"Wed, 30 May 2018 07:15:44 UTC: github.com/solo-io/gloo/internal/function-discovery/eventloop/event_loop.go:98"	attempting update for "gloo-system-ingress-8080"
"Wed, 30 May 2018 07:15:44 UTC: github.com/solo-io/gloo/internal/function-discovery/updater/updater.go:222"	attempting to apply update for upstream "gloo-system-ingress-8080"
"Wed, 30 May 2018 07:15:44 UTC: github.com/solo-io/gloo/internal/function-discovery/eventloop/event_loop.go:112"	beginning update for []string{
  "gloo-system-control-plane-8081",
  "gloo-system-ingress-8080",
  "gloo-system-ingress-8443",
  "default-grpc-bookstore-8080",
} upstreams: 4
"Wed, 30 May 2018 07:15:44 UTC: github.com/solo-io/gloo/internal/function-discovery/eventloop/event_loop.go:98"	attempting update for "gloo-system-ingress-8443"
"Wed, 30 May 2018 07:15:44 UTC: github.com/solo-io/gloo/internal/function-discovery/updater/updater.go:222"	attempting to apply update for upstream "gloo-system-ingress-8443"
"Wed, 30 May 2018 07:15:44 UTC: github.com/solo-io/gloo/internal/function-discovery/eventloop/event_loop.go:112"	beginning update for []string{
  "gloo-system-control-plane-8081",
  "gloo-system-ingress-8080",
  "gloo-system-ingress-8443",
  "default-grpc-bookstore-8080",
} upstreams: 4
"Wed, 30 May 2018 07:15:44 UTC: github.com/solo-io/gloo/internal/function-discovery/eventloop/event_loop.go:112"	beginning update for []string{
  "gloo-system-control-plane-8081",
  "gloo-system-ingress-8080",
  "gloo-system-ingress-8443",
  "default-grpc-bookstore-8080",
} upstreams: 4
"Wed, 30 May 2018 07:15:44 UTC: github.com/solo-io/gloo/internal/function-discovery/detector/detector.go:64"	no more retries for "gloo-system-control-plane-8081"
"Wed, 30 May 2018 07:15:44 UTC: github.com/solo-io/gloo/internal/function-discovery/detector/detector.go:64"	no more retries for "gloo-system-ingress-8080"
"Wed, 30 May 2018 07:15:44 UTC: github.com/solo-io/gloo/internal/function-discovery/detector/detector.go:64"	no more retries for "gloo-system-ingress-8443"
"Wed, 30 May 2018 07:15:44 UTC: github.com/solo-io/gloo/internal/function-discovery/eventloop/event_loop.go:98"	attempting update for "default-grpc-bookstore-8080"
"Wed, 30 May 2018 07:15:44 UTC: github.com/solo-io/gloo/internal/function-discovery/updater/updater.go:222"	attempting to apply update for upstream "default-grpc-bookstore-8080"
"Wed, 30 May 2018 07:15:44 UTC: github.com/solo-io/gloo/internal/function-discovery/eventloop/event_loop.go:98"	attempting update for "gloo-system-control-plane-8081"
"Wed, 30 May 2018 07:15:44 UTC: github.com/solo-io/gloo/internal/function-discovery/updater/updater.go:222"	attempting to apply update for upstream "gloo-system-control-plane-8081"
"Wed, 30 May 2018 07:15:44 UTC: github.com/solo-io/gloo/internal/function-discovery/eventloop/event_loop.go:98"	attempting update for "gloo-system-ingress-8080"
"Wed, 30 May 2018 07:15:44 UTC: github.com/solo-io/gloo/internal/function-discovery/updater/updater.go:222"	attempting to apply update for upstream "gloo-system-ingress-8080"
"Wed, 30 May 2018 07:15:44 UTC: github.com/solo-io/gloo/internal/function-discovery/eventloop/event_loop.go:98"	attempting update for "gloo-system-ingress-8443"
"Wed, 30 May 2018 07:15:44 UTC: github.com/solo-io/gloo/internal/function-discovery/updater/updater.go:222"	attempting to apply update for upstream "gloo-system-ingress-8443"
"Wed, 30 May 2018 07:15:44 UTC: github.com/solo-io/gloo/internal/function-discovery/detector/detector.go:64"	no more retries for "gloo-system-control-plane-8081"
"Wed, 30 May 2018 07:15:45 UTC: github.com/solo-io/gloo/internal/function-discovery/detector/detector.go:64"	no more retries for "gloo-system-ingress-8080"
"Wed, 30 May 2018 07:15:45 UTC: github.com/solo-io/gloo/internal/function-discovery/detector/detector.go:64"	no more retries for "gloo-system-ingress-8443"
"Wed, 30 May 2018 07:15:45 UTC: github.com/solo-io/gloo/internal/function-discovery/eventloop/event_loop.go:98"	attempting update for "default-grpc-bookstore-8080"
"Wed, 30 May 2018 07:15:45 UTC: github.com/solo-io/gloo/internal/function-discovery/updater/updater.go:222"	attempting to apply update for upstream "default-grpc-bookstore-8080"
"Wed, 30 May 2018 07:15:46 UTC: github.com/solo-io/gloo/internal/function-discovery/eventloop/event_loop.go:98"	attempting update for "gloo-system-control-plane-8081"
"Wed, 30 May 2018 07:15:46 UTC: github.com/solo-io/gloo/internal/function-discovery/updater/updater.go:222"	attempting to apply update for upstream "gloo-system-control-plane-8081"
"Wed, 30 May 2018 07:15:46 UTC: github.com/solo-io/gloo/internal/function-discovery/eventloop/event_loop.go:98"	attempting update for "gloo-system-ingress-8080"
"Wed, 30 May 2018 07:15:46 UTC: github.com/solo-io/gloo/internal/function-discovery/updater/updater.go:222"	attempting to apply update for upstream "gloo-system-ingress-8080"
"Wed, 30 May 2018 07:15:47 UTC: github.com/solo-io/gloo/internal/function-discovery/eventloop/event_loop.go:98"	attempting update for "gloo-system-ingress-8443"
"Wed, 30 May 2018 07:15:47 UTC: github.com/solo-io/gloo/internal/function-discovery/updater/updater.go:222"	attempting to apply update for upstream "gloo-system-ingress-8443"
"Wed, 30 May 2018 07:15:47 UTC: github.com/solo-io/gloo/internal/function-discovery/detector/detector.go:64"	no more retries for "gloo-system-control-plane-8081"
"Wed, 30 May 2018 07:15:47 UTC: github.com/solo-io/gloo/internal/function-discovery/detector/detector.go:64"	no more retries for "gloo-system-ingress-8080"
"Wed, 30 May 2018 07:15:47 UTC: github.com/solo-io/gloo/internal/function-discovery/detector/detector.go:64"	no more retries for "gloo-system-ingress-8443"
"Wed, 30 May 2018 07:15:48 UTC: github.com/solo-io/gloo/internal/function-discovery/eventloop/event_loop.go:98"	attempting update for "default-grpc-bookstore-8080"
"Wed, 30 May 2018 07:15:48 UTC: github.com/solo-io/gloo/internal/function-discovery/updater/updater.go:222"	attempting to apply update for upstream "default-grpc-bookstore-8080"
"Wed, 30 May 2018 07:15:48 UTC: github.com/solo-io/gloo/internal/function-discovery/eventloop/event_loop.go:98"	attempting update for "gloo-system-control-plane-8081"
"Wed, 30 May 2018 07:15:48 UTC: github.com/solo-io/gloo/internal/function-discovery/updater/updater.go:222"	attempting to apply update for upstream "gloo-system-control-plane-8081"
"Wed, 30 May 2018 07:15:49 UTC: github.com/solo-io/gloo/internal/function-discovery/eventloop/event_loop.go:98"	attempting update for "gloo-system-ingress-8080"
"Wed, 30 May 2018 07:15:49 UTC: github.com/solo-io/gloo/internal/function-discovery/updater/updater.go:222"	attempting to apply update for upstream "gloo-system-ingress-8080"
"Wed, 30 May 2018 07:15:49 UTC: github.com/solo-io/gloo/internal/function-discovery/eventloop/event_loop.go:98"	attempting update for "gloo-system-ingress-8443"
"Wed, 30 May 2018 07:15:49 UTC: github.com/solo-io/gloo/internal/function-discovery/updater/updater.go:222"	attempting to apply update for upstream "gloo-system-ingress-8443"
"Wed, 30 May 2018 07:15:49 UTC: github.com/solo-io/gloo/internal/function-discovery/detector/detector.go:64"	no more retries for "gloo-system-control-plane-8081"
"Wed, 30 May 2018 07:15:50 UTC: github.com/solo-io/gloo/internal/function-discovery/detector/detector.go:64"	no more retries for "gloo-system-ingress-8080"
"Wed, 30 May 2018 07:15:50 UTC: github.com/solo-io/gloo/internal/function-discovery/detector/detector.go:64"	no more retries for "gloo-system-ingress-8443"
"Wed, 30 May 2018 07:15:50 UTC: github.com/solo-io/gloo/internal/function-discovery/eventloop/event_loop.go:98"	attempting update for "default-grpc-bookstore-8080"
"Wed, 30 May 2018 07:15:50 UTC: github.com/solo-io/gloo/internal/function-discovery/updater/updater.go:222"	attempting to apply update for upstream "default-grpc-bookstore-8080"
"Wed, 30 May 2018 07:15:51 UTC: github.com/solo-io/gloo/internal/function-discovery/eventloop/event_loop.go:98"	attempting update for "gloo-system-control-plane-8081"
"Wed, 30 May 2018 07:15:51 UTC: github.com/solo-io/gloo/internal/function-discovery/updater/updater.go:222"	attempting to apply update for upstream "gloo-system-control-plane-8081"
"Wed, 30 May 2018 07:15:51 UTC: github.com/solo-io/gloo/internal/function-discovery/eventloop/event_loop.go:98"	attempting update for "gloo-system-ingress-8080"
"Wed, 30 May 2018 07:15:51 UTC: github.com/solo-io/gloo/internal/function-discovery/updater/updater.go:222"	attempting to apply update for upstream "gloo-system-ingress-8080"
"Wed, 30 May 2018 07:15:51 UTC: github.com/solo-io/gloo/internal/function-discovery/eventloop/event_loop.go:98"	attempting update for "gloo-system-ingress-8443"
"Wed, 30 May 2018 07:15:51 UTC: github.com/solo-io/gloo/internal/function-discovery/updater/updater.go:222"	attempting to apply update for upstream "gloo-system-ingress-8443"
"Wed, 30 May 2018 07:15:52 UTC: github.com/solo-io/gloo/internal/function-discovery/detector/detector.go:64"	no more retries for "gloo-system-control-plane-8081"
"Wed, 30 May 2018 07:15:52 UTC: github.com/solo-io/gloo/internal/function-discovery/detector/detector.go:64"	no more retries for "gloo-system-ingress-8080"
"Wed, 30 May 2018 07:15:52 UTC: github.com/solo-io/gloo/internal/function-discovery/detector/detector.go:64"	no more retries for "gloo-system-ingress-8443"

Upstream Discovery

ERROR: logging before flag.Parse: W0530 06:46:11.927839       1 client_config.go:529] Neither --kubeconfig nor --master was specified.  Using the inClusterConfig.  This might not work.
"Wed, 30 May 2018 06:46:11 UTC: github.com/solo-io/gloo/pkg/storage/crd/crd_storage_client.go:103"	registering crd v1.crd{
  Plural:    "upstreams",
  Group:     "gloo.solo.io",
  Version:   "v1",
  Kind:      "Upstream",
  ShortName: "us",
}
"Wed, 30 May 2018 06:46:11 UTC: github.com/solo-io/gloo/pkg/storage/crd/crd_storage_client.go:103"	registering crd v1.crd{
  Plural:    "virtualservices",
  Group:     "gloo.solo.io",
  Version:   "v1",
  Kind:      "VirtualService",
  ShortName: "vs",
}
"Wed, 30 May 2018 06:46:11 UTC: github.com/solo-io/gloo/pkg/storage/crd/crd_storage_client.go:103"	registering crd v1.crd{
  Plural:    "virtualmeshes",
  Group:     "gloo.solo.io",
  Version:   "v1",
  Kind:      "VirtualMesh",
  ShortName: "vm",
}
2018/05/30 06:46:11 starting kubernetes service discovery
"Wed, 30 May 2018 06:46:11 UTC: github.com/solo-io/gloo/vendor/github.com/solo-io/kubecontroller/controller.go:113"	Starting "kube-upstream-discovery" controller
"Wed, 30 May 2018 06:46:11 UTC: github.com/solo-io/gloo/vendor/github.com/solo-io/kubecontroller/controller.go:116"	Waiting for informer caches to sync
"Wed, 30 May 2018 06:46:12 UTC: github.com/solo-io/gloo/vendor/github.com/solo-io/kubecontroller/controller.go:121"	Starting workers
"Wed, 30 May 2018 06:46:12 UTC: github.com/solo-io/gloo/vendor/github.com/solo-io/kubecontroller/controller.go:127"	Started workers
"Wed, 30 May 2018 06:46:12 UTC: github.com/solo-io/gloo/pkg/config/desired_state.go:101"	updating upstream "gloo-system-control-plane-8081"
"Wed, 30 May 2018 06:46:12 UTC: github.com/solo-io/gloo/pkg/config/desired_state.go:101"	updating upstream "gloo-system-control-plane-8081"
"Wed, 30 May 2018 06:47:11 UTC: github.com/solo-io/gloo/pkg/config/desired_state.go:101"	updating upstream "gloo-system-control-plane-8081"
"Wed, 30 May 2018 06:47:11 UTC: github.com/solo-io/gloo/pkg/config/desired_state.go:101"	updating upstream "gloo-system-ingress-8080"
"Wed, 30 May 2018 06:47:12 UTC: github.com/solo-io/gloo/pkg/config/desired_state.go:101"	updating upstream "gloo-system-ingress-8443"
"Wed, 30 May 2018 06:48:11 UTC: github.com/solo-io/gloo/pkg/config/desired_state.go:101"	updating upstream "gloo-system-control-plane-8081"
"Wed, 30 May 2018 06:48:11 UTC: github.com/solo-io/gloo/pkg/config/desired_state.go:101"	updating upstream "gloo-system-ingress-8080"
"Wed, 30 May 2018 06:48:11 UTC: github.com/solo-io/gloo/pkg/config/desired_state.go:101"	updating upstream "gloo-system-ingress-8443"
"Wed, 30 May 2018 06:49:11 UTC: github.com/solo-io/gloo/pkg/config/desired_state.go:101"	updating upstream "gloo-system-control-plane-8081"
"Wed, 30 May 2018 06:49:11 UTC: github.com/solo-io/gloo/pkg/config/desired_state.go:101"	updating upstream "gloo-system-ingress-8080"
"Wed, 30 May 2018 06:49:11 UTC: github.com/solo-io/gloo/pkg/config/desired_state.go:101"	updating upstream "gloo-system-ingress-8443"
"Wed, 30 May 2018 06:50:11 UTC: github.com/solo-io/gloo/pkg/config/desired_state.go:101"	updating upstream "gloo-system-control-plane-8081"
"Wed, 30 May 2018 06:50:12 UTC: github.com/solo-io/gloo/pkg/config/desired_state.go:101"	updating upstream "gloo-system-ingress-8080"
"Wed, 30 May 2018 06:50:12 UTC: github.com/solo-io/gloo/pkg/config/desired_state.go:101"	updating upstream "gloo-system-ingress-8443"
"Wed, 30 May 2018 06:51:11 UTC: github.com/solo-io/gloo/pkg/config/desired_state.go:101"	updating upstream "gloo-system-control-plane-8081"
"Wed, 30 May 2018 06:51:12 UTC: github.com/solo-io/gloo/pkg/config/desired_state.go:101"	updating upstream "gloo-system-ingress-8080"
"Wed, 30 May 2018 06:51:12 UTC: github.com/solo-io/gloo/pkg/config/desired_state.go:101"	updating upstream "gloo-system-ingress-8443"
"Wed, 30 May 2018 06:52:11 UTC: github.com/solo-io/gloo/pkg/config/desired_state.go:101"	updating upstream "gloo-system-control-plane-8081"
"Wed, 30 May 2018 06:52:12 UTC: github.com/solo-io/gloo/pkg/config/desired_state.go:101"	updating upstream "gloo-system-ingress-8080"
"Wed, 30 May 2018 06:52:12 UTC: github.com/solo-io/gloo/pkg/config/desired_state.go:101"	updating upstream "gloo-system-ingress-8443"
"Wed, 30 May 2018 06:53:11 UTC: github.com/solo-io/gloo/pkg/config/desired_state.go:101"	updating upstream "gloo-system-control-plane-8081"
"Wed, 30 May 2018 06:53:11 UTC: github.com/solo-io/gloo/pkg/config/desired_state.go:101"	updating upstream "gloo-system-ingress-8080"
"Wed, 30 May 2018 06:53:12 UTC: github.com/solo-io/gloo/pkg/config/desired_state.go:101"	updating upstream "gloo-system-ingress-8443"
"Wed, 30 May 2018 06:54:11 UTC: github.com/solo-io/gloo/pkg/config/desired_state.go:101"	updating upstream "gloo-system-control-plane-8081"
"Wed, 30 May 2018 06:54:11 UTC: github.com/solo-io/gloo/pkg/config/desired_state.go:101"	updating upstream "gloo-system-ingress-8080"
"Wed, 30 May 2018 06:54:11 UTC: github.com/solo-io/gloo/pkg/config/desired_state.go:101"	updating upstream "gloo-system-ingress-8443"
"Wed, 30 May 2018 06:55:11 UTC: github.com/solo-io/gloo/pkg/config/desired_state.go:101"	updating upstream "gloo-system-control-plane-8081"
"Wed, 30 May 2018 06:55:11 UTC: github.com/solo-io/gloo/pkg/config/desired_state.go:101"	updating upstream "gloo-system-ingress-8080"
"Wed, 30 May 2018 06:55:12 UTC: github.com/solo-io/gloo/pkg/config/desired_state.go:101"	updating upstream "gloo-system-ingress-8443"
"Wed, 30 May 2018 06:56:11 UTC: github.com/solo-io/gloo/pkg/config/desired_state.go:101"	updating upstream "gloo-system-control-plane-8081"
"Wed, 30 May 2018 06:56:11 UTC: github.com/solo-io/gloo/pkg/config/desired_state.go:101"	updating upstream "gloo-system-ingress-8080"
"Wed, 30 May 2018 06:56:11 UTC: github.com/solo-io/gloo/pkg/config/desired_state.go:101"	updating upstream "gloo-system-ingress-8443"
"Wed, 30 May 2018 06:57:11 UTC: github.com/solo-io/gloo/pkg/config/desired_state.go:101"	updating upstream "gloo-system-control-plane-8081"

Using an existing Ingress

Is it possible to use an existing Ingress that's deployed? I have an nginx-ingress already wired up to a loadbalancer on my K8s cluster.

BUILD FEATURE: add go vet check after dep ensure

the e2e tests take a very long time to run, and often fail because of build errors that would have been caught with a very quick go vet check on the repo.
In general we should be following many of these conventions anyway.

BUG: glooctl create upstream static does not respect --static-hosts flag

glooctl is returning "unknown flag" for --static-hosts which appears to be supported by the help output. Am I passing this flag incorrectly?

/tools# glooctl create upstream static --name test --static-hosts example.com:80
Error: unknown flag: --static-hosts
Usage:
  glooctl create upstream static [flags]

Flags:
  -h, --help                       help for static
      --service-spec-type string   if set, Gloo supports additional routing features to upstreams with a service spec. The service spec defines a set of features
      --static-hosts strings       list of hosts for the static upstream. these are hostnames or ips provided in the format IP:PORT or HOSTNAME:PORT. if :PORT is missing, it will default to :80
      --static-outbound-tls        connections Gloo manages to this cluster will attempt to use TLS for outbound connections. Gloo will automatically set this to true for port 443

Global Flags:
  -i, --interactive        use interactive mode
      --name string        name of the resource to read or write
  -n, --namespace string   namespace for reading or writing resources (default "gloo-system")
  -o, --output string      output format: (yaml, json, table)```

Not able to test it in minikube

followed all the steps, Tried installing and i got the below yaml wen executed "glooctl upstream get default-petstore-8080 -o yaml"

metadata:
annotations:
generated_by: kubernetes-upstream-discovery
kubectl.kubernetes.io/last-applied-configuration: |
{"apiVersion":"v1","kind":"Service","metadata":{"annotations":{},"labels":{"sevice":"petstore"},"name":"petstore","namespace":"default"},"spec":{"ports":[{"port":8080,"protocol":"TCP"}],"selector":{"app":"petstore"}}}
namespace: gloo-system
resource_version: "950"
name: default-petstore-8080
spec:
service_name: petstore
service_namespace: default
service_port: 8080
status:
state: Accepted
type: kubernetes

creating glooctl upstream gives "Error: unknown flag: --kube-service

I have a service in the namespace datasvc-contacts and ran the below command which I think is correct. It is giving me an error saying --kube-service is not a valid flag?

glooctl create upstream kube -n gloo-datasvc-contacts \
  --kube-service datasvc-contacts-svc \
  --kube-service-namespace datasvc-contacts \
  --kube-service-port 8080

Error: unknown flag: --kube-service
Usage:
glooctl create upstream kube [flags]

Flags:
-h, --help help for kube
--kube-service string name of the kubernetes service
--kube-service-labels strings labels to use for customized selection of pods for this upstream. can be used to select subsets of pods for a service e.g. for blue-green deployment
--kube-service-namespace string namespace where the kubernetes service lives (default "defaukt")
--kube-service-port uint32 the port were the service is listening. for services listenin on multiple ports, create an upstream for each port. (default 80)
--service-spec-type string if set, Gloo supports additional routing features to upstreams with a service spec. The service spec defines a set of features

Global Flags:
-i, --interactive use interactive mode
--name string name of the resource to read or write
-n, --namespace string namespace for reading or writing resources (default "gloo-system")
-o, --output string output format: (yaml, json, table)

glooctl version 0.5.0
Running on a MAC OS X with latest OS update, dark mode enabled if it matters
Latest generation 15" macbook

fix install script

  • Release needs SHA artifacts
  • Need to re-mount working install script on the site

ability to edit the settings crd from the cli

in order to more easily expose the Settings crd, cli commands for:

  • add//remove watchNamespace (also ability to configure rbac for those namespaces as well?)
  • set configsource (consul, kube, vault, file)
  • reset to defaults

add a preview option to install

For folks who wish to deal directly with the Kube yaml it would be good to have a preview for

glooctl install kube --preview which spits out the kube yaml

Diagnose problems during installation

Feature Request:
provide a glooctl diagnose install -like feature that would beforehand notify you of possible problems you will encounter during the installation
suggestion o verifications:

  • version mismatch or incompatibility

  • permissions

  • licenses (if it is the case)

  • access to bits (offline installation)

gen_map_test can flake

/Users/rick/code/src/github.com/solo-io/gloo/pkg/cliutil/nsselect/gen_map_test.go:20

  Expected
      <[]string | len:3, cap:4>: ["ns2, s3", "ns1, s1", "ns1, s2"]
  to equal
      <[]string | len:3, cap:3>: ["ns1, s1", "ns1, s2", "ns2, s3"]

  /Users/rick/code/src/github.com/solo-io/gloo/pkg/cliutil/nsselect/gen_map_test.go:22
------------------------------


Summarizing 1 Failure:

[Fail] Generate Options [It] should create the correct Secret options and map
/Users/rick/code/src/github.com/solo-io/gloo/pkg/cliutil/nsselect/gen_map_test.go:22

Ran 1 of 1 Specs in 0.001 seconds
FAIL! -- 0 Passed | 1 Failed | 0 Pending | 0 Skipped
--- FAIL: TestGit (0.00s)
FAIL

Automate release process

Currently releasing gloo requires:

  • Manually build and push docker images (gloo, gateway, envoy-gloo, discovery) with the new release tag
  • Update kube install/kube.yaml in this repo to use the new image tags
  • Regenerate kube.yaml.go
  • Commit and merge PR to master
  • Create a tag and GH release
  • Manually create and upload the artifacts (glooctl, glooctl SHAs) to the GH release
  • Manually create site docs (make site), create and push docker image
  • Update prod with new docs docker image

Question on NATS-streaming integration

On source_events_from_github example, I found the quite magic stuff:

# Deploy NATS and minio
# --- this command only deply the NATS-streaming server and minio server
kubectl apply -f \
 https://raw.githubusercontent.com/solo-io/gloo/master/example/source_events_from_github/kube-deploy.yaml

# Create a route for nats
# --- Route HTTP requests to NATS
glooctl route create --sort \
    --path-exact /github-webhooks \
    --upstream default-nats-streaming-4222 \
    --function github-webhooks

I can track that NATS-streaming server is auto discovered: https://github.com/solo-io/gloo/blob/master/pkg/function-discovery/nats-streaming/discover_nats_test.go

Can you explain where's the logic that push the event to NATS-streaming? I can't see where the logic for github-webhooks is implemented.

Thanks.

Update: I found that gloo provide TCP filter for NATS-streaming upstream via https://github.com/solo-io/envoy-nats-streaming/blob/master/source/common/nats/message_builder.cc — I think NATS-streaming integration should have better documentation. Thanks again for great integration.

Tutorial #2 doesn't fail as expected

In https://github.com/solo-io/gloo/blob/master/docs/getting_started/kubernetes/2.md#steps the instructions state:

 curl ${GATEWAY_URL}/petstore/findPet

 bad request: Did not found json element: id

But it no longer errors like this.

I believe the function definitions have changed. The tutorial expects them to be:

glooctl upstream get default-petstore-8080 -o yaml
functions:
     ...
     - name: findPetById
       spec:
         body: ""
         headers:
           :method: GET
         path: /api/pets/{{id}}

But instead they are now:

functions:
...
- name: findPetById
  spec:
    body: ""
    headers:
      :method: GET
    passthrough_body: false
    path: /api/pets/{{ default(id, "") }}

glooctl get routes error message a bit confusing

ceposta@postamaclab(~) $ glooctl get virtualservice routes
Error: virtualservice id provided was incorrect
Usage:
  glooctl get virtualservice route [flags]

Maybe we could update the usage to

glooctl get virtualservice [virtual service name] routes

thoughts?

BUG: glooctl v0.5.2 is actually 0.5.0

I'm unsure if the version is being reported incorrectly or if it's actually the wrong version:

/tmp# curl -LO https://github.com/solo-io/gloo/releases/download/v0.5.2/glooctl-linux-amd64
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100   610    0   610    0     0   2850      0 --:--:-- --:--:-- --:--:--  2850
100 40.6M  100 40.6M    0     0  2771k      0  0:00:15  0:00:15 --:--:-- 2946k
/tmp# chmod +x glooctl-linux-amd64
/tmp# ./glooctl-linux-amd64 --version
glooctl version 0.5.0

Function discovery not running

following running gloo with docker-compose, while the other services are up after running command docker-compose up function discovery is not running.

docker ps
returns

  • soloio/envoy:0.2.28
  • soloio/petstore-example:v0.1

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.