Coder Social home page Coder Social logo

mercadona / postoffice Goto Github PK

View Code? Open in Web Editor NEW
23.0 23.0 7.0 704 KB

A dispatching service implemented in Elixir. Communicate services or sent api calls to yourself to process anything later

License: Apache License 2.0

Elixir 65.43% Dockerfile 0.32% CSS 7.32% JavaScript 16.17% HTML 7.70% Makefile 1.19% Shell 1.86%
elixir pubsub

postoffice's People

Contributors

andrewgy8 avatar awaistkd avatar daniel-ruiz avatar dependabot[bot] avatar elreplicante avatar emiliocarrion avatar epalenque avatar ethervoid avatar fery avatar igarridot avatar igorrodriguez avatar jbonoraw avatar jjponz avatar jsansaloni avatar kianmeng avatar lonamiaec avatar lucasyarza avatar sanntt avatar vbergae avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

postoffice's Issues

Rename property `endpoint` from publishers model

This name is a bit confusing because when a publisher type is pubsub, the topic name must be the same that the endpoint.

The complete path of the model at this moment is:

lib/postoffice/messaging/publisher.ex

Fix find message feature

With the latest release, our finder was broken.
Before the changes, when someone looked for a message's id, we just needed to look for in messages table. Now we don' have that table anymore, so we should look for messages in oban_jobs. One thing to take into account here is that oban_job table is pruned every 24 hours

We have two new tables, sent_messages and failed_messages. In those tables, we have the original message_id and its original content. We should look in all those 3 tables:

  • oban_jobs for messages not sent, being sent or scheduled
  • sent_messages for successful sent messages
  • failed_messages messages not able to be sent at some point

The information we'd love to show as a search result would be the message's content and all success/failures done (when and what publisher was involved)

Do symmetry to create topic and create publisher

If we try create a publisher that exists, we returns a 400 error, but, if we try create a existing topic, we return 200.

We should modify topic to return 400 in the same way that publishers works

Create more contexts

At this moment, we has the contexts:

  • publisher_producer
  • messaging
  • message_producer_supervisor
  • message_producer
  • message_consumer_supervisor
  • message_consumer
  • dispatch

And in the messaging contexts, for example, we access to Topic information and Publisher information

I think we should create more contexts to manage correctly the information

Re-enable publish past messages when creating a publisher

Re-enable publish past messages when creating a publisher, but, instead of use from_now property, this property will be remove from the api and replaced by start_from that indicates the date to start send old messages.

The new payload to create publisher will be:

{
          'active': True,
          'target': 'http://www.some_url.com',
          'topic': 'some_topic',
          'type': 'http',
          'start_from': '2020-01-01'
}

start_from field is optional

Mark publishers as deleted

The only way to "remove" a publisher nowadays is marking it as inactive, but it has a downside. All publisher's messages will never be sent, we keep marking them as scheduled so we're trying to send them forever.
We need to add a new field on publishers, so we can mark them as deleted. If we're trying to send a message and the publisher is inactive will keep the message as scheduled but if the publisher is also marked as deleted we need to set this message as discarded

Manage correctly the errors when create message

When create a message the errors must be managed correctly.

At this moment, if we try create a message with an nonexistent topic, we returns a Internal Server Error because we no manage a nonexistent topic.

Spike to move goth config to module

We need to set an env variable with a credentials JSON file route in order to start the app.
There is an option on using our own config module so it would be interesting to read that JSON from DB in case we have a configured identity, or a dummy JSON in case we don't so the app is able to run.
With this change, we could stop copying files to our docker image, and we'll be able to publish PostOffice public image on docker's hub as it would run out of the box with any extra step.

Set topic name as unique

Topic name should be unique on db level.

Moreover, ecto can do this validation automatically.

Subscribing from the client

AFAIK, the current model of this project is based on a message POST to a client endpoint. Basically, a client opens an api that allows the broker to send messages.

What I would like to see and try to do is subscribe to a topic.

Right now, I can only envision this in one way, but I would love to hear of others.

My idea is to open a websocket in postoffice, in which the subscriber can start listening to. With this, a client can start listening to the websocket and a stream of messages will start to follow. Of course, this leads to issue such as:

  • rate limiting
  • message ordering
  • retry strategy's
  • and probably a lot more stuff.

However I think it would be a really good addition to the project since a project with more than a dozen subscriber, would need to manage a dozen or more endpoints. It would be a very passive way to listen to messages/events on the client side.

Keep message history clean

It's nice to create a system to be able to keep message history clean removing the old messages.

To achieve this, we should add environment variable (MESSAGE_RETENTION_DAYS for example) to define the days that we will preserve the messages (7 for example). And all days we can run a task to remove old messages

To do this, we can use quantum-elixir/quantum-core

Error using pubsub adapter

When try to send messages via pubsub, postoffice throw an error.

the problem seems that lives in this piece of code

request = %GoogleApi.PubSub.V1.Model.PublishRequest{
          messages: [
            %GoogleApi.PubSub.V1.Model.PubsubMessage{
              data: Base.encode64(Poison.encode!(payload)),
              attributes: attributes
            }
          ]
        }

The file path is: lib/postoffice/adapters

the google client expects that attributes must be a map, but this seems isn't map.

We must fix this to can send messages via pubsub

Post messages with different topics in bulk

There are cases in which a group of use cases happens at the same time and eventual consistency must be guaranteed.
For each one of these use cases, an event is published for different topics.
An example would be the validation by scanning in dispatch, where the following events might happen and therefore be published:

  • Order picking area validated
  • Crates updated
  • Order staging area dispatched
  • Order dispatched

If an error occurred in the middle, database would rollback, but events would have been published.

Furthermore, each request to postoffice adds latency and it could be reduced to just one call.

A purpose would be to expose an endpoint that would allow us to send messages of different topics in bulk.

Postoffice already exposes POST /bulk/messages/ but it's only for messages with the same topic.
I think it's misleading, as the topic belongs to the message.

In order to don't break the current API, our proposal is to expose an endpoint like POST /bulk//multitopic-messages/ which would accept a body like:

{
  'messages': [
    {'topic': 'a-topic', payload: {...}},
    ...
  ]
}

Format numbers in dashboard

Dashboard widgets should format data to present, here is an example
image
It would be nice to give a proper format to help on reading, as follows
image

Add new model for "rescue" process

We're reading the origin_host from topics, so every system owner of a topic must specify the origin host, creating many duplicated values.
It would be easier to have a specific model to keep origin_host values, and if the recovery process is enabled or not.
The proposed schema would be

  • id
  • origin_host
  • enabled

Once this model is in place, we would need to update the get_recovery_hosts() method from the Messaging module to read from the new model instead of the topic's data.

Simplify payload of message api

The actual api to create a message is:

{ 
    message:{
        attributes: {},
        payload: {},
        topic: ""     
   }
}

We can remove message to simplify the payload:

{ 
    attributes: {},
    payload: {},
    topic: ""     
}

Simplify payload of topic api

The actual api to create a topic is:

{
topic:{
name: "name
}
}

We can remove topic key to simplify the payload:

{
name: "name"
}

Add origin_host to publisher's schema

To enable the recovery of undelivered messages we need to know where to go for them

Add origin_host to Publisher. It's should be a nullable string field. We need it to be nullable as pubsub publishers won't require this recovery process

Change response status in case resources already exists

While trying to create topics/publishers via API we return 400 Bad request in case the topic/publisher we're trying to create already exists.
I propose to change it to return 409 Conflict in case the resource is already created and let 400 Bad request for real bad requests

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.