Coder Social home page Coder Social logo

sdk-csharp's Introduction

Build

Status

This SDK current supports the following versions of CloudEvents:

  • v1.0

sdk-csharp

.NET Standard 2.0 (C#) SDK for CloudEvents

The CloudNative.CloudEvents package provides support for creating, encoding, decoding, sending, and receiving CNCF CloudEvents. Most applications will want to add dependencies on other CloudNative.CloudEvents.* packages for specific event format and protocol binding support. See the user guide for details of the packages available.

A few gotchas highlighted for the impatient who don't usually read docs

  1. The CloudEvent class is not meant to be used with object serializers like JSON.NET. If you need to serialize or deserialize a CloudEvent directly, always use a CloudEventFormatter such as JsonEventFormatter.
  2. Protocol binding integration is provided in the form of extensions and the objective of those extensions is to map the CloudEvent to and from the respective protocol message, such as an HTTP request or response. The application is otherwise fully in control of the client. Therefore, the extensions do not add security headers or credentials or any other headers or properties that may be required to interact with a particular product or service. Adding this information is up to the application.

User guide and other documentation

The docs/ directory contains more documentation, including the user guide. Feedback on what else to include in the documentation is particularly welcome.

Changes since 1.x

From version 2.0.0-beta.2, there are a number of breaking changes compared with the 1.x series of releases. New code is strongly encouraged to adopt the latest version rather than relying on the 1.3.80 stable release.

The stable 2.0.0 version was released on June 15th 2021, and all users are encouraged to use this (or later) versions.

A more details list of changes is provided within the documentation.

Community

The C# SDK welcomes community contributions; see the contributing document for more details.

Each SDK may have its own unique processes, tooling and guidelines. Common governance related material can be found in the CloudEvents docs directory. In particular, in there you will find information concerning how SDK projects are managed, guidelines for how PR reviews and approval, and our Code of Conduct information.

If there is a security concern with one of the CloudEvents specifications, or with one of the project's SDKs, please send an email to [email protected].

Additional SDK Resources

sdk-csharp's People

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

sdk-csharp's Issues

Work out where to put the version history of each package

As the packages evolve independently, I suspect we'll need separate version history docs.

First thought: in the same directory as the source code, e.g. src/CloudNative.CloudEvents/HISTORY.md? We'd want to ensure it's not built into the package, but that's easy enough to do. It's not terribly discoverable though.

I'm very open to other ideas.

Kafka transport support

The specification for Kafka is currently in a draft mode but it seems that it's stable enough (since it's very similar to other transports) to implement it.

Serialize json using System.Text.Json

Are there any plants to use system.text.json as serializer instead of newtonsoft?

If you have an application that uses system.text.json it hurts to add a dependency to newtonsoft.json when using cloud events.

Consider versioning implications of InternalsVisibleTo

Currently CloudNative.CloudEvents exposes internals to other libraries in Attributes.cs:

[assembly: InternalsVisibleTo("CloudNative.CloudEvents.Amqp")]
[assembly: InternalsVisibleTo("CloudNative.CloudEvents.Mqtt")]
[assembly: InternalsVisibleTo("CloudNative.CloudEvents.Kafka")]

This means that those internals are effectively part of the public API for versioning purposes - if the Kafka project calls into an internal method in (say) version 1.4.0 of CloudNative.CloudEvents, that internal method can't be removed or changed in a breaking way until 2.0.
My own experience is that it's simpler to limit InternalsVisibleTo to test projects, which don't have the same issues.

I'll do a little audit of why those internals are made available like that - what uses there are. (To be absolutely bullet proof we'd have to check every reference from every published... but I think restricting it to 1.3.80 will be good enough, given the low usage of earlier versions.)

CloudNative.CloudEvents.AspNetCoreSample will not Auth with Azure Portal

Registering a WebHook in the Azure portal means you have to pass the Authorization test. This sample code does not allow for it, and hence, not able to be used.

I tried to understand how the OPTION verb is used, which requires CORS to be enabled, as well as a few other challenges. In the end I decided to create an event subscription to a ServiceBus Queue and read from that. Likely better for reliability and retry, but again IMO the sample code will not work as it stands today.

CloudEventController.cs, method ReceiveCloudEvent (line 19)

Revisit extensions design

I might be ignorant on this, but I haven't found any documentation to clarify regarding extensions. Right now, when you're writing a new event, I figured out how to attach extension properties using some of the extensions that are in this repo, but because of the way the Attach() method is formed, it requires some info up front in the constructor. I can set some private fields in the ctor and then reference them in the attach method. Everything builds and serializes just fine.

Coming the other direction, however... if I use JsonEventFormatter as suggested, it requires that I call out any extensions in the Decode method and then it seems to execute those in the exact same way which means I need those ctor arguments upfront but I don't have those because they're in the message. Short of deserizlizing to JToken and digging them out, I'm not sure how this is supposed to work.

Is this a design flaw or am I doing this wrong?

Azure Function HttpTrigger for CloudEvents

Hi,
I've created a new azure function Http trigger for receiving cloud events and I've installed the cloud events nuget package. Based on the documentation for a HttpRequestMessage I should be able to use the ToCloudEvent() extension method to get the CloudEvent passed (https://github.com/cloudevents/sdk-csharp#http---systemnethttplistener-httprequestmessage). The function looks something like this:

`
[FunctionName("EventGridBlobEventHandler")]
public static async Task Run(
[HttpTrigger(AuthorizationLevel.Function, "options", "post", Route = null)] HttpRequestMessage req, ILogger log)
{
if (req.Method.Method == "OPTIONS")
{
// If the request is for subscription validation, send back the validation code

            var response = req.CreateResponse(HttpStatusCode.OK);
            response.Headers.Add("Webhook-Allowed-Origin", "eventgrid.azure.net");

            return response;
        }
        var content = await req.Content.ReadAsStringAsync();
        
        var @event = req.ToCloudEvent();
        log.LogInformation("Received eventType: {EventType}", @event.Type);            
        return req.CreateResponse(HttpStatusCode.OK);
    }

`

I'm trying to test it locally with Postman doing post request having JSON formatted cloud event in the body, but after the req.ToCloudEvent() call the event variable is not containing the data I'm passing. What am I missing as from the docs I don't see what I'm doing wrong.

Thanks!

Consider whether we need resources for error messages

Currently exception error messages are in a resource string lookup. This has two advantages and one disadvantage:

  • It's localizable
  • We can potentially improve consistency, using the same resource in multiple places
  • It makes the code throwing the exception more complex

Experience suggests we won't actually get round to localizing the exception message... so do we think the remaining advantage is worth the complexity disadvantage?

Http Header DateTime format should be in RFC 3339 format ot avoid errors in Knative Eventing

CloudEvent header date-time format is not in RFC 3339 format. There needs to be a T between the date and the time.

E.g.: 2020-06-17 12:52:43Z Wrong!
2020-06-17T12:52:43Z Correct

Knative Eventing (broker filter) rejects non RFC 3339 formatted DateTime.

To fix:
In sdk-csharp/src/CloudNative.CloudEvents/HttpClientExtension.cs
Function: MapAttributesToListenerResponse()

Change line 471 from:

 httpListenerResponse.Headers.Add(HttpHeaderPrefix + attribute.Key,
                            ((DateTime)attribute.Value).ToString("u"));

To

 httpListenerResponse.Headers.Add(HttpHeaderPrefix + attribute.Key,
                            XmlConvert.ToString((DateTime)attribute.Value, XmlDateTimeSerializationMode.Utc));

Support Custom JSON JsonSerializerSettings

I would like to customize the serilizer to not send the null properties in the message. I am able to do this by changing the default settings, but I do not want to change this for the entire application and would rather not clone to default settings, do the work and then set back to the original settings.

any more ideas?

I currently have this
JsonConvert.DefaultSettings = () => new JsonSerializerSettings { NullValueHandling = NullValueHandling.Ignore };

JsonEventFormatter UriFormatException with relative Uris

Hello,

When attempting to materialize a CloudEvent from a KafkaCloudEventMessage, I get an UriFormatException regarding the source attribute not being able to be constructed.

UriFormatException: Invalid URI: The format of the URI could not be determined.

I believe it has to do with the source Uri supplied as my original CloudEvent, which is relative.

Now the 1.0 spec, recommends using absolute uris for the source attribute, but it does not forbid using relative ones.

Therefore, shouldn't the JsonEventFormatter be able to handle relative uris too?

How to reproduce

var formatter = new JsonEventFormatter();
var e = new CloudEvent("test", new Uri("/source/test", UriKind.RelativeOrAbsolute))
{
    DataContentType = new ContentType(MediaTypeNames.Application.Octet),
    Data = Encoding.UTF8.GetBytes("Hello, world!")
};
var kafkaMessage = new KafkaCloudEventMessage(e, ContentMode.Binary, formatter);
e = kafkaMessage.ToCloudEvent(formatter);

Support text/plain Media Type in CloudEventJsonInputFormatter

I don't know that this could be considered as an issue or not, but I want to ask/suggest it anyway.

We are trying to use cloud events on OpenShift platform. Our problem is the system sends events to our ASP.NET Core application with text/plain Content-Type. We solved this by copying source code of CloudEventJsonInputFormatter to our project, changing its name to CloudEventTextInputFormatter, and changing SupportedMediaTypes to text/plain like this:

SupportedMediaTypes.Add(MediaTypeHeaderValue.Parse("text/plain"));

Since the underlying ReadCloudEventAsync method does not make any transformation while reading the request body, it works like a charm. Here is the code for assigning Data field.

cloudEvent.Data = await new StreamReader(httpRequest.Body, Encoding.UTF8).ReadToEndAsync();

What do you think? Could it be integrated into CloudEventJsonInputFormatter? All it needs is the SupportedMediaTypes line I shared above. I could be like this:

SupportedMediaTypes.Add(MediaTypeHeaderValue.Parse("application/json"));
SupportedMediaTypes.Add(MediaTypeHeaderValue.Parse("application/cloudevents+json"));
SupportedMediaTypes.Add(MediaTypeHeaderValue.Parse("text/plain"));

Review HTTP tests carefully

Currently our ASP.NET Core integration test only works because when you create an HttpContent from an event without data, in binary mode, the formatter ends up creating content of "null" (as four bytes). If there's no data, we shouldn't be specifying a DataContentType or populating the request body - but that throws at the moment.

Likewise other HTTP tests specify a content type of XML despite using a JSON event formatter - it's not clear what the expected behaviour is there, in general. (That's related to #55.)

Feature Request: Protobuf formatter support

I have seen that we have JsonEventFormatter, and that would be great if we can support ProtobufFormatter too because of gRPC template on .NET Core 3.0, then some of usecases which are using Protobuf format such as message with Protobuf and put to Kafka or transfer gRPC services back and forth with Protobuf message...

ContentType decode exception

Hello,

I receive this expcetion while I try to deserialize event which I receive from Azure Service Bus.
Excaption:

The 'contenttype' attribute value must be a content-type expression compliant with RFC2046
   at CloudNative.CloudEvents.CloudEventAttributes.ValidateAndNormalize(String key, Object& value)
   at CloudNative.CloudEvents.CloudEventAttributes.set_Item(String key, Object value)
   at CloudNative.CloudEvents.JsonEventFormatter.DecodeJObject(JObject jObject, IEnumerable`1 extensions)
   at CloudNative.CloudEvents.JsonEventFormatter.DecodeStructuredEvent(Byte[] data, IEnumerable`1 extensions)

The event looks like this:

{
  "DataContentType": {
    "Boundary": null,
    "CharSet": null,
    "MediaType": "application/json",
    "Name": null,
    "Parameters": []
  },
  "Data": {
    "OrderNumber": "#",
    "TrackingNumber": "#",
    "StateId": 5
  },
  "Id": "dbb05af1-d0fb-46ee-9c8e-df585c02362a",
  "DataSchema": null,
  "Source": "http://localhost:61974/",
  "SpecVersion": 3,
  "Subject": "510589659",
  "Time": "2020-11-20T05:26:56.8365438Z",
  "Type": "trackandtrace.order.statechanged"
}

Event creation looks like this:

var message = new CloudEvent(Types.OrderStateChanged, _configuration.BaseUri)
{
	Data = new TransportStateChanged
	{
		OrderNumber = orderInfo.OrderNumber,
		TrackingNumber = orderInfo.TrackingNumber,
		StateId = orderInfo.StateId
	},
	DataContentType = new ContentType("application/json"),
	Subject = orderInfo.OrderNumber
};

Event deserialization is done by JsonEventFormatter with byte[] parameter.

Is there anything that I miss when I create or deserialize message?

I check JsonEventFormatter code and maybe this is the problem

switch (item.Value.Type)
{
	case JTokenType.String:
		attributes[item.Key] = item.Value.ToObject<string>();
		break;
	case JTokenType.Date:
		attributes[item.Key] = item.Value.ToObject<DateTime>();
		break;
	case JTokenType.Uri:
		attributes[item.Key] = item.Value.ToObject<Uri>();
		break;
	case JTokenType.Null:
		attributes[item.Key] = null;
		break;
	case JTokenType.Integer:
		attributes[item.Key] = item.Value.ToObject<int>();
		break;
	default:
		attributes[item.Key] = item.Value;
		break;
}

bacause in CloudEventAttributes is value type compared to string or ContentType but actual type is Newtonsoft.Json.Linq.JObject

Kafka CloudEvents extension slow for consumer

When casting a cloud event from a consume result performance drastically drops from serveral hunders of messages consumed per second to +- 15 messages per second.

This is the method I am calling:

consumeResult.Message.ToCloudEvent();

Amqp extensions - bad implementation?

Hi!
Im trying to do Azure EventHub extension for CloudEvents. As example i took Amqp extension, and notice a few flaws in current implementation, according to this description:

The CloudEvent.Data property is object typed, and may hold any valid serializable CLR type. The following types have special handling:

System.String: In binary content mode, strings are copied into the transport message payload body using UTF-8 encoding.
System.Byte[]: In binary content mode, byte array content is copied into the message paylaod body without further transformation.
System.Stream: In binary content mode, stream content is copied into the message paylaod body without further transformation.
Any other data type is transformed using the given event formatter for the operation or the JSON formatter by default before being added to the transport payload body.

All extension attributes can reached via the CloudEvent.GetAttributes() method, which returns the internal attribute collection. The internal collection performs all required validations.

AmqpCloudEventMessage.cs

  1. formatter - would be nice to have this null as default
  2. if formatter is null it is not replaced by default JSON formatter
  3. if contentMode different than Structured, and cloudEvent.Data is not met the special handling cases, cloudEvent.Data is not prepared with formatter and attached to BodySection. Simply, BodySection will be empty.
  4. if we try numer 3 fix using formatter.EncodeAttribute(cloudEvent.SpecVersion, CloudEventAttributes.DataAttributeName(cloudEvent.SpecVersion), cloudEvent.Data, cloudEvent.Extensions.?);
  • EncodeAttribute ignore all Data attribute which are not a Stream ?!
  • the stream one serialize byte[] array into json and then to string - what is the purpose of that? Isn't that volataile > System.Stream: In binary content mode, stream content is copied into the message paylaod body without further transformation. rule?
  • cloudEvent.Extensions is not accessed in AmqpExtension project - as it is internal - should we provide for it access? (internalvisibleto?)
  1. else if (attribute.Value is DateTime || attribute.Value is DateTime || attribute.Value is string)
    a small duplication of attribute.Value is DateTime

Most of this can be fixed fairly easy, i can introduce fixes in PR if my understanding of this i correctly - but the most what concern me is the EncodeAttribute method. With current implementation i cannot implement:

Any other data type is transformed using the given event formatter for the operation or the JSON formatter by default before being added to the transport payload body.

as EncodeAttirubte ignore all Data objects which are not a Stream.

Refactor HTTP handling

Even after #70 is merged, the HTTP code will contain a lot of repetition. Much of that could be avoided with some refactoring, I suspect.

Consider changing the type of CloudEvent.Time to DateTimeOffset

Two benefits of this:

  • It accurately reflects the spec, which talks about a "timestamp" type - a DateTime with a Kind of Local or Unspecified does not represent a timestamp in general. (A kind of Local can be converted to a timestamp, if we assume that assumption of "system default time zone" is actually correct, but it's pretty ugly.)
  • It allows the UTC offset described in RFC3339 to be preserved

One downside: there's no way of representing "Unqualified Local Time" or "Unknown Local Offset Convention" in DateTimeOffset

Personally I believe the benefits outweigh the downside, for values which are really expected to be timestamps, but I'm happy to discuss it.

Should CloudEvent.Id and CloudEvent.Time really be autogenerated?

Currently, if Id and Time aren't specified, they're defaulted via Guid.NewGuid().ToString() and DateTime.UtcNow - at least for most constructors. (But not all; currently if you use the CloudEvent(CloudEventsSpecVersion specVersion, IEnumerable<ICloudEventExtension> extensions) constructor, that doesn't default the attributes, but everything else does.

It feels to me like it would be better to avoid defaulting anything other than the spec version. That allows for the idea of a "partial" CloudEvent, with only a few attributes set, to be later merged with another.

validate cloud event structure

Is there a way to validate cloudevent structure from a json/string. The JsonFormatter is substituting default values for a message if values are not present. How could we check if a json is valid cloudevent message.

Consider making the spec version immutable for any given event

I suspect it's quite rare to want to change a CloudEvent from one spec version to another - and being able to mutate an existing object in that way feels particularly odd, and leads to code complexity.

I propose that we prevent the CloudEvent spec version attribute from being modified (or cleared) from CloudEventAttributes, and instead provide a method that returns a new CloudEvent (or CloudEventAttributes) with equivalent attributes, but in a different version.

HttpRequestExtension.ReadCloudEventAsync seems overly permissive

I'm currently working on the .NET Functions Framework for Google Cloud Functions, implementing Cloud Event support.

I have an adapter type which reads a Cloud Event from an HttpRequest using the ReadCloudEventAsync method. I deliberately fail the HTTP request if that returns null, which it's documented to do if the request doesn't contain a Cloud Event. However, I can't get it to return null whatever I try - an empty request, a request with just plain text... they all come back as Cloud Events.

Looking at the code, I can't see how it would ever return null.

ASP.NET Core example to show how to return a CloudEvent

The ASP.NET Core example shows how to receive and a CloudEvent. However, there's no example to show how to return a CloudEvent from a controller. I've been trying to figure this out with no success so far. A sample would be quite helpful.

Determine spec version support

Assuming we're creating a 2.0 (as various changes such as strong-naming are breaking changes) we should decide whether we still need to support CloudEvents prior to 1.0. For reference, the dates are:

  • 1.0: October 2019
  • 0.3: June 2019
  • 0.2: December 2018
  • 0.1: April 2018

I would hope that by the time the library 2.0 goes GA (which is unlikely to be before December), there'd be no requirement to support spec 0.3 and earlier any more.

HttpSend Project will fail on Authentication

I was not able to get it to work as it is, as the httpClient does not have the aeg-sas-key.. I added it this way and started to get OK's from the HTTP requests

  _HttpClient.DefaultRequestHeaders.Add("aeg-sas-key", new List<string>() { pAccessKey });

HttpSend project, Program.cs, line 40 & 41
var httpClient = new HttpClient();
var result = (await httpClient.PostAsync(this.Url, content));

(Also never seen a Program.cs with Option attributes before - might be worth a comment or two as to why this is created this way.

Null Reference when using Extension Method

Greetings, I have an issue where when I use .IsCloudEvent() or .ToCloudEvent() extension methods, I get a NullReferenceException. My current flow is the following.

var output = new Output(search.SearchTerm);
            string guid = Guid.NewGuid().ToString();
            var cloudEvent = new CloudEvent("event", new Uri("urn:sample:sample"), guid, DateTime.UtcNow)
            {
                DataContentType = new ContentType("application/json"),
                Data = JsonConvert.SerializeObject(output, Formatting.Indented)
            };
            var message = new AmqpCloudEventMessage(cloudEvent, ContentMode.Structured, new JsonEventFormatter());

I publish this successfully to a local ActiveMq via AMQPNetLite.

Afterwards, I consume the message successfully, however i get an exception raised once I attempt to validate that the message is a CloudEvent.

Address address = new Address("amqp://admin:[email protected]:5672");
            Connection connection = Connection.Factory.CreateAsync(address).Result;
            Session session = new Session(connection);

            ReceiverLink receiver = new ReceiverLink(session, "receiver-link", "CloudEvent");
            var message = receiver.ReceiveAsync().Result;
            bool isCloudEvent = message.IsCloudEvent();
            receiver.Accept(message);

            receiver.CloseAsync().Wait();
            session.CloseAsync().Wait();
            connection.CloseAsync().Wait();

I expected this to work out of the box, but I might be missing something. Would be glad if I could get any pointers, and have the documentation updated for this, since it is not very apparent and all the REQUIRED fields from the spec are contained in the message.

Add CloudEvents middleware

Description

As part of the Dapr project - we wrote our own middleware to unwrap a structured cloud event. This seems like a generally useful feature - but because we wrote it ourselves separate from this project and tailored to Dapr's needs, it doesn't interact with the any of the goodness here.

https://github.com/dapr/dotnet-sdk/blob/master/samples/AspNetCore/RoutingSample/Startup.cs#L78
https://github.com/dapr/dotnet-sdk/blob/master/src/Dapr.AspNetCore/CloudEventsMiddleware.cs

Dapr uses the cloudevents format (only structured json) for pub/sub messages. The middleware gives users a pretty idiomatic experience for using ASP.NET Core's primitives to interact with the payload of the cloudevent (data or data_base64).

How this works in practice:

  • Dapr sends an HTTP request to the app using the structured JSON format
  • The middleware unwraps the envelope
    • The envelope is read as JSON
    • We replay the contents of data or data_base64 into the request body
  • Some other piece of code (likely MVC) reads the request body and doesn't see the envelope, only its payload

What we're currently missing is that we don't persist the envelope of the cloudevent anywhere the use has access to. Example of what this might look like.... If we're going to expose the cloudevents envelope, then it seems useful to be able to do so in a strongly-typed way.

This ends up being a really useful pattern for an app that needs to receive a cloudevent, but the app code wants to use existing tech to read the payload. It feels like this is a generally useful pattern and we could converge this functionality with the cloudevents package rather than supporting it in Dapr as a one-off.

I'm starting to have some conversations with users that do want access to other properties on the cloudevent, so ultimately we want to leverage what's been built here already.

Why didn't we do this earlier?

We are unwilling at add Newtonsoft.Json as a dependency for just this feature given that the other 95% of our functionality uses S.T.J. If #94 is going to happen, then it makes sense for us to to contribute to and rely on this project, rather than building partial functionality that overlaps.

What would it look like?

I'd like to contribute the middleware if it can stay close to the current design, and hopefully support a super-set of the features we have today. Our users are using this pattern today (middleware unwraps the event payload, using MVC or some other mechanism for the app to read the payload).

We'd need to go though some kind of deprecation period and point users to this package, eventually removing our functionality in favor of this. If the middleware lands here, I'm not sure if we'd ultimately need to take a dependency on this package in our code, or just tell users to install it ๐Ÿคท

If you don't think the middleware belongs here, I think it's likely we'll still want to start using functionality from this package once #94 happens.

Http Protocol Bindings do not correctly URL encode header values

As per the spec: https://github.com/cloudevents/spec/blob/master/http-protocol-binding.md#3132-http-header-values

Some CloudEvents metadata attributes can contain arbitrary UTF-8 string content, and per RFC7230, section 3, HTTP headers MUST only use printable characters from the US-ASCII character set, and are terminated by a CRLF sequence with OPTIONAL whitespace around the header value.

String values MUST be percent-encoded as described in RFC3986, section 2.4 before applying the header encoding rules described in [RFC7230, section 3.2.6][rfc7230-section-3-2s6].

This library should URL encode headers when creating requests, and decode headers before mapping them to cloud events when decoding events.

Stop using JsonEventFormatter in transport tests

As our only event formatter, we currently use JsonEventFormatter everywhere, even when the test has nothing to do with JSON.

Given that our tests tend to just have string content, it might be worth creating a TestEventFormatter that is as simple as possible, and will make debugging transport tests easier.

(We should also probably have a central place for "create a test event", "validate a test event" etc. There's a lot of duplication at the moment.)

CloudEvent should enforce lower-case attribute names

The spec states:

CloudEvents attribute names MUST consist of lower-case letters ('a' to 'z') or digits ('0' to '9') from the ASCII character set. Attribute names SHOULD be descriptive and terse and SHOULD NOT exceed 20 characters in length.

Unfortunately, this can easily be violated:

var evt = new CloudEvent(CloudEventsSpecVersion.V1_0, "my-type", new Uri("//my-source"), "my-id", DateTime.UtcNow);
evt.GetAttributes()["Subject"] = "my-subject";

var formatter = new JsonEventFormatter();
var bytes = formatter.EncodeStructuredEvent(evt, out _);
var text = Encoding.UTF8.GetString(bytes);
// Includes "Subject" instead of "subject"
Console.WriteLine(text);

I ran into this because the HttpRequest.ReadCloudEventAsync extension method in the ASP.NET Core package preserves header casing - so a header of "Ce-Type" ends up with a value of "Type" in the JSON.

I'm ambivalent as to whether we lower-case in the attributes, or reject an attempt to set an invalid attribute name, but we should definitely do one of the two.

Clarify CloudEvent formatter responsibilities

I'm writing a doc to help myself think about this more clearly, but it feels to me like there's a lack of clarity around when data should be converted in what exact way.
We may want to change the actual code to make this clearer, or it may just be a matter of fixing docs. We'll see.

One key result I'd like to see: code that says (for example) "Populate HttpRequestMessage from CloudEvent" then "Convert HttpRequestMessage to CloudEvent" should be able to roundtrip in both directions, without gaining extra levels of encoding on each cycle, which I suspect it would do now without care.

JsonEventFormatter does not persist timezone/type for DateTimeOffset attribute values

CloudEvent 1.3.80

JsonEventFormatter does not persist timezone/type for DateTimeOffset attribute values.

Prooftest:

[Test]
public void Test()
{
	var formatter = new JsonEventFormatter();
	var now = DateTimeOffset.UtcNow;

	var cloudEvent = new CloudEvent("a", new Uri("http://example.com"))
	{
		Data = "Hello!"
	};
	cloudEvent.GetAttributes()["test"] = now;
	var serialized = formatter.EncodeStructuredEvent(cloudEvent, out _);

	var serializedText = Encoding.UTF8.GetString(serialized);
	Console.WriteLine(serializedText);

	var restored = formatter.DecodeStructuredEvent(serialized);
	var restoredTime = restored.GetAttributes()["test"]; // {23.03.2020 10:16:33} (DateTime; kind: local)
	restoredTime.Should().Be(now);
}

serialized payload:

{
  "specversion": "1.0",
  "type": "a",
  "source": "http://example.com",
  "id": "35ee3470-41a8-45ef-b311-233c148b8e52",
  "time": "2020-03-23T07:15:26.2260352Z",
  "data": "Hello!",
  "test": "2020-03-23T07:15:26.2236884+00:00"
}

CloudEventContent maps headers in structured mode

My understanding is that in structured mode, we shouldn't have ce-type (etc) HTTP headers, but I believe CloudEventContent still maps these.
Will write a test and then fix it, assuming I'm right.

Adding allow header makes validation fail

response.Headers.Add("Allow", "POST");

When this code is adding the Allow header I get this response:
Misused header name. Make sure request headers are used with HttpRequestMessage, response headers with HttpResponseMessage, and content headers with HttpContent objects.

It appears that you must add it to .Content.Headers.Add("Allow", "POST");
source: https://stackoverflow.com/questions/14286436/why-i-cannot-set-allow-in-http-response-header

Without this, I can't use this in an azure function in .net core 3.1
not sure if this is a bug in other versions, I assume it behaves this way in all .NET versions?

Add support/sample for ASP.NET Core

As far as I can tell, the package does not support ASP.NET Core. Either add support for ASP.NET Core or add a sample showing how to parse for CloudEvents on the server side with ASP.NET Core

Check case sensitivity uses

We're fixing the use of CloudEvent attributes themselves not previously being case-sensitive, but every transport has its own rules, and we should check we're doing the right thing. In particular, often we're using an "invariant culture case-insensitive match" instead of an "ordinal case-insensitive match" and I don't know that that will always be the right approach.

Feature suggestion - Batched CloudEvents

What feature would you like?
The CloudEvents spec describes how events could also be batched together and given a specific Content-Type: application/cloudevents-batch+json.

The current SDK implementation only allows single events to be parsed.

Receiving
It would be nice when the DecodeJObject takes into account if the token is an Object or an Array so we can benifit from this approach.

Sending
A new HttpContent implementation for batched events could be added for this too. Maybe take in a sequence of cloud events for this.

Consider not using inheritance for Kafka and Amqp

Currently we use constructors to create messages for Kafka and Amqp, whereas other transports use extension methods.
I see no obvious benefits in using inheritance here, and it does make it harder to write asynchronous options.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.