Coder Social home page Coder Social logo

firelyteam / spark Goto Github PK

View Code? Open in Web Editor NEW
251.0 44.0 165.0 287.08 MB

Firely and Incendi's open source FHIR server

License: BSD 3-Clause "New" or "Revised" License

C# 97.38% JavaScript 0.19% HTML 1.09% XSLT 1.06% Batchfile 0.01% Dockerfile 0.09% Shell 0.17%
fhir spark stu3 dstu2 r4 c-sharp fhir-server spark-fhir-server docker fhir-api

spark's Introduction

DSTU2 STU3 R4
Tests Tests Tests
Integration Tests Integration Tests Integration Tests
Release Release Release
Docker Release Docker Release Docker Release

Spark

Spark is an open-source FHIR server developed in C#, initially built by Firely. Further development and maintenance is now done by Incendi.

Spark implements a major part of the FHIR specification and has been used and tested during several HL7 WGM Connectathons.

Get Started

There are two ways to get started with Spark. Either by using the NuGet packages and following the Quickstart Tutorial, or by using the Docker Images.

NuGet Packages

Read the Quickstart Tutorial on how to set up your own FHIR Server using the NuGet Packages. There is also an example project that accompanies the Quickstart Tutorial which you can find here: https://github.com/incendilabs/spark-example

Docker Images

Set up the Spark FHIR server by using the Docker Images. Make sure you have installed Docker. On Linux you will need to install Docker Compose as well. After installing Docker you could run Spark server by running one of the following commands, found below, for your preferred FHIR Version. Remember to replace the single quotes with double quotes on Windows. The Spark FHIR Server will be available after startup at http://localhost:5555.

R4

curl 'https://raw.githubusercontent.com/FirelyTeam/spark/r4/master/.docker/docker-compose.example.yml' > docker-compose.yml
docker-compose up

STU3

curl 'https://raw.githubusercontent.com/FirelyTeam/spark/stu3/master/.docker/docker-compose.example.yml' > docker-compose.yml
docker-compose up`

DSTU2

curl 'https://raw.githubusercontent.com/FirelyTeam/spark/master/.docker/docker-compose.example.yml' > docker-compose.yml 
docker-compose up

Versions

R4

Source code can be found in the branch r4/master. This is the version of Spark running at https://spark.incendi.no FHIR Endpoint: https://spark.incendi.no/fhir

STU3

Source code can be found in the branch stu3/master, we try to keep up-to-date with the STU3 version of FHIR. This is the version of Spark running at https://spark-stu3.incendi.no FHIR Endpoint: https://spark-stu3.incendi.no/fhir

DSTU2

DSTU2 is no longer maintained by this project. The source code can be found in the branch master.

DSTU1

DSTU1 is no longer maintained by this project. The source code can be found in the branch dstu1/master.

Contributing

If you want to contribute, see our guidelines

Git branching strategy

Our strategy for git branching:

Branch from the r4/master branch which contains the R4 FHIR version, unless the feature or bug fix is considered for a specific version of FHIR then branch from the relevant branch which at this point is stu3/master.

See GitHub flow for more information.

spark's People

Contributors

andy-a-o avatar autark avatar brianpos avatar cknaap avatar corinaciocanea avatar dependabot[bot] avatar erollando avatar gtakov avatar hvroege avatar jjrdk avatar kennethmyhra avatar losolio avatar mharthoorn avatar mjugl avatar mmsmits avatar pieteckhart-hci avatar rbauck avatar richardschneider avatar shaosss avatar tonyabell avatar whyfate avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

spark's Issues

Not able to build master-dstu2 (or develop)

I'm using Microsoft Visual Studio Community Edition 2015, and I'm trying to compile DSTU2 so that it's like the server installed on http://spark-dstu2.furore.com/

This is probably not an issue with the software, but with my knowledge of compiling C# programs with NuGet, and my setup. But if you're able to help, that'd be great.

I'm able to build DSTU-1 with this IDE, but not DSTU-2.

I get the following error (incl a few others)

error : Unable to find version '0.50.1-alpha1' of package 'Hl7.Fhir.DSTU2'.

My Nuget explorer says that this version is installed however.

(Is there a forum where I could post this instead of posting it as an issue?)

I checked out the code like this:

git clone https://github.com/furore-fhir/spark.git spark2
cd spark2/
git checkout dstu2/master

and then I opened the spark.sln in my IDE and tried to do build all.

Update Server Conformance statement

For DSTU, Conformance statement has been extended, e.g. to include declaring chained parameters. Update the ConformanceBuilder to provide this new information.

Add tag SUBSETTED to Meta.tag for _summary results

A couple of days ago there was discussion of servers returning partial data (for whatever business reasons) and including the SUBSETTED Tag with them.
[03:40:34] Brian Postlethwaite: Should this also be included when a search result is returned from _summary?
[03:46:59] David Hay: or REDACTED…
[03:47:22] David Hay: and I’d assume yes - for the same arguments as Paul made...

Indexing elemens for search

In the C# object model there are elements for which the name end on "_" (like AppointmentResponse.ParticipantStatus). If these paths occur in the definition of search parameters, the harvester will not be able to find the appropriate element on the object model. Easy fix would be for the harvester to re-try to find a property suffixed with "_" when a property is not found.

Deadlock conflict with JsonFhirFormatter and Async DelegatingHandler

I'm using your JsonFhirFormatter in my webapi application.

I've created an async DelegatingHandler which saves the response content to the database, but for some reason it keeps dead locking. I'm not certain which code is causing it (probably mine), but wondered if you had experienced this before?

This is the sample code I'm using for my Handler:

protected override async Task<HttpResponseMessage> SendAsync(HttpRequestMessage request,
                                                           CancellationToken cancellationToken)
    {
        var response = await base.SendAsync(request, cancellationToken).ConfigureAwait(false);

        string content = "";
        if (response.Content != null)
        {
            var bytes = await response.Content.ReadAsByteArrayAsync().ConfigureAwait(false);
            content = Encoding.UTF8.GetString(bytes);    
        }

        //SAVE AUDIT TO DATABASE!

        return response;
    }

It seems to lock on the .ReadAsByteArrayAsync(). I'm not sure the .ConfigureAwait(false) is absolutely necessary, but seemed to resolve another deadlock issue I was having.

It might be useful, for me and others to include an Auditing handler in your server, as this could be a common scenario.

Doing _include=Patient.managingOrganization fails

When using SPARK I am accessing the following : http://spark.furore.com/fhir/Patient?_include=Patient.managingOrganization and get the following

<OperationOutcome xmlns="http://hl7.org/fhir">
<issue>
<severity value="error" />
<details value="NullReferenceException: Object reference not set to an instance of an object." />
</issue>
<issue>
<severity value="information" />
<details value="at Spark.Store.MongoFhirStore.<FindByIds>b__7(Uri uri) in d:\temp\4iiddlmd.0bf\input\Spark.Store\MongoFhirStore.cs:line 138 at System.Linq.Enumerable.WhereSelectListIterator`2.MoveNext() at MongoDB.Bson.BsonArray.AddRange(IEnumerable`1 values) at MongoDB.Driver.Builders.Query.In(String name, IEnumerable`1 values) at Spark.Store.MongoFhirStore.FindByIds(IEnumerable`1 ids) in d:\temp\4iiddlmd.0bf\input\Spark.Store\MongoFhirStore.cs:line 137 at Spark.Store.MongoFhirStore.Include(Bundle bundle, String include) in d:\temp\4iiddlmd.0bf\input\Spark.Store\MongoFhirStore.cs:line 786 at Spark.Store.MongoFhirStore.Include(Bundle bundle, ICollection`1 includes) in d:\temp\4iiddlmd.0bf\input\Spark.Store\MongoFhirStore.cs:line 797 at Spark.Service.FhirService.Search(String collection, IEnumerable`1 parameters, Int32 pageSize) in d:\temp\4iiddlmd.0bf\input\Spark.Service\Service\FhirService.cs:line 222 at Spark.Controllers.FhirController.Search(String type) in d:\temp\4iiddlmd.0bf\input\Spark\Controllers\FhirController.cs:line 118 at lambda_method(Closure , Object , Object[] ) at System.Web.Http.Controllers.ReflectedHttpActionDescriptor.ActionExecutor.<>c__DisplayClass10.<GetExecutor>b__9(Object instance, Object[] methodParameters) at System.Web.Http.Controllers.ReflectedHttpActionDescriptor.ActionExecutor.Execute(Object instance, Object[] arguments) at System.Web.Http.Controllers.ReflectedHttpActionDescriptor.ExecuteAsync(HttpControllerContext controllerContext, IDictionary`2 arguments, CancellationToken cancellationToken) --- End of stack trace from previous location where exception was thrown --- at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task) at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task) at System.Web.Http.Controllers.ApiControllerActionInvoker.<InvokeUsingResultConverterAsync>d__8.MoveNext() --- End of stack trace from previous location where exception was thrown --- at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task) at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task) at System.Web.Http.Controllers.ActionFilterResult.<ExecuteAsync>d__2.MoveNext() --- End of stack trace from previous location where exception was thrown --- at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task) at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task) at System.Web.Http.ApiController.<InvokeActionWithExceptionFilters>d__1.MoveNext()" />
</issue>
</OperationOutcome>

Martijn says:
Searching in spark has been reimplemented last month and the search without any (filtering) parameters gives the null pointer error. (known issue)

Difference betwee birthdate and birthDate - should we be case-sensitive?

[16:05:23] Al Amyot: Ewout, I've also noticed that on your server, "birthdate=1945-01-01" returns 1 result and "birthDate=1945-01-01" returns ALL results. I've searched the FHIR spec for references to case sensitivity/capitalization and have not managed to find whether the terms should be case-sensitive or not. Is your server's behaviour correct? If the term should be case-sensitive, should the server return an error when the wrong one is supplied?

ResourceReference should not resolve to Chained parameter

The following search
~/fhir/Encounter?patient=d1
resolves to normalized criteria {patient.internal_id=d1}
Where it should just resolve to {patient=Patient/d1}

Because the encounter should be found even if the patient is not present in the database.

Bundle when retrieved in XML format is wrapped in CData encoding.

Hi,

In DSTU2 when we GET search results for a resource in XML format , the response is not XML instead its wrapped in CDATA encoding. This needs to be taken care by XML parser to represent this data as XML upon retrieval. When we use tools like Fiddler or postman , this is automatically taken care by parser however when using Soap UI all my scripts which written for DSTU1 using XPath for Data validations fail.

This is observed only in case of Spark(DSTU2 code base and Test server) however when testing using Grahame's test server http://fhir-dev.healthintersections.com.au this is not observed

Thanks,
Revathy

Searching resources by id fails

When doing this:
http://spark.furore.com/fhir/DiagnosticReport?subject:Patient=pat2
the search will return all DiagnosticReports

However, when doing this:
http://spark.furore.com/fhir/DiagnosticReport?subject:Patient._id=pat2
search works correctly.

Search is also correct when doing this:
http://spark.furore.com/fhir/DiagnosticReport?subject=Patient/pat2

XmlSignatureHelper:Sign - ArgumentException parameters reversed

in DSTU2 branch at line 81 of Spark.Engine/Auxiliary/XmlSignatureHelper.cs :

if (!certificate.HasPrivateKey) throw new ArgumentException("certificate", "Certificate should have a private key");

Should be:

if (!certificate.HasPrivateKey) throw new ArgumentException("Certificate should have a private key", "certificate");

Posting to New Patient #Error 415

I try to post:
http://localhost:1396/fhir/Patient/?_format=json

I get the error message:
{
"resourceType": "OperationOutcome",
"issue": [
{
"severity": "error",
"details": "StatusCode: 415, ReasonPhrase: 'Unsupported Media Type', Version: 1.1, Content: System.Net.Http.ObjectContent`1[[System.Web.Http.HttpError, System.Web.Http, Version=5.2.2.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35]], Headers:\r\n{\r\n Content-Type: application/json; charset=utf-8\r\n}"
}
]
}

A simple JSON Fhir Message I use is:
{
"resourceType": "Patient",
"text": {
"status": "generated",
"div": "<div xmlns="http://www.w3.org/1999/xhtml\">Everywoman, Eve. SSN:\r\n 444222222\r\n \r\n \r\n \r\n "
},
"identifier": [
{
"label": "SSN",
"system": "sid/us-ssn",
"value": "444222222"
}
],
"name": [
{
"use": "official",
"family": [
"Everywoman"
],
"given": [
"Eve"
]
},
{
"family": [
"Kramer"
],
"given": [
"Ewout"
]
},
{
"family": [
"Kramer"
],
"given": [
"Ewout"
]
},
{
"family": [
"Kramer"
],
"given": [
"Ewout"
]
},
{
"family": [
"Kramer"
],
"given": [
"Ewout"
]
},
{
"family": [
"Kramer"
],
"given": [
"Ewout"
]
},
{
"family": [
"Kramer"
],
"given": [
"Ewout"
]
},
{
"family": [
"Kramer"
],
"given": [
"Ewout"
]
}
],
"telecom": [
{
"system": "phone",
"value": "555-555-2003",
"use": "work"
}
],
"gender": {
"coding": [
{
"system": "v3/AdministrativeGender",
"code": "F"
}
]
},
"birthDate": "1976-05-31",
"address": [
{
"use": "home",
"line": [
"3333 Home Street"
]
}
],
"managingOrganization": {
"reference": "Organization/hl7"
},
"active": true
}

ASP.Net 5 / Dependency Injection?

I'm writing a Couchbase Store for Spark and was wondering when you foresee adding Dependency Injection? I see a comment in FhirController that suggests it is planned.

Audit / log mechanism?

Does this platform contains or supports an extension mechanism for logging or auditing?
I mean the product, not the FHIR standard.
I would like to register all the activity related to all ingoing and outgoing messages.

PUT to resourcetype sends no OperationOutcome

When trying to PUT a resource to a resourcetype without supplying an id, Spark returns a 405 Method not Allowed, but doesn't give the error back as an OperationOutcome. Instead, it just sends a text message {"Message":"The requested resource does not support http method 'PUT'."}

Support for type search parameter

Hi,

Currently as per DSTU1 for Location Resource , a type search parameter is supported which is used to search a code for the type of location. When i query using the following url :
https://localhost/FhirStudy/fhir/Location?type=rneu

The following exception is seen:
"Unknown resource collection rneu". This is because the resource type received by FHIRcontroller is rneu instead of Location.

Have any of you encountered this issue or the fix for same has been done? Please let me know.

Rework C#6 code to C#5

In ElementQuery.cs some C#6 code has been introduced in the ParsePredicate function. Rework this to C#5.

Build Error in VS

Hi All, Went to download the latest build ,and tried to compile but ran into some small errors. I do have a fix if you need. The issue was in the web.config. I had to change the langversion to 5 from 6. May be just something on my end , but in-case someone had the same issue here is the fix.

Error was in webpage
[No relevant source lines]

in webconfig line below
compilerOptions="/langversion:5
Thanks
Brandon

Need some help on Crucible for FHIR Tests

Hi All,

I am trying to use Crucible to determine the compliance to FHIR specifications by executing the predefined FHIR Tests.When i try to run the tests , it is executing indefinitely. Can anyone please help me with this. I am unable to find good documentation on using this tool.

Also can anyone please help me with following information :

  1. Should the FHIR server which I am using be a public server to run these tests?
  2. Is it possible for me to download the tests and tool and run it locally?
  3. Is supporting software conformance resource mandatory for the tool to be used?

Thanks,
Revathy

Unit Tests Failures

80: Failed
22: Pass

Are all the unit tests up to date and in working order.

I am planning on refactoring the project to make it easier to plug in different storage and index providers.

Should I make sure the tests are up to date, or are they not being maintained.

{type}/_search only accepts POST

It seems the dstu2 code only accepts POST requests for the _search format like this: {type}/_search

When changing to Get in the code, it works. According to the specification, it seems GET is acceptable?

For example, I think the following GET should be acceptable?

GET http://localhost:1396/fhir/Observation/_search?subject%3APatient=12345

Fix is probably in FhirController.cs? Currently the above URL gets interpreted as if "_search" is an {id} and gives an error that _search is an unknown id.

Search returns incorrect "self" link in Bundle

Spark returns this for a search inside Bundle:

<link>
    <relation value="self"/>
    <url value="http://spark-dstu2.furore.com/fhir/_snapshot?id=27e5683a-a9a5-4d98-a43d-f5e0c74f9d78&start=0&_count=20"/>
</link>

The self link is supposed to contain the original query (next/prev etc. not necessarily), repeating the query parameters that have been understood by the server, so the client can determine which search params were taken into account.

Spark.Engine nuget-package

Hi guys!
I am developing FHIR- server using FHIRbase. I suggest you create nuget-package from the assembly "Spark.Engine".

Inconsistent treatment of _id and :type modifier

[16-12-2014 16:44:15] Bill de Beaubien: I was playing around with pointing the SMART growth app at spark yesterday
[16-12-2014 16:44:31] Bill de Beaubien: of these requests..
[16-12-2014 16:44:33] Bill de Beaubien: http://spark.furore.com/fhir/Observation/_search?subject:Patient._id=1132446799
http://spark.furore.com/fhir/Observation/_search?subject:Patient=1132446799
[16-12-2014 16:44:52] Bill de Beaubien: it uses the 2nd form, which for some reason on spark finds nothing, though the first finds observations
[16-12-2014 16:44:56] Bill de Beaubien: is that expected?

Spark.Store Transaction Commit Fails Leaving Resource in 'queued' status

When creating, updating or deleting a Resource document, the result of commiting the transaction is not evaluated for failure. When the sweep() function is called to move the Resource document from 'queued' to 'current' the collection save result is not evaluated. When this fails no exception is raised and the transaction is not rolled back leaving the Resource document in the state of 'queued'.

image

image

Proposed fix is to evaluate the result of Transaction.cs line 144 to ensure the Update was successful and the documents affected is equal to 1. If this evaluation fails, the sweep() mehod will throw an exception with the details of the failure so that the transaction fails and the Rollback() method is called.

Using gzip compression does not work

Finally got around to testing compressed request to FHIR server. Sorry Grahame, but it appears both your and Ewout's servers fail. It looks like both server are feeding the compressed content directly into XML processing without first decompressing the request content.

[14-6-2014 00:34:45] Richard Schneider:

PUT http://fhir.healthintersections.com.au/open/Profile/patient?_ignoreVersion=true HTTP/1.1
Category: http://hl7.fhir/example; label="FHIR example"; scheme="http://hl7.org/fhir/tag"
User-Agent: orchestral-client/0.80
Accept: application/atom+xml; q=1.0, application/xml+fhir; q=0.9, application/json+fhir; q=0.8, /; q=0.1
Content-Type: application/xml+fhir; charset=utf-8
Content-Encoding: gzip
Host: fhir.healthintersections.com.au
Content-Length: 6709
Accept-Encoding: gzip, deflate
Connection: Keep-Alive

Linux support

Any plans to implement this product into Linux by using the new VS 2015 tools?
MongoDB supports Mono 3.x, so it is feasible.

URL param validation errors throw a 500 error

When I do e.g.
http://spark.furore.com/fhir/DiagnosticReport/Diagnosticreport-example-dxa

Spark returns a 500 (internal server error), just because the report's id does not match the regex pattern for a valid id. This should return the appropriate 4xx (invalid request i think) status code instead. Also, the returned OperationOutcome is too cryptic.

MongoConnectionException

I've been trying to hit the http://spark-dstu2.furore.com/fhir/Patient endpoint since evening and I'm getting the following error -

at MongoDB.Driver.Internal.DirectMongoServerProxy.Connect(TimeSpan timeout, ReadPreference readPreference)\r\n   at MongoDB.Driver.Internal.DirectMongoServerProxy.ChooseServerInstance(ReadPreference readPreference)\r\n   at MongoDB.Driver.MongoServer.AcquireConnection(ReadPreference readPreference)\r\n   at MongoDB.Driver.MongoCursor`1.MongoCursorConnectionProvider.AcquireConnection()\r\n   at MongoDB.Driver.Operations.QueryOperation`1.GetFirstBatch(IConnectionProvider connectionProvider)\r\n   at MongoDB.Driver.Operations.QueryOperation`1.Execute(IConnectionProvider connectionProvider)\r\n   at MongoDB.Driver.MongoCursor`1.GetEnumerator()\r\n   at System.Linq.Enumerable.WhereSelectEnumerableIterator`2.MoveNext()\r\n   at System.Collections.Generic.List`1..ctor(IEnumerable`1 collection)\r\n   at System.Linq.Enumerable.ToList[TSource](IEnumerable`1 source)\r\n   at Spark.Search.Mongo.MongoSearcher.CollectKeys(IMongoQuery query)\r\n   at Spark.Search.Mongo.MongoSearcher.CollectKeys(String resourceType, IEnumerable`1 criteria, SearchResults results)\r\n   at Spark.Search.Mongo.MongoSearcher.Search(String resourceType, SearchParams searchCommand)\r\n   at Spark.Mongo.Search.Common.MongoFhirIndex.Search(String resource, SearchParams searchCommand)\r\n   at Spark.Service.FhirService.Search(String type, SearchParams searchCommand)\r\n   at Spark.Controllers.FhirController.Search(String type)\r\n   at lambda_method(Closure , Object , Object[] )\r\n   at System.Web.Http.Controllers.ReflectedHttpActionDescriptor.ActionExecutor.<>c__DisplayClass10.<GetExecutor>b__9(Object instance, Object[] methodParameters)\r\n   at System.Web.Http.Controllers.ReflectedHttpActionDescriptor.ActionExecutor.Execute(Object instance, Object[] arguments)\r\n   at System.Web.Http.Controllers.ReflectedHttpActionDescriptor.ExecuteAsync(HttpControllerContext controllerContext, IDictionary`2 arguments, CancellationToken cancellationToken)\r\n--- End of stack trace from previous location where exception was thrown ---\r\n   at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\r\n   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n   at System.Web.Http.Controllers.ApiControllerActionInvoker.<InvokeActionAsyncCore>d__0.MoveNext()\r\n--- End of stack trace from previous location where exception was thrown ---\r\n   at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\r\n   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n   at System.Web.Http.Controllers.ActionFilterResult.<ExecuteAsync>d__2.MoveNext()\r\n--- End of stack trace from previous location where exception was thrown ---\r\n   at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)\r\n   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)\r\n   at System.Web.Http.Controllers.ExceptionFilterResult.<ExecuteAsync>d__0.MoveNext()"

It was working fine this morning.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.