graftjs / jschan Goto Github PK
View Code? Open in Web Editor NEWJavaScript port of libchan based around streams
License: MIT License
JavaScript port of libchan based around streams
License: MIT License
Async is not used very much, just for closing down some stuff. Maybe we can do better and remove it.
Hi,
Is there a plan for a TCP transport ?
I guess that it will have the highest throughput on local network, which is where jschan is going to be used anyway.
see GraftJS/graft#19
events.js:141
throw er; // Unhandled 'error' event
^
Error: Protocol "http:" not supported. Expected "https:".
at new ClientRequest (_http_client.js:53:11)
at Object.exports.request (http.js:31:10)
at ClientSession.newStream [as _createNewStream] (/Volumes/alien/projects/digs/digs-graft/node_modules/jschan/lib/spdy/client.js:104:18)
at createChannel (/Volumes/alien/projects/digs/digs-graft/node_modules/jschan/lib/spdy/client.js:143:11)
at ClientSession.WriteChannel (/Volumes/alien/projects/digs/digs-graft/node_modules/jschan/lib/spdy/client.js:157:10)
at SPDYClient.Client._write (/Volumes/alien/projects/digs/digs-graft/node_modules/graft/lib/client.js:36:30)
at doWrite (/Volumes/alien/projects/digs/digs-graft/node_modules/graft/node_modules/readable-stream/lib/_stream_writable.js:279:12)
at writeOrBuffer (/Volumes/alien/projects/digs/digs-graft/node_modules/graft/node_modules/readable-stream/lib/_stream_writable.js:266:5)
at SPDYClient.Writable.write (/Volumes/alien/projects/digs/digs-graft/node_modules/graft/node_modules/readable-stream/lib/_stream_writable.js:211:11)
at Graft.ondata (/Volumes/alien/projects/digs/digs-graft/node_modules/graft/node_modules/readable-stream/lib/_stream_readable.js:572:20)
Build out a simple example app (confer with libchan guys to make sure we are building the right thing)
Currently, we cannot pass a standard objectMode stream inside a channel with automated piping: we need to create a new channel and pipe.
Is it sound to use that pattern? Plus, solving this issue might be more tricky, because Transform stream are not supported by the automated piping natively, you will need to flag them with something like jschanReadable
or jschanWritable
.
Any opinions on this?
I am creating an array of fs.createReadableStreams and putting them into a message. This message is then passed to my jschan/graft microservice, which pipes each file it into local fs.createWritableStreams.
Data, end, and finish events fire correctly when I am using the in-memory graftJS, and also when using spdy on localhost with a small file (< 3746 bytes). However, all three of the same events fail to fire when sending larger files over spdy jschan.
Node version is v0.10.35
Sender:
// Convert filePaths into filestreams that can be sent via graft jschan
if (msg.filePaths) {
var fileStreams = {};
msg.filePaths.forEach(function(filePath) {
var fileName = path.basename(filePath);
//var zlibCompress = zlib.createDeflate();
var fileStream = fs.createReadStream(filePath);
fileStream.resume();
fileStream.on('data', function(chunk) {
console.log('got %d bytes of data', chunk.length);
});
fileStream.on('end', function() { console.log('end'); });
fileStreams[fileName] = fileStream;
});
msg.fileStreams = fileStreams;
}
Receiver:
// Save remote fileStreams to local temp directory.
async.map(Object.keys(msg.fileStreams), function _writeStream(fileName, cb) {
var fileStream = msg.fileStreams[fileName];
var outPath = path.join(dirPath, fileName);
var outStream = fs.createWriteStream(outPath);
console.log(outPath);
// end event does not fire
fileStream.resume();
fileStream.on('data', function(chunk) {
console.log('got %d bytes of data', chunk.length);
});
fileStream.on('end', function() { console.log('end'); });
// finish event does not fire
fileStream.pipe(outStream).on('error', function(err) {
setImmediate(cb, err);
}).on('finish', function() {
setImmediate(cb, null, outPath);
});
}, function(err, filePaths) {
if (err) { cleanupTmp(); throw err; }
...
});
Would really appreciate the help nailing this bug down.
Let's first let the libchan guys how they want to roll out that in libchan.
Currently a channel, once created, can be serialized only within the session that has originated it.
This means that I cannot route the messages between the services easily, i.e. I cannot send a channel from one user to another client. Currently, this can be done only through explicit piping.
However, I think it's doable to automatic 'fix' this, so when we send a channel bound to another session, it gets automatically piped. What do you think?
like the title, is jschan compatible with node 4.8.3?
if there is already has closed issue related to this, please send me a link here.
Thanks
Add our names to the jschan author's entry in package.json
Port usage.md to node proto-code, so we can see what it's like
Port in-memory execution to node so we can try out our api
Write up concerns about message queue patterns and post it in libchan issue queue for official feedback
Hello everyone,
Based on an email conversation I had with Matteo, he suggested I open an issue here, as his assumption was that the Node client should work with the Golang server, based on last time he tested it.
Basically, I am testing the Golang libchan rexec server:
https://github.com/docker/libchan/blob/master/examples/rexec/rexec_server/server.go
Against the Node.Js jsChan rexec sample:
https://github.com/GraftJS/jschan/blob/master/examples/rexec/client.js
The versions tested are the ones in master repo as of January 15, 2015.
I had two console windows open, on the same server. One running a compiled golang version of libchan rexec server, waiting on port 9323.
The other console would be for the client:
I added a few log lines in the server, and the approximate location where the server stops being responsive is right after:
t, err := tl.AcceptTransport() (line 63, it passes this one)
and before
receiver, err := t.WaitReceiveChannel() (line 71, it doesn't seem to pass this one, no errors thrown)
In other words, the t.WaitReceiveChannel() does not appear to be responsive to jsChan's channel.
My regards to the team, great work,
Silviu
Research and send a feedback email to this list about node-spdy
Currently to create a return channel we use:
var jschan = require('jschan');
var session = jschan.memorySession();
var chan = session.createWriteChannel();
var ret = chan.createReadChannel();
chan.write( { hello: 'world, ret: ret } );
However it will be much simpler to have:
var jschan = require('jschan');
var session = jschan.memorySession();
var chan = session.writeChannel(); // shorter
var ret = jschan.readChannel(); // simpler, channels are derived from the top-level objects, and then passed through.
chan.write( { hello: 'world, ret: ret } );
We need a SPDY transport.
Port the example app to node, to get it working.
@AdrianRossouw proposed this in chat, and I agree :).
@AdrianRossouw, when you have done with this go ahead and publish as 0.1.0!
How this is slower than standard streams?
How this is slower than HTTP?
We just need some numbers, and maybe some optimization tricks.
It seems the encoding part of libchan is in a flux at the moment: e.g. they want to support msgpack (ref)[https://github.com/docker/libchan/blob/72754f8294ce601ea4a947a635df3252bb009e2d/PROTOCOL.md], but right now it's netstrings only, and they have a pull request that adds msgpack, but it does not remove netstrings (see docker/libchan#37).
It's completely unclear.
In order to bootstrap an ecosystem for jschan as quickly as possible, do you think having a contributing rule similar to levelup might help?
See https://github.com/rvagg/node-levelup/blob/master/CONTRIBUTING.md
According to the spec at https://nodejs.org/api/stream.html#stream_readable_read_size_1
Note: once the _read() method is called, it will not be called again until the push method is called.
This causes done() to never be called when ByteStream on remote-side has finished and is sending an { id: x } message. When done() is not called, back pressure is not released in session inStream.
ByteStream.prototype._read = function() {
74 var done = this._lastDone;
75 if (done) {
76 this._lastDone = null;
77 done();
78 }
79 return null;
80 };
first push to npm
Figure out enough of Go to be able to test libchan on it's own!
we need this to have compatiblity with libchan on the server side.
we didn't want to tackle this at the time, because we'd have to rewrite net.
but it looks like somebody has written a port in the last 2 weeks.
When sending large strings or objects (more than one frame) over a response channel only the first frame appears to be handles correctly. Workaround for now is to use a byte stream. This issue potentially due to a misconfiguration on node-spdy.
I'm not an owner here, can any of you add travis to this repo? I already added the .travis.yml file.
we discussed wanting to enable graft/jschan over webrtc data channels too. this issue is to collect some of the related concepts.
One of the things I've need to do in aetherboard is to have a writechannel for all my messages to the server, and then a read channel that is meant to contain the merged stream of everybody's messages in the order they were received.
Inside the message I send, I have another write channel that contains a series of messages that I will write to even after the initial message is sent. This is actually not that uncommon. think IRC with the ability to see everybody's text as they type.
Unfortunately the message I receive back from the server, is somehow different for me than for all the other people who receive the message, and beyond that, it is also different to my initial writechannel.
Trying to pipe from it, causes some kind of silent error, and unpipes everything, meaning that my client becomes disconnected.
This behaviour can be reproduced with aetherboard by following this issue :
GraftJS/aetherboard#16
I have written some tests for graft in this PR :
GraftJS/graft#13
I'm pretty sure though that the bug is actually going to be found here in jschan though.
I've also noticed that the passing around of the original channel is causing the chan.streams to contain circular references in there.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.