Coder Social home page Coder Social logo

lsp-server's Introduction

lsp-server

A language server scaffold exposing a crossbeam-channel API.

This crate has been vendored into the rust-analyzer repo

Description

This crate is a language server scaffold, exposing a synchronous crossbeam-channel based API. It handles protocol handshaking and parsing messages, while you control the message dispatch loop yourself.

See examples/goto_def.rs for a minimal example LSP server that can only respond to the gotoDefinition request. To use the example, execute it and then send an initialize request.

lsp-server's People

Contributors

andrewradev avatar bjorn3 avatar bollwyvl avatar bors[bot] avatar darinmorrison avatar jneem avatar kjeremy avatar lnicola avatar markjansnl avatar matklad avatar sou-chon avatar steffengy avatar veykril avatar vsrs avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

lsp-server's Issues

Connection::socket doesn't seem to be working correctly

I'm trying to implement an LSP which accepts socket connections over tcp

I've tried initializing the connction like this to create an LSP which is listening on port 5555:

let (connection, io_threads) = Connection::socket("localhost:5555");

but connection fails with the error:

'Couldn't connect to the server...: Os { code: 61, kind: ConnectionRefused, message: "Connection refused" }'

If I look into the source for Connection it doesn't make sense to me what's happening: my assumption would be that this connection method would listen on the given TCP socket address, so that a client can connect and issue requests over JSON-RPC, but this seems to be opening a connection to the given socket address.

What exactly is happening here? Is it possible to set up an LSP which is listening for a client connection via a socket address, or is there something else I am missing?

Invalid method name leads to a panic

[DEBUG][2021-10-08 06:46:46] .../vim/lsp/rpc.lua:395 "rpc.send" { id = 3, jsonrpc = "2.0", method = 0, params = "textDocument/formatting"}
[ERROR][2021-10-08 06:46:46] .../vim/lsp/rpc.lua:459 "rpc" "rust-analyzer" "stderr" "thread 'main' panicked at 'called Option::unwrap() on a None value', /home/puh/.cargo/registry/src/github.com-1ecc6299db9ec823/lsp-server-0.5.2/src/req_queue.rs:60:34\nstack backtrace:\n"

trouble to receive SelectionRangeRequest

Hi there, I am trying to using lsp-server to capture selection event in VSCode, however, I only received hover event.

let server_capabilities = serde_json::to_value(
        &ServerCapabilities {
            definition_provider: Some(OneOf::Left(true)),
            selection_range_provider: Some(SelectionRangeProviderCapability::Simple(true)),
            hover_provider: Some(HoverProviderCapability::Simple(true)),
            ..Default::default()
        }
    ).unwrap();

Can anyone enlighten me anywhere I am doing wrong?

PublishDiagnostics issue

I am sending a diagnostic to the VSCode with following LSP server:

let param = PublishDiagnosticsParams {...}
let not = lsp_server::Notification::new(N::METHOD.to_string(), params);
connection.sender.send<...>(not.into())

However, nothing happened in the VSCode, the notification sent successfully from the sender.send call. Can anyone enlighten me where could be the potential issue?

(the file path, start and end position is correct)

Need additional info on goto_def example and server interfacing.

HI there,
I am working on a vscode extension + Solidity server in rust for providing support for Solidity language in hyperledger.
I was trying the goto_def example and I hit some hurdles while doing so.

  1. There is no correct way defined to interface the client side with server. I tried running the goto_def example but unfortunately it sometimes appears and sometimes not in the output of vscode console. So i would like to have some clarification on that.
    A screenshot for understanding:
    servercli

The repo: https://github.com/Hyperion101010/sls/tree/p1_work

Also as there is very little resource available on the server part of the vscode extension development. I would appreciate any useful docs/blogs if you have come across for server development in other language like rust, python except ts and nodejs.

build error: error[E0658]: use of unstable library feature 'panic_any'

Was trying to build texlab and got this error.

 % cargo install --path ./
  Installing texlab v3.0.0 (C:\dev\rust\texlab)
    Updating crates.io index
    Updating git repository `https://github.com/cormacrelf/citeproc-rs`
   Compiling markup5ever v0.9.0
   Compiling lsp-server v0.5.1
   Compiling texlab v3.0.0 (C:\dev\rust\texlab)
   Compiling xml5ever v0.16.1
   Compiling html5ever v0.25.1
error[E0658]: use of unstable library feature 'panic_any'
  --> C:\Users\jtebokkel\.cargo\registry\src\github.com-1ecc6299db9ec823\lsp-server-0.5.1\src\stdio.rs:60:17
   |
60 |                 std::panic::panic_any(err)
   |                 ^^^^^^^^^^^^^^^^^^^^^
   |
   = note: see issue #78500 <https://github.com/rust-lang/rust/issues/78500> for more information

error[E0658]: use of unstable library feature 'panic_any'
  --> C:\Users\jtebokkel\.cargo\registry\src\github.com-1ecc6299db9ec823\lsp-server-0.5.1\src\stdio.rs:67:17
   |
67 |                 std::panic::panic_any(err);
   |                 ^^^^^^^^^^^^^^^^^^^^^
   |
   = note: see issue #78500 <https://github.com/rust-lang/rust/issues/78500> for more information

error: aborting due to 2 previous errors

For more information about this error, try `rustc --explain E0658`.
error: could not compile `lsp-server`

To learn more, run the command again with --verbose.
warning: build failed, waiting for other jobs to finish...
error: failed to compile `texlab v3.0.0 (C:\dev\rust\texlab)`, intermediate artifacts can be found at `C:\dev\rust\texlab\target`

Question regarding license

It's not clear what's the licensing situation for this project, is it safe to assume it's licensed in the same way as rust-analyzer (dual MIT and Apache license)?

Testing small projects - sending data to goto_def example results in error

I'm a bit new to LSP programs and trying to test a small one sample right now. I'm running cargo run --example goto_def, but can't seem to figure out how to actually send commands to a stdio connection.

I saw pr #27 , and I've tried copying JSON-rpc commands and piping them to the command but neither seem to work properly:

$ cargo run --example goto_def

    Finished dev [unoptimized + debuginfo] target(s) in 0.05s
     Running `target/debug/examples/goto_def`
starting generic LSP server
{"jsonrpc":"2.0","method":"initialize","id":1,"params":{"capabilities":{}}}
Error: ProtocolError("expected initialize request, got error: receiving on an empty and disconnected channel")

$  cat send1                                                                     

{"jsonrpc":"2.0","method":"initialize","id":1,"params":{"capabilities":{}}}

$  cat send1|cargo run --example goto_def                                              

    Finished dev [unoptimized + debuginfo] target(s) in 0.06s
     Running `target/debug/examples/goto_def`
starting generic LSP server
Error: ProtocolError("expected initialize request, got error: receiving on an empty and disconnected channel")

$   cargo run --example goto_def < send1                                           

    Finished dev [unoptimized + debuginfo] target(s) in 0.02s
     Running `target/debug/examples/goto_def`
starting generic LSP server
Error: ProtocolError("expected initialize request, got error: receiving on an empty and disconnected channel")

I changed the sample to use sockets, but run into a different error here:

diff examples/goto_def.rs
-    let (connection, io_threads) = Connection::stdio();
+    let (connection, io_threads) = Connection::listen("127.0.0.1:5555")?;

$ cargo run                                                                         

    Finished dev [unoptimized + debuginfo] target(s) in 0.10s
     Running `target/debug/lumos-lsp`
starting generic LSP server
thread '<unnamed>' panicked at 'called `Result::unwrap()` on an `Err` value: Custom { kind: InvalidData, error: "malformed header: \"POST / HTTP/1.1\"" }', /Users/paula1/.cargo/registry/src/github.com-1ecc6299db9ec823/lsp-server-0.6.0/src/socket.rs:27:60
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
Error: ProtocolError("expected initialize request, got error: receiving on an empty and disconnected channel")

# sent this curl cmd from a different tty during run
 $ curl -vv -XPOST -H 'Content-Type: application/json' -d '{"jsonrpc":"2.0","method":"initialize","id":1,"params":{"capabilities":{}}}' http://localhost:5555

Note: Unnecessary use of -X or --request, POST is already inferred.
*   Trying ::1...
* TCP_NODELAY set
* Connected to localhost (::1) port 5555 (#0)
> POST / HTTP/1.1
> Host: localhost:5555
> User-Agent: curl/7.64.1
> Accept: */*
> Content-Type: application/json
> Content-Length: 75
>
* upload completely sent off: 75 out of 75 bytes
* Empty reply from server
* Connection #0 to host localhost left intact
curl: (52) Empty reply from server
* Closing connection 0

Am I missing something basic here about communication with the server?

Connection::initialize might be the wrong interface

It turns out that ServerCapabilities may need to be modified based on ClientCapabilities before being sent out but that cannot be done with the current library (we can only access ClientCapabilities after constructing server capabilities. In addition since LSP 3.15 InitializeResult can have a serverInfo field to pass back to the client which is also not supported by this interface.

Context: rust-lang/rust-analyzer#4130

[Question] How to ship with a VSCode extension

Hi,

I am planning to make an lsp-server with Rust and ship it with my VSCode extension.
What would be the best way to do this?
Have the builds ready and download them when the extension is first downloaded or ship the builds for all platforms with the extension?

Thanks in advance!

Async version of the server

Thanks for creating this project! It's been a tremendous help when implementing deno lsp (https://github.com/denoland/deno/tree/master/cli/lsp).

However due to synchronous nature of the server we've hit a few problems that required us to "asyncify" the server. We did it by spawning a separate thread that "pumps" messages received from the client. This solution works for now, but requires additional infrastructure to cooperate with Tokio.

So here's my question; is there an interest from rust-analyzer team in having an async version of the server? The codebase is relatively small and porting it to use Tokio should be rather a straightforward task. I'll be also happy to fork the project and provide a separate crate once #25 is answered.

CC @matklad

Recommended testing strategy

Hello!

Thanks for this library, it looks very handy :)

Is there a recommended testing strategy for when using this library? Would you test it via the Connection or might you focus on the functions that the Connection handling code dispatches individual messages to?

Thanks,
Louis

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.