Coder Social home page Coder Social logo

scalameta / scalameta Goto Github PK

View Code? Open in Web Editor NEW
1.1K 1.1K 223.0 24.32 MB

Library to read, analyze, transform and generate Scala programs

Home Page: http://scalameta.org/

License: BSD 3-Clause "New" or "Revised" License

Scala 99.03% Shell 0.12% Python 0.08% Java 0.38% JavaScript 0.32% CSS 0.03% HTML 0.03%
metaprogramming parser pretty-printer scala semantic semanticdb syntactic

scalameta's Introduction

scalameta

Build Status

User documentation

Head over to the user docs to learn more about the project and its roadmap.

Tutorial

If you'd like to find out how to use scalameta, see this tutorial.

Team

The current maintainers (people who can merge pull requests) are:

An up-to-date list of contributors is available here: https://github.com/scalameta/scalameta/graphs/contributors.

scalameta's People

Contributors

albertikm avatar allanrenucci avatar baccata avatar cb372 avatar daviddudson avatar densh avatar dos65 avatar dveim avatar dwijnand avatar exoego avatar gabro avatar ghik avatar giabao avatar hugo-vrijswijk avatar jonas avatar jvican avatar kitbellew avatar kpbochenek avatar masseguillaume avatar mdemarne avatar olafurpg avatar scala-steward avatar scalameta-bot avatar sethtisue avatar sswistun-vl avatar tanishiking avatar tgodzik avatar vjovanov avatar xavier-fernandez avatar xeno-by avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

scalameta's Issues

AST interpretation

Along with AST persistence, interpretation is one of the fundamental axioms of scala.meta, being cornerstone to #160. @mutcianm prototyped an interpreter for scala.reflect, and @vjovanov prototyped an interpreter for scala.meta (https://github.com/scalameta/interpreter). When the fundamentals of scala.meta are ready (i.e. 0.1 is released), we'll start writing a full-fledged interpreter for our ASTs.

Better tokens for literals and comments

Character literals, string literals and comments are now represented as atomic tokens. If we want to support unquoting into these language constructs (e.g. q"// $comment"), we need to split them into Start/Part/End parts akin to how it's done for string interpolation.

Finish lexical model

All in all, the idea of having tokens associated with trees worked extremely well, because with tokens we: 1) get lowest-level syntactic details (whitespace, comments, etc), 2) get positions absolutely for free.

However, recent experiments by @mdemarne and @Duhemm exposed some shortcomings of our implementation that have to be addressed before we can consider the model finished:

  1. Current encoding of tokens as essentially tuples of (Content, Dialect, Offset, Offset) is too wasteful given that we can get up to tens and hundreds of thousands of tokens on big files.
  2. The aforementioned encoding is not only wasteful, but also inadequate for synthetic tokens (ones generated by #149 or by token quasiquotes). Such tokens don't really have any associated Content, but we are forced to create one for them anyway (e.g. see
    // TODO: Ugh, creating synthetic inputs with semi-meaningless text just to satisfy the framework.
    ).

Def macros

Design and implement a language extension for Scala that would expose functionality equivalent to def macros based on the platform-independent scala.meta API.

upd. (ScalaDays San Francisco 2015) We already have a prototype of this at https://github.com/scalameta/scalahost/blob/f1a17f613e18673b570d70548837399deada7b84/tests/src/test/scala/macros/SerializerMetaJoint.scala, its inner workings are elaborated on at https://groups.google.com/forum/#!topic/scalameta/nE21r2TI8tI.

upd. (ScalaDays Berlin 2016) At ScalaDays Berlin, the question of the future of macros was very hot. I've even had to change the title of my talk to make a definitive statement (https://twitter.com/mzywiol/status/743782879566630912). The main prerequisite to implementing this issue is #436 (semantic APIs) that itself depends on a bunch of other issues. Here's a picture that displays the dependency graph for the current moment:

Make fields like denot, sigma, etc private to meta

Regular users should never be able to write (via copy) or read (via getters) them. We need to think how to consistently hide these fields from everywhere. Making getters private[meta] is simple, but making a named parameter private[meta] is not that easy.

Incorrect behavior of ..$ quasiquotes

val q"($q, y: Y, ..$e)" = q"(x: X, y: Y, z: Z)"
q: scala.meta.Term.Param = x: X
e: scala.collection.immutable.Seq[scala.meta.Term.Param] = List(y: Y, z: Z)

Here e should have been List(z: Z), but it is not.

Incorrect error message in typecheckError

println(typecheckError("""
import scala.meta._
import scala.meta.dialects.Scala211
val l = List()
q"..$l"
"""))

gives strange error:

<macro>:5: exception during macro expansion: 
scala.NotImplementedError: an implementation is missing
    at scala.Predef$.$qmark$qmark$qmark(Predef.scala:225)
    at org.scalameta.ast.Reflection$XtensionAstType$$anonfun$publish$1.apply(Reflection.scala:84)
    at org.scalameta.ast.Reflection$XtensionAstType$$anonfun$publish$1.apply(Reflection.scala:81)
    at scala.reflect.internal.Types$Type$$anon$5.apply(Types.scala:764)
    at scala.reflect.internal.Types$Type$$anon$5.apply(Types.scala:763)
    at scala.collection.immutable.List.loop$1(List.scala:173)
    at scala.collection.immutable.List.mapConserve(List.scala:189)
    at scala.reflect.internal.tpe.TypeMaps$TypeMap.mapOver(TypeMaps.scala:115)
    at scala.reflect.internal.Types$Type$$anon$5.apply(Types.scala:764)
    at scala.reflect.internal.Types$Type.map(Types.scala:765)
    at scala.reflect.internal.Types$Type.map(Types.scala:260)
    at org.scalameta.ast.Reflection$XtensionAstType.publish(Reflection.scala:81)
    at scala.meta.internal.quasiquotes.ast.ConversionMacros.unliftApply(ConversionMacros.scala:69)
    at sun.reflect.GeneratedMethodAccessor69.invoke(Unknown Source)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:497)
    at scala.reflect.macros.runtime.JavaReflectionRuntimes$JavaReflectionResolvers$$anonfun$resolveJavaReflectionRuntime$2.apply(JavaReflectionRuntimes.scala:34)
    at scala.reflect.macros.runtime.JavaReflectionRuntimes$JavaReflectionResolvers$$anonfun$resolveJavaReflectionRuntime$2.apply(JavaReflectionRuntimes.scala:22)
    at scala.tools.nsc.typechecker.Macros$class.macroExpandWithRuntime(Macros.scala:756)

      q"..$l"
      ^

More precise prettyprinting layouts

After #149 is implemented, we'll have basic support for merging different trees together using a simple 2-space indentation scheme. It would be nice to make it possible for the users to customize this scheme by providing various options for the token inferencer.

Code cleanup

As a direct consequence of its experimental nature, our codebase has accumulated a lot of cruft. While a year of experiments has led us to a concrete understanding of what we're going to release in 0.1, it also left a bunch of traps that preclude easy contributions to scala.meta. Some of the especially vicious devices that we need to disarm:

Hygiene

Hygiene (summarized in terms of scala.reflect in http://docs.scala-lang.org/overviews/quasiquotes/hygiene.html) is an extremely useful property of AST construction/deconstruction facilities. While hacking up a subset of of hygiene is simple (scala.reflect has it in form of reify), implementing full-fledged support on par with classic languages like Scheme and Racket requires significant effort.

Finish semantic model

One of our main experiments with scala.meta trees was finding a platform-independent model for semantic information, i.e. information about program that can't be obtained by simply looking at the surface syntax, namely: 1) resolved references, 2) computed types, 3) desugared terms.

To express semantic infomation, scala.reflect uses symbols and types + operates on desugared trees, dotty does the same adding denotations to the mix, intellij has types, but not symbols and doesn't do any desugarings. In short, every potential platform does semantics differently, so we really had to start from scratch and design something that would be compatible with the multitude of popular approaches.

Our current prototype uses platform-independent denotations and sigmas for resolved references as well as platform-dependent scratchpads for everything else. I think that the former worked quite well, while the latter has to be replaced with something (we can start with something as simple as adding Tree.tpe and Tree.desugaring fields).

  • Remove Tree.scratchpad
  • Add Term.tpe
  • Add Term.desugaring
  • Update ast.md
  • Update show[Semantics] to account for new semantic fields

Documentation

There is already a lot in scala.meta, and there are early adopters (https://github.com/metadoc/metadoc and http://blog.codacy.com/2015/06/04/a-quick-look-at-scalameta/) who are using it to solve practical issues. However, there are no comprehensive guides that describe available functionality and give lightweight examples (the readme in this repo introduces some of the key concepts, but it's by no means exhaustive, and scalameta/example is very much a DYI thing + the build is quite scary).

Denotations are too precise

The current idea with denotations including a prefix alongside a symbol is inspired by scala.reflect's problems with asSeenFrom/typeSignatureIn (in short, getting a type signature of a symbol requires computing and carrying around prefixes, e.g. see https://gitter.im/scala/scala?at=556cee1d05c872ce6ac7843d).

With denotations being equivalent to (Prefix, Symbol), we can automatically compute signatures in situations like t"List[Int]".defs("head").tpe without users ever needing to know that there's something non-trivial going on here.

Now, when we have prefix information baked into names and have this information affect the behavior of these names, it's only logical to account for prefixes in equality checks.

This, however, gives rise to an annoying problem: if one compares names that point to the same definition, the comparison will return false in cases when prefixes are different, e.g. t"42".defs("+") != t"Int".defs("+"). This is clearly unwanted and should be fixed.

The fix is complicated by the fact that equality checks can't use hosts, because trees are platform-independent (don't have links to hosts) == and equals can't take implicit arguments. Therefore we can't just ask someone to resolve a given symbol in a given prefix during equality checks.

problems with unquoting subtrees into trees

We have several places in the current implementation of quasiquotes, where unquoting a subtree into a tree is not as easy as writing ... $subtree .... We need to scan through the spec, detect all those cases and think what to do with them.

Examples:

  • new $template doesn't work
  • class C $ctor or something doesn't work

inconsistent construction of Pat.Var.XXX nodes

We need easy ways to create Pat.Var.Term and Pat.Var.Type, and the spec promises it to us in the form of p"name" and pt"name".

However, if we dig deeper, we'll find that even though p"x" is a Pat.Var.Term, p"X" is a Term.Name and both pt"x" and pt"X" are Type.Name. This is a direct consequence of how Scala's parser works: Pat.Var.Term with capitalized name is only allowed in the lhs of vals and vars, whereas Pat.Var.Type is only allowed in type arguments.

Term.tpe

After #165 has been merged, Tree.scratchpads that were used to implement Term.tpe no longer exist. We should adapt scalahost to provide a new implementation for this important API.

need dedicated quasiquotes to work with <smth>.Arg

Some constructions can be used in arguments only, for example, def f(x :T*).
Should be special quasiquotes (qarg, targ, parg) to create them.

So, t"T*" or t"=> T" should give parse error, but for targ"T*", targ"=> T" and parg"_*" the result should be the appropriate Tree (for these examples, Type.Arg.Repeated, Type.Arg.ByName and Pat.Arg.SeqWildcard respectively).

Tune performance

We always prioritized getting concepts right over getting blazing speed, but now when the core ideas have proven their usefulness, it's time to benchmark and see where we can improve performance. Potential areas:

Modularization

Historically, we've always been dividing core artifacts of the scala.meta platform into: 1) foundation (miscellaneous helpers), 2) scalameta (trees + the entire surface of platform-independent APIs), 3) scalahost (scalac <-> scalameta adapter).

Nowadays, when we have a bunch of functionality packed into scalameta, it seems reasonable to split it into several parts, e.g. tokens + trees + prettyprinting + semantics + parsing + tql. This might be useful both for documentational (easier to understand) and psychological (minimalism) reasons.

We need to see, however, how this is going to play with the current design of "import scala.meta._ should be enough for everyone", where we take package object meta and simultaneously inherit it from all the aforementioned APIs.

update readme and roadmap

I am very exciting about how things are going on as well as many others. But latest news, roadmap and readme have not been updated since summer...

Weird warning on `val q"$lit" = q"42"`

import scala.meta._
import scala.meta.dialects.Scala211

object Test {
  def main(args: Array[String]): Unit = {
    val q"$lit" = q"42"
  }
}
12:19 ~/Projects/core/sandbox (master)$ s
Test.scala:8: warning: patterns after a variable pattern cannot match (SLS 8.1.1)
    val q"$lit" = q"42"
        ^
Test.scala:8: warning: unreachable code due to variable pattern 'quasiquote$macro$2$hole$0' on line 8
    val q"$lit" = q"42"
        ^
Test.scala:8: warning: unreachable code
    val q"$lit" = q"42"
        ^
three warnings found

q"(i: Int) => 42" doesn't work

11:01 ~/Projects/core/sandbox (master)$ parse '(i: Int) => 42'
// Scala source: /var/folders/n4/39sn1y_d1hn3qjx6h1z3jk8m0000gn/T/tmpVj8xdZ
scala.meta.ParseException: ; expected but right arrow found at 9..11
    at scala.meta.internal.parsers.Reporter$class.syntaxError(Reporter.scala:13)
    at scala.meta.internal.parsers.Reporter$$anon$1.syntaxError(Reporter.scala:19)
    at scala.meta.internal.parsers.Reporter$class.syntaxError(Reporter.scala:14)
    at scala.meta.internal.parsers.Reporter$$anon$1.syntaxError(Reporter.scala:19)
    at scala.meta.internal.parsers.Parser.syntaxErrorExpected(Parsers.scala:478)
    at scala.meta.internal.parsers.Parser.accept(Parsers.scala:484)
    at scala.meta.internal.parsers.Parser.acceptStatSep(Parsers.scala:497)
    at scala.meta.internal.parsers.Parser.acceptStatSepOpt(Parsers.scala:501)
    at scala.meta.internal.parsers.Parser.statSeq(Parsers.scala:2699)
    at scala.meta.internal.parsers.Parser$$anonfun$parseStat$1.apply(Parsers.scala:46)
    at scala.meta.internal.parsers.Parser$$anonfun$parseStat$1.apply(Parsers.scala:46)
    at scala.meta.internal.parsers.Parser.parseRule(Parsers.scala:36)
    at scala.meta.internal.parsers.Parser.parseStat(Parsers.scala:46)
    at scala.meta.syntactic.Api$Parse$$anonfun$parseStat$1.apply(Api.scala:40)
    at scala.meta.syntactic.Api$Parse$$anonfun$parseStat$1.apply(Api.scala:40)
    at scala.meta.syntactic.Api$Parse$$anon$1.apply(Api.scala:39)
    at scala.meta.syntactic.Api$Parse$$anon$1.apply(Api.scala:39)
    at scala.meta.syntactic.Api$XtensionInputLike.parse(Api.scala:60)
    at scala.meta.tools.Metac$.delayedEndpoint$scala$meta$tools$Metac$1(Metac.scala:106)
    at scala.meta.tools.Metac$delayedInit$body.apply(Metac.scala:9)
    at scala.Function0$class.apply$mcV$sp(Function0.scala:34)
    at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
    at scala.App$$anonfun$main$1.apply(App.scala:76)
    at scala.App$$anonfun$main$1.apply(App.scala:76)
    at scala.collection.immutable.List.foreach(List.scala:381)
    at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:35)
    at scala.collection.mutable.ListBuffer.foreach(ListBuffer.scala:45)
    at scala.App$class.main(App.scala:76)
    at scala.meta.tools.Metac$.main(Metac.scala:9)
    at scala.meta.tools.Metac.main(Metac.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at scala.reflect.internal.util.ScalaClassLoader$$anonfun$run$1.apply(ScalaClassLoader.scala:70)
    at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
    at scala.reflect.internal.util.ScalaClassLoader$URLClassLoader.asContext(ScalaClassLoader.scala:101)
    at scala.reflect.internal.util.ScalaClassLoader$class.run(ScalaClassLoader.scala:70)
    at scala.reflect.internal.util.ScalaClassLoader$URLClassLoader.run(ScalaClassLoader.scala:101)
    at scala.tools.nsc.CommonRunner$class.run(ObjectRunner.scala:22)
    at scala.tools.nsc.ObjectRunner$.run(ObjectRunner.scala:39)
    at scala.tools.nsc.CommonRunner$class.runAndCatch(ObjectRunner.scala:29)
    at scala.tools.nsc.ObjectRunner$.runAndCatch(ObjectRunner.scala:39)
    at scala.tools.nsc.MainGenericRunner.runTarget$1(MainGenericRunner.scala:65)
    at scala.tools.nsc.MainGenericRunner.run$1(MainGenericRunner.scala:87)
    at scala.tools.nsc.MainGenericRunner.process(MainGenericRunner.scala:98)
    at scala.tools.nsc.MainGenericRunner$.main(MainGenericRunner.scala:103)
    at scala.tools.nsc.MainGenericRunner.main(MainGenericRunner.scala)

Impossible to use `val importee"$a => $b" = importee"x => _"`

So, lhs results in Import.Selector.Rename(Name.Indeterminate.Quasi(0, (quasiquote$macro$4$hole$0 @ _)), Name.Indeterminate.Quasi(0, (quasiquote$macro$4$hole$1 @ _))), and rhs - Import.Selector.Unimport(Name.Indeterminate("x")). Import.Selector.Unimport != Import.Selector.Rename, therefore pattern-matching fails with MatchError.

Probably one should merge Unimport into Rename, with Rename.to == _.

Exception while unquoting names like '$x'

Here is an example: (note a space in second line)
Also second line probably should result in "y", not "$y".

scala> val $y = q"y"
$y: scala.meta.internal.ast.Term.Name = y

scala> q" $$y"
res2: scala.meta.internal.ast.Term.Name = $y

scala> q"$$y"
<console>:15: error: exception during macro expansion:
java.lang.UnsupportedOperationException: position-changing adjust on Token.BOF: expected 0..0, actual 1..1
    at scala.meta.syntactic.Token$BOF.adjust(Token.scala:145)
    at scala.meta.internal.quasiquotes.ast.ReificationMacros$$anonfun$5$$anonfun$apply$1.apply(ReificationMacros.scala:169)
    at scala.meta.internal.quasiquotes.ast.ReificationMacros$$anonfun$5$$anonfun$apply$1.apply(ReificationMacros.scala:167)
    at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:245)
    at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:245)
    at scala.collection.Iterator$class.foreach(Iterator.scala:742)
    at scala.collection.AbstractIterator.foreach(Iterator.scala:1194)
    at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
    at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
    at scala.collection.TraversableLike$class.map(TraversableLike.scala:245)
    at scala.collection.AbstractTraversable.map(Traversable.scala:104)
    at scala.meta.internal.quasiquotes.ast.ReificationMacros$$anonfun$5.apply(ReificationMacros.scala:167)
    at scala.meta.internal.quasiquotes.ast.ReificationMacros$$anonfun$5.apply(ReificationMacros.scala:104)
    at scala.collection.immutable.List.map(List.scala:273)
    at scala.meta.internal.quasiquotes.ast.ReificationMacros.parseSkeleton(ReificationMacros.scala:104)
    at scala.meta.internal.quasiquotes.ast.ReificationMacros.expand(ReificationMacros.scala:64)
    at scala.meta.internal.quasiquotes.ast.ReificationMacros.apply(ReificationMacros.scala:61)

       q"$$y"
       ^

Attach tokens to quasiquotes

Current implementation of quasiquotes (described in detail in #157) relies on materializeAst to reify quasiquoted trees. This materializer only persists tree shapes and semantic information, but not tokens. We need to fix this if we want to implement things like #46.

scala.reflect -> scala.meta tree converter

The reflect -> meta AST converter was one of the initial pieces of functionality implemented in scalahost (the first one was the prototype for lightweight macros and the second one was the converter).

At the time when the AST converter was being developed, we had a different vision for scala.meta, which led to several fundamental assumptions that we made when developing scalahost: 1) the only environment the converter is going to be invoked is a post-typer compiler plugin to scalac, 2) it is possible to undo the desugarings that scalac does in scala.reflect trees, 3) it is practical to do so.

At the moment, one year later, we live in a different world: 1) it turned out that converting unattributed trees is also important (e.g. to support macro annotations), 2) most of desugarings can indeed be undone, but 3) that requires patching the typechecker to store out-of-band information about trees, and that has absolutely no chances to scale to TASTY.

Finally, the converter ended up being too smart for its own good. I wanted to experiment with a typeclass-based approach, when a macro would analyze the code of the converter and automatically infer fundeps between input and output types, but that ended up being an abomination to write and support, and now I just want it gone.

Therefore, we are planning to rewrite the converter from scratch according to the following scheme:

  1. Figure out source code of the output scala.meta tree. If the input scala.reflect tree has associated source code, take it. Otherwise, prettyprint the input to obtain synthetic source code.
  2. Parse the source code with the scala.meta parser to obtain the skeleton of the output. If there was actual, human-written source code, our parser will retain all syntactic details, otherwise it doesn't matter anyway.
  3. Traverse the skeleton and the input in parallel, taking note of: a) symbols, b) types, c) desugarings. This conflates the ensugaring and conversion steps of the current converter, which is not that good, but, more importantly, obviates the need to guess tree shapes from scala.reflect's lossy representation, which is priceless.

This will realize our long-term dream of perfect trees that combine syntactic comprehensiveness and semantic richness, being the cornerstone of the 0.1 release: https://github.com/scalameta/scalameta/milestones.

Upd. One more year later, the vision has changed again. See #148 (comment) for more information.

Quasiquotes

We currently have a prototype of quasiquotes that works as follows:

  1. There's a bunch of string interpolators defined for a bunch of non-terminals of Scala grammar, e.g. @astQuasiquote[Stat]('q) implicit class XtensionQuasiquoteTerm(ctx: StringContext).
  2. These interpolators are dispatched to a macro
  3. The macro assembles and parses tokens that comprise the quasiquote
  4. The resulting syntactic tree is populated with attributes
  5. The resulting semantic tree is then reified using the combination of auto-derived liftables and hand-written handlers of unquotes and splices

This prototype already works for construction and deconstruction when quasiquotes don't contain any unquotes. However, unquoting is currently problematic, because it requires support from the parser and the reifier.

The following steps need to be taken to make this prototype truly useful:

The naive approximation of hygiene should also be replaced, but that's a really big topic that deserves its own issue: #156.

Denotation.Multi

Currently, our semantic model assumes that all names umambiguously bind to a single definition represented by a denotation. Not mentioning the situation with partially typechecked trees that can arise in IDEs or REPL, this model doesn't cover imports in Scala and union types in Dotty. Therefore, we need to introduce a notion of multidenotations (e.g. by adding another ADT case to Denotation).

Correct processing of optional terms

Operations like val q"try $a finally $b" = q"try x finally y" should return Option-wrapped types, where spec says. So, in example above, b should be Some[scala.meta.Term] instead of scala.meta.Term.

This issue affects pattern-matching unquoting with non-existent parts in right right. For example, in val q"try $a finally $b" = q"try x" b should return None, whereas at the moment MatchError occurs.

The bug probably is inside reificator.

Missing value when unquoting enumerators

val q"for (a <- as; if $cond; ..$enumz) bar" = q"for (a <- as; if foo; b <- bs) bar"
produces following error: not found: value quasiquote$macro$20$hole$0

Here is full output with -Dquasiquote.debug:

scala> val q"for (a <- as; if $cond; ..$enumz) bar" = q"for (a <- as; if foo; b <- bs) bar" 

Adhoc(List(BOF (0..0), for (0..3),   (3..4), ( (4..5), a (5..6),   (6..7), <- (7..9),   (9..10), as (10..12), ; (12..13),   (13..14), if (14..16),   (16..17), foo (17..20), ; (20..21),   (21..22), b (22..23),   (23..24), <- (24..26),   (26..27), bs (27..29), ) (29..30),   (30..31), bar (31..34), EOF (34..34)))
Adhoc(List(BOF (0..0), for (0..3),   (3..4), ( (4..5), a (5..6),   (6..7), <- (7..9),   (9..10), as (10..12), ; (12..13),   (13..14), if (14..16),   (16..17), foo (17..20), ; (20..21),   (21..22), b (22..23),   (23..24), <- (24..26),   (26..27), bs (27..29), ) (29..30),   (30..31), bar (31..34), EOF (34..34)))
for (a <- as; if foo; b <- bs) bar
Term.For(List(Enumerator.Generator(Pat.Var.Term(Term.Name("a")), Term.Name("as")), Enumerator.Guard(Term.Name("foo")), Enumerator.Generator(Pat.Var.Term(Term.Name("b")), Term.Name("bs"))), Term.Name("bar"))
_root_.scala.meta.internal.ast.Term.For(_root_.scala.collection.immutable.Seq(_root_.scala.meta.internal.ast.Enumerator.Generator(_root_.scala.meta.internal.ast.Pat.Var.Term(_root_.scala.meta.internal.ast.Term.Name("a").copy(denot = _root_.scala.meta.internal.semantic.Denotation.Zero, typing = _root_.scala.meta.internal.semantic.Typing.Unknown, expansion = _root_.scala.meta.internal.semantic.Expansion.Identity)), _root_.scala.meta.internal.ast.Term.Name("as").copy(denot = _root_.scala.meta.internal.semantic.Denotation.Zero, typing = _root_.scala.meta.internal.semantic.Typing.Unknown, expansion = _root_.scala.meta.internal.semantic.Expansion.Identity)), _root_.scala.meta.internal.ast.Enumerator.Guard(_root_.scala.meta.internal.ast.Term.Name("foo").copy(denot = _root_.scala.meta.internal.semantic.Denotation.Zero, typing = _root_.scala.meta.internal.semantic.Typing.Unknown, expansion = _root_.scala.meta.internal.semantic.Expansion.Identity)), _root_.scala.meta.internal.ast.Enumerator.Generator(_root_.scala.meta.internal.ast.Pat.Var.Term(_root_.scala.meta.internal.ast.Term.Name("b").copy(denot = _root_.scala.meta.internal.semantic.Denotation.Zero, typing = _root_.scala.meta.internal.semantic.Typing.Unknown, expansion = _root_.scala.meta.internal.semantic.Expansion.Identity)), _root_.scala.meta.internal.ast.Term.Name("bs").copy(denot = _root_.scala.meta.internal.semantic.Denotation.Zero, typing = _root_.scala.meta.internal.semantic.Typing.Unknown, expansion = _root_.scala.meta.internal.semantic.Expansion.Identity))), _root_.scala.meta.internal.ast.Term.Name("bar").copy(denot = _root_.scala.meta.internal.semantic.Denotation.Zero, typing = _root_.scala.meta.internal.semantic.Typing.Unknown, expansion = _root_.scala.meta.internal.semantic.Expansion.Identity)).copy(typing = _root_.scala.meta.internal.semantic.Typing.Unknown, expansion = _root_.scala.meta.internal.semantic.Expansion.Identity)
Adhoc(List(BOF (0..0), for (0..3),   (3..4), ( (4..5), a (5..6),   (6..7), <- (7..9),   (9..10), as (10..12), ; (12..13),   (13..14), if (14..16),   (16..17), $cond (0..5), ; (0..1),   (1..2), .. (2..4), $enumz (0..6), ) (0..1),   (1..2), bar (2..5), EOF (5..5)))
Adhoc(List(BOF (0..0), for (0..3),   (3..4), ( (4..5), a (5..6),   (6..7), <- (7..9),   (9..10), as (10..12), ; (12..13),   (13..14), if (14..16),   (16..17), $cond (0..5), ; (0..1),   (1..2), .. (2..4), $enumz (0..6), ) (0..1),   (1..2), bar (2..5), EOF (5..5)))
for (a <- as; if $cond; ..$enumz) bar
Term.For(List(Enumerator.Generator(Pat.Var.Term(Term.Name("a")), Term.Name("as")), Enumerator.Guard(Term.Quasi(0, (quasiquote$macro$20$hole$0 @ _))), Enumerator.Generator.Quasi(1, Enumerator.Generator.Quasi(0, (quasiquote$macro$20$hole$1 @ _)))), Term.Name("bar"))
{
  final class $anon extends scala.AnyRef {
    def <init>() = {
      super.<init>();
      ()
    };
    def unapply(input: _root_.scala.meta.Tree) = input match {
      case _root_.scala.meta.internal.ast.Term.For($plus$colon(_root_.scala.meta.internal.ast.Enumerator.Generator(_root_.scala.meta.internal.ast.Pat.Var.Term(_root_.scala.meta.internal.ast.Term.Name("a")), _root_.scala.meta.internal.ast.Term.Name("as")), (quasiquote$macro$20$hole$1 @ _)), _root_.scala.meta.internal.ast.Term.Name("bar")) => scala.Tuple2(Unlift.unapply[_root_.scala.meta.Tree](quasiquote$macro$20$hole$0), Unlift.unapply[Seq[_root_.scala.meta.Tree]](quasiquote$macro$20$hole$1)) match {
        case scala.Tuple2(_root_.scala.Some((quasiquote$macro$20$result$0 @ _)), _root_.scala.Some((quasiquote$macro$20$result$1 @ _))) => _root_.scala.Some(scala.Tuple2(quasiquote$macro$20$result$0, quasiquote$macro$20$result$1))
        case _ => _root_.scala.None
      }
      case _ => _root_.scala.None
    }
  };
  new $anon()
}.unapply(<unapply-selector>)
<console>:16: error: not found: value quasiquote$macro$20$hole$0
       val q"for (a <- as; if $cond; ..$enumz) bar" = q"for (a <- as; if foo; b <- bs) bar"
                              ^

Interoperability with TASTY

One of the fundamental axioms of scala.meta is AST persistence. Since TASTY is the platform for AST persistence, we need to be able to interop with it, i.e. to load TASTY as scala.meta trees and to save scala.meta trees into TASTY.

Token inference

Tokens are our way of comprehensively capturing syntactic details of Scala source code. Each scala.meta tree has an associated collection of tokens that keep track of everything that comprises its corresponding source code, including whitespace, comments, etc.

Our parser automatically populates tokens for the trees that it emits (that means that trees that come from human-written source code are guaranteed to have tokens), but tokens for synthetic trees (e.g. ones that are created via tree transformations) are currently stubbed: https://github.com/scalameta/scalameta/blob/master/scalameta/src/main/scala/scala/meta/internal/ui/InferTokens.scala.

Bootstrap quasiquotes

Current implementation of quasiquotes (see #157 for details) is written as a scala.reflect macro in terms of the scala.reflect API. We'd need to bootstrap it to make it useful to Dotty and Intellij. Note that it doesn't mean that we need to fully implement #160 - it just means that the implementation needs to be rewritten to only use scala.meta APIs.

Possible right-associativity of operator should be handled

At the moment, right-associative operators are processed as left-associative ones.

scala> q"x*y".show[Structure]
res4: String = Term.ApplyInfix(Term.Name("x"), Term.Name("*"), Nil, List(Term.Name("y")))

scala> q"x::y".show[Structure]
res5: String = Term.ApplyInfix(Term.Name("x"), Term.Name("::"), Nil, List(Term.Name("y")))

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.