Coder Social home page Coder Social logo

sbt-unidoc's Introduction

sbt-unidoc

sbt plugin to unify scaladoc/javadoc across multiple projects.

how to add this plugin

For sbt 1.x (requires sbt 1.5.x and above) add the following to your project/plugins.sbt:

addSbtPlugin("com.github.sbt" % "sbt-unidoc" % "0.5.0")

Note: We changed the organization from "com.eed3si9n" to "com.github.sbt".

For older sbt 1.x and sbt 0.13 add the following to your project/plugins.sbt:

addSbtPlugin("com.eed3si9n" % "sbt-unidoc" % "0.4.3")

how to unify scaladoc

  1. Enable ScalaUnidocPlugin in your root project's settings.

Note: If one of your subprojects is defining def macros, add scalacOptions in (ScalaUnidoc, unidoc) += "-Ymacro-expand:none" to the root project's setting to temporary halt the macro expansion.

Here's an example setup using multi-project build.sbt:

ThisBuild / organization := "com.example"
ThisBuild / version := "0.1-SNAPSHOT"
ThisBuild / scalaVersion := "2.12.15"
ThisBuild / autoAPIMappings := true

val library = (project in file("library"))
  .settings(
    name := "foo-library"
  )

val app = (project in file("app"))
  .dependsOn(library)
  .settings(
    name := "foo-app"
  )

val root = (project in file("."))
  .enablePlugins(ScalaUnidocPlugin)
  .aggregate(library, app)
  .settings(
    name := "foo"
  )

From the root project, run unidoc task:

foo> unidoc
...
[info] Generating Scala API documentation for main sources to /unidoc-sample/target/scala-2.10/unidoc...
[info] Scala API documentation generation successful.
[success] Total time: 10 s, completed May 16, 2013 12:57:10 AM

A Scala unidoc is created under crossTarget / "unidoc" containing entities from all projects under the build.

how to exclude a project

  1. Construct ScalaUnidoc / unidoc / unidocProjectFilter in the root project's settings.
val root = (project in file("."))
  .enablePlugins(ScalaUnidocPlugin)
  .aggregate(library, app)
  .settings(
    name := "foo",
    ScalaUnidoc / unidoc / unidocProjectFilter := inAnyProject -- inProjects(app)
  )

This will skip Scaladoc for the app project.

how to include multiple configurations

  1. Construct ScalaUnidoc / unidoc / unidocConfigurationFilter in the root project's settings.
val root = (project in file("."))
  .enablePlugins(ScalaUnidocPlugin)
  .aggregate(library, app)
  .settings(
    name := "foo",
    TestScalaUnidoc / unidoc / unidocConfigurationFilter := inConfigurations(Compile, Test)
  )

Running test:unidoc will now create unidoc including both Compile and Test configuration.

how to publish Scala unidoc to Github Pages

Add sbt-site and sbt-ghpages to your project/site.sbt:

addSbtPlugin("com.typesafe.sbt" % "sbt-site" % "1.3.2")

addSbtPlugin("com.typesafe.sbt" % "sbt-ghpages" % "0.6.2")

Then in build.sbt import GitKeys,

import com.typesafe.sbt.SbtGit.GitKeys._

Add ScalaUnidoc / packageDoc / mappings to the site's mapping:

val root = (project in file("."))
  .enablePlugins(ScalaUnidocPlugin, GhpagesPlugin)
  .aggregate(library, app)
  .settings(
    name := "foo",
    ScalaUnidoc / siteSubdirName := "latest/api",
    addMappingsToSiteDir(ScalaUnidoc / packageDoc / mappings, ScalaUnidoc / siteSubdirName),
    gitRemoteRepo := "[email protected]:user/foo.git"
  )

Here's how to preview and publish it:

foo> previewSite
foo> ghpagesPushSite

how to unify javadoc

  1. Enable GenJavadocPlugin in child projects.
  2. Enable JavaUnidocPlugin in the root project.
ThisBuild / organization := "com.example"
ThisBuild / version := "0.1-SNAPSHOT"
ThisBuild / scalaVersion := scalaVersion := "2.12.15"
ThisBuild / autoAPIMappings := true

val library = (project in file("library"))
  .enablePlugins(GenJavadocPlugin)
  .settings(
    name := "foo-library"
  )

val app = (project in file("app"))
  .enablePlugins(GenJavadocPlugin)
  .dependsOn(library)
  .settings(
    name := "foo-app"
  )

val root = (project in file("."))
  .enablePlugins(JavaUnidocPlugin)
  .aggregate(library, app)
  .settings(
    name := "foo"
  )

GenJavadocPlugin adds a compiler plugin called genjavadoc, which generates Java source code into target/java from Scala source code, so javadoc can be generated. The main benefits of javadoc are having natural documentation for Java API, IDE support, and Java enum support. However, the genjavadoc does not always generate compilable Java code; if you see misbehavior please open an issue with genjavadoc.

First clean all projects (in the above, root aggreates both children), then run unidoc task from the root project:

foo> clean
[success] Total time: 0 s, completed May 16, 2013 1:13:55 AM
foo> unidoc
[info] Compiling 10 Scala sources ...
[info] Generating Java API documentation for main sources to /unidoc-sample/target/javaunidoc...
[warn] Loading source file /unidoc-sample/app/target/java/foo/App$.java...
....
[info] Java API documentation generation successful.
[success] Total time: 1 s, completed May 16, 2013 1:14:12 AM

Cleaning is necessary since genjavadoc does not track the correspondence between Scala sources and emitted Java files, leading to extraneous Java sources when deleting classes. A Java unidoc is created under target/javaunidoc containing entities from all projects under the build.

how to publish genjavadoc instead of scaladoc

  1. Enable PublishJavadocPlugin in child projects.

This will substitute the Compile / packageDoc with Genjavadoc / packageDoc to use the enhanced Javadoc.

how to unify both Scaladoc and Javadoc

  1. Enable GenJavadocPlugin (or PublishJavadocPlugin) in child projects.
  2. Enable ScalaUnidocPlugin and JavaUnidocPlugin in the root project.

This combines both Scala unidoc settings and Java unidoc settings. Run unidoc from the root project to execute both.

credits

The original implementation of unidoc task was written by Peter Vlugter (@pvlugter) for Akka project. I took Unidoc.scala in akka/akka@ed40dff7d7353c5cb15b7408ec049116081cb1fc and refactored it. sbt-unidoc is Open Source and available under the Apache 2 License.

sbt-unidoc's People

Contributors

2m avatar artemkorsakov avatar daenyth avatar dependabot[bot] avatar eed3si9n avatar emanresusername avatar halcat0x15a avatar ktoso avatar liff avatar magnolia-k avatar pikinier20 avatar rkuhn avatar scala-steward avatar seratch avatar sethtisue avatar sullis avatar travisbrown avatar xirc avatar xuwei-k avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

sbt-unidoc's Issues

does not work with Scala 3.0.0-M2, sbt-dotty 0.4.6

--- /dev/null
+++ b/a1/A.scala
@@ -0,0 +1 @@
+class A
diff --git a/a2/B.scala b/a2/B.scala
new file mode 100644
index 0000000..179f0d2
--- /dev/null
+++ b/a2/B.scala
@@ -0,0 +1 @@
+class B
diff --git a/build.sbt b/build.sbt
new file mode 100644
index 0000000..57221b1
--- /dev/null
+++ b/build.sbt
@@ -0,0 +1,16 @@
+val commonSettings = Def.settings(
+  organization := "com.example",
+  version := "0.1.0-SNAPSHOT",
+  scalaVersion := "3.0.0-M2",
+)
+
+val a1 = project.settings(
+  commonSettings,
+)
+
+val a2 = project.settings(
+  commonSettings,
+)
+
+commonSettings
+enablePlugins(ScalaUnidocPlugin)
diff --git a/project/build.properties b/project/build.properties
new file mode 100644
index 0000000..7de0a93
--- /dev/null
+++ b/project/build.properties
@@ -0,0 +1 @@
+sbt.version=1.4.4
diff --git a/project/plugins.sbt b/project/plugins.sbt
new file mode 100644
index 0000000..6e5893d
--- /dev/null
+++ b/project/plugins.sbt
@@ -0,0 +1,2 @@
+addSbtPlugin("com.eed3si9n" % "sbt-unidoc" % "0.4.3")
+addSbtPlugin("ch.epfl.lamp" % "sbt-dotty" % "0.4.6")

run sbt unidoc

[info] Main Scala API documentation to /home/runner/work/unidoc-scala-3/unidoc-scala-3/target/scala-3.0.0-M2/unidoc...
[error] java.lang.RuntimeException: java.lang.ClassNotFoundException: dotty.tools.dottydoc.Main
[error] 	at xsbt.DottydocRunner.run(DottydocRunner.java:85)
[error] 	at xsbt.ScaladocInterface.run(ScaladocInterface.java:11)
[error] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[error] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
[error] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
[error] 	at java.lang.reflect.Method.invoke(Method.java:498)
[error] 	at sbt.internal.inc.AnalyzingCompiler.invoke(AnalyzingCompiler.scala:330)
[error] 	at sbt.internal.inc.AnalyzingCompiler.doc(AnalyzingCompiler.scala:176)
[error] 	at sbt.internal.inc.AnalyzingCompiler.doc(AnalyzingCompiler.scala:134)
[error] 	at sbt.Doc$.$anonfun$scaladoc$1(Doc.scala:52)
[error] 	at sbt.Doc$.$anonfun$scaladoc$1$adapted(Doc.scala:40)
[error] 	at sbt.RawCompileLike$.$anonfun$prepare$1(RawCompileLike.scala:79)
[error] 	at sbt.RawCompileLike$.$anonfun$prepare$1$adapted(RawCompileLike.scala:72)
[error] 	at sbt.RawCompileLike$.$anonfun$cached$4(RawCompileLike.scala:63)
[error] 	at sbt.RawCompileLike$.$anonfun$cached$4$adapted(RawCompileLike.scala:61)
[error] 	at sbt.util.Tracked$.$anonfun$inputChangedW$1(Tracked.scala:219)
[error] 	at sbt.RawCompileLike$.$anonfun$cached$1(RawCompileLike.scala:68)
[error] 	at sbt.RawCompileLike$.$anonfun$cached$1$adapted(RawCompileLike.scala:52)
[error] 	at sbtunidoc.Unidoc$.apply(Unidoc.scala:30)
[error] 	at sbtunidoc.BaseUnidocPlugin$.$anonfun$baseUnidocSettings$1(BaseUnidocPlugin.scala:25)
[error] 	at scala.Function1.$anonfun$compose$1(Function1.scala:49)
[error] 	at sbt.internal.util.$tilde$greater.$anonfun$$u2219$1(TypeFunctions.scala:62)
[error] 	at sbt.std.Transform$$anon$4.work(Transform.scala:68)
[error] 	at sbt.Execute.$anonfun$submit$2(Execute.scala:282)
[error] 	at sbt.internal.util.ErrorHandling$.wideConvert(ErrorHandling.scala:23)
[error] 	at sbt.Execute.work(Execute.scala:291)
[error] 	at sbt.Execute.$anonfun$submit$1(Execute.scala:282)
[error] 	at sbt.ConcurrentRestrictions$$anon$4.$anonfun$submitValid$1(ConcurrentRestrictions.scala:265)
[error] 	at sbt.CompletionService$$anon$2.call(CompletionService.scala:64)
[error] 	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[error] 	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
[error] 	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[error] 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
[error] 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
[error] 	at java.lang.Thread.run(Thread.java:748)
[error] Caused by: java.lang.ClassNotFoundException: dotty.tools.dottydoc.Main
[error] 	at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
[error] 	at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
[error] 	at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
[error] 	at java.lang.Class.forName0(Native Method)
[error] 	at java.lang.Class.forName(Class.java:264)
[error] 	at xsbt.DottydocRunner.run(DottydocRunner.java:79)
[error] 	at xsbt.ScaladocInterface.run(ScaladocInterface.java:11)
[error] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[error] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
[error] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
[error] 	at java.lang.reflect.Method.invoke(Method.java:498)
[error] 	at sbt.internal.inc.AnalyzingCompiler.invoke(AnalyzingCompiler.scala:330)
[error] 	at sbt.internal.inc.AnalyzingCompiler.doc(AnalyzingCompiler.scala:176)
[error] 	at sbt.internal.inc.AnalyzingCompiler.doc(AnalyzingCompiler.scala:134)
[error] 	at sbt.Doc$.$anonfun$scaladoc$1(Doc.scala:52)
[error] 	at sbt.Doc$.$anonfun$scaladoc$1$adapted(Doc.scala:40)
[error] 	at sbt.RawCompileLike$.$anonfun$prepare$1(RawCompileLike.scala:79)
[error] 	at sbt.RawCompileLike$.$anonfun$prepare$1$adapted(RawCompileLike.scala:72)
[error] 	at sbt.RawCompileLike$.$anonfun$cached$4(RawCompileLike.scala:63)
[error] 	at sbt.RawCompileLike$.$anonfun$cached$4$adapted(RawCompileLike.scala:61)
[error] 	at sbt.util.Tracked$.$anonfun$inputChangedW$1(Tracked.scala:219)
[error] 	at sbt.RawCompileLike$.$anonfun$cached$1(RawCompileLike.scala:68)
[error] 	at sbt.RawCompileLike$.$anonfun$cached$1$adapted(RawCompileLike.scala:52)
[error] 	at sbtunidoc.Unidoc$.apply(Unidoc.scala:30)
[error] 	at sbtunidoc.BaseUnidocPlugin$.$anonfun$baseUnidocSettings$1(BaseUnidocPlugin.scala:25)
[error] 	at scala.Function1.$anonfun$compose$1(Function1.scala:49)
[error] 	at sbt.internal.util.$tilde$greater.$anonfun$$u2219$1(TypeFunctions.scala:62)
[error] 	at sbt.std.Transform$$anon$4.work(Transform.scala:68)
[error] 	at sbt.Execute.$anonfun$submit$2(Execute.scala:282)
[error] 	at sbt.internal.util.ErrorHandling$.wideConvert(ErrorHandling.scala:23)
[error] 	at sbt.Execute.work(Execute.scala:291)
[error] 	at sbt.Execute.$anonfun$submit$1(Execute.scala:282)
[error] 	at sbt.ConcurrentRestrictions$$anon$4.$anonfun$submitValid$1(ConcurrentRestrictions.scala:265)
[error] 	at sbt.CompletionService$$anon$2.call(CompletionService.scala:64)
[error] 	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[error] 	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
[error] 	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[error] 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
[error] 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
[error] 	at java.lang.Thread.run(Thread.java:748)
[error] (Scalaunidoc / doc) java.lang.ClassNotFoundException: dotty.tools.dottydoc.Main

Weird Compilation Error while generating docs for class with scala-logging

Hello,

I'm encountering this strange behaviour while executing sbt unidoc in a module of my project which has got simple ADTs and only com.typesafe.scala-logging" %% "scala-logging" % "3.7.2 as a dependency.

Here is an example

sealed trait Statement {
  def db: String
}

case class SelectStatement(override val db: String)
    extends SQLStatement
    with LazyLogging {

  private def filterToExpression(dimension: String, operator: String, value:String): Option[Expression] = {
    operator match {
      case ">"         => Some(ComparisonExpression(dimension, GreaterThanOperator, value))
      case ">="        => Some(ComparisonExpression(dimension, GreaterOrEqualToOperator, value))
      case "="         => Some(EqualityExpression(dimension, value))
      case "<="        => Some(ComparisonExpression(dimension, LessOrEqualToOperator, value))
      case "<"         => Some(ComparisonExpression(dimension, LessThanOperator, value))
      case op @ _      =>
        logger.warn("Ignored filter with invalid operator: {}", op)
        None
    }
  }

scala version is 2.12.4

The exception got is

[error] 
[error]   last tree to typer: Ident(op)
[error]               symbol: value op (flags: <triedcooking>)
[error]    symbol definition: val op: String
[error]                  tpe: String
[error]        symbol owners: value op -> method filterToExpression -> class SelectSQLStatement -> package statement
[error]       context owners: method filterToExpression -> class SelectSQLStatement -> package statement
[error] 
[error] == Enclosing template or block ==
[error] 
[error] Block(
[error]   Apply(
[error]     "logger"."warn"
[error]     // 2 arguments
[error]     "Ignored filter with invalid operator: {}"
[error]     "op" // val op: String, tree.tpe=String
[error]   )
[error]   "None"
[error] )
[error] 
[error] == Expanded type of tree ==
[error] 
[error] TypeRef(
[error]   TypeSymbol(
[error]     final class String extends Serializable with Comparable[String] with CharSequence
[error]     
[error]   )
[error] )
[error] 
[error] uncaught exception during compilation: scala.MatchError

Honestly, I have no clue of what can cause such a low level exception.
Any hint or suggestion is much apprecciated.

Thanks for the support,
Saverio

warning: Supported source version less than target

Hello,

I'm not sure if this error belongs in this project or the genjavadoc project. I am getting an error that seems to state that annotations are processed with an assumed input source-version of Java 6 (just a guess). I tried using a new version of the genjavadoc plugin (0.9) as well since the diff between 0.8 and 0.9 mentioned some Java 8 improvements, but I got the same error. I looked at both projects but I don't see any reference to the susi project that this seems to come from. Anyways, looking for some help to track down / resolve / work-around the issue.

[warn] warning: Supported source version 'RELEASE_6' from annotation processor 'org.eclipse.sisu.space.SisuIndexAPT6' less than -source '1.8'

Thanks!
John Murray

Unidoc may fail if dependencies of submodules don't agree

See akka/akka#29067

That's because unidoc just concatenates the classpaths from all the aggregated submodules. That might mean that two versions of the same artifact end up on the classpath and it depends on ordering which one is chosen. It doesn't do conflict resolution like update does.

In the best case, it would instead try to build an aggregate module and run a full update on that (might be hard to achieve).

How to use it in maven?

My project is a mix of scala and java, built with maven. I need to generate some JSON data for my restapi description with the help of the Javadoc production process, but I don't know how to use it in maven。

Does unidoc support '<' and '>'?

When building docs from scala docs in the Spark repository, I found that I couldn't use > or <, is there a way to escape these characters if they are supported?

> build/sbt unidoc
...
[error] ./spark/sql/core/target/java/org/apache/spark/sql/functions.java:2834: error: malformed HTML
[error]    *    limit < 0: <code>regex</code> will be applied as many times as possible, and the resulting
[error]               ^

Turn into an autoplugin

I've actually started working on that and created the issue to start a discussion. While adapting the plugin, I realised that unidoc actually provides two different sets of settings:

  • scalaUnidocSettings
  • scalaJavaUnidocSettings

My first instinct, but I'd like to confirm that it's alright, is to provide two auto plugins, one per set of settings:

  • UnidocPlugin
  • UnidocJavaPlugin

Is that the right course?

Duplicate symbols when referring to a project that uses cross-build JVM / JS

https://github.com/Sciss/UnidocCrossProblem

val audioFileURI = uri("https://github.com/Sciss/AudioFile.git#v2.3.1")
val lAudioFile   = ProjectRef(audioFileURI, "rootJVM")

val root = project.in(file("."))
  .enablePlugins(ScalaUnidocPlugin)
  .aggregate(lAudioFile)

When running sbt ++2.13.4! unidoc, basically I am just getting error as if every symbol was already found:

[info] Main Scala API documentation to /data/temp/UnidocCrossProblem/target/scala-2.13/unidoc...
[error] /home/hhrutz/.sbt/1.0/staging/a9131e4c8070a18aff25/audiofile/jvm/src/main/scala/de/sciss/audiofile/AudioFilePlatform.scala:23:7: AudioFilePlatform is already defined as trait AudioFilePlatform
[error] trait AudioFilePlatform {
[error]       ^
[error] /home/hhrutz/.sbt/1.0/staging/a9131e4c8070a18aff25/audiofile/jvm/src/main/scala/de/sciss/audiofile/AudioFileTypePlatform.scala:21:7: AudioFileTypePlatform is already defined as trait AudioFileTypePlatform | => root / Scalaunidoc / doc 1s
[error] trait AudioFileTypePlatform {
[error]       ^
[error] /home/hhrutz/.sbt/1.0/staging/a9131e4c8070a18aff25/audiofile/jvm/src/main/scala/de/sciss/audiofile/ReaderFactoryPlatform.scala:19:7: ReaderFactoryPlatform is already defined as trait ReaderFactoryPlatform | => root / Scalaunidoc / doc 1s
[error] trait ReaderFactoryPlatform {
[error]       ^
[error] /home/hhrutz/.sbt/1.0/staging/a9131e4c8070a18aff25/audiofile/jvm/target/scala-2.13/src_managed/main/sbt-buildinfo/BuildInfo.scala:7:13: BuildInfo is already defined as case class BuildInfo
[error] case object BuildInfo {
[error]             ^
[error] /home/hhrutz/.sbt/1.0/staging/a9131e4c8070a18aff25/audiofile/shared/src/main/scala/de/sciss/audiofile/AsyncAudioFile.scala:21:7: AsyncAudioFile is already defined as trait AsyncAudioFile
[error] trait AsyncAudioFile extends AudioFileBase with AsyncChannel {
[error]       ^
[error] /home/hhrutz/.sbt/1.0/staging/a9131e4c8070a18aff25/audiofile/shared/src/main/scala/de/sciss/audiofile/AsyncBufferHandler.scala:26:26: AsyncBufferHandler is already defined as trait AsyncBufferHandler
[error] private[audiofile] trait AsyncBufferHandler extends BufferHandler {
[error]                          ^
[error] /home/hhrutz/.sbt/1.0/staging/a9131e4c8070a18aff25/audiofile/shared/src/main/scala/de/sciss/audiofile/AsyncBufferHandler.scala:30:26: AsyncBufferReader is already defined as trait AsyncBufferReader
[error] private[audiofile] trait AsyncBufferReader extends AsyncBufferHandler {
[error]                          ^

etc.

I don't know where this comes from. Why does scaladoc/unidoc say all these symbols were already defined? Is there a workaround?

Usage with android-sdk-plugin

Hi,

sbt-unidoc fails if the sub-projects use android-sdk-plugin:

> unidoc
[info] Main Scala API documentation to /home/stanch/projects/macroid/target/scala-2.10/unidoc...
[error] /home/stanch/projects/macroid/macroid-akka/src/main/scala/macroid/akkafragments/Akka.scala:6:object app is not a member of package android
[error] import android.app.Activity
[error]                ^

My guess is that android-sdk-plugin uses some trickery to inject android.jar into the classpath, but I’m not sure how to integrate sbt-unidoc with it.

Reproduction: https://github.com/macroid/macroid/tree/ft-unidoc
Compilation will require this minimal setup: https://github.com/macroid/macroid/blob/ft-unidoc/.travis.yml

<object> is already defined as trait ...

Unidoc is giving me errors when a trait or class has a companion object.

For example:
RunId is already defined as object RunId
RunId is already defined as case class RunId

The project containing the class is a scala/scala.js shared "crossProject" in a multi-project build (scala-2.11.7) .
When I remove the shared project, the unidoc problems go away.
This seems to also be a problem with the sbt gh-pages plugin.

Assertion failure compiling macros with Scala 2.12

Hi!

I recently switched to using Scala 2.12, and have since found that Unidoc fails when processing sources containing macro definitions. Specifically, an assertion error results.

The Ymacro-no-expand option has been deprecated in favor of Ymacro-expand:none as of Scala 2.12, so I'm using that, but otherwise, there aren't too many changes I can recall from the sources I used with Scala 2.11.

Unidoc worked fine with the same code under Scala 2.12. Am I missing something?

Here's what I'm seeing:

java.lang.AssertionError: assertion failed:

     while compiling: /home/some-user/src/some-project/some-module/macro/src/main/scala/some-url/util/package.scala
        during phase: globalPhase=terminal, enteringPhase=typer
     library version: version 2.12.1
    compiler version: version 2.12.1
  reconstructed args: -Ymacro-expand:none -diagrams -doc-footer Copyright © Some copyright message. -deprecation -groups -doc-title Some API Documentation -Xfatal-warnings -doc-version 0.0.4-SNAPSHOT -no-prefixes -classpath /some/path/macro/target/scala-2.12/classes:/home/some-user/.ivy2/cache/org.scoverage/scalac-scoverage-runtime_2.12/jars/scalac-scoverage-runtime_2.12-1.3.0.jar:/home/some-user/.ivy2/cache/org.scala-lang/scala-reflect/jars/scala-reflect-2.12.1.jar -bootclasspath /usr/lib/jvm/java-8-oracle/jre/lib/resources.jar:/usr/lib/jvm/java-8-oracle/jre/lib/rt.jar:/usr/lib/jvm/java-8-oracle/jre/lib/sunrsasign.jar:/usr/lib/jvm/java-8-oracle/jre/lib/jsse.jar:/usr/lib/jvm/java-8-oracle/jre/lib/jce.jar:/usr/lib/jvm/java-8-oracle/jre/lib/charsets.jar:/usr/lib/jvm/java-8-oracle/jre/lib/jfr.jar:/usr/lib/jvm/java-8-oracle/jre/classes:/home/some-user/.ivy2/cache/org.scala-lang/scala-library/jars/scala-library-2.12.1.jar -d /home/some-user/src/some-project/target/scala-2.12/api -implicits

  last tree to typer: Ident(<argument>)
       tree position: <unknown>
            tree tpe: some-url.util.Resource
              symbol: <none>
   symbol definition: <none> (a NoSymbol)
      symbol package: <none>
       symbol owners: 
           call site: package <root> in <none>

<Cannot read source file>
    at scala.tools.nsc.doc.model.ModelFactory.findTemplateMaybe(ModelFactory.scala:811)
    at scala.tools.nsc.doc.model.MemberLookup.internalLink(MemberLookup.scala:15)
    at scala.tools.nsc.doc.model.MemberLookup.internalLink$(MemberLookup.scala:14)
    at scala.tools.nsc.doc.DocFactory$$anon$1.internalLink(DocFactory.scala:68)
    at scala.tools.nsc.doc.base.MemberLookupBase.$anonfun$memberLookup$6(MemberLookupBase.scala:59)
    at scala.tools.nsc.doc.base.MemberLookupBase.memberLookup(MemberLookupBase.scala:59)
    at scala.tools.nsc.doc.base.MemberLookupBase.memberLookup$(MemberLookupBase.scala:48)
    at scala.tools.nsc.doc.DocFactory$$anon$1.memberLookup(DocFactory.scala:68)
    at scala.tools.nsc.doc.base.CommentFactoryBase.$anonfun$parseAtSymbol$16(CommentFactoryBase.scala:371)
    at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:234)
    at scala.collection.immutable.Map$Map2.foreach(Map.scala:146)
    at scala.collection.TraversableLike.map(TraversableLike.scala:234)
    at scala.collection.TraversableLike.map$(TraversableLike.scala:227)
    at scala.collection.AbstractTraversable.map(Traversable.scala:104)
    at scala.tools.nsc.doc.base.CommentFactoryBase.linkedExceptions$1(CommentFactoryBase.scala:370)
    at scala.tools.nsc.doc.base.CommentFactoryBase.parse0$1(CommentFactoryBase.scala:388)
    at scala.tools.nsc.doc.base.CommentFactoryBase.parseAtSymbol(CommentFactoryBase.scala:416)
    at scala.tools.nsc.doc.base.CommentFactoryBase.parseAtSymbol$(CommentFactoryBase.scala:200)
    at scala.tools.nsc.doc.DocFactory$$anon$1.parseAtSymbol(DocFactory.scala:68)
    at scala.tools.nsc.doc.model.CommentFactory.parse(CommentFactory.scala:85)
    at scala.tools.nsc.doc.model.CommentFactory.parse$(CommentFactory.scala:83)
    at scala.tools.nsc.doc.DocFactory$$anon$1.parse(DocFactory.scala:68)
    at scala.tools.nsc.doc.model.CommentFactory.defineComment(CommentFactory.scala:75)
    at scala.tools.nsc.doc.model.CommentFactory.defineComment$(CommentFactory.scala:38)
    at scala.tools.nsc.doc.DocFactory$$anon$1.defineComment(DocFactory.scala:68)
    at scala.tools.nsc.doc.model.CommentFactory.$anonfun$comment$1(CommentFactory.scala:31)
    at scala.collection.mutable.HashMap.getOrElseUpdate(HashMap.scala:79)
    at scala.tools.nsc.doc.model.CommentFactory.comment(CommentFactory.scala:31)
    at scala.tools.nsc.doc.model.CommentFactory.comment$(CommentFactory.scala:29)
    at scala.tools.nsc.doc.DocFactory$$anon$1.comment(DocFactory.scala:68)
    at scala.tools.nsc.doc.model.ModelFactoryImplicitSupport.makeImplicitConversions(ModelFactoryImplicitSupport.scala:102)
    at scala.tools.nsc.doc.model.ModelFactoryImplicitSupport.makeImplicitConversions$(ModelFactoryImplicitSupport.scala:85)
    at scala.tools.nsc.doc.DocFactory$$anon$1.makeImplicitConversions(DocFactory.scala:68)
    at scala.tools.nsc.doc.model.ModelFactory$DocTemplateImpl.<init>(ModelFactory.scala:346)
    at scala.tools.nsc.doc.model.ModelFactory$modelCreation$$anon$7.<init>(ModelFactory.scala:642)
    at scala.tools.nsc.doc.model.ModelFactory$modelCreation$.createDocTemplate$1(ModelFactory.scala:642)
    at scala.tools.nsc.doc.model.ModelFactory$modelCreation$.createTemplate(ModelFactory.scala:689)
    at scala.tools.nsc.doc.model.ModelFactory.makeMember0$1(ModelFactory.scala:783)
    at scala.tools.nsc.doc.model.ModelFactory.makeMember(ModelFactory.scala:796)
    at scala.tools.nsc.doc.model.ModelFactory$DocTemplateImpl.$anonfun$ownMembers$1(ModelFactory.scala:356)
    at scala.tools.nsc.doc.model.ModelFactory$DocTemplateImpl.<init>(ModelFactory.scala:356)
    at scala.tools.nsc.doc.model.ModelFactory$PackageImpl.<init>(ModelFactory.scala:465)
    at scala.tools.nsc.doc.model.ModelFactory$modelCreation$$anon$4.<init>(ModelFactory.scala:675)
    at scala.tools.nsc.doc.model.ModelFactory$modelCreation$.createTemplate(ModelFactory.scala:675)
    at scala.tools.nsc.doc.model.ModelFactory.makeMember0$1(ModelFactory.scala:783)
    at scala.tools.nsc.doc.model.ModelFactory.makeMember(ModelFactory.scala:796)
    at scala.tools.nsc.doc.model.ModelFactory$DocTemplateImpl.$anonfun$ownMembers$1(ModelFactory.scala:356)
    at scala.tools.nsc.doc.model.ModelFactory$DocTemplateImpl.<init>(ModelFactory.scala:356)
    at scala.tools.nsc.doc.model.ModelFactory$PackageImpl.<init>(ModelFactory.scala:465)
    at scala.tools.nsc.doc.model.ModelFactory$modelCreation$$anon$4.<init>(ModelFactory.scala:675)
    at scala.tools.nsc.doc.model.ModelFactory$modelCreation$.createTemplate(ModelFactory.scala:675)
    at scala.tools.nsc.doc.model.ModelFactory.makeMember0$1(ModelFactory.scala:783)
    at scala.tools.nsc.doc.model.ModelFactory.makeMember(ModelFactory.scala:796)
    at scala.tools.nsc.doc.model.ModelFactory$DocTemplateImpl.$anonfun$ownMembers$1(ModelFactory.scala:356)
    at scala.tools.nsc.doc.model.ModelFactory$DocTemplateImpl.<init>(ModelFactory.scala:356)
    at scala.tools.nsc.doc.model.ModelFactory$PackageImpl.<init>(ModelFactory.scala:465)
    at scala.tools.nsc.doc.model.ModelFactory$modelCreation$$anon$4.<init>(ModelFactory.scala:675)
    at scala.tools.nsc.doc.model.ModelFactory$modelCreation$.createTemplate(ModelFactory.scala:675)
    at scala.tools.nsc.doc.model.ModelFactory.makeMember0$1(ModelFactory.scala:783)
    at scala.tools.nsc.doc.model.ModelFactory.makeMember(ModelFactory.scala:796)
    at scala.tools.nsc.doc.model.ModelFactory$DocTemplateImpl.$anonfun$ownMembers$1(ModelFactory.scala:356)
    at scala.tools.nsc.doc.model.ModelFactory$DocTemplateImpl.<init>(ModelFactory.scala:356)
    at scala.tools.nsc.doc.model.ModelFactory$PackageImpl.<init>(ModelFactory.scala:465)
    at scala.tools.nsc.doc.model.ModelFactory$RootPackageImpl.<init>(ModelFactory.scala:473)
    at scala.tools.nsc.doc.model.ModelFactory$modelCreation$$anon$1.<init>(ModelFactory.scala:657)
    at scala.tools.nsc.doc.model.ModelFactory$modelCreation$.createTemplate(ModelFactory.scala:657)
    at scala.tools.nsc.doc.model.ModelFactory$modelCreation$.createRootPackage(ModelFactory.scala:605)
    at scala.tools.nsc.doc.model.ModelFactory$$anon$21.<init>(ModelFactory.scala:50)
    at scala.tools.nsc.doc.model.ModelFactory.makeModel(ModelFactory.scala:47)
    at scala.tools.nsc.doc.DocFactory.makeUniverse(DocFactory.scala:81)
    at scala.tools.nsc.doc.DocFactory.generate$1(DocFactory.scala:124)
    at scala.tools.nsc.doc.DocFactory.document(DocFactory.scala:131)
    at xsbt.Runner.run(ScaladocInterface.scala:26)
    at xsbt.ScaladocInterface.run(ScaladocInterface.scala:10)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at sbt.compiler.AnalyzingCompiler.call(AnalyzingCompiler.scala:107)
    at sbt.compiler.AnalyzingCompiler.doc(AnalyzingCompiler.scala:73)
    at sbt.compiler.AnalyzingCompiler.doc(AnalyzingCompiler.scala:68)
    at sbt.Doc$$anonfun$scaladoc$1.apply(Doc.scala:23)
    at sbt.Doc$$anonfun$scaladoc$1.apply(Doc.scala:23)
    at sbt.RawCompileLike$$anonfun$prepare$1.apply(RawCompileLike.scala:64)
    at sbt.RawCompileLike$$anonfun$prepare$1.apply(RawCompileLike.scala:56)
    at sbt.RawCompileLike$$anonfun$cached$1$$anonfun$2$$anonfun$apply$1.apply(RawCompileLike.scala:49)
    at sbt.RawCompileLike$$anonfun$cached$1$$anonfun$2$$anonfun$apply$1.apply(RawCompileLike.scala:47)
    at sbt.Tracked$$anonfun$outputChanged$1.apply(Tracked.scala:84)
    at sbt.Tracked$$anonfun$outputChanged$1.apply(Tracked.scala:79)
    at sbt.RawCompileLike$$anonfun$cached$1.apply(RawCompileLike.scala:54)
    at sbt.RawCompileLike$$anonfun$cached$1.apply(RawCompileLike.scala:39)
    at sbtunidoc.Plugin$Unidoc$.apply(Plugin.scala:134)
    at sbtunidoc.Plugin$$anonfun$baseCommonUnidocTasks$2.apply(Plugin.scala:27)
    at sbtunidoc.Plugin$$anonfun$baseCommonUnidocTasks$2.apply(Plugin.scala:27)
    at scala.Function1$$anonfun$compose$1.apply(Function1.scala:47)
    at sbt.$tilde$greater$$anonfun$$u2219$1.apply(TypeFunctions.scala:40)
    at sbt.std.Transform$$anon$4.work(System.scala:63)
    at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:228)
    at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:228)
    at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:17)
    at sbt.Execute.work(Execute.scala:237)
    at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:228)
    at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:228)
    at sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestrictions.scala:159)
    at sbt.CompletionService$$anon$2.call(CompletionService.scala:28)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)

This is then followed by a repeat of the original error message heading, without the stack trace. I'm not sure what line of the file is causing the failure.

Any ideas?

UPDATE: Incidentally, if I generate documentation using "doc" instead, I don't get any errors.

No API doc is generated for a multi-module project and Scala 3

I first tried on a large project then on the simple project I could think of but either:

  • nothing is generated
  • or I see the scaladoc help message
sbt:specs2> unidoc
[info] Main Scala API documentation to /Users/etorreborre/projects/specs2/test-unidoc/target/scala-3.0.0/unidoc...
[info] Usage: scaladoc <options> <source files>
[info] where possible standard options include:
[info] -P                 Pass an option to a plugin, e.g. -P:<plugin>:<opt>
[info] -bootclasspath     Override location of bootstrap class files.
[info] -classpath         Specify where to find user class files.

This is using sbt-1.5.4 and sbt-unidoc-0.4.3.

support for scalajs crossprojects

Hi,
sbt-unidoc does nor work well with scalajs crossprojects. The issue seems to be the shared part which both subprojects will have an the classpath and when unidoc puts all projects in one, the shared part is available twice and thus generates a conflict.
are there any plans to support scalajs crossprojects?

How to publish only the unidoc (aggregating) project

For example, my build file

val baseName = "ScalaCollider"
val baseNameL = baseName.toLowerCase

val PROJECT_VERSION           = "1.27.0"
val scalaColliderSwingVersion = "1.39.0"
val ugensVersion              = "1.19.0"
val audioFileVersion          = "1.5.0"
val oscVersion                = "1.1.6"

val lOSC                = RootProject(uri(s"git://github.com/Sciss/ScalaOSC.git#v$oscVersion"))
val lAudioFile          = RootProject(uri(s"git://github.com/Sciss/AudioFile.git#v$audioFileVersion"))
val lUGens              = RootProject(uri(s"git://github.com/Sciss/ScalaColliderUGens.git#v$ugensVersion"))
val lScalaCollider      = RootProject(uri(s"git://github.com/Sciss/$baseName.git#v${PROJECT_VERSION}"))
val lScalaColliderSwing = RootProject(uri(s"git://github.com/Sciss/ScalaColliderSwing.git#v$scalaColliderSwingVersion"))

scalaVersion in ThisBuild := "2.12.8"

val root = project.withId(s"$baseNameL-unidoc").in(file("."))
  .enablePlugins(ScalaUnidocPlugin)
  .settings(
    name                 := s"$baseName-unidoc",
    version              := PROJECT_VERSION,
    organization         := "de.sciss",
    mappings in packageDoc in Compile := (mappings  in (ScalaUnidoc, packageDoc)).value,
    scalacOptions in (Compile, doc) ++= Seq(
      "-skip-packages", Seq(
        "de.sciss.osc.impl", 
        "de.sciss.synth.impl",
        "snippets"
      ).mkString(":"),
      "-doc-title", s"${baseName} ${PROJECT_VERSION} API"
    ),
    publishArtifact in (Compile, packageBin) := false, // there are no binaries
    publishArtifact in (Compile, packageSrc) := false  // there are no sources
  )
  .aggregate(/* lOSC, lAudioFile, */ lUGens, lScalaCollider, lScalaColliderSwing)

What I want to do is publish the javadoc.jar of the unified docs. But if I run sbt scalacollider-unidoc/publishLocal, sbt attempts to also publish all the dependencies, e.g.

[info] Wrote /home/hhrutz/.sbt/1.0/staging/7a469ac4827c285b0be5/scalacolliderugens/spec/target/scalacolliderugens-spec-1.19.0.pom
[info] Wrote /home/hhrutz/.sbt/1.0/staging/7a469ac4827c285b0be5/scalacolliderugens/api/target/scala-2.12/scalacolliderugens-api_2.12-1.19.0.pom
[info] Done updating.
[info] Wrote /home/hhrutz/.sbt/1.0/staging/46c805c0efc1cd242273/scalacolliderswing/core/target/scala-2.12/scalacolliderswing-core_2.12-1.39.0.pom
[info] :: delivering :: de.sciss#scalacolliderugens_2.12;1.19.0 :: 1.19.0 :: release :: Mon Jan 14 01:08:38 CET 2019
[info] 	delivering ivy file to /home/hhrutz/.sbt/1.0/staging/7a469ac4827c285b0be5/scalacolliderugens/target/scala-2.12/ivy-1.19.0.xml
[info] Wrote /data/temp/unidoc-test/target/scala-2.12/scalacollider-unidoc_2.12-1.27.0.pom
[warn] Attempting to overwrite /home/hhrutz/.ivy2/local/de.sciss/scalacolliderugens_2.12/1.19.0/ivys/ivy.xml (non-SNAPSHOT)
[warn] 	You need to remove it from the cache manually to take effect.
[warn] Attempting to overwrite /home/hhrutz/.ivy2/local/de.sciss/scalacolliderugens_2.12/1.19.0/ivys/ivy.xml.sha1 (non-SNAPSHOT)
[warn] 	You need to remove it from the cache manually to take effect.
[warn] Attempting to overwrite /home/hhrutz/.ivy2/local/de.sciss/scalacolliderugens_2.12/1.19.0/ivys/ivy.xml.md5 (non-SNAPSHOT)
[warn] 	You need to remove it from the cache manually to take effect.
...

etc.

Is there a way to only publish the one artifact ~/.ivy2/local/de.sciss/scalacollider-unidoc_2.12/1.27.0/docs/scalacollider-unidoc_2.12-javadoc.jar (eventually publishSigned)?

problems on 2.10

Great project, thank you for making it! Consider adding a warning that you can run into problems with scala 2.10 - I work on a cross compiled library that just kept getting weird errors when I would try to run unidoc without specifying that I wanted it to run on 2.11.5.
Let me know if you would like more details about the errors I was seeing.

NullPointerException issue

After upgrading to Scala 2.12.3, my project compiles & runs fine. But unidoc throws

[debug] 	/websocket/protocol/Protocol.scala
java.lang.NullPointerException
	at scala.tools.nsc.typechecker.Typers$Typer.callToCompanionConstr(Typers.scala:3277)
	at scala.tools.nsc.typechecker.Typers$Typer.tryNamesDefaults$1(Typers.scala:3541)
	at scala.tools.nsc.typechecker.Typers$Typer.doTypedApply(Typers.scala:3564)

This looks like some scala bug. I'm curious what does unidoc do different than compile? Does the stacktrace mean the issue is happening in Protocol.scala file?

The environment is sbt 13.15, scala 2.12.3.

unidoc fails to link members that doc has no problem with

A brief example: I have a file in which one of my methods contains this link to a 3rd party library's class.

import com.datastax.driver.core.exceptions._

/**
 * [[com.datastax.driver.core.exceptions.AlreadyExistsException AlreadyExistsException]]
 */
...

This is no problem for doc but unidoc tells me it "Could not find any member to link for "com.datastax.driver.core.exceptions.AlreadyExistsException"."

Does it have anything to do with it being a third party library? It's Datastax's Cassandra driver.

Links between scala and java unidocs

When generating both Scala and Java API documentation, it would be nice to be able to easily switch between the two.

After a while you get used to manipulating the URL directly when Google sends you to the wrong variant, but that's not too neat (and does not always work reliably) :).

Upgrade to genjavadoc 0.9

When I attempted to use scala version 2.11.7 I got an error because genjavadoc could not be found for that version. It seems they did not publish a 0.8 for 2.11.7 and only a 0.9. It would be awesome if, by default, this project used 0.9.

scaladoc output is not uniquely determined

I'm not sure if this is a unidoc or a scaladoc issue, but if I call sbt unidoc on https://github.com/KWARC/MMT/tree/stable/src the generated file (html) always differ literally (see UniFormal/MMT#2).

I've got no idea, what is causing these differences, but it is annoying for our committed documentation or reproducing (identical) documentation from the same sources.

If it is a scaladoc problem (as ant has the same problem) please forward the issue further. But maybe it is just the order .scala files are processed (that changes for some reason, depending on processing time or resources).

Exclude a package from all projects

We want to exclude a particular package named "internal" from scaladoc and javadoc for all projects. We have UnidocSite plugin as follows:

object UnidocSite extends AutoPlugin {
  import sbtunidoc.{BaseUnidocPlugin, JavaUnidocPlugin, ScalaUnidocPlugin}
  import JavaUnidocPlugin.autoImport._
  import ScalaUnidocPlugin.autoImport._
  import BaseUnidocPlugin.autoImport.unidoc

  import com.typesafe.sbt.site.SitePlugin.autoImport._

  override def requires: Plugins = ScalaUnidocPlugin && JavaUnidocPlugin

  def excludeJavadoc: Set[String] = Set("internal", "scaladsl", "csw_protobuf")
  
  def excludeScaladoc: String     = Seq("internal", "csw_protobuf", "akka").mkString(":")

  override def projectSettings: Seq[Setting[_]] = Seq(
    siteSubdirName in ScalaUnidoc := "/api/scala",
    addMappingsToSiteDir(mappings in (ScalaUnidoc, packageDoc), siteSubdirName in ScalaUnidoc),
    siteSubdirName in JavaUnidoc := "/api/java",
    filterNotSources(sources in (JavaUnidoc, unidoc), excludeJavadoc),
    addMappingsToSiteDir(mappings in (JavaUnidoc, packageDoc), siteSubdirName in JavaUnidoc),
    
    scalacOptions in (ScalaUnidoc, unidoc) ++= Seq("-skip-packages", excludeScaladoc),
    
    autoAPIMappings := true
  )

  def filterNotSources(filesKey: TaskKey[Seq[File]], subPaths: Set[String]): Setting[Task[Seq[File]]] = {
    filesKey := filesKey.value.filterNot(file => subPaths.exists(file.getAbsolutePath.contains))
  }
}

But somehow "internal" package gets excluded only from javadoc and not from scaladoc.

Is there something we are missing ?

javaunidoc:doc does not properly depend on compile

Observation: when I run javaunidoc:doc after clean then all relevant Scala sources are compiled but their resulting Java sources are not taken into account for the javadoc run. The full JavaDoc is only created when then running javaunidoc:doc a second time. I think there is a dependency missing on the compile task somewhere, but the formulation of the plugin is so convoluted that I cannot figure out where to place it.

unidoc with play project

CAn you provide an example how to configure unidoc with a multi module play, 2.3.1 scala project. I keep getting erros about the Assets, ReverseController and routes object already being defined.

Thanks Markus

Unidoc is broken on Scala 2.12.8

I'm not able to generate functioning Scaladocs with Scala 2.12.8.

Here's a reproducible example: https://github.com/torkelrogstad/scaladoc-test

The generated docs in target/scala-2.11/unidoc work just fine. The ones in scala-2.12 have broken relative links. You can reproduce this by doing sbt unidoc, and changing the Scala version in build.sbt on line 4.

issue upgrading sbt-unidoc from 0.3 to 0.4.1

I was looking at upgrading the Spark dependency on sbt-unidoc to allow for eventual uptake of sbt 1.0.
The unidoc plugin changed a fair bit between 0.3 and 0.4.
I'm not very familiar with this plugin and if someone with some experience with this plugin had a little time to review what changes are needed in Spark build, that would be great.
Spark has its own sbt shell script in the its build dir (eg ./build/sbt).
The sbt-unidoc configuration is in https://github.com/apache/spark/blob/master/project/SparkBuild.scala

unidoc compile issue

Hi,

I am getting the below error while running sbt unidoc:
/Users/dhavalkolapkar/Documents/Workspace/Kamanja/trunk/EnvContexts/SimpleEnvContextImpl/src/main/scala/com/ligadata/SimpleEnvContextImpl/SimpleEnvContextImpl.scala:2105: type mismatch;
[error] found : (com.ligadata.KvBase.Key, com.ligadata.KvBase.Value) => Unit
[error] required: (com.ligadata.StorageBase.Key, com.ligadata.StorageBase.Value) => Unit
[error] dataStore.get(containerName, callbackFunction)
[error] ^
[error] /Users/dhavalkolapkar/Documents/Workspace/Kamanja/trunk/EnvContexts/SimpleEnvContextImpl/src/main/scala/com/ligadata/SimpleEnvContextImpl/SimpleEnvContextImpl.scala:2154: overloaded method value get with alternatives:
errorUnit

I have 15 sub projects and my build.sbt contains:
lazy val commonSettings = Seq(
scalaVersion := "2.11.4",
autoAPIMappings in unidoc := true
)

val root = (project in file(".")).settings(commonSettings: _).
settings(unidocSettings: _).
aggregate(subprojects)

Unidoc seems to ingore scalacOptions

Hello,

I try to get scaladoc source links working with unidoc. When I add

scalacOptions in (Compile, doc) ++=
  Opts.doc.sourceUrl("https://github.com/choffmeister/secloud/blob/master/core€{FILE_PATH}.scala")

to my build.sbt and execute sbt doc then everything works fine. To get it working with unidoc I also added

scalacOptions in sbtunidoc.Plugin.UnidocKeys.unidoc ++=
  Opts.doc.sourceUrl("https://github.com/choffmeister/secloud/blob/master/core€{FILE_PATH}.scala")

Now when I execute sbt unidoc there are no sourcecode links in the combined API documentation.

I checked with sbt "show unidoc::scalacOptions", but everything looks fine:

[info] Loading project definition from /Users/choffmeister/Development/secloud/project
[info] Set current project to secloud (in build file:/Users/choffmeister/Development/secloud/)
[info] core/*:unidoc::scalacOptions
[info]  List(-unchecked, -feature, -deprecation, -language:postfixOps, -encoding, utf8, -sourcepath, /Users/choffmeister/Development/secloud/core, -doc-source-url, https://github.com/choffmeister/secloud/blob/master/core€{FILE_PATH}.scala)
[info] commandline/*:unidoc::scalacOptions
[info]  List(-unchecked, -feature, -deprecation, -language:postfixOps, -encoding, utf8, -sourcepath, /Users/choffmeister/Development/secloud/commandline, -doc-source-url, https://github.com/choffmeister/secloud/blob/master/commandline€{FILE_PATH}.scala)
[info] secloud/*:unidoc::scalacOptions
[info]  List(-unchecked, -feature, -deprecation, -language:postfixOps, -encoding, utf8, -sourcepath, /Users/choffmeister/Development/secloud)
[success] Total time: 0 s, completed Feb 2, 2014 1:50:17 PM

The -doc-source-url arguments appear properly, but anyway: no source code links in the generated API. What am I doing wrong? Is it a problem, that I have a multi project?

At https://github.com/choffmeister/secloud/tree/api-source-links you can find my project. When you clone it, then executing sbt doc already works, but sbt unidoc does not.

Thanks in advance,
Christian

apiMappings does not seem to work with unidoc task

If I run sbt doc references are resolved correctly based on my apiMappings. If I run sbt unidoc then references are not resolved.

I have tried the following:

apiMappings in unidoc := apiMappings.value,
apiMappings in ScalaUnidoc := apiMappings.value,
apiMappings in unidoc := (apiMappings in (Compile, doc)).value,
apiMappings in ScalaUnidoc := (apiMappings in (Compile, doc)).value,
apiMappings in (ScalaUnidoc, unidoc) := (apiMappings in (Compile, doc)).value,
mySettings ++ inTask(unidoc)(Seq(
  apiMappings in ScalaUnidoc := (apiMappings in (Compile, doc)).value
))

Weird behavior with evicted dependencies

There is a weird interaction between libraryDependencies in sub-projects that depend on one another. If the dependee has an older version of a dependency than the depender, then this old version might be used in the depender when running unidoc!

I've made an example project here. Note that with compile the depender sub-project uses the new cats-effect version (and compiles OK), yet with unidoc it uses the old version and fails. This is also flaky in a weird way - the behavior seems to depend on the folder in which the build is located. For me, the bug is always present when running the example project from /tmp/sbt-test, but can't be reproduced in /home/nigredo/dev/various-examples...

A workaround for this issue might be to ensure all the sub-projects use consistent dependencies - but this looks painful in the long run.

'xsbt ghpagesPushSite' failing with java unidoc: 'fatal: Not a git repository'

I have a smallish java project here:
https://github.com/erikerlandson/gibbous/tree/unidoc-issue

I attempted to set it up for generating javadoc per the instructions on your site. You can browse my sbt setup at the branch linked above.

xsbt previewSite seems to be generating what I expect, but xsbt ghpagesPushSite is failing in a strange way. I have attached the full output, but the error is this:

[info] Loading settings from build.sbt ...
[info] Set current project to gibbous (in build file:/home/eje/git/gibbous/)
[error] fatal: Not a git repository (or any parent up to mount point /home)

full output

javadoc: error - java.lang.OutOfMemoryError

When I ran unidoc recently, it exploded with an OOM exception.

$ sbt jancyCommon/unidoc
...
[info] Generating /home/j/moje/src/jancy/jancy-common/target/javaunidoc/index-all.html...
[info] 1 error
[error] javadoc: error - java.lang.OutOfMemoryError: Please increase memory.
[error] For example, on the JDK Classic or HotSpot VMs, add the option -J-Xmx
[error] such as -J-Xmx32m.
[error] (jancyCommon/javaunidoc:doc) javadoc returned nonzero exit code
[error] Total time: 245 s, completed Apr 19, 2017 10:01:56 PM

I increased the maximum memory in the sbt launcher script, but unidoc didn't pick it up and launched javadoc with default settings. It only worked when I temporarily changed it system-wide:

$ export JAVA_TOOL_OPTIONS='-Xmx2048M'

but this feels like a hack.

Could we have a unidoc setting for tweaking this, like in maven-javadoc?

EDIT: And by 'could we have' I mean 'Would add it myself and send a PR if somebody was interested in merging it in'

Error in combination with a Simulacrum @typeclass

Hello,

I've got this PR on the cats-effect project, trying to enable the sbt-unidoc plugin: typelevel/cats-effect#29

Unfortunately unidoc fails:

> unidoc
[info] Compiling 9 Scala sources to cats-effect/core/jvm/target/scala-2.12/classes...
[info] Main Scala API documentation to cats-effect/target/scala-2.12/unidoc...
[error] Symbol 'type cats.effect.Effect.ToEffectOps' is missing from the classpath.
[error] This symbol is required by 'package cats.effect.implicits.package'.
[error] Make sure that type ToEffectOps is in your classpath and check for conflicting dependencies with `-Ylog-classpath`.
[error] A full rebuild may help if 'package.class' was compiled against an incompatible version of cats.effect.Effect.
[info] No documentation generated with unsuccessful compiler run
[error] one error found
[error] (root/scalaunidoc:doc) Scaladoc generation failed
[error] Total time: 9 s, completed Apr 24, 2017 11:04:43 PM

The class it is complaining about is generated by a Simulacrum macro, see: https://github.com/mpilquist/simulacrum

Must mention that this error does not happen when running the normal doc command, so I think it is Unidoc related.

Please advise.

Compiling issue with type members

The following code compiles (using sbt compile) but does not compile using sbt unidoc:

      item.expr match {
        case p: Param[item.Data] => // FAILS HERE
          updateParam(p, item.value)
        case _ =>
      }

Support for configurations other than Compile

I am in the process of converting our Kiama build across to use sbt-unidoc. We have used a locally hacked version of the Akka unidoc approach for a while, but we'd like to use sbt-unidoc so we can remove those hacks, particularly since some of them will need non-trivial modifications to work with sbt 0.13.

I have successfully got things working for the Compile configuration. I use the following setting modifications to make the doc task in our root project run unidoc and to put the output in the "api" subdir instead of "unidoc". These mods mean that publishing picks up the unified docs.

def compileUnidocSettings : Seq[Setting[_]] =
  unidocSettings ++ Seq (
    doc in Compile := (doc in ScalaUnidoc).value,
    target in unidoc in ScalaUnidoc := crossTarget.value / "api"
  )

My current stumbling block is getting a similar setup to work for the Test config. We publish our (unified) docs for the tests, examples etc. They are normally produced using the test:doc task which is also used when publishing (via a "test-api" output directory).

As far as I can see at the moment sbt-unidoc doesn't support this use case since the settings seem to be assuming a Compile config. It appears that I would need to duplicate much of the plugin to get equivalent settings for the Test config.

It would be ideal if sbt-unidoc's settings could be parameterised by the configuration, instead of just assuming Compile.

How to exclude certain classes

Hey I have an isssue where i cannot generate docs for my multi-project because of a shared class of BuildInfo:

[info] Main Scala API documentation to C:\code\mmlspark\target\scala-2.12\sbt-1.0\unidoc...
[error] C:\code\mmlspark\core\target\scala-2.12\sbt-1.0\src_managed\main\sbt-buildinfo\BuildInfo.scala:6:13: BuildInfo is already defined as case class BuildInfo
[error] case object BuildInfo {
[error]             ^
[error] C:\code\mmlspark\deep-learning\target\scala-2.12\sbt-1.0\src_managed\main\sbt-buildinfo\BuildInfo.scala:6:13: BuildInfo is already defined as case class BuildInfo
[error] case object BuildInfo {
[error]             ^
[error] C:\code\mmlspark\lightgbm\target\scala-2.12\sbt-1.0\src_managed\main\sbt-buildinfo\BuildInfo.scala:6:13: BuildInfo is already defined as case class BuildInfo
[error] case object BuildInfo {
[error]             ^
[error] C:\code\mmlspark\opencv\target\scala-2.12\sbt-1.0\src_managed\main\sbt-buildinfo\BuildInfo.scala:6:13: BuildInfo is already defined as case class BuildInfo
[error] case object BuildInfo {
[error]             ^
[error] C:\code\mmlspark\target\scala-2.12\sbt-1.0\src_managed\main\sbt-buildinfo\BuildInfo.scala:6:13: BuildInfo is already defined as case class BuildInfo
[error] case object BuildInfo {
[error]             ^
[error] C:\code\mmlspark\vw\target\scala-2.12\sbt-1.0\src_managed\main\sbt-buildinfo\BuildInfo.scala:6:13: BuildInfo is already defined as case class BuildInfo
[error] case object BuildInfo {
[error]             ^
[info] No documentation generated with unsuccessful compiler run

Is there any way to ignore this class in the generation?

unidoc/sbt-site

I have been trying to following the guidlines here, to migrate to the newest version of unidoc:

Before:

import UnidocKeys._
val root = (project in file("."))
  .aggregate(library, app)
  .settings(commonSettings: _*)
  .settings(unidocSettings: _*)

After:

val root = (project in file("."))
  .enablePlugins(ScalaUnidocPlugin)
  .aggregate(library, app)
  .settings(commonSettings: _*)

However, this seems to break something when I pass 'ScalaUnidoc' through to sbt-microsites.

addMappingsToSiteDir(mappings in (ScalaUnidoc, packageDoc), apiSubDirName),

It complains about Reference to undefined setting. Not sure if its really a sbt-site issue, but never version of unidoc seems to break it.

autoAPIMappings does not seem to work with unidoc task

I have tried to get the sbt setting autoAPIMappings to work with unidoc, unsuccessfully. I already tried the following settings:

autoAPIMappings := true

autoAPIMappings in unidoc := true

autoAPIMappings in ScalaUnidoc := true

autoAPIMappings in (ScalaUnidoc, unidoc) := true

Is it currently ignored or is this an I am to stupid to use sbt issue?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.