Coder Social home page Coder Social logo

avsystem / scala-commons Goto Github PK

View Code? Open in Web Editor NEW
84.0 24.0 20.0 17.04 MB

AVSystem commons library for Scala

Home Page: http://www.avsystem.com

License: MIT License

Scala 99.91% Shell 0.07% HTML 0.02%
scala language-utilities java-interoperability-utilities macros serialization rpc redis hacktoberfest

scala-commons's Introduction

AVSystem Commons Library for Scala

Continuous Integration Maven Central

API reference

NOTE This library is available to the public but it's mostly used internally at AVSystem. Therefore, API may frequently change in an incompatible way.

Modules and features

  • commons-core - basic language utilities and generic features not associated with any particular library of framework:
    • GenCodec: format-agnostic, typeclass based serialization framework with automatic typeclass derivation for case classes and sealed hierarchies
      • built-in serialization formats include JSON (raw string), CBOR and BSON (in commons-mongo).
    • Typesafe RPC/proxy framework used in particular by Udash Framework for client-server communication
    • Better enumeration support for Scala - ValueEnum, SealedEnumCompanion, NamedEnumCompanion, OrderedEnum
    • Java interoperability utilities - JavaInterop
    • Google Guava interoperability utilities (dependency on Guava is optional)
    • Various Scala language-level utilities
    • Lightweight alternatives for Scala Option - Opt - guarantees no nulls, OptArg, NOpt, OptRef (implemented as value classes)
    • Components and Dependency Injection library
  • commons-redis - Scala driver for Redis
  • commons-macros contains implementations of macros used in other modules and reusable macro utilities:
    • MacroCommons trait with several convenience functions for implementing macros
    • TypeClassDerivation - implements infrastructure for automatic type class derivation
  • commons-analyzer - static analyzer for Scala code, i.e. a compiler plugin that enforces various (mostly unrelated) rules and conventions on source code
  • commons-jetty - Jetty server utilities
  • commons-mongo - MongoDB utilities for Scala & Java MongoDB drivers, integration with GenCodec
  • commons-hocon - Utilities for working with HOCON
    • HoconInput - an Input implementation for GenCodec that can read Lightbend Config (com.typesafe.config.Config)
    • An AST (HTree) and a lexer/parser for HOCON (HLexer, HParser)
  • commons-spring - Spring framework utilities:
    • HoconBeanDefinitionReader - an utility that allows you to define Spring application context using HOCON format

scala-commons's People

Contributors

attomir avatar crossy147 avatar ddworak avatar ghik avatar grzesiul avatar halotukozak avatar hamidzr avatar mhawryluk avatar mikkolaj avatar mjjaniec avatar mkej avatar najder-k avatar p1t4g0r45 avatar passarinho4 avatar plokhotnyuk avatar pulewicz avatar scala-steward avatar sebaciv avatar ugon avatar wodomierz avatar wojciech-milewski avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

scala-commons's Issues

Error logging in Jetty RPCHandler

In RPCHandler of JettyRPCFramework, when there is some exception thrown during call handling, only thing which is done is sending 500 Internal Server Error response with exception message.
When eg. some NPE is thrown with null message, it's ultra hard to debug such thing, because we only get error on client side, which tells us nothing.
Please add some form of logging there.

Compilation error when importing GenCodec in Scala 2.11

Got a following error here with v1.28.2:

[error] /home/travis/build/plokhotnyuk/jsoniter-scala/jsoniter-scala-benchmark/src/main/scala/com/github/plokhotnyuk/jsoniter_scala/macros/AVSystemCodecs.scala:6:8: Symbol 'term com.github.ghik' is missing from the classpath.
[error] This symbol is required by ' <none>'.
[error] Make sure that term ghik is in your classpath and check for conflicting dependencies with `-Ylog-classpath`.
[error] A full rebuild may help if 'SharedExtensions.class' was compiled against an incompatible version of com.github.
[error] import com.avsystem.commons.serialization.GenCodec
[error] 

Perhaps the cause is using of the com.github.ghik.silencer.silent annotation in GenCodec sources, and declaration the provided scope for the silencer-lib dependency:
http://search.maven.org/remotecontent?filepath=com/avsystem/commons/commons-macros_2.11/1.28.2/commons-macros_2.11-1.28.2.pom

Conditional beans include support in commons-spring

Would be cool if there was an option to include other bean files in HOCON based on a specified condition, so certain functionalities could be disabled via a feature flag and then unnecessary beans would not be built. Similar thing can be achieved by just choosing to include a file or not instead of setting a boolean feature flag, however it's not an ideal approach as it tangles configuration with Dependency Injection.

It could look something like this:

beans {
  ...
}

conditional = [
  {condition: ${ump.feature.enabled}, beans: {include "feature.conf"}},
  {condition: ${ump.coolStuff.enabled}, beans: {include "coolStuff.conf"}}
] 

Configurable JsonStringOutput/Input

Make JSON backend of GenCodec configurable by allowing to adjust:

  • JSON formatting (indent)
  • String character escaping rules
  • Number limits - if JSON is to be parsed inside JS runtime then all JSON numbers must fit in a double precision decimal, otherwise they may be arbitrarily large
  • Math context for parsing BigInteger and BigDecimal to avoid problems like #65
  • Date format for timestamps
  • Representation of binary data (hex, base64, array of numbers) - #64

GenCodec.materialize fails for Type Members and works for Type Parameters

Please consider the following Type Parameter hierarchy:

sealed trait A[X] {
  def foo: X
}

object A {
  final case class First[Type : GenCodec](foo: Type) extends A[Type]
  object First {
    implicit def genCodec[Type : GenCodec]: GenCodec[First[Type]] = GenCodec.materialize
  }

  implicit def genCodec[T : GenCodec]: GenCodec[A[T]] = GenCodec.materialize
}

The code above compiles and works nicely. However, after turning the Type Parameter into the Type Member:

sealed trait A {
  type X
  def foo: X
}

object A {
  final case class First[Type : GenCodec](foo: Type) extends A { type X = Type }
  object First {
    implicit def genCodec[Type : GenCodec]: GenCodec[First[Type]] = GenCodec.materialize
  }

  implicit def genCodec[T : GenCodec]: GenCodec[A { type X = T }] = GenCodec.materialize
}

the compiler claims that Cannot automatically derive GenCodec for com.avsystem.yanush.components.shared.search.model.A{type X = T}.

Accepting Ints as Doubles in BSON

Hi,

We're working with a an existing database, where doubles are sometimes written as ints(example: 10 instead of 10.0). This causes the following problem: Encountered BsonValue of type INT32, expected DOUBLE.

I was thinking about how to approach this properly. I got 2 working solutions, but I'm not 100% happy with either of them.

  1. Write a custom codec for Double, example:
object DoubleCodec extends GenCodec.SimpleCodec[Double] {

  override def readSimple(input: SimpleInput): Double = {

    try input.readDouble()
    catch {
      case NonFatal(_) => // catching something more specific will be better
        input.readInt().toDouble
    }
  }

  override def writeSimple(output: SimpleOutput, value: Double): Unit = output.writeDouble(value)

  override def nullable: Boolean = false
}

What I don't like here is that we have to use a try-catch block(or Scala Try, but it'll be the same thing under the hood), because SimpleInput only has readX methods, and nothing to check the type before trying to read it.

  1. Make a change in the input classes, this will require a change in the lib. This seems more proper, but again there is a catch:
  • We either have to accept ints, but always produce doubles in outputs, which is kind-of inconsistent (this is not a problem in our case)
  • Or we have to accept ints, and sometimes produce ints in outputs. This is also fine in our case, but it breaks the compatibility of the library.

For the second approach, I crated a draft PR: #413 (needs more tests and probably some other improvements, but just posting it here as a base for a better solution)

Do you know any other, better way?

Regards,
Paweł Ulewicz.

GenCodec for Seq creates a Vector instead of List when reading

Originally reported in #62

Technically, this is not a bug - GenCodec.read[Seq[A]] is free to choose whatever implementation of Seq it wants. However, this is non-intuitive because the standard Seq(...) factory creates a List by default.

It seems that the problem stems from an inconsistency in Scala collections library. GenCodec.seqCodec[Seq,A] is based on implicit instance of CanBuildFrom[Nothing,A,Seq[A]] which, for whatever reason, creates a Vector ¯\_(ツ)_/¯. This quirk can manifest in simpler examples, e.g.

scala> Seq(1,2,3)
res19: Seq[Int] = List(1, 2, 3)

scala> Seq(1,2,3).to[Seq]
res18: Seq[Int] = Vector(1, 2, 3)

So I'm not sure what to do about it yet, because it's not clear to me what would be the "correct" behaviour here. If we patched GenCodec specifically for Seq then we'd have to look for all other possible similar inconsistencies in the collections library and patch them accordingly...

Mapping scala BigDecimal to BSON NumberDecimal instead of binary

What is the best to map scala.math.BigDecimal to NumberDecimal instead of binary?

I created a repo with this example and currently it looks like that:

case class TypeWithBigDecimal(x: BigDecimal, y: BigDecimal)

TypeWithBigDecimal(x = 103234.42343, y = 12344.423)

{"x": {"$binary": {"base64": "AmdTOqcAAAAF", "subType": "00"}}, "y": {"$binary": {"base64": "ALxcZwAAAAM=", "subType": "00"}}}

https://github.com/pulewicz/mongo-commons-playground/blob/master/src/main/scala/PlaygroundApp.scala#L89

From what I see, this is currently hardcoded in all of of the Bson outputs, so I guess the only way would be to create your own output implementation, isn't it?

https://github.com/AVSystem/scala-commons/blob/master/commons-mongo/jvm/src/main/scala/com/avsystem/commons/mongo/BsonValueOutput.scala#L39
https://github.com/AVSystem/scala-commons/blob/master/commons-mongo/jvm/src/main/scala/com/avsystem/commons/mongo/BsonWriterOutput.scala#L26
https://github.com/AVSystem/scala-commons/blob/master/commons-mongo/jvm/src/main/scala/com/avsystem/commons/mongo/BsonWriterOutput.scala#L63

Are you planning to make it configurable?

Regards,
Paweł Ulewicz.

GenCodec regression in 1.34.15

GenCodec materialization fails for the following code:

trait Bound[T <: Bound[T]]
sealed trait Sealed[T <: Bound[T]]
object Sealed {
  case class A[T <: Bound[T]](i: T) extends Sealed[T]

  implicit def codec[Tpe <: Bound[Tpe] : GenCodec]: GenCodec[Sealed[Tpe]] = {
    GenCodec.materialize[Sealed[Tpe]]
  }
}
[error] Test.scala:11:31: Macro at line 14 failed: Cannot materialize com.avsystem.commons.serialization.GenCodec[Sealed.A[tpref$macro$3T] forSome { type tpref$macro$3T <: Bound[tpref$macro$3T] }] because of problem with parameter i:
[error] No GenCodec found for tpref$macro$3T
[error]   case class A[T <: Bound[T]](i: T) extends Sealed[T]
[error]                               ^
[error] Test.scala:14:25: class type required but Sealed.A[tpref$macro$3T] forSome { type tpref$macro$3T <: Bound[tpref$macro$3T] } found
[error]     GenCodec.materialize[Sealed[Tpe]]
[error]                         ^

It works fine with 1.34.14.

Compilation error in `materializeRecursively` call when upgrading to 1.34.3 version

It was detected by PR from @scala-steward plokhotnyuk/jsoniter-scala#201

Compilation error message is:

[error] Error while emitting AVSystemCodecs.scala
[error] value deferred$macro$124

It fails at the following line, when generating a codec for the GeoJSON type:
https://github.com/plokhotnyuk/jsoniter-scala/blob/a1dfb1355f80ae0927b2809bf9aab4062a6ed836/jsoniter-scala-benchmark/src/main/scala/com/github/plokhotnyuk/jsoniter_scala/macros/AVSystemCodecs.scala#L55

Possible DoS when parsing a JSON number to BigInt or BigDecimal

It happened that parsing of BigInt and BigDecimal in latest versions of JVM has O(n^2) complexity where n is the number of significant digits. It means that a JSON body with a length ~1Mb can do 100% load one CPU core for several seconds:

scala> def timed[A](f: => A): A = { val t = System.currentTimeMillis(); val r = f; println(s"Took ${System.currentTimeMillis() - t} millis"); r } 
timed: [A](f: => A)A

scala> List(1000, 10000, 100000, 1000000).foreach(x => timed(BigInt("9" * x)))
Took 0 millis
Took 2 millis
Took 135 millis
Took 13221 millis

scala> List(1000, 10000, 100000, 1000000).foreach(x => timed(BigDecimal("9" * x)))
Took 0 millis
Took 2 millis
Took 138 millis
Took 13440 millis

Invalid condition generation in filtered positional operator

Using the exact example from docs:
https://www.mongodb.com/docs/manual/reference/operator/update/positional-filtered/#update-all-array-elements-that-match-arrayfilters

case class Grades(id: Grades.ID, grades: List[Grade]) extends MongoEntity[Int]
object Grades extends MongoEntityCompanion[Grades]

case class Grade(grade: Int, mean: Int, std: Int)
object Grade extends MongoDataCompanion[Grade]

val update = Grades.ref(_.grades).updateFiltered(
 filter = _.ref(_.grade).gte(85),
 update = _.ref(_.mean).set(100)
)

We can see that the generated array query looks as follow:

println(update.toBsonAndArrayFilters._2.asScala.head.toJson)
> {"filter0": {"grade": {"$gte": 85}}}

instead of {"filter0.grade" : {"$gte": 85}}}

This actually matches on documents in the array that are equal to {"grade": {"$gte": 85}}, instead of evaluating the condition on its fields, so it makes it impossible to actually query the documents in array.

[TypedMongo] Allowing the discriminator field at any position

What is the easiest way of reading class hierarchies with discriminator fields at the end of the object? Unfortunately, we need to adapt to an existing schema.

As explained in https://github.com/AVSystem/scala-commons/blob/master/docs/GenCodec.md#flat-sealed-hierarchy-format:

It may also be sensitive to object field order because the discriminator field must be read before any other fields (this is only a problem with Input implementations that do not support random field access by name - JsonStringInput does not have this problem).

I did some checks, and it seems like BsonReaderInput has this problem, but BsonValueInput hasn't: https://github.com/pulewicz/mongo-commons-playground/blob/master/src/main/scala/PlaygroundApp.scala#L47

Questions:

  1. Is there any easy way to use TypedMongoCollection with value input and output? From what I see currently it's hardcoded and it will need creating your own versions of the following classes, is it correct?

  2. Is using the value input and output a good solution, or are there any disadvantages?

Regards,
Paweł Ulewicz.

Emptiness and Definition of Opt

It's common to write tests like

sth shouldBe Opt.Empty

It could be simplified with org.scalatest.enablers.Emptiness

sth shouldBe empty

Possible implementation

implicit def emptinessOfOpt[A]: Emptiness[Opt[A]] = _.isEmpty

Even more useful, the same for org.scalatest.enablers.Definition

Tbh, I don't know if it's possible to do it without requiring the scalaTest dependency.

Referencing nested array fields in typed Mongo

Assuming a hierarchy like this:

case class InnerRecord(
  int: Int,
  str: String,
)
object InnerRecord extends MongoDataCompanion[InnerRecord]

case class RecordTestEntity(
  id: String,
  innerList: List[InnerRecord]
) extends MongoEntity[String]
object RecordTestEntity extends MongoEntityCompanion[RecordTestEntity]

How can I construct a projection so that only innerList.int fields are returned? I know I can reference the fields at particular index of the list, eg. RecordTestEntity.ref(_.innerList.head.int), but I would like to achieve something like this: RecordTestEntity.ref(_.innerList.map(_.int)). Is it possible?

quasi-Opaque types in Scala 2

In default Scala 2, we have a way to alias types. This is great as a way to make code more readable, but if your type logically has a more limited range of support then the type won't enforce this. Or if the type you are working with is used in different places to mean different things, then implicits wont know what to pick.

Opaque type aliases from Scala 3 provide type abstraction without any overhead. In Scala 2, a similar result could be achieved with value classes, but it has limitations and enforces "boxed" syntax, e.g.:

final case class ID(value: String) extends AnyVal

Seq[ID](...).forEach{ id=>
  println("Sth sth" + id.value)
}

Seq[String](ID("asdsadas").value)

My proposition allows to define quasi-opaque type without runtime overhead and with full IDE support.

object ID extends Opaque.Default[String]

  Seq[String](ID("DASDAS"))
  /*
  type mismatch;
    found   : CastableTest.this.ID.Type
    (which expands to)  com.avsystem.commons.opaque.Opaque.Hidden[String,CastableTest.this.ID.Tag]
    required: String
   */
  Seq[ID.Type]("DASDAS")
  /*
  type mismatch;
     found   : String("DASDAS")
     required: CastableTest.this.ID.Type
        (which expands to)  com.avsystem.commons.opaque.Opaque.Hidden[String,CastableTest.this.ID.Tag]
  Seq[ID.Type]("DASDAS")
   */

Seq[ID](...).forEach{ id=>
  println("Sth sth" + id)
}

object ID2 extends Subopaque.Default[String]

Seq[String](ID2("DASDAS")) //ok

Opaque is equivalent to

opaque type T = Original

SubOpaque is equivalent to:

opaque type T <: Original = Original

in Scala 3

Cannot pull single element by value in Mongo update

I have a write command like this:

MongoWrite.UpdateMany(
  filter = MyEntity.ref(_.someField).in(listOfFields),
  update  = MyEntity.ref(_.listOfUuids).pull(_ === uuid),
  setupOptions = _.upsert(true)
)

Running this write results in an exception from Mongo:

Bulk write operation error on server localhost:27017. Write errors: [BulkWriteError{index=0, code=2, message='unknown top level operator: $eq', details={}}]

I assume that it constructs a similar filter like in this SO question: https://stackoverflow.com/questions/49617088/how-to-remove-element-from-array-in-mongodb

Is it possible to construct an update that doesn't have this issue? Currently I've found two workarounds:

  • .pullAll(uuid)
  • .pull(_.in(uuid))

Macro commons fails to initialize when commons annotations not present in classpath

final val AnnotationAggregateType = getType(tq"$CommonsPkg.annotation.AnnotationAggregate") - line 43 in MacroCommons trait

scala.reflect.macros.TypecheckException: object annotation is not a member of package com.avsystem.commons
	at scala.reflect.macros.contexts.Typers.$anonfun$typecheck$3(Typers.scala:32)
	at scala.reflect.macros.contexts.Typers.$anonfun$typecheck$2(Typers.scala:26)
	at scala.reflect.macros.contexts.Typers.doTypecheck$1(Typers.scala:25)
	at scala.reflect.macros.contexts.Typers.typecheck(Typers.scala:39)
	at scala.reflect.macros.contexts.Typers.typecheck$(Typers.scala:20)
	at scala.reflect.macros.contexts.Context.typecheck(Context.scala:6)
	at scala.reflect.macros.contexts.Context.typecheck(Context.scala:6)
	at com.avsystem.commons.macros.MacroCommons.getType(MacroCommons.scala:403)
	at com.avsystem.commons.macros.MacroCommons.getType$(MacroCommons.scala:402)
	at com.avsystem.commons.macros.AbstractMacroCommons.getType(MacroCommons.scala:10)
	at com.avsystem.commons.macros.MacroCommons.$init$(MacroCommons.scala:43)
	at com.avsystem.commons.macros.AbstractMacroCommons.<init>(MacroCommons.scala:10)```

There is no dependency on commons-annotations in commons-macros.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.