lagom / lagom-recipes Goto Github PK
View Code? Open in Web Editor NEWLicense: Other
License: Other
Take for example @renatocaval's work at https://github.com/renatocaval/account-scala-multi-jdbc
Adding the recipes on lightbend's techhub will increase visibility
copy/pasting the contents of the main README
in the Lagom docs would make this recipies more easily searchable.
As lagom moves forward some recipes need upgrades as versions bump. It'd be great to have some form of build server to ease maintenance.
Taking the k8s example for a spin, some suggestions from our k8s engineer
downwardAPI
, might be better using a ConfigMap
and then mount them as volumes. Only reason to use configMaps over Environment variables is you can change the ConfigMap on the fly and don’t need to re-create the whole deployment.imagePullPolicy: IfNotPresent
should be Always
in case of Production because in case you have old image with same tag locally, it will never pull new imagesmy additional remark would be to remove the dependency on minikube and it's ingress and make something more generic, example based on istio which can be deployed on any k8s host (including docker for mac + k8s extension)
Recipe should be the java equivalent of https://github.com/lagom/lagom-recipes/tree/master/file-upload/file-upload-scala-sbt
This has come up a few times (e.g. https://discuss.lightbend.com/t/how-lagom-java-use-play-java-controller/909/3).
We, the team of maintainers, have decided to make a few changes around the code structure, repository reorganization and maintained samples. One such change is the archiving of Lagom Recipes.
We decided to pick a few recipes into the new https://github.com/lagom/lagom-samples repository and archive https://github.com/lagom/lagom-recipes. That means that any recipe that was not moved to https://github.com/lagom/lagom-samples will not be editable anymore. We invite all recipe contributors to take their recipes under their umbrella and keep working on them. We'll be happy to cross-link to external repositories demonstrating how to build great features in Lagom, we just decided to not be a blocker to get this published. We will keep the index of external recipes on the https://github.com/lagom/lagom-samples repo and archive lagom-recipes preventing new changes.
Thanks all for your hard work, contributions, ideas and for pushing forward this collection of samples.
Is there any example for file upload implementation using maven?
In some cases it's necessary to test the wire protocol complies with the contract the service provides to its clients.
In those situations writing tests that use lagom's client doesn't work because there's no access to lower layers.
#34 enabled travis for sbt
recipes but not for mvn
ones.
These are the changes I tried to make to use Postgres: marvinmarnold@81b6f39
When I do sbt runAll
, things start off nice:
16:17:04.840 [info] application [] - Creating Pool for datasource 'default'
16:17:04.851 [info] com.zaxxer.hikari.HikariDataSource [] - HikariPool-1 - Starting...
16:17:04.960 [info] com.zaxxer.hikari.HikariDataSource [] - HikariPool-1 - Start completed.
16:17:04.970 [info] play.api.db.HikariCPConnectionPool [] - datasource [default] bound to JNDI as DefaultDS
16:17:04.983 [info] play.api.db.DefaultDBApi [] - Database [default] connected at jdbc:postgresql://localhost:5431/postgres
16:17:05.659 [info] org.hibernate.jpa.internal.util.LogHelper [] - HHH000204: Processing PersistenceUnitInfo [
name: default
...]
But then the Postgres connection ultimately fails
16:17:06.795 [warn] org.hibernate.engine.jdbc.spi.SqlExceptionHelper [] - SQL Error: 0, SQLState: null
16:17:06.795 [error] org.hibernate.engine.jdbc.spi.SqlExceptionHelper [] - null
16:17:06.799 [warn] com.lightbend.lagom.internal.javadsl.persistence.jpa.JpaSessionImpl [] - Exception while initializing JPA EntityManagerFactory
javax.persistence.PersistenceException: [PersistenceUnit: default] Unable to build Hibernate SessionFactory
at org.hibernate.jpa.boot.internal.EntityManagerFactoryBuilderImpl.persistenceException(EntityManagerFactoryBuilderImpl.java:967)
at org.hibernate.jpa.boot.internal.EntityManagerFactoryBuilderImpl.build(EntityManagerFactoryBuilderImpl.java:892)
at org.hibernate.jpa.HibernatePersistenceProvider.createEntityManagerFactory(HibernatePersistenceProvider.java:58)
at javax.persistence.Persistence.createEntityManagerFactory(Persistence.java:55)
at javax.persistence.Persistence.createEntityManagerFactory(Persistence.java:39)
at com.lightbend.lagom.internal.javadsl.persistence.jpa.JpaSessionImpl.lambda$createEntityManagerFactory$1(JpaSessionImpl.java:86)
at com.lightbend.lagom.internal.javadsl.persistence.jpa.Retry.$anonfun$retry$1(Retry.scala:35)
at scala.concurrent.Future$.$anonfun$apply$1(Future.scala:655)
at scala.util.Success.$anonfun$map$1(Try.scala:251)
at scala.util.Success.map(Try.scala:209)
at scala.concurrent.Future.$anonfun$map$1(Future.scala:289)
at scala.concurrent.impl.Promise.liftedTree1$1(Promise.scala:29)
at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:29)
at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:60)
at slick.util.AsyncExecutor$$anon$2$$anon$3$$anon$4.run(AsyncExecutor.scala:165)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.hibernate.exception.GenericJDBCException: Unable to open JDBC Connection for DDL execution
at org.hibernate.exception.internal.StandardSQLExceptionConverter.convert(StandardSQLExceptionConverter.java:47)
at org.hibernate.engine.jdbc.spi.SqlExceptionHelper.convert(SqlExceptionHelper.java:111)
at org.hibernate.engine.jdbc.spi.SqlExceptionHelper.convert(SqlExceptionHelper.java:97)
at org.hibernate.resource.transaction.backend.jdbc.internal.DdlTransactionIsolatorNonJtaImpl.getIsolatedConnection(DdlTransactionIsolatorNonJtaImpl.java:69)
at org.hibernate.tool.schema.internal.exec.ImprovedExtractionContextImpl.getJdbcConnection(ImprovedExtractionContextImpl.java:60)
at org.hibernate.tool.schema.extract.internal.SequenceInformationExtractorLegacyImpl.extractMetadata(SequenceInformationExtractorLegacyImpl.java:40)
at org.hibernate.tool.schema.extract.internal.DatabaseInformationImpl.initializeSequences(DatabaseInformationImpl.java:65)
at org.hibernate.tool.schema.extract.internal.DatabaseInformationImpl.<init>(DatabaseInformationImpl.java:59)
at org.hibernate.tool.schema.internal.Helper.buildDatabaseInformation(Helper.java:132)
at org.hibernate.tool.schema.internal.AbstractSchemaMigrator.doMigration(AbstractSchemaMigrator.java:96)
at org.hibernate.tool.schema.spi.SchemaManagementToolCoordinator.performDatabaseAction(SchemaManagementToolCoordinator.java:183)
at org.hibernate.tool.schema.spi.SchemaManagementToolCoordinator.process(SchemaManagementToolCoordinator.java:72)
at org.hibernate.internal.SessionFactoryImpl.<init>(SessionFactoryImpl.java:313)
at org.hibernate.boot.internal.SessionFactoryBuilderImpl.build(SessionFactoryBuilderImpl.java:452)
at org.hibernate.jpa.boot.internal.EntityManagerFactoryBuilderImpl.build(EntityManagerFactoryBuilderImpl.java:889)
... 16 common frames omitted
Caused by: java.sql.SQLFeatureNotSupportedException: null
at com.zaxxer.hikari.HikariDataSource.getConnection(HikariDataSource.java:135)
at org.hibernate.engine.jdbc.connections.internal.DatasourceConnectionProviderImpl.getConnection(DatasourceConnectionProviderImpl.java:122)
at org.hibernate.engine.jdbc.env.internal.JdbcEnvironmentInitiator$ConnectionProviderJdbcConnectionAccess.obtainConnection(JdbcEnvironmentInitiator.java:180)
at org.hibernate.resource.transaction.backend.jdbc.internal.DdlTransactionIsolatorNonJtaImpl.getIsolatedConnection(DdlTransactionIsolatorNonJtaImpl.java:43)
... 27 common frames omitted
16:17:06.800 [info] com.lightbend.lagom.internal.javadsl.persistence.jpa.JpaSessionImpl [] - Will retry initializing JPA EntityManagerFactory 10 times in 5 seconds
I'm able to connect to that DB from DBeaver using the same credentials. Port 5431 is intentional.
The sample https://github.com/ignasi35/lagom-scala-sbt-multiple-rdbms should be a recipe. Needs reviewing and more detailed explanation on the README
.
It could be merged with https://github.com/ignasi35/lagom-multiple-databases to further demo other use cases.
In both couchbase recipes the /api/user-greetings
endpoint always returns []
.
/cc @johanandren
Current LICENSE is Apache-2 abbreviated. We should:
This is the current configuration for the Cluster Bootstrap Discovery Method:
This globally sets the default discovery method to kubernetes-api
, which is only appropriate to use for Cluster Bootstrap and not for inter-service discovery.
While this recipe does not currently use Akka Discovery for anything else, users are likely to copy it as it is. Anyone using the Akka gRPC integration will need another global default discovery method, and in the future, Lagom will have a ServiceLocator
implementation based on Akka Discovery.
The recipe should be updated to set akka.management.cluster.bootstrap.contact-point-discovery.discovery-method
instead of akka.discovery.method
, as described in akka/akka-management#316.
Add an example of using Play's JPA API to do CRUD-oriented persistence in a Lagom service.
e.g. at least log correlation is missing:
How do I use MDC-Log correlation IDs with Lightbend Telemetry (Cinnamon) in Lagom? ([Java/Maven example](./lightbend-telemetry/log-correlation-java-mvn/README.md))
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.