Comments (4)
Not sure if implicits are supported by py4j call - in that case we need .scala wrapper.
from jgit-spark-connector.
- Python wrapper
- Add jar automagically
- configure DataSource
- select/join
- wrapper for the proposed API
- Scala side
- check our api works the way it was described in the document
from jgit-spark-connector.
I've been playing a little bit with our spark api in pyspark and it seems that we might not need the scala wrapper. Even though implicits don't work on pyspark we can always import the Implicits class and manually use it in the python wrapper.
from py4j.java_gateway import java_import
jvm = sparkSession.sparkContext._gateway.jvm
java_import(jvm, 'tech.sourced.api.Implicits')
implicits = jvm.tech.sourced.api.Implicits
sparkSession.conf.set('tech.sourced.api.repositories.path', '/siva/files/path')
reposDf = implicits.getDataSource('repositories', sparkSession._jsparkSession)
refsDf = implicits.ApiDataFrame(reposDf).getReferences()
commitsDf = implicits.ApiDataFrame(refsDf).getCommits()
filesDf = implicits.ApiDataFrame(commitsDf).getFiles()
NOTE: ApiDataFrame
can only be instantiated with a Java DataFrame, so if the dataframe comes from python it needs to be implicits.ApiDataFrame(pyDf._jdf)
instead.
Of course, more experimenting is needed to see if this is feasible in a real api, but it seems promising.
from jgit-spark-connector.
PR: #39
from jgit-spark-connector.
Related Issues (20)
- Return of org.eclipse.jgit.errors.RevWalkException: Walk failure HOT 2
- [feature request] Handy handling of bad siva-files HOT 2
- Engine fails on UAST when run spark-shell with --packages argument in cluster mode HOT 6
- Modify requirements file, Pyspark version HOT 2
- Add a configuration flag to skip siva read errors
- [Change] Files not cleaned in /tmp folder on Spark Workers when using apollo HOT 2
- commits filtering counterintuitive behaviour HOT 1
- Engine fails to extract UASTs on actual Spark cluster HOT 5
- Latest version of Docker image engine-jupyter missing bblfsh HOT 6
- Expose Bblfsh UAST extraction errors in Engine
- Jupyter example notebook doesn't work out-of-the-box
- Docker Hub 'latest' image doesn't point to master Dockerfile. HOT 2
- Missing dependencies for pyspark in README HOT 6
- Build and test process leave the repository in dirty state for git describe HOT 3
- Engine fails on siva files from pipeline-staging cluster HOT 14
- Update README with EOL
- "standard" repository format does not work HOT 7
- ReferencesDataFrame files property HOT 4
- WARN Repository: close() called when useCnt is already zero
- WARN CommitIterator: missing object HOT 1
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from jgit-spark-connector.