Coder Social home page Coder Social logo

byron / google-apis-rs Goto Github PK

View Code? Open in Web Editor NEW
970.0 16.0 125.0 440.02 MB

A binding and CLI generator for all Google APIs

Home Page: http://byron.github.io/google-apis-rs

License: MIT License

Python 0.07% Rust 99.88% Mako 0.05% Makefile 0.01% Dockerfile 0.01%

google-apis-rs's Introduction

Maintenance Mode

No new features will be implemented, but contributions are welcome. From time to time new releases will be published to crates.io.

Also, we are looking for a new maintainer.

WARNING

The following crates are not under our control and have been published by another party

  • google-privateca1-cli
  • google-recaptchaenterprise1-cli
  • google-resourcesettings1-cli

Use at your own risk.


This repository holds mako scripts to generate all Google APIs as described by the google discovery service.

The generate source code of each google API can be found in the gen subdirectory. Each google API resides in it's own crate which can be used as any other crate.

To find a library of your interest, you might want to proceed looking at the API documentation index.

Maintenance Mode

These crates are considered done and only minimal time will be invested to keep them relevant. This includes the following tasks:

  • reply to issues, triage them
  • provide support for PRs
  • occasional updates of all crates to update them with the latest API definitions, probably no more than twice a year
  • dependency updates to avoid security issues and keep the crates usable in modern projects

New features will not be implemented but PRs are welcome. Please feel free to start a discussion to talk about desired features.

Please be aware of the alternative implementation of these crates, which may be better suited for you.

Project Features

  • provide an idiomatic rust implementation for google APIs
  • first-class documentation with cross-links and complete code-examples
  • support all features, including downloads and resumable uploads
  • safety and resilience are built-in, allowing you to create highly available tools on top of it. For example, you can trigger retries for all operations that may temporarily fail, e.g. due to network outage.

Build Instructions

Prerequisites

To generate the APIs yourself, you will need to meet the following prerequisites:

  • make
  • Make is used to automate and efficiently call all involved programs
  • python
  • As mako is a python program, you will need python installed on your system to run it. Some other programs we call depend on python being present as well. Note that you need python 3.8, as 3.9+ introduced some breaking changes that breaks the dependencies.
  • an internet connection and wget
  • Make will download all other prerequisites automatically into hidden directories within this repository, which requires it to make some downloads via wget.
  • Rust Stable
  • This project compiles on stable Rust 1.6 or greater only. You might consider using Rustup to control the toolchain on a per-project basis.

Using Make

The makefile is written to be self-documenting. Just calling make will yield a list of all valid targets.

โžœ  google-apis-rs git:(main) make
using template engine: '.pyenv/bin/python etc/bin/mako-render'

Targets
help-api       -   show all api targets to build individually
help-cli       -   show all cli targets to build individually
docs-all       -   cargo-doc on all APIs and associates, assemble them together and generate index
docs-all-clean -   remove the entire set of generated documentation
github-pages   -   invoke ghp-import on all documentation
regen-apis     -   clear out all generated apis, and regenerate them
license        -   regenerate the main license file
update-json    -   rediscover API schema json files and update api-list.yaml with latest versions
deps           -   generate a file to tell how to build libraries and programs
help           -   print this help
make: Nothing to be done for `help'.

You can easily build the documentation index using make docs-all and individual API documentation using make <api-name>-doc. Run doctests on all apis with make cargo-api ARGS=test or on individual ones using make <api-name>-cargo ARGS=test. To see which API targets exist, run make help-api.

The same goes for commandline programs, just ust -cli instead of -api, and have a look at help-cli for individual targets.

Make and parallel job execution

In theory, you can run make with -j4 to process 4 jobs in parallel. However, please note that cargo may be run for some jobs and do a full build, which could cause it to fail as it will be updating it's index. The latter isn't working for multiple cargo processes at once as the index is a shared resource.

Nonetheless, once you have built all dependencies for all APIs once, you can safely run cargo in parallel, as it will not update it's index again.

In other words: The first time, you run make docs and make cargo ARGS=test, you shouldn't run things in parallel. The second time, you are free to parallelize at will.

Adding new APIs and updating API schemas

The list of available APIs to generate is based on a query of the Google discovery API, and then baked into a make-compatible dependency file. That will represents a cache, and the only way to enforce a full update is to delete it and run make again.

For example, to update all json files and possibly retrieve new API schemas, do as follows:

# -j8 will allow 8 parallel schema downloads
rm -f .api.deps .cli.deps && FETCH_APIS=1 make update-json -j8

Now run make cargo-api ARGS=check and each time that fails, add it to the list of forbidden APIs along with a note as to why it fails, regenerate all APIs with the make invocation above. When done and all APIs pass cargo check, commit changes in the shared.yml file.

Setup API and CLI version numbers

The version numbers for the respective program types are setup in etc/api/type-*.yaml where * resolves to the supported program types, being cli and api at the time of writing. You can change the version for all expected artifacts by editing the respective key inside of the yaml (cargo.build_version at the time of writing).

The following script would regenerate all higher-level programs (CLI), add the result to git and push it.

$ make gen-all-cli
# Use the version you are comfortable with in the changelog - sometimes you only want to
# update one program type. Here we go with a multi-version, containing all version strings
# in one heading. It's just for the visuals, after all.
$ clog --setversion=api-v<api-version>
$ git add .
$ git commit -m "chore(versionup): added code for latest version"
$ git tag api-v<api-version> cli-v<cli-version> && git push --tags origin master

Publish API to Cargo

Before anything else happens, be sure we publish the latest APIs to cargo.

# We want as many publishes to work as possible ... -k keeps going on error.
# sometimes uploads fail spuriously, and just need to be retried.
$ make publish-api publish-cli -k
# another attempt to do this should not do actual work
$ make publish-api publish-cli
# all clear ? Please proceed ... .

The previous call will have created plenty of marker files, which need to be committed to the repository to prevent to attempt multiple publishes of the same version.

git add .
git commit -m "chore(cargo): publish latest version to crates.io"
git push origin main

Build Documentation and post it onto GitHub

The last step will update the documentation index to point to the latest program versions. For now we assume hosting on GitHub-Pages, but the compiled documentation directory can be hosted statically anywhere if required.

# This will run rust-doc on all APIs, generate the index html file and run gh-import at the end.
# The latter might require user-input in case no credentials are setup.
$ make github-pages

License

The license of everything not explicitly under a different license are licensed as specified in LICENSE.md.

License

Licensed under either of

at your option.

Contribution

Unless you explicitly state otherwise, any contribution intentionally submitted for inclusion in the work by you, as defined in the Apache-2.0 license, shall be dual licensed as above, without any additional terms or conditions.

Archive of Development Videos

Rust Stackshare.io

All work done on the project is based on github issues, not only to track progress and show what's going on, but to have a place to link screen-recordings to. Milestones are used to provide a bigger picture.

Additionally, there is a development diary which serves as summary of major steps taken so far. As opposed to issue-screencasts, it is not made live, but is authored and narrated, which should make it more accessible.

Click the image below to see the playlist with all project related content:

thumb

Super-Entertaining Videos

Each episode sums up one major step in project development:

google-apis-rs's People

Contributors

byron avatar causal-agent avatar chaoky avatar cristicbz avatar davids avatar debomac avatar dermesser avatar djrodgerspryor avatar dreaming-codes avatar dvaerum avatar edelangh avatar emberian avatar fiadliel avatar gheoan avatar harababurel avatar imfede avatar ivanukhov avatar joell avatar kogai avatar kylegentle avatar michael-f-bryan avatar mihirsamdarshi avatar omgeeky avatar petrosagg avatar philippeitis avatar samoylovfp avatar serprex avatar thebiggerguy avatar worr avatar ytanay avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

google-apis-rs's Issues

Bring compiler warnings down to 0

Also, re-enable all linters, and fix what needs to be fixed.

If a structure is unused, we shouldn't actually emit it. Therefore, the UnusedType marker can be removed as well.

`Option<_>` in schema properties only if needed

Currently Option is used all the time to aid encoding and decoding to json. However, some schemas do not need that, for example, response-values, and maybe others as well.

Do some research and implement it accordingly.

Integrate CLI docs into central index

This will allow them to be seen when updating github pages for example.
Somewhat relates to #50 , as the latter seems to be the last big doc-related thing to do.

build and update instructions in README

Other parties should be able to generate the APIs themselves, which requires some documentation of the make based build system (which as we know is self-documenting).

Make sure to cover special cases, like how to get new APIs, which requires deletion of the .api.deps file.

Assure Downloads can work

Drive2 for example supports downloading files, and we should make sure that this actually works within the bounds of our API.

per-API crate version with appended build info: +<revision>

We control crate versions with one version for all, which should only change if the generator changes, and thus produces different code. Standard semver rules apply.

To assure we don't degenerate the exact API version the crate represents, we should append build metadata, to get versions such as 0.1.0+20150309.

That way, we will only produce new crate versions if something truly changed.

new release

Just to get the Result changes online, and improve the docs.

Refactor `Result` to standard `Ok/Err` enum

That way, matches will be more hierarchical. Err contains another enum with all possible failure states.

Also, let query_transfer_state() return a std::Result, which is natural to it.

Making it compile again ...

With the latest rustc, the dependencies don't compile anymore. Make it work again.

And re-release with a new patch-level.
The crates currently online wouldn't work anymore.

handle 'methods' which are unrelated to resources

APIs like 'freebase' support methods which are not tied to resources. However, our system really wants that right now, and there should be a good way to upgrade it to deal with that case gracefully.

These 'methods' are added to the existing ones, but might actually be callable directly from the hub.

Add 'google-' prefix to crates

This should ONLY affect crate names, everything else stays as is.
Remember to adjust the library docs accordingly, possibly fitting the new RFC already.

Support for 'variant' schema type

See mapsengine1 - `GeoJsonGeometry.

The 'Go' implementation supports this already, so should I. An enum seems to be the native data-type, even though Jsonification would certainly need to be tested beforehand.

Remember to update the documentation urls - they are hardcoded to structs.

Dev-Diary #1: Up until now ...

This episode gives you an idea why this project exists, and of course, how it got as far as it got.

Some of the topics are:

  • Authentication
  • The mako template engine
  • make for dependency handling and process invocation
  • generation of deeply nested datatypes
  • serde for Json encoding and decoding
  • documentation is no afterthought

rustdoc search broken in doc-index

It seems that copying all files into one folder doesn't cut it, as it will also overwrite data related to the search index. This was done initially just to reduce redundancy, as all dependent crate docs are duplicated once per API.

The good news is that Git will deduplicate for us, even though the demand in space will rise on github.

A fix could be to just copy all doc-data into it's own subdirectory. The benefit will be that there is no special case anymore (in terms of URL) for the CLI and the API docs.

link to 404 on docs index

Hey, even though it's no longer Friday 13th, I just found a 404 on my first try to click on a link on http://byron.github.io/google-apis-rs/ ;)

The link for qpxexpress v1 is to qpxexpress1/index.html, which doesn't exist.

Looking at the gh-pages branch tells me the correct path is qpxExpress1/index.html (with a capitalized E).

Switch JSON [de,en]coding to serde !

Actually all structures currently generated will not encode-decode into anything useful as their identifiers have been mangled. This doesn't work though, as these must remain unchanged.

As this is not possible with rustc_serialize, this task should be given to serde.

Per-API installation instructions

Should tell people how to modify their cargo toml. This should possibly be put in front of the complete usage example.

Additionally, the documentation should be changed to use youtube3::new() instead of youtube3::YouTube::new(...) in all code samples.

'publish' files to keep track of publishes to crates.io

Currently make doesn't know anything about whether or not a crate is already on crates.io. To track releases, one should create a central 'publish' command, which depends on the latest generated libraries (together), and publish files that are checked in.

Such a file could look like this: etc/api/youtube/v3/crates/0.1.0+2014040201.

`plus1` build error

illegal recursive struct type; wrap the inner value in a box to make it representable

documentation index and gh-pages

Add a make target to generate all API documentation and link it together in a html front page that just links to the respective sub-directories using relative paths.

The documentation must be self-contained for upload on gh-pages. For the latter, there must be a target as well using gh-import, to help simply pushing latest docs to github.

Dynamic 'Required Request Value'

The documentation for the request value should represent the request value itself and document every single (nested) property.

Be sure you think about how we will later parse these values to finally build the API call - the same framework/functions should be used in these cases.

Assure 'alt=media' param is handled

This is supported in some youtube APIs are least, and we should be sure to either support it or reject it right away.

While at it, we may want to check other parameters that we cannot support, for example, as they change the encoding of the return value.

nicer library names

What I want is names like

  • youtube3
  • drive2

Sometimes, versions look different and should be converted as shown

  • v1beta4 -> name1_beta4
  • v1.3 -> name1p3
  • directory_v1 -> name1_directory
  • directory_v1.3 -> name1p3_directory
  • v1management -> name1_management

implement doit(): resumable media upload

Something to consider here is how the delegate is involved to store the upload information to allow actually resuming it in another call.
Supporting this might ripple through our method builder a bit, as it might need more information to allow this. Or the delegate provides it at runtime, which would certainly be prepared.

Goal is to support storing the current result, and resuming a few minutes/hours later without issues, i.e. the API must allow to do that natively.

setup CLI docs and central index integration

As Docs are a first-class citizen, we should integrate them into our central doc index right away.

For the CLI, I imagine mkdocs should be used to generate beatiful static documentation from possibly shared markdown files.

Also, rustdoc shouldn't be invoked for the CLI at all - it's all done by mkdocs. deps.mako may have to learn how to handle that.

use travis-ci

It should only work on a subset of all APIs, but besides that, test doc generation and actual unit-tests.

use discovery API to update json files !

Currently these are pulled from the google go api client repository, but that's not the original source of the information.

Turns out there is a discovery service which can easily be used to get the json files ! Write something to help doing it, ideally with make to support simple parallelization.

Implement doit(): simple media upload

This includes 'multi-part' uploads which are not resumable. It will possibly change the structure of our current implementation quite a bit, as it needs more looping and multiple requests.

This RFC was used as basis for implementation.

Provide method details to delegate, and `begin()/end()` calls

When calling the delegate, currently the only information it gets about the method it should handle is its name.
As the delegate can be seen as state-machine, it would benefit from knowing more about what's going on. There should be information about a method that is passed to it right at the beginning, and it should be informed as well when the method is done executing, passing in the result for inspection.

Even though the begin()/end() calls are not strictly required as the user knows when he is calling into the API and returning from it, it will nonetheless make handling API work more convenient.

Develop CLI interface grammar

Figure out how the grammar should look like, which basically specifies how the CLI can be controlled. Think about future addition of a Queue, and prefer designs allowing to add new functionality easily.

It's clear that the command-subcommand pattern is to be used here, and that it needs to be able to set all the values supported by the API.

Even though documentation is provided separately via mkdocs, the command must be self-documenting when invoked from the commandline (ideally providing a URL to the public HTML docs as well).

This ticket could theoretically be tackled by writing the README file that contains the command overview, including the respective MAKO code. This basically makes it half-way towards implementing the CLI parser (which probably wants to be docopt for ease-of-use).

`rtc_map` shouldn't be needed

Instead, the resource-to-activity map should keep fully qualified activity names.
This will make the code cleaner and easier to understand/maintain.

docs review

  • add #Arguments descriptions for required parameters in MethodBuilders
  • fix TODO in upload_resumable() method docs
  • fix whitespace issue in mbuild example (see upload_resumable)
  • link library overview (Hub introduction) to MethodBuilder trait and CallBuilder trait,

Make user-agent field configurable

It should be a field on the Hub, which is preset with the default one, but which can be set by the user to affect all future calls.

That way, the library will be suited to other programs which just want to use it, keeping their own user agent or allowing them to maintain their own identity towards the google servers.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.