Long-running periodic job scheduler and runner.
paultag / moxie Goto Github PK
View Code? Open in Web Editor NEWmoxie
License: MIT License
moxie
License: MIT License
if you're not hosting in UTC, I think it's off a bit.
or at least subscriptions?
add alerts
so we don't have to remove the container every time. which isn't a big deal, but it's another moving part.
it'd be bad to get pwned.
alternatively, see if websockets tolerates htaccess
allow launching a build with a paramater; needed for stuff like openstates historical scrape
Possibly just webhooks in general, but GitHub web hooks would be really handy to be able to have jobs fire off in response to pushes, comments, PRs, issues, etc.
(moxie-init/load in a container, pupa init in a container, etc)
aiopg is really ugly. make it less ugly
for job results and stuff
since that's currently blind before start - might look like the daemon died when it didn't
expose all of the db over ssh
ssh localhost -p 2222 kill test
\r\n
everywhereI remember seeing some view exceptions. handle EOD gracefully. likely related to aiohttp api change
not sure how this'd work just yet, but magic. do we want to remove everything related?
so that you can change the impl that's created
docker rm docker ps -a | grep Exit | awk '{print $1}'
docker rmi docker images | grep "<none>" | awk '{print $3}'
online / offline
ansible is amazing. shared variables is amazing. I hate setting SUNLIGHT_API_KEY 50 times.
rate limit the number of jobs being brought up at once. Let's do 2 at a time at most?
reload config if there's a db change or something.
communicate exceptions, such as missing images, or unlinkable bits.
During pip install -r requirements.txt
, aiohttp
installs successfully from git, but then aiodocker
doesn't see it during install/compilation:
Obtaining aiohttp from git+https://github.com/KeepSafe/aiohttp#egg=aiohttp (from -r requirements.txt (line 6))
Updating /home/eric/.virtualenvs/moxie/src/aiohttp clone
Running setup.py (path:/home/eric/.virtualenvs/moxie/src/aiohttp/setup.py) egg_info for package aiohttp
warning: no previously-included files matching '*.pyc' found anywhere in distribution
warning: no previously-included files found matching 'aiohttp/_multidict.html'
warning: no previously-included files found matching 'aiohttp/_multidict.*.so'
warning: no previously-included files found matching 'aiohttp/_multidict.pyd'
warning: no previously-included files found matching 'aiohttp/_multidict.*.pyd'
warning: no previously-included files found matching 'aiohttp/_websocket.html'
warning: no previously-included files found matching 'aiohttp/_websocket.*.so'
warning: no previously-included files found matching 'aiohttp/_websocket.pyd'
warning: no previously-included files found matching 'aiohttp/_websocket.*.pyd'
Installing extra requirements: 'egg'
Downloading/unpacking aiodocker (from -r requirements.txt (line 7))
Downloading aiodocker-0.6.tar.gz
Running setup.py (path:/home/eric/.virtualenvs/moxie/build/aiodocker/setup.py) egg_info for package aiodocker
Traceback (most recent call last):
File "<string>", line 17, in <module>
File "/home/eric/.virtualenvs/moxie/build/aiodocker/setup.py", line 4, in <module>
from aiodocker import __version__
File "/home/eric/.virtualenvs/moxie/build/aiodocker/aiodocker/__init__.py", line 3, in <module>
from .docker import Docker
File "/home/eric/.virtualenvs/moxie/build/aiodocker/aiodocker/docker.py", line 5, in <module>
import aiohttp
ImportError: No module named 'aiohttp'
Complete output from command python setup.py egg_info:
Traceback (most recent call last):
File "<string>", line 17, in <module>
File "/home/eric/.virtualenvs/moxie/build/aiodocker/setup.py", line 4, in <module>
from aiodocker import __version__
File "/home/eric/.virtualenvs/moxie/build/aiodocker/aiodocker/__init__.py", line 3, in <module>
from .docker import Docker
File "/home/eric/.virtualenvs/moxie/build/aiodocker/aiodocker/docker.py", line 5, in <module>
import aiohttp
ImportError: No module named 'aiohttp'
right now it checks on the pk and skips if it exists
I think we're making too many connections to postgres.
I think that might cause 500s when we use too many.
re-run jobs, etc
so that we can set a static time for the next interval
so we can sort by project, etc
It'd be great to hand a job a Git repo path and have it do git pull
and then docker build
directly instead of having to wait for either Hub automated builds or pushes from my local box in order to deploy.
it broke
using the integers was a hack. use a unique pk for each - e.g. email on maintainer
just be sure we don't hang coroutines anywhere. I don't think we do, but it's worth checking
ugh
fΓΌr a mobile app
having one moxied is fine, since it's pretty lazy / slow, but letting it handle many workers is super needed.
perhaps capture a high file descriptor? might need some docker tweaks.
skydns / skydock just took down my server
The ability to have multiple servers running jobs would be a pretty big improvement, especially if they can all have a single web UI for management.
We talked on IRC, and tagging (ala #29) is probably the cleanest way to get to this.
https://github.com/devunt/aioirc is a thing, and it might be useful for this (haven't dug into it)
Running the same job against a whole cluster of machines (or even just with slightly different configuration, like a different GOOS
/GOARCH
set, for example π) would be an interesting way to get a build matrix of sorts.
(just recording this as an issue so we can discuss further ideas/thoughts on it asynchronously)
A declarative, efficient, and flexible JavaScript library for building user interfaces.
π Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. πππ
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google β€οΈ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.