Greetings.
Per introduction, my name is rueX, and I'm a cool guy who found ControlPlane to his liking, and wanted to contribute. The branch "rueX" is where my contributions reside. Feel free to clone and make use of it, as you please.
I'd like to take ControlPlane in the direction of an ANN (artificial neural network), with nearly no user-interface at all. The minimal UI would be a "learn this" button, which the user would press before performing some action or changing a setting somewhere.
An alternative UI would be via natural language input, of the sort: "When I connect my bluetooth headset, change the bitpool to the highest settings, maybe 80, and switch the audio output to the headset. But only do this when I'm at home." This could be typed, or spoken into osX's dictation.
A pre-configured "Common Actions" area would permit quick access to simpler, common items.
Behind the scenes, ControlPlane would generate and update contexts, learning and incorporating subtleties of conditions and behaviour, as time goes on. An advanced UI could expose the weights and states of the ANN model, but this would not be essential, nor visible by default.
The original repository for ControlPlane is here: https://github.com/dustinrue/ControlPlane.
Cheers,
rueX