Coder Social home page Coder Social logo

hog's Introduction

The Power Hog

The power hog is a tool that periodically collects energy statistics of your mac and makes them available to you.

There are two main aims:

  1. Identify which apps are using a lot of energy on your machine.
  2. Collecting the data from as many machines as possible to identify wasteful apps globally.

We provide a website for detailed analytics of your data. The hog by default uploads your measurement data to our Green Metrics Tool backend. We put in a lot of effort to make sure that no confidential information is exposed but please refer to the settings section if you want to disable the upload or submit the data to your own backend.

The hog consists of 2 apps that run on your local system. You need to power logger but not the app!

Power logger

The background process power_logger.py which saves the power statists to the database. We use the mac internal powermetrics tool to collect the data. Because the powermetrics tool needs to be run as root so does the power_logger script. The tool accepts one argument -d to run the tool in debug mode. It can also be sent the SIGINFO command to give some statistics. You can either call it by hand and send it to the background with & or define it an agent. For development purposes we recommend to always first run the program in the foreground and see if everything works fine and then use the launch agent.

If you want to avoid running the desktop app you can call the power_logger.py script with -w which will give you the details url.

You can also run the powermetrics process yourself and then use power_logger.py to process the data and upload it. You can use the -f parameter with a filename. Please submit the data in the plist format. You can use the following call string: powermetrics --show-all -i 5000 -f plist -o FILENAME and to run the powermetrics process yourself.

Parameter list

  • -d: Set's debug/ development mode to true. The Settings are set to local environments and we output statistics when running.
  • -w: Gives you the url of the analysis website and exits. This is especially useful when not using the desktop app
  • -f filename: Use the file as powermetrics input and don't start the process internally.

Setup of the power collection script

This is a description on how to set everything up if you did a git clone. You can also just do

sudo bash -c "$(curl -fsSL https://raw.githubusercontent.com/green-coding-solutions/hog/main/install.sh)"

which will do the whole install for you.

Do it manually

Make the power_logger.py script executable with chmod a+x power_logger.py

Please modify the io.green-coding.hog.plist file to reference the right path. There is a script below that does everything for you.

Place the .plist file in the /Library/LaunchDaemons directory. For security reasons, files in /Library/LaunchDaemons/ should have their permissions set to be owned by root:wheel and should not be writable by others.

sed -i.bak "s|PATH_PLEASE_CHANGE|$(pwd)|g" io.green-coding.hog.plist
sudo cp io.green-coding.hog.plist /Library/LaunchDaemons/

sudo chown root:wheel /Library/LaunchDaemons/io.green-coding.hog.plist
sudo chmod 644 /Library/LaunchDaemons/io.green-coding.hog.plist

After placing the .plist file in the right directory, you need to tell launchd to load the new configuration:

sudo launchctl load /Library/LaunchDaemons/io.green-coding.hog.plist

You can check if your service is loaded with:

sudo launchctl list | grep io.green-coding.hog

If you want to unload or stop the service:

sudo launchctl unload /Library/LaunchDaemons/io.green-coding.hog.plist

Settings

It is possible to configure your own settings by using a settings.ini file in the same directory as the power_logger.py script. Or adding a hog_settings.ini file to /etc/. This will file will be prioritized.

Following keys are currently used:

  • powermetrics: This is the delta in ms that power metrics should take samples. So if you set this to 5000 powermetrics will return the aggregated values every 5 seconds
  • upload_delta: This is the time delta data should be uploaded in seconds.
  • api_url: The url endpoint the data should be uploaded to. You can use the https://github.com/green-coding-solutions/green-metrics-tool if you want but also write/ use your own backend.
  • web_url: The url where the analytics can be found. We will append the machine ID to this so make sure the end of the string is a =
  • resolve_coalitions: The way macOS works is that it looks as apps and not processes. So it can happen that when you look at your power data you see your shell as the main power hog. This is because your shell has probably spawn the process that is using a lot of resources. Please add the name of the coalition to this list to resolve this error.

The desktop App

The hog desktop app gives you analytics of the data that was recorded. Please move this into your app folder.

Description of the headings

  • Name: This is the name of the process coalition. A coalition can be multiple processes. For example a program might fork new process which will all show up in the coalition. Sometimes a shell might turn up here. Please tell us so we can add this as an exception
  • Energy Impact: This is the value mac gives it's processes. The exact formula is not known but we know that quite some factors are considered some details. For now this is the best value we've got 🫣
  • AVG Cpu Time %: This is how long this coalition has spent on the CPUs. We take a percentage which can be over 100% as the coalition could run on multiple cpus at the same time. So if a process takes up 100% of cpu time and runs on 4 cpus the time will be 400%.

Database

All data is saved in an sqlite database that is located under:

/Library/Application Support/io.green-coding.hog/db.db

Updating

We currently don't support an automatic update. You will have to:

sudo mv /etc/hog_settings.ini /etc/hog_settings.ini.back
sudo bash -c "$(curl -fsSL https://raw.githubusercontent.com/green-coding-solutions/hog/main/install.sh)"

Contributing

PRs are always welcome. Feel free to drop us an email or look into the issues.

The hog is developed to not need any dependencies.

Debugging

It sometimes help to enable debugging for the logger process you can do this by editing the /Library/LaunchDaemons/io.green-coding.hog.plist file:

<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
    <key>Label</key>
    <string>io.green-coding.hog</string>

    <key>ProgramArguments</key>
    <array>
        <string>/usr/local/bin/hog/power_logger.py</string>
        <string>-v</string>
        <string>debug</string>
        <string>-o</string>
        <string>/tmp/hog.log</string>
    </array>

    <key>RunAtLoad</key>
    <true/>

    <key>KeepAlive</key>
    <true/>
</dict>

Please remember that the log file can become quite big. The hog does not use logrotate or similar out of the box.

Screenshots

Sources

Misc

  • If you can't see the hog logo in the menu bar because of the notch there are multiple solutions.

hog's People

Contributors

ribalba avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar

hog's Issues

track usage by website?

I'm not sure if this even possible, but it definitely would be interesting (IMO):

The same way I want to know which processes that are run by my terminal use the energy, I would love to know which websites use the most energy in my browser.

"measurement systems not running" in the morning

I'm testing the power-hog on my dev machine.

Always when opening the computer in the morning (so after suspend?) I see

grafik

Currently I just re-run the setup-script again, and then it works.

I weren't able yet to look into the setup-script, or validate the setup using the manual setup steps.

data is not uploaded anymore

According to the logs this is probably related to the domain change of the API.

https://api.green-coding.berlin/v1/hog/add is the current endpoint. This delivers 403 forbidden atm.

Energy Impact goes down when closing the power_logger

I've also noticed unexpected behavior: When I have the HOG.app open and then unload the power_logger, the Energy Impact decreases...
I suspect it's because the internal clock keeps running...? However, I would expect the Energy Impact to remain the same...? I don't know the formula, so it might not be a bug, but it's somehow unexpected.

Sorting on macOS 14

I just upgraded on macOS 14 and sorting for CPU % or Energy Impact or name does not work anymore.

Not exaclty sure if the last update of the hog itself or the macOS update is the culprit though.
Screenshot 2024-03-09 at 12 18 40 PM

Sprint 1

  • Show if data upload and where to
  • Add to settings that upload is a boolean
  • Add defaults if there is no settings file
  • Add section to Readme how to create a settings file
  • Move from joule to wH in frontend. Still save in mJ
  • Make CPU relative to the interval time.
  • Make the titles capital
  • Change "provider App running" to "All measurements systems functional "
  • Add units to table
  • Add table foreground color when in dark mode
  • Add more details to demo data button
  • Add demo Data
  • Nicer welcome screen first time usage
  • Show how big the upload queue is
  • Enable Intel Macs

eMail feedback

Als mir das mit Reminders aufgefallen ist wollte ich gleich einen Mini-Drilldown machen.

Was aktuell für mich optisch nicht möglich ist ist zu verstehen wie große diese Zahl ist die da in der Tabelle steht.

Wenn ich mir Reminders angucke mache ich z.B. oft nach der 5-Minutes Ansicht dann die 24 Stunden Ansicht auf und will gucken wie viel Reminders in den letzten 5 Min im Vergleich zu den letzten 24 Stunden verbraucht hat. Die Zahl ist aber einfach zu unleserlich. 
Das mit der Leserlichkeit hatte ich ja schon mal gesagt, aber mein neuer Wunsch wäre nun nicht nur die Zahl zu formatieren, sondern auch eine Größenordung dranhzuhängen.

Für mich wäre statt 242997621956 => 242.997.621.956 es noch besser wenn man daraus 242.99 Billion machen würde oder falls Billion eine Abkürzung hat auch gerne diese.
Falls du lieber wissenschaftlich machen willst fände ich auch 242.99 10^9 ok
Noch besser, aber tricky weil wir keine Dimension haben,  wäre sowas wie 242.99 GEI (für den Energy impact) und die Zeit in Minuten / Stunden umzurechnen.

Ich weiss nämlich auf Anhieb nicht was 242997621956 in CPU Time ist. Sind das nanosekunden?

3. Was ich auch noch gerne machen würde ist zu gucken ob eine App eine Diskrepanz zwischen Energy Impact und CPU Time. Ich möchte auf den Header der Tabelle klicken und es sortiert sich danach.

Wenn sich dann die Sortierung zwischen Energy Impact und CPU Time ändert, dann ist das eine mögliche Anomalie die ich gerne näher angucken würde.

Nach welcher Spalte ist aktuell sortiert?

Bugs

  • Change type from PATH_PLASE_CHANGE to PATH_PLEASE_CHANGE
  • Check with @arne why install script wasn't run with sudo
  • Check why the Unzip fails @ArneTR
  • Change sed script to be BSD sed
  • Investigate cpu_time
  • Check why settings file is not picked up when not using -d
  • Check why upload fails and kills the program

Refactor away from the collect/ write model.

Initially we had 2 processes running. One that collected the data from powermetrics and one that then wrote everything to a file. As we now use a database we can simplify things and write the data directly.

reduce disk space usage

grafik

Having run the hog for some time, the DB size gets bigger and bigger.

Apart from the fact that this is quite much data for not that much time (month), I feel like at the least we should have the option to delete data that is older than a certain time. Or best case, older data gets aggregated automatically, so we don't have all the details, but still have all the past statistics.

Add debugging for no disk

We are currently seeing that in some cases power metrics does not report a disk being present. We should add some debugging to see when this happens as normally a disk doesn't vanish.

Data upload delay due to hanging thread

The hog in it's current patch level constantly displays current data on local.

However I am experiencing missing data when using the Detailed Analytics.

My upload backlog is already at 2000+ and the daemon has been running uniterrupted since then.

Apparently the upload did not succeed and the thread crashed. The other threads where running fine.

The reason to that is currently unknown and I cannot spot this behaviour in my prior logs.

The resolve is to unload/load the daemon.

Screenshot 2023-12-03 at 5 11 05 PM
[DEBUG] 2023-12-03 11:47:46,921 - Program started.
[DEBUG] 2023-12-03 11:47:46,921 - Using db: /Library/Application Support/berlin.green-coding.hog/db.db
[DEBUG] 2023-12-03 11:47:46,922 - Using /etc/hog_settings.ini as settings file.
[DEBUG] 2023-12-03 11:47:46,969 - Setting: {'powermetrics': 5000, 'upload_delta': 300, 'api_url': 'https://api.green-coding.berlin/v1/hog/add', 'web_url': 'https://metrics.green-coding.berlin/hog-details.html?machine_uuid=', 'upload_data': True, 'resolve_coalitions': ['com.googlecode.iterm2', 'com.apple.terminal', 'com.vix.cron']}
[DEBUG] 2023-12-03 11:47:46,969 - Upload thread started
[DEBUG] 2023-12-03 11:47:46,969 - DB checker thread started
[DEBUG] 2023-12-03 11:47:46,969 - Ticker thread started
[INFO] 2023-12-03 11:47:46,970 - Starting powermetrics process: powermetrics --show-all -i 5000 -f plist
[INFO] 2023-12-03 11:47:47,050 - Uploading 10 rows to: https://api.green-coding.berlin/v1/hog/add
[DEBUG] 2023-12-03 11:47:51,938 - Uploaded.
[INFO] 2023-12-03 11:47:52,073 - Uploading 10 rows to: https://api.green-coding.berlin/v1/hog/add
[DEBUG] 2023-12-03 11:51:09,217 - Parsing new input
[DEBUG] 2023-12-03 11:51:09,346 - Data added to the DB
[INFO] 2023-12-03 11:51:09,346 - {'combined_energy': 3774, 'cpu_energy': 3726, 'gpu_energy': 48, 'ane_energy': 0, 'energy_impact': 2863}
[DEBUG] 2023-12-03 11:51:09,540 - DB Check
[DEBUG] 2023-12-03 11:51:09,632 - Power metrics running check
[DEBUG] 2023-12-03 11:51:14,375 - Parsing new input
[DEBUG] 2023-12-03 11:51:14,478 - Data added to the DB
[INFO] 2023-12-03 11:51:14,478 - {'combined_energy': 9657, 'cpu_energy': 9468, 'gpu_energy': 189, 'ane_energy': 0, 'energy_impact': 7556}
[DEBUG] 2023-12-03 11:51:19,481 - Parsing new input
[DEBUG] 2023-12-03 11:51:19,562 - Data added to the DB
[INFO] 2023-12-03 11:51:19,562 - {'combined_energy': 14164, 'cpu_energy': 13613, 'gpu_energy': 551, 'ane_energy': 0, 'energy_impact': 10617}
[DEBUG] 2023-12-03 11:51:24,625 - Parsing new input
[DEBUG] 2023-12-03 11:51:24,695 - Data added to the DB
[INFO] 2023-12-03 11:51:24,696 - {'combined_energy': 19772, 'cpu_energy': 18943, 'gpu_energy': 829, 'ane_energy': 0, 'energy_impact': 14064}
[DEBUG] 2023-12-03 11:51:29,692 - Parsing new input
[DEBUG] 2023-12-03 11:51:29,750 - Data added to the DB
[INFO] 2023-12-03 11:51:29,750 - {'combined_energy': 30182, 'cpu_energy': 28842, 'gpu_energy': 1340, 'ane_energy': 0, 'energy_impact': 21226}
[DEBUG] 2023-12-03 11:51:34,834 - Parsing new input
[DEBUG] 2023-12-03 11:51:34,890 - Data added to the DB
[INFO] 2023-12-03 11:51:34,890 - {'combined_energy': 38847, 'cpu_energy': 36999, 'gpu_energy': 1848, 'ane_energy': 0, 'energy_impact': 27390}

....

[DEBUG] 2023-12-03 12:31:26,931 - DB Check
[DEBUG] 2023-12-03 12:31:26,940 - Parsing new input
[DEBUG] 2023-12-03 12:31:27,015 - Power metrics running check
[DEBUG] 2023-12-03 12:31:27,054 - Data added to the DB
[INFO] 2023-12-03 12:31:27,055 - {'combined_energy': 1030542, 'cpu_energy': 961519, 'gpu_energy': 69023, 'ane_energy': 0, 'energy_impact': 660009}
[DEBUG] 2023-12-03 12:31:32,072 - Parsing new input
[DEBUG] 2023-12-03 12:31:32,124 - Data added to the DB
[INFO] 2023-12-03 12:31:32,125 - {'combined_energy': 1031471, 'cpu_energy': 962408, 'gpu_energy': 69063, 'ane_energy': 0, 'energy_impact': 660649}
[DEBUG] 2023-12-03 12:31:37,272 - Parsing new input
[DEBUG] 2023-12-03 12:31:37,319 - Data added to the DB
[INFO] 2023-12-03 12:31:37,319 - {'combined_energy': 1033139, 'cpu_energy': 964038, 'gpu_energy': 69101, 'ane_energy': 0, 'energy_impact': 661731}
[DEBUG] 2023-12-03 12:31:42,449 - Parsing new input
[DEBUG] 2023-12-03 12:31:42,500 - Data added to the DB
[INFO] 2023-12-03 12:31:42,501 - {'combined_energy': 1033865, 'cpu_energy': 964723, 'gpu_energy': 69142, 'ane_energy': 0, 'energy_impact': 662177}
[DEBUG] 2023-12-03 12:31:47,589 - Parsing new input
[DEBUG] 2023-12-03 12:31:47,647 - Data added to the DB
[INFO] 2023-12-03 12:31:47,647 - {'combined_energy': 1035421, 'cpu_energy': 966239, 'gpu_energy': 69182, 'ane_energy': 0, 'energy_impact': 663228}
[DEBUG] 2023-12-03 12:31:52,641 - Parsing new input
[DEBUG] 2023-12-03 12:31:52,685 - Data added to the DB
[INFO] 2023-12-03 12:31:52,685 - {'combined_energy': 1036295, 'cpu_energy': 967072, 'gpu_energy': 69223, 'ane_energy': 0, 'energy_impact': 663761}
[DEBUG] 2023-12-03 12:31:57,846 - Parsing new input
[DEBUG] 2023-12-03 12:31:57,900 - Data added to the DB
[INFO] 2023-12-03 12:31:57,901 - {'combined_energy': 1037642, 'cpu_energy': 968376, 'gpu_energy': 69266, 'ane_energy': 0, 'energy_impact': 664652}
[DEBUG] 2023-12-03 12:32:03,037 - Parsing new input
[DEBUG] 2023-12-03 12:32:03,082 - Data added to the DB
[INFO] 2023-12-03 12:32:03,082 - {'combined_energy': 1039071, 'cpu_energy': 969759, 'gpu_energy': 69312, 'ane_energy': 0, 'energy_impact': 665573}

....

[DEBUG] 2023-12-03 17:11:54,933 - Parsing new input
[DEBUG] 2023-12-03 17:11:54,992 - Data added to the DB
[INFO] 2023-12-03 17:11:54,992 - {'combined_energy': 4335394, 'cpu_energy': 4021331, 'gpu_energy': 314056, 'ane_energy': 7, 'energy_impact': 2893909}

Make table fixed with

Currently the table is not fixed with font. Which makes it very hard to compare the values.
Screenshot 2023-12-06 at 18 52 00

Hog crashes and is not restarted by OS

[INFO] 2023-10-24 18:16:21,363 - {'combined_energy': 770053, 'cpu_energy': 738248, 'gpu_energy': 31764, 'ane_energy': 41, 'energy_impact': 571755}
[INFO] 2023-10-24 18:31:28,981 - Uploading 10 rows to: https://api.green-coding.berlin/v1/hog/add
[ERROR] 2023-10-24 18:31:29,092 - No new data in DB. Exiting to be restarted by the os
[DEBUG] 2023-10-24 18:31:29,092 - DB Check ✅
[DEBUG] 2023-10-24 18:31:29,988 - Upload 👌
[DEBUG] 2023-10-24 18:31:31,677 - Program started 🎉
[DEBUG] 2023-10-24 18:31:31,677 - Using db: /Library/Application Support/berlin.green-coding.hog/db.db
[DEBUG] 2023-10-24 18:31:31,700 - Setting: {'powermetrics': 5000, 'upload_delta': 300, 'api_url': 'https://api.green-coding.berlin/v1/hog/add', 'web_url': 'https://metrics.green-coding.berlin/hog-details.html?machine_uuid=', 'upload_data': True, 'resolve_coalitions': ['com.googlecode.iterm2', 'com.apple.terminal', 'com.vix.cron']}
[DEBUG] 2023-10-24 18:31:31,700 - Upload thread started
[DEBUG] 2023-10-24 18:31:31,700 - DB checker thread started
[INFO] 2023-10-24 18:31:31,700 - Starting powermetrics process: powermetrics --show-all -i 5000 -f plist
[INFO] 2023-10-24 18:31:31,713 - Uploading 10 rows to: https://api.green-coding.berlin/v1/hog/add
[ERROR] 2023-10-24 18:52:34,941 - No new data in DB. Exiting to be restarted by the os
[DEBUG] 2023-10-24 18:52:34,997 - DB Check ✅
 % sudo launchctl list | grep berlin.green-coding.hog
29641	0	berlin.green-coding.hog
 % ps -ef | grep power
    0   327     1   0 Thu02AM ??         3:46.83 /System/Library/CoreServices/powerd.bundle/powerd
    0 29641     1   0  6:31PM ??         0:00.13 /Library/Developer/CommandLineTools/Library/Frameworks/Python3.framework/Versions/3.9/Resources/Python.app/Contents/MacOS/Python /usr/local/bin/hog/power_logger.py -v debug -o /tmp/hog.log

It looks like that after a suspend the DB thread checker still has the old time and then kills the process. But why does the system come up and crash right again

Sprint 2

  • Correlate energy usage to battery
  • Add pylint
  • Look at why the values are not correct for CPU %
  • Look at why the overall power values are weird for tasks
  • Update the views in the hog ever minute

Show subprocesses?

I'm not sure if this fits the ideas behind this tool.

When I look at my statistics, the view is not wrong, but limited (IMO).
grafik

In my terminal-based setup, everything is grouped under the terminal emulator.

Is this how it should be?

Add grid intensity to the hog interface

We would like to show the grid intensity when you open the hog. This should enable the user to make an informed decision when to start energy hungry workloads.

  1. Look at when open and free apis are out there for as many countries/ regions as possible
  2. Implement an api endpoint that gives you the energy mix for a given region
  3. Extend the hog swift program to query that endpoint based on the location.
  4. Display the grid intensity

Open questions:

  1. Should we get the location in the form of lat/long from the MacOS?
  2. Should the user set the location in the install.
  3. How do we expose the grid intensity without violating the license of the provider

Links:

Screenshot 2023-11-20 at 18 20 46

Make process coalitions configurable

On development machines it often happens that the shall turns up as head of a process coalition and such all the energy impact and cpu time is accounted to the shell process. #11

There are multiple ways this could be resolved on the client. We show this data on the server.

  1. add a list of coalition names to the settings file and then resolve them.
  2. also save the subprocess of coalitions and show them in the hog app view. (This will increase the local DB size quite a bit)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.