Coder Social home page Coder Social logo

figures's People

Contributors

bbrsofiane avatar bryanlandia avatar dependabot[bot] avatar estherjsuh avatar grozdanowski avatar jfaman avatar johnbaldwin avatar melvinsoft avatar natea avatar omarithawi avatar ponytojas avatar shadinaif avatar thraxil avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

figures's Issues

traceback during populate_daily_metrics

found this traceback in the logs:

Traceback (most recent call last):
  File "/edx/app/edxapp/venvs/edxapp/src/figures/figures/tasks.py", line 110, in populate_daily_metrics
    force_update=force_update)
  File "/edx/app/edxapp/venvs/edxapp/local/lib/python2.7/site-packages/celery/local.py", line 188, in __call__
    return self._get_current_object()(*a, **kw)
  File "/edx/app/edxapp/venvs/edxapp/local/lib/python2.7/site-packages/celery/app/trace.py", line 439, in __protected_call__
    return orig(self, *args, **kwargs)
  File "/edx/app/edxapp/venvs/edxapp/local/lib/python2.7/site-packages/celery/app/task.py", line 428, in __call__
    return self.run(*args, **kwargs)
  File "/edx/app/edxapp/venvs/edxapp/src/figures/figures/tasks.py", line 56, in populate_single_cdm
    course_id).load(date_for=date_for, force_update=force_update)
  File "/edx/app/edxapp/venvs/edxapp/src/figures/figures/pipeline/course_daily_metrics.py", line 335, in load
    data = self.get_data(date_for=date_for)
  File "/edx/app/edxapp/venvs/edxapp/src/figures/figures/pipeline/course_daily_metrics.py", line 279, in get_data
    date_for=date_for)
  File "/edx/app/edxapp/venvs/edxapp/src/figures/figures/pipeline/course_daily_metrics.py", line 261, in extract
    course_id, date_for,)
  File "/edx/app/edxapp/venvs/edxapp/src/figures/figures/pipeline/course_daily_metrics.py", line 178, in get_average_days_to_complete
    days_to_complete = get_days_to_complete(course_id, date_for)
  File "/edx/app/edxapp/venvs/edxapp/src/figures/figures/pipeline/course_daily_metrics.py", line 164, in get_days_to_complete
    days.append((cert.created_date - ce[0].created).days)
  File "/edx/app/edxapp/venvs/edxapp/local/lib/python2.7/site-packages/django/db/models/query.py", line 289, in __getitem__
    return list(qs)[0]
IndexError: list index out of range

Upgrade tox lint to use Python 3.x

Currently the tox lint environment uses Python 2.7. Questions to decide on is which Python 3.x version if we pick just one "tox lint" environment or if we run lint in all the 3.x environments

Koa compatibility

I'm taking a look at whether the current 0.4.dev8 version could already be (or be made) compatible with Koa.

I'm pushing my work here and I'm using this fork of the tutor-figures plugin to install it on edX.

What works:

  • The active user count works
  • The enrolled user count worked after I ran python manage.py lms populate_figures_metrics --no-delay inside the LMS container

What doesn't work:

  • The course-specific numbers are all 0 or N/A

This is probably because when I run python manage.py lms populate_figures_metrics --no-delay this error occurs a lot of times:

2021-02-17 16:24:04,984 ERROR 77 [figures.tasks] [user None] [ip None] tasks.py:110 - FIGURES:PIPELINE:DAILY:SITE:COURSE:FAIL:populate_daily_metrics_for_site. site_id:4, date_for:2021-02-17. course_id:course-v1:Totem+TP_SM_FA+001 exception:["'None' value must be a decimal number."]
Traceback (most recent call last):
  File "/openedx/venv/lib/python3.8/site-packages/django/db/models/query.py", line 554, in update_or_create
    obj = self.select_for_update().get(**kwargs)
  File "/openedx/venv/lib/python3.8/site-packages/django/db/models/query.py", line 406, in get
    raise self.model.DoesNotExist(
figures.models.CourseDailyMetrics.DoesNotExist: CourseDailyMetrics matching query does not exist.

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/openedx/venv/lib/python3.8/site-packages/django/db/models/fields/__init__.py", line 1560, in to_python
    return decimal.Decimal(value)
decimal.InvalidOperation: [<class 'decimal.ConversionSyntax'>]

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/openedx/venv/src/figures/figures/tasks.py", line 103, in populate_daily_metrics_for_site
    populate_single_cdm(course_id=course_id,
  File "/openedx/venv/lib/python3.8/site-packages/celery/local.py", line 191, in __call__
    return self._get_current_object()(*a, **kw)
  File "/openedx/venv/lib/python3.8/site-packages/celery/app/task.py", line 393, in __call__
    return self.run(*args, **kwargs)
  File "/openedx/venv/src/figures/figures/tasks.py", line 65, in populate_single_cdm
    cdm_obj, _created = CourseDailyMetricsLoader(
  File "/openedx/venv/src/figures/figures/pipeline/course_daily_metrics.py", line 352, in load
    return self.save_metrics(date_for=date_for, data=data)
  File "/opt/pyenv/versions/3.8.6/lib/python3.8/contextlib.py", line 75, in inner
    return func(*args, **kwds)
  File "/openedx/venv/src/figures/figures/pipeline/course_daily_metrics.py", line 310, in save_metrics
    cdm, created = CourseDailyMetrics.objects.update_or_create(
  File "/openedx/venv/lib/python3.8/site-packages/django/db/models/manager.py", line 82, in manager_method
    return getattr(self.get_queryset(), name)(*args, **kwargs)
  File "/openedx/venv/lib/python3.8/site-packages/django/db/models/query.py", line 559, in update_or_create
    obj, created = self._create_object_from_params(kwargs, params, lock=True)
  File "/openedx/venv/lib/python3.8/site-packages/django/db/models/query.py", line 575, in _create_object_from_params
    obj = self.create(**params)
  File "/openedx/venv/lib/python3.8/site-packages/django/db/models/query.py", line 422, in create
    obj.save(force_insert=True, using=self.db)
  File "/openedx/venv/lib/python3.8/site-packages/django/db/models/base.py", line 743, in save
    self.save_base(using=using, force_insert=force_insert,
  File "/openedx/venv/lib/python3.8/site-packages/django/db/models/base.py", line 780, in save_base
    updated = self._save_table(
  File "/openedx/venv/lib/python3.8/site-packages/django/db/models/base.py", line 873, in _save_table
    result = self._do_insert(cls._base_manager, using, fields, update_pk, raw)
  File "/openedx/venv/lib/python3.8/site-packages/django/db/models/base.py", line 910, in _do_insert
    return manager._insert([self], fields=fields, return_id=update_pk,
  File "/openedx/venv/lib/python3.8/site-packages/django/db/models/manager.py", line 82, in manager_method
    return getattr(self.get_queryset(), name)(*args, **kwargs)
  File "/openedx/venv/lib/python3.8/site-packages/django/db/models/query.py", line 1186, in _insert
    return query.get_compiler(using=using).execute_sql(return_id)
  File "/openedx/venv/lib/python3.8/site-packages/django/db/models/sql/compiler.py", line 1376, in execute_sql
    for sql, params in self.as_sql():
  File "/openedx/venv/lib/python3.8/site-packages/django/db/models/sql/compiler.py", line 1318, in as_sql
    value_rows = [
  File "/openedx/venv/lib/python3.8/site-packages/django/db/models/sql/compiler.py", line 1319, in <listcomp>
    [self.prepare_value(field, self.pre_save_val(field, obj)) for field in fields]
  File "/openedx/venv/lib/python3.8/site-packages/django/db/models/sql/compiler.py", line 1319, in <listcomp>
    [self.prepare_value(field, self.pre_save_val(field, obj)) for field in fields]
  File "/openedx/venv/lib/python3.8/site-packages/django/db/models/sql/compiler.py", line 1260, in prepare_value
    value = field.get_db_prep_save(value, connection=self.connection)
  File "/openedx/venv/lib/python3.8/site-packages/django/db/models/fields/__init__.py", line 1569, in get_db_prep_save
    return connection.ops.adapt_decimalfield_value(self.to_python(value), self.max_digits, self.decimal_places)
  File "/openedx/venv/lib/python3.8/site-packages/django/db/models/fields/__init__.py", line 1562, in to_python
    raise exceptions.ValidationError(
django.core.exceptions.ValidationError: ["'None' value must be a decimal number."]

I assume this error happens for all the courses. I hope to be able to figure out why tomorrow. In the mean time, any advice would definitely help!

Could figures be used on Python3.8.5?

Hi,

I am using python 3.8.5.

In my code, the following code will encountered the following issue:

from figures import set_limits, plot_coords, plot_bounds, plot_line_issimple

Traceback (most recent call last):
  File "C:\Program Files\JetBrains\PyCharm Community Edition 2020.3.1\plugins\python-ce\helpers\pydev\pydevd.py", line 1477, in _exec
    pydev_imports.execfile(file, globals, locals)  # execute the script
  File "C:\Program Files\JetBrains\PyCharm Community Edition 2020.3.1\plugins\python-ce\helpers\pydev\_pydev_imps\_pydev_execfile.py", line 18, in execfile
    exec(compile(contents+"\n", file, 'exec'), glob, loc)
  File "C:/doc/code_python/har/annotation/linestring.py", line 4, in <module>
    from figures import set_limits, plot_coords, plot_bounds, plot_line_issimple
ImportError: cannot import name 'set_limits' from 'figures' (C:\ProgramData\Anaconda3\lib\site-packages\figures\__init__.py)

Hawthorn GeneratedCertificate Mock is wrong

As you discovered, @estherjsuh , the GeneratedCertificate mock is wrong, missing the In Hawthorn, theon_delete=models.CASCADE` in the mock code:

This is the production model:

https://github.com/appsembler/edx-platform/blob/appsembler/tahoe/master/lms/djangoapps/certificates/models.py#L233

So we need to fix this for the Hawthorn mock

The Juniper mock looks correct: https://github.com/appsembler/figures/blob/master/mocks/juniper/lms/djangoapps/certificates/models.py#L11

Address Tox configuration tech debt

Right now tox uses two separate requirements files that are duplicated expect for edx-organizations. Also, I'm just starting to use tox, so I'm sure there are other maintainability improvements that can be made.

Ficus - incompatible edx-django-oath2-provider

@bryanlandia mentioned

I installed appsembler/amc branch of edx-organizations, but the its requirement for edx-django-oauth2-provider>=1.2.0 is not compatible with Ficus, and it does appear that for our needs on [this specific server] that 1.1.4 is fine, so I included an extra reequirement in configs for that version, to override the edx-organizations dependency.

Make the integration with Open edX more seamless

Going through the README I see several requests to edit the settings. I'd like to suggest better alternatives:

pip install figures
That's great! It's much faster than git!


	"ADDL_INSTALLED_APPS": [
		"figures"
	]

This is acceptable practice, especially if there's migrations (such as in this case).


	"FEATURES": {
		... 
		"ENABLE_FIGURES": true,
		...
	}

In my opinion this is redundant, since we're adding the app to ADDL_INSTALLED_APPS, there's no need to have a feature, it's not like being a feature inside the edX platform.


from figures.settings import FIGURES

Please avoid this pattern, there's about 200 dependencies/apps in the edX Platform almost none of them uses this pattern. This pattern is problematic since Figures depends on the settings and I don't see any need why settings would depend on Figures.


	if FEATURES.get('ENABLE_FIGURES'):
		from figures.settings import FIGURES

Sames goes here.

if settings.FEATURES.get('ENABLE_FIGURES'):
    	urlpatterns += (
    		url(r'^figures/',
    		    include('figures.urls', namespace='figures')),
    	)

if settings.FEATURES.get('ENABLE_FIGURES'): -> 'figures' in if settings.INSTALLED_APPS'):


The way static files is integrated should be completely decoupled from the edX Platform such as:

  1. The platform should know nothing about how the static files are manages.
  2. Figures should know nothing about how the platform manages its static files.
  3. Static assets such as compiled js and css files should be committed to the git repo.
  4. Figures views should call those files, the XBlock ResourceLoader is a good example of such pattern.

Migration error

I followed the instructions in order to install Figures on Ironwood but at the final step, I encounter an error.
With command "python.edxapp /edx/bin/manage.edxapp lms migrate figures" I get the error "django.db.utils.OperationalError: unable to open database file"

With command "python.edxapp /edx/bin/manage.edxapp lms syncdb figures" I get the error "manage.edxapp syncdb: error: unrecognized arguments: figures"

With command "python.edxapp /edx/bin/manage.edxapp lms syncdb --settings aws figures" I get the error "AttributeError: 'module' object has no attribute 'update_settings"

How can I fix it?

Add useful docstrings to views module

This is an early documentation win for the community to give everyone more insight into using the REST API

  • endpoints
  • data returned
  • filtering (query params)

But we'll also want to coincide this update with enabling more authentication options for the views

Intermittent test failure with TestSiteDailyMetricsExtractor

This test fails intermittently:

___________________________________________ TestSiteDailyMetricsExtractor.test_extract ___________________________________________

self = <tests.pipeline.test_site_daily_metrics.TestSiteDailyMetricsExtractor object at 0x10a7ab6d0>

    def test_extract(self):
        expected_results = dict(
            cumulative_active_user_count=65,
            todays_active_user_count=15,
            total_user_count=3,
            course_count=len(CDM_INPUT_TEST_DATA),
            total_enrollment_count=150,
        )
        actual = pipeline_sdm.SiteDailyMetricsExtractor().extract(
            date_for=self.date_for)
        for key, value in expected_results.iteritems():
>           assert actual[key] == value, 'failed on key: "{}"'.format(key)
E           AssertionError: failed on key: "total_user_count"
E           assert 2 == 3

tests/pipeline/test_site_daily_metrics.py:177: AssertionError

Fix naive date warnings in tests

Running the automated tests outputs naive date warnings. Sample console output:

RuntimeWarning: DateTimeField GeneratedCertificate.created_date received a naive datetime (2018-04-01 00:00:00) while time zone support is active

Add installation instructions

Issue formerly: "Add instructions for custom lms.env.json"
edx-figures requires new vars to be loaded from the lms.env.json file. Please see here:

Until/unless we get a 'vendor additions' PR implemented upstream, we'll want to add instructions for implementors to update their fork of edx-platform to enable the custom variables required by edx-figures

Mock data generator may fail with out of range error

the seed_data management command failed during the Figures workshop (Open edX 2019 Conference). I re-ran and it passed. Unfortunately, I closed the terminal window before creating this issue, so I can't paste that stack trace. The function failing is the seed_student_module in the devsite.seed module.

Support Ironwood

It seems that figures not yet support Ironwood. Any guidances to implement it please? I want to contribute if possible.

/figures/api/users/detail/?enrolled_in_course_id=course-v1:edX+DemoX+Demo_Course 404

URL: http://192.168.56.10/figures/course/course-v1:edX+DemoX+Demo_Course

==> /edx/var/log/nginx/access.log <==
- - 127.0.0.1 - edx [15/Oct/2018:00:37:35 +0000]  "GET /xqueue/get_queuelen/?queue_name=certificates HTTP/1.1" 200 61 0.104 "-" "python-requests/2.3.0 CPython/2.7.12 Linux/4.4.0-31-generic"

==> /edx/var/log/lms/edx.log <==
Oct 15 00:37:39 vagrant [service_variant=lms][track.contexts][env:sandbox] WARNING [vagrant  9652] [contexts.py:31] - unable to parse course_id "detail/course-v1:edX+DemoX+Demo_Course"
Traceback (most recent call last):
 File "/edx/app/edxapp/edx-platform/common/djangoapps/track/contexts.py", line 25, in course_context_from_url
   course_id = SlashSeparatedCourseKey.from_deprecated_string(course_id_string)
 File "/edx/app/edxapp/venvs/edxapp/local/lib/python2.7/site-packages/opaque_keys/edx/locations.py", line 49, in from_deprecated_string
   return CourseLocator.from_string(serialized)
 File "/edx/app/edxapp/venvs/edxapp/local/lib/python2.7/site-packages/opaque_keys/__init__.py", line 197, in from_string
   return cls.deprecated_fallback._from_deprecated_string(serialized)
 File "/edx/app/edxapp/venvs/edxapp/local/lib/python2.7/site-packages/opaque_keys/edx/locator.py", line 379, in _from_deprecated_string
   raise InvalidKeyError(cls, serialized)
InvalidKeyError: <class 'opaque_keys.edx.locator.CourseLocator'>: detail/course-v1:edX+DemoX+Demo_Course

==> /edx/var/log/nginx/access.log <==
- - 192.168.56.1 - - [15/Oct/2018:00:37:40 +0000]  "GET /figures/api/courses/detail/course-v1:edX+DemoX+Demo_Course/ HTTP/1.1" 200 335 0.287 "http://192.168.56.10/figures/course/course-v1:edX+DemoX+Demo_Course" "Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/69.0.3497.100 Safari/537.36"
- - 127.0.0.1 - edx [15/Oct/2018:00:37:40 +0000]  "GET /xqueue/get_queuelen/?queue_name=certificates HTTP/1.1" 200 61 0.119 "-" "python-requests/2.3.0 CPython/2.7.12 Linux/4.4.0-31-generic"

==> /edx/var/log/lms/edx.log <==
Oct 15 00:37:40 vagrant [service_variant=lms][openedx.core.djangoapps.content.block_structure.store][env:sandbox] INFO [vagrant  9658] [store.py:176] - BlockStructure: Read from cache; <openedx.core.djangoapps.content.block_structure.store.StubModel object at 0x7fc16df21d10>, size: 27906
Oct 15 00:37:40 vagrant [service_variant=lms][lms.djangoapps.grades.new.course_grade_factory][env:sandbox] INFO [vagrant  9658] [course_grade_factory.py:226] - Grades: Update, Course: course_key: course-v1:edX+DemoX+Demo_Course, version: 5bb02b9e636239465f78ab68, edited_on: 2018-09-30 01:49:18.055000+00:00, grading_policy: 2HDb6cWz6xgUQ8b9YhsgN5sukP0=, User: 4, Course Grade: percent: 0.0, letter_grade: None, passed: False, persisted: False
Oct 15 00:37:40 vagrant [service_variant=lms][openedx.core.djangoapps.content.block_structure.store][env:sandbox] INFO [vagrant  9658] [store.py:176] - BlockStructure: Read from cache; <openedx.core.djangoapps.content.block_structure.store.StubModel object at 0x7fc16e057090>, size: 27906
Oct 15 00:37:41 vagrant [service_variant=lms][lms.djangoapps.grades.new.course_grade_factory][env:sandbox] INFO [vagrant  9658] [course_grade_factory.py:226] - Grades: Update, Course: course_key: course-v1:edX+DemoX+Demo_Course, version: 5bb02b9e636239465f78ab68, edited_on: 2018-09-30 01:49:18.055000+00:00, grading_policy: 2HDb6cWz6xgUQ8b9YhsgN5sukP0=, User: 5, Course Grade: percent: 0.0, letter_grade: None, passed: False, persisted: False

==> /edx/var/log/nginx/access.log <==
- - 192.168.56.1 - - [15/Oct/2018:00:37:41 +0000]  "GET /figures/api/users/detail/?enrolled_in_course_id=course-v1:edX+DemoX+Demo_Course HTTP/1.1" 404 57 1.285 "http://192.168.56.10/figures/course/course-v1:edX+DemoX+Demo_Course" "Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/69.0.3497.100 Safari/537.36" (edited)

Months may be skipped in monthly metrics histories when last day of first month in series is greater than in a subsequent month

Appears that we never get any kind of metrics for February (unless probably it's the current month). This affects all types of metrics.

image

Looks like the problem is in metrics.previous_months_iterator

>>> from datetime import datetime
>>> date_for = datetime.utcnow().date()
>>> from figures import helpers, metrics
>>> date_for = helpers.as_date(date_for)
>>> date_for
datetime.date(2021, 5, 31)
>>> metrics.previous_months_iterator(month_for=date_for, months_back=6)
<generator object previous_months_iterator at 0x7fbc30ec0190>
>>> [mon for mon in metrics.previous_months_iterator(month_for=date_for, months_back=6)]
[(2020, 11, 30), (2020, 12, 31), (2021, 1, 31), (2021, 3, 31), (2021, 4, 30), (2021, 5, 31)]

could figures library be used on python 3.8.5?

Hi,

I am using python 3.8.5.

In my code, the following code will encountered the following issue:

from figures import set_limits, plot_coords, plot_bounds, plot_line_issimple

Traceback (most recent call last):
  File "C:\Program Files\JetBrains\PyCharm Community Edition 2020.3.1\plugins\python-ce\helpers\pydev\pydevd.py", line 1477, in _exec
    pydev_imports.execfile(file, globals, locals)  # execute the script
  File "C:\Program Files\JetBrains\PyCharm Community Edition 2020.3.1\plugins\python-ce\helpers\pydev\_pydev_imps\_pydev_execfile.py", line 18, in execfile
    exec(compile(contents+"\n", file, 'exec'), glob, loc)
  File "C:/doc/code_python/har/annotation/linestring.py", line 4, in <module>
    from figures import set_limits, plot_coords, plot_bounds, plot_line_issimple
ImportError: cannot import name 'set_limits' from 'figures' (C:\ProgramData\Anaconda3\lib\site-packages\figures\__init__.py)

Refactor requirements files to minimize packages needed to run tests

We want only the packages needed for each of the test environments. Then we have a "_dev.txt" requirements file that does a -r <named-release>_test.txt and includes the imports not needed by test but needed for dev.

Like: "ginkgo.txt" -> "ginkgo_test.txt" and "ginkgo_dev.txt"

After refactoring, then enable the OpenAPI feature on devsite by default. Read Omar's comment here: #292

Look into removing "path" from webpack-stats.json

We have or will have two persistent webpack-stats.json files. One for production and one for testing.

BundleTracker adds a "path" var into the webpack stats file. This is the absolute path to the generated resources. We either want to not include this var if it is not needed by production and test or have the relative path to the package root or project root.

Planning Figures Juniper upgrade (Python 3, Django 2)

Here are my thoughts for doing the Figures Juniper upgrade:

Overview

The core parts of Figures:

  • Figures Django reusable app Python package - ./figures/ directory. This is wh
  • Figures React JavaScript front end - ./frontend
  • Platform mocks - ./mocks/ directory - Each subdirectory here maps to an Open edX release
  • Tests - ./tests/ directory - This is the home for the unit tests
  • Figures development server and settings - ./devsite directory. This is a minimal server configuration. There are two settings files. One to run the development environment (web server, Django shell). The other to run the unit tests

And then there are the configuration files for pytest, tox, codecov in the project root

Suggestion for approach

We would like to do this incrementally with a number of small PRs rather than a huge PR.

First, this is the first real effort for community contribution to Figures. I want to make sure community members get the most value for their development time. Starting small means we can have "small miscommunications" instead of "big miscommunications" and reduce risk of rejecting contributions because of a misunderstanding.

Second, we must maintain backward compatibility. We still have systems on Ginkgo and Hawthorn running Figures. I am NOT maintaining multiple branches for different versions.

Over my career, I've learned a single codebase to support multiple targets is more maintainable and less costly in the long run than keeping different codebases synchronized. Where one finds there are hard incompatibilities that necessitate branching, that is a cue to pull that incompatible functionality out of the core code and into an adapter/plugin.

Steps

I suggest we approach the upgrade in the following sequence so we can "layer in" Juniper support with minimal initial changes to existing files:

  1. Add a new ./devsite/requirements/juniper_community.txt Pip requirements file.

Copy the hawthorn_community.txt file and update versions for those in Juniper. There might be new packages, add those, there might be packages no longer needed. We can discuss as we find those

  1. Create a pytest-juniper.ini file in the project root, just like we have a special pytest.ini file for Ginkgo, we'll start with one for Juniper.

See the Ginkgo file for reference: https://github.com/appsembler/figures/blob/master/pytest-ginkgo.ini

  1. Create Juniper mocks

3.a. Create a new ./mocks/juniper/ directory, copying from ./mocks/hawthorn

3.b. Review platform changes made to the models and supporting code in the mocks.

For example, look over the student module in Figures mocks (https://github.com/appsembler/figures/blob/master/mocks/hawthorn/student/models.py) What are the changes made in the Juniper upgrade in the platform?

https://github.com/edx/edx-platform/blob/open-release/juniper.master/common/djangoapps/student/models.py

We really only want to mock what we absolutely need to

The juniper mocks is probably going to be one of the bigger tasks to do for the Django 2/Python 3 upgrade. And although this is a lot of effort, I find the mocks to be very well worth the effort. It helps speed up development and makes integration with the platform more purposeful.

Question: Why not just mock everything in the unit tests with Mock?
Answer: The platform mocks let us run Figures development server so we can interact and explore Figures standalone without requiring the platform to be installed on the developer's system. There are other reasons, but that's a key one.

  1. Update test settings (optionally also update devsite development server settings settings too

Up for discussion on the approach here. At a minimum, we need to update ./devsite/devsite/test_settings.py or we need to create a new test settings file (copy existing test_settings.py and modify) for Juniper testing.

It might be helpful to just update the whole devsite so we have a working sandbox for Figures. Note we need to maintain backward compatibility.

Updating settings:

We have two basic options here:

Option A. Add a new juniper_settings.py for the development web server and juniper_test_settings.py for the pytest unit tests

Option B. Update the existing settings.py and test_settings.py files to support Juniper while maintaining backward compatibility

Let's discuss which approach when we see what needs to be changed. What I suggest is we do a draft PR modifying the existing settings to identify the required changes. From this PR, we'll know which direction we want to go in for maintainability.

  1. Add a Juniper environment to ./tox.ini

https://github.com/appsembler/figures/blob/master/tox.ini#L31-L39

  1. Update ./figures/ and ./tests

IMPORTANT: We must maintain backward compatibility. We use the figures.compat module to serve as a compatibility layer to abstract platform release specifics from the rest of Figures code. See how it currently works:

https://github.com/appsembler/figures/blob/master/figures/compat.py

Front end

The frontend code has no Python and therefore should not need to be updated for doing a Python 3 upgrade

Figures Questions (Candidates for updating docs or a FAQ)

In openedx slack #figures channel (slack link: https://openedx.slack.com/archives/CD0H6H8P5/p1594795591093400), I was asked:

  1. Can figures work without apsembler’s edx-platform fork? I mean with OpenEdx version of edx-platform?

  2. Is figures scaleable? If yes, then how much? Do we have any metric from stress test?

  3. Do we need intermediary cloud data warehouse in future to keep it working for huge pool of users?

  4. Can figures work without apsembler’s edx-platform fork? I mean with OpenEdx version of edx-platform?

Figures should be able to work on the upstream/community (Open edX) version of Figures Hawthorn (as of July 2020). If Figures does not work with upstream/community Hawthorn release of Open edX, please open a ticket in the GitHub issues (here: https://github.com/appsembler/figures/issues) and ping John in openedx slack #figuers channel.

For the releases of Open edX supported: We know we need to get Figures upgraded to work on Juniper, but I have other work that I need to address for our customers first, primarily with improving the API, metrics served and response performance.

It is important to note that multisite support for Figures has been coded to use Appsembler's fork

  1. Is figures scaleable? If yes, then how much? Do we have any metric from stress test?

Figures scalability is directly dependent on the scalability of the LMS infrastructure running it. Figures is designed to use the existing LMS infrastructure (MySQL, Celery for data processing async jobs, like the daily metrics extraction and aggregation, and the capability of the server hosting the app server).

With that, one of my main focuses right now are performance improvements to make figure more performant, such as making API queries (Django QuerySet queries) more efficient, aggregating data to reduce the need for live queries on built-in LMS models (like courseware.models.StudentModule), and scaling out the pipeline Celery tasks while ensuring resiliency to handle failed Celery tasks.

"Stress test metrics" nothing formal. As Appsembler's Tahoe data grow, we find API performance issues and address them. So our stress testing is using Figures in production. I don't have any formal metrics I can release at this time.

I am working piece by piece on building a development environment that can do stress testing with synthetic data. I just released an early version of Celery on Docker for Figure development environment "devsite":

https://github.com/appsembler/figures/blob/master/devsite/README.md

In the backlog is getting a MySQL docker container option implemented, then incrementally improve synthetic data generation.

  1. Do we need intermediary cloud data warehouse in future to keep it working for huge pool of users?

This is an open question. Appsembler is committed to enabling Figures to work on small deployments (which it does now) for the members of the community who need to deploy standalone servers. As I mentioned above, Figures uses available LMS resources. This helps Figures scale as its underlying infrastructure scales. We are also committed to our customers, which means that Figures also has to scale to meet our customer needs in a multi-tenant platform

Pytest is missing devsite .env file and default Open edX release

Issue #1 pytest failes because devsite/devsite/.env does not exist

Running pytest fails with the following:

$ pytest
/home/jbaldwin/.pyenv/versions/3.5.10/envs/figpy35/lib/python3.5/site-packages/environ/environ.py:630: UserWarning: /home/jbaldwin/work/appsembler/ws/figures/devsite/devsite/.env doesn't exist - if you're not configuring your environment separately, create one.
  "environment separately, create one." % env_file)

Tox does work. Tox calling pytest works because tox defines environment variables that the devsite settings file expects. These enviroment var

The package used is here: https://github.com/joke2k/django-environ

It is unfortunate that the package seems to not be actively maintained. However, we use it only for Figures devsite and not in production.

Issue #2: There is not a default Open edX release defined. This can be seen if one creates an empty devsite/devsite/.env file.

The .env file loads environment variables that are then read in the devsite settings.py and test_settings.py files

There are options to fix this (here are just some):

  1. Improve the error message if .env is missing
  2. Add a default .env file with a default Open edX release
  3. For the test_settings.py, use a test.env file
  4. Remove use of django-environ and implement an alternate strategy for setting environment variables
  5. Dynamically create a .env file with default settings if one does not already exist

TODO: dig into the options and choose

  1. Quick fix for now
  2. Then long term solution

Fix sites handling

New addition to filter sites is breaking in devsite multisite mode. There may be community installation issues too

https://github.com/appsembler/figures/blob/master/figures/sites.py#L312

  File "/Users/jbaldwin/work/appsembler/repos/figures/devsite/devsite/seed.py", line 476, in seed_all
    backfill_figures_ed()
  File "/Users/jbaldwin/work/appsembler/repos/figures/devsite/devsite/seed.py", line 415, in backfill_figures_ed
    for site in get_sites():
  File "/Users/jbaldwin/work/appsembler/repos/figures/figures/sites.py", line 312, in get_sites
    sites_backend_path = settings.ENV_TOKENS['FIGURES'].get('SITES_BACKEND')
KeyError: 'FIGURES'

Where is update_settings?

On Hawthorn production i did
pip install -e git+https://github.com/appsembler/figures.git#egg=figures
and set all settings so on. But Django gets error:

  File "/edx/app/edxapp/edx-platform/lms/envs/aws.py", line 1137, in <module>
    figures.update_settings(
AttributeError: 'module' object has no attribute 'update_settings'

Then i tried to find update_settings:
grep -r -l -m 1 'update_settings' venvs/edxapp/src/figures/figures/
empty result.
When i used PyPi (pip install figures) and runs migrate it gets other errors:

  File "/edx/app/edxapp/venvs/edxapp/local/lib/python2.7/site-packages/figures/urls.py", line 8, in <module>
    from figures import views
  File "/edx/app/edxapp/venvs/edxapp/local/lib/python2.7/site-packages/figures/views.py", line 29, in <module>
    from .filters import (
  File "/edx/app/edxapp/venvs/edxapp/local/lib/python2.7/site-packages/figures/filters.py", line 50, in <module>
    class CourseEnrollmentFilter(django_filters.FilterSet):
  File "/edx/app/edxapp/venvs/edxapp/local/lib/python2.7/site-packages/figures/filters.py", line 55, in CourseEnrollmentFilter
    course_id = django_filters.MethodFilter(action='filter_course_id')
AttributeError: 'module' object has no attribute 'MethodFilter'

Backfill monthly metrics for site test fails

Figures tests/test_backfill.py::test_backfill_monthly_metrics_for_site fails because the number of months expected to backfill do not match up. See the following:

>       backfilled = backfill_monthly_metrics_for_site(site=site, overwrite=True)
E       AssertionError: assert 6 == 5
E        +  where 6 = len([{'created': True, 'dt': datetime.datetime(2020, 11, 1, 0, 0, tzinfo=<UTC>), 'obj': <SiteMonthlyMetrics: id:1, month_f...ime(2021, 4, 1, 0, 0, tzinfo=<UTC>), 'obj': <SiteMonthlyMetrics: id:6, month_for:2021-04-01, site:site-3.example.com>}])
E        +  and   5 = len([{'month': datetime.datetime(2020, 11, 29, 13, 7, 24, tzinfo=<UTC>), 'month_sm': [<StudentModule: StudentModule object...entModule object>, <StudentModule: StudentModule object>, <StudentModule: StudentModule object>, ...], 'sm_count': 14}])

tests/test_backfill.py:104: AssertionError

This code is NOT run as part of normal daily or monthly pipeline job execution. Therefore, I'm marking the test with xfail to avoid this blocking Figures fixes for Juniper. We DO need to address it, but the purpose of this function and the Django management command that drives is is for backfilling MAU history for a new Figures deployment

xfail added to this test in this PR: #355

https://appsembler.atlassian.net/browse/RED-2173

Add basic integration tests with edx/edx-platform

The goal here is to ensure the bare minimum integration tests including:

  • Both of the edX Platform's and edX Figures database migrations work
  • URLs are callable from within the platform
  • The default configuration works out of the box
  • That's about it!

Choosing edx/edx-platform over appsembler/edx-platform is for two reasons:

  • We do the manual testing on our fork, so we have this by default.
  • We want community adoption.
  • We want to minimize changes to our fork.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.