Finish the Labs - Ansible based Lab Grading and Solving Platform
Shell 0.71%Python 1.80%Jinja 97.49%
ftl's Introduction
FTL Overview
FTL, Finish The Labs, is currently a student invoked automated lab grading system
designed to allow students to self mark their OPENTLC labs to see if they
have passed them correctly. Other uses include lab and lab environment validation.
On failure it highlights the failed sections via Instructor supplied error
messages. Students have unlimited attempts and can run the grading at any time,
without penalty.
On success it:
Informs the student
Updates the LMS (at this point the grade-update-lms role is a stub
toc::[]
├── README.adoc # This document
├── ansible.cfg # Default ansible.cfg - can be over-riddden
├── courses/ # Directory containing course and their graders and solvers
├── docs/ # Documentation - see below for overview
├── main.yml # Main entry point to grade or solve a lab
├── roles/ # Top level roles directory, can be supplemented at a course/lab level
├── vars/ # Set global *common* vars reports etc
└── devel # developer tools, data, scripts, playbooks
Deployer Guide - Deploying, Installing, and Executing Lab Tests
Terminology
A small number of terms are defined below
grader host - Host that the student works from, typically bastion or workstation
graders - Graders are playbooks that grade a lab, they are typically invoked via the grade_lab command which is installed at deployment time.
solvers - Solvers are playbooks that solver labs i.e. they apply all the necessary ansible tasks to move complete a lab.
grader roles - grader roles, in the roles directory provide pre-built roles for testing specfic, and common, scenarios e.g.
grader_check_command_output
grader_check_file_exists
grader_check_ocp_node_exists
grader_check_ocp_route_exists
grader_check_ocp_svc_exists
grader_check_package_installed
grader_check_service_enabled
grader_check_service_started
grader_check_user_exists
Workflow - Deployment
FTL can be deployed via push i.e. inserted into a lab at the deployment
stage. An obvious example being inserted via Ansible Agnostic Deployer in the
Post Software stage of a config.
Note
FTL currently clones the entire repo i.e all tests for all labs so no
additional meta data needs to be supplied at deployment time.
Alternatively it could be modified to be deployed via pull i.e. the student
clones the FTL repo locally. This will require either a permissions change on
the repo or a API token/login to be provided as FTL is a private repo today.
Workflow - Developing a Lab Test
Instructors create a series of lab checking playbooks in the correct sub-directory
of the courses directory. Tests are written in ansible and test 1 or more
conditions in the lab. For example:
Has package foo been installed
Does user bar exist
Does a file contain a certain line
Does a URI give a certain response e.g. 200 or a certain output
Whilst instructors can create tests some convenience roles have been created
to both test condition and to track score, these are detailed in the
Lab Grading Author Guide*
Sample of a typical lab test leveraging an existing role
- name: Section 1.5, Check student has installed ansiblerole: grader_check_package_installedvars:
section_number: 1.5student_error_message: Package not installedpackage: ansible
Instructors can author custom, and more complex, tests as long as these can be
executed in an ansible playbook. See the Documentation for more details.
Usage
Students are supplied with grade_lab and solve_lab wrapper scripts, in /usr/local/bin that typically take 2 arguments, course name and lab number:
$ grade_lab ansible_engine_foundations 03_01
Questions:
Do students invoke the check via:
CLI on the lab machine
Via button in the LMS
Does the check action pull in the playbook(s) or are they already there?
Change grader to allow G or Gi for storage quota sizes. Even though we usually use G for storage (and Gi for memory), the documentation actually shows Gi for the storage quota.
Add a conditional at the beginning of the grader to check for presence of the projects being checked. If they exist, delete them. Otherwise the grader will fail because it can't create the project from scratch and get the things in the project request template.
In the default CPU check, we are looking for the first object in the list...but if the student creates it in a different order, the check fails when it shouldn't.
These checks need to be more robust since not everyone copies and pastes from the docs.
name: Check if the default CPU request is 500m in the LimitRange
include_role:
name: grader_check_ocp_resource
vars:
resource_kind: LimitRange
resource_namespace: george-test
resource_name: project-limits
resource_definition_checks:
- error_message: "Default CPU request is not 500m in the LimitRange project-limits"
json_query: "spec.limits[0].defaultRequest.cpu"
value: "500m"
task_description_message: Check if the default CPU request is 500m in the LimitRange
student_error_message: "Default CPU request is not 500m in the LimitRange project-limits"
FAIL: Check if cluster monitoring is running on infra nodes: Cluster monitoring is not running on infra nodes
FAIL: Check if Elasticsearch is running on infra nodes: Elasticsearch is not running on infra nodes
FAIL: Check if image registry is running on infra nodes: The image registry is not running on infra nodes
FAIL: Check if the CPU quota is 4 in the project george-test: CPU ResourceQuota is not set to 4 in the project george-test
FAIL: Check if the CPU request is 500m for Elasticsearch: CPU request is not 500m for Elasticsearch
FAIL: Check if the default CPU request is 500m in the LimitRange: Default CPU request is not 500m in the LimitRange project-limits
FAIL: Check if the default memory request is 500Mi in the LimitRange: Default CPU request is not 500Mi in the LimitRange project-limits
FAIL: Check if the memory request is 4Gi for Elasticsearch: Memory request is not 4Gi for Elasticsearch
FAIL: Check if the storage request quota is 20G in the project george-test: Storage requests quota is not set to 20G in the project george-test
FAIL: Check if the storage size is 20G for Elasticsearch: Storage size is not 20G for Elasticsearch
FAIL: Check LimitRange in project george-test: LimitRange project-limits doesn't exist in project george-test
FAIL: Check NetworkPolicy allow-from-openshift-ingress in project george-test: NetworkPolicy allow-from-openshift-ingress doesn't exist in project george-test
FAIL: Check NetworkPolicy allow-same-namespace in project george-test: NetworkPolicy allow-same-namespace doesn't exist in project george-test
FAIL: Check ResourceQuota in project george-test: ResourceQuota project-quota doesn't exist in project george-test
PASS: Check if 5 user identities exist
PASS: Check if john is a cluster admin
PASS: Check if the cluster has 2 infra nodes
PASS: Check if the cluster has 2 router pods
PASS: Check if the cluster has 2 worker nodes
PASS: Check if the cluster has 3 masters
PASS: Check if the routers are running on infra nodes