Coder Social home page Coder Social logo

code-dot-org / code-dot-org Goto Github PK

View Code? Open in Web Editor NEW
816.0 81.0 479.0 1.85 GB

The code powering code.org and studio.code.org

Home Page: http://code.org

License: Other

blockly computer science educational online learning code studio coding

code-dot-org's Introduction

Code.org

Welcome! You've found the source code for the Code.org website and the Code Studio platform. Code.org is a non-profit dedicated to expanding access to computer science education. You can read more about our efforts at code.org/about.

Quick start

  1. Follow our setup guide to configure your workstation.
  2. rake build to build the application if you have not done so already
  3. bin/dashboard-server to launch the development server.
  4. Open your browser to http://localhost-studio.code.org:3000/.

To see a list of all build commands, run rake from the repository root.

How to help

Wondering where to start? See our contribution guidelines.

What's in this repo?

Here's a quick overview of the major landmarks:

Documentation

The server for our Code Studio learning platform, a Ruby on Rails application responsible for:

  • Our courses, tutorials, and puzzle configurations
  • User accounts
  • Student progress and projects
  • The "levelbuilder" content creation tools

The server for the Code.org website, a Sinatra application responsible for:

The JavaScript 'engine' for all of our tutorials, puzzle types and online tools. It gets built into a static package that we serve through dashboard. Though there are currently some exceptions, the goal is that all JS code ultimately lives here, so that it gets the benefit of linting/JSX/ES6/etc. Start here if you are looking for:

Everything else

  • aws: Configuration and scripts that manage our deployments.
  • bin: Developer utilities.
  • cookbooks: Configuration management through Chef.
  • shared: Source and assets used by many parts of our application.
  • tools: Git commit hooks.

code-dot-org's People

Contributors

ajpal avatar aoby avatar ashercodeorg avatar bcjordan avatar bencodeorg avatar bethanyaconnor avatar bjvanminnen avatar breville avatar caleybrock avatar cpirich avatar davidsbailey avatar deploy-code-org avatar dmcavoy avatar erin007 avatar hacodeorg avatar hamms avatar hannahbergam avatar islemaster avatar jmkulwik avatar joshlory avatar laurelfan avatar maddiedierker avatar maureensturgeon avatar mehalshah avatar molly-moen avatar nkiruka avatar philbogle avatar sureshc avatar tanyaparker avatar wjordan avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

code-dot-org's Issues

Design performant schema for formerly-Firebase data

Part of the firebase deprecation project: #55084

We have approximately 4 billion records currently in Firebase (by record we mean one row in a student project dataset), spread across millions of student data tables (contained in millions of applab projects).

Now that we have full data imports going (#55189) we can start performance testing possible schemas to store this data in MySQL.

  1. Store each student-channel (=student project) in a mysql row
  2. Store each student-table (multiple tables per channel) in a JSON mysql row
  3. Store each student-table in a TEXT mysql row?
  4. Store each student-record (multiple records per table) in a TEXT mysql row

RFC: "style" attribute overused in apps preventing external modifications

I currently am trying to create a Stylus theme to add a dark mode to Code.org (which should at least be a feature by now, but that's a GitHub Issue for another day). Part of my frustration on the project has been the lack of elements that can be overwritten.

Take the JSX file LessonProgress.jsx, which handles part of the header used in virtually every lesson on Code.org. The element .react_stage, which serves as a container for lesson bubbles, is styled through the variable styles.container. At first glance, sure, this seems like a reasonable approach to styling the element. It's not a class that's used hundreds of times and should be able to be adapted with one change in the CSS.

However, also consider the CSS hierarchy.

image

This is hard to read if you're viewing it on GitHub's dark mode. A direct link is here.

Yes, it's possible to overwrite these styles with !important, but generally !important is bad practice. This is a case where it might not be, because the point of !important is to override website styles especially when your hands are tied, but if (for instance) custom themes are ever introduced to certain elements like the phone in App Lab, it's not going to be easy to manage that feature when you have style attributes.

I don't see having to use workarounds as the fault of myself and an excuse to not change the system, but rather the fault of the site itself and a problem that needs to be addressed. There's a reason you don't see Google using inline styles on search pages even if it's just a simple banner.

I believe in writing clear and concise code like this:

.react-stage {
  background-color: #000;
}

not like:

.react-stage[style] {
  background-color: #000 !important;
}

The only potential excuse I see is that, because of how JSX works, you can just define your colors in a JS file and control all of the colors via that JavaScript file. The problem is that there's already SCSS that matches up with whatever's in util.

In a big project like Code.org (which is 11.6 GB), separating markup from styles is almost necessary and even mentioned in the style guide. Some day, I might even try to merge that aforementioned dark mode into this repository. That's not going to be possible if certain elements can't be styled or require more work than usual. Even if there are workarounds, that's not good practice or a good example to set to developers who might be new to CSS/SCSS.

I obviously have respect for all kinds of developers, and if this seems like a stern rant, it's not. I like code that's readable and clean, and I value being able to alter a website's design at whim. I want to open up conversation about this, especially since it's mentioned in the style guide and an issue for me.

Got few questions, if some one can guide.

Hi,

I am researching for my company where we want to have our own coding platform for kids.
So just wanted to know,
If we can put code-dot-org to my company website domain?
Can we redesign it to our need? What is allowed and what is not allowed?
Can we set up AppLab and Gamelab separately?
Can we use it for commercial purpose?

Thanks in advance.
RYG81

Missing Dash on code.org/research

Under "Research Partnerships", there is a missing dash for Dr. Briana Morrison between "University of Nebraska at Omaha" and "Subgoal Labels Study in CSP Unit 3".

Presumably this needs to be fixed in the content pipeline rather than directly in GitHub (thus the issue report rather than a PR).

Honeybadger error

Hi,
after the command "rake install:hooks" i get the following errormessage:
rake install:hooks
** [Honeybadger] Initializing Honeybadger Error Tracker for Ruby. Ship it! version=3.3.0 framework=ruby level=1 pid=12878
** [Honeybadger] Development mode is enabled. Data will not be reported until you deploy your app. level=2 pid=12878
ln -s ../../tools/hooks/pre-commit .git/hooks/pre-commit
** [Honeybadger] Unable to send error report: API key is missing. id=7cac4f20-3b23-4f0c-87d4-fad4fb8baf68 level=3 pid=12878
rake aborted!
'ln -s ../../tools/hooks/pre-commit .git/hooks/pre-commit' returned 1
ln: Die symbolische Verknรผpfung '.git/hooks/pre-commit' konnte nicht angelegt werden: Keine Berechtigung
Tasks: TOP => install:hooks
(See full trace by running task with --trace)

What does it mean and how can i avoid it

Thanks

Implement Library Manifest System

This is a significant work-item, we have a dataset browser, that includes categories and lots of nice metadata. Currently that data is stored in Firebase: https://console.firebase.google.com/project/cdo-v3-shared/database/cdo-v3-shared/data/~2Fv3~2Fchannels~2Fshared~2Fmetadata~2Fmanifest

  • Design a DB schema for storing this data
  • Design backend APIs for fetching/updating the data, including provisions for our cron jobs to update "current tables"
  • Update Levelbuilder (I think?) to use the new APIs for editing/updating the data
  • Implement DatablockStorage.getLibraryManifest() using new backend

Fixed by: #56621

properties_encryption_key

my local.yml

properties_encryption_key: 'significantly-sl'

when in run bundle exec rake install:dashboard --trace
go error:
ArgumentError: key must be 16 bytes
[CDO]/dashboard/lib/encryption.rb:29:in key=' [CDO]/dashboard/lib/encryption.rb:29:in decrypt_string'
[CDO]/dashboard/lib/encryption.rb:37:in decrypt_object' [CDO]/dashboard/app/dsl/base_dsl.rb:14:in encrypted'

setup problem with ruby

when i start the setup i got an errormessage after the "sudo apt-get install -y git mysql-server mysql-client libmysqlclient-dev libxslt1-dev libssl-dev zlib1g-dev imagemagick libmagickcore-dev libmagickwand-dev openjdk-9-jre-headless libcairo2-dev libjpeg8-dev libpango1.0-dev libgif-dev curl pdftk enscript libsqlite3-dev phantomjs build-essential redis-server rbenv ruby-build npm ruby2.5-dev"-command. i got hte following errormessages:
E: Unable to locate package ruby2.5-dev
E: Couldn't find any package by glob 'ruby2.5-dev'
E: Couldn't find any package by regex 'ruby2.5-dev'
do you have a solution for the problem?
thanks

Consider batching create/update record requests in JS

Right now we do a browser=>rails=>db round-trip for every create_record call, which there could be very many per second. If we don't like the load we're seeing on the server, since we don't have onRecordEvent anyway, it might make sense to cache create_record calls, say every 0.5s, and batch execute them (which our backend API allows for).

We can decide this as we investigate switching a small number of projects on prod, and see how the loading looks.

How do we update the view after doing a dataset browser action?

See #56349 for a PR in this direction

The problem is that the way Firebase was coupled to the redux store was via subscription methods in applab.js. These are triggered by changes to the redux store (e.g. what data view is currently displayed), but flow through applab.js.

We don't have a subscription system like Firebase, instead we need to do a refresh of the data to load the current view in cases where, say, "Add Table" was pressed in DataTable.jsx. Basically any of the storageBackend().THINGHERE methods executed from apps/src/storage/dataBrowser/*.jsx.

overriding settings for local ip 10.0.xxx.xxx type not localhost

In locals.yml, we will set the ip address so dashboard and pegasus server works.

nano /home/codedotorg/code-dot-org/locals.yml

put to finish page

override_dashboard: local ip
override_pegasus: local ip

and this

dashboard_enable_pegasus: true
to
dashboard_enable_pegasus: false

and finally

rake build

this working in recent repo?

Improving stats

Hello, I was interested in making a contribution related to stats in your platform that are displayed in each section. I was thinking about using a tool called Chartkick to show them, it will be posible to use this?
I mean by the way the platform is designed.

Thank you.

Implement tested backend for Firebase functions

This is a check-list as we go through implementing inDatablockStorage the many methods originally in FirebaseStorage, and the new ones we have extracted.

Data blocks

  • FirebaseStorage.getKeyValue = function (key, onSuccess, onError) {}
  • FirebaseStorage.setKeyValue = function (key, value, onSuccess, onError) {}
  • FirebaseStorage.createRecord = function (tableName, record, onSuccess, onError) {}
  • FirebaseStorage.readRecords = function (tableName, searchParams, onSuccess, onError) {}
  • FirebaseStorage.updateRecord = function (tableName, record, onSuccess, onError) {}
  • FirebaseStorage.deleteRecord = function (tableName, record, onSuccess, onError) {}

Dataset browser

  • FirebaseStorage.createTable = function (tableName, onSuccess, onError) {}
  • FirebaseStorage.deleteTable = function (tableName, type, onSuccess, onError) {}
  • FirebaseStorage.clearTable = function (tableName, onSuccess, onError) {}
  • FirebaseStorage.addColumn = function (tableName, columnName, onSuccess, onError) {}
  • FirebaseStorage.deleteColumn = function (tableName, columnName, onSuccess, onError) {}
  • FirebaseStorage.renameColumn = function (tableName, oldName, newName, onSuccess, onError) {}
  • FirebaseStorage.coerceColumn = function (tableName, columnName, columnType, onSuccess, onError) {}
  • FirebaseStorage.importCsv = function (tableName, tableDataCsv, onSuccess, onError) {}

Newly Created Dataset Browser methods extracted from applab.js

  • FirebaseStorage.subscribeToListOfProjectTables = function (onTableAdded, onTableRemoved) {}
  • FirebaseStorage.subscribeToKeyValuePairs = function (onKeyValuePairsChanged) {}
  • FirebaseStorage.subscribeToTable = function (tableName, onColumnsChanged, onRecordsChanged) {}
  • FirebaseStorage.previewSharedTable = function (sharedTableName, onColumnsChanged, onRecordsChanged) {}
  • FirebaseStorage.unsubscribeFromTable = function (tableName) {}
  • FirebaseStorage.unsubscribeFromKeyValuePairs = function () {}

Levelbuilder dataset injection hooks

  • FirebaseStorage.populateTable = function (jsonData) {}
  • FirebaseStorage.populateKeyValue = function (jsonData, onSuccess, onError) {}

Other

  • FirebaseStorage.deleteKeyValue = function (key, onSuccess, onError) {}
  • FirebaseStorage.getLibraryManifest = function () {}
  • FirebaseStorage.getColumnsForTable = function (tableName, tableType) {}
  • FirebaseStorage.channelExists = function () {}
  • FirebaseStorage.clearAllData = function (onSuccess, onError) {}
  • FirebaseStorage.addCurrentTableToProject = function (tableName, onSuccess, onError) {}
  • FirebaseStorage.copyStaticTable = function (tableName, onSuccess, onError) {}

Testing related functions

  • FirebaseStorage.resetForTesting = function () {}

DatablockStorage.addColumn doesn't seem to work

The problem is that column names comes from a DatablockStorage.loadTableAndColumns call and our implementation is inferring the columns from the records (when it should be used DatablockStorageTables.columns field in the DB).

We should instead be getting the columns from a call to datablock_storage_controller.rb's get_columns_for_table

See: https://github.com/code-dot-org/code-dot-org/blob/8039db32b60a94f47c44a39d237689510f8a4ef1/apps/src/storage/datablockStorage.js/#L233-L240

PR to fix this: #56464

Figure out Rate Limiting for Datasets on MySQL

Rate Limiting

In Firebase we maintain a rate limit per project/channel, 300 writes per 15 seconds, 600 writes per 60 seconds.

  1. Where does this data rate come from? Its pretty high. Is this required by the microbit projects?
  2. On MySQL we see inserts taking 0.15s per, which if we're not inserting in parallel gives us an insert rate of about 6 inserts/s, which is 90 writes per 15 seconds or 360 per 60 seconds. This is less than our existing limits.
  3. Should we be batching these together?
{
  "counters": {
    "limits": {
      "15": {
        "lastResetTime": 1654619184297,
        "writeCount": 175
      },
      "60": {
        "lastResetTime": 1654524726608,
        "writeCount": 188
      }
    }
  },
  "serverTime": 1661532798036,

Optimize getColumn

As per @cnbrenci , curriculum apparently uses the getColumn() block instead of readRecords(). Right now getColumn() is implemented by doing a full table fetch, and extracting the columns on the JS. This is slow, and results in a LOT of extra traffic going over the wire.

Instead, for datablock storage, we can optimize this block by creating a backend get_column method.

bundle exec rake install error

WARNING: level 'Allthethings Encrypted Play Lab Level' not seeded properly due to missing CDO.properties_encryption_key
site_1 | warning: unable to decrypt level Blocks to Math 1, skipping
why?

rake install then got stuck

dev@iZ2zec6ll6nvbh4acez5unZ:~/code-dot-org/dashboard$ bundle exec rake db:drop
Dropped database 'dashboard_development'
Dropped database 'dashboard_test'
dev@iZ2zec6ll6nvbh4acez5unZ:~/code-dot-org/dashboard$ cd ../
dev@iZ2zec6ll6nvbh4acez5unZ:~/code-dot-org$ ls
apps  codecov.yml      dashboard      experimental  i18n     locals.yml          NOTICE    README.md  STYLEGUIDE.md  Vagrantfile
aws   CONTRIBUTING.md  deployment.rb  Gemfile       lib      locals.yml.default  pegasus   SETUP.md   TESTING.md
bin   cookbooks        docs           Gemfile.lock  LICENSE  log                 Rakefile  shared     tools
dev@iZ2zec6ll6nvbh4acez5unZ:~/code-dot-org$ rake install
bundle --without production adhoc staging test levelbuilder integration --quiet --jobs 4
mysql://root@localhost/dashboard_development
RAILS_ENV=development RACK_ENV=development bundle exec rake dashboard:setup_db

stucked at this step about 30mins(ssh connection will be abort)๏ผŒdo i have any wrong opration?

Allow marking applab projects as using datablock storage

As part of the firebase migration for student data storage (https://github.com/orgs/code-dot-org/projects/4/), we'd like a way to progressively roll-out Datablock Storage.

A major implementation goal has been to keep the old firebase code paths present in the same codebase. For this issue, we just need a DB-persisted way for applab.js to determine if this project is stored in firebase, or if its been migrated to Datablock Storage.

Current idea:

  1. add a "use_datablock_storage" boolean column to the projects table, default to false
  2. add a DCDO field for "applab_percent_use_datablock_storage" to roll the dice over whether a new project is created in firebase or datablock.
  3. Inject use_datablock_storage into AppOptions (how?)
  4. increase percentage gradually, fixing any issues that come up, until we hit 100% datablock for new projects
  5. convert old projects, until 100% of project rows have use_datablock_storage=true
  6. Merge a cleanup PR that removes the use_datablock_storage column, and removes the firebase code paths

PR that implements part of this idea: #56536

rake install fail

Hello,

rake aborted! when rake install.

Is there anything missing?

Here is the log

root@localhost:~/data/code-dot-org# rake install
bundle --without production adhoc staging test levelbuilder integration --quiet --jobs 1
mysql://root@localhost/dashboard_development
RAILS_ENV=development RACK_ENV=development bundle exec rake dashboard:setup_db
rake aborted!
'RAILS_ENV=development RACK_ENV=development bundle exec rake dashboard:setup_db' returned 137
Created database 'dashboard_development'
Created database 'dashboard_test'
-- create_table("activities", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0177s
-- create_table("ap_cs_offerings", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0084s
-- create_table("ap_school_codes", {:id=>false, :force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0078s
-- create_table("authored_hint_view_requests", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0136s
-- create_table("callouts", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0098s
-- create_table("census_inaccuracy_investigations", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0112s
-- create_table("census_overrides", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0093s
-- create_table("census_submission_form_maps", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0110s
-- create_table("census_submissions", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0125s
-- create_table("census_submissions_school_infos", {:id=>false, :force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0104s
-- create_table("census_summaries", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0104s
-- create_table("channel_tokens", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0142s
-- create_table("circuit_playground_discount_applications", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0159s
-- create_table("circuit_playground_discount_codes", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0132s
-- create_table("cohorts", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0105s
-- create_table("cohorts_deleted_users", {:id=>false, :force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0197s
-- create_table("cohorts_districts", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0107s
-- create_table("cohorts_users", {:id=>false, :force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0102s
-- create_table("concepts", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0087s
-- create_table("concepts_levels", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0109s
-- create_table("contained_level_answers", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0095s
-- create_table("contained_levels", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0116s
-- create_table("course_scripts", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0116s
-- create_table("courses", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0130s
-- create_table("districts", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0098s
-- create_table("districts_users", {:id=>false, :force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0107s
-- create_table("experiments", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0140s
-- create_table("facilitators_workshops", {:id=>false, :force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0078s
-- create_table("featured_projects", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0117s
-- create_table("followers", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0098s
-- create_table("gallery_activities", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0155s
-- create_table("games", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0091s
-- create_table("hint_view_requests", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0113s
-- create_table("ib_cs_offerings", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0083s
-- create_table("ib_school_codes", {:id=>false, :force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0090s
-- create_table("level_concept_difficulties", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0087s
-- create_table("level_source_images", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0096s
-- create_table("level_sources", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0147s
-- create_table("level_sources_multi_types", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0109s
-- create_table("levels", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0107s
-- create_table("levels_script_levels", {:id=>false, :force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0113s
-- create_table("metrics", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0089s
-- create_table("paired_user_levels", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0126s
-- create_table("pd_accepted_programs", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0062s
-- create_table("pd_applications", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0188s
-- create_table("pd_attendances", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0159s
-- create_table("pd_course_facilitators", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0111s
-- create_table("pd_district_payment_terms", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0102s
-- create_table("pd_enrollment_notifications", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0103s
-- create_table("pd_enrollments", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0110s
-- create_table("pd_facilitator_program_registrations", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0600s
-- create_table("pd_facilitator_teachercon_attendances", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0082s
-- create_table("pd_fit_weekend1819_registrations", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0078s
-- create_table("pd_payment_terms", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0097s
-- create_table("pd_pre_workshop_surveys", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0097s
-- create_table("pd_regional_partner_cohorts", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0123s
-- create_table("pd_regional_partner_cohorts_users", {:id=>false, :force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0099s
-- create_table("pd_regional_partner_contacts", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0125s
-- create_table("pd_regional_partner_mappings", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0116s
-- create_table("pd_regional_partner_program_registrations", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0106s
-- create_table("pd_sessions", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0118s
-- create_table("pd_teacher_applications", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0146s
-- create_table("pd_teachercon1819_registrations", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0112s
-- create_table("pd_teachercon_surveys", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0098s
-- create_table("pd_workshop_material_orders", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0156s
-- create_table("pd_workshop_surveys", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0115s
-- create_table("pd_workshops", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0205s
-- create_table("pd_workshops_facilitators", {:id=>false, :force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0632s
-- create_table("peer_reviews", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0187s
-- create_table("plc_course_units", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0136s
-- create_table("plc_courses", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0106s
-- create_table("plc_enrollment_module_assignments", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0147s
-- create_table("plc_enrollment_unit_assignments", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0142s
-- create_table("plc_learning_modules", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0133s
-- create_table("plc_learning_modules_tasks", {:id=>false, :force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0119s
-- create_table("plc_tasks", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0093s
-- create_table("plc_user_course_enrollments", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0139s
-- create_table("puzzle_ratings", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0108s
-- create_table("regional_partner_program_managers", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0137s
-- create_table("regional_partners", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0095s
-- create_table("regional_partners_school_districts", {:id=>false, :force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0091s
-- create_table("school_districts", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0580s
-- create_table("school_infos", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0125s
-- create_table("school_stats_by_years", {:primary_key=>["school_id", "school_year"], :force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0107s
-- create_table("schools", {:id=>false, :force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.1010s
-- create_table("script_levels", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0126s
-- create_table("scripts", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0096s
-- create_table("secret_pictures", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0117s
-- create_table("secret_words", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0094s
-- create_table("section_hidden_scripts", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0111s
-- create_table("section_hidden_stages", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0103s
-- create_table("sections", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0130s
-- create_table("seeded_s3_objects", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0095s
-- create_table("segments", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0118s
-- create_table("sign_ins", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0105s
-- create_table("stages", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0096s
-- create_table("state_cs_offerings", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0092s
-- create_table("studio_people", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0120s
-- create_table("survey_results", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0137s
-- create_table("teacher_profiles", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0090s
-- create_table("unexpected_teachers_workshops", {:id=>false, :force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0118s
-- create_table("user_geos", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0124s
-- create_table("user_levels", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0136s
-- create_table("user_module_task_assignments", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0146s
-- create_table("user_permissions", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0087s
-- create_table("user_proficiencies", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0448s
-- create_table("user_scripts", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0108s
-- create_table("users", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0319s
-- create_table("videos", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0065s
-- create_table("workshop_attendance", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0162s
-- create_table("workshop_cohorts", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0087s
-- create_table("workshops", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0856s
-- add_foreign_key("ap_school_codes", "schools")
-> 0.0225s
-- add_foreign_key("authored_hint_view_requests", "levels")
-> 0.0235s
-- add_foreign_key("authored_hint_view_requests", "scripts")
-> 0.0222s
-- add_foreign_key("authored_hint_view_requests", "users")
-> 0.0244s
-- add_foreign_key("census_inaccuracy_investigations", "census_overrides")
-> 0.0229s
-- add_foreign_key("census_inaccuracy_investigations", "census_submissions")
-> 0.0209s
-- add_foreign_key("census_inaccuracy_investigations", "users")
-> 0.0282s
-- add_foreign_key("census_overrides", "schools")
-> 0.0192s
-- add_foreign_key("census_submission_form_maps", "census_submissions")
-> 0.0206s
-- add_foreign_key("census_summaries", "schools")
-> 0.0200s
-- add_foreign_key("circuit_playground_discount_applications", "schools")
-> 0.0207s
-- add_foreign_key("hint_view_requests", "users")
-> 0.0220s
-- add_foreign_key("ib_school_codes", "schools")
-> 0.0174s
-- add_foreign_key("level_concept_difficulties", "levels")
-> 0.0185s
-- add_foreign_key("pd_payment_terms", "regional_partners")
-> 0.0176s
-- add_foreign_key("pd_regional_partner_cohorts", "pd_workshops", {:column=>"summer_workshop_id"})
-> 0.0181s
-- add_foreign_key("pd_teachercon1819_registrations", "regional_partners")
-> 0.0195s
-- add_foreign_key("pd_workshops", "regional_partners")
-> 0.0186s
-- add_foreign_key("peer_reviews", "level_sources")
-> 0.0229s
-- add_foreign_key("peer_reviews", "levels")
-> 0.0270s
-- add_foreign_key("peer_reviews", "scripts")
-> 0.0251s
-- add_foreign_key("peer_reviews", "users", {:column=>"reviewer_id"})
-> 0.0310s
-- add_foreign_key("peer_reviews", "users", {:column=>"submitter_id"})
-> 0.0237s
-- add_foreign_key("plc_course_units", "scripts")
-> 0.0193s
-- add_foreign_key("plc_courses", "courses")
-> 0.0179s
-- add_foreign_key("plc_learning_modules", "stages")
-> 0.0228s
-- add_foreign_key("plc_tasks", "script_levels")
-> 0.0198s
-- add_foreign_key("school_infos", "school_districts")
-> 0.0209s
-- add_foreign_key("school_infos", "schools")
-> 0.0199s
-- add_foreign_key("school_stats_by_years", "schools")
-> 0.0173s
-- add_foreign_key("schools", "school_districts")
-> 0.0776s
-- add_foreign_key("sections", "courses")
-> 0.0240s
-- add_foreign_key("state_cs_offerings", "schools", {:column=>"state_school_id", :primary_key=>"state_school_id"})
-> 0.0194s
-- add_foreign_key("survey_results", "users")
-> 0.0194s
-- add_foreign_key("user_geos", "users")
-> 0.0204s
-- add_foreign_key("user_proficiencies", "users")
-> 0.0220s
-- initialize_schema_migrations_table()
-> 0.0133s
-- create_table("activities", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0567s
-- create_table("ap_cs_offerings", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0106s
-- create_table("ap_school_codes", {:id=>false, :force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0087s
-- create_table("authored_hint_view_requests", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0127s
-- create_table("callouts", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0090s
-- create_table("census_inaccuracy_investigations", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0135s
-- create_table("census_overrides", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0097s
-- create_table("census_submission_form_maps", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0115s
-- create_table("census_submissions", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0103s
-- create_table("census_submissions_school_infos", {:id=>false, :force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0094s
-- create_table("census_summaries", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0094s
-- create_table("channel_tokens", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0127s
-- create_table("circuit_playground_discount_applications", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0132s
-- create_table("circuit_playground_discount_codes", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0132s
-- create_table("cohorts", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0119s
-- create_table("cohorts_deleted_users", {:id=>false, :force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0129s
-- create_table("cohorts_districts", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0110s
-- create_table("cohorts_users", {:id=>false, :force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0093s
-- create_table("concepts", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0124s
-- create_table("concepts_levels", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0105s
-- create_table("contained_level_answers", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0112s
-- create_table("contained_levels", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0130s
-- create_table("course_scripts", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0114s
-- create_table("courses", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0089s
-- create_table("districts", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0113s
-- create_table("districts_users", {:id=>false, :force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0104s
-- create_table("experiments", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0155s
-- create_table("facilitators_workshops", {:id=>false, :force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0063s
-- create_table("featured_projects", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0075s
-- create_table("followers", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0101s
-- create_table("gallery_activities", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0129s
-- create_table("games", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0085s
-- create_table("hint_view_requests", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0099s
-- create_table("ib_cs_offerings", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0080s
-- create_table("ib_school_codes", {:id=>false, :force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0087s
-- create_table("level_concept_difficulties", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0101s
-- create_table("level_source_images", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0088s
-- create_table("level_sources", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0118s
-- create_table("level_sources_multi_types", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0133s
-- create_table("levels", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0146s
-- create_table("levels_script_levels", {:id=>false, :force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0097s
-- create_table("metrics", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0066s
-- create_table("paired_user_levels", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0116s
-- create_table("pd_accepted_programs", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0071s
-- create_table("pd_applications", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0210s
-- create_table("pd_attendances", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0104s
-- create_table("pd_course_facilitators", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0147s
-- create_table("pd_district_payment_terms", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0103s
-- create_table("pd_enrollment_notifications", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0097s
-- create_table("pd_enrollments", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0152s
-- create_table("pd_facilitator_program_registrations", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0079s
-- create_table("pd_facilitator_teachercon_attendances", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0086s
-- create_table("pd_fit_weekend1819_registrations", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0093s
-- create_table("pd_payment_terms", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0089s
-- create_table("pd_pre_workshop_surveys", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0092s
-- create_table("pd_regional_partner_cohorts", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0118s
-- create_table("pd_regional_partner_cohorts_users", {:id=>false, :force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0136s
-- create_table("pd_regional_partner_contacts", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0150s
-- create_table("pd_regional_partner_mappings", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0136s
-- create_table("pd_regional_partner_program_registrations", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0089s
-- create_table("pd_sessions", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0101s
-- create_table("pd_teacher_applications", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0116s
-- create_table("pd_teachercon1819_registrations", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0223s
-- create_table("pd_teachercon_surveys", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0139s
-- create_table("pd_workshop_material_orders", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0123s
-- create_table("pd_workshop_surveys", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0101s
-- create_table("pd_workshops", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0109s
-- create_table("pd_workshops_facilitators", {:id=>false, :force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0091s
-- create_table("peer_reviews", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0145s
-- create_table("plc_course_units", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0088s
-- create_table("plc_courses", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0079s
-- create_table("plc_enrollment_module_assignments", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0120s
-- create_table("plc_enrollment_unit_assignments", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0125s
-- create_table("plc_learning_modules", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0096s
-- create_table("plc_learning_modules_tasks", {:id=>false, :force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0088s
-- create_table("plc_tasks", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0082s
-- create_table("plc_user_course_enrollments", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0102s
-- create_table("puzzle_ratings", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0102s
-- create_table("regional_partner_program_managers", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0104s
-- create_table("regional_partners", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0102s
-- create_table("regional_partners_school_districts", {:id=>false, :force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0115s
-- create_table("school_districts", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.1352s
-- create_table("school_infos", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0117s
-- create_table("school_stats_by_years", {:primary_key=>["school_id", "school_year"], :force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0106s
-- create_table("schools", {:id=>false, :force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0583s
-- create_table("script_levels", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0118s
-- create_table("scripts", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0182s
-- create_table("secret_pictures", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0138s
-- create_table("secret_words", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0121s
-- create_table("section_hidden_scripts", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0115s
-- create_table("section_hidden_stages", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0143s
-- create_table("sections", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0150s
-- create_table("seeded_s3_objects", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0102s
-- create_table("segments", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0117s
-- create_table("sign_ins", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0121s
-- create_table("stages", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0108s
-- create_table("state_cs_offerings", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0121s
-- create_table("studio_people", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0082s
-- create_table("survey_results", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0130s
-- create_table("teacher_profiles", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0127s
-- create_table("unexpected_teachers_workshops", {:id=>false, :force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0125s
-- create_table("user_geos", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0116s
-- create_table("user_levels", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0095s
-- create_table("user_module_task_assignments", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0182s
-- create_table("user_permissions", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0110s
-- create_table("user_proficiencies", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0100s
-- create_table("user_scripts", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0102s
-- create_table("users", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0379s
-- create_table("videos", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0091s
-- create_table("workshop_attendance", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0113s
-- create_table("workshop_cohorts", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0067s
-- create_table("workshops", {:force=>:cascade, :options=>"ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci"})
-> 0.0102s
-- add_foreign_key("ap_school_codes", "schools")
-> 0.0185s
-- add_foreign_key("authored_hint_view_requests", "levels")
-> 0.0231s
-- add_foreign_key("authored_hint_view_requests", "scripts")
-> 0.0295s
-- add_foreign_key("authored_hint_view_requests", "users")
-> 0.0240s
-- add_foreign_key("census_inaccuracy_investigations", "census_overrides")
-> 0.0205s
-- add_foreign_key("census_inaccuracy_investigations", "census_submissions")
-> 0.0228s
-- add_foreign_key("census_inaccuracy_investigations", "users")
-> 0.0193s
-- add_foreign_key("census_overrides", "schools")
-> 0.0173s
-- add_foreign_key("census_submission_form_maps", "census_submissions")
-> 0.0194s
-- add_foreign_key("census_summaries", "schools")
-> 0.0177s
-- add_foreign_key("circuit_playground_discount_applications", "schools")
-> 0.0219s
-- add_foreign_key("hint_view_requests", "users")
-> 0.0187s
-- add_foreign_key("ib_school_codes", "schools")
-> 0.0200s
-- add_foreign_key("level_concept_difficulties", "levels")
-> 0.0202s
-- add_foreign_key("pd_payment_terms", "regional_partners")
-> 0.0210s
-- add_foreign_key("pd_regional_partner_cohorts", "pd_workshops", {:column=>"summer_workshop_id"})
-> 0.0206s
-- add_foreign_key("pd_teachercon1819_registrations", "regional_partners")
-> 0.0196s
-- add_foreign_key("pd_workshops", "regional_partners")
-> 0.0183s
-- add_foreign_key("peer_reviews", "level_sources")
-> 0.0271s
-- add_foreign_key("peer_reviews", "levels")
-> 0.0246s
-- add_foreign_key("peer_reviews", "scripts")
-> 0.0238s
-- add_foreign_key("peer_reviews", "users", {:column=>"reviewer_id"})
-> 0.0245s
-- add_foreign_key("peer_reviews", "users", {:column=>"submitter_id"})
-> 0.0543s
-- add_foreign_key("plc_course_units", "scripts")
-> 0.0188s
-- add_foreign_key("plc_courses", "courses")
-> 0.0236s
-- add_foreign_key("plc_learning_modules", "stages")
-> 0.0260s
-- add_foreign_key("plc_tasks", "script_levels")
-> 0.0166s
-- add_foreign_key("school_infos", "school_districts")
-> 0.0177s
-- add_foreign_key("school_infos", "schools")
-> 0.0218s
-- add_foreign_key("school_stats_by_years", "schools")
-> 0.0189s
-- add_foreign_key("schools", "school_districts")
-> 0.0929s
-- add_foreign_key("sections", "courses")
-> 0.0260s
-- add_foreign_key("state_cs_offerings", "schools", {:column=>"state_school_id", :primary_key=>"state_school_id"})
-> 0.0197s
-- add_foreign_key("survey_results", "users")
-> 0.0238s
-- add_foreign_key("user_geos", "users")
-> 0.0190s
-- add_foreign_key("user_proficiencies", "users")
-> 0.0176s
-- initialize_schema_migrations_table()
-> 0.0072s
warning: unable to decrypt level Blocks to Math 1, skipping
warning: unable to decrypt level U1L13 - Assess Text Compression reverse process, skipping
warning: unable to decrypt level U2L07 Assessment4, skipping
warning: unable to decrypt level Algo MST Student Lesson Introduction, skipping
warning: unable to decrypt level brad-repro-bug-000, skipping
warning: unable to decrypt level CSD U3 Collision Detection, skipping
warning: unable to decrypt level CSD U3 complex sprite movement SFLP, skipping
warning: unable to decrypt level csp_unit_assessment_overview, skipping
warning: unable to decrypt level CSPU5_U3L20 Student Lesson Introduction, skipping
warning: unable to decrypt level CSPU5_U3L25 Student Lesson Introduction, skipping
warning: unable to decrypt level CSPU5_U3L27 Student Lesson Introduction, skipping
warning: unable to decrypt level CSPU5_U3L28 Student Lesson Introduction, skipping
warning: unable to decrypt level OPD-K5 Celebrate, skipping
warning: unable to decrypt level OPD-K5 CS, skipping
warning: unable to decrypt level swipeRightPassThrough, skipping
warning: unable to decrypt level swipeRightPassThrough-test123, skipping
warning: unable to decrypt level Terminology Recap, skipping
warning: unable to decrypt level Test External Markdown, skipping
warning: unable to decrypt level U1L12 Student Lesson Introduction, skipping
warning: unable to decrypt level U1L13 Student Lesson Summary, skipping
warning: unable to decrypt level U1L14 Student Lesson Summary, skipping
warning: unable to decrypt level U1L15 Student Lesson Summary, skipping
warning: unable to decrypt level U1L1 Student Lesson Introduction, skipping
warning: unable to decrypt level U1L5 Student Lesson Introduction, skipping
warning: unable to decrypt level U1L9 Student Lesson Introduction, skipping
warning: unable to decrypt level U2L13 Student Lesson Introduction, skipping
warning: unable to decrypt level U2L14 Student Lesson Introduction, skipping
warning: unable to decrypt level U2L15 Student Lesson Introduction, skipping
warning: unable to decrypt level U2L18 Student Lesson Introduction, skipping
warning: unable to decrypt level U2L20 Student Lesson Introduction, skipping
warning: unable to decrypt level U2L3 Student Lesson Introduction, skipping
warning: unable to decrypt level U2L6 Student Lesson Introduction, skipping
warning: unable to decrypt level U2L7 Student Lesson Introduction, skipping
warning: unable to decrypt level U3L20 Student Lesson Introduction, skipping
warning: unable to decrypt level U3L25 Student Lesson Introduction, skipping
warning: unable to decrypt level U3L27 Student Lesson Introduction, skipping
warning: unable to decrypt level U3L28 Student Lesson Introduction, skipping
warning: unable to decrypt level U4L09 Student Lesson Intro Practice PT big data dilemma, skipping
warning: unable to decrypt level Unit 4 Lesson 1 Introduction, skipping
warning: unable to decrypt level Unit 4 Lesson 2 Introduction, skipping
warning: unable to decrypt level Unit 4 Lesson 3 Introduction, skipping
warning: unable to decrypt level Unit 4 Lesson 4 Introduction, skipping
warning: unable to decrypt level Unit 4 Lesson 5 Introduction, skipping
warning: unable to decrypt level Unit 4 Lesson 6 Introduction, skipping
warning: unable to decrypt level Unit 4 Lesson 8 Introduction, skipping
warning: unable to decrypt level Unit 5 Lesson 11 Introduction, skipping
warning: unable to decrypt level Unit 5 Lesson 16 Introduction, skipping
warning: unable to decrypt level v2 U1L14 Student Lesson Introduction, skipping
warning: unable to decrypt level v2 U1L5 Student Lesson Introduction, skipping
warning: unable to decrypt level v2 U1L8 Student Lesson Introduction, skipping
warning: unable to decrypt level v2 U1L9 Student Lesson Introduction, skipping
warning: unable to decrypt level v2 U2L11 Student Lesson Introduction, skipping
warning: unable to decrypt level v2 U2L15 Student Lesson Introduction, skipping
warning: unable to decrypt level v2 U2L1 Student Lesson Introduction, skipping
warning: unable to decrypt level v2 U2L3 Student Lesson Summary, skipping
warning: unable to decrypt level v2 U2L4 Student Lesson Summary, skipping
warning: unable to decrypt level v2 U2L5 Student Lesson Introduction, skipping
warning: unable to decrypt level v2 U2L6 Student Lesson Introduction, skipping
warning: unable to decrypt level v2 U2L9 Student Lesson Introduction, skipping
warning: unable to decrypt level v2 U3L10 Student Lesson Introduction, skipping
warning: unable to decrypt level v2 U4L05 Student Lesson Intro Encryption Caesar, skipping
warning: unable to decrypt level v2 U4L06 Student Lesson Introduction Encryption Vigenere, skipping
warning: unable to decrypt level Welcome to TeacherCon!, skipping
warning: unable to decrypt level U2L06 - Matching: Label the Diagram, skipping
warning: unable to decrypt level U2L14 Assessment, skipping
warning: unable to decrypt level U2L14 Assessment4, skipping
warning: unable to decrypt level U2L17 - Vocabulary matching, skipping
warning: unable to decrypt level U2L18 Match terms to cups and beans analogy, skipping
warning: unable to decrypt level U2L3 Assessment, skipping
warning: unable to decrypt level CB_Question_1, skipping
warning: unable to decrypt level CB_Question_10, skipping
warning: unable to decrypt level CB_Question_11, skipping
warning: unable to decrypt level CB_Question_12, skipping
warning: unable to decrypt level CB_Question_13, skipping
warning: unable to decrypt level CB_Question_14, skipping
warning: unable to decrypt level CB_Question_15, skipping
warning: unable to decrypt level CB_Question_16, skipping
warning: unable to decrypt level CB_Question_17, skipping
warning: unable to decrypt level CB_Question_18, skipping
warning: unable to decrypt level CB_Question_19, skipping
warning: unable to decrypt level CB_Question_1_copy, skipping
warning: unable to decrypt level CB_Question_2, skipping
warning: unable to decrypt level CB_Question_20, skipping
warning: unable to decrypt level CB_Question_21, skipping
warning: unable to decrypt level CB_Question_22, skipping
warning: unable to decrypt level CB_Question_3, skipping
warning: unable to decrypt level CB_Question_3update, skipping
warning: unable to decrypt level CB_Question_4, skipping
warning: unable to decrypt level CB_Question_5, skipping
warning: unable to decrypt level CB_Question_6, skipping
warning: unable to decrypt level CB_Question_7, skipping
warning: unable to decrypt level CB_Question_8, skipping
warning: unable to decrypt level CB_Question_9, skipping
warning: unable to decrypt level CPS Unit2 Ch 1 MC lossless compression, skipping
warning: unable to decrypt level CSP Unit 2 Ch 2 MC data bias, skipping
warning: unable to decrypt level cspu3_assess1_callfunction, skipping
warning: unable to decrypt level cspu3_assess1_codeoutcome, skipping
warning: unable to decrypt level cspu3_assess1_collaboration, skipping
warning: unable to decrypt level cspu3_assess1_functionsfalse, skipping
warning: unable to decrypt level cspu3_assess1_loop_or_function2, skipping
warning: unable to decrypt level cspu3_assess1_loop_or_function3, skipping
warning: unable to decrypt level cspu3_assess1_loop_or_function4, skipping
warning: unable to decrypt level cspu3_assess1_namingconvention, skipping
warning: unable to decrypt level cspu3_assess1_naturallanguage, skipping
warning: unable to decrypt level cspu3_assess1_outputdrawing, skipping
warning: unable to decrypt level cspu3_assess1_parameters, skipping
warning: unable to decrypt level cspu3_assess1_removeline, skipping
warning: unable to decrypt level cspu3_assess1_robotpath, skipping
warning: unable to decrypt level cspu3_assess1_truefunctions, skipping
warning: unable to decrypt level cspu4_assess1_DDoS, skipping
warning: unable to decrypt level cspu4_assess1_encryption, skipping
warning: unable to decrypt level cspu4_assess1_keyEncryption, skipping
warning: unable to decrypt level cspu4_assess1_mod, skipping
warning: unable to decrypt level cspu4_assess1_moore, skipping
warning: unable to decrypt level cspu4_assess1_phishing, skipping
warning: unable to decrypt level cspu5_assess1_additem, skipping
warning: unable to decrypt level cspu5_assess1_algorithm, skipping
warning: unable to decrypt level cspu5_assess1_codevalues, skipping
warning: unable to decrypt level cspu5_assess1_debugging, skipping
warning: unable to decrypt level cspu5_assess1_drawingoutcome, skipping
warning: unable to decrypt level cspu5_assess1_elementid, skipping
warning: unable to decrypt level cspu5_assess1_eventprograms, skipping
warning: unable to decrypt level cspu5_assess1_fivemore, skipping
warning: unable to decrypt level cspu5_assess1_flowchart, skipping
warning: unable to decrypt level cspu5_assess1_increase, skipping
warning: unable to decrypt level cspu5_assess1_snowman, skipping
warning: unable to decrypt level cspu5_assess1_swap, skipping
warning: unable to decrypt level cspu5_assess1_turtle, skipping
warning: unable to decrypt level cspu5_assess1_value, skipping
warning: unable to decrypt level cspu5_assess2_addstrings, skipping
warning: unable to decrypt level cspu5_assess2_appLabError, skipping
warning: unable to decrypt level cspu5_assess2_boolean, skipping
warning: unable to decrypt level cspu5_assess2_combineString, skipping
warning: unable to decrypt level cspu5_assess2_false, skipping
warning: unable to decrypt level cspu5_assess2_goodbye, skipping
warning: unable to decrypt level cspu5_assess2_hello, skipping
warning: unable to decrypt level cspu5_assess2_numValues, skipping
warning: unable to decrypt level cspu5_assess2_oldEnough, skipping
warning: unable to decrypt level cspu5_assess2_pseudoCode, skipping
warning: unable to decrypt level cspu5_assess2_setAlarm, skipping
warning: unable to decrypt level cspu5_assess2_statementOutput, skipping
warning: unable to decrypt level cspu5_assess2_wrongLogic, skipping
warning: unable to decrypt level cspu5_assess3_ageList, skipping
warning: unable to decrypt level cspu5_assess3_array1, skipping
warning: unable to decrypt level cspu5_assess3_array2, skipping
warning: unable to decrypt level cspu5_assess3_array3, skipping
warning: unable to decrypt level cspu5_assess3_array4, skipping
warning: unable to decrypt level cspu5_assess3_array5, skipping
warning: unable to decrypt level cspu5_assess3_bakeSale, skipping
warning: unable to decrypt level cspu5_assess3_counter, skipping
warning: unable to decrypt level cspu5_assess3_impossibleOutput, skipping
warning: unable to decrypt level cspu5_assess3_listAppend, skipping
warning: unable to decrypt level cspu5_assess3_loopValue1, skipping
warning: unable to decrypt level cspu5_assess3_loopValue2, skipping
warning: unable to decrypt level cspu5_assess3_loopValue3, skipping
warning: unable to decrypt level cspu5_assess3_mysterySwap, skipping
warning: unable to decrypt level cspu5_assess3_robot, skipping
warning: unable to decrypt level cspu5_assess3_rollDie, skipping
warning: unable to decrypt level cspu5_assess3_swap, skipping
warning: unable to decrypt level cspu5_assess3_whileTrue, skippingKilled
Tasks: TOP => install => install:all => install:dashboard
(See full trace by running task with --trace)

Turkish Translation

I want to help this project by translating it into turkish. I'm turkish and 25 years old. I'm working as an english teacher in a school. I think i can help. Please just show me which file should i translate and you will see. You won't regret because i take my job serious.

Datablock Storage - deprecate Firebase (tracking issue)

We're looking to move away from Firebase due to high costs, a security model we never locked down fully, and increased local-dev infrastructure complexity.

Eventually this description will be updated to include more detail, but for now, this is a continuation of an investigation documented in #54643 .

Design

Data Model

See #55344 for details on the data model and #55481 for details on rate limiting and defining performant queries.

Direct Query via JS to Firebase -> Rails Controller

#55554 describes some of the investigations we've done while trying to describe the backend storage implementation and #55853 has the breakdown of essentially the current Firebase interface that's being used today so that we can override it in a systematic way.

The plan is, given a rails controller to interact with the new DB, override FirebaseStorage so we can swap out the storage layer with our new DatablockStorage implementation conditionally.

Shared Tables and Current Tables

There are a subset of special tables which come from the Data Library. Current Tables (like 'Spotify Top 50') are kept up to date and student projects reference a shared version of that table. Shared tables are copied into the student's project, creating a lot of duplication.

Our design for these is to treat both Shared and Current tables as Shared tables, and add some deduplication logic that essentially treats Shared tables the same as Current tables until the first write on the table. In other words, when a user adds a table from the Data Library to their project, that table will reference a common table and therefore be kept up to date with any changes to that table. On first write, a copy of the table will be made alongside the other tables in the given project, and it will not longer be kept up to date with the main table.

Technical Decisions & Reasoning

Which DB engine? We went with MySql InnoDB for simplicity and consistency with what the project already uses for backend storage.

Rollout, incremental vs. all-at-once migration Incremental rollout based on project configuration. This makes it easier to test individual projects in production earlier without turning on the feature, as well as avoid downtime while the db migration happens.

Decisions and reasoning are documented throughout the related issues in this project, but this is a place to summarize those decisions for the future.

Rollout Plan

Incremental rollout on a project-by-project basis via a column in the projects DB. Decisions on details of rollout (batched or rolling based on project access? Symmetric writes and checking, or no?) TBD.

Work Plan

Phase 1: design and investigation

  • Select architecture to replace Firebase.
    • Evaluate suitability of the simplest architectures first: "just mysql" vs more complex architectures
      • See: #54643 (comment)
      • Identify Firebase usage in the apps/ codebase
        • Used by applab for the Data tab / Data Browser UX
          • Probably doesn't need to be realtime
        • Used by applab for the onRecordEvent block available to students
        • Referenced by P5lab = "gamelab" in the UX
      • Identify Firebase usage in the dashboard/ codebase
      • Evaluate usage to determine if we could use a "non realtime" architecture
        • How is P5Lab using it? answer: using setKeyValue and getKeyValue, not in a way that requires holding a connection
        • Sample student applab Project usage to determine how often the onRecordEvent block is used in TOS-valid projects. Results: only 0.02% of projects use it, see: #54643 (comment)
        • Get approval from Product/Curriculum to drop the onRecordEvent block.
  • Design a performant schema for Firebase data (~2 billion rows): #55344 see #55344 (comment)
  • Bulk import 1TB of Firebase data
    • Build bulk import tool: see #55189
    • Evaluate de-duplicating the Firebase data, which has LOTS of stock table datasets, see #55345

Phase 2: implement Datablock Storage backend

  • Implement skeleton Datablock Storage rails controller backend
    • Support operations in firebaseStorage.js
      • TODO: flesh out list of firebaseStorage commands we want to support
    • Support operations in firebase_helper.rb, see full list here: #55853
      • table_as_csv()
      • delete_shared_table()
      • upload_shared_table()
      • upload_live_table()
      • get_shared_table()
      • get_shared_table_list()
      • get_library_manifest()
      • set_library_manifest()

Phase 3: migrate existing code to Datablock Storage

  • Port Data Browser and Data Library React UX to Datablock Storage
    • applab.js: subscribe the redux store to firebase changes
      • applab.js: subscribeToTable() does a firebase .on('child_added') to live refresh table as rows are added
        • UX decision: drop feature and refresh on a page refresh (natural behavior?)
      • applab.js: subscribes to current_tables to get a live updating list of tables in the project
        • UX question: when you go to the data tab, is the data auto-refreshed? (is that like a reload?)
        • UX decision: drop keeping the list of tables live refreshed
        • Question: Does this affect blocks too?
    • onDataViewChange(): implements the UX changes for the dataset browser
  • Port applab blocks to Datablock Storage
    • Port apps/src/storage (firebaseStorage, firebaseCounters, etc) to use the new architecture
    • drop onRecordEvent from commands.js, api.js, dropletConfig.js (see drop PR)
    • applab/commands.js: all calls should pass through to firebaseStorage.js, shouldn't need changes (?)
    • applab/api.js: re-exports commands.js, all calls thereby pass through to firebaseStorage, shouldn't need changes
  • Port gamelab blocks to Datablock Storage
    • gamelab/commands.js
    • Confirm: P5Lab is the same thing as gamelab?
  • Port Levelbuilder mode level_data pieces to Datablock Storage
    • Only need to port all the commands in firebase_helper.rb?
    • datasets_controller.rb and dashboard/app/views/datasets
      • Appears to be a CRUD app for managing the stock datasets, probably only in levelbuilder mode, possibly /datasets
      • Need to migrate this to the Firebase API
      • @firebase points to firebase_helper.rb
    • levels_controller.rb
      • e.g. /levels/1/edit, lists the `@firebase.get_library_manifest``
  • Port live dataset features to Datablock Storage (see bin/cron/applab_datasets)
  • Rate limiting, see #55481
    • Design rate limiting
      • what are current rate limits? are they actually enforced?
        • Current limits are 300 writes per 15s or 600 per 60s. Our insert rate is 6 per s. This is lower than firebase's current limits.
          • Will this be an issue for microbit projects?
      • Determine if session-based limiting (per user) works or if we need firebase "global to channel" style limiting
    • Implement rate limiting
  • Support exported projects, see #22522

Phase 4: rollout and migration

  • Decide on details of rollout (batched or rolling based on project access? Symmetric writes and checking, or no?)
  • Finish migration Tool

Git LFS Migration

Advance preparation of systems:

  1. Merge PR adding git-lfs to chef + drone dockerfile, see: #53310
  2. Check staging, test & prod to ensure git-lfs --version >= 3.0
  3. Check levelbuilder to ensure git-lfs --version >= 3.0
  4. Post to #developers suggesting folks install git-lfs and verify their version is >= 3.0
  5. Investigate if we'll need to migrate our drone AMI and/or master and autoscaler servers for git-lfs support too, after discussion we don't think this will be an issue and its hard to check, so we'll try a migration and if it doesn't work with drone, we'll back out and look into this.

Practice conversion:

  1. Update file patterns to match https://docs.google.com/document/d/1YpB9CN-jswgZ3XTh4BBNrK-rf4RblyLse2cJu88aUjg
  2. Run a test conversion against current staging to make sure it hasn't gone stale since sept (conclusion: old conversion technique with bfg didn't work anymore, had to switch to git lfs import)
  3. Build migration script, see: https://gist.github.com/snickell/ca717b3266568b4ee05b0af20f5a25d8
  4. Do a full dry run on https://github.com/snickell/code-dot-org
  5. Validate the new repository works and performs as expected

GitHub Migration:

  1. Levelbuilder - commit content before starting & warn content editors not to make change until after the work is complete (changes made during the transition will not be stash/pop-able)
  2. Notify #developers the day before, and day of migration that they'll need to push any branches and will need to check them out again after migration
  3. Add branch protection rule with pattern * and [x] Lock Branch. This will prevent unexpected pushes.
  4. Create a git clone --mirror of the repo in multiple locations, including AWS
  5. Start running migration script https://gist.github.com/snickell/ca717b3266568b4ee05b0af20f5a25d8
  6. Wait for migration script to complete (several hours)
  7. Disable --force branch protections on staging, test, production, levelbuilder, staging-next: UPDATE, we were forced to actually delete these due to a bug in github, the last step is now recreating the branch protection rules.
  8. Run push in chunks section of the migration script
  9. Do a fresh git clone of staging from github
  • verify that LFS files are materialized in the fresh clone
  • inspect cloned repo with git-sizer to verify LFS goals have been achieved

Infra Migration

  • production-daemon / production-console: git clone --depth 1 --branch production [email protected]:code-dot-org/code-dot-org.git production.postlfs
  • staging: git clone --depth 1 --branch staging [email protected]:code-dot-org/code-dot-org.git staging.postlfs
  • test: git clone --depth 1 --branch test [email protected]:code-dot-org/code-dot-org.git test.postlfs
  • levelbuilder: git clone --depth 1 --branch levelbuilder [email protected]:code-dot-org/code-dot-org.git levelbuilder.postlfs
  • AMI builder: will have to start it to update it?
  • verify staging build succeeds
  • verify test build succeeds
  • verify production build succeeds
  • verify drone can see and checkout new commits pushed to a PR, update: drone is broken, see https://drone.cdn-code.org/code-dot-org/code-dot-org/39482/1/2
  • FIX DRONE!
  • push a full PR through all the way to production

Post Migration Restoration

  • Remove * branch protection rule.
  • IMPORTANT: re-create branch protection rules from #55759 (comment)
  • Re-open PRs that were closed during the initial staging pushes This appears to not be do-able, so instead becomes writing directions for how to best cope with this sadness.
  • Re-add code-dot-org/trusted-contributors to the branch protection rules, see: #55759 (comment)

Post Migration Dev Support (Sunday)

  • Write directions for recloning (including preserving your locals.yml, I believe you also have to bundle install)
  • Write suggestions for how to deal with PRs being force closed and non-reopenable
  • Test how to cherry-pick commits from a forgotten branch, document and refresh memory
  • Post directions to #developers
  • Notify github support we have switched to Git LFS

Question on fetching code from link.

Hello,
I'm here to ask if it's at all possible to scrape the code from a code.org project using a scraper or if it's hidden inside of a .json file somewhere! If you can help or know this is/isn't possible please let me know! Thank you.
I've tried scraping and I could only get some not ALL of the code.

How to build to production?

I see that when I set RACK_ENV=production and RAILS_ENV=production it tries to connect to a database named dashboard_production but it is not created when I issue a rake install command.

How do I configure database HOST/PORT/USER/PASSWORD for mysql and redis?
Are there any other considerations before running it in production?

Evaluate De-depulication of formerly-Firebase data

Currently Firebase data has a LOT of duplicate tables that are unmodified and brought in by the data set browser. We might consider deduping our data as we import it, it could modify the overall size of the data set substantially.

Part of the overall Firebase deprecation project: #55084

Design backend for Datablock Storage

Datablock Storage (#55084) is switching from a browser=>firebase connection for project storage to a browser=>rails=>mysql connection. We have tentatively determined a schema (see: #55344 (comment)) and measured its perf characteristics (#55189), now we need to design the rails backend.

  • Characterize async behavior of the existing Firebase backend
  • Determine acceptable load (particular invocations/s) for Rails backend
  • Decide batching size (if batching is possible within async semantics)
  • Write out & profile SQL queries for doing all required checks (rate limit, table count, etc) for each data block
  • Figure out how we want to implement de-duped tables, which also solves for "current table" (the live tables like Daily Weather, Spotify, and COVID caseload)
  • Design backend API

Translations

Hi, I don't know where to ask these questions.

I am currently working on your CrowdIn project, and have translated about 25% of the entire project.
Was just wondering, if you also need Filipino / Tagalog translations here or in any of your Github directory?
I can also translate documentations and guides if needed, aside from the tutorial.

Thanks.

`git diff` not showing .haml diffs (for example)

After our LFS migration, #55759, @kelbyhawn is reporting not seeing HAML, CSS & SCSS diffs in GitHub desktop.
In theory, we have excluded them from LFS using .gitattributes.

I was able to repro:
image

UPDATE:
Curiously, the file is NOT in LFS, it should be a regular git file, and diffing does work on GH.com:
image

Is this a GH Desktop bug?

code-studio-common.js is about 10M

code-studio-common.js is about 10M. essential.js about 1.6 M.
I suppose the version we downloaded has been complied. To big size of one js file. It's very slow to open the the website when we visit it for the first time.
It must be some history reason. Is there any way of any thoughts to improve it?

how can configure this in recent repo?

These enable the local code-studio build
build_code_studio: true
use_my_code_studio: true
This enables the local blockly-core build
build_blockly_core: true

Cache repo between drone runs

  • We currently get ~~200 clones per day of our repo: https://github.com/code-dot-org/code-dot-org/graphs/traffic
  • Each clone will consume more than 20GB of LFS data (possibly much more, tbd)
  • The majority of our clones are from Drone (see below)
  • If we implement a caching mechanism, we can dramatically reduce the number of clones.

Elijah had the idea that we could cache a zip download of the tagged releases from https://github.com/code-dot-org/code-dot-org/releases

By implementing a mechanism for caching git checkouts and "fast forwarding them" to the current branch, we can dramatically reduce LFS data usage, at leas the usage attributable to drone.

Fixed by #56389

where can edit this variables

few years ago i can rename port number for pegasus

nano /home/codedotorg/code-dot-org/deployment.rb

'pegasus_port'                => 3000,
To
'pegasus_port'                => 3001,

how can do it with recent repo?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.