-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Logging metrics and flags #59
Comments
Metrics used in Discovering Patterns in Student Activity where calcified as UNIVERSE events describe the state of the universe in which the students are working. They are time-based events that represent fixed points relative to the due date. • N days before the assignment due date (N = 5, 4, 3, 2, 1) SUBMISSION events describe how far along a student is in his or her submission sequence. • Student has made his/her first submission METRIC events describe changes that a student has made to his or her code, based on code metrics that are computed during • # of non-commented lines of (test, non-test) code (increased, decreased) by |
Metrics form another learning system focused on completion time an error frequency |
Another Metric is flow which can be measured by asking. a) How challenging did you find the last activity over a scale of 5 after each challenge ends |
From the MSQL to be administered after the first week the first quiz and before the exam maybe use other course stuff Intrinsic Motivation Questions
Extrinsic
|
I will start modifying the challenges to adapt the new code multiple choice challenge The content of the challenge should be from now on in order to track a code challenge I need to log every run submission to save that in the trial I need to record the history for that I would need to log an array of activities on the trail The trial activities are events that occur while debugging I need to prompt the user for the flow rating at the end of the trials as well and store that result in the trial itself for now under the field metrics which is a mixed of other stuff we log such as start time, end time, and averages of activity logs (percentage of error etc) the events that can be logged can be categories by mode some events need to be appended by time proximity (debug step 10 times makes little sense we can just lump it as debug.step with start end time |
as for code assessment by multiple choice challenges |
challegneTrial and arenaTrial will no longer use mongoose-version but instead updates to user metrics should be maintained in an associated collection dedicated for logging metrics
currently should be mixed
challenges and arenas should have flags mixed attribute which has known default settings but also can be freely set in order to account for custom challanges
The text was updated successfully, but these errors were encountered: