A one hour workshop were we mainly discuss and sketch:
- UI for teaching in this context.
- UI for batch teaching, teach as you go or a combination of those
- How to handle when the ML-model is not sure of a prediction (The ML-model gives a % number that indicates if it is “far away” from teaching data). In this case the predictions can be wrong or right so connected with next point?
- How to interact when it predicts wrong?
- What does it mean that the whole feature space consisting of location/weekday/timeofday/activity is available and you can connect any point in this feature space to any journey.
- Discuss teaching as a way to program.
- Teach the app with a made-up perhaps unrealistic behaviour.
- Evaluate the behaviour "Does it work the way you expect".
- Write down the behaviour you taught the app and if the results are logical with respect to the teaching.
- Think around/sketch how the behaviour you taught could be presented/visualised so you would understand it.
Add feedback here both general issues, thoughts and sketches or just notes so you remember, the more the better. Especially things that relates to the research questions above is valuable:
https://github.com/k3larra/commuter/blob/master/UserStudy/week6/workshop1a.md
[1] P. Y. Simard et al., “Machine Teaching: A New Paradigm for Building Machine Learning Systems,” 2017.