Replace our subjective assessments of a student's technical skills with quantitative, objective quizzes. Provide students with instant feedback and direction on their technical mastery.
At Bloc, we use a lot of time and human capital trying to identify which students are doing well vs. those who are not. What might we add to the Bloc experience that would help efficiently assess these students in terms of their ability to get a job at the end of the course?
In addition to a low-cost way to assess fundamental technical skills, we also knew we needed:
Coming out of this brainstorming meeting, the lowest-cost idea that hit all three requirements was a quizzing feature. I was tasked with building out an MVP to soft-test as a potential product solution.
I iterated on this feature independently from February through April, when it got on the roadmap to be built as a feature in-platform. From April 2016 to today, I continued to scale, evalaute, and iterate on quizzes as a measure of technical proficiency.
Quizzes are only effective if they're testing the right thing. We found out quickly, we had no way of dicussing whether we were testing the right thing or not.
I had crowdsourced quiz questions from out mentorship, curriculum, and engineering team. This meant that quizzes were either written based on what was taught in the curriculum, or based on the engineer's personal bias of what knowledge was important.
However, if a student gets a question wrong (or if many students get the same question wrong), it could also mean that we are not teaching effectively. Or, we could even be teaching the wrong subject matter.
One of the mentors on the team introduced us to learning objectives, which is often used in schools. Learning objectives is a tool used in Mastery Learning, which states that a student must demonstrate mastery of one level of the subject before proceeding to the next level.
After much iteration, we found it a great method to align on our intent before we even begin to write the curriculum. Moving forward, the process for editing or adding curriculum looked like this:
This new process ensured that we were teaching practical skills that would land student jobs, and that the curriculum and student proficiency would be evaluated independently. More valuable than the quizzing feature itself, was the discovery that we needed learning objectives.
When I began this project, there was a lot of skepticism about the the value of adding quizzes to the Bloc platform. For a lot of stakeholders, it felt like it would get complicated and out of hand very quickly.
To mitigate these concerns, I developed 3 principles:
This process helped me identify major risks early on (such as challenge #1), which allowed me to mitigate it in a controlled environment. If I had spent all my time designing a complex quizzing feature, it would have been a waste of resources and been harder to course correct.
I started out using many third party tools (GoConqr, Quizlet, Typeform, Google Sheets). Getting hands-on experience on the utility of the different tools made it easy to evaluate a build-borrow-buy decision when it came to integrating the quiz feature into the platform. We quickly found out some features that we initially thought were important (eg, a timer for each question) turned out to be less important.
The original intent of the project was to provide students with instant feedback loops, so they can self-support their own learning between mentor meetings.
Through various iterations, we found out that this meant:
There are many opportunities to improve on the groundwork that quizzes have set.
These are big problems that take educators many years to tackle. As I learned in challenge #2, it is much more important to create a shared ownership and understanding around the problem, rather than "owning" the problem myself.
I left Bloc without having implemented all the changes I wished to see, but with full confidence that there is a shared culture of understanding around the nuances, the opportunities, and the limitations of each path we might take.