Behind Our Learning Curriculum

November 17, 2023

What happens behind NewCampus’ curriculum? Megan (Learning Experience Design Lead) shared about creating a user-centric learning

What happens behind NewCampus’ curriculum? Megan (Learning Experience Design Lead) shared about creating a user-centric learning, exploring learner’s understanding, and appreciating imperfections in the process.

Decoding learning dynamics: Megan's approach to user testing in cohort learning environments

Siska: Could you walk us through your approach to user testing in creating the learning materials for Management Essentials, Leadership Essentials, and Stakeholder Essentials?

Megan: In any sort of cohort-based learning, the potential for failure or any sort of friction to happen is also quite high.

You're never going to be able to fully eliminate that chaos or that failure, but it's more of a process of figuring out how to make sense of it or ensure that tension is happening at the right time and that it's also balanced against moments of success, of feeling good.

User testing is really a process of understanding where those different highs and lows are, how we can either make the highs higher or make the lows not as devastatingly low, and how the sequencing can help the entire narrative come together.

User testing is also acknowledging that our learners and our users are fundamentally quite different from us in terms of the work they do, the headspace they're in, their level of motivation, and their needs. It helps us understand not only how a particular product works but also helps us understand that group of users much better.

In tactical terms, this means always making sure that we are testing something out on different levels before we actually go live with it. Firstly, testing out individual activities to see if they hit the objectives you want to set out. Secondly, we test out the sequence. Does the sequence of activities make sense? Could we have done them slightly differently? And would that actually help people understand better? And then the third we test out is instructional clarity. Just making sure that whatever instructions we are providing or the activities or whatever we are teaching are clear and coherent, regardless of people's cultural backgrounds or their level of language ability.

It’s really important to us to make sure that you can have two learners, maybe coming in from very different industries and countries, but still being able to understand the instructions as quickly as possible and actually spend their time in deep discussion.

Involving long-term learning, from practicality to impact

Siska: I remember there's an invitation going out to the alumni to sit for Stakeholder Essentials during the making. How do you incorporate this feedback into the design process and the end product, especially trying to balance and reflect the theory and practicality of it to people managers?

Megan: That was really rewarding for us because previously we usually relied on some sort of internal testing mechanism, so we’re our own guinea pigs. Though that’s a fast way to test, the limitation is that it can't fully replicate how our learners think.

When we brought in those alumni for Stakeholder Essentials during development, it was quite an interesting process to see how people responded to the content. What they found relevant, and how they bring in their own stories.

Apart from just doing dry runs, we’ve also run the same programs multiple times with different clients and cohorts, which serves as an ongoing process of testing and continuously improving the program.

There's no real way to test for practicality beyond perceived practicality. Doing the actual dry run, we can only understand how practical people think it is, but not necessarily how it actually is. This is where community engagement comes in for us.

What happens two, four, or six months after the program with the learners is particularly interesting to me. We conduct quite a lot of interviews with learners, and it's very helpful for me to read them because I can understand how the learnings have played out in the real world. I learned about the specific skills that managers talk about having a big impact on them.

This is practical aspect because implementing soft skills in the real world requires constant reflection. Confidence also becomes very important for people to actually try those skills out for themselves and fully embrace the identity of being a good manager or an emerging leader.

To come back to the question, theoretical knowledge can be assessed within the testing process, where we can test for understanding and changes in mindset. However, there's no real way to test for practicality unless we maintain a long-term relationship with the learners and continuously adapt our program based on their real-world feedback.

Siska: It's an interesting concept that you mentioned. The "blending" moment sometimes happens after the program, and it's something that you didn't expect. Can you share a specific story of learning that surprised you the most?

Megan: One moment in testing that surprised me was actually during our internal testing process. We were designing Leadership Essentials, which is specifically tailored for individuals who are thinking at a leadership level in terms of the impact they can have. We believe that true effectiveness as leaders only come about when the group is aligned on their vision, culture, and behaviour they want to embody.

During our internal testing, I was surprised to see our own advocates at NewCampus, engaging in these discussions and achieving some level of alignment on where we’re going and what we’re focussing on. It was also remarkable to hear their thoughts and reflections that were previously unspoken, but were now being openly discussed among each other.

This was surprising because I didn't expect the user testing process to have such a profound impact on our team's thinking and internal dynamics. It's a joyful moment to witness something working as designed and also benefiting the people and teammates I care about.

The productivity of imperfection: failing fast, learning faster

Siska: This question might be a bit broad, but reflecting on your experience, is there a common thread among humans from the user testing?

Megan: It's really about imperfection as something that is so precious. But that imperfection itself can also be quite productive and powerful.

Because it ties in with a lot of things, like how businesses are built, knowledge about leadership, and education. This appreciation of imperfection runs counter to what we, as a society, often think of as what’s important. Too often we strive towards perfect, watertight, efficient experiences, processes, and products.

Coming back to that original thought about the points of friction and failure and why that is really important, the user testing process itself is really trying to fail as much and as hard and as fast as possible for the purpose of learning.

Because when things go well, when things are going exactly as you imagine, there's not a lot of room for learning and conversation, unfortunately. But in the user testing process, you have to learn to respond quite positively and optimistically to things that are falling apart.

For me, that's like a common thread across human existence in the sense that nowadays, in product development or in the way tech companies are set up, things start in failing fast and embracing mistakes as opportunities. But at the same time, unfortunately, as tech companies grow in size, perfection, this idea of excellence as having no room for failure, somehow takes hold.

Unfortunately, as things scale, usually that's the way things go. Because the risk is a lot higher, or if you fail, then there's a lot more money on the table, there are a lot more things to be lost.

But what’s great about observing the testing process is that everyone’s willing to embrace those moments of failure, and even gather around them. Knowing that they’re in an environment where something imperfect is being run somehow also frees people up to be more reflective, and accepting of their own messy thought processes. That's the thing that resonates with me.

I guess the reason why we're having this conversation is also because a lot of these spaces for failure are usually kept internal. We don't often share about our user testing process. Because it's also a process of going like, oh, this one was really messy, so how do we fix it and move on? The result can be that we tend to only learn privately because of that.

Find the right tools, knowledge, and skills needed to help you build a productive workplace.