
What past technology changes in human engagement and L&D have taught us
L&D has never lacked tools. What it often lacks is self-control in deciding what, when, and how. Those who have lived through multiple cycles of change will recognize the pattern. Flash, rapid authoring tools, mobile learning, microlearning, ever-evolving learning platforms, and the AR/VR boom. Each wave came with a promise. Each brought real progress and created new challenges.
This time the AI entered the room. And once again, L&D is said to be all about to change. This time, too, risk has not gone out of style. Too many risks are being created too quickly. So what can we learn from the past?
Human participation is a design choice
Human participation is often described as a review step. How to check AI output before production. This description gives a good picture of what is really at stake. So let’s remember we’re the boss!
In L&D, human participation is about ownership.
Who decides what is important? Who decides what can be deleted? Who decides how much complexity is helpful and when it becomes a burden.
AI can suggest structures, examples, and evaluations. However, it is not possible to determine what is noteworthy in a particular context. These are design decisions. And they have consequences. Humans have to deal with it.
When speed impairs learning
AI makes content creation faster and cheaper. It’s definitely useful. This is also something senior L&D leaders should pause and reflect on. As speed becomes the dominant metric, L&D quietly shifts focus from learning design to content production.
The questions start with “What do learners need to do?” “How fast can I create it?” Efficiency alone does not improve learning. Without judgment, you just create more noise.
Instructional design goes beyond content creation
As automation increases, the role of instructional design is quietly changing.
If instructional designers are still primarily judged on their output, AI will soon overtake them. But that doesn’t mean it didn’t play a role. This is a sign that L&D is viewing its value too narrowly.
In more mature L&D functions, instructional design aims to:
Learning Architecture Clear Intent Thoughtful Sequencing Content Suppression
Human relations belong here. It is not at the end of the process, but at the very center of it.
What AR and VR have taught us L&D
This isn’t the first time L&D has been here. AR and VR once held great promise. Immersive learning. Safe simulation. High engagement. It’s a very attractive future. Several years later, AR and VR are still going strong. But many teams struggle to use them meaningfully or at scale.
It’s not because the technology has failed. Because success required strong instructional design, a complex ecosystem that was hard to build and expensive to maintain. Without a clear learning intent, immersion becomes a spectacle. Experience replaced purpose. That lesson is worth remembering.
Why is concurrency with AI important?
AI feels different because it is more accessible. Implementation will be faster. You can experiment more cheaply. But the underlying risks are well known. When technology leads and learning intent follows, L&D ends up creating content that looks sophisticated and confident, but still struggles to change behavior.
AR and VR have taught L&D that engagement doesn’t automatically lead to learning. If human participation is treated as an option, AI will teach the same lessons.
Silent risks that L&D leaders should pay attention to
The biggest risk isn’t that the content is of poor quality. Content that looks right but adds little value. AI-generated learning often uses appropriate language and follows accepted patterns. Without human judgment, L&D risks filling the platform rather than supporting performance.
The questions worth asking are simple.
Are we designing learning or just adding content? Are we respecting our learners’ time? Are we clear about what they really need to learn? Are we getting the most out of our learning budget and ROI?
Human participation as an L&D function
Human participants cannot be left solely to individual designers. That needs to be reflected in how L&D works.
Clear expectations regarding the quality of learning. A space to challenge content that feels unnecessary. Supports thoughtful backlash. Continuous development of design decisions.
When this is done well, L&D does not produce further learning. It produces better learning.
Questions that shape what comes next
AI will continue to evolve. It’s inevitable. The more important question for L&D is what learning decisions we are willing to carry over, and which decisions we must remain human. If you don’t answer this intentionally, it will be answered by tools, schedules, and convenience. And there’s no looking back!
Being a human stakeholder is not about resisting change. It’s about taking responsibility for learning.
