
Lessons learned from building educational technology
There are statistics that should concern any L&D leader considering custom learning technology. According to research by Standish Group, approximately 66% of software projects fail to meet expectations or are abandoned altogether. In the field of educational technology, where student outcomes and taxpayer dollars are at stake, those numbers should be unacceptable. But here’s what most people misunderstand about why EdTech projects fail. It’s almost never coding. It’s almost never on budget. In most cases, eLearning architecture decisions, fundamental decisions made in the first two weeks of a project, determine everything that follows.
I’ve spent over a decade building custom software, with the majority of that time focused on education technology for K-12 schools and charter school networks. Successful platforms shared a set of common architectural patterns. Those that failed shared another set. Here’s what I learned.
In this article…
1. Design with teacher workflows in mind, not administrator wishlists.
The most common mistake in EdTech platform development is building from the top down. Administrators or district leaders define requirements. The development team builds on these specifications. The platform will start. Teachers hate it.
This happens because administrators think in terms of data such as enrollment numbers, compliance reports, and performance metrics. Teachers think in terms of workflow. “I need to check attendance, distribute today’s assignments, see who is late, and contact three parents before lunch.”
Something interesting happens when you first decide on the architecture of your e-learning platform around the teacher workflow. This means that the administrative data that administrators need emerges naturally as a byproduct of teachers’ work. Attendance data, engagement metrics, performance trends, and more are captured without adding a single extra click to a teacher’s day.
practical points
Before we write a single line of code, we spend a day observing three to five teachers each. Map your workflow minute by minute. Then, instead of asking teachers to do something new, design your data model to capture what teachers are already doing.
Research from the International Society for Technology in Education (ISTE) consistently shows that teacher buy-in is the strongest predictor of technology implementation success in schools. Deciding on an e-learning architecture that respects teacher workflows is not only good design, but also fundamental to implementation.
2. Build FERPA compliance into the data layer, not the application layer
The Family Educational Rights and Privacy Act (FERPA) governs how student education records are handled. Most development teams treat FERPA compliance as a feature, something that is added on top of the working platform. This approach raises two serious problems.
First, forcing compliance onto existing architectures inevitably creates gaps. When student data flows through systems that are not designed with privacy in mind from the ground up, it is nearly impossible to guarantee that personally identifiable information (PII) will not be leaked through logging systems, error reports, third-party analytics, or cached API responses. Second, retrofit compliance is expensive. I have seen organizations spend more money on FERPA compliance audits of existing platforms than they would spend building it properly from scratch. The solution is architectural, and compliance must reside at the data layer itself.
In practice, this means implementing data classification at the schema level. All data that enters the system is tagged as one of three categories: directory information (usually shareable), education records (protected by FERPA), and anonymized data (aggregated and anonymous). Access controls, audit logging, and data retention policies automatically act based on these classifications, regardless of which application functions are accessing the data.
practical points
If your development partner fails to explain your data classification strategy during the initial architecture meeting, plan to enforce compliance later. That’s a red flag.
3. Separate the learning engine from the content layer
One of the most important decisions in eLearning architecture is how tightly the learning logic (assessment, progress tracking, adaptation pathways) is tied to the content itself (lessons, videos, quizzes, reading material). Tightly coupled systems (systems where the quiz logic is embedded directly into the lesson content) are faster to build initially. Maintaining them is also a nightmare. As the curriculum changes (and it always does), updating a tightly coupled system means touching both content and logic at the same time. This introduces bugs and requires developer involvement in what should be a content editor’s job.
Loosely coupled systems separate concerns. Content editors manage content through a content management layer, and the learning engine handles ordering, assessment scoring, and progress tracking independently. The two communicate through a well-defined interface. Standards such as SCORM, xAPI, or LTI are often used to ensure interoperability between content layers and external systems. This separation benefits you in three specific ways:[SCORMxAPIorLTItoensureinteroperabilitybetweenthecontentlayerandexternalsystemsThisseparationpaysdividendsinthreespecificways:[SCORM、xAPI、またはLTIなどの標準を使用して、コンテンツ層と外部システム間の相互運用性を確保します。この分離により、次の3つの具体的な方法で利益が得られます。[SCORMxAPIorLTItoensureinteroperabilitybetweenthecontentlayerandexternalsystemsThisseparationpaysdividendsinthreespecificways:
Curriculum updates become a content task, not an engineering task.
Teachers or curriculum experts can update lessons without developer support. Learning engine can be reused between programs
For example, a charter school network can use the same assessment and progress tracking engine across different campuses with different curricula. Make your analysis more meaningful
When learning logic is separated from content, student performance can be compared across different content versions, providing powerful data for curriculum improvement. practical points
Ask your development team if a curriculum specialist can update your lesson without filing a support ticket. If the answer is no, then content and logic are coupled too tightly.
4. Measure everything from day one
In my experience, the most underrated aspect of EdTech platform architecture is instrumentation. This is the practice of embedding data collection points throughout the system to capture how students and teachers actually interact with the platform. Most teams plan to “add analytics later.” This is a mistake for a simple reason. This is because it is not possible to retroactively retrieve data about interactions that have already occurred. If you launch without instrumentation in September and realize in December that you want engagement data from the first semester, that data is lost. Effective measures for educational platforms go beyond page views and clicks. Indicators that actually inform learning outcomes include:
Task execution time by content type
Are students spending more time on videos and reading? This will tell you the effectiveness of your content format. Evaluation trial pattern
How many attempts does it take to master? Where do students abandon the assessment? This reveals a spike in curriculum difficulty. help-seeking behavior
When and how do students seek help? This indicates where instructional support is needed. session pattern
When and for how long will students participate? This will inform scheduling and pacing decisions.
A key decision in your e-learning architecture is to build an event-driven data pipeline that captures these interactions in real time without impacting platform performance. This typically means implementing an asynchronous event bus that writes interaction data to a separate analytics datastore, keeping your primary application fast while building rich datasets for analytics. As AI capabilities shape K-12 education software, this instrumentation data becomes even more valuable and feeds into adaptive learning models that personalize the student experience.
practical points
Define your instrumentation strategy before the feature list. The data collected during the first three months of implementation will be the data that determines whether the platform is actually improving learning outcomes.
5. Plan offline from the architectural level
This is a decision that distinguishes between platforms built by people who have visited the school and platforms built by people who have not visited the school. The school’s internet connection is unstable. It’s unreliable in the countryside. Unreliable during peak usage in urban areas. If you have 30 students streaming video at the same time in a classroom designed for 1990s internet loads, it’s unreliable. Despite this reality, most learning platforms are designed as pure cloud-based applications that require a constant internet connection. If the connection is lost, the platform becomes unusable. Students lose their jobs. Teachers lose class time. Frustration builds up. Recruitment will decrease.
Designing offline functionality doesn’t mean building a completely offline application. This means implementing a gradual hardening strategy where core workflows (retrieving ratings, viewing previously loaded content, recording attendance) continue to work during connectivity gaps and are synchronized when connectivity returns.
This technical approach includes client-side caching of critical content and a queue-based synchronization system that gracefully handles conflict resolution. Although this adds complexity to the initial architecture, it eliminates one of the most common complaints from educators using custom learning platforms.
practical points
Ask your platform provider what happens if a student loses WiFi in the middle of an assessment. If the answer involves a loss of work, then the architecture is not ready for use in a real classroom.
Common points
These five decisions share a common philosophy. It’s about building how education actually works, not how you want it to work. Teachers are busy. Student data is confidential. The curriculum is always changing. Learning takes place in imperfect environments with imperfect infrastructure. Successful platforms are those whose architecture recognizes these realities from the initial design consultation stage.
If you’re an L&D leader evaluating custom learning technology, these five questions will give you a framework to assess whether your platform is built for the real education world.
Was the platform designed around teacher workflows and administrator requirements? Is compliance built into the data layer or added as a feature? Can the content be updated independently of the learning logic? What interaction data was collected from day one? What happens if the internet goes down?
The answers to these questions can tell you more about the long-term viability of your platform than any previous feature list or demo.
Read more:
Building a custom LMS: When an off-the-shelf platform isn’t enough
