Nursiscope – Nursing Education Platform

Nursiscope – Nursing Education Platform

Overview

A dedicated platform for nurses delivering courses, books, and certifications — giving a professional community their own digital home. I developed both the backend (course management, payments, authentication) and the full frontend experience.

Engineering a Nursing Education Platform from 0 to 1

Role: Full-Stack Engineer Stack: SvelteKit, Payload CMS, MongoDB, AWS S3, PayPal Type: B2B2C SaaS

Nursing education in 2024 is still surprisingly fragmented. Students bounce between YouTube tutorials, PDF drives, random Teachable courses, and WhatsApp groups just to piece together CPD-compliant learning. There's rarely one authoritative place where you can take a structured course, sit an assessment, get a verifiable certificate, and read supporting literature — all under one login.

NursiScope is that place.

It's a full-stack SaaS platform purpose-built for nursing professionals and students. Users can enroll in structured courses, complete module-by-module assessments, purchase educational books, engage with a curated nursing blog, and — once they complete a course — receive a programmatically generated PDF certificate delivered straight to their inbox.

The business model is B2B2C: administrators and editors publish and moderate content, while nurses and students consume and pay for it. The platform handles everything from content management to payment processing to certificate issuance without a human needing to touch a button.

The Core Problem Set

ProblemHow NursiScope Solves It
Fragmented nursing resourcesSingle platform for courses, books, and research
No accessible certificationAutomated PDF cert on course completion
Password-based auth on shared hospital PCsOTP-based login — zero credential exposure risk
Manual enrollment after paymentPayPal capture → auto-enrollment in < 1s

Tech Stack

LayerTechnologyWhy
FrontendSvelteKitSmaller bundles, SSR/CSR hybrid, reactive without a vDOM
Backend / CMSPayload CMS (Next.js)Headless CMS with built-in admin UI, RBAC, and REST/GraphQL
DatabaseMongoDB (Atlas)Document model fits nested course/module/assessment schemas
File StorageAWS S3Managed, scalable, CDN-ready for media and PDFs
PaymentsPayPal JS SDKWidely trusted, fast checkout, webhook-free capture model
EmailNodemailer + React EmailComposable email templates in TSX
MonorepoTurborepo + pnpmParallel builds, shared packages, single lockfile
AuthJWT + OTPStateless, mobile-friendly, secure on shared devices

System Architecture

The Decoupled Monorepo Approach

The entire platform lives in a single Turborepo monorepo with two deployable apps — apps/web (SvelteKit) and apps/server (Payload CMS) — and shared internal packages under packages/.

This was a deliberate choice over a "one big Next.js app" setup. Keeping the frontend and backend completely separate means:

  1. Independent deployments. The SvelteKit frontend deploys to Netlify's edge network. The Payload backend deploys to its own Node.js environment. A broken CSS change never takes down the API.
  2. Shared types, zero duplication. Auto-generated TypeScript types from Payload's schema (payload-types.ts, ~2,089 lines) are the single source of truth for every API shape the frontend consumes.
  3. Team scalability. Frontend and backend engineers can work in parallel without stepping on each other's builds.
bash
    monorepo/
├── apps/
│   ├── web/          → SvelteKit (Netlify)
│   └── server/       → Payload CMS + Next.js (Node)
└── packages/
    ├── ui/           → Shared Svelte components (@repo/ui)
    └── eslint-config → Shared lint rules
  

Why Payload CMS Over a Custom REST API

This is probably the biggest architectural decision in the project, and honestly the one that saved the most time.

Building a nursing education platform means you need: a content editor that non-engineers can use, fine-grained access control per content type, file upload handling, and a full CRUD API — before you've written a single line of business logic.

A custom Express/Fastify API would have required building all of that from scratch. Payload gives you:

  • A production-quality admin panel with rich-text editing (Lexical), media management, and relationship fields — for free.
  • Collection-based schema that acts as both the database model and the API definition. Define your courses collection once; Payload auto-generates GET /api/courses, POST /api/courses, etc.
  • Hook system for business logic. beforeChange, afterChange, afterRead hooks let you intercept any database operation cleanly — no middleware spaghetti.
  • Built-in RBAC at both the collection and field level.

The trade-off? Payload adds ~React Admin UI overhead and opinionates your project structure. For a platform where content managers need a real editing experience, it was completely worth it.

High-Level Architecture Diagram

High-Level Architecture Diagram

The Automation Engine: Payment → Certificate

This is where the platform earns its keep. The entire enrollment-to-certificate pipeline is fully automated — no human intervention required.

The Problem with "Manual" Certification

The traditional flow at most ed-tech platforms looks like this: user pays → admin gets notified → admin manually enrolls user → user eventually gets a PDF from someone's Google Drive. That's slow, error-prone, and doesn't scale.

NursiScope's approach: the payment capture event is the trigger for everything downstream.

The Trigger-Action Chain

When a user clicks "Pay Now" on a course page, here's exactly what happens:

  • PayPal JS SDK creates an order on the client and returns an orderId.
  • User approves the payment. PayPal calls .capture() and returns the full transaction details.
  • The frontend sends POST /api/transactions with the PayPal data — paypalOrderId, paypalTransactionId, amount, payer, productId.
  • Payload creates a Transaction document with status: 'COMPLETED'.
  • The afterChange hook fires. This is where the magic is.
draft apps/server/src/collections/transactions.ts
    hooks: {
  afterChange: [
    async ({ doc, operation, req }) => {
      if (operation === 'create' && doc.status === 'COMPLETED') {

        // 1. Send purchase confirmation email
        await req.payload.sendEmail({
          to: doc.payer.email,
          subject: 'You're enrolled! 🎓',
          html: renderCoursePurchasedEmail({ user, course })
        });

        // 2. Create UserProgress entry (the enrollment record)
        await req.payload.create({
          collection: 'userProgress',
          data: {
            user: doc.payer,
            courses: [{
              course: doc.course,
              progress: { status: 'not_started', currentCourseIndex: 0 },
              assessments: [],
              certificateUrl: ''
            }]
          }
        });

        // 3. Update user financial fields
        await req.payload.update({
          collection: 'users',
          id: doc.payer,
          data: { totalExpenses: updatedExpenses }
        });
      }
    }
  ]
}
  
  • Once the user completes all modules and passes the final assessment, certificateEnabled: true triggers PDF generation via pdf-lib.
  • The generated certificate PDF is uploaded to AWS S3.
  • The S3 URL is written back to UserProgress.courses[n].certificateUrl.
  • A certificate email is dispatched using a React Email template.

Why Hooks Over a Job Queue

The natural follow-up question is: why not use a message queue (Redis/Bull) for this?

The honest answer: atomicity and simplicity at this scale.

With hooks, if the Transaction document creation fails — say, the DB throws an error — the afterChange hook never fires. No orphaned emails go out for a payment that didn't actually complete. The failure is clean and traceable.

A job queue adds infra complexity (you need Redis, a worker process, retry logic, dead-letter queues) that isn't justified until you're processing hundreds of concurrent purchases. For the current scale, async/await in a Node.js hook is perfectly fast enough. The architecture is already designed so that migrating to a queue later is a clean swap — just move the hook body into a worker.

Security & RBAC

The OTP Decision — This One's Interesting

Most modern platforms default to magic links for passwordless auth. Send a link to the user's email, they click it, they're in. Simple.

NursiScope uses OTP (One-Time Password) instead. Here's why that was the right call for this specific user base.

Nurses and nursing students often work on shared hospital computers — ward stations, library PCs, break room desktops. A magic link gets clicked and opens a session. The user walks away. The next person to open that browser is now authenticated as someone else.

OTP changes the equation:

  • The code is short-lived and single-use.
  • It requires active input from the user. You can't accidentally inherit someone else's session by clicking a leftover browser tab.
  • It works identically on mobile (copy-paste from the email app) and desktop.
  • No link in the email means no phishing risk from URL-spoofed magic links.
draft apps/server/src/utilities/generate-otp.ts
    // OTP stored as JSON in _verificationToken field:
// { OTP: "482917", verifying: true, email: "user@hospital.nhs.uk" }
  

The OTP is stored hashed in _verificationToken, consumed on first use, and nullified post-verification. Password reset uses a completely separate OTP flow so the two paths can never interfere.

Engineering Decisions — The Real Reasoning

Compressed Cookies for User State

SvelteKit's server hooks (hooks.server.ts) run on every request. For auth to work, the user object needs to be available server-side on every route. The naive approach is to re-fetch the user from the database on every request — slow and wasteful.

The solution: store the serialized user object in a cookie. Problem: HTTP cookies have a ~4KB limit, and a user object with roles, profile data, and preferences easily blows past that.

The fix was LZ-String compression:

Code
    // Cookie → LZ-compress(JSON.stringify(user)) → ~60-70% size reduction
const userString = await compressUser(user);
event.cookies.set(COOKIE_KEYS.USER, userString, { secure: true, sameSite: 'none' });

// On every request in hooks.server.ts:
const user = await deCompressUser(cookieValue);
event.locals.user = user;
event.locals.api = new NursiscopeApi(token, tokenExp);
  

Decompression overhead on every request is microseconds. The benefit — no DB round-trip on every page load — is worth it.

Dynamic CORS for Netlify Preview URLs

Netlify generates a unique preview URL for every pull request: deploy-preview-42--nursiscope.netlify.app. Without handling this, every PR would fail with CORS errors when trying to hit the API — killing the review workflow.

The solution is a bit cheeky but practical:

Code
    const allowedOrigins = [
  'https://nursiscope.com',
  // Pre-generate the first 100 possible preview deploy URLs
  ...Array.from({ length: 100 }, (_, i) =>
    `https://deploy-preview-${i}--nursiscope.netlify.app`
  ),
];
  

No manual config updates. No CORS-blocked PRs. Just works.

Virtual Fields for Computed Properties

Payload's virtual fields let you compute derived values at read time without storing them in the database. NursiScope uses this pattern throughout:

Code
    // fullName isn't stored — it's always fresh
createVirtualField({ name: 'fullName' },
  ({ siblingData }) => `${siblingData.firstName} ${siblingData.lastName}`
);

// Notification preview — truncated content for list views
createVirtualField({ name: 'preview' },
  ({ siblingData }) => siblingData.content?.slice(0, 50) + '...'
);

// Verification status — readable label for the admin UI
createVirtualField({ name: 'status' },
  ({ siblingData }) => siblingData._verified ? 'verified' : 'unverified'
);
  

What I'd Do Differently

Job Queue for async operations. The afterChange hook approach works well at current scale, but email delivery and PDF generation happening synchronously in a request lifecycle will eventually cause timeout issues under load. The architecture is already positioned for a clean migration to BullMQ or similar — the hook just becomes a job dispatcher.

Rate limiting on auth endpoints. The OTP and password reset endpoints don't currently have explicit rate limiting. For a production healthcare platform, brute-force protection on these routes is a must-add.

Token blacklisting on logout. Current logout clears client-side state, but the JWT technically remains valid until its 8-hour expiry. A Redis-backed token blacklist would close that window properly.

The Bottom Line

Building NursiScope from the ground up was a masterclass in balancing "developer speed" with "system stability." By choosing a headless foundation like Payload CMS and a reactive frontend like SvelteKit, I was able to build a platform that feels like it was handled by a whole team, despite being a solo effort.

More than just a technical exercise, it was about solving a real problem for the nursing community. I’m proud of how the automation turned out—there’s something incredibly satisfying about seeing a student pass a quiz and knowing the system is handling the enrollment, certification, and accounting in the background without me lifting a finger.

The platform is built to scale, and I’m excited to keep pushing the boundaries of what this stack can do.

Next Project

MUAALAG – Madonna Alumni Platform

View project arrow_forward
Book a Call arrow_forward Start a Project arrow_forward Book a Call arrow_forward Start a Project arrow_forward