Tuesday, January 27, 2026

The Ethics Layer: Designing AI-Driven Learning with Integrity

Innovation without ethics is a risk multiplier

 AI is transforming how we build and deliver learning, but without guardrails, it can amplify the very inequities we aim to dismantle. I’ve developed training modules that delve into hallucination, misinformation, and bias, not in theory, but in real-world enterprise use cases. When ChatGPT generates “plausible but false” answers, how do we help learners validate? When adaptive tools reinforce biased patterns, how do we design for inclusion?

Instructional designers are now ethicists

 I no longer treat responsible AI as a bonus topic. It’s a design pillar. I embed bias mitigation frameworks directly into learning content and scenario-based practice. I partner with SMEs to align ethical awareness with workflow reality. One project included role-based choices with immediate feedback; learners had to spot hallucinations and correct the course. The result? Stronger decision-making, not just faster automation.

Takeaway

 Responsible AI training isn’t just about content; it’s about intent, clarity, and accountability.

Discussion Prompt

Where have you seen AI's risks surface in learning, and how did your team address them?


Posted to LinkedIn

Tuesday, January 20, 2026

The LMS Isn’t the Strategy

Why platforms are tools, not solutions

Don’t confuse infrastructure with impact

I’ve led LMS migrations for institutions and built learning ecosystems from scratch for startups. In every case, the temptation was the same: expect the platform to “solve” engagement or scale. However, technology alone can’t fix what is missing in strategy. A great LMS amplifies a great plan, it doesn’t replace one, nor does it prop up a bad one. I’ve seen beautiful dashboards with zero uptake and basic systems with off-the-charts learner impact. The difference? Clear purpose and design thinking.

Start with the learner, not the platform

Effective learning ecosystems begin with mapping the journey: what learners need to know, feel, and do. Then we design the experience. Only then should we pick the tools. When we flip that order, we build backwards. I once consulted on a rollout where the LMS was locked before content existed. It added six months to the timeline and forced the team to retrofit ideas. Strategy should lead; platforms should follow.

Don't design for the tool. Design and find the right tool for the job.

Takeaway

Technology should support your learning vision, not dictate it.

Discussion Prompt

When has a platform helped, or hurt, the impact of your learning program?


Posted to LinkedIn

Thursday, January 15, 2026

What I’ve Learned Building 100+ Programs

Takeaways for scaling learning that sticks 

Designing at scale takes more than tools

After 25+ years and over 100 digital programs, I’ve learned that strategy, empathy, and iteration consistently outperform trend-chasing. Whether it’s onboarding, compliance, or leadership training, real impact comes from getting the basics right, and adapting them relentlessly.

Here’s what I carry forward

  • Start with the learner’s reality, not the stakeholder’s wishlist.
  • Define success before you pick the platform.
  • Inclusive design isn’t optional; it’s strategic.
  • Don’t just deliver content, engineer performance.
  • Treat data as dialogue, not just validation.
  • Build feedback into the process, not just the postmortem.
  • Invite learning teams into business strategy early and often.

These ideas aren’t flashy. But there’s a difference between programs that look good and ones that work. I’m continually refining, always learning, and always ready to collaborate with teams that share the same values.


Takeaway

Learning that scales starts small: with clarity, care, and conversations that matter.


Discussion Prompt

Which of these lessons have you seen make (or miss) the biggest difference?


Posted to LinkedIn

Tuesday, January 13, 2026

The Designer Is the Strategist: Rethinking what “instructional designer” means in 2026

Designers aren’t order-takers anymore

In too many organizations, instructional designers are still brought in after the strategy is set. “Make it look nice.” “Turn this into a course.” But the most impactful work I’ve done happened when I was involved early, shaping goals, surfacing risks, and mapping user journeys. Instructional design is strategic work. We aren’t just building courses. We’re building performance systems.

We speak both languages

Designers translate business needs into learner experiences. That means understanding KPIs and adult learning, brand voice and Universal Design for Learning (UDL), AI tools, and team dynamics. When I’ve helped shift a program from slides to systems thinking, it wasn’t about prettier graphics. It was about identifying bottlenecks, shaping behavior, and showing measurable value. And that shift earned a seat at the strategy table.

Takeaway

Instructional designers don’t just execute vision; we need to help define it.

Discussion Prompt

How early are designers invited into strategic conversations where you work?


Tuesday, December 23, 2025

Fail-Proofing Your Learning Strategy

How to bulletproof initiatives from day one

Prevention beats rework every time

I’ve seen learning initiatives fall apart, not because the design was weak, but because alignment wasn’t locked in. One healthcare client launched a compliance module without manager buy-in. Completion rates lagged for months. We rebuilt with stakeholder interviews, scenario mapping, and clearer metrics. Same topic, same tech, radically different outcome. Why? This time, it was anchored in real context.

Map expectations early and often

Before I create anything, I ask: What does success look like to each group, leadership, learners, and front-line managers? That simple question surfaces misalignment fast. I then design feedback loops into the rollout: check-ins, usage data, short surveys. This helps spot issues while they’re still small. It’s not flashy work, but it’s what keeps programs from stalling post-launch.

Takeaway

The best learning plans aren’t just well-designed, they’re co-owned from day one.

Discussion Prompt

What’s one thing you wish you'd asked before launching a learning initiative?


Posted to LinkedIn

Thursday, December 18, 2025

What AI Can’t Do (Yet)

Keeping the human edge in an algorithmic age

AI is a partner, not a proxy

I’ve tested AI tools across course design, feedback loops, and learning analytics. They’re fast, scalable, and occasionally brilliant. But they still don’t understand nuance. AI can suggest a quiz, it can’t read a learner’s frustration. It can tag learning objectives, but not reframe them in language that inspires. In short, AI does pattern. People do meaning. And right now, that gap still matters.

Human insight builds learning that lands

I once used an AI tool to auto-generate course outlines. They looked sharp, but when we tested them with learners, they missed tone, pacing, and relevance. We kept the bones, but had to rewrite for clarity and connection. The real win was combining AI’s speed with a human filter for empathy and impact. That’s where I see the future, not automation, but augmentation.

Takeaway

AI can boost learning delivery, but people still craft the experience that connects.

Discussion Prompt

Where has AI helped, or hindered, your work in learning or design?


Posted to LinkedIn


Tuesday, December 16, 2025

Data as Dialogue

What learning analytics should actually be telling you

Numbers should spark action, not just reporting

I’ve worked with institutions that had dashboards for everything, course completions, quiz scores, and attendance logs. But the real question is: What decisions are we making with this data? Learning analytics only matter when they trigger meaningful dialogue. I once ran a workshop where we shared quiz drop-off data with faculty, not to critique, but to redesign. That single conversation led to a 16% lift in assessment completion.

My own background is in statistics and data analysis. I love data, sometimes even data for the sake of having data. But data alone will not solve any problems. 

Ask better questions, get better learning

Instead of asking, “Did they finish?” I push teams to ask, “What made them stop?” or “What’s missing in the behavior we expected?” Data becomes useful when it’s tied to experience and outcomes. That’s why I embed reflection checkpoints, pulse feedback, and usage maps into every program I design. The goal isn’t just insight, it’s iteration.

Takeaway

Analytics aren’t the answer. They’re the start of a smarter conversation.

Discussion Prompt

What’s one learning metric you think we overuse, or overlook?