Skip to main content
Graham Dumpleton
Lead Software Engineer
View all authors

Reviewing workshops with AI

· 7 min read
Graham Dumpleton
Lead Software Engineer

In our previous post we walked through deploying an AI-generated Educates workshop on a local Kubernetes cluster. The workshop was up and running, accessible through the training portal, and ready to be used. But having a workshop that runs is only the first step. The next question is whether it's actually any good.

Workshop review is traditionally a manual process. You open the workshop in a browser, click through each page, read the instructions, run the commands, check that everything works, and make notes on what could be improved. It's time-consuming and somewhat tedious, especially when you're the person who wrote the workshop in the first place and already know what it's supposed to do. Even this task, though, is one where AI can help.

Deploying Educates yourself

· 6 min read
Graham Dumpleton
Lead Software Engineer

In our last post we showed how an AI skill can generate a complete interactive workshop for the Educates training platform. The result was a working workshop for the Air Python web framework, and you can browse the source in the GitHub repository. But having workshop source files sitting in a repository is only half the story. The question that naturally follows is: how do you actually deploy it?

If you've used platforms like Killercoda, Instruqt, or Strigo, the answer would be straightforward. You push your content to the platform, and it handles the rest. But that convenience comes with a trade-off that's easy to overlook until it bites you.

Teaching an AI about Educates

· 14 min read
Graham Dumpleton
Lead Software Engineer

The way we direct AI coding agents has changed significantly over the past couple of years. Early on, the interaction was purely conversational. You'd open a chat, explain what you wanted, provide whatever context seemed relevant, and hope the model could work with it. If it got something wrong or went down the wrong path, you'd correct it and try again. It worked, but it was ad hoc. Every session started from scratch. Every conversation required re-establishing context.

What's happened since then is a steady progression toward giving agents more structured, persistent knowledge to work with. Each step in that progression has made agents meaningfully more capable, to the point where they can now handle tasks that would have been unrealistic even a year ago. We've been putting these capabilities to work on a specific challenge: getting an AI to author interactive workshops for the Educates training platform. In our previous posts we talked about why workshop content is actually a good fit for AI generation. Here we want to explain how we've been making that work in practice.

Clickable actions in workshops

· 8 min read
Graham Dumpleton
Lead Software Engineer

The idea of guided instruction in tutorials isn't new. Most online tutorials these days provide a click-to-copy icon next to commands and code snippets. It's a useful convenience. You see the command you need to run, you click the icon, and it lands in your clipboard ready to paste. Better than selecting text by hand and hoping you got the right boundaries.

But this convenience only goes so far. The instructions still assume you have a suitable environment set up on your own machine. The commands might reference tools you haven't installed, paths that don't exist in your setup, or configuration that differs from what the tutorial expects. The copy button solves the mechanics of getting text into your clipboard, but the real friction is in the gap between the tutorial and your environment. You end up spending more time troubleshooting your local setup than actually learning the thing the tutorial was supposed to teach you.

When AI content isn't slop

· 7 min read
Graham Dumpleton
Lead Software Engineer

In a post on my personal site I talked about the forces reshaping developer advocacy. One theme that kept coming up was content saturation. AI has made it trivially easy to produce content, and the result is a flood of generic, shallow material that exists to fill space rather than help anyone. People have started calling this "AI slop," and the term captures something real. Recycled tutorials, SEO-bait blog posts, content that says nothing you couldn't get by asking a chatbot directly. There's a lot of it, and it's getting worse.

The backlash against AI slop is entirely justified. But I've been wondering whether it has started to go too far.

Educates becomes an independent OSS project

· 2 min read
Jorge Morales Pou
Lead Software Engineer
Graham Dumpleton
Lead Software Engineer

I am happy to announce a significant change regarding the Educates project, an interactive training platform for hands-on labs hosted on Kubernetes.

Educates was a collaborative open source initiative spearheaded by Graham Dumpleton and myself (Jorge Morales) which has been developed over the past five years. It was initially developed while working for VMware, and subsequently Broadcom following their acquisition of VMware.

Recently however, both Graham and myself were impacted by the latest round of cuts by Broadcom subsequent to the takeover of VMware. This left Broadcom in possession of the Educates project without active maintainers.