Skip to content

Prompt Engineering - steps towards AI Native Tooling - devops edition

talks 2 min read

ChatGPT reached a million users in five days. Nothing in tech history has moved that fast. GitHub Copilot felt like magic just a year before, and then this arrived and changed everything. This talk maps out where prompt-based AI tooling was already showing up across the DevOps lifecycle – and where it was heading.

For coding and configuration, ChatGPT already understood Nginx configs, Kubernetes manifests, IAM policies, cron job syntax, and could even pass the AWS certification exam. It was getting integrated into IDEs as a conversational assistant alongside your code, into terminal tools that translate natural language to git commands, and into data tools that convert plain English to SQL queries. Regex? Just describe what you want. Test generation? Ask it. Even AWS audit findings got explained with remediation steps.

The AI art world was further along and offered a preview of where text tooling was heading. Stable Diffusion’s open source release was the inflection point – suddenly everyone could experiment, inspect training data, and fine-tune models. The ecosystem that emerged around image generation – prompt search engines, prompt marketplaces, auto-completion for prompts, negative prompts for imperfection control, and reverse-engineering prompts from images – all of this was a roadmap for what would happen with code prompts.

Beyond images, text-to-video generation was already real. Google and Meta had their closed models, but open tools could generate motion, change backgrounds via prompts, transfer poses between images, create 3D depth from 2D, and control lighting. Sound generation from prompts added another dimension. The full creative pipeline – from storyboards through video to audio – was becoming prompt-driven.

The question of whether prompt engineering is a real profession mirrors the early debates about DevOps engineering. Some argue it is a temporary UX failure that will get abstracted away. Others see it as a fundamental new skill. What seems clear: we are heading toward AI-native products where the interface is intent-driven conversation rather than explicit commands. The term Matt Friedman coined – AI-native products – captures where this is going. The takeaway: experiment with this now, learn how it works under the hood, because you cannot be taught this in school.

Watch on YouTube — available on the jedi4ever channel

This summary was generated using AI based on the auto-generated transcript.

Navigate with