
How Cow-Shed Built a Full Brand Using AI (And What We Learned Along the Way)
How Cow-Shed AI used a real project to test AI tools properly: by making “Catch Up With AI”
Introduction
When people talk about “learning AI”, it often drifts into theory: long lists of tools, jargon, and promises about productivity.
We don’t learn that way – and we don’t teach that way either.
So this autumn, at Cow-Shed, we decided to test AI properly by doing something real: we took our new workshop concept, Catch Up With AI, and built an entire brand using AI tools end-to-end.
Not hypotheticals. Not examples. A genuine project with real constraints, deadlines, aesthetic decisions, and technical challenges.
Below is the step-by-step process – what we used, what worked, what didn’t, and where AI genuinely accelerated the work.
1. Brand Strategy with ChatGPT
Every brand needs direction before you touch the visuals.
We fed ChatGPT our creative brief for Catch Up With AI and asked it to help refine the brand strategy:
- the core message (don’t be left behind)
- value proposition
- audience needs and pain points
- tone, voice and editorial direction
- brand essence and taglines
This wasn’t ChatGPT “doing the thinking for us” – it was a structured conversation that clarified what the brand was trying to say. It challenged assumptions, filled gaps, and turned loose ideas into a coherent narrative.
It felt like working with a strategist who never gets tired of iterating.
2. Moodboarding (not an AI step – but AI guided the process)
We could have used an AI moodboard generator, but in this case we wanted human taste.
Instead, we asked ChatGPT to guide us through building a moodboard that matched the brand strategy:
- warm, editorial, design-led
- premium magazine aesthetic
- minimal styling, soft natural light
- peach, black, and plum palette
- Playfair Display + Manrope typography
We pulled references manually into a board, but the direction came from ChatGPT – almost like an art director reminding us what to look for and why it mattered.
3. Logo Creation with AI (and the realities of iteration)
Next, we moved into Ideogram for logo creation.
This was the point where the limits of AI showed themselves.
Ideogram produced interesting concepts quickly, but fine-tuning was difficult, and we ran out of credits before getting something fully resolved. It was a useful exploration, but not a replacement for a designer – at least not for logos that require polish.
The takeaway?
AI is excellent for volume and variation, but branding still needs a human eye to decide what’s “right”.
4. Image Generation with Adobe Firefly
Originally, we expected to use Midjourney, but it required a subscription.
Since we already had access to Adobe Firefly, we tested that instead – and it was very good.
ChatGPT helped by:
- writing precise, editorial photography prompts
- suggesting which images to create for the project (hero images, metaphor-driven visuals, workshop scenes)
- adjusting the prompts until the images matched the brand’s aesthetic
Firefly produced magazine-quality images with consistent colours and lighting – and it fitted seamlessly into our existing Adobe workflow.
For a real client project, this would absolutely be usable.
5. Creating an Instagram Carousel (with Canva)
We wanted a simple, clean Instagram post explaining the workshop.
ChatGPT walked us through:
- choosing the right Firefly image
- designing a multi-slide carousel
- writing the copy
- keeping it legible and on-brand
- setting it up in Canva quickly
It was the closest thing to having a creative director sitting beside us saying,
“Use this image. Put the headline here. Slide 3 needs more space. Try a soft overlay.”
Again – AI didn’t replace design judgement, but it sped up the thinking.
6. Turning Images into Video with Runway + ElevenLabs
This step surprised us.
We used four Firefly images.
Runway took each one and generated short cinematic clips – slow movement, shifting light, subtle depth. Very polished.
Then ChatGPT wrote the voiceover script.
We generated the narration in ElevenLabs (which handled tone and pacing beautifully), and then:
- imported everything into Adobe Premiere
- added transitions and subtitles
- layered in music
- adjusted audio levels
- stitched the clips into a coherent intro video
You could do the same in iMovie or CapCut, but Premiere made it easier to control timing.
This was the most AI-augmented part of the project – a true combination of human editing and AI-generated components.
7. Website Build in Webflow (coming next)
The final step will be creating a small landing page for Catch Up With AI using the assets, copy and brand direction produced through this process.
The plan:
- use the colour palette from the moodboard
- bring in the Firefly images
- create a simple hero section
- use the brand story and benefit bullets generated earlier
- add the intro video as a header or supporting section
Webflow itself has AI tools now, but the website will mostly be a thoughtful assembly of everything created above.
And once it’s done, the full end-to-end workflow will be complete.
What This Experiment Confirmed for Us
AI is most powerful when you use it on a real project.
Learning tools in isolation rarely sticks.
Using them inside an actual workflow does.
AI works best as a collaborator, not as a replacement.
It’s the strategist, the sounding board, the lighting assistant, the draft-maker – not the entire studio.
People don’t need to fear these tools – they need to see them in context.
When you sit at a desk with real work, AI becomes clearer, calmer, and much less mysterious.
And this is exactly the kind of experience we bring into Catch Up With AI: real tasks, real examples, real workflows – because that’s where the value is.

