Loading project
Preparing this case study...
Preparing this case study...
An AI-powered content engine that turns live API data into structured marketing content across video, social, long-form articles, and PDF reports.
Project Snapshot
Technical Footprint
This project is an AI-powered marketing engine designed to turn live product data into structured content across video, social media, long-form writing, and PDF reports.
The core workflow is simple but powerful: raw API data is converted into a canonical signal, passed through content-specific adapters, transformed by generators, and then prepared for publishing. The full operating loop extends this further into monitoring and feedback, so the system is not just creating content, but learning from how that content performs.
I built the system around a clear principle: the audience should not feel like they are being marketed to. They should feel informed. The marketing becomes a side effect of the quality of the information. That principle shaped the whole content engine, from short-form video hooks to technical developer tutorials, LinkedIn research posts, Substack articles, and downloadable whitepaper-style PDFs.
The engine is structured around different content groups. Group 1 covers video content, including 30-second shorts, 60-second investigative shorts, and long-form video formats. Group 2 covers static social content for platforms like LinkedIn, Twitter/X, Instagram, and TikTok. Group 3 covers long-form written content, including Dev.to tutorials and Substack articles. Group 4 covers PDF whitepapers designed as standalone intelligence briefs.
What makes the project different is that the content is not generated from vague prompts. Each output is mapped back to real API fields, audience intent, narrative structure, visual rules, and platform-specific formats. The aim was to build a repeatable marketing system where content quality comes from data, structure, and narrative discipline rather than random content ideas.
I designed the full content engine workflow, including the raw API to canonical signal process, the adapter layer, the generator layer, the publishing flow, and the monitoring/feedback loop.
I also created the content architecture across multiple groups: video content, static social content, long-form written content, and PDF whitepapers. Each group was mapped to specific audiences, platforms, data fields, content types, and narrative structures.
I defined the narrative blueprint approach, where each content output has a story arc, structure, data sequence, emotional journey, governing principle, exit feeling, repeat trigger, and share trigger. This made the system more disciplined than a normal AI content generator.
One of the main challenges was turning raw API data into content without making the output feel generic or automated. A simple AI content generator would not be enough, because the system needed to understand the difference between a trading signal, a developer tutorial, a LinkedIn research post, a data lab insight, and a PDF briefing.
Another challenge was creating a proper content pipeline. The workflow had to move from raw API data into a canonical signal, then through adapters, generators, publishers, and eventually monitoring and feedback. That meant treating marketing as an operating system rather than a collection of one-off posts.
The hardest part was keeping the data, narrative, and format aligned. Every content type needed a clear audience, platform, trigger, promise, field map, narrative arc, and visual structure. Without that discipline, the content would become inconsistent very quickly.
This project gives me a strong example of how AI can be used beyond simple chatbots or copy generation. It shows how an AI system can transform live product data into a full marketing operation with different content types, audiences, platforms, and publishing formats.
It also creates a reusable model for future products. The same workflow can be applied to other data-driven brands where raw signals need to become content, reports, social posts, videos, and customer-facing insights.
The project also strengthens PulseBit as a product because the marketing engine is directly connected to the data product itself. Instead of creating random promotional content, the engine turns the product’s own intelligence into useful public-facing material.
This project taught me that AI content systems need a strong structure before generation. The quality does not come from asking AI to “write a post”. It comes from defining the data contract, the narrative job of each field, the audience, the content type, and the intended emotional journey before anything is generated.
I also learned the importance of canonical data. If the canonical signal is weak, every adapter and generator downstream becomes unreliable. That is why the workflow has to prioritise stable canonical truth first, then adapter compliance, and only then generator refinement.
The project also pushed me to think about marketing as a system. Good content is not just copywriting. It is product data, audience psychology, distribution format, visual design, feedback loops, and repeatable execution working together.
I help founders and teams turn messy ideas into reliable systems — from MVPs and APIs to AI-enabled automation workflows.