SXSW 2026: AI Is Reshaping Storytelling – With Human Craft At The Center
BY DAVID TAMAYO, CREATIVE AI DIRECTOR

It was another great year in the sunny heat of Austin at SXSW 2026. This year, I had the chance to dive into some of the most pressing conversations at the intersection of AI, creativity, and media. Austin had it all, with conversations ranging from practical insights on responsible AI use to big-picture ideas about human ingenuity in the age of automation. There were powerful experiential activations like JBL’s immersive space and plenty of insights from leaders at the Webby Awards, Pixar, ElevenLabs, and more. Throughout, it was clear that AI is no longer just a tool – it’s shaping how we think, create, and collaborate.
AI has inherent risk involved
The takeaway here is simple: All tools, including AI tools, carry risk. The legal landscape is gradually becoming more comfortable with AI, but many clients, especially in regulated industries, still require a cautious approach. In those environments, protecting confidential information is critical. A more conservative approach often leads teams to use AI tools quietly, without openly discussing it. However, a more collaborative approach, combined with genuine client interest, tends to create greater transparency and healthier adoption.
From a legal perspective, a key priority is avoiding IP infringement. That means controlling inputs, being careful with source material, and following clear best practices. Ownership is also a nuanced topic: purely AI-generated output may not always qualify for copyright protection, but AI-assisted work with meaningful human creative contribution can be more protectable. The creative direction, prompting strategy, editing, selection, refinement, and transformation process all matter, as well as the documentation of responsible usage. Actually, that’s exactly the foundation of Vermeer.ai, Havas’ Gen-AI solution created by POP. With Vermeer.ai, everything was built with responsibility in mind – all models are vetted by our legal teams according to predefined criteria, aligned with industry standards, and backed by rigorous, responsible AI training.

From “human-first” to “human-preferred”
The framing around AI use in the creative process has shifted. AI can absolutely be part of creativity – the challenge is that the legal definition of what constitutes “enough” human contribution is still unclear, and that ambiguity remains a major issue when it comes to legislation and disclosure. We should see a middle ground ultimately emerge: one that both protects creators and rights holders while still allowing innovation to move forward.
Overall, AI should be integrated into the traditional production pipeline as a creative tool, not treated as a replacement. The goal is to preserve artistic intent, authenticity, and creativity while using AI to open up new kinds of images, workflows, and storytelling possibilities.
This was the topic of conversation during the VIP breakfast hosted by Havas in Austin, where I had the chance to hear Dan Hagen, Havas’ Global Chief Data & Technology Officer, speak about “The Rise of the Thinking Class”. From how AI can amplify human intelligence to the organizational shifts needed to harness it responsibly, the conversation reinforced a key theme emerging across SXSW: the future of innovation is still deeply human.

Audience trust is more important than ever
Brands need to be seen as ethical in their use of AI, because ethics are directly tied to reputation, trust, and long-term brand value. Transparency is also becoming increasingly important, as labeling AI-generated output may be the right move, especially where trust, authenticity, or public perception are at stake. Even when disclosure is not legally required, it can still be strategically and ethically smart.
During the panel “The Trust Paradox: AI, Creativity & Credibility” presented by the Webby Awards, some strong statistics from the Webby Trend Report translate this context: 94% of people believe AI content should be clearly labeled, and 62% said they would trust a brand more if it disclosed AI usage.
At the same time, many questions remain unresolved: What counts as sufficient human authorship? When should AI-generated content be labeled? How should ownership be treated across prompts, outputs, edits, and training data? What responsibilities do platforms, users, and clients each carry? For now, the best path is to stay careful, transparent, documented, and ethically grounded while the law catches up.

Constraints promote creativity
Creativity does not come from unlimited possibility alone – constraints are essential. Limitations help give direction, avoid paralysis, and make work feel more intentional and believable. In visual production, realism and quality often come from respecting real-world rules, such as how a camera behaves, what lens is being used, or how movement is physically motivated. Those kinds of constraints actually strengthen creative output.
The overall perspective is optimistic: if AI lowers costs while preserving strong artistic direction and thoughtful constraints, it could empower smaller teams and studios to innovate faster. This could potentially lead to a new golden era of creative work, where breakthrough stories, films, games, or franchises can emerge from much leaner production models.

The evolving role of creatives
As execution becomes easier, taste, judgment, and curation become even more important. In that context, generalists who understand the full creative pipeline, from story and pre-production through modeling, lighting, animation, and final image-making, may become more valuable as AI begins to blur the boundaries between previously separate disciplines, creating new feedback loops for creativity.
AI is already being used in practical, narrow ways inside major studios like Pixar, for example to accelerate rendering. By reducing time and cost, these tools can make production more efficient and potentially allow studios to take bigger creative risks.

SXSW 2026 was the edition where the iconic futurist Amy Webb kicked off her keynote with a symbolic “funeral” for her traditional annual trend reports. Her argument was that the speed with which the world is changing and tech is evolving does not allow time for trend reports to be published anymore. Brands must act fast at the intersection of multiple technological, social, and economic forces.
This moment says a lot about the pace with which we’re living evolution – and AI is, of course, at the forefront. It’s clear how much brands and agencies must adapt to follow it, considering legal, ethical, and transparency implications as much as the changes in consumer behavior. In the end, we must keep observing human changes to embrace AI. One thing feeds the other. It’s not AI versus human – it’s AI empowered by humans who leads with intention, ethics, and craft.
Want more insights from POP’s leaders? Make sure to follow us on LinkedIn and Instagram.
Comments are closed.