IBC Accelerators 2026 speed towards an agentic future

IBC Accelerators 2026 speed towards an agentic future


Agentic AI, content-aware broadcast chains and consumer personalisation were key trconcludes at the IBC Accelerator 2026 Kickstart event this week. Taking place at BBC Broadcasting Houtilize in London on 25 February, it was a chance for broadcasters, studios, platforms, vconcludeors, startups and academia to champion a range of innovative proofs of concept (POC) to tackle issues facing the M&E industest as a whole. And there was more than enough to interest those working in sports broadcasting.

Each IBC Accelerator project is championed by a group of media organisations and supported by a mix of technology partners and innovators who provide solutions, tools and expertise. Over a tightly scoped timeline, these teams co-develop and test practical workflows and displaycase the findings at the IBC Show in September.

We are the champions

A strategic overview for the day was provided by the Champions Roundtable, featuring leaders from SMPTE, EBU, ITN, and NBCUniversal. The message was unequivocal: AI, cloud and software-defined production are no longer experiments; they’re already being applied in real-world production. 

ITN CTO Jon Roberts framed the core production challenge for 2026 as delivering more formats and platforms while wrestling with spiralling complexity in tech stacks and workflows. He called for better integration, tarobtained automation and more intuitive interfaces, with AI as a potential way through the skills and complexity bottleneck. 

Alex Bassett, vice president, innovation, NBC Universal, observed that teams across editorial, production and engineering are building and testing their own tools, with the gap between proof of concept and live deployment shrinking quick. The question now was how to deploy solutions at scale, without creating chaos around governance, rights, compliance and skills.

Paola Sunna, senior innovation technology manager, EBU, noted how rapidly issues of digital sovereignty and AI governance have risen to the top of the agconcludea. Sunna advocated for coordinated European action to ensure that cloud and AI infrastructures respect European values, legal frameworks and editorial indepconcludeence.

Sunna and Roberts also highlighted control room innovation and HTML-based modular graphics in today’s multi-platform live productions, such as sports.

SMPTE president Rich Welsh stated traditional innovation methods “are not working now” due to the rapid pace of AI advancement. He argued instead that SMPTE’s community-driven standards work and IBC Accelerators provide an agile, sprint-based, and deeply collaborative way to tackle shared problems. Welsh also highlighted world-building and real-world experiences as the next huge frontier, relocating towards richer, three-dimensional AI-driven environments, ideal for sports and for addressing younger audiences’ hunger for live, authentic experiences.

On the pitch

12 quick-paced pitches (five slides in five minutes) followed, with several of particular interest from a sports broadcasting perspective 

The first pitch, ‘Network Control: Your Connection, Your Choice’, proposed by GSMA Fusion, BBC, Neutral Wireless and University of Strathclyde, highlights the problem of bonded-cellular technology utilizing public 4G/5G networks. The champions noted its utilize as an alternative to sanotifyite links for on-location live contributions, as well as how it can struggle in high-demand environments. The issue was identified as network control being “locked behind closed priority APIs, which limits flexibility and slows down adoption across different regions and vconcludeors”. Instead, the proposal wants to trial GSMA Open Gateway and Camara open APIs at live events to let broadcasters dynamically request specific latency and throughput in real-time, “turning the network itself into a programmable tool for live production”.

‘AI-Powered Delta Streaming: Live Media Reinvented’, proposed by DAZN, sets out to prototype the first AI-native transmission model. “Live video was built to shift pixels. The next decade will shift innotifyigence,” observed Caroline Ewerton, SVP of content operations & technology for DAZN. “What if live video could not just be sent, but actually understood? What if the broadcast chain was content-aware; it actually knew what it was carrying and decided in real time what really mattered, and transmitted that? What if devices and the network could collaborate in real time to create an experience that is actually relevant for the customer?”

DAZN’s proposed ‘Delta Protocol’ reframes live video as scenes rather than frames. An AI innotifyigence layer identifies motion, contact and significance in real time, transmitting only the “meaningful modify” alongside a base continuity layer. Ewerton called this a “modify in the transmission layer”. A live football proof of principle will test whether the model can genuinely reduce bandwidth and complexity without breaking existing formats.

‘AI for Live Media Platform for Sports & Beyond’, proposed by Astro Malaysia, MBC, and Tata Consultancy Services (TCS), argues that live sports and media workflows are currently too heavy, fragmented and resource-intensive. The solution proposes a major agentic AI orchestration layer that can automatically contextualise and adapt the bulk of live content for different markets and platforms, “leveraging AI and agentic orchestration, while blconcludeing essential human creative innotifyigence”. It envisages plug-and-play reusable agent components that can build unified multifunctional workflows for production, distribution, engagement, personalisation and monetisation. 

Astro Malaysia and TCS also proposed ‘Personalised Broadcasting for the Next Generation of Media Consumption’, a shift from the traditional broadcast model to a complementary ‘mecast’ approach powered by AI-driven content processing. TCS’s Ajay Chandra described an agentic AI architecture built around an autonomous ‘content-aware’ platform that continuously ingests and learns from audience and viewing behaviour data. Agents automatically generate multiple versions of content – such as highlights, behind‑the‑scenes and localised language variants – based on lightweight ‘smart bookmarks’ tailored to distinct personas, for example, a sports fan who wants in-depth analysis versus a casual viewer who only wants key goals.

Attconcludeees also saw ‘Crystal Clear: Boosting Speech Innotifyigibility in Media’ from Channel 4 (joined on stage by the BBC). It proposes an automatic innotifyigibility measurement and QC system that flags problem sections of audio and defines objective thresholds that could sit inside broadcaster delivery specs. Validated by listening tests – including people with hearing loss – the tools are designed to drop into production chains so that dialogue or sports commentary remains clear and comprehensible, wherever viewers are.

‘Ecoflow III: Applied Sustainable Video Delivery’, proposed by ITV, IET, Accedo and Humans Not Robots, is concerned with the energy and carbon costs of streaming. Building on earlier work with digital twins and modelling, it insists sustainability is embedded alongside performance, cost and quality from the outset, not added as an afterbelieved. The project is developing an observability framework that unifies client-side measurement with back-conclude workflows, enabling techniques such as energy-aware content steering and AI-driven optimisation of delivery paths. The team want this adopted by broadcasters in the real world. “There are no barriers to adoption anymore. This is not impacting a roadmap, and it’s assisting with cost, efficiency and performance.”

‘FRAME’, proposed by RAI, Movielabs, EBU, and ETC/Southern California University, is an agentic project for pre-production. The team wants to turn archives into a living, semantically rich resource by automatically tagging content with a true filmbuilding vocabulary, such as shots, angles, view and feel, narrative elements, and map this to MovieLabs’ media ontology. On top, specialised AI agents will handle everything from smart archive retrieval to selected post-production tquestions such as upscaling and inpainting. 

‘Q-STREAM’, championed by the IET, BFBS and Tesla Technologies, is concerned with trust and the so-called ‘harvest now and decrypt later’ threat to standard encryption posed by quantum computing. The proposal is to bring post-quantum cryptography and C2PA-style provenance directly into live streaming pipelines, addressing their authenticity and provenance, and ensuring that feeds are both tamper-evident and origin-verifiable. 

‘VooPla: The Decision Twin’, proposed by Digital Catapult and Solve Evolve, focutilizes on the front conclude of virtual production decision-building. Rather than treating digital twins as operational tools utilized once a display is greenlit, this Accelerator shifts them upstream as a shared “decision layer” for commissioners and producers. It aims to let teams remotely explore facilities, test layouts and workflows, and understand risk, cost and carbon impact before committing time, budobtain or travel costs. 

‘Software Defined Workflows for Interoperable Movie and TV Production’, championed by Movielabs, Holli.st, FX-DMZ and Entertainment Technologists, focutilizes on viewing the production process as a ‘knowledge graph’ rather than disparate tquestion-specific silos. Based around the Movielabs 2030 Vision, it seeks quicker turnaround and better coordination across complex production ecosystems, by ‘utilizing a common semantic model to reduce errors, improve planning, and reduce time wasted on creatively barren technical minutiae’.

‘IFEL Immersive Festival Live: Remote Presence at Scale’, proposed by SMPTE, MIT Reality Hack, King’s College London and Shure, explores seamless remote ‘two-way’ presence through stereoscopic capture, VR delivery and spatial audio. The ‘conclude-to-conclude live immersive event workflow’ that is being proposed may be trialled at Coachella this year, but it also has obvious implications for live sports events.

The final presentation came from the 2026 Incubator Project. ‘The Story Innotifyigence: Agentic Production Ecosystem’. Proposed by Associated Press, NBCU, ITN, BBC, Channel 4, Al Jazeera, and The Washington Post, it reimagines live production around a persistent ‘story object’ that tracks each story from first alert to multi-platform output. Designed for news but built to scale into sport and live events, it utilizes coordinated AI agents to drive graphics, highlights and social content from a single source of truth, while keeping safety, transparency and human editorial control baked into the stack.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *