A FACE OF ONE’S OWN : A NEW AGE OF STORYTELLING AND CIVIC ENGAGEMENT IN FILM, MUSIC AND THEATRE
This article is an opinion piece by Olivia Gillman, Film, Theatre and Music Representative for The Ethical AI Alliance. Olivia is based in London, England. Her attachments include The National Theatre and Pinewood Film Studios. @omgillman - IG . Click here to read Olivia’s National Theatre guide to acting practice and photorealism (as taught at Columbia University and Guildhall School of Drama)
Anatomy Lesson of Dr. Willem van der Meer by Michiel van Mierevelt
A Brief History of Human Creative Engagement and Our Current Predicament
“It's too late for collaboration: time to accept co-operation.”- So did a Silicon Valley executive address a room of 500 people at a Civic AI conference at Oxford University in March 2026. His speech followed introductions from three Oxford endorsed leaders in care and neuroscience development. He had not entered the space on a winner - his first words had been: “I hope I am not encroaching here on whatever it is you're trying to do.” He patted the air next to his fellow speakers, as though to say: “this whole public assembly thing…it’s cute youre trying.” His lack of awareness as guest in a historic location (teaching began in Oxford in 1096) - summarises an increasingly common tech industry confusion that misses how the tech and ai industry is a relatively recent development on a human-run planet with a history spanning over twelve thousand years.
In entertainment specifically, we've been doing something the tech industry incentivizes now: the harvesting of human intimacy, attention and emotion– since before Ancient Greece. Ancient Greek tragedies from The Oresteia through to Medea would see competition with comedies such as Lysistrata or The Frogs, competing at the Festival of Dionysia as crowds flocked to Athenian auditorium for their annual dose of entertainment.
Shakespeare's London would, much like biometric data scraping today, dissect human bodies in front of its audience. Like a surveillance agent Iago says to Othello: “I'll know thy thoughts.” “Not if my heart were in your hand” replies Iago - a resistant victim. And this victimhood was all too usual at the time: criminals dissected in live anatomizations as public punishment in amphitheatres running side by side with venues like The Globe. The anatomization act of 1832 ended this but the market of body parts continued well past Shelley’s 1818 Frankenstein and into the commonality of cadaver theft from graves in the Victorian era. Sadistic ownership of human body parts by other humans is not then altogether new.
Over twelve thousand years vehicles for human entertainment have had to adapt from Commedia D’ell Arte in Italian street theatre through to use of OLED screens and Generative AI in The Mandalorian to cater to what producers are willing to invest in - and human audiences are willing to purchase. Ancient Grecian audiences would visualise sets around their toga-clad actors set against blue Athenian skies (and much like Hamilton on Broadway today -how much they enjoyed this might determine how much they spent on Greek vases at merchandise stands). Today, following toy cars backgrounding early Hitchcock sets in the mid twentieth century, virtual production visualizes for audiences full crowd sequences: as in Disguise's volumetric crowd capture used on Wicked, at a competitive cost. A deepfake or Generative Ai human avatar can take things one step further and visualise an entire human in virtual, augmented or mixed reality.
Currently social media algorithms often incentivize user attention through a maximization of tragic or horrific content: a business decision potentially damaging to human psychology when this algorithmical clustering could be used for positivity, humour, and the promotion of civic contentment. Take, for example, a modern AI tool such as Polis - which can aggregate data into democratic outcomes. Dr Audrey Tang, former Digital Minister for Taiwan, speaking recently at Oxford University CIVIC AI Conference - cited using such tools for democratic purpose: where the default trajectory with AI does not have to be human annihilation and self punishment but can lead to fair surveying and the production of democratic data.
If the twenty-first century western world was Ancient Greece: currently the mainstream is stuck watching Oedipus Rex on repeat on their social media feeds because powerful Silicon Valley executives are choosing to incentivize machines and profit over safety and quality of human-lived experience. The AI harms and stories of industry shifts with shock value are not hard to find: see our Ethical Alliance Harms Map. A particularly striking case study: 16-year-old Adam Rain of Orange County, California, died by suicide in April 2025 after months of engaging with ChatGPT. At a recent meeting amongst Hollywood filmmakers lead by The Creators’ Coalition on Artificial Intelligence, a SAG-AFTRA representative described being deep-faked without consent in a feature film in mid 2025: they acknowledged this is an increasingly common event amongst creatives in film with body scans, photogrammetry, mo-cap and gen-ai practices often confusingly overlaid at point of human interface on-sets. Performers and on-ground creatives alike left unsure about what happens to their body data after their performances leave the camera.
I was recently approached to use our Columbia-taught National Theatre theory of photo-realism (or naturalism) in acting to train generative ai human avatars and assist a coder and content creator in reducing an $80,000 human-teams-lead advertisement shoot for big brands to $5,000 should they replace human actors and models with effective, emotionally authentic, avatars. I declined - in my shock -to work with this company appropriating my directing skills this way, but started on this journey where I am now: researching to give protections to humans in our industry- noticing that currently, whether we’ve consented or not all of our creative outputs and audio-visual likenesses are vulnerable to data scraping.
Forget just us narcissists in entertainment : were a useful case study as were at the frontline of this tech-human meeting- but in many Western democracies, due to lagging laws next to tech industry progression, everyday citizens just lost their right to anonymity and the terrain of their own face - even if it is their choice to remain hidden. In Britain alone, whilst protections against deepfake harassment are freshly solid - no citizen has default legal right to their own personality or image data: just protection against abuse and harassment. Creative performers notice this first - and yet one day it could affect everyone: and that day could be this year. Data scraping and the widespread use of AI is a practice we now have to accept is here. Championing privacy and freedom for the oppressed,
Virginia Woolf’s vision of intellectual and personal privacy resonates anew in debates on digital identity and AI surveillance
Virginia Woolf wrote about a Room Of Ones’ Own in the nineteenth century as she advocated for private space for artists- but is our 21st Century rebuttal A Face Of Ones’ Own? The first musical instrument invented by man was the bone flute, carved from objects such as mammoths tusks some 35,000 - 40,000 years ago : and yet we also invented AI. In a favourite moment of mine on a recent call with SAG-AFTRA creatives and the CCAI, a member called Eric contributed: “if the tech industry can't even deliver a licence framework, why are we relying on them to be the creators of superintelligence ?”
But if these are the immediate, pressing, problems: talk and philosophy are all very well - but what can we all think and do to maintain the careers and worlds we have built?
Poor Actors - The Far-Reaching Power of Tech
My instinct tells me- after six months of attending tech industry events and representing the creative industry- identifying the problem, atleast, is simple: there are immoral structural decisions being taken in a dark room somewhere that need challenging, unpicking and reordering by a collaborative community, in the interest of all tech users and impacted citizens. We are in what Will.i.am (currently developing sovereign human-controlled data agents at the University of Arizona) terms a ‘wild west.’
Twenty first century colonialism is poor actors in the tech industry turning up in pirate ships on the shores of our cultures to scrape not our gold coins and jewellery : but our biometric (facial, vocal and bodily) data. And just like pirates some are not just impolite or disregardingly disrespectful about this: but behaving with abject, front-footed immorality.
Dario Amodei of Anthropic provided a recent example of an ethical, civic, move in refusing to take on domestic surveillance under-pressure from Republican governance in use of Claude: a civic step. However, our aforementioned impolite Silicon Valley executive in Oxford gives an example of an attitude of complacency and disrespectfulness toward the boundaries, time and selfhood of civilians that AI leaders in Silicon Valley can currently display. There is a tendency to not view themselves as a co-industry amongst other industries (where a London artists’ agent might view the tech industry as little more than an inflated IT department) - but as the leading supreme industry. This is something I only realised by attending tech industry events 2025-26, and the culture is shocking : industry defectors, for example, labelling culture the last line of defence. By sharp comparison, where I work day - to - day in the arts, I, like so many, see poorly behaving tech industry as an unwanted incursion on our industry territory and revenue streams. Perhaps my perspective is all the more purist as my original grounding is as a theatre director in in-person spaces: where Claude AI may aswell be Windows 98. I felt disassociated from that in the 1990s, preferring the studio with peers: and I feel disassociated with AI now - the point is it isnt the humanity I , like so many in our industry, trained to work with.
Monday 28th of April 2025 saw the Spanish and Portuguese blackout. Interrupted on a meander down Las Ramblas, I witnessed Spanish society go into shut down that day. From within an hour I went from drinking coffee as a tourist in a cafe to being huddled round a white van with twenty locals as one man disproportionately catastrophised in his translation of a battery-powered car radio to tell us all of Europe was down: energy, electricity - the borders were closing , and a dictator was likely invading overhead. His extreme inaccurate reaction was perhaps more as a descendant of Catalans’ Franco regime - and the fear of return to that that can grind at the Catalan psychology. However, whilst the issue itself was more low key: a day long Iberian Peninsula power cut, the apocalyptic twelve hour experience was an insight into capitalist human society suddenly thrown off a cliff by a technological blackout.
Having to use executive and social skills to navigate: asking for directions and seeking information was my hardest learning curve- we all werent used to talking in this way. It was when I sat on a balcony, drawing a scene ahead- to replace scrolling on instagram, waving at the woman opposite to replace whatsapping friends, looking at overheard aeroplane activity (or lack thereof) - to replace tracking the airline website, and watching out for Medieval town criers arriving at our square with needed information- to replace News scrolling: I truly realised how reliant we are on how big tech runs infrastructure. Human collective power is retained so long as we are as self-sufficient as possible without our landlord. The local Chinese takeaway understood this: they were the only local building with a back-up domestically run power generator : and made a killing on a new noodles, bagettes and phone-charge product that day.
In address to the Atlantic Institute Kay Firth-Butterfield summarised recently how the term ‘ethics’ is often watered down to ‘responsibility’ in big tech - the term ethics can be diluted with justification of perceived constraints on exponential growth in the AI race of major capitalist monopolies. This is a language that can all too quickly become gendered- where the feminine (irrational) is associated with care, HR and sustainability - whilst the masculine (rational) is associated with fast pace, growth and product. The product of this is that the ethics and human resources-lead decision making can be completely abandoned as easily as a young female employee might be ignored in an unequal meeting. And yet rights and freedoms effect every human on the planet: including CEOs willfully erasing them. So what can we do?
Proposals For Future Action In The Creative Industry: Protecting the Community, Your Freedom and Livelihood
Entertainment and civil engagement at audience level is a key civic tool across society that cinema, music and theatre can engage in their facilitation of large audiences around common purpose: the kind of crowd work (look to Augusto Boal’s forum theatre in Brazilian favellas - or even Cynthia Erivo holding a room in her hand with song) that fuels democracies in mechanizing shared common purpose. It's why art is the first thing autocracies can cut: notably Franco had all of Gaudi’s colourful lampposts painted black in Barcelona’s town squares during his mid twentieth century authoritarian reign.
Group-mind is also the kind of community conscience and cohesion The Atlantic Institute’s Naseer Eledroos recently observed tech companies are often trying to carve through in neighboring industries and societies ‘by design’- to place humans in silos in which they are easier to scrape value and data from. If you haven’t ever read E.M Forsters’ The Machine Stops, this is key reading to understand the potential dystopia and pain for such a design for mass society: every human kept in their white box where other humans are deemed unclean. And, now, this could happen whilst we lay down with our chatbot partners. I saw fiery flamenco dance at the Palau De La Musica Catalana in early 2025 : surely it is engaging in this rhythmic, colourful expression - set against, lets say, a backdrop of verdant green fields of lush nature - that would be the complete opposite to E.M Forster's terrifying imaginary? These are our extremes: but it's important we know them to generate micro-actions to maintain and improve our greater whole - where nature, collective community conscience, and human irregularity (musician Jacob Collier proudly advocates for this), could be the key opposite to a technologically determinist, authoritarian international society risked if poor actors in tech and ai arent held to account by us: their neighbours.
Every profession is going to have its own answers to how to create a toolkit whilst we await neat, civic, regulation in this new ai age. And every profession should meet and engage on this topic. We are, in effect, much like our forefathers in Ancient Greece tasked with building methods for a new, adaptive, democratic society from the ground up: something many in the Western world felt we comfortably had, until- perhaps, the sand somehow seems to slip from beneath our feet. The film, music and theatre industries, as society’s storytellers, cannot overlook our role in this. We also cannot carry the burden of every vertical of capitalism or society. But, these are some common things I think we all should know as members of the creative industry (creators and consumers):
Avoid ‘default trajectory’ thinking. It is not a given that technological or AI supremacy/ super-intelligence runs the world. This has not been the case for 12,000 years plus of human-lead international civilization and the tech industry is a neighbour of the equally valuable food industry, education industry, creative industry, care industry- amongst many more human industries.
Local, human, community, action has the power to drive a future society where tech is our tool, and we are driving the journey: if this is what we want as a society.
A lot of AI companies are currently operating like frightened PR machines - releasing products at escalated pace because they need to return large profit (not currently made) to mass early investment. This is a weakness in the sustainable functionality of many operative business models.
The AI and virtual production tools are a promising technological innovation but are a separate entity from the morality, culture and design of the tech industry that currently provides them. There are well-intended individuals in the tech industry who are moving to improve moral fabric in business: they are often not currently the dominant voices. Storytelling could play a role here.
Be aware that whenever you post images casually online now - this is providing biometric and valuable data related to your marketable artist identity. If you are not paying for something: you are the product.
And these are some common things I would recommend, should we want to protect creative agency and grow ai for storytelling that is engaged as a tool for civic society, we all could try doing:
For individual performers, sign up to your local creators’ guild or union. Examples: Equity UK for Britain or SAG-AFTRA for The USA.
Stay engaged with new workers’ alliances being formed. For example: The Ethical AI Alliance (Global). Here, cross-borders, artists and academics like myself, technologists, lawyers, unions and students from North America to Africa and Asia come together to discuss new pathways of ai accountability.
CCAI (Creative industries- USA).
Investigate trademarking your audio-visual likeness and/or relevant artistic output. Processes like this can be expensive and it is worth noting the need for these local human defences last as long as more regional, national and international regulation does not exist following fast tech and ai developments 2025-2026 globally. Undertake agent consultation.
If AI is creating in your area of creativity, keep a tab on what it is capable of and how your output remains different and unique in comparison to AI's determined patterns and probability. This keeps you on top of what your human product is and may, in more positive uses of this reading, enhance your understanding of your own creative identity or uniqueness. What we have that ai does not is our feelings, irregularities and place in our three dimensional living, breathing world.
Organise tools for carrying out creative practice in person, in a manner that does not interact with computers. For example, notebooks with handwriting in and paintings. This gives you analogue back-up of your work.
Whenever given opportunity, engage other industry professionals on value-driven conversation where you understand your own body, output and creativity as data with value - and also where you understand the value of your presence as a user of a technological tool. The recent mass Chat GPT boycott demonstrates that ethical decisions by tech companies can be challenged most effectively by economic mass behaviour. Ted Tremper’s team at the CCAI discussed this approach with the Ethical AI Alliance when we spoke to them about economic strategies across international creative projects in Q&A this April.
Provide allyship to all human demographics in this swiftly changing industry. A less-represented demographic than yours may need your support more than ever before. So where you do have power to comfortably wield in business: wield it to protect those less powerful than you - so long as you are safe in doing so. All humans need dignity, access to money, and private space to maintain and optimize freedom.
Question any elements of contracts you do not understand. Hollywood creatives have recently asked for proof to them and their lawyers their creative output or biometric (body, voice and movement) data has been destroyed at close of projects. You might need to assert these rights to your own image and output.
Read, watch, listen, but most importantly: act. I can recommend Tristan Harris’s Your Undivided Attention podcast (The Centre of Humane Technology) and Dan Kwan and Ted Tremper’s The AI Doc: Or How I Became an Apocaloptimist documentary (released early 2026) to ground in balanced theory on these topics. Our National Theatre photo-realism and feminism in digital spaces guide here. Laura Bates’ The New Age of Sexism for digital feminism. For UK-based readers, keep an eye on the new CoSTAR lab at Pinewood studios and The Schwarzman Centre, Oxford Institute for Ethics in AI and The Atlantic Fellows for talks and broadcast media.
Film poster for The AI Doc, a documentary on the risks and promises of AI.
The Silicon Valley CEO, who startled a 500-strong Oxford audience one day in March 2026 with ideas of co-operation for world citizens over collaboration with big tech…he had something else going on in his day. He was struggling to string a sentence together across a singular plane of logic. In fact his sentences could be up to ten meandering clauses long with little pause for breath. He disclosed he was working daily with algorithms, training an LLM. He had the appearance and sound of someone drunk on AI: the loop in the human, rather than a human kept in the loop. It is my hope that this article equips our civic industries of film, music and theatre with enough information to begin remaining in the loop and protecting worlds we have already built in a challenging new age.