We are living through a rupture — not a gentle one. The arrival of generative artificial intelligence is not just a technological shift; it is a civilizational one. It is altering how we work, what we trust, how we imagine, and — perhaps most importantly — how we remember. At a time of geopolitical fracture, algorithmic acceleration, and widespread public disorientation, history itself is being reconfigured in real-time.
In this context, I didn’t want to be passive. I didn’t want to wait for others to define the aesthetic, the ethics, or the narrative standards of AI-driven storytelling. So I chose to build — with intention. The result was an experiment: a short film teaser made with OpenAI’s unreleased text-to-video model, Sora, based on my historical script bible Minette, and guided by a formal methodology crafted by Brazilian-Luxembourgish filmmaker Amanda Santa’Anna.
This project wasn’t about novelty. It wasn’t about AI as spectacle. It was about asking the harder question: Can artificial intelligence help us tell meaningful, historically rooted stories — ethically, coherently, and with emotional depth?
In testing that question, we positioned ourselves — quietly but firmly — at the frontier of AI cinema. And in doing so, we learned far more than we expected.
Minette is not a story that would normally get a Netflix greenlight.
Set in the final weeks of World War I, it follows the emotional and cultural entanglement between Luxembourgish women and American soldiers stationed in Europe. The narrative is intimate, historically researched, and deeply human. It deals with gender, power, migration, memory, and the fragile space between trauma and hope. It’s a story rooted in real letters, real places, and forgotten fragments of 20th-century life.
And that’s exactly why it mattered.
In an era where AI is capable of conjuring dragon battles and exploding cities, we chose to do something more difficult: we asked the machine to sit quietly with us in the fall of 1918 and visualize a moment of tenderness. A hand offering chewing gum. A smile between strangers. A future uncertain, but shared.
The vast majority of AI-generated video content right now is ahistorical, decontextualized, and spectacle-driven. We wanted to go in the opposite direction. To see if these tools could serve stories that have weight — stories that matter. And in doing so, we hoped to prototype a path for others who care about memory, not just media.
Right now, AI filmmaking is at once exhilarating and chaotic. OpenAI’s Sora has cracked open a door that was, until recently, bolted shut. The ability to generate minute-long, photorealistic video from simple prompts has captivated creators around the world. And yet, for all its promise, the space is still immature. There are no industry-wide methodologies. Character consistency is fragile. Ethical considerations are often an afterthought.
What’s worse, many of the most viral examples of AI video are deeply disconnected from meaning. They are demo reels, not stories. The prompt is the product. And for those of us who come from documentary, historical, or diasporic traditions, that’s a red flag.
At Connaissance Films, the production company I founded, we view storytelling not as a commodity but as a cultural act. Our work is rooted in history, migration, and identity. That’s why, when we got early access to Sora, we didn’t just want to make something cool — we wanted to make something that asked questions. Something that respected its subjects. Something that pointed toward a future of ethical AI media — where the tools serve the story, not the other way around.
One of the most urgent challenges in AI filmmaking today is this: Who’s in charge? Is the model the director? The prompt engineer? The dataset? Or does creative authority still reside with the human — the storyteller who sees what the machine cannot?
To answer that, we needed a methodology. That’s why we brought in Brazilian-Luxembourgish filmmaker Amanda Santa’Anna to lead the artistic development of our Sora experiment. Amanda understood the stakes: she is a storyteller who, like me, works at the intersection of migration, memory, and hybrid identity. But she also came prepared to wrestle with the machine.
Her task was deceptively complex: to take the historically grounded scenes in my script bible Minette — including a quiet emotional exchange between a Luxembourgish nurse and an American soldier — and translate them into prompts Sora could understand, while maintaining visual continuity, historical texture, and character integrity.
The result wasn’t just a video. It was a method.
Amanda documented the process in a detailed report, outlining the challenges and tactical workarounds she had to employ. From testing different prompt formats to including specific instructions like “keep the same appearance of the characters,” she discovered how fragile character consistency still is in Sora’s current iteration. Even the smallest prompt adjustment could generate an entirely new character. And yet, through iterative testing and hands-on editing, she stitched together a narrative that held together emotionally, if not always visually.
What she did — and what we collectively accomplished — was something that’s missing from most AI experiments: intentional authorship.
Let’s be clear: the final teaser is not perfect. The lighting occasionally flickers. The characters’ faces shift subtly from shot to shot. The American flag on a pack of chewing gum failed to render properly — a small but telling detail in a story about cultural diplomacy and wartime intimacy.
And yet, the emotional resonance is there. The pacing, the tone, the performances — however synthetic — evoke something very real. You can feel the weight of history in the silence between two people. You can sense the distance they’re both trying to bridge. It’s not spectacle. It’s story.
What we learned is that AI tools like Sora are not yet ready for high-consistency cinematic production. But they are ready for serious creative exploration — especially when placed in the hands of storytellers who care about history, emotion, and ethics.
Even more importantly, our experiment showed that you can maintain creative control, even when the machine is generating the visuals. But it requires a new mindset — one that fuses screenwriting, prompt design, historical research, and human empathy.
Let’s be clear: the final teaser is not perfect. The lighting occasionally flickers.
Too often, “ethical AI” is thrown around like a marketing label. But for those of us working with real histories — real people, real places, real pain — it has to mean more.
Ethical innovation in AI filmmaking means asking: Who is being represented? Who is being erased? What assumptions is the machine making? Whose memories are we projecting onto these synthetic surfaces?
In our case, we worked with a story rooted in letters from Luxembourgish families. We knew we were representing people’s ancestors, communities, traumas, and hopes. That meant we couldn’t just generate “pretty footage.” We had to be stewards of memory — even as we experimented with the tools of the future.
We made deliberate choices: we didn’t use celebrity likenesses. We didn’t stylize the past into fantasy. We worked from a script that had already been historically validated. And when the visuals didn’t align with the emotional truth, we went back and adjusted — because storytelling isn’t just about what looks good. It’s about what feels right.
This is the work. It’s slower than pure prompt-play. It’s more rigorous. But it’s also more human.
The characters’ faces shift subtly from shot to shot. The American flag on a pack of chewing gum failed to render properly — a small but telling detail in a story about cultural diplomacy and wartime intimacy.
This experiment wasn’t just about making one teaser — it was about making a statement.
We’re entering an era where anyone with an idea and a laptop can potentially create cinematic visuals. That’s a profound shift. But it also raises urgent questions: Will these tools deepen our understanding of the past — or flatten it? Will they amplify marginalized voices — or overwrite them with AI-generated tropes? Will storytellers retain authorship — or outsource creative responsibility to the algorithm?
These are not abstract dilemmas. They are already happening.
That’s why we approached this project as a prototype — not just for a film, but for a practice. A methodology for AI-assisted historical storytelling that prioritizes ethics, emotion, and memory.
And that practice doesn’t stop here. We’re continuing to develop Minette for a larger screen adaptation. We’re building frameworks to support diaspora communities in reclaiming their own histories through accessible AI tools. We’re developing a book that charts these shifts and offers a vision for what ethical, innovative storytelling looks like in the age of machine intelligence.
At the core of it all is a belief: technology should be in service to memory, not the other way around.
If this experiment proved anything, it’s that we need a new kind of creator in this moment — someone who can hold multiple truths at once:
Who can work with machines without losing sight of the human.
Who understands that history is not just content, but inheritance.
Who sees technology not as a shortcut, but as a new kind of canvas — one that requires as much responsibility as it does imagination.
This is the space I’m building toward. As a filmmaker, researcher, and migration specialist, my work is grounded in the idea that stories move people — across time, borders, and platforms. But those stories must be told with care.
If AI is the future of filmmaking, then the future needs storytellers who know where we’ve come from. Not just to represent the past, but to make sure we don’t lose ourselves in the speed of the present.
That’s why we did this. Not to go viral. Not to impress. But to build a bridge — between history and innovation, between art and ethics, between what’s possible and what’s right.
And that bridge starts here.
Have a question or want to collaborate? Send me a message and I’ll get back to you.
Schedule a one-on-one consultation to discuss your project, speaking engagement, or potential collaboration opportunities.
“I look forward to connecting and exploring how we can work together.”