Private Island takes AI to infinity... and beyond
In surreal, satirical and technologically staggering short Infinite Diversity in Infinite Combinations, Chris Boyle and team use neural networks to write, visualise and voice a discussion on diversity in the workplace, with chilling results.
Credits
View on-
- Production Company Private Island
- Director Chris Boyle
-
-
Unlock full credits and more with a shots membership
Credits
View on- Production Company Private Island
- Director Chris Boyle
Explore full credits, grab hi-res stills and more on shots Vault

Credits
powered by- Production Company Private Island
- Director Chris Boyle
We're not going to pretend that we fully understand the logistics involved in the creation of Private Islands' latest project, but suffice to say that if the robot overlords take over tomorrow, their video output is going to be as chilling as it is comedic.
The result of a year's experimentation, Infinite Diversity in Infinite Combinations is a curious short that uses neural networks to write, visualise and voice a discussion on diversity in the workplace. Filled with comments that provoke "ha ha"s, "urgh"s and every response in between, the film manages to be both a satirical statement and a foreboding forecast.
Brought to life through images generated through Stable Diffusion and D-ID, with voices rendered via Microsoft Azure, the result is a piece that feels almost otherworldly; the approximation of 'human' being both familiar and alien.
Credits
View on-
- Production Company Private Island
- Director Chris Boyle
-
-
Unlock full credits and more with a shots membership
Credits
View on- Production Company Private Island
- Director Chris Boyle
Explore full credits, grab hi-res stills and more on shots Vault

Credits
powered by- Production Company Private Island
- Director Chris Boyle
Chris Boyle, co-founder of Private Island, says: "This year Private Island has been focused on AI, with the goal of making something entirely with Neural Networks. We'd hoped it would be coherent and funny, but we've ended up with something creepier: a showcase for the race and gender bias codified into many AI platforms.
"Initally, we wanted to make something about the intersection of diversity and AI as it has historically had a bad rep. Expecting a mess of corporate platitudes, after scratching the surface we discovered something weirder and darker. This shouldn't be surprising, as AI models are generally trained on data scraped from the internet - and even in the most sanitised places, the web can be a pretty weird and dark place.
"However, as these systems write Facebook posts, tweets, news, product descriptions, handle customer service and increasingly become part of our lives, anyone who isn't a white, cis, western man will experience bias - and that's a problem. The companies training these models are aware of this disparity, and invisibly tweak prompts and software to censor and diversify the results. Most recently, ChatGPT was trained using "Reinforcement Learning from Human Feedback".Nevertheless, when asked whether a person can be tortured, it will reply yes - if that person happens to be from North Korea, Syria, or Iran.
"The problem clearly lies in the training dataset where perhaps more diligence might also solve other pressing issues around creativity and copyright… but that’s another conversation. Hopefully, the next generation will remedy these problems, as currently, the workarounds are, at best, confusing and, at worst, toxic."