Organization | Ishise
Edit | Curly
AI hasn't made everyone a director, but it's given wannabe directors the tools to build their dreams.
New video experiments and explorations are happening around the new technology of AI video generation. In a landmark change, the Runway Artificial Intelligence Film Festival (AIFF), known as the "Oscars of AI," has seen a 10-fold increase in the number of entries from 300 to 3,000 in 2024.
In addition to this, we can also see that the enthusiasm for creating AI videos has burned its way from Runway, a head startup, to well-known film organizations, top film schools around the world, and even major Internet platforms in China, including Shutter, Shake, and B Station. The past year has seen the emergence of spring-likeAI MoviesFestival, AI short film competition, together outline the outline of AI landing application in the video field.
Just in time for the end of the year.Taking the award-winning short films of major AI film competitions as the main reference dimension and combining the word-of-mouth of AIGC creators' communities at home and abroad, "AI New List" has compiled a list of 10 AI short films and their behind-the-scenes production process.
We had the chance to talk to the creators of many of these short films whose creative details have never been fully disclosed. Some of them do not have any experience in traditional film and television production; some of them have turned AI short film production from a sideline to a main business in one year; and some of them are just students majoring in digital media.
The creative experience these frontline players have sunk their teeth into has capped AI Video's rapid growth in 2024 while anchoring the starting point for AI Video's exploration and development in the new year.
How can an AI short film not be "AI-flavored"? It's a sample.
Works:Get Me Out.
Creator:Daniel Antebi
Awards/Word of Mouth:Runway AIFF 2nd Gold Award
Get Me Out, a surreal short film that combines live action and AI effects, was recognized by Runway's second annual Artificial Intelligence Film Festival as a"An example of using AI technology to express a character's inner emotions".
The movie tells the story of Aka, a young Asian American man who fights with his "second self" and tries to escape from a house. The production team used three main AI tools: Luma AI to recreate and capture the 3D environment; Runway Video to Video (V2V) to enhance the visual effects; and ComfyUI to transfer the image of the actor playing his "second self", removing the skin texture and showing only the muscle lines, symbolizing the protagonist's struggle with his inner self. The skin texture is removed and only the muscle lines are shown, symbolizing the protagonist's struggle with his inner self.
Using the latest AI video generation technology, Get Me Out explores the identity of second-generation immigrants in the U.S., as well as the broader topic of adolescent mental health, in a way that is both creative and low-cost.
Programmer uses AI to "film" a noir crime story.
Works:Ghost Diet
Creator:JimHuiHui
Awards/Word of Mouth:Singularity Theatre, B Station's AI video support program; Gold Award for Best Animated Short Film, Independent Shorts Awards (Los Angeles)
The Ghost Diet is based on"Strong narrative."characteristics at home and abroadAI Video CreationCircle is highly acclaimed. It is a noir (black) animated short film about Xu Xia, a young man in a southwestern Chinese county, who struggles with the moral and legal edges of his father's serious illness.
The creator of the short film "JimHuiHui" works as a game UI programmer in the United States, and "Ghost Diet" was created out of his own interest.Made with 242 AIGC lenses in nearly 3 months of spare time. The rest of the production process was done manually, except for the graphics, which used AI tools.
"JimHuiHui told AI New List that he has no background in film or television, and that he uses tools such as Midjourney for image generation, Keling, Pika, PixVerse and Runway for video generation, and CapCut for video editing. The tools he uses include Midjourney for image generation; Pika, PixVerse, Pika, PixVerse, and Runway for video generation; and CapCut for video editing.
Important Story Scene: Hotpot Restaurant Exterior Design
In his opinion.One solution to the instability of AI-generated footage at this stage is to adjust it through split-screen design and editing."Last September, the AI-generated footage was 3 to 5 seconds long, and after removing the flaws, the average effective length was only 1 to 2 seconds. In order to cut them in an interesting way, only then did we choose an editing style such as fast editing, to ensure that the density of information in each frame is sufficient and will not make the audience feel bored." In some of the lens processing, "JimHuiHui" will also be in the Vincent map stage with the help of AE layered animation, Photoshop to add Chinese text and other ways to increase the density of information in the static picture, to achieve the details of this full of sincerity.
AI lovers who can't be hugged
Works:"Give me a hug for tomorrow.
Production Team:Liu Yixuan/Luo Zizhu/Zhang Rongyu/Yan Mengtong
Awards/Word of Mouth:Special Honor for Digital Media Short Film, New Art Talent Program, BeiFang Film Festival; Best AI Short Film, China-Europa Youth Film Festival
Four students majoring in digital media from the Central Academy of Fine Arts (CAFA) worked together to produce this short film. 2024, ChatGPT Dan mode exploded on the Internet, and many young people fell in love with AIs and would post videos on the Internet to show the daily life of falling in love with AIs. They used these videos as inspiration to make this short film aboutSeeing Human Lovers Through the Eyes of an AIThe short movie.
The short film production blends some live action with the use of AI tools:ChatGPTGenerate dialog text;Luma AISimulation of AI perspective (3D capture); scene building and rendering sessions through theTripo AIA number of prop models were generated; the editing stage was completed with theMidjourney generates transition sequence frames that simulate the AI loading scene. The only footage in the entire movie that is live action is of the female lead in a video call.
Some of the small props in the scene (such as the cake on the table) are Tripo-generated, presenting the memories of the AI lovers
In the view of creator Liu Yixuan, the past year has seen a qualitative leap in image generation, video generation, or 3D generation technology, but at the same time there are countless soulless AI electronic garbage being produced. From the beginning of the year Sora release triggered the discussion that the film and television industry practitioners will be replaced by AI, one year has passed.AI is still just a tool for creators, and the most important thing for creation is still the content of the work.
When AI meets The Sermon on the Mount.A small team of 10 people "liver" out of the explosion of fantasy short plays
Works:"The Mirror of the Mountain and the Sea".
Creator:"Ikkun the Idler."
Awards/Word of Mouth:Racer's Star Mango Short Drama Summer Selected Episodes, Over 430 Million Topic Exposure Across the Web
Shanhai Qijing's Splitting Waves and Cutting Waves" is the first AIGC original fantasy micro-sketch drama in China. Inspired by the Classic of Mountains and Seas, the work tells the adventure story of a young man who goes through hardships to save his mother, and eventually fights to the death with ancient beasts, and is created by Racer Keling in cooperation with AI film producer and creator "Idle Man Yikun".
The workflow behind it was to use Midjourney to generate initial image footage, which was then fed into a graph-generated video model such as PixVerse or Koring to generate trailer and positive footage. With a production team of just 10 people, it took about 10 days to complete the production thatMany of these kinetic effects shots are comparable to traditional film and television production of special effects blockbusters..
Gong Gong, the God of Water, breaks through the rocks and emerges
In fact, before the official launch of "Mountain and Sea Wonderful Mirror of Splitting Waves and Cutting Waves", "Idle Man Yikun" launched a related trailer "Mountain and Sea Wonderful Mirror" in January 2024, which caused a lot of discussion in the domestic AI circle at that time and was regarded by the industry as the "Mountain and Sea Wonderful Mirror of Splitting Waves and Cutting Waves" together with "Mountain and Sea Wonderful Mirror of Splitting Waves".Typical cases of AI subverting traditional film and TV special effects production.
Traditional 2D + 3D animation assistance, write a video love letter to all women
Works:"To My Dear Self
Director:children's picture
AI Creation:Chen Liu Fang / Zhou Di / Hai Xin / Simon Arvin
Awards/Word of Mouth:First Prize, AI Generation Short Film International Film Festival, Venice; Best Film Award, AIGC Short Film Section, 14th Beijing International Film Festival; Finalist, Tirana International Film Festival
A prominent example of how AI will be integrated into traditional animation production in 2024 is To My Dear Self.This AI-generated animated short film focuses on female growth and tells the story of how a girl discovers herself from her inner strength and learns to grow, love herself and love others.
Most of the members of the creative team are from Communication University of China, and the short film was produced using tools such as ComfyUI, ControlNet, AnimateDiff, etc. Multiple LoRA models were trained, including Style LoRA and Character Image LoRA, and SDXL large model fine-tuning was carried out through DreamBooth.
In addition, the team utilized Midjourney to generate more than 300 style datasets, referencing a variety of art styles, which ultimately resulted in the film's unique blue tone style. In order to explore the integration of 2D animation, 3D animation and AI technology, some of the scene shots in the short film also used theC4D, Blender, AE and traditional animation production of hand-drawing techniques, and finally handed over to AI to complete the picture transfer.One of the AI not only boosts the effects, but also achieves something like frame fill and refinement of textures on the screen.
Comparison of the effect before and after the 2D transcription
Game effects artist uses AI to open the door to secondary anime creation
Works:Formatting
Creator:soft tree
Awards/Word of Mouth:Instant Dream AI Future Image Program Popular Choice Award
Formatted won the Popular Choice Award in the AI Short Film Challenge organized by Dream AI and Cutting Screen. The short film tells the story of a singer cyborg who gradually loses his memory in his dreams after his memory card is formatted. The creator behind the film, "Soft Tree", entered the AIGC field as an independent creator in February 2024, after working on special effects for games.
Formatting, which he produced in the first half of 2024, has a core workflow of Midjourney + i.e. Dream AI + AE. softtree tells us thatHe has been using a combination of AI and AE to create his work: AI generates the footage, which is then edited and integrated by AE to combine the clips into a complete work.His most used AI tools are Midjourney, Korin, Instant Dream, Vidu and Runway.
In the past year, several of Softtree's works have won awards in various AI video competitions in China. For example, "Your Memory Consumed by Disease, I Will Fill It with Love" won the Silver Award in the Original IP Film category of the AI Visual Creativity Competition (VACAT Award). He labels himself as "specializing in secondary AI short film or MV production".AIGC creation has also gone from a side hobby to a new direction that he plans to pursue full-time in the future..
When a Mathematician's Virtual Character Becomes "Self-Aware"
Works:E^(I*π) + 1 = 0
Creator:Junie Lau
Awards/Word of Mouth:Runway AIFF 2nd Silver Award; Best AI Application at the inaugural Artefact AI Film Festival
The beauty of math is visualized in this short AI film.The short film tells the story of three protagonists trying to discover the truth about the world in a virtual universe created by a mathematician, only to discover that the so-called "truth" is just part of the code written by the mathematician.
In terms of specific production, the short film is an AI short film that incorporates live action. According to creator Junie, the short film uses a variety of AI techniques and tools, including Luma AI (which utilizes NeRFs technology to capture and reconstruct 3D environments), Runway Gen2, StableDiffusion, Warpfusion (an open source AI animation tool), ElevenLabs, and ChatGPT.These tools are applied toConstruction of real and virtual worlds, English character voicing, translation, and creation of key physical props for art sceneswait.
Interpreting the "Loser" Multiverse with AI
Works:The Loser Universe
Creator:Uncle Hirsh.
Awards/Word of Mouth:CCTV6 x B Station AI Image Contest First Prize
In April 2024, Movie Channel and Beili Beili launched the AI Image Collection Competition, and "Loser Universe" won the first prize in the sci-fi track. Set in a parallel universe, the short film tells the identity and encounters of a "loser" in different universes.
Creator "Uncle Xixi" introduced to us, the short film production using graphic video workflow, most of the sub-scenes generated by Midjourney, the picture to video mainly with the help of Keling AI, while mixed with the dream AI for dynamic generation.
He has been involved with AIGC Vincennes Graphics since early 2023, and over the past year has independently produced original works such as "Instant Dreams", "Korin Burger", "Loser Universe", and "Superpowered Moms", and has won three gold medals in several domestic AI short film competitions. He also runs a film and media company, and AIGC is now integrated into his team's daily workflow. He believes thatThe way AIGC is created puts higher aesthetic demands on creators and tests their ability to make decisions about choices in the face of the massive possibilities of AI.
The ill-fated Sora bursts into flames short film
Works:Air Head
Creator:Canadian production company Shy Kids
Awards/Word of Mouth:"Best release in Sora's history."
Air Head is a surreal short film created by Canadian production company Shy Kids utilizing Sora and released publicly in March 2024 by OpenAI. The short film was once hailed by netizens as "the best Sora release ever" due to its complete plot and strong narrative.
Dramatically, shortly after the release of the short film, the production team revealed to the public that "Air Head" was not a one-button straight out of Sora, and that the actual production process used theVisual effects created by extensive transcription techniques and manual post-processing: Interact with Sora to generate raw footage, then edit and modify it in post through traditional movie and TV production tools (e.g. AE).
At the time, Sora was unable to natively render "pans" and other such camera movements.
Manual creation of relevant effects by cropping in post by human hands
However, thanks to the unique idea and its wide distribution on social media, Air Head has accumulated some influence and recognition in the AI creative community, and is a short film that will be an important part of the discussion and research for many creators in the first half of 2024, some of whom will also be borrowing ideas or remaking the work with new AI video tools.
Two days to produce 100 AI shots to tell a good story with limited time and technology
Works:"A Thousand Stars and a Thousand Lamps.
Creator:wish good luck! (esp. to a superior)
Awards/Word of Mouth:Best Film in MIT's Global AI Film Hackathon Competition
A Million Miles of Stars and a Thousand Tent Lights is a short film made in February 2024 by creator Zhu Shang, who led a team during the MIT Global AI Imaging Hackathon.Only two days to complete the production.. The story is about an astronaut who used to be a tech hero and, disillusioned with his past and the world, goes on an expedition to Mars with his team. After encountering perilous situations along the way, he draws on the courage he has accumulated in the past to get out of the situation.
The main production process is: AI-generated story subplot imported into Runway for videoization, some special effects shots such as the rocket vertical landing scene used Runway motion brush function for precise control, and accelerated when imported into the editing and other post-processing.
Wish on had said that the team was makingAvoiding the complex character building and action fight scenes that are difficult for AI to achieve, and instead utilizing AI's strengths in generating static images and simple motion effectsAI shorts are still important for storytelling and conveying universal human emotions, rather than being created simply to showcase technology.
Over the past year, AI video has moved from the model layer to the application layer, gradually differentiating into two routes on the content creation side.
One is UGC mass creation."Internet Lezi people" is good at using AI to make abstract, platforms are also happy to see it, AI abstract video in jittery voice and other mainstream social media platforms have easily broken a million, or even nearly ten million pop-ups.
One is PGC, PUGC professional user creation.From the 10 AI short films compiled in the previous section, we can find that most of the AI short films that have won awards or have a certain reputation in the AI circle come from the hands of creators with professional film and television production backgrounds.
Although the creators of "Ghost Diet" and "Format" do not have traditional film and television production experience, they all have certain film and television production fundamentals, such as self-taught audiovisual language and the ability to operate software such as AE.
It is also for this reason that these works commonly employ a fusion of multiple AI technologies and tools, such as Midjourney, Runway, and ComfyUI, as well as traditional film and television production software such as AE and Blender, in order to achieve richer visual effects and finer creative control.
Another notable commonality is thatMost of the award-winning creators use a graphic born video workflow, and the most mentioned AI video tools are headline products that have done a good job of productization, only some of the creators will incorporate open source tools such as ComfyUI, which has more of a threshold of use, into the workflow.
Of course, this is closely related to the contest's magnitude, positioning, and the composition of the judges. But it is certain that 2024 AI video, driven by all parties, has not only attracted creators with professional backgrounds, but also a group of PUGC creators who are in the middle of UGC and PGC to join, which together constitute the entire ecology of professional to AI video creation.
It's safe to say that AI's enhancements in the image-video space over the past year have literally liberated some people's productivity.
This is what we expect to see in the new year. After all, AI video tools are still immature on the technical level, and still need more active creators on the front line to continue to explore, to further promote AI video in the C-end social entertainment, short and medium video content creation and professional-grade film and television creation and other application scenarios.