Comment

Behind the Scenes of GOLDEN | a short film made in 24 hours

So, there’s this thing at the College of William and Mary (where I went to school) that a bunch of us do every year called 24 Speed, where teams of students and alumni race the clock to shoot a short film in under 24 hours. It’s wild, it’s weird, it’s a ton of fun, it tends to encourage stupid creativity in the best of ways.

This year, the prompts were A Frog, the line of dialogue “Regret, what a funny word”, and our team was assigned ‘Oscar Bait’ and ‘Experimental’ as our genres to play with.

Roughly 24 hours later, we’d come up with and exported this:

The timeline of Golden, edited in Final Cut Pro X

For GOLDEN, most of the imagery was created with a few different AI image generators like Midjourney and Bing Image Creator, with assistance from Adobe Photoshop’s Generative fill.

The different models have different strengths and weaknesses, with Bing Image Creator being particularly good at logos, like Poison Arrow Softworks:

Some of the draft generations to get to the Poison Arrow Softworks Logo.

An alternate logo for Poison Arrow Softworks created with Bing Image Creator

The final logo for POISON ARROW SOFTWORKS

And the Artificial Intelligence Golden Boy:

Alternates for the Golden Boy (TM) Logo created by Bing Image Creator

The final Golden Boy Logo

Meanwhile Midjourney is better for creating settings, backdrops, and characters (like Deepfake DiCaprio):

A selection of Deepfake Dicaprios

Server rooms to slap the Poison Arrow Softworks logo onto.

The Los Angeles skyline at twilight, as perceived by Midjourney

That same Skyline but toasty.

With Photoshop’s new generative fill feature, I can quickly select the subject in an image, move them to a new layer, then fill in the space they occupied with an extension of the background.

This let me keyframe each element at a different rate, to give more a sense of the illusion of motion and parallax effect.

So now this still image becomes two separate elements…

…a png of the host with a transparent background…

…and the background itself, with no host.

I’ve also found that Midjourney is really good at coming up with gonzo concepts based on text descriptions. While I was working on writing the original script for Golden, and bouncing around ideas for what ‘Oscar Bait’ could mean, one of my thoughts was baiting a giant Oscar statue that was rampaging through the streets of downtown LA. Putting a few prompts into Midjourney, I got some real weird images of a giant golden head smashing through buildings, which inspired the ending of the movie.

Oscar awakens…

And the Academy Award for apocalyptic destruction goes to…

We don’t need no water…

Now THAT’S a boulevard of broken dreams.

To animate the city being on fire for the last couple of shots, I used a new AI tool called Pika to take some of the stills Midjourney had created and turn them into videos.

A lot of things get set on fire by the end of this story. This tends to happen in the stuff I write.

Like most AI tools, it involves a lot of trial and error, with an occasional side serving of unintentional nightmare fuel when I tried to animate the two deepfake actors (and to spare the audience, I decided to just to just keyframe those stills instead).

Oh good, I think the Acid tab just kicked in.

FOR THE LOVE OF ONES AND ZEROES WHYYYYY?!!

It’s wild how these tools can output so much material so quickly. As an indie filmmaker working against the clock, on a timed film competition like this, it’s awesome. In less than 24 hours, I was able to produce multiple variants for every asset I could possibly want, and customize them to my liking through multiple variations.

One of the lines I’ve heard about tools like Midjourney is that it turns basically anyone into an Art Director, curating pieces to create a specific look and feel. And I’ve definitely found that they’ve helped me as a writer and filmmaker by allowing me to visualize story concepts that I may have a more ambiguous grasp on. As a person who likes to make stuff, particularly high-concept, low-budget sci-fi stuff, I’m thrilled (Look, I made this, with my computer!)

But I’m also rather wary of what kinds of effects tools like these will have on the filmmaking industry as large companies try to use them to replace jobs, and disrupt what had previously been good creative careers.

These tools are only capable of doing what they do because of the collected efforts of millions of artists that made the works these AI models and image generators were trained on, and the idea that they might be used to put those same artists out of a job is horrifying. Personally, having thought about this a bit, I don’t think we should be able to trademark or copyright anything produced with Generative AI like Midjourney or Bing Image Creator, unless the contributors to the training data were fairly compensated (which is a whole can of worms in and of itself). If anything, these outputs should be in the public domain, available to all, but owned by no-one.

There’s a lot to unpack here, something I’m hoping to get to in another blog entry soon, but for now, I hope this has been an informative window into some of the themes and process of Golden, and an exploration of some of the big ideas behind this (honestly kinda dumb) little short.

•••

Comment

Comment

CHANCE OF STORMY SKIES - Webseries premiere!

One more day until the premiere of CHANCE OF STORMY SKIES!

A genre bending anthology series of weird tiny films. Often darkly comedic, always high concept, occasionally genuinely upsetting. Sometimes, there are spaceships. Created by Ted Hogeman and Sudeshna Mukherjee.

The first episode, SPACE INVASION THE MOVIE FILM, is going online TOMORROW, January 10, 2024, at Noon!

Available on YouTube, Threads, and TikTok (on a related note, laughing with the storm has a TikTok now(?), apparently?)

Catch more episodes of Chance of Stormy Skies each Wednesday at noon over the next several weeks!

Comment

Comment

Project Sneak Peek | Operation: Wet Paint

Looking back on a few projects this year, wanted to highlight some of the other filmmakers I’ve been lucky enough to work with outside of official Laughing with the Storm projects. Here’s a peek behind the scenes at the production of Operation: Wet Paint, directed by Thomas and Curtis Nishimoto, produced by Jack Strayton.

OPERATION: WET PAINT is a comedy short about a group of kids who decide to hatch a complex, heist-style plan to get revenge on the bullies that have been tormenting them.

Here’s a peek behind the scenes at the production of Operation: Wet Paint

The protagonists test a secret weapon for their plot in Operation: Wet Paint

Directors Thomas and Curtis Nishimoto stand at the monitor for a scene.

Director of Photography Daniela Mileykovsky stays out of the sun with an umbrella as she operates the camera.

The Sony FX9 that was the primary camera used to film Operation: Wet Paint

DP Daniela Mileykovsky and Production Sound Mixer Raven Jackson prepare for a key sequence with actors Sean DiGiorgio and Isaiah Owens.

Actors Isaiah Owens and Sean DiGiorgio prepare for a scene while the rest of the crew uses umbrellas to stay out of the summer sun.

Gaffer Berlin Waechter takes a behind the scenes still as Directors Thomas and Curtis Nishimoto demonstrate the blocking for actor Jaeden White.

Producer Jack Strayton stands at the monitor as Gaffer Berlin Waechter brings additional supplies to set.


Comment