Category Archives: Video

"Eclipse" music video

What does the future hold for storytelling – can a machine create cinema?

The release of the trailer for Morgan provides further insight that Artificial Intelligence (A.I.) is creeping slowly onto the creative stage. After initial progress was made in 1996 when IBM’s Deep Blue beat World Chess Champion Gary Kasparov, IBM’s Watson then beat human champions in Jeopardy, and more recently Google’s DeepMind conquered the ancient Chinese game of Go. Google Photos generates videos (with music) automatically from images on my phone. It’s becoming obvious that deep strategic thinking is at least possible using machines.

So, can a machine create a video narrative? Could we tell the difference?

The unfortunate fact is, of course, that the Morgan trailer is hollow and poorly-paced (even with the help of an “IBM filmmaker”), and the musicians behind the AI-directed “Eclipse” music video have distanced themselves from the end product.

Looking deeper into each of these projects, they still required a human hand to direct/collate/guide the machine – it’s a ground-up approach to AI, so there’s no “Watson Video Editing Software” on the market.

However, the building blocks have already been created – the Google search engine uses natural-language processing, and Wolfgram Alpha accepts commands in basic English. We now have (pretty good) automatic web summaries and headline analysers. There’s a reason why Google’s Principal Filmmaker Jessica Brillhart thinks Zork’s language processing will heavily influence the future of VR.

It seems that although we probably have a while to go before creativity is realistically threatened in any way, most people won’t care if something has been created by computer. For example, much of the in-house promotions we currently see on TV channels are packaged in a way that wouldn’t require human intervention – so perhaps it might not be that long after all (for specific situations).

So a machine can assemble a video (I even hesitate to use the words ‘edit’ or ‘direct’). But not very well – at least not yet. However, as Linguistics expert Noam Chomsky said, perhaps we are even asking the wrong question:

“Thinking is a human feature. Will AI someday really think? That’s like asking if submarines swim. If you call it swimming then robots will think, yes.”

via and

The Rise of the Diegetic Intertitle

Prior to the integration of sound, movies often displayed text-based information using title cards or intertitles. This form of communication is known as diegetic content, as the actors cannot see it.

However, even after the invention of the talkie, other types of information needed to be displayed — such as translations for a foreign audience. This was usually done in a very perfunctory way, with non-descriptive text (typically achieved using a font such as Times New Roman, with a black outline to contrast with any background) on the lower third of the screen. This text is external to the story, so it seemed natural that it should be stylistically different.

In more recent years, there are increased demands on visual storytelling – small screen devices (e.g. mobile) and computers have started to become part of the language of cinema (and, by extension – the video and digital screens on which they are ‘projected’). Additionally, in today’s more multicultural world, the requirement to show multiple languages within the same film means that different typographic techniques can be used to enhance this aspect of the story. In fact, this is an extension of traditional subtitles – where often sound effects and untranslated languages are still included.

As mobile and internet technology started to appear on screen, an editor would typically cut to a shot of the device — allowing the viewer to read the display. As post-production technology improved, TV’s requirement for an increased speed of plot exposition, and product placement costs (and legal clearances) required a more generic approach, this eventually evolved to show the interface directly incorporated onto the visual frame.

Subtitles, captions and interface design typically sits independently on top of the content as a layer added in post-production – i.e. as a semi-transparent wall between the story and the viewer. Integrating these titles to make them appear part of the content can be quite a technical challenge — especially when they need to be tracked to a moving camera.

This overlaying technique was demonstrated in movies such as Man On Fire (2004), Stranger Than Fiction (2006), Disconnect (2012), and 2014’s The Fault In Our Stars, John Wick, and Non-Stop, as well as TV shows such as (perhaps most influentially) Sherlock (2010) and House of Cards in 2013.

There are two main types of elements in modern cinema: diegetic – anything that the characters would recognise happening within their world (of the narrative story) and non-diegetic – anything that happens outside the story (for example, this would usually be opening credit sequences).

However, (much like modern media itself) on-screen typography has surpassed merely being integrated visually into the background plate. It is now becoming increasingly self-reflexive, and blurs these diegetic lines. This is often referred to as “breaking the fourth wall”, and is perhaps best demonstrated in the opening titles to the 2016 film Deadpool, where even the actual names of producers are subverted into narrative elements.

For more exegesis of the diegesis (sorry, I couldn’t help it), see Tim Carmody’s excellent 2011 SVA Interaction Design presentation “The Dictatorial Perpendicular: Walter Benjamin’s Reading Revolution”.

“A fourth wall break inside a fourth wall break? That’s like, sixteen walls.” – Deadpool

Masters of Videomontage

Some of the most fascinating video animators I’ve ever seen – Cyriak (Brighton, UK), Fernando Livschitz (Buenos Aires, Argentina) and Till Nowak (Hamburg, Germany). Using found footage and masks, they create a surreal and often disturbing view of reality.

As mentioned in the ‘Heroes of Animation film’, Cyriak sees this style as a natural evolution of the Terry Gilliam school – taking photographic elements and moving them in unexpected ways. I would go further and in that it takes Russian Constructivist fine-art photomontage to a natural conclusion.

We are so used to amateur camcorder and mobile films these days that this approach seems to transcend animation – and we are drawn into their world.

We are so used to amateur camcorder and mobile video these days that this approach seems to even transcend animation – and we are drawn into their world. So much so that The Institute for Centrifugal Research seems to be (remotely) plausible.

And here’s how it’s done.

This profile of Cyriak includes a history of his work, and a demonstration of his process. This behind-the-scenes video from The Centrifuge Brain Project shows the CGI overlaid over the source footage, and this After Effects tutorial explains the basics, using a locked-off camera (then you can add natural camera movement afterwards).

The Others - Hiroshi Kondo

Hiroshi Kondo

Hiroshi Kondo captures the the energy and the loneliness of living in such a vast metropolis in his experimental short, The Others. The slit-scanning film bends time and place into a moving portrait of a Tokyo square by highlighting the individual and the crowd moving both separately and in haunting unison. The overall product is something between glitch art and augmented reality.

You can see more of his talented work at his website below.


Bob Dylan – Like A Rolling Stone

This interactive video (from 2013) is the song’s first official video. It allows viewers to use their keyboards or cursors to flip through 16 channels that mimic TV formats such as games shows, shopping networks and reality series. People on each channel, no matter what TV trope they represent, are seen lip-syncing the lyrics.

“I’m using the medium of television to look back right at us,” director Vania Heymann told Mashable. “You’re flipping yourself to death with switching channels [in real life].” Adds Interlude CEO Yoni Bloch: “You’ll always miss something because you can’t watch everything at the same time.”

The stations you can flip through include a cooking show, The Price Is Right, Pawn Stars, local news, a tennis match, a children’s cartoon, BBC News and a live video of Dylan and the Hawks playing “Like a Rolling Stone” in 1966.

via and

Eric Prydz - Hologram

Eric Prydz – EPIC 4.0 Tour Visuals

Coming off the very successful campaign for Eric Prydz’s Generate music video, our friend Michael Sershall hired the team back to design the visuals for his EPIC 4.0 tour. The setup for the live show was fairly insane, with content screen forming a cube with a 28mm see-through LED in front, a Holo Gauze through the middle for a mesmerizing hologram projection, and finally a 12mm 4:1 wide-screen LED in the back enclosing the cube and playing back the key content.

For the gig, Munkowitz tapped his favorite collaborators, the great Conor Grebel and Michael Rigley, both ridiculously talented Cinema4D Artists and Animators, who brought their A-Game for this throwdown. All the content was rendered with the amazing Octane Renderer which meant the team bought two superComputers and a fuckLoad of graphics cards to render all the wetness. In the end, the project was about making art for entertainment, and these kinds of paying gigs are what we love.

via and

Werner Herzog Talks Virtual Reality

“I am convinced that this is not going to be an extension of cinema or 3-D cinema or video games. It is something new, different, and not experienced yet,” the filmmaker Werner Herzog said of virtual reality. An interview by Patrick House with the filmmaker about simulation and experience.


Wanderers – a short film by Erik Wernquist

The film is a vision of our humanity’s future expansion into the Solar System. Although admittedly speculative, the visuals in the film are all based on scientific ideas and concepts of what our future in space might look like, if it ever happens. All the locations depicted in the film are digital recreations of actual places in the Solar System, built from real photos and map data where available.


Wireless DMX Lighting Control Using Arduino and Vixen

A step-by-step tutorial on how to control and sequence wireless lighting effects – either for installations, displays, or wearable designs. It’s based on the Arduino board (or Freaklabs’ Fredboard) using Vixen software (v3). Everything you need – from scratch right through to code and working examples.


A 3D Fractal Artist Is Building an ‘Interstellar’ Inspired VR World

Filmmaker Julius Horsthuis, the frequent explorer of fractalized caverns and endless alien planets, has begun a line of computer-generated experiments that could let us explore our own Interstellar-like multidimensional realities. His impressive series of sweeping fractal vistas, beginning with Geiger’s Nightmare nearly a year ago, has given him a wealth of knowledge about making gorgeous fractals. Now, he has channeled that experience into building Hallway 360VR, the first in a line of 360-degree virtual reality animations.


Lightpainting with Pixelsticks

Pixelstick consists of 200 full color RGB LEDs inside a lightweight aluminum housing. The mounted controller reads images from an SD card and displays them, one vertical line at a time, on the LEDs. Each LED corresponds to a pixel in the image.


Cymatics – Nigel Stanford

Cymatics is the first single from Nigel Stanford’s new album Solar Echoes. It was shot in 6k resolution on Red Dragon cameras, and finished in 4k / Ultra HD.

The team went through months of research, testing, and development to make sure the experiments – including a Chladini plate, speaker dish, hose pipe, ferro fluid, Ruben’s tube, and tesla coil looked great in the final film.

Cymatics is the study of visible sound co vibration, a subset of modal phenomena. Typically the surface of a plate, diaphragm, or membrane is vibrated, and regions of maximum and minimum displacement are made visible in a thin coating of particles, paste, or liquid. Different patterns emerge in the excitatory medium depending on the geometry of the plate and the driving frequency.


3D Projection (without a screen)

A team of researchers in Japan lead by Akira Asanohave developed the technology they call ‘The Aerial Burton.’ The device works by firing a 1kHz infrared pulse directly into a 3D scanner, which focuses and reflects the laser to a specific point in the air. When the molecules reach the specified point at the end of the laser they ionize, releasing energy in the form of photons.


Why good leaders make you feel safe

What makes a great leader? Management theorist Simon Sinek suggests, it’s someone who makes their employees feel secure, who draws staffers into a circle of trust. But creating trust and safety – especially in an uneven economy – means taking on big responsibility.


Isaac Delusion’s “Pandora’s Box”


Created by Studio Clée – comprised of Alizée Ayrault and Claire Dubosc–with additional contributions from Romain Avalle, the project includes 600 videos and 1500 possible slots, yielding a seemingly-infinite number of combinations. After 20 seconds, you get the option to share the unique video with friends, where they can either watch your personalized video, or experience a fresh, reincarnated one. All get adorned with eye-catching vintage stock footage–cuts from North By North West and other 20th century gems–taken from the Prelinger Archives, a public domain database created in 1983.

View the project here