on Monday, December 19, 2011
Autodesk together with Paramount Pictures and Weta Digital invited industry professionals to a private advance screening of Steven Speilberg's 3D film,“The Adventures of Tintin” held December 13, 2011 at the Landmark Theatre in Los Angeles. David Morin, Autodesk representative and Co-Chair of the ADG Virtual Production Committee (VPC) led a Q&A session with Senior Visual Effects Supervisor Joe Letterithat followed. The audience was a good mix of cinematographers, producers and VFX artists, many of whom are members of the Previs Society and VPC- another offshoot of the ADG Technology Committee.

I was eager to attend this event--the Q&A in particular since there is concern among the ADG membership that no Production Designer was credited. I had read David Denby's review in the New Yorker the day before in which he briefly describes the making of the film:
The play between fantasy and realism is what gives the film its special look. Spielberg and his collaborators (Peter Jackson was the producer) have come up with the equivalent of Hergé’s clean-limbed, lean-forward manner (the characters in Hergé’s comic books seem always to be moving into the next panel). The animators labored for two years establishing settings—a street, a ship, a Moroccan city—and then the actors worked in a featureless room with reflectors attached to their bodies while dozens of digital cameras all around them picked up their movements. The animators used the movements—shrugs, strains, thrusts—to build the animated version of the characters, and added the completed figures to the preset backgrounds. The technique is similar to the one that James Cameron used for “Avatar,” but the look is drier, plainer, airier.
I give Tintin two "thumbs-up" not only for beautiful 3D cinematography but also 3D CG characters that are finally crossing the "uncanny valley." The characters eyes are lifelike; the hair and fabrics are finely detailed showing realistic movement (esp hair) in the wind. The hybrid nature of the film and the use of actor performance-capture was commented on by audience following the film. Letteri emphasized that the animator as artist is at work.  Yes, the actor performance is used as a template but it is interpreted by the animator.

The lighting and settings were beautifully designed. All of the settings were designed and drawn up just as if they were to be built as if in the real world but instead modeled in the virtual world. Letteri told how the original source material from Hergé's studio was used as reference material by the artists. They tried to remain as true as possible to the original books but when they found the lighting to be flat and uninteresting Speilberg suggested they go dark- resulting in the dramatic film noir look.

The animation itself is captivating. As Denby states, “The Adventures of Tintin” is a virtual non-stop scramble of running, jumping, swinging, dangling, plunging, and flying." Many of the action sequences according to Letteri were conceptualized by the animation team and developed in advance of the script. I found the morphing and use of scale as device for the transitions particularly imaginative. In some of the flying scenes however I felt too much animation and could not watch or risk a bit of motion sickness.

During the Q&A Letteri described how he and  Peter Jackson introduced Spielberg to the virtual camera while Avatar was in production, shutting down Avatar for a few days to produce a screen test he presented us and included in this "making of" trailer:



Tintin and Letteri's approach to filmmaking with the virtual camera are highlighted in an interview last year in Below the Line News:
After five years of work on Avatar, what became most apparent to Letteri..., is that much of what he used to consider postproduction, he now has to do before any shooting commences. “Avatar gives you that direct feedback,” he said. “It’s taking the last century of filmmaking and apply it to a virtual world. When you are doing that, you want to retain as much as what you know about making good films as possible.

Now that Cameron and company have set up a system for creating a film with largely virtual elements, other productions are picking up on it. Peter Jackson’s own Tintin— with Steven Spielberg directing the first installment — was shot using the same technology that Cameron set up.
Letteri also noted that during production of Tintin, all departments worked together simultaneously.  They used an Avid on set to feed editorial and the film itself was kept manageable by working in game-quality cgi. Time was not lost rendering and rendering shots that might be edited out of the story. They did a final quality render at the end after editorial were satisfied.
on Thursday, December 8, 2011
I started following writer, artist, humanist and technologist Jonathan Harris after attending a rare talk he gave at UCLA, where I first heard about Cowbird, his work-in-progress. At the time Harris had just turned age 30 and started a daily photoblog. He created a mailing list to which I subscribed and got emailed his one image a day.  I looked forward to these daily updates but was not surprised when after a few months he STOPPED.   
His mailing list is still intact and this is one I got today:


Hey there, you beautiful birds. 

After 2+ years of work, 145,000+ lines of code, one Icelandic grass hut, one night in jail, one serving of jellied ram's testicles with fermented shark meat, and countless pieces of toast with orange marmalade, it is my great pleasure to introduce you to Cowbird, a labor of love, and hopefully something that will have a long and meaningful life. Cowbird is a community of storytellers, focused on deeper, longer-lasting, more personal storytelling than you're likely to find anywhere else on the Web. Cowbird allows you to keep a beautiful audio-visual diary of your life (here's mine), and to collaborate with others in documenting the overarching "sagas" that shape our world today (starting with the Occupy Wall Street movement).
Our short-term goal is to pioneer a new form of participatory journalism, grounded in the simple human stories behind major news events. Our long-term goal is to build a public library of human experience -- kind of like a Wikipedia for real life (but much more beautiful). Here is an overview of the project: http://cowbird.com/about      Here are some good Cowbird stories:http://cowbird.com/author/jonathan/#/2278http://cowbird.com/author/scottthrift/#/608http://cowbird.com/author/annie/#/2509 Here is the story of me getting arrested a few weeks ago at Occupy Oakland :) http://cowbird.com/saga/occupy/story/2312 
Our community is still very small. We are looking for excellent storytellers -- photographers, writers, filmmakers, journalists, etc. If you would like to become a Cowbird storyteller, please request an invitation (we're trying to grow slowly). If you know other folks who would be a natural fit for Cowbird, please send them our way. Please spread the word about this project. Tweet it. Facebook it. Tumble it. Email it. Talk about it. Shout it from the rooftops, where those silly pigeons are roosting and pooping. Tell those dirty birds to move over, and make a little room for a different kind of bird. Beautiful things lie ahead.
Jonathan--
Jonathan Harris http://number27.org

Another project by Jonathan Harris is the beautifully designed Sputnik Observatory website, and a favorite of mine, where lateral conversations dedicated to modern culture interconnect and archive ideas from "extraordinary minds shaping modern thought."
on Saturday, December 3, 2011

The ipad app shown here is a great example of how the technology behind the interface does not overwhelm the user.

NuFormer 3D Video Mapping Interactivity Test 

 

Like Gradient VFX's ipad app created to navigate the Paramount backlot (referenced in my prior blog entry here) this is another sign that we are at the flashpoint of design and technology.

on Thursday, December 1, 2011
There is no stopping workflow convergence in the realm of design for narrative media as was shown last night by GRADIENT VFX with their presentation of GLoW visualization software at the ASC Technology Committee meeting. 
    Co-founders Olcun Tan and Thomas Tannenberger won the 2010 HPA award* for GLoW (Gradient Location-Optimized Workflow) which they created as a 2D or stereoscopic 3D previsualization and nonlinear production system. 
     GLoW consists of a proprietary software plug-in and related workflow that uses lidar scan metadata of sets and/or locations. The metadata can be imported into 3rd party software such as Maya and used create high quality CGI that is extremely accurate. The locations can be modified or supplemented with assets as needed. They had a library of assets to choose from including cars, trees and people that could be dropped into place with the click of a button. Also included were cameras, lights and other production tools.
     The scene(s) are  rendered in REALTIME with high-quality image based textures by making use of 'the cloud' to stream the data online,  and which can be accessed in many ways including computer, ipad or iphone.
     They are currently working with Paramount Studios and have a complete 3D model of the backlot that can be accessed via an ipad app as demonstrated. 


*During the HPA Awards ceremony on November 11, 2010, Gradient VFX was recognized for their inventive thinking and execution of concept with a special honor, the HPA Judges Award for Creativity and Innovation in Post Production.



on Thursday, November 24, 2011
Address Is Approximate

A beautiful stop motion animation short produced, animated, filmed, lit, edited & graded by Tom Jenkins ( theoryfilms.co.uk )
on Wednesday, November 16, 2011
The design world at large is looking for non-linear collaboration it seems.  Parallel to PREVIS in filmmaking there is Building Information Modeling (BIM) in architecture.
Is BIM the holy grail of architectural collaboration? 

 I came across an interesting article from The Guardian on this subject (linked here)


The UK Government is poised to mandate BIM for all public buildings which will influence the way designers, architects and engineers collaborate on future projects. 


Nemetschek VectorWorks offered a webinar on a third party plug-in for their modeling software that allows virtual online collaboration in 3D that is very useful for early design phase. As a VectorWorks user, I am very interested to see how this will be applied to the VIRTUAL ART DEPARTMENT.


Register for the second part of this series: Using an openBIM Workflow (1 AIA LU Hour)
(see the link at the end of this post for Part One - video)


on Monday, November 14, 2011
With rapidly evolving advances of ‘real-time’ software and processing power, I wonder how long before digital ‘previs’ becomes just plain ‘vis’.
"Previs is a collaborative process that generates preliminary versions of shots or sequences, predominantly using 3D animation tools and a virtual environment. It enables filmmakers to visually explore creative ideas, plan technical solutions, and communicate a shared vision for efficient production."

Brian Pohl, Visual Effects  (VFX) artist and Previs Supervisor, says in his blog:
“…Technological integration has gone even further with a new process called virtual production. … the seamless unification between the virtual and the real…. Virtual production goes beyond previs' planning procedures and actually moves them out of preproduction and into principle photography itself’.

The technological integration Pohl cites above is in reference to James Cameron’s development of Virtual Production used on the movie Avatar, and is closer to “vis”.  Pohl, now with Autodesk's product design team, is a founding member of the Previs Society, a relatively new association of professional cinematic artists based in Los Angeles.
Previs artists are crossing the boundaries of the narrative design fields. In production of film, television, games, live events, architectural and industrial design, digital previs facilitates collaboration, and supports an immersive design process. The cross-pollination of disciplines is at the heart of 5D | Future of Immersive Design Conference and one reason why I joined the Previs Society.
Shared knowledge is key a part of the mission of this group imparted through online discussion and monthly events.
I attended a recent full day event, Previs Annual Forum 2011, held on October 22, featuring informative presentations and stimulating discussion with several top Previs Artists. A number of technology companies participated, demonstrating products as well as hosting lunch and providing chances to mingle during coffee breaks followed by a lively cocktail hour that evening.
 Activison Tour
In September, Activision invited the Previs Society for a ‘sneak peek’ of its new state-of-the-art performance capture studio. This opportunity was of special interest to me with regard to 5D, because here is a game company opening their facility to filmmakers. Activision gave us a tour of their large-volume motion-capture space and demonstrated use of their virtual camera system, which I found intriguing.
Activision uses industry-standard infrared camera capture technology with their custom-built wireless virtual camera system. The camera system has a small monitor for viewing a motion-capture performance in a virtual environment, similar to what is typically seen in the ‘making-of-Avatar’ videos.
Flashback
I had seen earlier versions of these types of camera systems before but hadn’t fully grasped the powerful pull of where technology is taking the design process.
In 2008, Chris Edwards, CEO of Third Floor Inc., a company that specializes in previs, was an early adopter of this technology and demonstrated a virtual reality camera made by InterSense, at 5D08 | Future of Immersive Design Conference.
Gamecaster GCS3 Virtual Camera
In 2010, Tino Schaedler, 5D Founding Member, invited a handful of colleagues to his studio for a demo of another virtual camera system that Schaedler thought might be an excellent previs tool for use within the Art Department. He introduced us to developer David Macintosh, CEO of San Diego based company Gamecaster, who presented the GCS3; a camera system that allows artists to explore 3D animated environments.  At the time I thought it was difficult to navigate, but with a little practice would allow a novice such as myself to explore and record shots, saving them as virtual walk/fly-throughs that could be shared with a director.

Several Previs Artists speaking on a Visual Effects Society panel following a screening of ‘Real Steel’ at Disney Studios demonstrated a more advanced version of the virtual camera system. The presentation touched on processes including motion-capture, image-based capture, photogrammetry, virtual camera and simul-cam. The artists raved about the great success of digital previs as a new hub of efficient production planning, while still staying within the parameters of a studio budget and timeframe. In the end the panelists called it ‘modern filmmaking.’
Previs I had been exposed to in the past consisted primarily of crude animatics with models and characters created in-house by the previs team typically to figure out complicated VFX shots or stunts. It became clear that these newer processes used by the ‘Real Steel’ panel are pushing previs into a virtual world based on real-world parameters, and on a much larger scale than I had seen before.  With this new understanding, I revisited Activision and met with Matt Karnes, Director of Motion Capture, and Sylvain Doreau, Cinematics Director, for another look at their ‘technological tour de force’ motion-capture facility, a facility that specializes in the capture of data that can be used to create previs - the key driver in modern filmmaking.
The Studio
Activision’s new facility is located in the heart of ‘mo-cap alley’, an area near Marina del Rey that is home to several VFX and premiere mo-cap facilities including House of Moves (HOM) and Giant Studios. Activision started doing in-house mo-cap 5 years ago, in studio space originally rented from HOM. It has since expanded to become one of the top 5 of its kind in the world.
30' x 60' Mo-Cap large volume performance capture space 
180º Facial performance capture & audio booth
The studio boasts a 30’ x 60’ (with 15’h grid) large volume performance-capture space with 62 Vicon cameras (used with traditional* dot-marker suits) that can easily accommodate 6-10 actors. They have a full-body 3D scanner and a separate 3D head scanner (Cyberware), Canon 5D & 4D cameras for high-resolution texture mapping, and a 180º facial performance-capture system that uses 18 Vicon cameras inside a sound booth for recording simultaneous high-quality audio with a capture session.

The Process
Actors in reflector-dot suits perform on the mo-cap stage within a preloaded virtual environment  provided by the game developer that has been mapped to the mo-cap stage. The actors are able to orient themselves in the virtual world by viewing large screen projection at one end of the space.  Simplified props and set pieces in real scale are provided for the cast to interact with, for a more natural performance.
Volume space with a ramp for actor performance
Once actor-performance data is captured the actors are finished. The performance data is retargeted onto digital character that is combined with a virtual environment for previs animatics to be made. Similar to live action film making, the Director of Photography and/or Director with a custom-built virtual camera and the previs as reference, is able to go back onto the mo-cap stage to start filming in real time – but now in a virtual production.
New camera ‘footage’, transmitted wirelessly to computer, is viewed simultaneously by the DP in the camera ‘viewfinder’ – a 10” hi-definition video screen mounted on the camera –and  by production/cast on a large viewing screen at one end of the volume space. All the rigs of a traditional camera set up such as dolly or steady-cam may be employed.
Mo-cap client work desks
Sylvain Doreau, Cinematics Director at Activision, explained how virtual camera data allows him to create and edit QuickTime movies in a computer on site as a crosscheck for coverage. Doreau also creates a previs for each shot, as a necessary part of his pre-planning, not only for the performance capture, but also for the DP- who is shooting the virtual ‘footage’ long after the actors are gone.
 According to Matt Karnes, Activision is transitioning to newer technology where capture of facial and full body performance will be integrated in the volume space. New helmet rigs worn by the actors will consist of 4 to 6 mini HD cameras capturing video at 60 frames per second. The cameras will be mounted directly onto a lightweight helmet along with a microphone to capture the audio and video data. The video data captured by each camera seeks dark contrast dots on the actor’s face that are then reconstructed as positional point cloud data, much like a traditional passive marker mo-cap system. The positional point cloud data can be reapplied to any facial rig that accepts this type of data. 


Cross-pollination of technology in media design
Game developers, such as Activision, applying film technology to the creation of games are an example of the cross-pollination of technology in the field of media design. At the same time, filmmakers are taking technology from the game world, specifically real-time rendering software, and are developing virtual production.

The Modern Art Department
New technology, real-time rendering power, and better 3D modeling software simultaneously enables immersive design visualization within the Art Department. By designing a virtual world for the story to reside, the Art Department provides a hub for collaboration in which to occur, that can be fed into by all other departments, thus providing Previs Artists a world “on the front end” of production, within which to operate and facilitate.
Alex McDowell, Production Designer and 5D Co-founder states:

“In the 'modern' design department, where every designer is using digital tools and working virtually, design visualization is in fact a continual by-product of the design flow. Every design asset created by every designer is potentially an asset for Previs, and the design hub should be considered the front end of a continuous flow of data, through Previs, and to VFX.”

Digital assets normally created by Visual Effects artists on the ‘back end’ are now more and more created on the ‘front end.’  Previs Artists can pull digital design elements from all departments including - Art, Costumes, Cinematography and Visual Effects – and assemble these same elements into a production ‘pipeline’ and use them to pre-visualize shots for early editorial feedback.
Previs acts as bridge from pre-production to production to post-production. It can prevent the movie from being “made three times.” Previs in effect, also creates a prototype that can be tested before the film is made, a new tool in planning, that for all production departments, can result in a dramatic savings of time, effort and cost.

on Saturday, November 12, 2011
Imagine a virtual ocean that is participatory in that you can create a fish, and if your creation is purchased by another participant, you earn money. Strange but true....

Wemo Media and the Los Angeles Chapter of ACM SIGGRAPH held a special behind the scenes look at “The Blu”, a new on-line social media experience in collaboration with Hollywood and Academia on Oct. 11, 2011.

screen capture of user interface published on Deep Sea News


The invite announced:
     "The Wemo Media team is comprised of Academy award winning artists, technology innovators and execs from film, games and the web, including: Andy Jones (Academy award winner, Avatar), Kevin Mack (Academy award winner, What Dreams May Come), Louie Psihoyos (Academy award winner, The Cove) and Joichi Ito (Director, MIT Media lab). 
    Currently in private beta, “The Blu” is the first experience using the Maker Media Platform from Wemo Media. “The Blu” brings the ocean to life on the web. “The Blu” is a beautiful interactive online geosocial web application where people connect across the Internet and explore a vast ocean on the web. Every species and habitat is a unique work of art created by digital artists and developers around the world. Picture hundreds of thousands of aquatic species and tens of thousands of underwater habitats, beautiful works of art created by artists and developers all over the world, spread across the canvas of the World Wide Web as a globally shared art gallery.
     Wemo Media is passionate about enabling artists and developers to co-create globally shared media experiences and storytelling. At the core of Wemo Studios is the Maker Media Platform, a disruptive online creative collaboration platform enabling digital artists and developers around the world to peer-produce innovative digital media."


I asked Neville Spiteri, creator of the project, whether this space would be available for rent, and he thought I meant the studio. I clarified and said 'the virtual space' which is now an asset.  Filmmakers could conceivably use this in production... not sure whether that idea occurred to him before that, but seems like a 'no-brainer' especially since the plan is to create a realistic ocean environment that can be used in part for education and research.

LA Times:  INTO THE BLU

Deep Sea News- Interesting article from a marine biologist:
http://deepseanews.com/2011/11/into-theblu-virtually/
TheBlu website: http://theblu.com The pictures included in this post can be found on Facebook: https://www.facebook.com/intheblu


on Friday, November 11, 2011
TV VFX is delivering like never before thanks to advances in software. Film technology is trickling down to tv at a much faster rate allowing production to meet the very short deadlines- often inside of a week.
Jurrassic Jolts Terra Nova  

NewTek Software hosted an excellent panel of stellar VFX artists working in television on Nov 10, 2011 at the Academy Television Arts & Sciences.

Moderated by David Cohen, Variety Tech reporter with panelists:
Chuck Comisky - VFX Supervisor and Film Producer TERMINATOR 2 3D, ALIENS OF THE DEEP, AVATAR (Stereoscopic VFX Supervisor); On set stereo supervision -SANCTUM 3D, FINAL DESTINATION 5 3D
Doug Drexler – CG Supervisor, VES, Emmy and Academy Award winner for BATTLESTAR GALACTICA and DICK TRACY
Ron Thornton – VFX Producer STAR TREK: THE MOTION PICTURE/Emmy Award winner – BABYLON 5 (VFX Supervisor)
Eric Hance - TERRA NOVA VFX Supervising Artist/Emmy Award winner

David Cohen commented that while in Film there are many VFX driven features, in Television story and content is still king.  Many VFX is minimal and hard to see such as clean-up work or location fixes, however Doug Drexler and Eric Hance are working on two of the heaviest VFX shows on TV at the moment. Both commented on the fast turn-around and the role of VFX with story. They are able to work on the fly directly in production along side ART and Editorial (vs. post) and enjoy direct collaboration that they share with the writers to develop story when needed. Active team participation in production is relished by these artists.
More virtual sets are being used than ever before. The thought is that once production gets used to green screen there will be no going back since the cost savings is so great. There were no Production Designers present and find myself wondering how much input the Production Designer is giving VFX to go on.





on Thursday, November 10, 2011
Tom Walsh,  ADG President,  invited handful of colleagues, myself included, to participate in a roundtable discussion with Dr. Gianluca Sergi, Director Institute for Screen Industries Research, University of Nottingham, UK on Oct 10, 2011.

Sergi is working on a publication based on his findings from a survey of the world's most prominent film studios:

 Joining the dots - or around the world in 80 studios
The University of Nottingham in the UK has begun a programme of research projects, industry engagements with leading practitioners and partnerships with major studios to build the kind of research capacity and skill-base that are necessary to help filmmakers and studios alike build a more stable, sustainable and productive environment within which to work.

As our first research project we asked a 'simple' question: what is the role of traditional studio space in the digital age? The question has emerged out of debates around the introduction of digital technologies coupled with recent large investment in traditional spaces of production (see Warner investment in Leavesden in the UK, Cinecitta' revamping of their Rome's studios, Pinewood opening studios in several countries, and the remarkable success of WETA in New Zealand).

By looking at key studios 'models' around the world, what they have in common and what differentiates them, and how studio production is changing the project provides a snapshot of what the future may have in store for studio production models, its infrastructures and how these may impact on the dynamics of filmmaking.
China and Brazil according to Sergi are poised to release a production 'tsunmai' that will impact US film production in unforseen ways unless studios start talking to each other.  I questioned him whether the studios would ever talk to one another given that they are competitors. I did not get a satisfactory answer to my question... perhaps because he has an academic point of view.

He suggested we learn to speak Mandarin!  The University of Nottingham has a campus in Shanghai.
One anecdote was of the number of skyscrapers being built there. Sergi says they keep building regardless of tenancy (high rate of vacancy) to keep people employed.  In addition they are opening new cinemas at the rate of 3 a day.
What a difference a decade makes!
on Wednesday, November 9, 2011
Los Angeles ACM Siggraph hosted a great filmmaking panel last night at the Art Institute of California in Santa Monica.



"The visceral robot boxing matches in REAL STEEL were conceived, planned and executed using an innovative process that gave the filmmakers unprecedented creative freedom and efficiency – by taking virtual production out of the studio and bringing it into real world settings. From robot animation and lighting to set extensions and arena crowds, the teams at Digital Domain and Giant Studios streamlined production from start to finish for maximum efficiency and photorealism."


Digital Domain Panelists:

Erik Gamache - Animation Supervisor 
Geoffrey Baumann -  Digital Environments Supervisor
Justin van der Lek - Digital Environments Lead 
Blake Sweeney - Digital Lighting Lead 

Casey Schatz - Previs Supervisor from Giant Studios

'Real Steel' is among first film productions, if not THE FIRST to bring on a digital team in preproduction to capture all the real world data from the locations and combine it design data from the Art Department for use in planning of the production from concept phase through editorial and into post production. This is modern filmmaking.

It is the second panel I attended on the making of this film. The Visual Effects Society (VES) hosted a similar panel  following a screening on Oct. 8, 2011 at Disney Studios. That panel inspired me to write an article about previs and virtual production for Perspective Magazine, published by the Art Directors Guild.

on Friday, November 4, 2011
California Poppies
This is a test post using a photo taken with my iphone in Antelope Valley during peak Poppy Season at the Antelope Valley Poppy Reserve in 2009. The iphone has an incredible camera.